CN106030578B - Search system and control method of search system - Google Patents

Search system and control method of search system Download PDF

Info

Publication number
CN106030578B
CN106030578B CN201580010270.3A CN201580010270A CN106030578B CN 106030578 B CN106030578 B CN 106030578B CN 201580010270 A CN201580010270 A CN 201580010270A CN 106030578 B CN106030578 B CN 106030578B
Authority
CN
China
Prior art keywords
data
unit
image
commodity
physical quantity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201580010270.3A
Other languages
Chinese (zh)
Other versions
CN106030578A (en
Inventor
与那霸诚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujifilm Corp
Original Assignee
Fujifilm Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujifilm Corp filed Critical Fujifilm Corp
Publication of CN106030578A publication Critical patent/CN106030578A/en
Application granted granted Critical
Publication of CN106030578B publication Critical patent/CN106030578B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0623Item investigation
    • G06Q30/0625Directed, with specific intent or strategy
    • G06Q30/0627Directed, with specific intent or strategy using item specifications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/54Browsing; Visualisation therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/951Indexing; Web crawling techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9538Presentation of query results

Abstract

The invention provides a search system, a server system, and a control method for the search system and the server system, which can simply and accurately specify an image of a commodity according to the preference of a user by using a sensitive word and display the image of the commodity in a mode that the user can intuitively know the information of the image of the commodity. In a preferred embodiment of the present invention, the user designates perceptual word data at the client terminal (S11) and transmits the perceptual word data to the server system (S12). The server system receives the perceptual word data (S13), acquires physical quantities of the commodities associated with the perceptual word data (S14), and acquires image data of the commodities associated with the physical quantities of the commodities (S15), thereby generating display information data indicating display forms of the images (S16). The image data and the display information data of the product are transmitted from the server system to the client terminal (S17, S18), and the image data of the product is displayed on the client terminal based on the display information data (S19).

Description

Search system and control method of search system
Technical Field
The present invention relates to a search system, a server system, a control method of the search system and the server system, and more particularly, to a search technique using perceptual words.
Background
When searching for a commodity such as western style clothes on an EC (Electronic Commerce) site on the internet, it is common to lock the commodity by specifying a price, a size, a color, and the like. For example, when a user designates a desired suit color on a web page, a list of suit images related to the designated color is displayed, and the user can purchase the suit of the selected image by selecting the desired suit image from the displayed list of suit images.
However, the colors available for user selection on the EC website are limited, and sometimes the user cannot specify the exact colors desired. In addition, since a user often selects a product based on a blurred impression when purchasing the product at a physical store or the like, it is not always preferable to search for a product based on a specific color from the viewpoint of promoting the purchase of the product.
In this way, when searching for a product image by color or the like, a product that matches a color or impression actually desired by the user may not necessarily be presented in the search result. Therefore, various search techniques based on information other than color have been proposed.
Patent document 1 discloses a product information notification system capable of changing additional information of a product for each user and effectively presenting the information. In this product information notification system, basic information of a product group to be referred to by a user is selected from a product information database storing basic information (product name, size, color, brand, and the like) of each product, and the selected basic information is provided to the user. And, the corresponding additional information is selected from the additional information database according to the member information of the user, and the selected additional information is provided to the user.
Patent document 2 discloses an image search device that performs a search using a keyword including a perceptual word. In the image search apparatus, the acquired keyword is divided into nouns and perceptual words, and feature quantities for search are acquired from a perceptual word/noun-feature quantity database according to a combination of the nouns and the perceptual words, and an image is searched using the acquired feature quantities.
Prior art documents
Patent document
Patent document 1: japanese laid-open patent publication No. 2002-157268
Patent document 2: japanese patent laid-open publication No. 2011-065499
Disclosure of Invention
Technical problem to be solved by the invention
As a method for allowing a user (customer) to easily search for a desired product, a method using a perceptual word as a search word is effective. However, the impression presented by a perceptual word differs not only from user to user, but also easily varies with time. That is, even with the same perceptual word, the product that is thought of may differ from user to user, and even with a specific user, the preference may change over time, so that the product that is thought of from the same perceptual word may not be substantially the same.
Therefore, it is desired to improve the overall convenience of the product search on the EC site by realizing an accurate product search by an intuitive search technique using perceptual words as keywords and appropriately presenting images of a plurality of searched candidate products to the user, thereby eventually increasing the sales volume of the products.
Therefore, a new technology is desired that allows a user who accesses the EC site to easily find a desired product and to more reliably make a purchase decision. More specifically, it is desired to provide a new search system and the like that can allow a user to make a purchase decision instantly as soon as the user sees a search result based on a perceptual word, with respect to a series of search behaviors from "inputting a search keyword of a perceptual word by the user" to "displaying a search result".
However, as for the commodity search technology using perceptual words, such a series of technologies related to search behavior has not been proposed so far.
For example, the product information notification system in patent document 1 aims to perform effective demonstration by changing additional information of a product for each user, but does not perform a search using a sensitive word. Further, this product information notification system does not consider the problem of allowing the user to easily find a desired product in terms of displaying search results or the like.
Further, the image search device in patent document 2 employs a search technique based on perceptual words, but as with the product information notification system in patent document 1, no consideration has been given at all to the problem of allowing the user to easily find a desired product in terms of display of search results and the like.
The present invention has been made in view of the above circumstances, and an object thereof is to provide a technique for easily and accurately specifying a plurality of product images that match the preference of a user using a perceptual word and displaying the product images so that the user can intuitively understand the relationship between the plurality of product images.
Means for solving the technical problem
One aspect of the present invention relates to a search system including a client terminal and a server system connected to the client terminal via a network, the search system including: a server receiving unit that receives data transmitted from a client terminal via a network; a physical quantity acquisition unit for acquiring, from the client terminal via the network, a physical quantity of the product associated with the perceptual word data received by the server reception unit; an image search unit that acquires image data of the commodity associated with the physical quantity of the commodity acquired by the physical quantity acquisition unit; an image arrangement unit that specifies a display mode of the image data of the product acquired by the image search unit and generates display information data indicating the specified display mode; and a server transmitting unit that transmits the image data of the product acquired by the image searching unit and the display information data generated by the image arranging unit to the client terminal via a network, the client terminal including: a display unit; a terminal input part for allowing a user to designate perceptual word data; a terminal transmitting unit that transmits the perceptual word data specified by the terminal input unit to the server system via the network; a terminal receiving unit that receives the image data and the display information data of the product transmitted from the server system via the network; and a display control unit for displaying the image data of the commodity received by the terminal receiving unit on the display unit based on the display information data received by the terminal receiving unit.
According to this aspect, the image data of the product is acquired based on the physical quantity of the product associated with the perceptual word data specified by the user, and the image of the product corresponding to the preference of the user can be specified easily and accurately by using the perceptual word. Further, since the image data of the product is displayed based on the display information data, the product image can be displayed so that the user can intuitively know the information of the product image acquired by the search.
The following are preferred: the server system further has: the user information database stores the user identification data and the user attribute data after establishing association; a user attribute information acquisition section that accesses the user information database to acquire user attribute data associated with user identification data received from the client terminal via the network; a conversion table database for storing a plurality of conversion tables which are determined according to the user attribute data and which associate the physical quantity of the commodity with the perceptual word data; a conversion table acquisition unit that accesses the conversion table database to acquire a conversion table associated with the user attribute data acquired by the user attribute information acquisition unit; and an image database that stores the image data of the commodity in association with the physical quantity of the commodity, wherein the user identification data and the perceptual word data are specified by the user through the terminal input unit, the terminal transmission unit transmits the user identification data and the perceptual word data specified by the terminal input unit to the server system via the network, the physical quantity acquisition unit acquires the physical quantity of the commodity in association with the perceptual word data received by the server reception unit with reference to the conversion table acquired by the conversion table acquisition unit, and the image search unit accesses the image database to acquire the image data of the commodity in association with the physical quantity of the commodity acquired by the physical quantity acquisition unit.
According to the present aspect, the physical quantity of the commodity is acquired by referring to the conversion table acquired from the user attribute data, and therefore the image data of the commodity is acquired according to the attribute of the user.
The following are preferred: the image arrangement unit determines a display mode of the image data of the product acquired by the image search unit to a display mode in which the image data of the product is displayed in accordance with the characteristic data of the product, and the display information data indicates a display mode in which the image data of the product acquired by the image search unit is displayed in accordance with the characteristic data of the product.
According to this aspect, since the image data of the product acquired by the image search unit is displayed based on the characteristic data of the product, the user can intuitively understand the relationship between the product images based on the product characteristics.
The following are preferred: the display information data indicates a display form in which at least a part of the image data of the product acquired by the image search unit is displayed on a coordinate system indicating the characteristic data of the product.
According to this aspect, at least a part of the image data of the product is displayed on the coordinate system representing the characteristic data of the product, so that the user can intuitively understand the relationship between the product images on the coordinate system.
The following are preferred: the characteristic data of the product is determined according to a characteristic different from the sensitive word data designated by the user through the terminal input unit among the characteristics of the product.
According to this aspect, the image data of the product is displayed according to a characteristic different from the perceptual word data specified by the user. Here, "a characteristic different from the perceptual word data specified by the user through the terminal input unit" may be a product characteristic based on the perceptual word or may be a product characteristic not based on the perceptual word.
The following are preferred: the characteristic data of the product is determined based on at least one of the price of the product and the size of the product.
In this case, the user can intuitively understand the product on which the image is displayed, the information being related to at least one of the price of the product and the size of the product.
The following are preferred: the physical quantity of the product is determined by at least one of the color of the product, the pattern of the product, the texture of the product, and the model of the product.
According to this aspect, the image data of the product can be acquired from at least one of the color of the product, the pattern of the product, the texture of the product, and the model of the product.
The following are preferred: the user attribute data is determined according to at least any one of gender, age, race, nationality, religion, and professor of the user.
According to this aspect, the "physical quantity of the product associated with the perceptual word data" can be acquired with reference to the conversion table based on at least one of the sex, age, race, nationality, religion, and tutor of the user.
The following are preferred: the display control unit displays a plurality of sensitive words on the display unit, and the terminal input unit receives a command from a user, specifies at least one of the plurality of sensitive words displayed on the display unit, and specifies the specified sensitive word as sensitive word data.
According to this aspect, since the user can specify at least one of the plurality of perceptual words displayed on the display unit, the perceptual word data can be specified, and thus the convenience is very high.
The following are preferred: the image data of the commodity is acquired by photographing the commodity.
According to this aspect, image data of an appropriate product obtained by imaging an actual product is used.
The following are preferred: the image data of the commodity is added with metadata indicating a physical quantity of the commodity, and the image search unit acquires the image data of the commodity to which the metadata indicating the physical quantity of the commodity acquired by the physical quantity acquisition unit is added as the image data of the commodity associated with the physical quantity of the commodity acquired by the physical quantity acquisition unit.
According to this aspect, the image data of the product associated with the physical quantity of the product can be easily acquired from the metadata.
The following are preferred: the physical quantity of the commodity associated with the image data of the commodity is acquired by analyzing the image data of the commodity.
According to this aspect, the physical quantity of the commodity is acquired by image analysis.
The following are preferred: the server system further includes an image analysis unit that acquires a physical quantity of the commodity by analyzing the image data of the commodity, associates the acquired physical quantity of the commodity with the image data of the commodity, and stores the associated physical quantity of the commodity in an image database.
According to this aspect, the image analysis unit analyzes the image data to acquire the physical quantity of the product.
Another aspect of the present invention relates to a server system connected to a client terminal via a network, the server system including: a server receiving unit that receives data transmitted from a client terminal via a network; a physical quantity acquisition unit for acquiring, from the client terminal via the network, a physical quantity of the product associated with the perceptual word data received by the server reception unit; an image search unit that acquires image data of the commodity associated with the physical quantity of the commodity acquired by the physical quantity acquisition unit; an image arrangement unit that specifies a display mode of the image data of the product acquired by the image search unit and generates display information data indicating the specified display mode; and a server transmission unit that transmits the image data of the product acquired by the image search unit and the display information data generated by the image arrangement unit to the client terminal via a network.
Another aspect of the present invention relates to a method for controlling a search system including a client terminal and a server system connected to the client terminal via a network, wherein in the server system, data transmitted from the client terminal is received via the network by a server receiving unit, a physical quantity of a commodity associated with perceptual word data received by the server receiving unit is acquired from the client terminal via the network by a physical quantity acquiring unit, image data of the commodity associated with the physical quantity of the commodity acquired by the physical quantity acquiring unit is acquired by an image searching unit, a display form of the image data of the commodity acquired by the image searching unit is specified by an image arranging unit, display information data representing the specified display form is generated, and the image data of the commodity acquired by the image searching unit and the display information data generated by the image arranging unit are transmitted via the network by the server transmitting unit The image data of the commodity received by the terminal receiving unit is displayed on the display unit by the display control unit according to the display information data received by the terminal receiving unit.
Another aspect of the present invention relates to a method for controlling a server system connected to a client terminal via a network, the method comprising a server receiving unit, receiving data transmitted from the client terminal via the network, acquiring, by the physical quantity acquiring unit, physical quantities of the commodity associated with the perceptual word data received by the server receiving unit from the client terminal via the network, acquiring image data of the commodity associated with the physical quantity of the commodity acquired by the physical quantity acquisition unit by the image search unit, the display mode of the image data of the commodity acquired by the image search unit is determined by the image arrangement unit, and display information data representing the determined display form is generated, and the display information data is transmitted to the server by the server transmission unit, the image data of the product acquired by the image search unit and the display information data generated by the image arrangement unit are transmitted to the client terminal via a network.
Effects of the invention
According to the present invention, the image data of the product is acquired from the physical quantity of the product associated with the perceptual word data specified by the user, and the image data of the product is displayed based on the display information data indicating the display form of the image data of the product. Therefore, the image of the product matching the preference of the user can be specified easily and accurately by the perceptual word, and the user can intuitively understand the information of the product.
Drawings
Fig. 1 is a conceptual diagram of a search system.
Fig. 2 is a block diagram showing an example of a functional configuration of the client terminal.
Fig. 3 shows an example of an input screen of user identification data displayed on the display unit of the client terminal.
Fig. 4 shows an example of a display unit of a client terminal that displays a plurality of perceptual words (search word candidates).
Fig. 5 is a block diagram showing an example of a functional configuration of the server system.
Fig. 6 is a conceptual diagram of a data structure showing a correspondence relationship between user identification data and user attribute data stored in the user information database.
Fig. 7 is a conceptual diagram of a data structure showing an example of the structure data of the user attribute data.
Fig. 8 is a conceptual diagram of a data structure showing a correspondence relationship between user attribute data stored in the conversion table database and the conversion table.
Fig. 9 is a conceptual diagram of a data structure showing an example of the structure data of the conversion table.
Fig. 10 is a conceptual diagram showing a relationship between perceptual word data (perceptual space) and physical quantity data (physical measurement space) defined by the conversion table.
Fig. 11 is a conceptual diagram of a data structure showing the correspondence between image data and metadata stored in an image database.
Fig. 12 is a conceptual diagram of a data structure showing an example of the structure data of metadata.
Fig. 13 shows an example of a display mode of the image data of the product acquired by the image search unit.
Fig. 14 is a block diagram showing an example of a functional configuration of the server system, and is a diagram showing a case where the functional configuration of the server system is realized by a plurality of servers.
Fig. 15 is a conceptual diagram illustrating an example of a search system in which a server system and a client terminal exist in one country (country a).
Fig. 16 is a conceptual diagram illustrating an example of a search system when a server system and a client terminal exist across a plurality of countries (countries a to D).
Fig. 17 is a flowchart of the search processing and the search result display processing.
Fig. 18 is a diagram showing an appearance of the smartphone.
Fig. 19 is a block diagram showing the structure of the smartphone shown in fig. 18.
Detailed Description
An embodiment of the present invention will be described with reference to the drawings. In the following embodiments, an example in which "western style clothes" is a product to be searched is described, but the present invention is not limited to this, and can be applied to the case of searching for any other product.
Fig. 1 is a conceptual diagram of a search system 1. The search system 1 according to the present embodiment includes client terminals 11 and a server system 10 connected to each client terminal 11 via a network 12 such as the internet.
The client terminal 11 is a terminal operated by a user when searching for a product such as western style clothes, and may be in the form of a portable terminal such as a smart phone or a tablet device, a personal computer, or the like.
The server system 10 performs a product search in accordance with an instruction transmitted from the client terminal 11 via the network 12, and feeds back the search result to the client terminal 11 via the network 12.
In the search system 1, first, a functional configuration of the client terminal 11 will be described.
Fig. 2 is a block diagram showing an example of the functional configuration of the client terminal 11.
The client terminal 11 of the present example includes a terminal input unit 20, a terminal external input/output unit 23 (a terminal transmission unit 21 and a terminal reception unit 22), a display control unit 25, a display unit 27, and a terminal system controller 28.
The terminal input unit 20 includes: an operation unit that is directly operated by a user to input data such as a user ID, a password, and a search term (search basic information); and an information specifying unit that specifies data types such as a user ID, a password, and search basic information, which are input through the operation unit.
The user ID and the password are data for specifying the user who operates the client terminal 11, and hereinafter, either one of the user ID and the password or a combination of both is also referred to as "user identification data D1". The "search term (search base information)" is a term indicating a feature of a product desired by the user, and the sensibility term data D2 is used as the search term in the present embodiment. The perceptual word data D2 is data indicating perceptual terms (perceptual words), and the perceptual word data D2 as search words indicates an impression of a search commodity expected by the user.
For example, when the client terminal 11 is a portable terminal such as a smartphone, keys and a touch panel provided in the client terminal 11 can be used as an operation unit, and the information specifying unit can specify the perceptual word data selected by the operation unit. The user can input the data class in any manner through the operation unit, and may directly input the data class, or may select 1 or more desired data classes from a plurality of candidates displayed on the display unit 27 and input the data classes.
Fig. 3 shows an example of an input screen of the user identification data D1 displayed on the display unit 27 of the client terminal 11. In this example, a mobile terminal having a touch panel provided in the display unit 27 as a user operation unit (terminal input unit 20) is used as the client terminal 11.
In order to authenticate the user as a genuine user, the user inputs a user ID and a password set for each user into the client terminal 11. In the example shown in fig. 3, a user ID data input field 50, a password input field 51, and a software keyboard 52 are displayed on the display unit 27. The user ID data input field 50 is a field for the user to input a user ID, and the password input field 51 is a field for the user to input a password. The user uses the soft keyboard 52 to input the assigned user ID into the user ID data input field 50 and to input the password into the password input field 51.
The software keyboard 52 is composed of an arbitrary character pad (character pad) displayed on the display unit 27 and a touch panel (display unit 27), and when the user touches a portion of the touch panel corresponding to each character version displayed on the display unit 27, characters or the like corresponding to the touched position are input to the user ID data input field 50 or the password input field 51. The character board displayed on the display unit 27 as the software keyboard 52 is not particularly limited, and the software keyboard 52 can display not only characters for input such as hiragana, letters, numerals, and symbols, but also function keys such as a space key, an enter key, a delete key, and a display switch key on the display unit 27.
For example, when the user touches the touch panel (display section 27) at a position corresponding to the user ID data input field 50, the client terminal 11 shifts to the user ID input mode, and the user can input the user ID into the user ID data input field 50 using the software keyboard 52. Similarly, for example, when the user touches the touch panel at a position corresponding to the password input field 51, the client terminal 11 shifts to the password input mode, and the user can input the password in the password input field 51 using the software keyboard 52. When the user touches a position corresponding to the enter key of the soft keyboard 52 in a state where the user identification data and the password are input, the authentication processing is performed, and when the user identification data and the password are verified to be correct, the display of the display unit 27 shifts to a search processing screen.
Fig. 4 shows an example of the display unit 27 of the client terminal 11 that displays a plurality of perceptual words (candidates of search words).
The terminal input unit 20 (touch panel) of the present example receives a command from a user, specifies at least one of a plurality of perceptual words displayed on the display unit 27 in accordance with the command, and specifies perceptual word data D2 indicating the specified perceptual word as "perceptual word data D2 inputted through the terminal input unit 20". More specifically, a plurality of sensitive words that are candidates for the search word are displayed on the display unit 27, and a user touches a portion corresponding to any one of the plurality of sensitive words displayed on the display unit 27 on the touch panel (display unit 27), so that the sensitive word corresponding to the touched position is input as the search word.
The display form of the plurality of perceptual words on the display unit 27 is not particularly limited, and the plurality of perceptual words may be displayed in an orderly manner in the display unit 27 according to a predetermined rule or may be displayed at random. Further, the display of a plurality of sensitive words on the display unit 27 may be a so-called "tag cloud" display, or the display sizes of the sensitive words on the display unit 27 may be set to be different from each other. In the example of fig. 4, the perceptual word "lovely" is displayed most, the perceptual words "leisure" and "gorgeous" are displayed more largely, the perceptual words "refined", "natural" and "elegant" are displayed more largely, and the perceptual word "formal" is displayed least.
The plurality of perceptual words displayed on the display unit 27 may be predetermined or may be changed for each search process. The plurality of perceptual word data displayed on the display unit 27 may be stored in the client terminal 11 in advance, or may be transmitted from the server system 10 to the client terminal 11 (the display control unit 25 (see fig. 2)). At this time, the display control unit 25 displays candidates (a plurality of perceptual words) of the search word on the display unit 27 in a tag cloud form based on perceptual word data transmitted from the server system 10 through the terminal external input/output unit 23 (the terminal receiving unit 22) for each search process. The plurality of perceptual word data transmitted from the server system 10 to the client terminal 11 are not particularly limited, and may be changed according to, for example, the frequency of specifying search words and the attribute of the user when performing search processing in the server system 10.
The terminal transmission unit 21 (see fig. 2) transmits data classes such as user identification data D1 and perceptual word data D2 (search word) designated by the user through the terminal input unit 20 to the server system 10 via the network 12 (see fig. 1). The terminal receiving unit 22 receives data such as "search result data D5 (image data of a product)", display information data D6 ", and the like, which will be described later, transmitted from the server system 10 via the network 12. The terminal transmitting unit 21 and the terminal receiving unit 22 constitute a terminal external input/output unit 23, and the terminal transmitting unit 21 and the terminal receiving unit 22 may be constituted by the same equipment.
The display controller 25 controls the display unit 27 to control the display of the entire display of the display unit 27, for example, to switch between a screen display (see fig. 3) of a designation mode (user ID input mode and password input mode) of the user identification data D1 and a screen display (see fig. 4) of a designation mode of the perceptual word data D2 (search word) on the display unit 27. The display control unit 25 displays the image data of the product received by the terminal receiving unit 22 on the display unit 27 based on the display information data received by the terminal receiving unit 22.
The terminal system controller 28 (see fig. 2) controls the terminal external input/output unit 23 (the terminal transmission unit 21 and the terminal reception unit 22), the terminal input unit 20, and the display control unit 25 to cause each unit to execute the above-described processing and other processing, and also controls each unit not shown in the client terminal 11. For example, in the user authentication process (see fig. 3), the terminal system controller 28 controls the terminal input unit 20 and the terminal external input/output unit 23, and transmits the user identification data D1 input by the user through the terminal input unit 20 to the server system 10 via the terminal transmission unit 21 and the network 12. The terminal system controller 28 controls the terminal input unit 20, the terminal receiving unit 22, and the display control unit 25, and after the user authentication transmitted from the server system 10, receives candidates for a search word (a plurality of perceptual word data) (see fig. 4), search result data D5, display information data D6, and the like, to display whether the user authentication has been successful or unsuccessful on the display unit 27, to display a tag cloud of the search word (see fig. 4), to display a searched product image 66 (see fig. 13), and the like.
In this way, when the user performs a product search by operating the client terminal 11 having the above-described configuration, the user identification data D1 (user ID and password) and the sensitive word data D2 (search word) are input using the terminal input unit 20 (touch panel). The user identification data D1 and the sensitive word data D2 specified by the user input through the terminal input unit 20 are transmitted from the terminal input unit 20 to the terminal transmission unit 21, and are transmitted from the terminal transmission unit 21 to the server system 10 via the network 12.
Next, a functional configuration of the server system 10 will be described.
Fig. 5 is a block diagram showing an example of the functional configuration of the server system 10.
The server system 10 of the present example includes a server external input/output unit 33 (server transmission unit 31 and server reception unit 32), a user attribute information acquisition unit 35, a conversion table acquisition unit 37, a physical quantity acquisition unit 39, an image search unit 41, an image arrangement unit 43, an image analysis unit 45, a user information database 36, a conversion table database 38, an image database 42, and a server system controller 48.
The server receiver 32 receives data categories such as the user identification data D1 and the perceptual word data D2 (search word) transmitted from the client terminal 11 via the network 12, transmits the user identification data D1 to the user attribute information acquirer 35, and transmits the perceptual word data D2 to the physical quantity acquirer 39.
The user attribute information acquisition unit 35 accesses the user information database 36 to acquire the user attribute data D3 associated with the user identification data D1 received from the client terminal 11 via the network 12 and the server reception unit 32.
Fig. 6 is a conceptual diagram of a data structure showing the correspondence relationship between the user identification data D1 and the user attribute data D3 stored in the user information database 36. Fig. 7 is a conceptual diagram of a data structure showing an example of the structure data of the user attribute data D3.
The user information database 36 stores the user identification data D1 in association with the user attribute data D3. The user information database 36 is stored in a state of being in contact with the user attribute data D3 on each user identification data D1, so that the corresponding user attribute data D3 can be determined as long as the user identification data D1 is determined. The user attribute data D3 is not particularly limited as long as it is data indicating user attributes, and for example, data based on at least one of gender data D1, age data D2, race data D3, nationality data D4, religious data D5, and tutorial data D6 of the user may be used as the user attribute data D3.
The user attribute data D3 stored in the user information database 36 is specified in advance by the user, and the user attribute data D3 specified by the user is stored in the user information database 36 in association with the user identification data D1 (user ID) assigned to the user. For example, when the user first uses the search system 1, the user attribute data D3 may be input to the client terminal 11 (terminal input unit 20) together with the user identification data D1 such as a password. At this time, the input user identification data D1 and user attribute data D3 are transmitted from the client terminal 11 (terminal transmitting unit 21) to the server system 10 (server receiving unit 32), and are stored in the user information database 36 in association with each other under the control of the server system controller 48.
The conversion table acquisition section 37 (refer to fig. 5) accesses the conversion table database 38, thereby acquiring the conversion table T in association with the user attribute data D3 acquired by the user attribute information acquisition section 35.
Fig. 8 is a conceptual diagram showing the data structure of the correspondence relationship between the user attribute data D3 stored in the conversion table database 38 and the conversion table T. Fig. 9 is a conceptual diagram of a data structure showing an example of the structure data of the conversion table T.
The conversion table T is dependent on the user attribute data D3, and the user attribute data D3 and the conversion table T in the conversion table database 38 are stored in a state of being associated, so that as long as the user attribute data D3 is determined, the corresponding conversion table T can be determined. The physical quantity (physical quantity data D4) of the commodity in each of these conversion tables T stored in the conversion table database 38 is associated with the perceptual word data D2, and if the perceptual word data D2 are determined, the corresponding physical quantity data D4 may be determined.
The "physical quantity of a product (physical quantity data D4)" is not particularly limited as long as it is data representing physical characteristics of a product, and for example, data based on at least one of a color of a product (for example, a representative color), a pattern of a product (for example, a representative pattern), a texture of a product, and a model of a product may be used as the physical quantity data D4.
The physical quantity acquiring unit 39 (see fig. 5) in this example refers to the conversion table T acquired by the conversion table acquiring unit 37 to acquire a physical quantity (physical quantity data D4) of a commodity associated with the perceptual word data D2 (search word) received from the client terminal 11 via the network 12 by the server receiving unit 32. As described above, since the conversion table T acquired by the conversion table acquisition unit 37 depends on the user attribute data D3, the physical quantity data D4 acquired by the physical quantity acquisition unit 39 also varies depending on the user attribute data D3.
Fig. 10 is a conceptual diagram showing the relationship between the perceptual word data D2 (perceptual space 80) and the physical quantity data D4 (physical measurement space 82) defined by the conversion table T.
The conversion table T defines a region (hereinafter referred to as "physical quantity region 86") in the physical measurement space 82 corresponding to a region (hereinafter referred to as "sensitivity region 84") in the sensitivity space 80. That is, assuming that the sensitive regions 84 allocated for each of the sensitive word data D2 exist in the sensitive space 80, the physical quantity regions 86 (physical quantity data D4) corresponding to the sensitive regions 84 exist in the physical measurement space 82. In the example shown in fig. 10, when a certain perceptual word data D2 occupies a perceptual region 84a, a specific physical quantity region 86 (physical quantity data D4) in the physical measurement space 82 relating to the color feature amount, the pattern feature amount, and the texture feature amount corresponds to the perceptual region 84a (see the hatched portion in fig. 10).
The conversion table T defines "the correlation between the sensitive region 84 of the sensitive word data D2 shown in the sensitive space 80 and the physical quantity region 86 of the physical quantity data D4 shown in the physical measurement space 82", and converts the data in the sensitive space 80 into the data in the physical measurement space 82.
In the present embodiment, conversion tables T are prepared for each user attribute data D3, and when the user attribute data D3 are different, the specific positions of the perceptual region 84 in the perceptual space 80 are different even for the same perceptual word, using the conversion tables T identified by different criteria. That is, even with the same perceptual word, the physical quantities (color, pattern, texture, layout, etc.) of the product associated with the perceptual word vary among users having different attributes. For example, regarding the term "lovely", it is predicted that physical quantities (color, pattern, texture, shape, and the like) of the commodities associated with "60-year-old male" and "10-year-old female" are greatly different. The physical quantity acquiring unit 39 of the present embodiment uses the "conversion table T corresponding to the user attribute data D3" acquired by the conversion table acquiring unit 37, and therefore can acquire the physical quantity data D4 reflecting the difference in "physical quantity of a product associated with a perceptual word" between users having different attributes.
The physical quantity data D4 acquired by the physical quantity acquiring unit 39 is not limited to data relating to a single type of physical quantity, and may be data relating to a combination of a plurality of types of physical quantities. For example, the physical quantity data D4 acquired by the physical quantity acquisition unit 39 may be data relating to "color of product" or data relating to a combination of "color of product" and "pattern of product". When the physical quantity data D4 is data relating to a combination of a plurality of physical quantities, the physical quantity acquiring section 39 may acquire the physical quantity data D4 after weighting each physical quantity.
In the example shown in fig. 10, "a color feature amount defined by RGB (red green blue) data", "a pattern feature amount defined by a pattern density and a pattern size", and "a texture feature amount defined by glossiness and transparency" are used as references for specifying the physical quantity region 86 in the physical measurement space 82, but the present invention is not limited to these. For example, the physical quantity region 86 may be defined by a layout feature amount defined by the total width (thin-fat), the sleeve length (short-long), the clothing length (short-long), the width and height of the neck line (neckline), the cross-sectional area (small-large) of a space defined by the neck line for the user to pass through, the angle (small-large) of the V-neck, the curvature (small-large) of the U-neck, and the like. The reference for identifying the sensitive region 84 in the sensitive space 80 is not particularly limited, and the sensitive region 84 may be identified by any sensitivity such as "Warm (Warm) -cold (Cool)", and "formal-leisure".
The image database 42 (see fig. 5) stores the image data of the commodity in association with the physical quantity (physical quantity data) of the commodity, and the image searching unit 41 accesses the image database 42 to acquire the image data of the commodity in association with the "physical quantity (physical quantity data D4) of the commodity" acquired by the physical quantity acquiring unit 39.
Fig. 11 is a conceptual diagram of a data structure showing the correspondence relationship between the image data I and the metadata M stored in the image database 42. Fig. 12 is a conceptual diagram of a data structure showing an example of the structure data of the metadata M.
Metadata M indicating the physical quantity of the commodity is added to the image data I of the commodity stored in the image database 42, and the image data I of a plurality of commodities in the image database 42 is stored together with the metadata M.
The image data I of each commodity is acquired by photographing the corresponding commodity. Then, the physical quantity (physical quantity data unit 60) of the product included in the metadata M associated with the image data I of each product is acquired by analyzing the image data I of the corresponding product.
That is, the metadata M includes a characteristic data portion 62 indicating a characteristic of the product, and the characteristic data portion 62 includes a physical quantity data portion 60 indicating a physical quantity of the product and characteristic data indicating other characteristics of the product. The physical quantity data unit 60 includes, for example, color data M1, pattern data M2, texture data M3, and layout data M4 obtained by analyzing the image data I of the product in the physical quantity data unit 60. The color data M1 can be specified by RGB (red, green, blue) data, the pattern data M2 can be specified by pattern density and pattern size, and the texture data M3 can be specified by glossiness and transparency, for example. The layout data M4 can be specified by, for example, the total width (thin-fat), the sleeve length (short-long), the clothing length (short-long), the width and height of the neck contour (neckline), the cross-sectional area (small-large) of the space defined by the neckline and through which the head of the user passes, the angle (small-large) of the V-neck, the curvature (small-large) of the U-neck, and the like. On the other hand, the characteristic data other than the physical quantity data section 60 included in the characteristic data section 62 can be specified by a method other than analyzing the image data I of the product, and includes, for example, price data M5 of the product individually specified by the provider of the product image, size data M6 of the product, and the like.
In this example, the image analysis unit 45 (see fig. 5) obtains the physical quantity data unit 60. That is, the image analysis unit 45 inputs the image information data D7 including the "metadata M not including the physical quantity data unit 60" and the "image data I obtained by capturing an image of a commodity". The image analyzer 45 adds the physical quantity data unit 60, which is obtained by analyzing the image data I included in the image information data D7, to the metadata M, and outputs and stores the image information data D7 including the "metadata M including the physical quantity data unit 60" and the "image data I" to the image database 42. In this manner, the image analysis unit 45 analyzes the image data I of the commodity to acquire the physical quantity of the commodity (physical quantity data unit 60), and stores the acquired physical quantity of the commodity (physical quantity data unit 60) in the image database 42 in association with the image data I of the commodity.
The image search unit 41 accesses the image database 42 to acquire image data of the commodity associated with the physical quantity (physical quantity data D4) of the commodity acquired by the physical quantity acquisition unit 39. That is, the image search unit 41 acquires the image data I of the product to which the metadata M indicating the physical quantity (physical quantity data D4) of the product acquired by the physical quantity acquisition unit 39 is added as the image data I of the product associated with the physical quantity of the product acquired by the physical quantity acquisition unit 39.
For example, when the physical quantity data D4 relates to "color of product", the image data I (for example, "color data M1" is the image data I indicating "red") to which metadata M indicating a color (for example, "red") associated with the perceptual word data D2 input to the physical quantity acquisition unit 39 is added is acquired by the image search unit 41. When the physical quantity data D4 is data relating to a combination of a plurality of types of physical quantities, the image data I to which the metadata M indicating the plurality of types of physical quantities shown in the physical quantity data D4 is added is acquired by the image search unit 41. For example, when the physical quantity data D4 relates to "color of product" and "pattern of product", the image data I to which the metadata M indicating the color and pattern associated with the perceptual word data D2 input to the physical quantity acquisition unit 39 is added is acquired by the image search unit 41. When the physical quantity data D4 is data relating to a combination of a plurality of types of physical quantities and is weighted according to the type of physical quantity, the image data I to which the metadata M indicating the plurality of types of physical quantities indicated by the physical quantity data D4 is added is acquired by the image searching unit 41 according to the weighting.
The image arranging unit 43 specifies the display form of the image data I of the product acquired by the image searching unit 41 through the search, and generates the display information data D6 indicating the specified display form. That is, the image arranging unit 43 specifies the display form of the image data I of the product acquired by the image searching unit 41 to the form in which the image data I of the product is displayed based on the characteristic data of the product, and generates the display information data D6 indicating the specified display form.
The display information data D6 generated by the image arranging unit 43 indicates a display form in which at least a part of the image data I of the product acquired by the image searching unit 41 is displayed on the coordinate system indicating the characteristic data of the product. The "product characteristic data" here is determined according to a characteristic different from the perceptual word data (search word) specified by the user via the client terminal 11 among the characteristics of the product. For example, the "characteristic data of the product" may be determined from at least one of the price of the product and the size of the product (the price data M5 and the size data M6 in fig. 12).
Fig. 13 shows an example of a display mode of the product image data I acquired by the image search unit 41. In part (a) of fig. 13, the horizontal axis represents the "size (large/small)" of the product, and the vertical axis represents the "price code" of the product. In part (b) of fig. 13, the horizontal axis represents the "model (fat/thin)" of the product, and the vertical axis represents the "price code" of the product. In part (c) of fig. 13, the horizontal axis represents "color tone (shade) of the product", and the vertical axis represents "price code" of the product. In part (d) of fig. 13, the horizontal axis represents the "pattern density (density/density)" of the product, and the vertical axis represents the "color tone (shade/lightness)" of the product. A plurality of (3 in the horizontal direction and 3 in the vertical direction in fig. 13, and 9 in total) commodity images 66 (image data I) are shown in the coordinate system 64 shown in each of parts (a) to (d) of fig. 13.
In this manner, the image arranging unit 43 (see fig. 5) generates display information data D6 indicating a display mode (see parts (a) to (D) of fig. 13) in which "the image data I of the product (product image 66) acquired by the image searching unit 41 through the search" included in the search result data D5 is displayed on the coordinate system 64. In addition, although the examples of displaying the plurality of product images 66 on the two-dimensional coordinate system 64 are shown in parts (a) to (D) of fig. 13, the present invention is not limited to these examples, and the display information data D6 may represent a display form in which the plurality of product images 66 are displayed in "one-dimensional" or "three-dimensional or other multi-dimensional".
The server transmission unit 31 transmits the search result data D5 (including the image data I of the product) acquired by the image search unit 41 and the display information data D6 generated by the image arrangement unit 43 to the client terminal 11 via the network 12. The search result data D5 and the display information data D6 transmitted by the server transmitter 31 are received by the terminal receiver 22 (see fig. 2) of the client terminal 11. Then, the display control unit 25 specifies the display mode of the "image data I (product image 66) of the product obtained by the image search unit 41 through the search" based on the received display information data D6, and displays the image data I (product image 66) of the product on the display unit 27 in the specified display mode (see parts (a) to (D) of fig. 13).
The server system controller 48 controls the server external input/output unit 33 (the server transmission unit 31 and the server reception unit 32), the user attribute information acquisition unit 35, the conversion table acquisition unit 37, the physical quantity acquisition unit 39, the image search unit 41, the image arrangement unit 43, and the image analysis unit 45, thereby causing each unit to execute the above-described processing and other processing, and also controls each unit of the server system 10, which is not shown. For example, when performing the user authentication process (see fig. 3), the server system controller 48 receives the user identification data D1 transmitted from the client terminal 11 via the server receiving unit 32, accesses the ID code database (not shown), determines whether or not the received user identification data D1 is correct, and transmits the determination result to the client terminal 11 via the server transmitting unit 31. When the display unit 27 of the client terminal 11 displays a tag cloud of a search word, the server system controller 48 accesses a search word database (not shown), selects a plurality of perceptual word data used for the tag cloud display (see fig. 4) based on various information such as the user attribute data D3, and transmits the selected perceptual word data to the client terminal 11 via the server transmission unit 31.
A specific embodiment for realizing the functional configuration of the server system 10 shown in fig. 5 is not particularly limited, and all the functional configurations of the server system 10 shown in fig. 5 may be realized by a single server or may be realized by a plurality of servers.
Fig. 14 is a block diagram showing an example of the functional configuration of the server system 10, and the functional configuration of the server system 10 is a diagram showing a case in which the server system is implemented by a plurality of servers.
The server system 10 of this example includes a web server 15, a database server 16, and an image analysis server 17. The web server 15 includes the server system controller 48, the server external input/output unit 33 (server transmission unit 31, server reception unit 32), the user attribute information acquisition unit 35, the conversion table acquisition unit 37, the physical quantity acquisition unit 39, the image search unit 41, and the image arrangement unit 43. The database server 16 includes a database controller 70 that controls the user information database 36, the conversion table database 38, and the image database 42, in addition to these databases. The image analysis server 17 includes an image analysis controller 72 that controls the image analysis unit 45 in addition to the image analysis unit 45 described above.
The database server 16 and the image analysis server 17 also have data input/output units similar to the server external input/output unit 33 (the server transmission unit 31 and the server reception unit 32) of the web server 15, respectively, but are not illustrated.
Each unit of the web server 15 is controlled by the server system controller 48, each unit of the database server 16 is controlled by the database controller 70, and each unit of the image analysis server 17 is controlled by the image analysis controller 72.
For example, when the user attribute information acquisition section 35 acquires the user attribute data D3 from the user identification data D1, the user information database 36 is accessed via the server external input/output section 33 and the database controller 70, thereby acquiring the user attribute data D3 associated with the user identification data D1. Similarly, the conversion table obtaining section 37 accesses the conversion table database 38 via the server external input/output section 33 and the database controller 70, and the image searching section 41 accesses the image database 42 via the server external input/output section 33 and the database controller 70.
Then, the image analysis unit 45 stores the image information data D7 after the image analysis in the image database 42 via the image analysis controller 72 and the database controller 70.
The server system 10 configured by a plurality of servers is not limited to the example shown in fig. 14. For example, any two servers of the web server 15, the database server 16, and the image analysis server 17 shown in fig. 14 may be implemented by a single server. Further, a part of the databases (the user information database 36, the conversion table database 38, and the image database 42) included in the database server 16 shown in fig. 14 may be provided in the web server 15.
The server system 10 and the client terminal 11 may be installed in the same country or different countries via the network 12, and when the server system 10 is implemented by a plurality of servers, some or all of the plurality of servers may be installed in different countries.
Fig. 15 is a conceptual diagram illustrating an example of the search system 1 when the server system 10 and the client terminal 11 exist in one country (country a). Fig. 16 is a conceptual diagram illustrating an example of the search system 1 when the server system 10 and the client terminal 11 exist across a plurality of countries (countries a to D).
The server system 10 shown in fig. 15 and 16 includes a web server 15, a database server 16, an image analysis server 17, and a mail server 18. The web server 15, the database server 16, and the image analysis server 17 can have the same configuration as the server system 10 shown in fig. 14, for example, and the mail server 18 is a server that transmits and receives electronic mails to and from the client terminal 11.
In the example shown in fig. 16, the web server 15 is provided in country a, the database server 16 is provided in country B, the mail server 18 is provided in country C, and the image analysis server 17 is provided in country D. Further, the client terminal 11 may exist in a plurality of countries, and the client terminal 11 may exist in countries a and B as shown in fig. 16.
Next, the flow of the search process will be described.
Fig. 17 is a flowchart of the search processing and the search result display processing. In fig. 17, "S (S13 to S17)" indicates a process mainly performed by the server system 10, and "C (S11 to S12 and S18 to S19)" indicates a process mainly performed by the client terminal 11.
In the search system 1 of the present embodiment, when a product search is performed, the user specifies the user identification data D1 and the sensitive word data D2 specified as search words through the terminal input unit 20 (S11 in fig. 17). The user identification data D1 and the perceptual word data D2 designated by the terminal input unit 20 are transmitted to the server system 10 via the terminal transmission unit 21 and the network 12 (S12).
The user identification data D1 and the perceptual word data D2 transmitted from the client terminal 11 are received by the server receiving unit 32 (S13), the user identification data D1 is transmitted to the user attribute information acquiring unit 35, and the perceptual word data D2 is transmitted to the physical quantity acquiring unit 39. The user attribute data D3 is acquired from the user identification data D1 by the user attribute information acquisition section 35, and the corresponding conversion table T is acquired from the user attribute data D3 by the conversion table acquisition section 37, and the acquired conversion table T is sent to the physical quantity acquisition section 39. After that, the physical quantity acquirer 39 transmits the user identification data D1 and the physical quantity data D4 acquired from the conversion table T to the image searcher 41 (S14).
Thereafter, the image searching unit 41 searches for and acquires the image data of the commodity (search result data D5) associated with the physical quantity data D4 from the image data I stored in the image database 42 (S15), and the image arranging unit 43 acquires the display information data D6 indicating the display form of the acquired image data of the commodity (S16). The acquired search result data (image data of the product) D5 and display information data D6 are transmitted to the client terminal 11 via the server transmitter 31 (S17).
The search result data D5 (image data of the product) and the display information data D6 transmitted from the server system 10 are received by the terminal reception unit 22 and the display control unit 25 (S18), and the image data of the product acquired by the search is displayed on the display unit 27 in the display form indicated by the display information data D6 (S19).
As described above, according to the search system 1 of the present embodiment, a plurality of conversion tables T corresponding to the attributes of the user are prepared and stored, and the conversion tables T actually used for the product search are switched according to the attributes of the user. In this way, the attribute of the user (user attribute data D3) is acquired for each user, and the conversion table T is switched according to the attribute, whereby an appropriate product search can be performed in accordance with the attribute.
Further, the perceptual feature value (perceptual word data) is not directly related to the product, but the physical quantity is directly related to the product, and the perceptual word is related to the product via the perceptual space 80 and the physical measurement space 82 (fig. 10). Therefore, even if a search is performed using perceptual words that vary according to the user attributes, the time, and the like, the "establishment of a relationship between perceptual words and products" that are appropriate for the attributes, the time, and the like can be appropriately performed by adjusting the correlation between the region of the perceptual space 80 (perceptual word data) and the region of the physical measurement space 82 (physical quantity data). By using the optimum conversion table T by flexibly changing the relationship between the physical measurement value and the perceptual word in this way, accurate product search can be performed.
Then, the image data of the product selected by the search is displayed on the coordinate system 64 based on the characteristic amount (for example, price, size, and the like) different from the perceptual word designated by the user, so that the user can intuitively confirm the search result from the characteristic amount and can make a purchase decision immediately. In this way, the search system 1 according to the present embodiment can prompt the user to make a purchase decision by performing "simple search using intuitive perceptual words" and "display of a search result image that can be intuitively recognized".
As described above, according to the present embodiment, a plurality of product images that match the preference of the user can be specified easily and accurately by using the perceptual word, and a product image that allows the user to intuitively understand the relationship between the plurality of product images can be displayed.
The functional configurations described above can be realized by arbitrary hardware, software, or a combination of both. For example, the present invention can be applied to a program for causing a computer to execute the processing method in each part of the server system 10 and the client terminal 11 and the control method of the entire processing, a storage medium (non-transitory recording medium) in which the program is recorded and readable by a computer, or a computer in which the program is installed.
The client terminal 11 of the present invention is not particularly limited, and examples thereof include a mobile phone, a smart phone, a PDA (Personal Digital assistant), and a portable game machine. An example of a smartphone to which the present invention is applicable will be described below.
< Structure of smartphone >
Fig. 18 is a diagram showing an external appearance of the smartphone 101. A smartphone 101 shown in fig. 18 has a flat-plate-shaped housing 102, and includes a display input unit 120 in which a display panel 121 serving as a display unit and an operation panel 122 serving as an input unit are integrated with each other on one surface of the housing 102. The housing 102 includes a speaker 131, a microphone 132, an operation unit 140, and a camera unit 141. The configuration of the housing 102 is not limited to this, and for example, the display unit and the input unit may be independent from each other, or a folding structure or a structure having a slide mechanism may be employed.
Fig. 19 is a block diagram showing the configuration of the smartphone 101 shown in fig. 18. As shown in fig. 19, the smartphone includes, as main components, a wireless communication unit 110, a display input unit 120, a communication unit 130, an operation unit 140, a camera unit 141, a storage unit 150, an external input/output unit 160, a GPS (Global Positioning System) reception unit 170, an operation sensor unit 180, a power supply unit 190, and a main control unit 100. The smartphone 101 has a wireless communication function of performing mobile wireless communication via a base station apparatus and a mobile communication network as a main function.
The wireless communication unit 110 performs wireless communication with a base station apparatus accommodated in a mobile communication network in accordance with a command from the main control unit 100. This wireless communication is used to transmit and receive various file data such as voice data and image data, e-mail data, and the like, and to receive Web data and stream data and the like.
The display input unit 120 includes a display panel 121, which is a so-called touch panel, and an operation panel 122, and visually delivers information to a user by controlling display images (still images and moving images) and character information and the like by the main control unit 100, and detects a user operation for the displayed information.
The Display panel 121 uses an LCD (Liquid Crystal Display), an OELD (Organic Electro-Luminescence Display), or the like as a Display device. The operation panel 122 is a device that is placed so as to be able to recognize an image displayed on the display surface of the display panel 121 and detects coordinates of a user's operation with a finger or a stylus. When the user operates the device with a finger or a stylus pen, a detection signal generated by the operation is output to the main control section 100. Next, the main control section 100 detects an operation position (coordinate) on the display panel 121 based on the received detection signal.
As shown in fig. 18, the display panel 121 of the smartphone 101 illustrated as an embodiment of the imaging apparatus of the present invention and the operation panel 122 are integrally configured as the display input unit 120, but the operation panel 122 is disposed so as to completely cover the display panel 121. With this configuration, the operation panel 122 may also have a function of detecting a user operation in a region other than the display panel 1.21. In other words, the operation panel 122 may include a detection region (hereinafter, referred to as a display region) of an overlapping portion overlapping the display panel 121 and a detection region (hereinafter, referred to as a non-display region) of an outer edge portion not overlapping the display panel 121.
The size of the display region and the size of the display panel 121 may be completely matched, but the sizes do not necessarily need to be matched. The operation panel 122 may include two sensing regions, i.e., an outer edge portion and an inner portion other than the outer edge portion. The width of the outer edge portion is appropriately designed according to the size of the frame 102. The position detection method employed in the operation panel 122 includes a matrix switch method, a resistive film method, a surface acoustic wave method, an infrared ray method, an electromagnetic induction method, a capacitance method, and the like, and any method can be employed.
The communication unit 130 includes a speaker 131 and a microphone 132, converts the user's voice input through the microphone 132 into voice data that can be processed by the main control unit 100 and outputs the voice data to the main control unit 100, and decodes the voice data received through the wireless communication unit 110 or the external input/output unit 160 and outputs the voice data from the speaker 131. As shown in fig. 18, for example, a speaker 131 may be mounted on the same surface as the surface on which the display input unit 120 is provided, and a microphone 132 may be mounted on a side surface of the housing 102.
The operation unit 140 receives a command from a user as a hardware key using a key switch or the like. For example, as shown in fig. 18, the operation unit 140 is a push-button switch that is mounted on a side surface of the housing 102 of the smartphone 101, and is turned on when pressed by a finger or the like, and turned off by the restoring force of a spring or the like when released.
The storage unit 150 stores a control program and control data of the main control unit 100, application software, address data such as a name and a telephone number of a communication destination, data of a transmitted/received e-mail, Web data downloaded by Web browsing, and downloaded content data, and temporarily stores stream data and the like. The storage unit 150 is composed of an internal storage unit 151 built in the smartphone and an external storage unit 152 having a detachable external storage slot. Each of the internal storage unit 151 and the external storage unit 152 constituting the storage unit 150 is implemented by a storage medium such as a flash Memory type (flash Memory type), a hard disk type (hard disk type), a multimedia card (multimedia card micro type), a card type Memory (e.g., MicroSD (registered trademark) Memory), a RAM (Random Access Memory), or a ROM (Read Only Memory).
The external input/output unit 160 functions as an interface with all external devices connected to the smartphone 101, and is used to directly or indirectly connect to other external devices through communication or the like (for example, Universal Serial Bus (USB), IEEE1394, or the like) or a network (for example, the internet, wireless LAN, Bluetooth (registered trademark)), RFID (Radio Frequency Identification), Infrared communication (IrDA (registered trademark), UWB (ultra wideband (registered trademark)), ZigBee (registered trademark), or the like).
Examples of the external devices connected to the smartphone 101 include a wired/wireless headset, a wired/wireless external charger, a wired/wireless data terminal, a Memory card (Memory card) and a SIM (Subscriber Identity Module)/UIM (User Identity Module) card connected via a card socket, an external audio/video device connected to an audio/video I/O (Input/output) terminal, a wirelessly connected external audio/video device, a wired/wirelessly connected smartphone, a wired/wirelessly connected personal computer, a wired/wirelessly connected PDA, and a headset. The external input/output unit can transfer data transmitted from such an external device to each constituent element inside the smartphone 101 and cause data inside the smartphone 101 to be transmitted to the external device.
The GPS receiver 170 receives GPS signals transmitted from the GPS satellites ST1 to STn in accordance with a command from the main controller 100, performs a positioning calculation process based on the received GPS signals, and detects a position including the latitude, longitude, and altitude of the smartphone 101. When the GPS receiving unit 170 can acquire position information from the wireless communication unit 110 and the external input/output unit 160 (for example, wireless LAN), the position can be detected using the position information.
The motion sensor unit 180 includes, for example, a 3-axis acceleration sensor and detects the physical dynamics of the smartphone 101 in accordance with a command from the main control unit 100. By detecting the physical dynamics of the smartphone 101, the direction of movement and acceleration of the smartphone 101 are detected. The detection result is output to the main control section 100.
The power supply unit 190 supplies power stored in a battery (not shown) to each unit of the smartphone 101 in accordance with a command from the main control unit 100.
The main control unit 100 includes a microprocessor, operates according to the control program and the control data stored in the storage unit 150, and controls the respective units of the smartphone 101 in a unified manner. The main control unit 100 has a mobile communication control function and an application processing function for controlling each unit of the communication system for performing voice communication and data communication by the wireless communication unit 110.
The application processing function is realized by the main control section 100 in accordance with the operation of the application software stored in the storage section 150. Examples of the application processing function include an infrared communication function for controlling the external input/output unit 160 to perform data communication with the opposing device, an email function for transmitting and receiving an email, and a Web browsing function for browsing a Web page.
The main control unit 100 also has an image processing function of affecting display on the display input unit 120 and the like based on image data (still image data or moving image data) such as received data or downloaded stream data. The image processing function is a function in which the main control unit 100 decodes the image data, and performs image processing on the decoded result to display an image on the display input unit 120.
Further, the main control section 100 performs display control for the display panel 121, and operation detection control for detecting a user operation through the operation section 140 and the operation panel 122.
By performing display control, the main control section 100 displays software keys such as an icon and a scroll bar for starting application software, or displays a window for creating an email. The scroll bar is a software key for receiving a command to move a display portion of an image such as a large image that is not completely received in the display area of the display panel 121.
By executing the operation detection control, the main control section 100 detects a user operation through the operation section 140, receives an operation for the icon or an input of a character string for the input field of the window through the operation panel 122, or receives a request for scrolling an image through a scroll bar.
By executing the operation detection control, the main control unit 100 has a touch panel control function of determining whether the operation position on the operation panel 122 is in an overlapping portion (display area) overlapping the display panel 121 or in an outer edge portion (non-display area) not overlapping the display panel 121 other than the overlapping portion, and controlling the sensing area of the operation panel 122 and the display position of the software keys.
The main control unit 100 can detect a gesture operation on the operation panel 122 and execute a preset function according to the detected gesture operation. The gesture operation is simply a conventional simple click operation, and is an operation for drawing a trajectory with a finger or the like, simultaneously designating a plurality of positions, or drawing at least one kind of trajectory from a plurality of positions by combining these operations.
The camera unit 141 is a digital camera that performs electronic imaging using an imaging element such as a CMOS. The camera section 141 can convert image data obtained by shooting into compressed image data such as JPEG under the control of the main control section 100, store the compressed image data in the storage section 150, and output the compressed image data through the external input/output section 160 and the wireless communication section 110. As shown in fig. 18, in the smartphone 101, the camera unit 141 is mounted on the same surface as the display input unit 120, but the mounting position of the camera unit 141 is not limited to this, and may be mounted on the back surface of the display input unit 120, or may be mounted with a plurality of camera units 141. When a plurality of camera units 141 are mounted, shooting can be performed individually by switching the camera units 141 for shooting, or shooting can be performed by using a plurality of camera units 141 at the same time.
The camera unit 141 can be used for various functions of the smartphone 101. For example, an image acquired by the camera section 141 can be displayed on the display panel 121, and an image of the camera section 141 can be used as one of the operation inputs of the operation panel 122. The GPS receiving unit 170 can detect the position by referring to the image from the camera unit 141 when detecting the position. Further, the image from the camera section 141 may be referred to, and the optical axis direction of the camera section 141 of the smartphone 101 or the current usage environment may be determined without using the 3-axis acceleration sensor or using the 3-axis acceleration sensor in combination. Of course, the image from the camera section 141 can be used within the application software.
Further, the position information acquired by the GPS receiving unit 170, the voice information acquired by the microphone 132 (which may be converted into text information by voice-text conversion by the main control unit or the like), the posture information acquired by the motion sensor unit 180, and the like may be added to the image data of the still picture or the moving picture, and the information may be recorded in the storage unit 150 and output through the external input/output unit 160 and the wireless communication unit 110.
For example, the terminal system controller 28 and the display control unit 25 shown in fig. 2 can be realized by the main control unit 100 shown in fig. 19, and the terminal external input/output unit 23 (fig. 2) can be realized by the wireless communication unit 110 and/or the external input/output unit 160 (fig. 19). The terminal input unit 20 (fig. 2) can be realized by the operation panel 122 (display input unit 120) (fig. 19), and the display unit 27 (fig. 2) can be realized by the display panel 121 (display input unit 120) (fig. 19).
Description of the symbols
1-search system, 10-server system, 11-client terminal, 12-network, 15-network server, 16-database server, 17-image analysis server, 18-mail server, 20-terminal input section, 21-terminal transmission section, 22-terminal reception section, 23-terminal external input output section, 25-display control section, 27-display section, 28-terminal system controller, 31-server transmission section, 32-server reception section, 33-server external input output section, 35-user attribute information acquisition section, 36-user information database, 37-conversion table acquisition section, 38-conversion table database, 39-physical quantity acquisition section, 41-image search section, 42-image database, 43-image arrangement, 45-image analysis, 48-server system controller, 50-user ID data input field, 51-password input field, 52-soft keyboard, 60-physical quantity data, 62-characteristic data, 64-coordinate system, 66-commodity image, 70-database controller, 72-image analysis controller, 80-sensory space, 82-physical measurement space, 84-sensory area, 86-physical quantity area, 100-main control, 101-smartphone, 102-housing, 110-wireless communication, 120-display input, 121-display panel, 122-operation panel, 130-communication, 131-speaker, 132-microphone, 140-operating section, 141-camera section, 150-storage section, 151-internal storage section, 152-external storage section, 160-external input output section, 170-GPS receiving section, 180-motion sensor section, 190-power supply section, T-conversion table, I-image data, M-metadata, D1-user identification data, D2-perceptual word data, D3-user attribute data, D4-physical quantity data, D5-search result data, D6-display information data, D7-image information data.

Claims (13)

1. A search system comprising a client terminal and a server system connected to the client terminal via a network,
the server system has:
a server receiving unit that receives data transmitted from the client terminal via the network;
a physical quantity acquisition unit that acquires a physical quantity of a commodity associated with perceptual word data received by the server reception unit from the client terminal via the network;
an image search unit that acquires image data of a commodity associated with the physical quantity of the commodity acquired by the physical quantity acquisition unit;
an image arrangement unit that specifies a display form of the image data of the product acquired by the image search unit and generates display information data indicating the specified display form;
a server transmission unit that transmits the image data of the product acquired by the image search unit and the display information data generated by the image arrangement unit to the client terminal via the network;
the user information database stores the user identification data and the user attribute data after establishing association;
a user attribute information acquisition section that accesses the user information database to acquire the user attribute data associated with user identification data received from the client terminal via the network;
a conversion table database for storing a plurality of conversion tables which are determined according to the user attribute data and which associate the physical quantity of the commodity with the perceptual word data;
a conversion table acquisition unit that accesses the conversion table database to acquire a conversion table associated with the user attribute data acquired by the user attribute information acquisition unit; and
an image database for storing the image data of the commodity in association with the physical quantity of the commodity,
the client terminal has:
a display unit;
a terminal input unit that receives a user command to specify the perceptual word data;
a terminal transmitting unit that transmits the perceptual word data specified by the terminal input unit to the server system via the network;
a terminal receiving unit configured to receive the image data of the product and the display information data transmitted from the server system via the network; and
a display control unit for displaying the image data of the commodity received by the terminal receiving unit on the display unit based on the display information data received by the terminal receiving unit,
the terminal input unit receives a user command to specify user identification data and the perceptual word data,
the terminal transmitting unit transmits the user identification data and the perceptual word data specified by the terminal input unit to the server system via the network,
the physical quantity acquiring unit acquires the physical quantity of the commodity associated with the perceptual word data received by the server receiving unit with reference to the conversion table corresponding to the user attribute data acquired by the conversion table acquiring unit, the physical quantity of the commodity acquired by the physical quantity acquiring unit varies in accordance with the user attribute data,
the image search unit accesses the image database to acquire image data of the commodity associated with the physical quantity of the commodity acquired by the physical quantity acquisition unit.
2. The search system of claim 1,
the image arrangement unit determines a display mode of the image data of the product acquired by the image search unit to a display mode in which the image data of the product is displayed based on the characteristic data of the product,
the display information data indicates a display mode in which the image data of the product acquired by the image search unit is displayed based on the characteristic data of the product.
3. The search system of claim 2,
the display information data indicates a display mode in which at least a part of the image data of the product acquired by the image search unit is displayed on a coordinate system indicating the characteristic data of the product.
4. The search system according to claim 2 or 3,
the characteristic data of the product is determined according to a characteristic different from the perceptual word data designated by the terminal input unit among the characteristics of the product.
5. The search system according to claim 2 or 3,
the characteristic data of the commodity is determined based on at least one of the price of the commodity and the size of the commodity.
6. The search system according to any one of claims 1 to 3,
the physical quantity of the product is determined according to at least one of the color of the product, the pattern of the product, the texture of the product, and the model of the product.
7. The search system according to any one of claims 1 to 3,
the user attribute data is determined according to at least any one of a gender, age, race, nationality, religion, and a professor of the user.
8. The search system according to any one of claims 1 to 3,
the display control section displays a plurality of perceptual words on the display section,
the terminal input unit receives a command from a user, specifies at least one of the plurality of perceptual words displayed on the display unit, and specifies the specified perceptual word as the perceptual word data.
9. The search system according to any one of claims 1 to 3,
the image data of the commodity is acquired by photographing the commodity.
10. The search system according to any one of claims 1 to 3,
metadata indicating a physical quantity of the product is added to the image data of the product,
the image search unit acquires image data of a commodity to which metadata indicating a physical quantity of the commodity acquired by the physical quantity acquisition unit is added, as image data of the commodity associated with the physical quantity of the commodity acquired by the physical quantity acquisition unit.
11. The search system according to any one of claims 1 to 3,
the physical quantity of the commodity associated with the image data of the commodity is acquired by analyzing the image data of the commodity.
12. The search system according to any one of claims 1 to 3,
the server system further includes an image analysis unit that acquires a physical quantity of the commodity by analyzing the image data of the commodity, associates the acquired physical quantity of the commodity with the image data of the commodity, and stores the associated physical quantity of the commodity in the image database.
13. A method for controlling a search system including a client terminal and a server system connected to the client terminal via a network,
in the server system, it is possible that,
receiving, by a server receiving section, data transmitted from the client terminal via the network,
acquiring, by a physical quantity acquiring unit, a physical quantity of a commodity associated with perceptual word data received by the server receiving unit from the client terminal via the network,
acquiring, by an image search unit, image data of a commodity associated with the physical quantity of the commodity acquired by the physical quantity acquisition unit,
an image arrangement unit for specifying a display mode of the image data of the product acquired by the image search unit and generating display information data representing the specified display mode,
transmitting, by a server transmitting section, the image data of the commodity acquired by the image searching section and the display information data generated by the image arranging section to the client terminal via the network,
the user identification data and the user attribute data are stored after being associated through the user information database,
accessing, by a user attribute information acquisition section, the user information database to acquire the user attribute data associated with user identification data received from the client terminal via the network,
storing a plurality of conversion tables which are determined according to the user attribute data and which associate the physical quantity of the commodity with the perceptual word data,
accessing, by a conversion table acquiring section, the conversion table database to acquire a conversion table associated with the user attribute data acquired by the user attribute information acquiring section,
storing the image data of the commodity in association with the physical quantity of the commodity through an image database,
in the client terminal, the client terminal is provided with a plurality of terminals,
transmitting, by a terminal transmitting section, the perceptual word data specified by the terminal input section receiving a user's command to the server system via the network,
receiving, by a terminal receiving unit, the image data of the commodity and the display information data transmitted from the server system via the network,
displaying, by a display control unit, the image data of the commodity received by the terminal receiving unit on a display unit based on the display information data received by the terminal receiving unit,
in the control method of the search system, the search engine may,
receiving a user command through the terminal input unit to specify user identification data and the perceptual word data,
transmitting, by the terminal transmission unit, the user identification data and the perceptual word data specified by the terminal input unit to the server system via the network,
acquiring, by the physical quantity acquiring unit, a physical quantity of the commodity associated with the perceptual word data received by the server receiving unit with reference to the conversion table corresponding to the user attribute data acquired by the conversion table acquiring unit, the physical quantity of the commodity acquired by the physical quantity acquiring unit varying in accordance with the user attribute data,
the image search unit accesses the image database to acquire image data of the commodity associated with the physical quantity of the commodity acquired by the physical quantity acquisition unit.
CN201580010270.3A 2014-02-28 2015-01-19 Search system and control method of search system Active CN106030578B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2014-038524 2014-02-28
JP2014038524A JP6114706B2 (en) 2014-02-28 2014-02-28 Search system and search system control method
PCT/JP2015/051244 WO2015129333A1 (en) 2014-02-28 2015-01-19 Search system, server system, and method for controlling search system and server system

Publications (2)

Publication Number Publication Date
CN106030578A CN106030578A (en) 2016-10-12
CN106030578B true CN106030578B (en) 2019-12-27

Family

ID=54008662

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201580010270.3A Active CN106030578B (en) 2014-02-28 2015-01-19 Search system and control method of search system

Country Status (5)

Country Link
US (1) US10664887B2 (en)
EP (1) EP3113047A4 (en)
JP (1) JP6114706B2 (en)
CN (1) CN106030578B (en)
WO (1) WO2015129333A1 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10795641B2 (en) * 2016-08-16 2020-10-06 Sony Corporation Information processing device and information processing method
JP7047255B2 (en) * 2017-03-24 2022-04-05 富士フイルムビジネスイノベーション株式会社 Display devices, display systems and programs
JP2019200729A (en) * 2018-05-18 2019-11-21 シャープ株式会社 Electronic apparatus, control unit, control method, and program
US11586408B2 (en) 2019-01-29 2023-02-21 Dell Products L.P. System and method for aligning hinged screens
JP7469050B2 (en) 2020-01-14 2024-04-16 東芝テック株式会社 Program and information processing system
CN111767738A (en) * 2020-03-30 2020-10-13 北京沃东天骏信息技术有限公司 Label checking method, device, equipment and storage medium
CN113297475A (en) * 2021-03-26 2021-08-24 阿里巴巴新加坡控股有限公司 Commodity object information searching method and device and electronic equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1914611A (en) * 2004-01-29 2007-02-14 泽塔普利株式会社 Information search system, information search method, information search device, information search program, image recognition device, image recognition method, image recognition program, and sales sy
CN101441651A (en) * 2007-11-20 2009-05-27 富士胶片株式会社 Product search system, product search method, and product search program

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11120180A (en) * 1997-10-17 1999-04-30 Sharp Corp Data retrieval device and storage medium recording data retrieval program
JP2002157268A (en) 2000-11-22 2002-05-31 Fujitsu Ltd Method and system for announcing article information
JP2003256461A (en) * 2002-03-04 2003-09-12 Fuji Photo Film Co Ltd Method and device for retrieving image, and program
US7660822B1 (en) * 2004-03-31 2010-02-09 Google Inc. Systems and methods for sorting and displaying search results in multiple dimensions
WO2006036781A2 (en) * 2004-09-22 2006-04-06 Perfect Market Technologies, Inc. Search engine using user intent
US8566712B1 (en) * 2006-01-04 2013-10-22 Google Inc. Image management
US20090112910A1 (en) * 2007-10-31 2009-04-30 Motorola, Inc. Method and apparatus for personalization of an application
US8538943B1 (en) * 2008-07-24 2013-09-17 Google Inc. Providing images of named resources in response to a search query
EP4145371A1 (en) * 2008-08-08 2023-03-08 Nikon Corporation Search supporting system, search supporting method and search supporting program
GB2465378A (en) * 2008-11-14 2010-05-19 Want2Bthere Ltd Image based search system and method
JP5401962B2 (en) * 2008-12-15 2014-01-29 ソニー株式会社 Image processing apparatus, image processing method, and image processing program
JP2011065499A (en) 2009-09-18 2011-03-31 Seiko Epson Corp Image search device and image search method
US8600824B2 (en) * 2010-04-28 2013-12-03 Verizon Patent And Licensing Inc. Image-based product marketing systems and methods
JP2012108721A (en) * 2010-11-17 2012-06-07 Viva Computer Co Ltd Image search device and image search method
US9075825B2 (en) * 2011-09-26 2015-07-07 The University Of Kansas System and methods of integrating visual features with textual features for image searching
JP5506104B2 (en) * 2011-09-30 2014-05-28 楽天株式会社 Information processing apparatus, information processing method, and information processing program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1914611A (en) * 2004-01-29 2007-02-14 泽塔普利株式会社 Information search system, information search method, information search device, information search program, image recognition device, image recognition method, image recognition program, and sales sy
CN100465957C (en) * 2004-01-29 2009-03-04 泽塔普利株式会社 Information search system, information search method, information search device, information search program, image recognition device, image recognition method, image recognition program, and sales sy
CN101441651A (en) * 2007-11-20 2009-05-27 富士胶片株式会社 Product search system, product search method, and product search program

Also Published As

Publication number Publication date
US10664887B2 (en) 2020-05-26
CN106030578A (en) 2016-10-12
WO2015129333A1 (en) 2015-09-03
EP3113047A1 (en) 2017-01-04
US20160321730A1 (en) 2016-11-03
JP6114706B2 (en) 2017-04-12
JP2015162195A (en) 2015-09-07
EP3113047A4 (en) 2017-02-15

Similar Documents

Publication Publication Date Title
CN106030578B (en) Search system and control method of search system
US10776854B2 (en) Merchandise recommendation device, merchandise recommendation method, and program
US11460983B2 (en) Method of processing content and electronic device thereof
US11227326B2 (en) Augmented reality recommendations
US9870633B2 (en) Automated highlighting of identified text
US10891674B2 (en) Search apparatus, search system and search method
KR20170090370A (en) Analyzing method for review information mobile terminal, system and mobile terminal thereof
JP6120467B1 (en) Server device, terminal device, information processing method, and program
US10817923B2 (en) Information providing system, information providing apparatus, information providing method, and program
US20230091214A1 (en) Augmented reality items based on scan
CN111476154A (en) Expression package generation method, device, equipment and computer readable storage medium
CN113869063A (en) Data recommendation method and device, electronic equipment and storage medium
JP2017228278A (en) Server device, terminal device, information processing method, and program
CN111291555B (en) Commodity specification identification method, commodity specification identification device and computer readable storage medium
JP6321204B2 (en) Product search device and product search method
JP6414982B2 (en) Product image display control device, product image display control method, and program
JP6317714B2 (en) Print reception apparatus, print reception server, and print reception method
JP2019036045A (en) Control method, server and control program
KR20150033002A (en) Image providing system and image providing mehtod of the same

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant