US20180157682A1 - Image information processing system - Google Patents
Image information processing system Download PDFInfo
- Publication number
- US20180157682A1 US20180157682A1 US15/575,092 US201615575092A US2018157682A1 US 20180157682 A1 US20180157682 A1 US 20180157682A1 US 201615575092 A US201615575092 A US 201615575092A US 2018157682 A1 US2018157682 A1 US 2018157682A1
- Authority
- US
- United States
- Prior art keywords
- matching
- search
- image data
- image
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G06F17/30265—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/51—Indexing; Data structures therefor; Storage structures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
- G06F16/5838—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
- G06F16/5854—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using shape and object relationship
-
- G06F17/30994—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G06K9/6202—
-
- G06K9/6215—
-
- G06K9/78—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/904—Browsing; Visualisation therefor
Definitions
- the present invention relates to an image information processing system that searches for names and descriptions of captured objects on image data and provides the names and descriptions to users.
- an image search method and an apparatus as disclosed in Patent Literature 1 enable obtainment of information regarding an unknown target from a captured image of the captured unknown target.
- the image search method compares a binarized image of a captured image of an object and a binarized image of each item stored in an image database to search for candidates of the name of the object from the image database.
- Patent Literature 1 JP 10-254901 A
- the number of the images for comparison with the captured image stored in the image database is limited. Therefore, when the captured image and the image stored in the image database are compared, there is a substantial difference in search results, even for the same object, due to a difference in capturing conditions such as a season and a time zone of when the image is captured. To take account of the difference, ambiguity is increased for the degree of similarity calculated from the comparison result to facilitate obtainment of the search result, which results in reduction in search precision.
- the present invention provides an image information processing system.
- An image information processing system of the present invention made to solve the above problem including: an image database configured to store image data for matching of a captured object for matching in association with identification information including a name of the captured object for matching; an image acquiring unit configured to acquire image data; a search target input receiving unit configured to receive selection of a search object from the image data acquired by the image acquiring unit; an image search processing unit configured to calculate the degree of similarity between the search object received by the search target input receiving unit and the captured object for matching stored in the image database, and search for image data for matching of a captured object for matching having the degree of similarity that exceeds a reference value; a display processing unit configured to display a search result including the searched image data for matching and the identification information, on a display unit of an information communication terminal used by a user; and a search result processing unit configured to receive a selection input from the user to the search result displayed on the display unit of the information communication terminal, and stores the image data acquired by the image acquiring unit, as image data for matching, in association with the identification information of the search
- This configuration enables search for the name of the captured object and the like with high precision, using a captured image captured by the user, even if the user doesn't know the name of the captured object and the like from the captured image, for example.
- the search result processing unit stores image data of a captured search object having the degree of similarity that exceeds a predetermined value set to be higher than the reference value, as image data for matching, in association with the identification information of the search result of which the selection input from the user has been received.
- the image data stored in the image database as the image data for matching is limited to image data with a high degree of similarity. Therefore, precision of the image search can be further improved.
- the image database stores attribute information including type information, capturing position information, and capturing time information corresponding to the image data for matching of a captured object for matching, in association with the identification information
- the search target input receiving unit receives an input of the attribute information corresponding to the search object
- the image search processing unit calculates the degree of coincidence between the attribute information corresponding to the search object of which the input has been received by the search target input receiving unit and the attribute information corresponding to the image data for matching, extracts image data for matching of a captured object for matching having the degree of coincidence that exceeds an attribute reference value, calculates the degree of similarity between the search object received by the search target input receiving unit and the captured object for matching of the extracted image data for matching, and searches for the image data for matching of a captured object for matching having the degree of similarity that exceeds a reference value.
- the image information processing system of the present invention can search for the name and the like of the captured object with high precision by using the captured image, even if the user does not know the name and the like of the captured object from the image captured by the user.
- FIG. 1 is a conceptual diagram schematically illustrating an example of a system configuration of an image information processing system according to the present embodiment.
- FIG. 2 is a conceptual diagram schematically illustrating an example of a hardware configuration of the image information processing system according to the present embodiment.
- FIG. 3 is a diagram illustrating an example of items of a record of image information for matching of the present embodiment.
- FIG. 4 is a diagram illustrating an example of items of a record of image data for matching of the present embodiment.
- FIG. 5 is a diagram illustrating an example of a state in which a search object of the present embodiment is selected.
- FIG. 6 is a diagram illustrating an example of a state in which a search result of the present embodiment is displayed.
- FIG. 7 is a diagram illustrating another example of a state in which a search result of the present embodiment is displayed.
- FIG. 8 is a diagram illustrating an example of a screen prompting registration of image data of the present embodiment as image data for matching.
- FIG. 9 is an explanatory diagram of processing of registering image data of the present embodiment as image data for matching.
- FIG. 10 is a flowchart illustrating a processing process of the image information processing system of a first example.
- FIG. 11 is a flowchart illustrating a processing process of the image information processing system of a second example.
- FIG. 12 is a flowchart illustrating a processing process of the image information processing system of a third example.
- FIG. 13 is a diagram illustrating another example of a state in which a search result of the present embodiment is displayed.
- An image information processing system 1 of the present embodiment is realized using a computer including a server and the like.
- the computer includes an arithmetic device 30 such as a CPU that executes arithmetic processing of a program, a storage device 31 such as a RAM or a hard disk that stores information, a display device 32 such as a display that performs display, an input device 33 such as a mouse, a keyboard, and a touch panel used to perform an input, and a communication device 34 that transmits/receives the processing result of the arithmetic device 30 and the information stored in the storage device 31 to/from the computer.
- arithmetic device 30 such as a CPU that executes arithmetic processing of a program
- a storage device 31 such as a RAM or a hard disk that stores information
- a display device 32 such as a display that performs display
- an input device 33 such as a mouse, a keyboard, and a touch panel used to perform an input
- a communication device 34 that transmits
- Functions of means of the present invention are merely logically distinguished, and may physically or virtually have the same area. Further, in the image information processing system 1 of the present invention, these functions may be realized by one server, or each function may be distributed to two or more servers. In addition, apart or all of the functions of the present invention may be performed by an information communication terminal 20 such as a smartphone equipped with a camera function used by the user. In particular, the information communication terminal 20 may perform processing of an image acquiring unit 12 , a search target input receiving unit 13 , a display processing unit 15 , and the like, which will be described below. With the configuration, a processing load of a management server 10 can be reduced.
- the management server 10 having the functions of the image information processing system 1 can transmit/receive information to/from the information communication terminal 20 used by an administrator or a user through a network such as the Internet.
- the information communication terminal 20 include a smartphone equipped with a camera function, a personal computer, and portable communication terminals such as a mobile phone, a PHS, and a tablet computer.
- the image information processing system 1 includes an image database 11 , the image acquiring unit 12 , the search target input receiving unit 13 , an image search processing unit 14 , the display processing unit 15 , and a search result processing unit 16 .
- the image database 11 stores image data for matching 100 of the captured object for matching in association with identification information 100 a including a name of the captured object for matching.
- the image data for matching 100 includes, as illustrated in FIG. 4 , image files of natural scenery such as flowers, birds, animals and plants such as bugs, buildings such as towers, and mountains, and image IDs attached to the image files.
- the image database 11 may store attribute information 100 b including type information, capturing position information, and capturing time information corresponding to the image data for matching 100 of the captured object for matching in association with the identification information 100 a.
- the image acquiring unit 12 acquires image data 2 from the information communication terminal 20 used by the user.
- the file format of the image data 2 is not particularly limited.
- the image data 2 may be image data 2 captured by the user with the camera.
- the search target input receiving unit 13 receives selection of a search object 2 a from the image data 2 acquired by the image acquiring unit 12 .
- Examples of means for receiving the selection of the search object 2 a include means for displaying the image data 2 on a display unit 21 made of a liquid crystal display and the like of the information communication terminal 20 , and prompting the user to tap the search object 2 a from among the captured objects of the image data 2 , means for moving a selection window 3 as illustrated in FIG. 5 to surround the search object 2 a , and means for reading gaze information given by the user on the search object 2 a with so-called an in-camera provided on the display unit 21 side of the information communication terminal 20 .
- the search target input receiving unit 13 cuts out, enlarges, rotates, and moves a portion of the received search object 2 a to convert the image data 2 into a composition with which the degree of similarity with the image data for matching 100 stored in the image database 11 can be easily calculated.
- the search target input receiving unit 13 may receive an input of attribute information corresponding to the search object 2 a .
- the attribute information corresponding to the search object 2 a include type information, capturing position information, and capturing time information.
- Examples of the type information include “flower”, “bird”, “insect”, “building”, and “natural scenery”.
- the capturing position information is information of a captured place, and may be acquired from a GPS device provided in the information communication terminal 20 .
- the capturing time information includes not only time information but also date information.
- the image search processing unit 14 calculates the degree of similarity between the search object 2 a received by the search target input receiving unit 13 and the captured object for matching of the image data for matching 100 stored in the image database 11 , and searches for the image data for matching 100 of a captured object for matching having the degree of similarity that exceeds a reference value.
- the degree of similarity is calculated using a contour shape, a binarized image, color distribution, or the like, which is extracted by a known image processing technology.
- the degree of similarity is calculated to be 100% when the search object 2 a and the captured object for matching are perfectly matched.
- the image search processing unit 14 calculates the degree of coincidence between the attribute information corresponding to the search object 2 of which the input has been received by the search target input receiving unit 13 and the attribute information 100 b corresponding to the image data for matching 100 , extracts the image data for matching 100 of a captured object for matching having the degree of coincidence that exceeds an attribute reference value, calculates the degree of similarity between the search object 2 a received by the search target input receiving unit 13 and the captured object for matching of the extracted image data for matching 100 , and searches for the image data for matching 100 of a captured object for matching having the degree of similarity that exceeds a reference value.
- the degree of coincidence it is favorable to set an attribute reference value for each attribute information, and for example, the attribute reference value of the type information is favorably perfect match.
- the display processing unit 15 displays a search result 110 including the searched image data for matching 100 and the identification information 100 a on the display unit 21 of the information communication terminal 20 .
- the display processing unit 15 can arrange and display the search result 110 in descending order according to the degree of similarity. Further, the display processing unit 15 changes the display size of the search result 110 according to the degree of similarity as illustrated in FIG. 6 , so that the user can easily visually recognize the search result 110 with a high degree of similarity.
- the display processing unit 15 displays, as illustrated in FIG. 7 , a screen 110 a configured from the captured object for matching and the identification information 100 a corresponding to the captured object for matching on the display unit 21 of the information communication terminal 20 . Further, when receiving an input to a registration button 111 displayed on the screen 110 a from the user, the display processing unit 15 displays a screen that prompts a selection input as to whether registering the image data 2 , which is displayed on the display unit 21 of the information communication terminal 20 illustrated in FIG. 8 .
- the search result processing unit 16 receives a selection input from the user on the screen that prompts a selection input as to whether registering the image data 2 , which is displayed on the display unit 21 of the information communication terminal 20 , and stores the image data 2 acquired by the image acquiring unit 12 as the image data for matching 100 to the image database 11 in association with the identification information 100 a of the search result 110 of which the selection input from the user has been received.
- the search result processing unit 16 provides an image ID for distinguishing the image data for matching 100 to the image data 2 and stores the image ID to the image database 11 in association with an identification ID of the identification information 100 a.
- the search result processing unit 16 stores the image data 2 as the image data for matching 100 to the image database 11 in association with the identification information 100 a of the search result 110 of which the selection input from the user has been received, limiting the image data 2 to the image data 2 of the captured search object 2 a having the degree of similarity that exceeds a predetermined value set to be higher than a reference value.
- the image acquiring unit 12 acquires the image data 2 of “flower” captured by the information communication terminal 20 used by the user (S 102 ).
- the image data 2 may not be image data captured by the information communication terminal 20 used by the user and may be the image data 2 acquired through the Internet or the like.
- the search target input receiving unit 13 receives selection of the search object 2 a from the image data 2 displayed on the display unit 21 of the information communication terminal 20 (S 103 ). Then, the search target input receiving unit 13 cuts out, enlarges, rotates, and moves a portion of the received search object 2 a to convert the image data 2 into a composition with which the degree of similarity with the image data for matching 100 stored in the image database 11 can be easily calculated.
- a method of receiving the selection of the search object 2 a is not particularly limited. For example, the selection of the search object 2 a is received by tapping the search object 2 a or by moving the selection window 3 to surround the search object 2 a as illustrated in FIG. 5 .
- the image search processing unit 14 calculates the degree of similarity between the search object 2 a received by the search target input receiving unit 13 and the captured object for matching stored in the image database 11 (S 104 ).
- the degree of similarity is calculated using a contour shape, a binarized image, color distribution, or the like, which is extracted by a known image processing technology.
- the degree of similarity calculated in this way can be displayed as a numerical value, and is calculated to be 100% when the search object 2 a and the captured object for matching are perfectly matched.
- the image search processing unit 14 searches for the image data for matching 100 of a captured object for matching having the degree of similarity that exceeds a reference value (S 105 ).
- the reference value can be appropriately changed. For example, 80% may be set as the reference value.
- the display processing unit 15 displays the search result 110 including the image data for matching 100 and the identification information 100 a on the display unit 21 of the information communication terminal 20 (S 106 ).
- the search result 110 is displayed in descending order according from search result 110 having a high degree of similarity to the search result 110 having a low degree of similarity calculated by the image search processing unit 14 . Further, the display processing unit 15 may display the search result 110 having a high degree of similarity in a largest display size as illustrated FIG. 6 .
- the display processing unit 15 can receive a selection input to the search result 110 from the user, and display the screen 110 a configured from the captured object for matching and the identification information 100 a for the captured object for matching illustrated in FIG. 7 on the display unit 21 of the information communication terminal 20 .
- the display processing unit 15 When receiving an input to the registration button 111 illustrated in FIG. 7 , the display processing unit 15 receives a selection input by the user to the search result 110 displayed on the display unit 21 of the information communication terminal 20 , and displays a screen that prompts a selection input as to whether registering the image data 2 illustrated in FIG. 8 .
- the search result processing unit 16 displays the image data 2 acquired by the image acquiring unit 12 to the image database 11 as the image data for matching 100 in association with the identification information 100 a of the search result 110 of which the selection input from the user has been received (S 108 ).
- the search result processing unit 16 provides an image ID for distinguishing the image data for matching 100 to the image data 2 and stores the image ID to the image database 11 in association with an identification ID of the identification information 100 a.
- the image data 2 is sequentially accumulated as the image data for matching 100 , and the accuracy of the calculated degree of similarity is improved. Therefore, the search precision can be improved and a more accurate search result 110 can be provided.
- the image data 2 may not be appropriate to be stored as the image data for matching 100 . Therefore, in the second example, a case in which only image data 2 that satisfies a predetermined condition can be stored in an image database 11 as image data for matching 100 will be described.
- An example of a processing process of the second example will be described with reference to the flowchart shown in FIG. 11 . The same reference numeral is given to a process similar to that in the first example, and a detailed description thereof is omitted.
- the search result processing unit 16 determines whether the degree of similarity exceeds a predetermined value set to be higher than a reference value (S 201 ).
- the search result processing unit 16 determines that the degree of similarity exceeds the predetermined value set to be higher than the reference value, stores the image data 2 of the search object 2 a to the image database 11 as the image data for matching 100 in association with the identification information 100 a of the received search result 110 (S 108 ).
- the degrees of similarity between the search object 2 a and all the captured object for matching stored in the image database 11 are calculated, and when the calculated degrees of similarity exceed the reference value, the search result 110 is displayed as results having high degrees of similarity on the display unit 21 of the information communication terminal 20 used by the user.
- a delay of the processing speed of the calculation processing of the degree of similarity by the image search processing unit 14 and a load to the management server 10 that performs the calculation processing of the degree of similarity are increased. Therefore, in the present third example, a case of limiting the number of image data for matching 100 for which the degree of similarity is to be calculated will be described with reference to the flowchart shown in FIG. 12 .
- the same reference numeral is given to a process similar to that in the first and second examples, and a detailed description thereof is omitted.
- the search target input receiving unit 13 receives not only an input of the search object 2 a but also an input of the attribute information corresponding to the search object 2 a (S 301 ).
- the search object 2 a of the present example is “flower”, and thus the search target input receiving unit 13 receives selection of the type information of “flower”.
- the search target input receiving unit 13 may receive inputs of the capturing position information and the capturing time information of the search object 2 a .
- the capturing position information and the capturing time information are respectively acquired from a GPS device and a date device provided in the information communication terminal 20 .
- the image search processing unit 14 calculates the degree of coincidence between the attribute information corresponding to the search object 2 a received by the search target input receiving unit 13 and the attribute information 100 b corresponding to the captured object for matching stored in the image database 11 (S 302 ), and extracts the captured object for matching having the degree of coincidence that exceeds the attribute reference value (S 303 ).
- the degree of coincidence is calculated to be 100% when the attribute information of the search object 2 a and the attribute information 100 b of the captured object for matching are perfectly matched.
- the image search processing unit 14 calculates the degree of similarity between the search object 2 a and the extracted captured object for matching (S 104 ), and searches for the image data for matching 100 of the captured object for matching having the degree of similarity that exceeds the reference value (S 105 ).
- the degree of similarity is calculated limiting the search range to the captured object for matching in which the type information in the attribute information 100 b is set to “flower”. Therefore, the search range is limited, and the processing speed is improved.
- the search range is limited to the captured object for matching in which the capturing time information of the attribute information 100 b is set to “day time”, so that the degree of similarity can be calculated between the objects in similar conditions. Therefore, the search precision can be improved.
- the invention disclosed in the present specification is not limited to the configurations of the inventions and embodiments, and also includes a specified configuration in which a partial configuration of the aforementioned configuration is changed to another configuration disclosed in the present specification, a specified configuration in which another configuration disclosed in the present specification is added to the aforementioned configuration, a specified configuration having a more generic concept in which the partial configuration is deleted to the extent of obtaining the function and effect, within an applicable scope and also includes modifications described below.
- the display processing unit 15 may display the identification information 100 a associated with the image data 2 of the search target 2 a by the search result processing unit 16 , near the search object 2 a of the image data 2 .
- the search target input receiving unit 13 cuts out a portion of the search object 2 a
- the search target input receiving unit 13 acquires coordinate position information of the image data 2 of the search object 2 a .
- the display processing unit 15 then shifts and displays the identification information 100 a associated with the image data 2 of the search object 2 a by the search result processing unit 16 to a position not overlapping with the coordinate position information.
Abstract
An image information processing system includes an image database storing image data for matching of a captured object for matching in association with identification information including a name of the captured object, an image acquiring unit acquiring image data, a search target input receiving unit receiving selection of a search object from the image data, an image search processing unit that calculates the degree of similarity between the search object and the captured object, and searches for image data for matching, a display processing unit displaying the search result, and a search result processing unit that receives a selection input from a user to the search result, and stores the image data acquired by the image acquiring unit as image data for matching to the image database in association with the identification information of the search result of which the selection input from the user has been received.
Description
- The present invention relates to an image information processing system that searches for names and descriptions of captured objects on image data and provides the names and descriptions to users.
- With the spread of smartphones and the like equipped with a camera function, there are increasing cases where users capture natural scenery such as flowers, birds, buildings, and mountains that the users are curious about in daily life, while traveling, and the like.
- As often happens when checking detailed information of the captured object captured by the user, the user doesn't know even the name of the captured object, has difficulty in entering an appropriate keyword when performing keyword search, and cannot obtain a satisfactory search result.
- Therefore, an image search method and an apparatus as disclosed in
Patent Literature 1 enable obtainment of information regarding an unknown target from a captured image of the captured unknown target. The image search method compares a binarized image of a captured image of an object and a binarized image of each item stored in an image database to search for candidates of the name of the object from the image database. - However, the number of the images for comparison with the captured image stored in the image database is limited. Therefore, when the captured image and the image stored in the image database are compared, there is a substantial difference in search results, even for the same object, due to a difference in capturing conditions such as a season and a time zone of when the image is captured. To take account of the difference, ambiguity is increased for the degree of similarity calculated from the comparison result to facilitate obtainment of the search result, which results in reduction in search precision.
- In view of the above problems, the present invention provides an image information processing system.
- An image information processing system of the present invention made to solve the above problem including: an image database configured to store image data for matching of a captured object for matching in association with identification information including a name of the captured object for matching; an image acquiring unit configured to acquire image data; a search target input receiving unit configured to receive selection of a search object from the image data acquired by the image acquiring unit; an image search processing unit configured to calculate the degree of similarity between the search object received by the search target input receiving unit and the captured object for matching stored in the image database, and search for image data for matching of a captured object for matching having the degree of similarity that exceeds a reference value; a display processing unit configured to display a search result including the searched image data for matching and the identification information, on a display unit of an information communication terminal used by a user; and a search result processing unit configured to receive a selection input from the user to the search result displayed on the display unit of the information communication terminal, and stores the image data acquired by the image acquiring unit, as image data for matching, in association with the identification information of the search result of which the selection input from the user has been received.
- This configuration enables search for the name of the captured object and the like with high precision, using a captured image captured by the user, even if the user doesn't know the name of the captured object and the like from the captured image, for example.
- Further, in the image information processing system of the present invention, the search result processing unit stores image data of a captured search object having the degree of similarity that exceeds a predetermined value set to be higher than the reference value, as image data for matching, in association with the identification information of the search result of which the selection input from the user has been received.
- With this configuration, the image data stored in the image database as the image data for matching is limited to image data with a high degree of similarity. Therefore, precision of the image search can be further improved.
- Further, in the image information processing system of the present invention, the image database stores attribute information including type information, capturing position information, and capturing time information corresponding to the image data for matching of a captured object for matching, in association with the identification information, the search target input receiving unit receives an input of the attribute information corresponding to the search object, and the image search processing unit calculates the degree of coincidence between the attribute information corresponding to the search object of which the input has been received by the search target input receiving unit and the attribute information corresponding to the image data for matching, extracts image data for matching of a captured object for matching having the degree of coincidence that exceeds an attribute reference value, calculates the degree of similarity between the search object received by the search target input receiving unit and the captured object for matching of the extracted image data for matching, and searches for the image data for matching of a captured object for matching having the degree of similarity that exceeds a reference value.
- With this configuration, the image data for matching with matched attribute information is extracted and the degree of similarity is calculated. Therefore, a search processing speed can be improved.
- The image information processing system of the present invention can search for the name and the like of the captured object with high precision by using the captured image, even if the user does not know the name and the like of the captured object from the image captured by the user.
-
FIG. 1 is a conceptual diagram schematically illustrating an example of a system configuration of an image information processing system according to the present embodiment. -
FIG. 2 is a conceptual diagram schematically illustrating an example of a hardware configuration of the image information processing system according to the present embodiment. -
FIG. 3 is a diagram illustrating an example of items of a record of image information for matching of the present embodiment. -
FIG. 4 is a diagram illustrating an example of items of a record of image data for matching of the present embodiment. -
FIG. 5 is a diagram illustrating an example of a state in which a search object of the present embodiment is selected. -
FIG. 6 is a diagram illustrating an example of a state in which a search result of the present embodiment is displayed. -
FIG. 7 is a diagram illustrating another example of a state in which a search result of the present embodiment is displayed. -
FIG. 8 is a diagram illustrating an example of a screen prompting registration of image data of the present embodiment as image data for matching. -
FIG. 9 is an explanatory diagram of processing of registering image data of the present embodiment as image data for matching. -
FIG. 10 is a flowchart illustrating a processing process of the image information processing system of a first example. -
FIG. 11 is a flowchart illustrating a processing process of the image information processing system of a second example. -
FIG. 12 is a flowchart illustrating a processing process of the image information processing system of a third example. -
FIG. 13 is a diagram illustrating another example of a state in which a search result of the present embodiment is displayed. - Hereinafter, a system configuration of an image information processing system of the present embodiment will be described with reference to
FIG. 1 . - An image
information processing system 1 of the present embodiment is realized using a computer including a server and the like. As illustrated inFIG. 2 , in the imageinformation processing system 1 of the present embodiment, the computer includes anarithmetic device 30 such as a CPU that executes arithmetic processing of a program, astorage device 31 such as a RAM or a hard disk that stores information, adisplay device 32 such as a display that performs display, aninput device 33 such as a mouse, a keyboard, and a touch panel used to perform an input, and acommunication device 34 that transmits/receives the processing result of thearithmetic device 30 and the information stored in thestorage device 31 to/from the computer. - Functions of means of the present invention are merely logically distinguished, and may physically or virtually have the same area. Further, in the image
information processing system 1 of the present invention, these functions may be realized by one server, or each function may be distributed to two or more servers. In addition, apart or all of the functions of the present invention may be performed by aninformation communication terminal 20 such as a smartphone equipped with a camera function used by the user. In particular, theinformation communication terminal 20 may perform processing of animage acquiring unit 12, a search targetinput receiving unit 13, adisplay processing unit 15, and the like, which will be described below. With the configuration, a processing load of amanagement server 10 can be reduced. - The
management server 10 having the functions of the imageinformation processing system 1 can transmit/receive information to/from theinformation communication terminal 20 used by an administrator or a user through a network such as the Internet. Examples of theinformation communication terminal 20 include a smartphone equipped with a camera function, a personal computer, and portable communication terminals such as a mobile phone, a PHS, and a tablet computer. - The image
information processing system 1 includes animage database 11, theimage acquiring unit 12, the search targetinput receiving unit 13, an imagesearch processing unit 14, thedisplay processing unit 15, and a searchresult processing unit 16. - As illustrated in
FIG. 3 , theimage database 11 stores image data for matching 100 of the captured object for matching in association withidentification information 100 a including a name of the captured object for matching. The image data for matching 100 includes, as illustrated inFIG. 4 , image files of natural scenery such as flowers, birds, animals and plants such as bugs, buildings such as towers, and mountains, and image IDs attached to the image files. Further, theimage database 11 may storeattribute information 100 b including type information, capturing position information, and capturing time information corresponding to the image data for matching 100 of the captured object for matching in association with theidentification information 100 a. - The
image acquiring unit 12 acquiresimage data 2 from theinformation communication terminal 20 used by the user. The file format of theimage data 2 is not particularly limited. For example, in a case where theinformation communication terminal 20 is a smartphone equipped with a camera function, theimage data 2 may beimage data 2 captured by the user with the camera. - The search target
input receiving unit 13 receives selection of asearch object 2 a from theimage data 2 acquired by theimage acquiring unit 12. Examples of means for receiving the selection of thesearch object 2 a include means for displaying theimage data 2 on adisplay unit 21 made of a liquid crystal display and the like of theinformation communication terminal 20, and prompting the user to tap thesearch object 2 a from among the captured objects of theimage data 2, means for moving aselection window 3 as illustrated inFIG. 5 to surround thesearch object 2 a, and means for reading gaze information given by the user on thesearch object 2 a with so-called an in-camera provided on thedisplay unit 21 side of theinformation communication terminal 20. - Then, the search target
input receiving unit 13 cuts out, enlarges, rotates, and moves a portion of the receivedsearch object 2 a to convert theimage data 2 into a composition with which the degree of similarity with the image data for matching 100 stored in theimage database 11 can be easily calculated. - Further, the search target
input receiving unit 13 may receive an input of attribute information corresponding to thesearch object 2 a. Examples of the attribute information corresponding to thesearch object 2 a include type information, capturing position information, and capturing time information. Examples of the type information include “flower”, “bird”, “insect”, “building”, and “natural scenery”. The capturing position information is information of a captured place, and may be acquired from a GPS device provided in theinformation communication terminal 20. The capturing time information includes not only time information but also date information. - The image
search processing unit 14 calculates the degree of similarity between thesearch object 2 a received by the search targetinput receiving unit 13 and the captured object for matching of the image data for matching 100 stored in theimage database 11, and searches for the image data for matching 100 of a captured object for matching having the degree of similarity that exceeds a reference value. Here, as a method of calculating the degree of similarity, the degree of similarity is calculated using a contour shape, a binarized image, color distribution, or the like, which is extracted by a known image processing technology. The degree of similarity is calculated to be 100% when thesearch object 2 a and the captured object for matching are perfectly matched. - Further, the image
search processing unit 14 calculates the degree of coincidence between the attribute information corresponding to thesearch object 2 of which the input has been received by the search targetinput receiving unit 13 and theattribute information 100 b corresponding to the image data for matching 100, extracts the image data for matching 100 of a captured object for matching having the degree of coincidence that exceeds an attribute reference value, calculates the degree of similarity between thesearch object 2 a received by the search targetinput receiving unit 13 and the captured object for matching of the extracted image data for matching 100, and searches for the image data for matching 100 of a captured object for matching having the degree of similarity that exceeds a reference value. For the degree of coincidence, it is favorable to set an attribute reference value for each attribute information, and for example, the attribute reference value of the type information is favorably perfect match. - The
display processing unit 15 displays asearch result 110 including the searched image data for matching 100 and theidentification information 100 a on thedisplay unit 21 of theinformation communication terminal 20. Thedisplay processing unit 15 can arrange and display thesearch result 110 in descending order according to the degree of similarity. Further, thedisplay processing unit 15 changes the display size of thesearch result 110 according to the degree of similarity as illustrated inFIG. 6 , so that the user can easily visually recognize thesearch result 110 with a high degree of similarity. - Further, when receiving a selection input to the a search result 200 displayed on the
display unit 21 of theinformation communication terminal 20 from the user, thedisplay processing unit 15 displays, as illustrated inFIG. 7 , ascreen 110 a configured from the captured object for matching and theidentification information 100 a corresponding to the captured object for matching on thedisplay unit 21 of theinformation communication terminal 20. Further, when receiving an input to aregistration button 111 displayed on thescreen 110 a from the user, thedisplay processing unit 15 displays a screen that prompts a selection input as to whether registering theimage data 2, which is displayed on thedisplay unit 21 of theinformation communication terminal 20 illustrated inFIG. 8 . - The search
result processing unit 16 receives a selection input from the user on the screen that prompts a selection input as to whether registering theimage data 2, which is displayed on thedisplay unit 21 of theinformation communication terminal 20, and stores theimage data 2 acquired by theimage acquiring unit 12 as the image data for matching 100 to theimage database 11 in association with theidentification information 100 a of thesearch result 110 of which the selection input from the user has been received. To be specific, as illustrated inFIG. 9 , the searchresult processing unit 16 provides an image ID for distinguishing the image data for matching 100 to theimage data 2 and stores the image ID to theimage database 11 in association with an identification ID of theidentification information 100 a. - Further, the search
result processing unit 16 stores theimage data 2 as the image data for matching 100 to theimage database 11 in association with theidentification information 100 a of thesearch result 110 of which the selection input from the user has been received, limiting theimage data 2 to theimage data 2 of the capturedsearch object 2 a having the degree of similarity that exceeds a predetermined value set to be higher than a reference value. - Next, an example of a processing process of the image
information processing system 1 will be described below with reference to the flowchart ofFIG. 10 . - In the first example, processing of capturing an unknown flower planted in a park or the like using a camera of a smartphone having a camera function as the information communication terminal 20 (S101), and displaying identification information including the name of the flower from the
image data 2 of captured “flower” will be described. - The
image acquiring unit 12 acquires theimage data 2 of “flower” captured by theinformation communication terminal 20 used by the user (S102). Note that theimage data 2 may not be image data captured by theinformation communication terminal 20 used by the user and may be theimage data 2 acquired through the Internet or the like. - The search target
input receiving unit 13 receives selection of thesearch object 2 a from theimage data 2 displayed on thedisplay unit 21 of the information communication terminal 20 (S103). Then, the search targetinput receiving unit 13 cuts out, enlarges, rotates, and moves a portion of the receivedsearch object 2 a to convert theimage data 2 into a composition with which the degree of similarity with the image data for matching 100 stored in theimage database 11 can be easily calculated. A method of receiving the selection of thesearch object 2 a is not particularly limited. For example, the selection of thesearch object 2 a is received by tapping thesearch object 2 a or by moving theselection window 3 to surround thesearch object 2 a as illustrated inFIG. 5 . - The image
search processing unit 14 calculates the degree of similarity between thesearch object 2 a received by the search targetinput receiving unit 13 and the captured object for matching stored in the image database 11 (S104). As a method of calculating the degree of similarity, the degree of similarity is calculated using a contour shape, a binarized image, color distribution, or the like, which is extracted by a known image processing technology. The degree of similarity calculated in this way can be displayed as a numerical value, and is calculated to be 100% when thesearch object 2 a and the captured object for matching are perfectly matched. - The image
search processing unit 14 then searches for the image data for matching 100 of a captured object for matching having the degree of similarity that exceeds a reference value (S105). Here, the reference value can be appropriately changed. For example, 80% may be set as the reference value. - The
display processing unit 15 displays thesearch result 110 including the image data for matching 100 and theidentification information 100 a on thedisplay unit 21 of the information communication terminal 20 (S106). Thesearch result 110 is displayed in descending order according fromsearch result 110 having a high degree of similarity to thesearch result 110 having a low degree of similarity calculated by the imagesearch processing unit 14. Further, thedisplay processing unit 15 may display thesearch result 110 having a high degree of similarity in a largest display size as illustratedFIG. 6 . - Further, the
display processing unit 15 can receive a selection input to thesearch result 110 from the user, and display thescreen 110 a configured from the captured object for matching and theidentification information 100 a for the captured object for matching illustrated inFIG. 7 on thedisplay unit 21 of theinformation communication terminal 20. - When receiving an input to the
registration button 111 illustrated inFIG. 7 , thedisplay processing unit 15 receives a selection input by the user to thesearch result 110 displayed on thedisplay unit 21 of theinformation communication terminal 20, and displays a screen that prompts a selection input as to whether registering theimage data 2 illustrated inFIG. 8 . - Then, when a Yes button is input on the screen that prompts a selection input as to whether registering the
image data 2 illustrated inFIG. 8 (S107), the searchresult processing unit 16 displays theimage data 2 acquired by theimage acquiring unit 12 to theimage database 11 as the image data for matching 100 in association with theidentification information 100 a of thesearch result 110 of which the selection input from the user has been received (S108). To be specific, as illustrated inFIG. 9 , the searchresult processing unit 16 provides an image ID for distinguishing the image data for matching 100 to theimage data 2 and stores the image ID to theimage database 11 in association with an identification ID of theidentification information 100 a. - In this manner, the
image data 2 is sequentially accumulated as the image data for matching 100, and the accuracy of the calculated degree of similarity is improved. Therefore, the search precision can be improved and a moreaccurate search result 110 can be provided. - In the first example, whether storing the
image data 2 captured by the user to theimage database 11 as the image data for matching 100 is left to the user's own decision. Therefore, theimage data 2 may not be appropriate to be stored as the image data for matching 100. Therefore, in the second example, a case in which onlyimage data 2 that satisfies a predetermined condition can be stored in animage database 11 as image data for matching 100 will be described. An example of a processing process of the second example will be described with reference to the flowchart shown inFIG. 11 . The same reference numeral is given to a process similar to that in the first example, and a detailed description thereof is omitted. - When the Yes button is input on the screen that prompt a selection input as to whether registering the
image data 2 illustrated inFIG. 8 , the searchresult processing unit 16 determines whether the degree of similarity exceeds a predetermined value set to be higher than a reference value (S201). When the searchresult processing unit 16 determines that the degree of similarity exceeds the predetermined value set to be higher than the reference value, the searchresult processing unit 16 stores theimage data 2 of thesearch object 2 a to theimage database 11 as the image data for matching 100 in association with theidentification information 100 a of the received search result 110 (S108). - What is associated with the
identification information 100 a as the image data for matching 100 is limited to only theimage data 2 having a high degree of similarity. Therefore, the search precision can be further improved. - In the first and second examples, the degrees of similarity between the
search object 2 a and all the captured object for matching stored in theimage database 11 are calculated, and when the calculated degrees of similarity exceed the reference value, thesearch result 110 is displayed as results having high degrees of similarity on thedisplay unit 21 of theinformation communication terminal 20 used by the user. As described above, with an increase in the number of image data for matching 100, a delay of the processing speed of the calculation processing of the degree of similarity by the imagesearch processing unit 14 and a load to themanagement server 10 that performs the calculation processing of the degree of similarity are increased. Therefore, in the present third example, a case of limiting the number of image data for matching 100 for which the degree of similarity is to be calculated will be described with reference to the flowchart shown inFIG. 12 . The same reference numeral is given to a process similar to that in the first and second examples, and a detailed description thereof is omitted. - The search target
input receiving unit 13 receives not only an input of thesearch object 2 a but also an input of the attribute information corresponding to thesearch object 2 a (S301). Thesearch object 2 a of the present example is “flower”, and thus the search targetinput receiving unit 13 receives selection of the type information of “flower”. Furthermore, the search targetinput receiving unit 13 may receive inputs of the capturing position information and the capturing time information of thesearch object 2 a. The capturing position information and the capturing time information are respectively acquired from a GPS device and a date device provided in theinformation communication terminal 20. - The image
search processing unit 14 calculates the degree of coincidence between the attribute information corresponding to thesearch object 2 a received by the search targetinput receiving unit 13 and theattribute information 100 b corresponding to the captured object for matching stored in the image database 11 (S302), and extracts the captured object for matching having the degree of coincidence that exceeds the attribute reference value (S303). For example, the degree of coincidence is calculated to be 100% when the attribute information of thesearch object 2 a and theattribute information 100 b of the captured object for matching are perfectly matched. - The image
search processing unit 14 then calculates the degree of similarity between thesearch object 2 a and the extracted captured object for matching (S104), and searches for the image data for matching 100 of the captured object for matching having the degree of similarity that exceeds the reference value (S105). In the third example, in a case where the type information of the attribute information corresponding to thesearch object 2 a is “flower”, the degree of similarity is calculated limiting the search range to the captured object for matching in which the type information in theattribute information 100 b is set to “flower”. Therefore, the search range is limited, and the processing speed is improved. - Further, in a case where the capturing time information of the attribute information corresponding to the
search object 2 a corresponds to “daytime”, the search range is limited to the captured object for matching in which the capturing time information of theattribute information 100 b is set to “day time”, so that the degree of similarity can be calculated between the objects in similar conditions. Therefore, the search precision can be improved. - The invention disclosed in the present specification is not limited to the configurations of the inventions and embodiments, and also includes a specified configuration in which a partial configuration of the aforementioned configuration is changed to another configuration disclosed in the present specification, a specified configuration in which another configuration disclosed in the present specification is added to the aforementioned configuration, a specified configuration having a more generic concept in which the partial configuration is deleted to the extent of obtaining the function and effect, within an applicable scope and also includes modifications described below.
- As illustrated in
FIG. 13 , thedisplay processing unit 15 may display theidentification information 100 a associated with theimage data 2 of thesearch target 2 a by the searchresult processing unit 16, near thesearch object 2 a of theimage data 2. To be specific, when the search targetinput receiving unit 13 cuts out a portion of thesearch object 2 a, the search targetinput receiving unit 13 acquires coordinate position information of theimage data 2 of thesearch object 2 a. Thedisplay processing unit 15 then shifts and displays theidentification information 100 a associated with theimage data 2 of thesearch object 2 a by the searchresult processing unit 16 to a position not overlapping with the coordinate position information. -
- 1 Image information processing system
- 2 Image data
- 3 Selection window
- 10 Management server
- 11 Image DB
- 12 Image acquiring unit
- 13 Search target input receiving unit
- 14 Image search processing unit
- 15 Display processing unit
- 16 Search result processing unit
- 20 Information communication terminal
- 21 Display unit
- 30 Arithmetic device
- 31 Storage device
- 32 Display device communication device
- 33 Input device
- 34 Communication device
- 100 Image data for matching
- 110 Search result
Claims (3)
1. An image information processing system comprising:
an image database configured to store image data for matching of a captured object for matching in association with identification information including a name of the captured object for matching;
an image acquiring unit configured to acquire image data;
a search target input receiving unit configured to receive selection of a search object from the image data acquired by the image acquiring unit;
an image search processing unit configured to calculate the degree of similarity between the search object received by the search target input receiving unit and the captured object for matching stored in the image database, and search for image data for matching of a captured object for matching having the degree of similarity that exceeds a reference value;
a display processing unit configured to display a search result including the searched image data for matching and the identification information, on a display unit of an information communication terminal used by a user; and
a search result processing unit configured to receive a selection input from the user to the search result displayed on the display unit of the information communication terminal, and stores the image data acquired by the image acquiring unit, as image data for matching, in association with the identification information of the search result of which the selection input from the user has been received.
2. The image information processing system according to claim 1 , wherein
the search result processing unit stores image data of a captured search object having the degree of similarity that exceeds a predetermined value set to be higher than the reference value, as image data for matching, in association with the identification information of the search result of which the selection input from the user has been received.
3. The image information processing system according to claim 1 , wherein
the image database stores attribute information including type information, capturing position information, and capturing time information corresponding to the image data for matching of a captured object for matching, in association with the identification information,
the search target input receiving unit receives an input of the attribute information corresponding to the search object, and
the image search processing unit calculates the degree of coincidence between the attribute information corresponding to the search object of which the input has been received by the search target input receiving unit and the attribute information corresponding to the image data for matching, extracts image data for matching of a captured object for matching having the degree of coincidence that exceeds an attribute reference value, calculates the degree of similarity between the search object received by the search target input receiving unit and the captured object for matching of the extracted image data for matching, and searches for the image data for matching of a captured object for matching having the degree of similarity that exceeds a reference value.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-117539 | 2015-06-10 | ||
JP2015117539A JP2017004252A (en) | 2015-06-10 | 2015-06-10 | Image information processing system |
PCT/JP2016/066355 WO2016199662A1 (en) | 2015-06-10 | 2016-06-02 | Image information processing system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20180157682A1 true US20180157682A1 (en) | 2018-06-07 |
Family
ID=57503568
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/575,092 Abandoned US20180157682A1 (en) | 2015-06-10 | 2016-06-02 | Image information processing system |
Country Status (5)
Country | Link |
---|---|
US (1) | US20180157682A1 (en) |
EP (1) | EP3309694A4 (en) |
JP (1) | JP2017004252A (en) |
CN (1) | CN107636653A (en) |
WO (1) | WO2016199662A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180165856A1 (en) * | 2016-12-09 | 2018-06-14 | Canon Kabushiki Kaisha | Control method and storage medium |
US10506110B2 (en) | 2016-12-09 | 2019-12-10 | Canon Kabushiki Kaisha | Image processing apparatus, control method, and storage medium |
US10560601B2 (en) | 2016-12-09 | 2020-02-11 | Canon Kabushiki Kaisha | Image processing method, image processing apparatus, and storage medium |
US20220147050A1 (en) * | 2020-11-10 | 2022-05-12 | Meteorolite Ltd. | Methods and devices for operating an intelligent mobile robot |
US11647370B2 (en) | 2018-02-16 | 2023-05-09 | Maxell, Ltd. | Mobile information terminal, information presentation system and information presentation method |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6862952B2 (en) * | 2017-03-16 | 2021-04-21 | 株式会社リコー | Information processing system, information processing device, information processing program and information processing method |
JP7095338B2 (en) * | 2018-03-19 | 2022-07-05 | 株式会社リコー | Image search device, information processing system and image search method |
JP2020035086A (en) * | 2018-08-28 | 2020-03-05 | 富士ゼロックス株式会社 | Information processing system, information processing apparatus and program |
JP7087844B2 (en) * | 2018-08-31 | 2022-06-21 | トヨタ自動車株式会社 | Image processing equipment, image processing system and vehicle |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020178135A1 (en) * | 1998-02-24 | 2002-11-28 | Sumiyo Tanaka | Image searching system and image searching method, and a recording medium storing an image searching program |
US20050162523A1 (en) * | 2004-01-22 | 2005-07-28 | Darrell Trevor J. | Photo-based mobile deixis system and related techniques |
US20070133947A1 (en) * | 2005-10-28 | 2007-06-14 | William Armitage | Systems and methods for image search |
US20090034805A1 (en) * | 2006-05-10 | 2009-02-05 | Aol Llc | Using Relevance Feedback In Face Recognition |
US20100250136A1 (en) * | 2009-03-26 | 2010-09-30 | Chen Chien-Hung Z | Computer based location identification using images |
US7844591B1 (en) * | 2006-10-12 | 2010-11-30 | Adobe Systems Incorporated | Method for displaying an image with search results |
US20110314049A1 (en) * | 2010-06-22 | 2011-12-22 | Xerox Corporation | Photography assistant and method for assisting a user in photographing landmarks and scenes |
US20120239638A1 (en) * | 2008-06-05 | 2012-09-20 | Enpulz, L.L.C. | Search engine supporting mixed image and text input |
US8472664B1 (en) * | 2008-01-31 | 2013-06-25 | Google Inc. | Inferring locations from an image |
US8782077B1 (en) * | 2011-06-10 | 2014-07-15 | Google Inc. | Query image search |
US20150154232A1 (en) * | 2012-01-17 | 2015-06-04 | Google Inc. | System and method for associating images with semantic entities |
US20150169645A1 (en) * | 2012-12-06 | 2015-06-18 | Google Inc. | Presenting image search results |
US20150227780A1 (en) * | 2014-02-13 | 2015-08-13 | FacialNetwork, Inc. | Method and apparatus for determining identity and programing based on image features |
US20160127597A1 (en) * | 2014-10-30 | 2016-05-05 | Fuji Xerox Co., Ltd. | Non-transitory computer readable medium |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050076004A1 (en) * | 2003-09-30 | 2005-04-07 | Hiroyuki Yanagisawa | Computer, database generating method for electronic picture book service, photographed subject information providing method, recording medium, and computer data signal |
JP2008146602A (en) * | 2006-12-13 | 2008-06-26 | Canon Inc | Document retrieving apparatus, document retrieving method, program, and storage medium |
JP5200015B2 (en) * | 2007-06-14 | 2013-05-15 | パナソニック株式会社 | Image recognition apparatus and image recognition method |
JP5525737B2 (en) * | 2009-03-18 | 2014-06-18 | オリンパス株式会社 | Server system, terminal device, program, information storage medium, and image search method |
CN104321802B (en) * | 2012-05-24 | 2017-04-26 | 株式会社日立制作所 | Image analysis device, image analysis system, and image analysis method |
KR102059913B1 (en) * | 2012-11-20 | 2019-12-30 | 삼성전자주식회사 | Tag storing method and apparatus thereof, image searching method using tag and apparauts thereof |
JP6064618B2 (en) * | 2013-01-23 | 2017-01-25 | 富士ゼロックス株式会社 | Information processing apparatus and program |
-
2015
- 2015-06-10 JP JP2015117539A patent/JP2017004252A/en active Pending
-
2016
- 2016-06-02 CN CN201680033786.4A patent/CN107636653A/en not_active Withdrawn
- 2016-06-02 US US15/575,092 patent/US20180157682A1/en not_active Abandoned
- 2016-06-02 WO PCT/JP2016/066355 patent/WO2016199662A1/en active Application Filing
- 2016-06-02 EP EP16807369.0A patent/EP3309694A4/en not_active Withdrawn
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020178135A1 (en) * | 1998-02-24 | 2002-11-28 | Sumiyo Tanaka | Image searching system and image searching method, and a recording medium storing an image searching program |
US20050162523A1 (en) * | 2004-01-22 | 2005-07-28 | Darrell Trevor J. | Photo-based mobile deixis system and related techniques |
US20070133947A1 (en) * | 2005-10-28 | 2007-06-14 | William Armitage | Systems and methods for image search |
US20090034805A1 (en) * | 2006-05-10 | 2009-02-05 | Aol Llc | Using Relevance Feedback In Face Recognition |
US7844591B1 (en) * | 2006-10-12 | 2010-11-30 | Adobe Systems Incorporated | Method for displaying an image with search results |
US8472664B1 (en) * | 2008-01-31 | 2013-06-25 | Google Inc. | Inferring locations from an image |
US20120239638A1 (en) * | 2008-06-05 | 2012-09-20 | Enpulz, L.L.C. | Search engine supporting mixed image and text input |
US20100250136A1 (en) * | 2009-03-26 | 2010-09-30 | Chen Chien-Hung Z | Computer based location identification using images |
US20110314049A1 (en) * | 2010-06-22 | 2011-12-22 | Xerox Corporation | Photography assistant and method for assisting a user in photographing landmarks and scenes |
US8782077B1 (en) * | 2011-06-10 | 2014-07-15 | Google Inc. | Query image search |
US20150154232A1 (en) * | 2012-01-17 | 2015-06-04 | Google Inc. | System and method for associating images with semantic entities |
US20150169645A1 (en) * | 2012-12-06 | 2015-06-18 | Google Inc. | Presenting image search results |
US20150227780A1 (en) * | 2014-02-13 | 2015-08-13 | FacialNetwork, Inc. | Method and apparatus for determining identity and programing based on image features |
US20160127597A1 (en) * | 2014-10-30 | 2016-05-05 | Fuji Xerox Co., Ltd. | Non-transitory computer readable medium |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180165856A1 (en) * | 2016-12-09 | 2018-06-14 | Canon Kabushiki Kaisha | Control method and storage medium |
US10460494B2 (en) * | 2016-12-09 | 2019-10-29 | Canon Kabushiki Kaisha | Control method and storage medium |
US10506110B2 (en) | 2016-12-09 | 2019-12-10 | Canon Kabushiki Kaisha | Image processing apparatus, control method, and storage medium |
US10560601B2 (en) | 2016-12-09 | 2020-02-11 | Canon Kabushiki Kaisha | Image processing method, image processing apparatus, and storage medium |
US11647370B2 (en) | 2018-02-16 | 2023-05-09 | Maxell, Ltd. | Mobile information terminal, information presentation system and information presentation method |
US20220147050A1 (en) * | 2020-11-10 | 2022-05-12 | Meteorolite Ltd. | Methods and devices for operating an intelligent mobile robot |
Also Published As
Publication number | Publication date |
---|---|
EP3309694A4 (en) | 2018-12-05 |
CN107636653A (en) | 2018-01-26 |
JP2017004252A (en) | 2017-01-05 |
EP3309694A1 (en) | 2018-04-18 |
WO2016199662A1 (en) | 2016-12-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20180157682A1 (en) | Image information processing system | |
US10242410B2 (en) | Storage medium, image processing method and image processing apparatus | |
US20230410500A1 (en) | Image display system, terminal, method, and program for determining a difference between a first image and a second image | |
WO2016015437A1 (en) | Method, apparatus and device for generating picture search library and searching for picture | |
US20130243249A1 (en) | Electronic device and method for recognizing image and searching for concerning information | |
US11335087B2 (en) | Method and system for object identification | |
JP2015090524A5 (en) | ||
JP2009058252A5 (en) | ||
US20180005022A1 (en) | Method and device for obtaining similar face images and face image information | |
US20170039450A1 (en) | Identifying Entities to be Investigated Using Storefront Recognition | |
US20180218083A1 (en) | Method for Recommending Clothes Collocation and Intelligent Terminal | |
US8918414B2 (en) | Information providing device, information providing method, information providing processing program, and recording medium having information providing processing program recorded thereon | |
US20150063686A1 (en) | Image recognition device, image recognition method, and recording medium | |
US11593303B2 (en) | Systems and methods for automatically placing a fire system device icon on a drawing of a building floor plan | |
US20220067343A1 (en) | Information processing apparatus, information processing method, and storage medium | |
US20140350844A1 (en) | Method for searching data and method for planning itinerary | |
JP2019083532A (en) | Image processing system, image processing method, and image processing program | |
US9064020B2 (en) | Information providing device, information providing processing program, recording medium having information providing processing program recorded thereon, and information providing method | |
KR20140006440U (en) | System and method for plant idendification using both images and taxonomic characters | |
US11216969B2 (en) | System, method, and computer-readable medium for managing position of target | |
CN109084750B (en) | Navigation method and electronic equipment | |
CN105893576A (en) | Information inquiry method and device and mobile terminal | |
WO2017000715A1 (en) | Contact matching method and device | |
JP2009042962A (en) | Retrieval system | |
CN109522449B (en) | Searching method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: WE'LL CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WAKABAYASHI, KAZUYOSHI;REEL/FRAME:044162/0620 Effective date: 20171106 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |