WO2013099038A1 - 画像検索システム、画像検索方法、画像検索装置、プログラム、及び情報記憶媒体 - Google Patents
画像検索システム、画像検索方法、画像検索装置、プログラム、及び情報記憶媒体 Download PDFInfo
- Publication number
- WO2013099038A1 WO2013099038A1 PCT/JP2011/080535 JP2011080535W WO2013099038A1 WO 2013099038 A1 WO2013099038 A1 WO 2013099038A1 JP 2011080535 W JP2011080535 W JP 2011080535W WO 2013099038 A1 WO2013099038 A1 WO 2013099038A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- search
- original image
- feature information
- processed
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/53—Querying
- G06F16/532—Query formulation, e.g. graphical querying
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
- G06F16/5838—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
- G06F16/5854—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using shape and object relationship
Definitions
- the present invention relates to an image search system, an image search method, an image search device, a program, and an information storage medium.
- An image search system for searching for an image similar to the original image is known (for example, Patent Document 1).
- an image search system for searching for a clothing image similar to a clothing image is known.
- the present invention has been made in view of the above problems, and an object of the present invention is to provide an image search system, an image search method, an image search device, a program, and an image search system that can search for images with different specific parts of the original image.
- An object is to provide an information storage medium.
- an image search system includes an original image acquisition unit that acquires an original image, a designation of a partial area to be processed in the original image, and the partial area.
- a plurality of types related to a region other than the partial region of the original image, or a processed image obtained by processing the processed content on the partial region of the original image Image search based on the feature information of at least one of the plurality of types of feature information regarding the processed image or the partial region of the original image, and at least one feature information selected based on the processing content
- an output control means for causing the output means to output a search result of the search means.
- the image search method includes an original image acquisition step of acquiring an original image, designation of a partial area to be processed in the original image, and processing contents for the partial area.
- a receiving step for accepting, a processed image in which the processing of the processing content is performed on the partial region of the original image, or a plurality of types of feature information on a region other than the partial region of the original image;
- a search step for performing an image search based on at least one feature information selected based on the processing content among the plurality of types of feature information regarding the partial region of the processed image, and the search step And an output control step for causing the output means to output the search result.
- the image search device is a means for obtaining processing content for a partial region of the original image, and a processed image obtained by processing the processing content on the partial region of the original image, or Selection based on the processing content of a plurality of types of feature information related to a region other than the partial region of the original image and the plurality of types of feature information related to the processed image or the partial region of the original image And a search condition setting means for setting a search condition for image search based on the at least one feature information.
- the program according to the present invention includes means for obtaining processing content for a partial area of an original image, and the part of the processed image obtained by processing the processing content on the partial area of the original image. Based on a plurality of types of feature information related to a region other than a region, and at least one feature information selected based on the processing content among the plurality of types of feature information related to the partial region of the processed image.
- the information storage medium includes a means for acquiring the processing content for a partial area of the original image, and the processing image of the processing image formed by processing the processing content on the partial area of the original image.
- a computer-readable information storage medium storing a program for causing a computer to function as search condition setting means for setting a search condition for image search based on the above.
- the output control unit may prevent the output unit from outputting the original image as a search result of the search unit when the search result of the search unit includes the original image. May be.
- the output control unit may set an output order of the images searched by the search unit based on a similarity between the image searched by the search unit and the original image.
- Output order setting means and means for causing the output means to output the image searched by the search means according to the output order, wherein the output order setting means outputs an image having a low similarity to the original image
- the order may be set higher than the output order of images having high similarity to the original image.
- the search unit includes a determination unit that determines whether or not the target image is the same or similar to the processed image, and searches for the same or similar image as the processed image.
- the determination means corresponds to the plurality of types of feature information related to the region other than the partial region of the processed image or the original image, and the target corresponding to the region other than the partial region of the processed image.
- a second degree of similarity indicating a degree of similarity between the at least one feature information relating to and the at least one feature information relating to the region of the target image corresponding to the partial region of the processed image First standard Means for determining whether or not it is equal to or higher than a second reference similarity lower than the similarity, and the first similarity is determined to be higher than or equal to the first reference similarity and the second similarity
- the target image may be determined to be the same or similar image as the processed image.
- the search unit includes a determination unit that determines whether the target image is an image similar to the original image, searches for an image similar to the original image, and performs the determination.
- the means includes the plurality of types of feature information regarding the region other than the partial region of the original image, and the plurality of types of the region of the target image corresponding to the region other than the partial region of the original image.
- FIG. 1 is a diagram illustrating an example of an overall configuration of an image search system according to an embodiment of the present invention. It is a figure which shows an example of the screen transition in the case of performing an image search. It is a figure which shows an example of a process screen. It is a figure for demonstrating an example of the process with respect to an original image. It is a figure for demonstrating another example of the process with respect to an original image. It is a figure for demonstrating another example of the process with respect to an original image. It is a figure which shows an example of a search result screen. It is a functional block diagram of an image search system. It is a figure for demonstrating a process area
- FIG. 1 shows an example of the overall configuration of an image search system 1 according to an embodiment of the present invention.
- the image search system 1 according to the present embodiment includes a server 10, a database 16, and a user terminal 20.
- the server 10 and the user terminal 20 are connected to a communication network 2 including, for example, the Internet, and data communication between the server 10 and the user terminal 20 is possible.
- the server 10 includes a control unit 11, a main storage unit 12, an auxiliary storage unit 13, a communication unit 14, and an optical disc drive unit 15.
- the control unit 11 includes, for example, one or a plurality of CPUs, and executes information processing according to an operation system or a program stored in the auxiliary storage unit 13.
- the main storage unit 12 is, for example, a RAM
- the auxiliary storage unit 13 is, for example, a hard disk or a solid state drive.
- the communication unit 14 is for performing data communication via the communication network 2.
- the optical disk drive unit 15 is for reading a program and data recorded on an optical disk (information storage medium).
- the program and data are supplied to the auxiliary storage unit 13 via an optical disc (information storage medium). That is, the program and data stored on the optical disc are read by the optical disc drive unit 15 and stored in the auxiliary storage unit 13.
- the server 10 may include a component for reading a program or data stored in an information storage medium (for example, a memory card) other than the optical disk. Then, a program or data may be supplied to the auxiliary storage unit 13 via an information storage medium (for example, a memory card) other than the optical disk. Further, the program and data may be supplied to the auxiliary storage unit 13 via the communication network 2.
- the server 10 can access the database 16.
- the database 16 stores a large number of images to be searched for images.
- the database 16 may be constructed in the server 10 or may be constructed in a server different from the server 10.
- the user terminal 20 is an information processing device used by a user.
- the user terminal 20 includes a control unit 21, a main storage unit 22, an auxiliary storage unit 23, a communication unit 24, a display unit 25, an audio output unit 26, and an operation unit 27.
- the control unit 21, main storage unit 22, auxiliary storage unit 23, and communication unit 24 are the same as the control unit 11, main storage unit 12, auxiliary storage unit 13, and communication unit 14 of the server 10.
- the program and data are supplied to the auxiliary storage unit 23 via the communication network 2.
- the user terminal 20 may include an optical disk drive unit.
- the program and data may be supplied to the auxiliary storage unit 23 via an optical disk (information storage medium).
- the user terminal 20 may include a component for reading a program or data stored in an information storage medium (for example, a memory card) other than the optical disk. Then, a program and data may be supplied to the auxiliary storage unit 23 via an information storage medium (for example, a memory card) other than the optical disk.
- the display unit 25 is a liquid crystal display or an organic EL display, for example, and displays various screens.
- the audio output unit 26 is a speaker or a headphone terminal, for example, and outputs various sounds.
- the operation unit 27 is used by the user for operation.
- a pointing device for the user to specify a position in the screen displayed on the display unit 25 is provided as the operation unit 27.
- the user terminal 20 includes a touch panel provided on the display unit 25 so as to overlap the display unit 25. Note that, for example, a mouse or a stick may be provided in the user terminal 20 instead of the touch panel.
- the image search system 1 can search for images with different specific parts of the original image.
- the image search function will be described below. In the following description, an example in which an image having a different specific part of a clothing image is searched will be described.
- FIG. 2 is a diagram for explaining an example of transition of a screen displayed on the display unit 25 of the user terminal 20 when performing an image search.
- an original image selection screen is first displayed on the display unit 25 as shown in FIG. 2 (S101).
- the original image selection screen is a screen for selecting an original image used for image search from images stored in the database 16.
- the processing screen is a screen for processing the original image.
- FIG. 3 shows an example of the processing screen.
- the original image 32 selected on the original image selection screen is displayed.
- An image of a long sleeve cut-and-sew is displayed as an original image 32 on the processing screen 30 shown in FIG.
- FIG. 3 shows the processing screen 30 in the initial state.
- An original image 32 in an unprocessed state is displayed on the processing screen 30 in the initial state.
- the background portion (portion other than clothes) of the original image 32 is removed. The removal of the background portion may be executed when the original image 32 is displayed on the processing screen 30, or an image from which the background portion has been removed in advance may be stored in the database 16.
- the processing screen 30 displays a menu 34 (pull-down menu) for specifying the processing content.
- a menu 34 pulse-down menu
- radio buttons optional buttons for specifying processing contents may be displayed.
- the user After the user specifies the processing content for the original image 32 in the menu 34, the user performs a processing operation corresponding to the processing content. For example, in the menu 34, the following processing contents can be designated. (1) Color deletion (color removal) (2) Color change (3) Pattern deletion (4) Pattern change (5) Deformation
- processing image obtained by processing the original image 32 is displayed.
- FIG. 4 shows a case where the color of the sleeve portion of the original image 32 is deleted.
- the user selects “Delete Color” in the menu 34. Thereafter, the user deletes the color of the sleeve portion by pointing to the pixel of the sleeve portion (that is, the pixel whose color is to be deleted). For example, the user performs processing as shown in FIG. 4 when emphasizing a portion other than the sleeve and not emphasizing the color of the sleeve portion.
- the user when the user wants to search for an image of a long-sleeved cut-and-sew in which the color of the sleeve portion of the long-sleeved cut-and-sew shown in the original image 32 is changed to a desired color, the user performs the above processing.
- the user when deleting the pattern of the sleeve portion in the original image 32, first, the user selects “Delete pattern” in the menu 34. Thereafter, the user deletes the pattern of the sleeve portion by pointing to the pixel of the sleeve portion (that is, the pixel from which the pattern is to be deleted). For example, the user performs the above-described processing when the portion other than the sleeve is emphasized and the design of the sleeve portion is not important.
- FIG. 5 shows a case where the pattern of the sleeve portion of the original image 32 is changed.
- the user first selects “change pattern” in the menu 34.
- “change pattern” is selected, a menu for changing the pattern is displayed, and the user changes the pattern on the sleeve portion to a desired pattern.
- the user wants to search for an image of a long-sleeved cut-and-sew in which the pattern of the sleeve portion of the long-sleeved cut-and-sew shown in the original image 32 is changed to a desired pattern, the user performs processing as shown in FIG.
- FIG. 6 shows a case where the sleeve portion of the original image 32 is deformed.
- the sleeve portion is deformed so as to shrink upward.
- the user first selects “deform” in the menu 34. Thereafter, the user designates a region surrounding the sleeve portion (that is, a portion to be deformed), and deforms the sleeve portion by deforming the region so that the region is contracted upward, for example. For example, when searching for a short-sleeved cut-and-sew that has the same color and pattern as the long-sleeved cut-and-sew shown in the original image 32, the user performs the processing shown in FIG.
- FIG. 7 shows an example of the search result screen.
- FIG. 7 shows a search result screen 40 when the image search is executed based on the processed image 36 (see FIG. 4) obtained by deleting the color of the sleeve portion of the original image 32.
- the search result screen 40 shown in FIG. 7 indicates that eight images 42A, 42B, 42C, 42D, 42E, 42F, 42G, and 42H are found as images similar to the processed image 36. In FIG. 7, the images 42D, 42E, 42F, and 42G are omitted.
- the images 42A to 42H obtained as search results are displayed according to the output order.
- images are displayed in order from the highest output order. Therefore, the image 42A is the image with the highest output order, and the image 42H is the image with the lowest output order.
- the image 42H is the original image 32 (that is, the image before being processed). That is, the output order of the original image 32 is set to the lowest.
- the output order of the images 42A to 42H is set so that the lower the similarity to the original image 32 (that is, the image before being processed), the higher the output order. It is like that.
- the original image 32 is not the image that the user wants to find, and the user wants to find an image having a high degree of similarity to the original image 32. The image is not likely to be.
- the output order is set as described above, so that the original image 32 itself or an image having a high similarity to the original image 32 is not preferentially displayed. .
- a product screen showing the detailed information of the product (clothes) shown in the image is displayed (S104). For example, a product can be purchased on the product screen.
- FIG. 8 is a functional block diagram showing functions realized in the image search system 1.
- the image search system 1 includes an original image acquisition unit 50, a reception unit 52, a search condition setting unit 54, a search unit 56, and an output control unit 58.
- these functional blocks are realized in the server 10 or the user terminal 20.
- the original image acquisition unit 50, the reception unit 52, and the search condition setting unit 54 are realized in the user terminal 20, and the search unit 56 and the output control unit 58 are realized in the server 10. That is, when the control unit 21 of the user terminal 20 executes processing according to the program, the control unit 21 functions as the original image acquisition unit 50, the reception unit 52, and the search condition setting unit 54. Further, when the control unit 11 of the server 10 executes the process according to the program, the control unit 11 functions as the search unit 56 and the output control unit 58.
- the original image acquisition unit 50 acquires an original image 32 to be used for image search.
- the original image acquisition unit 50 acquires the original image 32 selected by the user on the original image selection screen.
- the accepting unit 52 accepts designation of a partial area to be processed in the original image 32 and processing contents for the partial area.
- the accepting unit 52 accepts processing for a partial region of the original image 32 by displaying the processing screen 30 on the display unit 25. That is, the accepting unit 52 accepts processing for a partial area of the original image 32, thereby accepting designation of a partial area to be processed in the original image 32 and processing content for the partial area. .
- the user has actually processed the original image 32 to receive designation of a partial area to be processed and processing details for the partial area.
- the accepting unit 52 may accept the designation of a partial area to be processed and the processing content for the partial area without causing the user to actually process the original image 32.
- the search condition setting unit 54 includes a plurality of types of feature information regarding the non-processed area of the processed image 36 obtained by processing the processing content specified by the user on a partial area of the original image 32, and the processed area of the processed image 36. Search for searching for an image that is the same as or similar to the processed image 36 based on at least one feature information selected based on the processing content applied to the processing area from among a plurality of types of feature information Set conditions.
- the “processing region” is a region including a portion designated as a processing target (that is, a portion subjected to processing).
- the “non-processed area” is an area other than the processed area.
- FIG. 9 is a diagram for explaining the processing region and the non-processing region.
- FIG. 9 shows a processed region and a non-processed region when the sleeve portion is designated as a processing target (that is, when the sleeve portion is processed).
- a rectangular region including the sleeve portion is set as the processing region 60.
- An area other than the machining area 60 is set as the non-machining area 62.
- the search condition setting unit 54 acquires a plurality of types of feature information related to the non-processed area 62 of the processed image 36.
- the “plurality of types of feature information” is, for example, the following feature information.
- a known method can be adopted as a method for extracting the feature information. (1) Feature information about color (feature value) (2) Feature information on patterns (feature values) (3) Feature information about the shape (feature value)
- the search condition setting unit 54 acquires at least one feature information selected based on the processing content applied to the processing region 60 from among a plurality of types of feature information regarding the processing region 60 of the processing image 36.
- “Multiple types of feature information” in this case is also feature information such as (1) to (3) above.
- the search condition setting unit 54 selects the feature information used for the image search from the feature information (1) to (3) related to the processing region 60 of the processing image 36 based on the processing content for the processing region 60.
- FIG. 10 shows an example of a correspondence relationship between the processing content for the processing area 60 and the feature information of the processing area 60 used for image search.
- a circle in the “processing content for the processing region” column in FIG. 10 indicates the processing content specified as the processing content for the processing region 60 (that is, the processing content applied to the processing region 60). Further, in the “feature information of processing area used for image search” column in FIG. 10, a circle indicates that feature information is used for image search, and a blank indicates that feature information is not used for image search. ing.
- a circle is added to the “color-deletion” column in the “processing contents for the processing region” column.
- the “pattern” column and the “shape” column are circled in the “feature information of processing area used for image search” column, and the “color” column is blank. It has become. This indicates that the feature information related to the pattern and shape of the processed region 60 is used for image search, and the feature information related to the color of the processed region 60 is not used for image search.
- the color of the sleeve portion is deleted as shown in FIG. 4, only the feature information on the pattern and shape is used for the image search among the feature information on the processing region 60 (1) to (3).
- the feature information about the color is not used for the image search.
- the case where the user deletes the color of the sleeve portion is a case where the user does not specify the color of the sleeve portion.
- the feature information regarding the color of the processing region 60 is not used for image retrieval.
- the user wants to search for clothes having a sleeve portion with the same or similar pattern as the sleeve portion copied in the original image 32.
- the case where the user has not deformed the sleeve portion is a case where the user wants to search for clothes having a sleeve portion having the same or similar shape as the sleeve portion shown in the original image 32. .
- the feature information regarding the pattern and shape of the processing region 60 is used for image retrieval.
- FIG. 10B shows a case where “color deletion” and “deformation” are designated as the processing contents for the processing region 60.
- the “pattern” column and the “shape” column are circled in the “feature information of processing region used for image search” column, and the “color” column is blank. ing. This indicates that the feature information related to the pattern and shape of the processed region 60 is used for image search, and the feature information related to the color of the processed region 60 is not used for image search.
- the case where the user deletes the color of the sleeve portion and deforms the sleeve portion is when the user does not specify the color of the sleeve portion and the shape of the sleeve portion is specified. For this reason, the feature information related to the color of the processed region 60 is not used for image search, while the feature information related to the shape of the processed region 60 is used for image search.
- the user does not delete or change the pattern of the sleeve portion the user wants to search for clothes having a sleeve portion with the same or similar pattern as the sleeve portion copied in the original image 32. This is the case. For this reason, the feature information regarding the pattern of the processing area 60 is used for image search.
- a circle is added to the “pattern-deletion” column in the “processing content for the processing region” column.
- the “feature information of processing area used for image search” column the “color” column and the “shape” column are circled, and the “pattern” column is blank. ing. This indicates that the feature information regarding the color and shape of the processing region 60 is used for image search, and the feature information regarding the pattern of the processing region 60 is not used for image search.
- the case where the user has deleted the sleeve part pattern is when the user has not designated the sleeve part pattern. For this reason, the feature information regarding the pattern of the processing region 60 is not used for the image search.
- the user does not delete or change the color of the sleeve portion the user wants to search for clothes having a sleeve portion having the same or similar color as the sleeve portion copied in the original image 32. This is the case.
- the case where the user does not deform the sleeve portion is a case where the user wants to search for clothes having a sleeve portion having the same or similar shape as the sleeve portion shown in the original image 32. . For this reason, the feature information regarding the color and shape of the processing region 60 is used for image retrieval.
- a circle is added to the “pattern-deletion” column and the “deformation” column in the “processing contents for the processing region” column.
- the “characteristic information of the processing area used for image search” column the “color” column and the “shape” column are circled, and the “pattern” column is blank. ing. This indicates that the feature information regarding the color and shape of the processing region 60 is used for image search, and the feature information regarding the pattern of the processing region 60 is not used for image search.
- the case where the user deletes the pattern of the sleeve portion and deforms the sleeve portion is a case where the user does not specify the pattern of the sleeve portion and the shape of the sleeve portion is specified. For this reason, the feature information related to the pattern of the processed region 60 is not used for the image search, while the feature information related to the shape of the processed region 60 is used for the image search.
- the user does not delete or change the color of the sleeve portion
- a circle is added to the “color-deletion” column and the “pattern-deletion” column in the “processing contents for the processing area” column.
- the “shape” column is circled, and the “color” and “pattern” columns are blank. This indicates that only feature information related to the shape of the processed region 60 is used for image search, and feature information related to the color and pattern of the processed region 60 is not used for image search.
- the case where the user deletes the color and pattern of the sleeve portion is a case where the user does not specify the color and pattern of the sleeve portion. For this reason, the feature information regarding the color and pattern of the processing region 60 is not used for the image search.
- the case where the user has not deformed the sleeve portion is a case where the user wants to search for clothes having a sleeve portion having the same or similar shape as the sleeve portion shown in the original image 32. For this reason, the feature information regarding the shape of the processing region 60 is used for image retrieval.
- FIG. 10F circles are added to the “color-deletion” column, the “pattern-deletion” column, and the “deformation” column in the “processing contents for the processing region” column.
- a circle is added to the “shape” column in the “feature information of processing area used for image search” column, and the “color” and “pattern” columns are blank. Yes. This indicates that the feature information regarding the shape of the processing region 60 is used for image search, and the feature information regarding the color and pattern of the processing region 60 is not used for image search.
- the case where the user deletes the color and pattern of the sleeve portion and the sleeve portion is deformed is when the user does not specify the color and pattern of the sleeve portion and the shape of the sleeve portion is specified. Is the case. For this reason, the feature information regarding the color and the pattern of the processing region 60 is not used for the image search, while the feature information regarding the shape of the processing region 60 is used for the image search.
- a circle is added to the “color-change” column in the “processing content for the processing area” column.
- circles are added to the “color” column, the “pattern” column, and the “shape” column in the “feature area feature information used for image search” column.
- the case where the user changes the sleeve color is when the user positively specifies the sleeve color. For this reason, the feature information regarding the color of the processing region 60 is used for image retrieval.
- the user when the user does not delete or change the pattern of the sleeve portion, the user wants to search for clothes having a sleeve portion with the same or similar pattern as the sleeve portion copied in the original image 32. This is the case. For this reason, the characteristic information regarding the pattern of the processing region 60 is also used for the image search.
- the case where the user has not deformed the sleeve portion is a case where the user wants to search for clothes having a sleeve portion having the same or similar shape as the sleeve portion shown in the original image 32. .
- feature information related to the shape of the processing region 60 is also used for image retrieval.
- a circle is added to the “pattern-change” column in the “processing content for the processing area” column.
- This shows a case where “change of pattern” is designated as the processing content for the processing area 60.
- the “feature area feature information used for image search” column the “color” column, the “pattern” column, and the “shape” column are circled.
- the pattern of the sleeve portion is changed as shown in FIG. 5, all of the feature information (1) to (3) related to the processing region 60 is used for the image search.
- the case where the user changes the pattern of the sleeve portion is a case where the user positively specifies the pattern of the sleeve portion.
- the feature information regarding the pattern of the processing area 60 is used for image search.
- the user wants to search for clothes having a sleeve portion having the same or similar color as the sleeve portion copied in the original image 32. This is the case.
- feature information relating to the color of the processing region 60 is also used for image retrieval.
- the case where the user has not deformed the sleeve portion is a case where the user wants to search for clothes having a sleeve portion having the same or similar shape as the sleeve portion shown in the original image 32. .
- feature information related to the shape of the processing region 60 is also used for image retrieval.
- a “deformation” column is marked with a circle in the “processing content for the processing region” column. This shows a case where “deformation” is designated as the machining content for the machining area 60.
- the “color” column, the “pattern” column, and the “shape” column are marked with circles.
- the sleeve portion is deformed as shown in FIG. 6, for example, all of the feature information (1) to (3) related to the processing region 60 is used for the image search.
- the case where the user deforms the sleeve portion is a case where the user positively specifies the shape of the sleeve portion.
- the feature information regarding the shape of the processing region 60 is used for image retrieval.
- the case where the user has not deleted or changed the color and pattern of the sleeve portion means that the user has clothes having a sleeve portion having the same or similar color and pattern as the sleeve portion copied in the original image 32. This is when the user wants to search. For this reason, feature information regarding the color and pattern of the processing region 60 is also used for image retrieval.
- the search unit 56 searches for the same or similar image as the processed image 36 based on the search condition set by the search condition setting unit 54. Details will be described later (see FIG. 12).
- the output control unit 58 outputs the search result of the search unit 56 to the output unit.
- the output control unit 58 causes the display unit 25 to display the search result screen 40.
- the output control unit 58 determines the similarity between the image searched by the search unit 56 and the original image 32 (that is, the original image that has not been processed). Based on the above, the output order of the images searched by the search unit 56 is set. Then, the output control unit 58 causes the output unit to output the images searched by the search unit 56 according to the output order.
- the output control unit 58 sets the output order of images having a low similarity to the original image 32 higher than the output order of images having a high similarity to the original image 32.
- the first image and the second image are included in the image searched by the search unit 56, and the similarity between the original image 32 and the first image is determined between the original image 32 and the second image.
- the output control unit 58 sets the output order of the first image higher than the output order of the second image. In other words, the output control unit 58 lowers the output order of the second image from the output order of the first image.
- FIG. 11 is a flowchart illustrating an example of processing executed when processing on the original image 32 is completed on the processing screen 30 and a search instruction operation is performed by the user.
- the control unit 21 of the user terminal 20 executes the process shown in FIG. 11 according to the program
- the control unit 11 functions as the search condition setting unit 54.
- the control unit 11 of the server 10 executes the process shown in FIG. 11 according to the program
- the control unit 11 functions as the search unit 56 and the output control unit 58.
- the control unit 21 of the user terminal 20 changes the processing content performed on the processing screen 30 as shown in FIG. Based on the processed image 36, the processed region 60 and the non-processed region 62 are specified (S201).
- control unit 21 acquires feature information related to the non-processed area 62 of the processed image 36 (S202). For example, the control unit 21 acquires feature information related to the color, pattern, and shape of the non-processed area 62.
- control unit 21 acquires feature information related to the processing area 60 of the processed image 36 (S203). For example, the control unit 21 determines at least one of the feature information related to the color, pattern, and shape of the processing region 60 based on the processing content applied to the processing region 60 and the correspondence relationship illustrated in FIG. Get one.
- control unit 21 requests the server 10 to execute an image search (S204). For example, the control unit 21 transmits information as shown below to the server 10. Identification information of the original image 32 Information indicating the processing area 60 and the non-processing area 62 Feature information regarding the processing area 60 Feature information regarding the non-processing area 62
- control unit 11 of the server 10 executes an image search based on the information received from the user terminal 20 (S205).
- each image stored in the database 16 is set as a comparison target with the processed image 36, and for example, a process as shown in FIG. 12 is executed.
- the background portion (portion other than clothing) of the comparison target image is removed.
- an image from which the background portion has been removed in advance is stored in the database 16.
- the control unit 11 acquires feature information of the non-processed area 62 of the image (comparison target image) set as the comparison target with the processed image 36 (S301).
- the “non-processed region 62 of the comparison target image” means a region of the comparison target image corresponding to the non-processed region 62 of the processed image 36.
- the control unit 11 calculates the similarity (first similarity) between the feature information of the non-processed region 62 of the processed image 36 and the feature information of the non-processed region 62 of the comparison target image (S302). That is, the control unit 11 compares the feature information regarding the color, pattern, and shape of the non-processed region 62 of the processed image 36 with the feature information about the color, pattern, and shape of the non-processed region 62 of the comparison target image. To calculate the above similarity.
- control part 11 determines whether the similarity calculated by step S302 is more than a threshold value (1st reference
- a threshold value (1st reference
- the control unit 11 acquires feature information of the processed region 60 of the comparison target image (S304).
- the processed region 60 of the comparison target image means a region of the comparison target image corresponding to the processed region 60 of the processed image 36.
- control unit 11 calculates the similarity (second similarity) between the feature information of the processed region 60 of the processed image 36 and the feature information of the processed region 60 of the comparison target image (S305).
- the processed image 36 is an image obtained by performing “color deletion” processing on the processed region 60 of the original image 32, as shown in FIG.
- the feature information about the pattern and shape of the processed region 60 is used, and the feature information about the color of the processed region 60 of the processed image 36 is not used. Therefore, the similarity is calculated by comparing the feature information regarding the pattern and shape of the processed region 60 of the processed image 36 with the feature information regarding the pattern and shape of the processed region 60 of the comparison target image.
- the processed image 36 is an image obtained by performing the processing of “deletion of pattern” on the processed area 60 of the original image 32, as shown in FIG.
- Feature information regarding the color and shape of the 36 processed regions 60 is used, and feature information regarding the pattern of the processed region 60 of the processed image 36 is not used. Therefore, the similarity is calculated by comparing the feature information related to the color and shape of the processed region 60 of the processed image 36 with the feature information related to the color and shape of the processed region 60 of the comparison target image.
- the processing image 36 is an image obtained by performing the processing of “color deletion” and “pattern deletion” on the processing area 60 of the original image 32
- the processing image 36 is shown in FIG.
- the similarity is calculated by comparing the feature information related to the shape of the processed region 60 of the processed image 36 with the feature information related to the shape of the processed region 60 of the comparison target image.
- the processed image 36 is an image obtained by performing only the “pattern change” processing on the processing area 60 of the original image 32, as shown in FIG. Characteristic information regarding the color, pattern, and shape of the processing region 60 of the image 36 is used. For this reason, the similarity is obtained by comparing the feature information on the color, pattern, and shape of the processed region 60 of the processed image 36 with the feature information on the color, pattern, and shape of the processed region 60 of the comparison target image. Calculated.
- the processed image 36 is an image obtained by performing “deformation” processing on the processed region 60 of the original image 32, as shown in FIG.
- Feature information regarding the color, pattern, and shape of the processing region 60 is used.
- the similarity is obtained by comparing the feature information on the color, pattern, and shape of the processed region 60 of the processed image 36 with the feature information on the color, pattern, and shape of the processed region 60 of the comparison target image. Calculated.
- step S305 determines whether or not the similarity calculated in step S305 is greater than or equal to a threshold (second reference similarity) (S306).
- a threshold second reference similarity
- step S306 is set lower than the threshold value (first reference similarity) in step S303. That is, for the machining area 60, a search that is looser than the non-machining area 62 is executed. In other words, a relatively loose search is performed on the part that the user has intentionally processed, and as many search results as possible can be presented to the user.
- step S305 determines that the comparison target image and the processed image 36 are not similar.
- the control unit 11 determines that the comparison target image and the processed image 36 are similar (S307). Further, in this case, the control unit 11 acquires the degree of similarity between the comparison target image and the original image 32 (that is, the original image that has not been processed) (S308). For example, the control unit 11 calculates the similarity by comparing the feature information of the comparison target image with the feature information of the original image 32. The similarity calculated in step S308 is used to determine the output order when the comparison target image is displayed on the search result screen 40.
- the control unit 11 transmits the search result to the user terminal 20 (S206). That is, the control unit 11 transmits the data indicating the list of the images searched in step S205 (that is, the images determined to be similar to the processed image 36 in step S307) to the user terminal 20, whereby the search result screen is displayed. It is displayed on the display unit 25 of the user terminal 20.
- the control unit 11 sets the output order of the images searched in step S205 based on the similarity acquired in step S308. That is, the lower the similarity obtained in step S308, the higher the output order is set.
- the control unit 21 of the user terminal 20 displays a search result screen on the display unit 25 (S207). This is the end of the description of the processing illustrated in FIG.
- the image search system 1 it is possible to search for an image in which a specific part of the original image 32 is different. For example, it becomes possible to search for a clothing image in which at least one of the color, pattern, and shape of the sleeve portion of the clothing image (original image 32) is different.
- the original image 32 is not the image that the user wants to find.
- images that are not required by the user are not displayed on the search result screen 40.
- the original image 32 is obtained.
- the feature information of the non-processed area 62 may be acquired.
- the accepting unit 52 may determine that “Delete Sleeve Color” has been specified. .
- the reception unit 52 determines that “deletion of the pattern of the sleeve portion” is specified. May be.
- the search condition setting unit 54 processes the processed region 60 among the plurality of types of feature information regarding the non-processed region 62 of the original image 32 and the plurality of types of feature information regarding the processed region 60 of the original image 32.
- a search condition for searching for an image similar to the original image 32 may be set based on at least one feature information selected based on the content.
- the search condition setting unit 54 includes the feature information regarding the color, pattern, and shape of the non-processed area 62 of the original image 32 and the original image 32.
- a search condition for searching for an image similar to the original image 32 may be set on the basis of the pattern and shape feature information of the processing region 60.
- step S202 of FIG. 11 the control unit 21 acquires feature information regarding the non-processed area 62 of the original image 32.
- the control unit 21 acquires feature information regarding the color, pattern, and shape of the non-processed region 62 of the original image 32.
- step S ⁇ b> 203 the control unit 21 acquires feature information regarding the processing area 60 of the original image 32.
- the control unit 21 performs feature information regarding the color, pattern, and shape of the processing region 60 of the original image 32 based on the processing content specified as the processing content for the processing region 60 and the correspondence shown in FIG. Get at least one of them. For example, if “color deletion” is designated as the processing content for the sleeve portion ((A) in FIG. 10), the control unit 21 displays feature information regarding the pattern and shape of the processing region 60 of the original image 32. get.
- step S302 of FIG. 12 the control unit 11 calculates the similarity (first similarity) between the feature information of the non-processed area 62 of the original image 32 and the feature information of the non-processed area 62 of the comparison target image. To do.
- step S305 the control unit 11 calculates the similarity (second similarity) between the feature information of the processed region 60 of the original image 32 and the feature information of the processed region 60 of the comparison target image. For example, if “color deletion” is designated as the processing content for the sleeve portion ((A) in FIG. 10), the control unit 11 includes feature information regarding the pattern and shape of the processing region 60 of the original image 32. The degree of similarity with the feature information related to the pattern and shape of the processing region 60 of the comparison target image is calculated.
- step S203 the control unit 21 determines the color and the color of the processing area 60 of the original image 32. You may make it acquire the characteristic information regarding a shape.
- step S305 the control unit 11 calculates the similarity between the feature information related to the color and shape of the processed region 60 of the original image 32 and the feature information related to the color and shape of the processed region 60 of the comparison target image. May be. Even in this case, for example, there is a body part having the same or similar color, pattern, and shape as the body part (parts other than the sleeve) of the clothes image (original image 32), and the clothes image (original image 32). It is possible to search for a clothing image having a sleeve portion having the same or similar color and shape as the sleeve portion.
- the control unit 21 processes the original image 32 in step S203. Only feature information related to the shape of the region 60 may be acquired.
- the control unit 11 may calculate the similarity between the feature information related to the shape of the processed region 60 of the original image 32 and the feature information related to the shape of the processed region 60 of the comparison target image. Even in this case, for example, there is a body part having the same or similar color, pattern, and shape as the body part (parts other than the sleeve) of the clothes image (original image 32), and the clothes image (original image 32). It is possible to search for a clothing image having a sleeve portion having the same or similar shape as the sleeve portion.
- the feature information of the non-processed area 62 of the original image 32 is acquired in the modification [4] described above. Instead, the feature information of the non-processed area 62 of the processed image 36 may be acquired.
- the search unit 56 and the output control unit 58 may be realized in the user terminal 20. That is, the user terminal 20 may directly access the database 16.
- the search condition setting unit 54 may be realized in the server 10.
- an image stored in the database 16 may be divided into fine blocks, and feature information calculated in advance for each block may be stored in the database 16. Then, in steps S301 and S304 of FIG. 12, the feature information of the comparison target image may be acquired based on the feature information for each block stored in the database 16. By doing so, the processing load may be reduced.
- 1 image search system 2 communication network, 10 server, 11, 21 control unit, 12, 22 main storage unit, 13, 23 auxiliary storage unit, 14, 24 communication unit, 15 optical disc drive unit, 20 user terminal, 25 display unit , 26 audio output unit, 27 operation unit, 30 processed screen, 32 original image, 34 menu, 36 processed image, 40 search result screen, 42A, 42B, 42C, 42D, 42E, 42F, 42G, 42H image, 50 original image Acquisition unit, 52 reception unit, 54 search condition setting unit, 56 search unit, 58 output control unit, 60 machining area, 62 non-machining area.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Library & Information Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
(1)色の削除(色抜き)
(2)色の変更
(3)柄の削除
(4)柄の変更
(5)変形
(1)色に関する特徴情報(特徴値)
(2)柄に関する特徴情報(特徴値)
(3)形状に関する特徴情報(特徴値)
・原画像32の識別情報
・加工領域60及び非加工領域62を示す情報
・加工領域60に関する特徴情報
・非加工領域62に関する特徴情報
Claims (9)
- 原画像を取得する原画像取得手段と、
前記原画像のうちの加工の対象となる一部領域の指定と、当該一部領域に対する加工内容と、を受け付ける受付手段と、
前記加工内容の加工が前記原画像の前記一部領域に施されてなる加工画像、又は、前記原画像、の前記一部領域以外の領域に関する複数種類の特徴情報と、前記加工画像又は前記原画像の前記一部領域に関する前記複数種類の特徴情報のうちの、前記加工内容に基づいて選択される少なくとも一つの特徴情報と、に基づいて、画像検索を実行する検索手段と、
前記検索手段の検索結果を出力手段に出力させる出力制御手段と、
を含むことを特徴とする画像検索システム。 - 請求項1に記載の画像検索システムにおいて、
前記出力制御手段は、前記検索手段の検索結果に前記原画像が含まれる場合において、前記原画像を前記検索手段の検索結果として前記出力手段に出力させないことを特徴とする画像検索システム。 - 請求項1に記載の画像検索システムにおいて、
前記出力制御手段は、
前記検索手段によって検索された画像と、前記原画像と、の類似度に基づいて、前記検索手段によって検索された画像の出力順序を設定する出力順序設定手段と、
前記検索手段によって検索された画像を前記出力順序に従って出力手段に出力させる手段と、を含み、
前記出力順序設定手段は、前記原画像との類似度が低い画像の出力順序を、前記原画像との類似度が高い画像の出力順序よりも高く設定する、
ことを特徴とする画像検索システム。 - 請求項1に記載の画像検索システムにおいて、
前記検索手段は、対象画像が前記加工画像と同一又は類似の画像であるか否かを判定する判定手段を含み、前記加工画像と同一又は類似の画像を検索し、
前記判定手段は、
前記加工画像又は前記原画像の前記一部領域以外の前記領域に関する前記複数種類の特徴情報と、前記加工画像の前記一部領域以外の前記領域に対応する、前記対象画像の領域に関する前記複数種類の特徴情報と、の間の類似の程度を示す第1類似度が第1基準類似度以上であるか否かを判定する手段と、
前記加工画像の前記一部領域に関する前記少なくとも一つの特徴情報と、前記加工画像の前記一部領域に対応する、前記対象画像の領域に関する前記少なくとも一つの特徴情報と、の間の類似の程度を示す第2類似度が、前記第1基準類似度よりも低い第2基準類似度以上であるか否かを判定する手段と、を含み、
前記第1類似度が前記第1基準類似度以上であると判定され、かつ、前記第2類似度が前記第2基準類似度以上であると判定された場合に、前記対象画像が前記加工画像と同一又は類似の画像であると判定する、
ことを特徴とする画像検索システム。 - 請求項1に記載の画像検索システムにおいて、
前記検索手段は、対象画像が前記原画像と類似の画像であるか否かを判定する判定手段を含み、前記原画像と類似の画像を検索し、
前記判定手段は、
前記原画像の前記一部領域以外の前記領域に関する前記複数種類の特徴情報と、前記原画像の前記一部領域以外の前記領域に対応する、前記対象画像の領域に関する前記複数種類の特徴情報と、の間の類似の程度を示す第1類似度が第1基準類似度以上であるか否かを判定する手段と、
前記原画像の前記一部領域に関する前記少なくとも一つの特徴情報と、前記原画像の前記一部領域に対応する、前記対象画像の領域に関する前記少なくとも一つの特徴情報と、の間の類似の程度を示す第2類似度が、前記第1基準類似度よりも低い第2基準類似度以上であるか否かを判定する手段と、を含み、
前記第1類似度が前記第1基準類似度以上であると判定され、かつ、前記第2類似度が前記第2基準類似度以上であると判定された場合に、前記対象画像が前記原画像と類似の画像であると判定する、
ことを特徴とする画像検索システム。 - 原画像を取得する原画像取得ステップと、
前記原画像のうちの加工の対象となる一部領域の指定と、当該一部領域に対する加工内容と、を受け付ける受付ステップと、
前記加工内容の加工が前記原画像の前記一部領域に施されてなる加工画像、又は、前記原画像、の前記一部領域以外の領域に関する複数種類の特徴情報と、前記加工画像の前記一部領域に関する前記複数種類の特徴情報のうちの、前記加工内容に基づいて選択される少なくとも一つの特徴情報と、に基づいて、画像検索を実行する検索ステップと、
前記検索ステップの検索結果を出力手段に出力させる出力制御ステップと、
を含むことを特徴とする画像検索方法。 - 原画像の一部領域に対する加工内容を取得する手段と、
前記加工内容の加工が前記原画像の前記一部領域に施されてなる加工画像、又は、前記原画像、の前記一部領域以外の領域に関する複数種類の特徴情報と、前記加工画像又は前記原画像の前記一部領域に関する前記複数種類の特徴情報のうちの、前記加工内容に基づいて選択される少なくとも一つの特徴情報と、に基づいて、画像検索のための検索条件を設定する検索条件設定手段と、
を含むことを特徴とする画像検索装置。 - 原画像の一部領域に対する加工内容を取得する手段、及び、
前記加工内容の加工が前記原画像の前記一部領域に施されてなる加工画像の前記一部領域以外の領域に関する複数種類の特徴情報と、前記加工画像の前記一部領域に関する前記複数種類の特徴情報のうちの、前記加工内容に基づいて選択される少なくとも一つの特徴情報と、に基づいて、画像検索のための検索条件を設定する検索条件設定手段、
としてコンピュータを機能させるためのプログラム。 - 原画像の一部領域に対する加工内容を取得する手段、及び、
前記加工内容の加工が前記原画像の前記一部領域に施されてなる加工画像の前記一部領域以外の領域に関する複数種類の特徴情報と、前記加工画像の前記一部領域に関する前記複数種類の特徴情報のうちの、前記加工内容に基づいて選択される少なくとも一つの特徴情報と、に基づいて、画像検索のための検索条件を設定する検索条件設定手段、
としてコンピュータを機能させるためのプログラムを記録したコンピュータ読み取り可能な情報記憶媒体。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2011/080535 WO2013099038A1 (ja) | 2011-12-29 | 2011-12-29 | 画像検索システム、画像検索方法、画像検索装置、プログラム、及び情報記憶媒体 |
US14/369,205 US9600495B2 (en) | 2011-12-29 | 2011-12-29 | Image search system, image search method, image search device, program, and information recording medium |
JP2013551170A JP5788996B2 (ja) | 2011-12-29 | 2011-12-29 | 画像検索システム、画像検索方法、画像検索装置、プログラム、及び情報記憶媒体 |
TW101145043A TWI533149B (zh) | 2011-12-29 | 2012-11-30 | Image retrieval device, image retrieval method, computer program product, and information memory media |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2011/080535 WO2013099038A1 (ja) | 2011-12-29 | 2011-12-29 | 画像検索システム、画像検索方法、画像検索装置、プログラム、及び情報記憶媒体 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013099038A1 true WO2013099038A1 (ja) | 2013-07-04 |
Family
ID=48696596
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/080535 WO2013099038A1 (ja) | 2011-12-29 | 2011-12-29 | 画像検索システム、画像検索方法、画像検索装置、プログラム、及び情報記憶媒体 |
Country Status (4)
Country | Link |
---|---|
US (1) | US9600495B2 (ja) |
JP (1) | JP5788996B2 (ja) |
TW (1) | TWI533149B (ja) |
WO (1) | WO2013099038A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016085706A (ja) * | 2014-10-29 | 2016-05-19 | 株式会社フィール | 情報提供システム、および情報公開装置 |
JP2016103235A (ja) * | 2014-11-28 | 2016-06-02 | 日本電信電話株式会社 | 画像検索装置、画像登録装置、画像特徴選択装置、方法、及びプログラム |
JP2018142074A (ja) * | 2017-02-27 | 2018-09-13 | 三菱重工業株式会社 | 特徴量算出装置、画像類似度判定装置、画像検索装置、特徴量算出方法及びプログラム |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015059838A1 (ja) * | 2013-10-25 | 2015-04-30 | 楽天株式会社 | 検索システム、検索条件設定装置、検索条件設定装置の制御方法、プログラム、及び情報記憶媒体 |
CN110674837A (zh) * | 2019-08-15 | 2020-01-10 | 深圳壹账通智能科技有限公司 | 视频相似度获取方法、装置、计算机设备及存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08329135A (ja) * | 1995-05-30 | 1996-12-13 | Minolta Co Ltd | デザイン設計支援システム |
JPH11212990A (ja) * | 1998-01-26 | 1999-08-06 | Toray Ind Inc | 画像の検索装置および画像の検索表示方法ならびに物品の製造方法 |
JP2005293129A (ja) * | 2004-03-31 | 2005-10-20 | Toto Ltd | 物品特定システム及び方法 |
JP2006155588A (ja) * | 2004-11-05 | 2006-06-15 | Fuji Xerox Co Ltd | 画像処理装置、画像処理方法及び画像処理プログラム |
JP2009288928A (ja) * | 2008-05-28 | 2009-12-10 | Fujifilm Corp | 服飾検索方法及び装置、服飾検索プログラム、並びに服飾登録装置 |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6002798A (en) * | 1993-01-19 | 1999-12-14 | Canon Kabushiki Kaisha | Method and apparatus for creating, indexing and viewing abstracted documents |
US6101000A (en) * | 1998-01-30 | 2000-08-08 | Eastman Kodak Company | Photographic processing apparatus and method |
JP2000163576A (ja) * | 1998-11-25 | 2000-06-16 | Hitachi Ltd | 画像検索方法及びその実施装置並びにその処理プログラムを記録した媒体 |
JP4245872B2 (ja) * | 2002-08-28 | 2009-04-02 | 富士フイルム株式会社 | 類似度判定方法および装置並びにプログラム |
JP2004206689A (ja) * | 2002-12-11 | 2004-07-22 | Fuji Photo Film Co Ltd | 画像修正装置および画像修正プログラム |
JP4266784B2 (ja) * | 2003-11-14 | 2009-05-20 | キヤノン株式会社 | 画像処理システム及び画像処理方法 |
JP2006018551A (ja) * | 2004-07-01 | 2006-01-19 | Sony Corp | 情報処理装置および方法、並びにプログラム |
JP4747828B2 (ja) * | 2005-12-21 | 2011-08-17 | 富士ゼロックス株式会社 | 履歴管理装置 |
JP4990917B2 (ja) * | 2006-02-23 | 2012-08-01 | イマジネスティクス エルエルシー | データベース内の構成部品を探索するための入力としてユーザが構成部品を描くことができるようにする方法 |
US20080240572A1 (en) * | 2007-03-26 | 2008-10-02 | Seiko Epson Corporation | Image Search Apparatus and Image Search Method |
US20090202179A1 (en) * | 2008-02-13 | 2009-08-13 | General Electric Company | method and system for providing region based image modification |
JP5412169B2 (ja) * | 2008-04-23 | 2014-02-12 | 株式会社日立ハイテクノロジーズ | 欠陥観察方法及び欠陥観察装置 |
JP5298831B2 (ja) * | 2008-12-19 | 2013-09-25 | 富士ゼロックス株式会社 | 画像処理装置及びプログラム |
EP2199952A1 (en) * | 2008-12-22 | 2010-06-23 | Nederlandse Organisatie voor toegepast-natuurwetenschappelijk Onderzoek TNO | Method and apparatus for identifying combinations of matching regions in images. |
JP5226553B2 (ja) * | 2009-02-06 | 2013-07-03 | キヤノン株式会社 | 画像処理装置、画像処理方法、プログラムおよび記録媒体 |
JP2011054081A (ja) * | 2009-09-04 | 2011-03-17 | Sony Corp | 画像処理装置および方法、並びにプログラム |
JP2011138420A (ja) | 2009-12-29 | 2011-07-14 | Rakuten Inc | 検索システム |
-
2011
- 2011-12-29 WO PCT/JP2011/080535 patent/WO2013099038A1/ja active Application Filing
- 2011-12-29 US US14/369,205 patent/US9600495B2/en active Active
- 2011-12-29 JP JP2013551170A patent/JP5788996B2/ja active Active
-
2012
- 2012-11-30 TW TW101145043A patent/TWI533149B/zh active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08329135A (ja) * | 1995-05-30 | 1996-12-13 | Minolta Co Ltd | デザイン設計支援システム |
JPH11212990A (ja) * | 1998-01-26 | 1999-08-06 | Toray Ind Inc | 画像の検索装置および画像の検索表示方法ならびに物品の製造方法 |
JP2005293129A (ja) * | 2004-03-31 | 2005-10-20 | Toto Ltd | 物品特定システム及び方法 |
JP2006155588A (ja) * | 2004-11-05 | 2006-06-15 | Fuji Xerox Co Ltd | 画像処理装置、画像処理方法及び画像処理プログラム |
JP2009288928A (ja) * | 2008-05-28 | 2009-12-10 | Fujifilm Corp | 服飾検索方法及び装置、服飾検索プログラム、並びに服飾登録装置 |
Non-Patent Citations (2)
Title |
---|
TAKATOSHI KAWADA ET AL.: "Cloth Region Detection Using Shape Information from Image for Similar Cloth Image Retrieval", PROCEEDINGS OF THE 2008 IEICE GENERAL CONFERENCE JOHO SYSTEM 2, 5 March 2008 (2008-03-05), pages 180 * |
TSUTOMU HORIKOSHI ET AL.: "3D Modeling Using Rough Sketches and 3D Shape Retrieval System", TRANSACTIONS OF INFORMATION PROCESSING SOCIETY OF JAPAN, vol. 35, no. 9, 15 September 1994 (1994-09-15), pages 1750 - 1758 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016085706A (ja) * | 2014-10-29 | 2016-05-19 | 株式会社フィール | 情報提供システム、および情報公開装置 |
JP2016103235A (ja) * | 2014-11-28 | 2016-06-02 | 日本電信電話株式会社 | 画像検索装置、画像登録装置、画像特徴選択装置、方法、及びプログラム |
JP2018142074A (ja) * | 2017-02-27 | 2018-09-13 | 三菱重工業株式会社 | 特徴量算出装置、画像類似度判定装置、画像検索装置、特徴量算出方法及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
JP5788996B2 (ja) | 2015-10-07 |
TW201331773A (zh) | 2013-08-01 |
JPWO2013099038A1 (ja) | 2015-04-30 |
TWI533149B (zh) | 2016-05-11 |
US20140369610A1 (en) | 2014-12-18 |
US9600495B2 (en) | 2017-03-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5788996B2 (ja) | 画像検索システム、画像検索方法、画像検索装置、プログラム、及び情報記憶媒体 | |
US20080275850A1 (en) | Image tag designating apparatus, image search apparatus, methods of controlling operation of same, and programs for controlling computers of same | |
CN104281259A (zh) | 信息处理装置、信息处理方法和程序 | |
US20110093478A1 (en) | Filter hints for result sets | |
JP2008108200A (ja) | 情報抽出装置及び方法、並びにプログラム及び記憶媒体 | |
JP2006331418A (ja) | ソートしたコンテキスト内のリンク情報を表示するシステム及び方法 | |
US20110125731A1 (en) | Information processing apparatus, information processing method, program, and information processing system | |
JP2007047864A (ja) | データを編集する画面の表示を制御するシステム、およびその方法 | |
EP3373285A1 (en) | Display apparatus and information displaying method thereof | |
KR102035766B1 (ko) | 클리어런스 체크 프로그램, 클리어런스 체크 방법, 및 클리어런스 체크 장치 | |
JP2010102593A (ja) | 情報処理装置およびその方法、プログラム、記録媒体 | |
KR101483611B1 (ko) | 이미지에서 객체를 추출하기 위한 방법 및 단말기 | |
US10318610B2 (en) | Display method and electronic device | |
CN104156666A (zh) | 文件扩展名加入颜色属性来区分文件类型的方法及装置 | |
JP4830763B2 (ja) | 画像処理システムおよび画像処理プログラム | |
JP4693167B2 (ja) | 帳票検索装置、帳票検索方法、プログラム及びコンピュータ読み取り可能な記憶媒体 | |
JP6483580B2 (ja) | 画像処理装置,画像処理方法,画像処理プログラムおよびそのプログラムを格納した記録媒体 | |
JP2007164532A (ja) | タスク表示装置、タスク表示方法及びタスク表示プログラム | |
JP6031566B1 (ja) | 特徴抽出装置、画像検索装置、方法、及びプログラム | |
JP6409294B2 (ja) | 情報処理装置、システム、方法及びプログラム | |
CN104850589B (zh) | 一种检索结果的显示方法及装置 | |
JP2008305312A (ja) | 検索結果表示装置および方法、プログラム、並びに記録媒体 | |
CN113490051B (zh) | 一种视频抽帧方法、装置、电子设备及存储介质 | |
JP2018085093A (ja) | 情報処理装置、制御方法、プログラム | |
US8768060B2 (en) | Image processing apparatus, image processing method and computer-readable medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11878819 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013551170 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14369205 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11878819 Country of ref document: EP Kind code of ref document: A1 |