WO2011045920A1 - 色彩解析装置、色彩解析方法、及び色彩解析プログラム - Google Patents
色彩解析装置、色彩解析方法、及び色彩解析プログラム Download PDFInfo
- Publication number
- WO2011045920A1 WO2011045920A1 PCT/JP2010/006058 JP2010006058W WO2011045920A1 WO 2011045920 A1 WO2011045920 A1 WO 2011045920A1 JP 2010006058 W JP2010006058 W JP 2010006058W WO 2011045920 A1 WO2011045920 A1 WO 2011045920A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- color
- natural language
- description
- language description
- image
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/53—Querying
- G06F16/532—Query formulation, e.g. graphical querying
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
- G06F16/5838—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
- G06F16/5854—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using shape and object relationship
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
- G06F16/5862—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using texture
Definitions
- the present invention relates to a color analysis device, a color analysis method, and a color analysis program that analyze colors based on descriptions about colors in natural language sentences.
- the video captured by the surveillance camera is also stored as digital data, and searching for people and objects by video analysis using a computer has made the search for people and objects more efficient.
- search is performed by comparing metadata described in a natural language phrase previously assigned to each image with the query sentence. There is a way.
- searching by converting natural language expressions related to colors and shapes included in an inquiry sentence into image feature amounts.
- Non-Patent Document 2 discloses a method in which metadata is added to an image automatically extracted from a document (blog) using a tool and an image can be searched in the same manner as a full-text search of a document.
- Non-Patent Document 3 expresses sensory words such as “square” and “clean” and the length and angle of each part of an object such as a chair included in the image by symbols or numerical values.
- a method of searching by associating shape feature amounts is disclosed.
- the latter search method there is a search method by converting a natural language expression related to color into a color feature amount of an image.
- a color expression using words such as “blue” or “yellowish” is associated with a color feature amount expressed as a distribution of values in a color space such as RGB or HSI, And a method for searching landscape photos and the like are disclosed.
- Such an image search method using color feature amounts can also be applied when searching for an image to which no metadata is assigned in advance.
- image search using shape features cannot be performed effectively (for example, when searching for products with the same shape such as T-shirts and handkerchiefs, or searching for photographic images with indefinite shapes such as natural scenery) It is also effective in the case of
- the present invention has been made to solve the above-described problem, and a case where a plurality of colors are described when an image search is performed based on a description of colors expressed in a natural language. Even so, it is an object of the present invention to provide a color analysis device, a color analysis method, and a color analysis program that can accurately search for a desired image.
- a color analysis device is a color analysis device that replaces a natural language description, which is a description of a color expressed in a natural language, with data indicating a distribution of values in a predetermined color space.
- Color ratio determining means for determining the ratio of the area occupied by the color included in the natural language description to the entire area of the image targeted by the natural language description using a phrase indicating the relationship between the colors included. It is characterized by that.
- Another aspect of the color analysis apparatus is a color analysis apparatus that replaces a natural language description, which is a description of a color expressed in a natural language, with data indicating a distribution of values in a predetermined color space, Color ratio determination means for determining the ratio of the area occupied by each of the colors specified by the natural language description and the predetermined color stored in advance to the entire area of the image targeted by the natural language description It is provided with.
- a natural language description which is a description of a color expressed in a natural language, with data indicating a distribution of values in a predetermined color space
- Color ratio determination means for determining the ratio of the area occupied by each of the colors specified by the natural language description and the predetermined color stored in advance to the entire area of the image targeted by the natural language description It is provided with.
- the color analysis method uses a phrase indicating a relationship between colors included in a natural language description, which is a description of a color expressed in a natural language, to the entire image area targeted by the natural language description.
- data representing a distribution of values in a predetermined color space is generated by determining a ratio of a region occupied by a color included in a natural language description.
- Another aspect of the color analysis method according to the present invention is directed to the natural language description for the color specified by the natural language description that is a description of the color expressed in the natural language and the predetermined color stored in advance.
- Data representing a distribution of values in a predetermined color space is generated by determining a ratio of the area occupied by each color with respect to the entire area of the image.
- the color analysis program uses a phrase indicating a relationship between colors included in a natural language description, which is a description of a color expressed in a natural language, on a computer, and the image of the image targeted by the natural language description. This is to execute processing for determining the ratio of the region occupied by the colors included in the natural language description and processing for generating data indicating the distribution of values in a predetermined color space for the entire region.
- the natural language description for a color specified by a natural language description that is a description of a color expressed in natural language and a predetermined color stored in advance are stored in a computer.
- a process for determining a ratio of the area occupied by each color and a process for generating data indicating a distribution of values in a predetermined color space with respect to the entire target image area. is there.
- a desired image when an image search is performed based on a description about a color expressed in a natural language, a desired image can be searched with high accuracy even when a plurality of colors are described. it can.
- FIG. 1 is a block diagram showing an example of a functional configuration of a color analysis apparatus according to the present invention.
- the color analysis apparatus includes a dictionary 101, a natural language sentence input unit 102, a color description extraction unit 103, a color ratio determination unit 104, and an image feature quantity generation unit 105. including.
- the color analysis apparatus can generate image feature data corresponding to the description of the natural language related to the input color by using these means. That is, in the present invention, the color analysis apparatus converts a description related to the color of a natural language sentence into a feature amount of an image.
- the color analysis device is specifically realized by an information processing device such as a personal computer that operates according to a program.
- the color analysis apparatus can be applied to, for example, an application for performing an image search for accumulated camera-captured video. By applying to such a purpose, it becomes possible to perform image retrieval from the characteristics of the subject heard from a person (listening information, etc.).
- the color analysis apparatus according to the present invention particularly compares the use of searching for goods / equipment such as desired clothes and shoes, the color distribution generated from a natural language sentence and the color distribution of an object to be originally searched, It can be applied to the use of analyzing the difference between a person's memory color and the actual color, analyzing the influence of lighting, adjusting the lighting, and the like.
- the dictionary 101 is a dictionary including natural language words representing colors (hereinafter also referred to as color descriptions) and natural language words representing relationships between colors (hereinafter also referred to as related words). Specifically, the dictionary 101 is stored in a storage device such as a magnetic disk device or an optical disk device.
- the natural language sentence input means 102 is realized by a CPU of an information processing apparatus that operates according to a program and an input device such as a keyboard and a mouse.
- the natural language sentence input unit 102 has a function of inputting a query sentence described in a natural language for an image to be searched in accordance with a user operation. It is assumed that the natural language sentence input means 102 inputs a query sentence describing characteristics relating to the color of the image to be searched.
- the natural language sentence input unit 102 may input or receive a file including a query sentence describing an image to be searched, for example.
- the color description extraction means 103 is specifically realized by a CPU of an information processing apparatus that operates according to a program.
- the color description extraction unit 103 has a function of extracting a word (color description) representing a color from the query sentence input by the natural language sentence input unit 102 using the dictionary 101.
- the color description extraction unit 103 has a function of extracting a related word from the query sentence input by the natural language sentence input unit 102 using the dictionary 101.
- the color ratio determination unit 104 is realized by a CPU of an information processing apparatus that operates according to a program.
- the color ratio determination unit 104 has a function of determining a ratio between a plurality of colors included in the inquiry sentence based on the color description extracted by the color description extraction unit 103 and related words.
- the color ratio determination unit 104 extracts a value indicating a color space corresponding to the color description extracted by the color description extraction unit 103 and a division ratio corresponding to a related word from the dictionary 101. Then, the color ratio determination unit 104 calculates the color area and color ratio (ratio of the area indicated by the color included in the inquiry sentence) in the image to be searched based on the value indicating the extracted color space and the division ratio.
- the image feature quantity generation means 105 is specifically realized by a CPU of an information processing apparatus that operates according to a program.
- the image feature quantity generation means 105 is based on the color ratio determined by the color ratio determination means 104, and the image feature quantity (characteristic data indicating a distribution of values in a predetermined color space) indicating characteristics relating to the color of the search target image.
- the function to generate In the present embodiment, the image feature amount generation unit 105 generates an image feature amount that includes the color area and the color ratio obtained by the color ratio determination unit 104 in association with each other.
- FIG. 2 is an explanatory diagram showing an example of input / output data input to the color analyzer and an image search using the color analyzer.
- the natural language sentence input unit 102 inputs an inquiry sentence such as “A shirt with a yellow line on a blue background ...” according to a user operation, for example.
- the color analysis apparatus also uses a sentence described in a natural language for an image to be searched that is described using a plurality of colors such as “a yellow line on a blue background”.
- one or more local regions occupying a specific range in the color space, and the ratio of the area occupied by the colors in each local region to the entire region of the search target image (color ratio) Including image feature data can be generated. Then, by performing image collation using the image feature amount generated by the color analysis device, it is possible to perform an image search for a search target.
- FIG. 3 is an explanatory diagram showing a specific example of the dictionary 101. Extraction of a color description (color description) from a natural language sentence by the color description extracting means 103, and determination processing of the color ratio occupied by each area in the color space of each color described in the natural language sentence by the color ratio determining means 104 Can be executed by referring to the dictionary 101 prepared with the content structure shown in FIG.
- the color description extraction unit 103 refers to the dictionary 101 illustrated in FIG. It is possible to extract “blue” and “yellow”. Further, the color description extraction unit 103 can extract a relation word “ni” representing a relationship between color names. However, the color description extraction unit 103 extracts a matching word only from a character string positioned between two color names included in the query sentence as a word representing the relationship (related word). In this example, the color description extraction unit 103 extracts color descriptions and related words in the order of extraction from the query sentence, such as [color name: blue], [relation: ni], and [color name: yellow].
- FIG. 4 is an explanatory diagram illustrating an example of processing by the color ratio determination unit 104.
- the color description extracting means 103 follows the same process as described above, [color name: blue], [relationship: And [Color name: Yellow] are extracted.
- the color ratio determining unit 104 sets the color area width W to the value in the color space of [color name: blue] (HSV space in the dictionary 101 shown in FIG. 3).
- the added region (240, 1.0, 0.5, 0.15) can be obtained.
- the color ratio determination unit 104 can obtain the area (60, 1.0, 0.5, 0.08) of [color name: yellow] according to the same processing.
- the color ratio determination unit 104 applies the area division ratio of the color region indicated by the relation word [relationship: ni] that connects both colors to each of the obtained regions in the color space of the two colors, Find the ratio.
- 0.8: 0.2 is specified by the dictionary 101 shown in FIG.
- the color ratio determination means 104 has an area ratio of [color name: blue] on the target image of 0.8 (80%), and a region of [color name: yellow] on the target image. It can be determined that the area ratio is 0.2 (20%).
- the color ratio determining means 104 As another example, a case where the input natural language sentence has a description of “blue and red on a white background” will be described. Also in this case, according to the same processing as in FIG. 4A, the color ratio determining means 104, as shown in FIG. 4B, is 0.8, 0.1, The area ratio can be assigned at a ratio of 0.1, and the color ratio of the colors constituting a certain target image can be determined. However, in the example shown in FIG. 4B, the color ratio determining unit 104 determines the ratio by giving priority to [Relation: To] over [Relation: To] for the combination of color names.
- the color ratio determination unit 104 may determine the area ratio of the color region by further using the default color value included in the dictionary 101 illustrated in FIG. For example, only when the number of color names included in the color-related description extracted from the natural language sentence is 0 or only one color-related description is included, the color ratio determining unit 104 determines the default color. May be applied to determine the color ratio. In addition, the color ratio determination unit 104, for example, even when two or more color names are included in the inquiry sentence, the relation word that connects the two or more colors is [Relation: To]. In the case of representing equal division, the color ratio may be determined in addition to the default color as one of the equal division target colors.
- the color ratio determination unit 104 sets the default color.
- the color may be applied as one equally divided color to determine the color ratio.
- the color ratio determining unit 104 determines that the color description and the related word included in the inquiry sentence correspond to [color name: white], [relation: and], and [color name: red], and the expression (1 ) To determine the color ratio.
- the values indicating the color space for each color included in the dictionary 101 are determined by, for example, a system administrator who manages the color analyzer. It is assumed that it has been input and registered in advance.
- the value of the division ratio for each related word can be obtained as a statistical value by, for example, collecting the history of past image searches. For example, in the present embodiment, as a result of aggregating past image search histories, when the related word “to” is used, a statistical value indicating that the ratio between colors is about 50% is obtained.
- the related word “ni” it is assumed that a statistical value indicating that the ratio between colors is 80:20 has been obtained. Then, it is assumed that the dictionary 101 is set in advance based on these statistical values.
- the HSV space is used as the color space of each color.
- the color space is not limited to that illustrated in the present embodiment, and may be, for example, an RGB space or an HLS space. May be used.
- the image feature quantity generation unit 105 shown in FIG. 1 can output a set of one or more color regions and color ratios obtained by the color ratio determination unit 104 as an image feature quantity.
- the color analyzing apparatus can generate an area on the color space for one color or two or more colors constituting the image from the input natural language sentence (query sentence). And an image feature amount including the ratio of the occupied area can be generated. Therefore, as in the example shown in FIG. 2, an image composed of two or more colors can be searched based on a natural language sentence.
- FIG. 5 is a block diagram illustrating an example of a hardware configuration of the color analysis apparatus.
- the color analysis device 12 can be realized by a hardware configuration similar to a general computer device, and includes a CPU (Central Processing Unit) 121, a main storage unit 122, an output unit 123, and an input unit. 124, a communication unit 125, and an auxiliary storage unit 126.
- a CPU Central Processing Unit
- the main storage unit 122 is a main memory such as a RAM (Random Access Memory), and is used as a data work area or a temporary data save area.
- the output unit 123 is a display device such as a liquid crystal display device or a printing device such as a printer, and has a function of outputting data.
- the input unit 124 is an input device such as a keyboard or a mouse, and has a function of inputting data. Further, when data is input by reading a file, the input unit 124 may be an external recording medium reading device or the like.
- the communication unit 125 is connected to a peripheral device and has a function of transmitting and receiving data.
- the auxiliary storage unit 126 is a ROM (Read Only Memory), a hard disk device, or the like.
- the above-described components 121 to 126 are connected to each other via a system bus 127.
- the auxiliary storage unit 126 of the color analysis device 12 stores various programs for analyzing colors included in the search target image based on a query sentence input as a natural language sentence. Yes.
- the auxiliary storage unit 126 uses a word / phrase indicating a relationship between colors included in a natural language description, which is a description of a color expressed in a natural language, to the computer to store an image of the image targeted by the natural language description.
- a color analysis program for executing processing for determining a ratio of a region occupied by colors included in a natural language description and processing for generating data indicating a distribution of values in a predetermined color space with respect to the entire region. I remember it.
- the color analysis device 12 has a hardware component such as an LSI (Large Scale Integration) in which a program for realizing the function shown in FIG. You may implement
- the program for providing each function of each component as shown in FIG. 1 may be realized by software by causing the CPU 121 of the computer to execute the program. That is, the CPU 121 loads the program stored in the auxiliary storage unit 126 to the main storage unit 122 and executes the program, thereby controlling the operation of the color analysis device 12 to realize each function described above in software. be able to.
- FIG. 6 is a flowchart illustrating an example of processing in which the color analysis apparatus inputs a natural language sentence and analyzes colors included in an image to be searched.
- the color analysis apparatus inputs a natural language sentence and analyzes colors included in an image to be searched.
- the user wants to perform an image search, he / she operates the color analysis device to input an inquiry sentence (natural language sentence) in which the feature (especially color) of the image to be searched is described using a natural language.
- the natural language sentence input means 102 of the color analysis apparatus inputs a natural language sentence according to the user's operation (step S10).
- the color description extraction unit 103 of the color analysis apparatus extracts a color description and related words for the search target image from the natural language sentence input by the natural language sentence input unit 102 (step S11). ).
- the color ratio determining unit 104 of the color analyzing apparatus determines a ratio between a plurality of colors included in the inquiry sentence based on the color description and related words extracted by the color description extracting unit 103 (step S12).
- the color ratio determination unit 104 extracts a value indicating a color space corresponding to the extracted color description and a division ratio corresponding to a related word from the dictionary 101.
- the color ratio determination unit 104 obtains a color area to color ratio in the search target image based on the value indicating the extracted color space and the division ratio.
- the image feature quantity generation unit 105 of the color analysis apparatus generates an image feature quantity indicating characteristics relating to the color of the search target image based on the color ratio determined by the color ratio determination unit 104 (step S13).
- the image feature amount generation unit 105 generates an image feature amount that includes the color area and the color ratio obtained by the color ratio determination unit 104 in association with each other.
- the image feature amount generated by the color analysis device is input to an image search device (not shown) that performs image search.
- the image search device performs an image search based on the image feature amount generated by the color analysis device, and searches for and extracts an image that matches the search target image.
- the image search device and the color analysis device may be realized using the same information processing device or may be realized using different information processing devices.
- the color analysis apparatus represents an image feature amount from a natural language sentence that is input by reading a keyboard or a file or received using a communication unit via a network. Generate data. Then, an image search apparatus (not shown) connected via a network or an image search program executed in the same hardware as the color analysis apparatus 12 is used and is configured with one or more colors. Can be realized.
- the color analyzer extracts a plurality of color descriptions included in the input natural language sentence and extracts related words. Then, the color analysis apparatus obtains a color region and a color ratio for a plurality of colors based on the extracted color description and related words. Then, the color analysis apparatus generates an image feature amount that includes the obtained color region and color ratio in association with each other. Therefore, when an image search is performed based on a description about colors expressed in a natural language, even if a plurality of colors are described, they can be converted into color feature values on the image. Therefore, when an image search is performed based on a description about colors expressed in a natural language, a desired image can be searched with high accuracy even when a plurality of colors are described. In other words, even if an image to be searched is composed of a plurality of colors, a desired image can be searched with high accuracy based on plain natural language expression.
- FIG. 7 is a block diagram illustrating a functional configuration example of the color analysis apparatus according to the second embodiment.
- the color analysis apparatus includes target configuration knowledge 107, target description extraction unit 108, color value adjustment unit 109, and color ratio adjustment unit. 110 in that it is different from the first embodiment.
- the information included in the dictionary 106 is different from the information included in the dictionary 101 shown in the first embodiment.
- the functions of the natural language sentence input unit 102, the color description extraction unit 103, the color ratio determination unit 104, and the image feature amount generation unit 105 are the same as those shown in the first embodiment. Since it is the same, description is abbreviate
- FIG. 8 is an explanatory diagram showing a specific example of the dictionary 106 according to the second embodiment.
- the dictionary 106 is the first in that it further includes an object name dictionary, a prefix modifier dictionary, and a prefix modifier dictionary compared to the dictionary 101 shown in FIG. 3. Different from the embodiment.
- the target name dictionary includes target names other than colors (for example, objects such as clothes) among words that can be a feature of an image to be searched.
- an object indicated by the word “clothing name: jersey” is an instance of a class “clothing” (“clothing: jersey, clothing, blue” included in the object name dictionary ( Example).
- “blue” is the default color for “clothing name: jersey”, and the default color in the dictionary 101 shown in the first embodiment in the area identified as the jersey on the image. It can be used similarly.
- clothes such as jerseys and jackets are described as examples of objects that can be included in the search target image.
- the present invention is not limited to those illustrated in the present embodiment, and examples include bags and cameras. Analysis may be performed on the belongings of
- the prefix modifier dictionary includes prefix modifiers such as “bright” and “light” among the words that can be the characteristics of the image to be searched.
- the post-modifier dictionary includes post-modifiers such as “ ⁇ ish” and “ ⁇ system” among words that can be a feature of the image to be searched.
- the object configuration knowledge 107 includes information indicating a class that is a category to which an object that can be included in an image to be searched belongs, and information indicating a relationship between the classes.
- FIG. 9 is an explanatory diagram illustrating a specific example of information indicating classes included in the target configuration knowledge 107 and information indicating relationships between the classes in the second embodiment.
- the target configuration knowledge 107 is specifically stored in a storage device such as a magnetic disk device or an optical disk device.
- “clothing”, “upper jacket”, “upper body”, etc. represent classes, and “classification”, “overwriting”, “priority” given to the arrow lines connecting the classes. "Represents a specific relationship (a relationship between classes). As for “overwrite” and “priority” among the relationships between classes, as shown in FIG. 9, the class located at the base of the arrow representing the relationship between these classes is located at the tip of the arrow. It also describes how much of the shared area on the image occupies the class.
- the target configuration knowledge 107 illustrated in FIG. 9 for example, “upper jacket” and “upper” that share the same region by being classified into the same “upper body” are “upper jacket”. It is set to give priority to “upper”. As a result, if the objects corresponding to the “upper jacket” and “upper” are simultaneously described in the natural language sentence, the target corresponding to the “upper jacket” is specified by the designation shown in FIG. It can be determined that the color occupies 0.7 (70%) of the shared area, and the target color corresponding to “upper” occupies the remaining 0.3 (30%) of the same shared area. On the other hand, if it is set to “overwrite”, it is determined that the target color corresponding to the class located at the base of the arrow in the shared area is 1.0 (100%), that is, occupies the entire area. be able to.
- the target name dictionary, the prefix modifier dictionary, and the postfix modifier dictionary included in the dictionary 106 are input and registered in advance by, for example, a system administrator who manages the color analyzer. Further, it is assumed that the target configuration knowledge 107 is also input and registered in advance by, for example, a system administrator who manages the color analysis apparatus.
- the target description extraction means 108 is specifically realized by a CPU of an information processing apparatus that operates according to a program.
- the target description extraction unit 108 has a function of extracting a target name described in a target name dictionary included in the dictionary 106 from a natural language sentence input by the natural language sentence input unit 102 by referring to the dictionary 106. .
- the object description extracting unit 108 has a function of extracting a class to which the extracted object name belongs from the object name dictionary. Further, the object description extracting unit 108 extracts a color description related to the class to which the object name belongs and the object name extracted by the color description extracting unit 103 (the object description extracting unit 108 extracts the color description extracted from the color description extracting unit 103).
- a function for determining a correspondence relationship with a target name That is, the target description extraction unit 108 extracts the name of the target from the natural language sentence using the dictionary 106, and determines the type of the target.
- the target description extracting unit 108 is arranged as a unit for executing the process after the color description extracting unit 103.
- the context of the color description extracting unit 103 and the target description extracting unit 108 is the same. Is not limited to the example shown in FIG.
- the object description extraction unit 108 may be arranged in the preceding stage of the color description extraction unit 103. In this case, first, the target description extracting unit 108 executes processing for extracting the target name from the natural language sentence input by the natural language sentence input unit 102, and then the color description extracting unit 103 performs color description from the natural language sentence. Will be executed. Further, the processes by the color description extracting unit 103 and the target description extracting unit 108 may be executed simultaneously in parallel.
- the color value adjusting means 109 is specifically realized by a CPU of an information processing apparatus that operates according to a program.
- the color value adjusting unit 109 has a function of referring to the dictionary 106 and extracting a color name and a color-related modifier from the natural language sentence input by the natural language sentence input unit 102. Specifically, the color value adjusting unit 109 extracts a prefix modifier included in the prefix modifier dictionary and a suffix modifier included in the prefix modifier dictionary. In addition, the color value adjusting unit 109 has a function of correcting the area on the color space of each color based on the extracted modifier and postfix modifier. That is, the color value adjusting unit 109 uses the color modifiers included in the dictionary 106 to adjust the position or size of the area occupied by the color expressed in the natural language in the natural language sentence in the predetermined color space.
- FIG. 10 is an explanatory diagram illustrating an example of processing in which the color value adjusting unit 109 according to the second embodiment corrects an area in the color space of each color included in the search target image.
- the color value adjusting unit 109 when the input natural language sentence has a description “reddish”, the color value adjusting unit 109 reads “color name: red” from the description “reddish”. A continuous set of “postfix modifier: Ippo” is extracted.
- the color value adjusting unit 109 uses a correction value (0, ⁇ ) for “ish” as a value (0, 1.0, 0.5, 0.1) indicating a region in the color space for “red”. (0, 0.9, 0.5, 0.3) indicating the area after correction is obtained by correcting (adding in this example).
- the color value adjusting unit 109 sets the corrected value as the value of the region indicating “redness”.
- the color ratio determining unit 104 uses the correction value obtained by the color value adjusting unit 109 to obtain the color area and color ratio in the search target image.
- the color ratio adjusting unit 110 is specifically realized by a CPU of an information processing apparatus that operates according to a program.
- the color ratio adjusting unit 110 has a function of referring to the dictionary 106 and the target configuration knowledge 107 to adjust the area ratio of each color area obtained by the color ratio determining unit 104 in one target.
- the color ratio adjusting unit 110 calculates the area ratio of the colors that occupy the surface of each partial object by using the component ratio between the partial objects. Change according to. In other words, the color ratio adjusting unit 110 determines the area ratio of each color that occupies a specific area on the image corresponding to each object, using the relationship between the object types included in the object configuration knowledge 107.
- FIG. 11 is an explanatory diagram illustrating an example of processing in which the color ratio adjusting unit 110 according to the second embodiment adjusts the area ratio of the color region.
- the dictionary 106 describes at least “jacket” belonging to the “upper jacket” class and “sweater” belonging to the “upper” class as clothes names.
- the target configuration knowledge 107 (indicated simply as “knowledge” in FIG.
- the color ratio adjusting unit 110 determines the color that occupies the jacket based on the contents of the dictionary 106. It is determined that beige is 100% (color ratio 1.0) and red is 100% (color ratio 1.0) for the color occupying the sweater (step 1).
- the color ratio adjusting unit 110 has a jacket as a kind of upper garment (an instance of the “upper garment” class) and a sweater as a kind of garment (an instance of the “upper” class). ) (Step 2).
- the color ratio adjusting unit 110 can determine that both the “upper garment” class and the “upper garment” class are classified into the “upper body” class based on the content of the target configuration knowledge 107. Further, the color ratio adjusting unit 110 can determine that the “top garment” class is “prioritized” when the “top garment” class overlaps (when the top garment and the top garment overlap). . Further, the color ratio adjusting unit 110 can determine that the “priority” color occupies the area ratio “0.7” (70%) in the entire instance of the “upper body” class. Therefore, the color ratio adjusting means 110 has a color ratio of 1.0 ⁇ 0.7 for the jacket color “beige” in the entire “upper body” and the sweater color “red” for the entire “upper body”. It can be determined that the color ratio in the image is 1.0 ⁇ (1.0 ⁇ 0.7) (step 3). Then, as a result of the determination, the color ratio adjusting unit 110 can determine that the color composition ratio is beige 0.7 and red 0.3 (step 4).
- the image feature amount generation unit 105 generates an image feature amount based on each color region after the area ratio adjustment by the color ratio adjustment unit 110.
- color analysis apparatus shown in the present embodiment can be realized by the same hardware configuration as the color analysis apparatus in the first embodiment shown in FIG.
- the color analysis apparatus extracts the target name and class from the input natural language sentence, and corrects the area in the color space of each color included in the search target image. . Then, the color analysis apparatus adjusts the area ratio of each color area included in the search target image, and generates an image feature amount based on each color area after the area ratio adjustment. Therefore, according to the present embodiment, an image feature amount in the color space of an image can be generated from a natural language sentence even when a plurality of objects having different colors overlap each other. Therefore, even an image in which a plurality of objects having different colors overlap each other can be searched using a natural language sentence.
- FIG. 12 is a block diagram illustrating a functional configuration example of the color analysis apparatus according to the third embodiment.
- the color analyzing apparatus is different from the first embodiment in that it includes a color area adjusting unit 111 in addition to the components shown in FIG.
- the information included in the dictionary 101 is the same as that in the first embodiment, and a description thereof will be omitted.
- the functions of the natural language sentence input unit 102, the color description extraction unit 103, the color ratio determination unit 104, and the image feature amount generation unit 105 are the same as those shown in the first embodiment. The description is omitted because it is similar.
- the color analysis apparatus newly includes a color area adjustment unit 111 in addition to the components shown in the first embodiment.
- the color area adjustment unit 111 is realized by a CPU of an information processing apparatus that operates according to a program.
- the color region adjusting unit 111 extracts each local region according to the distance between the local regions on the color space indicated by each extracted color name. It has a function to adjust the size in the color space.
- the color area adjusting unit 111 performs the natural color matching on the entire area of the image targeted by the natural language sentence according to the distance in the predetermined color space between the areas occupied by the colors included in the natural language sentence. Adjust the size of the area occupied by each color included in the language sentence.
- FIG. 13 is an explanatory diagram illustrating an example of processing in which the color area adjusting unit 111 according to the third embodiment adjusts the size of each local area on the color space.
- the color area adjusting unit 111 has a positional relationship in which the distance between the boundary colors of the local area 1 and the local area 2 on the color space has a specific value.
- the local area 1 and the local area 2 are not particularly subjected to the process of changing the distance from the center color to the boundary color in the color space.
- FIG. 13A the color area adjusting unit 111 has a positional relationship in which the distance between the boundary colors of the local area 1 and the local area 2 on the color space has a specific value.
- the local area 1 and the local area 2 are not particularly subjected to the process of changing the distance from the center color to the boundary color in the color space.
- the color region adjusting unit 111 determines the boundary color from the center color in both local regions. To the distance until the minimum value of the distance between the boundary colors of both local regions becomes a specific value.
- the color area adjusting unit 111 when the local area 1 and the local area 2 are located closer than a specific distance, Is performed until the minimum value of the distance between the local areas becomes a specific value or until there is no overlap between the local areas.
- a limit value may be provided for the ratio of expanding or reducing the distance from the center color to the boundary color in both local regions. Further, the distance between the central colors is not enlarged or reduced, but the distance between the central colors is measured. As shown in FIG. It is also possible to perform processing for avoiding overlapping by reducing the area on the side facing each of 2 and maintaining or increasing the area occupied on the color space by enlarging the area on the opposite side.
- the image feature amount generation unit 105 generates an image feature amount based on each color area after size adjustment by the color area adjustment unit 111.
- color analysis apparatus shown in the present embodiment can be realized by the same hardware configuration as the color analysis apparatus in the first embodiment shown in FIG.
- the color analyzing apparatus adjusts the size of each local region in the color space when a plurality of color names are extracted from a natural language sentence. Then, the color analysis device generates an image feature amount based on each color region after the size adjustment. Therefore, by adjusting the size of each local area according to the relative distance of the local area on the color space to which each of the colors specified in the natural language corresponds, the target range of the image that matches the generated image feature amount Independent of each color specified in a natural language can also be maintained while maintaining a wide range. Therefore, it is possible to search for an image that takes a natural language sentence as an input, has high completeness, and well reflects requirements entered in the natural language sentence.
- the first to third embodiments have been described with reference to the preferred embodiments of the present invention.
- the color analysis apparatus according to the present invention is not limited to the above-described embodiments. That is, the configuration and function of the color analysis apparatus according to the present invention can be variously changed by those skilled in the art within the scope of the present invention.
- FIG. 14 is a block diagram illustrating a minimum functional configuration example of the color analysis apparatus.
- the color analysis apparatus includes a color ratio determination unit 104 as a minimum component.
- the color analysis apparatus with the minimum configuration shown in FIG. 14 performs processing for replacing a natural language description, which is a description of a color expressed in a natural language, with data indicating a distribution of values in a predetermined color space.
- the color ratio determining unit 104 uses a phrase indicating the relationship between colors included in the natural language description, and the color included in the natural language description is applied to the entire image area targeted by the natural language description.
- a function is provided for determining the ratio of the occupied area.
- the color analysis apparatus converts a natural language description (for example, a natural language sentence), which is a description of a color expressed in a natural language, into data (for example, an image feature amount) indicating a distribution of values in a predetermined color space. ), And a natural language description with respect to the entire region of the image targeted by the natural language description using a phrase (for example, a related word) indicating the relationship between colors included in the natural language description. It is characterized by comprising color ratio determination means (for example, realized by the color ratio determination means 104) for determining a ratio (for example, color ratio) occupied by colors included in the language description.
- a ratio for example, color ratio
- the color analysis apparatus is realized by a data generation unit (for example, the image feature amount generation unit 105) that generates data indicating a distribution of values in a predetermined color space based on the determination result of the color ratio determination unit.
- the data generation means includes, as data indicating the distribution of values in the color space, an area occupied by colors expressed in the natural language in the natural language description (for example, a color area) on the predetermined color space, Data including an area ratio (for example, a color ratio) occupied by colors included in the natural language description may be generated with respect to the entire image region targeted by the language description.
- the color analysis apparatus stores a dictionary (for example, dictionary 101) that stores at least information on color names for specifying a description of colors from natural language sentences, and information about words indicating relationships between colors.
- a dictionary for example, dictionary 101
- Means for example, a storage device that stores the dictionary 101
- the dictionary storage means is configured to store a dictionary including at least parallel particles (for example, particles such as “to” and “ni”) as words indicating the relationship between colors. Also good.
- the dictionary storage means stores a dictionary that further includes information about color modifiers (for example, prefix modifiers, postfix modifiers), and uses the color modifiers included in the dictionary.
- a color value adjusting means for example, realized by the color value adjusting means 109) for adjusting the position or size of the area occupied by the color expressed in the natural language in the natural language description in the predetermined color space. It may be configured.
- the color analysis apparatus is a color analysis apparatus that replaces a natural language description, which is a description of a color expressed in a natural language, with data indicating a distribution of values in a predetermined color space.
- a natural language description which is a description of a color expressed in a natural language
- data indicating a distribution of values in a predetermined color space For the specified color and a predetermined color (for example, the default color shown in FIG. 3) stored in advance, the respective colors included in the natural language description for the entire region of the image targeted by the natural language description are Color ratio determination means (for example, realized by the color ratio determination means 104) for determining the ratio of the occupied area may be provided.
- the dictionary storage means further includes a name of a target identified as a part of the image and information on the type of the target (for example, information included in the target name dictionary shown in FIG. 8).
- Knowledge storage means for example, a storage device for storing the target configuration knowledge 107) for storing target configuration knowledge (for example, target configuration knowledge 107) including at least information relating to the relationship between different types of objects, and dictionary storage
- the target name is extracted from the natural language description using the dictionary stored in the means, and the target description extracting means (for example, realized by the target description extracting means 108) for determining the type of the target;
- Color ratio adjusting means for example, a color ratio adjusting unit) that determines the area ratio of each color occupying a specific area on the image corresponding to each object using the relationship between the types of included objects To) and may be further configured with a realization by 110.
- the color analysis apparatus may perform processing for the natural language according to a distance in a predetermined color space between regions occupied by the colors included in the natural language description with respect to the entire region of the image targeted by the natural language description.
- Color area adjusting means for example, realized by the color area adjusting means 111) for adjusting the size of the area occupied by each color included in the description may be further provided.
- the color analysis apparatus converts a natural language description (for example, a natural language sentence) that is a description of a color expressed in a natural language into data (for example, an image feature amount) indicating a distribution of values in a predetermined color space. ), And a natural language description with respect to the entire region of the image targeted by the natural language description using a phrase (for example, a related word) indicating the relationship between colors included in the natural language description.
- a color ratio determination unit (for example, realized by the color ratio determination unit 104) that determines a ratio (for example, a color ratio) occupied by colors included in the language description is provided.
- the color analysis device is realized by a data generation unit (for example, the image feature amount generation unit 105) that generates data indicating the distribution of values in a predetermined color space based on the determination result of the color ratio determination unit.
- the data generation unit includes, as data indicating the distribution of values in the color space, an area occupied by colors expressed in the natural language in the natural language description (for example, a color area) on the predetermined color space, Data including an area ratio (for example, a color ratio) occupied by colors included in the natural language description may be generated with respect to the entire image region targeted by the language description.
- the color analysis device stores a dictionary (for example, dictionary 101) that stores at least information on color names for specifying a description of colors from natural language sentences, and information on phrases indicating relationships between colors. (E.g., a storage device that stores the dictionary 101).
- dictionary 101 for example, dictionary 101
- the dictionary storage unit is configured to store a dictionary including at least parallel particles (for example, particles such as “to” and “ni”) as words indicating a relationship between colors. Also good.
- the dictionary storage unit stores a dictionary further including information on color modifiers (for example, prefix modifiers and postfix modifiers), and uses the color modifiers included in the dictionary
- a color value adjusting unit (for example, realized by the color value adjusting unit 109) that adjusts the position or size of the area occupied by the color expressed in the natural language in the natural language description in the predetermined color space. It may be configured.
- the color analysis device is a color analysis device that replaces a natural language description, which is a description of a color expressed in a natural language, with data indicating a distribution of values in a predetermined color space.
- a natural language description which is a description of a color expressed in a natural language, with data indicating a distribution of values in a predetermined color space.
- the respective colors included in the natural language description for the entire region of the image targeted by the natural language description are You may comprise so that the color ratio determination part (For example, implement
- the dictionary storage unit further includes a name of a target identified as a part of the image and information on the type of the target (for example, information included in the target name dictionary shown in FIG. 8).
- a knowledge storage unit for example, a storage device for storing the target configuration knowledge 107) for storing target configuration knowledge (for example, the target configuration knowledge 107) including at least information on the relationship between the types of different targets, and a dictionary storage
- a target description extraction unit for example, realized by the target description extraction unit 108) that extracts a target name from a natural language description and discriminates the type of the target using a dictionary stored in the unit;
- a color ratio adjusting unit (for example, color ratio adjusting unit 110) that determines the area ratio of each color that occupies a specific area on the image corresponding to each object using the relationship between the types of included objects.
- a color ratio adjusting unit for example, color ratio adjusting unit 110 that determines the area ratio of each color that occupies a specific area on the image corresponding to each object using the relationship between the types of
- the color analysis apparatus may perform the natural language processing on the entire area of the image targeted by the natural language description according to the distance in a predetermined color space between the areas occupied by the colors included in the natural language description.
- a color area adjustment unit (for example, realized by the color area adjustment unit 111) that adjusts the size of the area occupied by each color included in the description may be further provided.
- the present invention can be applied to an application for performing an image search for accumulated video captured by a camera. Further, the present invention particularly compares the use of searching for goods / equipment such as desired clothes and shoes, the color distribution generated from a natural language sentence and the color distribution of an object to be originally searched, and the memory color of a person It can be applied to applications such as analyzing the difference between the actual color and analyzing the influence of lighting / adjusting the lighting.
Abstract
Description
以下、本発明の第1の実施形態を、図面を参照して説明する。図1は、本発明による色彩解析装置の機能構成の一例を示すブロック図である。図1に示すように、本実施形態では、色彩解析装置は、辞書101と、自然言語文入力手段102と、色記述抽出手段103と、色比率判定手段104と、画像特徴量生成手段105とを含む。そして、色彩解析装置は、それらの手段を用いて、入力した色に関する自然言語の記述に対応した画像特徴量データを生成できる。すなわち、本発明において、色彩解析装置は、自然言語文の色に関する記述を画像の特徴量に変換する。なお、色彩解析装置は、具体的には、プログラムに従って動作するパーソナルコンピュータ等の情報処理装置によって実現される。
[HSVW:0,1.0,0.5,0.1]×[0.5] ・・・ 式(1)
次に、本発明の第2の実施形態について、図面を参照して説明する。図7は、第2の実施形態における色彩解析装置の機能構成例を示すブロック図である。図7に示すように、本実施形態では、色彩解析装置は、図1に示した構成要素に加えて、対象構成知識107、対象記述抽出手段108、色彩値調整手段109、及び色比率調整手段110を含む点で、第1の実施形態と異なる。また、本実施形態では、辞書106が含む情報が、第1の実施形態で示した辞書101が含む情報と異なる。
次に、本発明の第3の実施形態について、図面を参照して説明する。図12は、第3の実施形態における色彩解析装置の機能構成例を示すブロック図である。図12に示すように、本実施形態では、色彩解析装置は、図1に示した構成要素に加えて、色領域調整手段111を含む点で、第1の実施形態と異なる。
102 自然言語文入力手段
103 色記述抽出手段
104 色比率判定手段
105 画像特徴量生成手段
106 辞書
107 対象構成知識
108 対象記述抽出手段
109 色彩値調整手段
110 色比率調整手段
111 色領域調整手段
12 色彩解析装置
121 CPU
122 主記憶部
123 出力部
124 入力部
125 通信部
126 補助記憶部
127 システムバス
Claims (21)
- 自然言語で表現された色についての記述である自然言語記述を、所定の色空間上の値の分布を示すデータに置き換える色彩解析装置であって、
前記自然言語記述に含まれる色同士の関係を示す語句を用いて、当該自然言語記述が対象とする画像の領域全体に対して、前記自然言語記述に含まれる色が占める領域の割合を決定する色比率判定手段を
備えたことを特徴とする色彩解析装置。 - 色比率判定手段の決定結果に基づいて、所定の色空間上の値の分布を示すデータを生成するデータ生成手段を備え、
前記データ生成手段は、前記色空間上の値の分布を示すデータとして、前記所定の色空間上で前記自然言語記述において自然言語によって表現された色が占める領域と、前記自然言語記述が対象とする画像の領域全体に対して、前記自然言語によって表現された色が占める面積比率とを含むデータを生成する
請求項1記載の色彩解析装置。 - 自然言語の文から色についての記述を特定するための色名、及び色同士の関係を示す語句に関する情報を少なくとも含む辞書を記憶する辞書記憶手段を備えた
請求項1又は請求項2記載の色彩解析装置。 - 前記辞書記憶手段は、色同士の関係を示す語句として少なくとも並立助詞を含む辞書を記憶する
請求項3記載の色彩解析装置。 - 前記辞書記憶手段は、色の修飾語句に関する情報をさらに含む辞書を記憶し、
前記辞書に含まれる色の修飾語句を用いて、前記所定の色空間上で前記自然言語記述において自然言語によって表現された色が占める領域の位置又はサイズを調整する色彩値調整手段をさらに備えた
請求項3又は請求項4記載の色彩解析装置。 - 自然言語で表現された色についての記述である自然言語記述を、所定の色空間上の値の分布を示すデータに置き換える色彩解析装置であって、
前記自然言語記述により特定した色と、予め記憶する所定の色とについて、当該自然言語記述が対象とする画像の領域全体に対して、前記それぞれの色が占める領域の割合を決定する色比率判定手段を
備えたことを特徴とする色彩解析装置。 - 前記辞書記憶手段は、画像の一部として識別する対象の名称と当該対象の種類に関する情報とをさらに含む辞書を記憶し、
異なる前記対象の種類間の関係に関する情報を少なくとも含む対象構成知識を記憶する知識記憶手段と、
前記辞書記憶手段が記憶する辞書を用いて、前記自然言語記述から前記対象の名称を抽出し、当該対象の種類を判別する対象記述抽出手段と、
前記対象構成知識に含まれる前記対象の種類間の関係を用いて、前記各対象に相当する画像上の特定領域を占める各色の面積比率を決定する色比率調整手段とを
さらに備えた請求項3から請求項5のうちのいずれか1項に記載の色彩解析装置。 - 前記自然言語記述が対象とする画像の領域全体に対して、前記自然言語記述に含まれる各色が占める領域間の前記所定の色空間上での距離に応じて、当該自然言語記述に含まれる各色が占める領域の大きさを調整する色領域調整手段をさらに備えた
請求項1から請求項7のうちのいずれか1項に記載の色彩解析装置。 - 自然言語で表現された色についての記述である自然言語記述に含まれる色同士の関係を示す語句を用いて、当該自然言語記述が対象とする画像の領域全体に対して、前記自然言語記述に含まれる色が占める領域の割合を決定することにより、所定の色空間上の値の分布を示すデータを生成する
ことを特徴とする色彩解析方法。 - 前記色空間上の値の分布を示すデータとして、前記所定の色空間上で前記自然言語記述において自然言語によって表現された色が占める領域と、前記自然言語記述が対象とする画像の領域全体に対して、前記自然言語記述に含まれる色が占める面積比率とを含むデータを生成する
請求項9に記載の色彩解析方法。 - 自然言語の文から色についての記述を特定するための色名、及び色同士の関係を示す語句に関する情報を少なくとも含む辞書を記憶する
請求項9又は請求項10記載の色彩解析方法。 - 色同士の関係を含む語句として少なくとも並立助詞を含む辞書を記憶する
請求項11に記載の色彩解析方法。 - 色の修飾語句に関する情報をさらに含む辞書を記憶し、
前記辞書に含まれる色の修飾語句を用いて、前記所定の色空間上で前記自然言語記述において自然言語によって表現された色が占める領域の位置又はサイズを調整する
請求項11又は請求項12記載の色彩解析方法。 - 自然言語で表現された色についての記述である自然言語記述により特定した色と、予め記憶する所定の色とについて、当該自然言語記述が対象とする画像の領域全体に対して、前記それぞれの色が占める領域の割合を決定することにより、所定の色空間上の値の分布を示すデータを生成する
ことを特徴とする色彩解析方法。 - 画像の一部として識別する対象の名称と当該対象の種類に関する情報と、異なる前記対象の種類間の関係に関する情報とをさらに含む辞書を記憶し、
異なる前記対象の種類間の関係に関する情報を少なくとも含む対象構成知識を記憶し、
前記辞書を用いて、前記自然言語記述から前記対象の名称を抽出し、当該対象の種類を判別し、
前記対象構成知識に含まれる前記対象の種類間の関係を用いて、前記各対象に相当する画像上の特定領域を占める各色の面積比率を決定する
請求項11から請求項13のうちのいずれか1項に記載の色彩解析方法。 - 前記自然言語記述が対象とする画像の領域全体に対して、前記自然言語記述に含まれる各色が占める領域間の前記所定の色空間上での距離に応じて、当該自然言語記述に含まれる各色が占める領域の大きさを調整する
ことを特徴とする請求項9から請求項15のうちのいずれか1項に記載の色彩解析方法。 - コンピュータに、
自然言語で表現された色についての記述である自然言語記述に含まれる色同士の関係を示す語句を用いて、当該自然言語記述が対象とする画像の領域全体に対して、前記自然言語記述に含まれる色が占める領域の割合を決定する処理と、
所定の色空間上の値の分布を示すデータを生成する処理とを
実行させるための色彩解析プログラム。 - 前記コンピュータに、
前記自然言語記述に含まれる色の修飾語句を用いて、前記所定の色空間上で前記自然言語記述において自然言語によって表現された色が占める領域の位置又はサイズを調整する処理を実行させる
請求項17記載の色彩解析プログラム。 - コンピュータに、
自然言語で表現された色についての記述である自然言語記述により特定した色と、予め記憶する所定の色とについて、当該自然言語記述が対象とする画像の領域全体に対して、前記それぞれの色が占める領域の割合を決定する処理と、
所定の色空間上の値の分布を示すデータを生成する処理とを
実行させるための色彩解析プログラム。 - 前記コンピュータに、
前記自然言語記述から画像の一部として識別する対象の名称を抽出し、当該対象の種類を判別する処理と、
異なる前記対象の種類間の関係に関する情報を用いて、前記各対象に相当する画像上の特定領域を占める各色の面積比率を決定する処理とを実行させる
請求項17から請求項19のうちのいずれか1項に記載の色彩解析プログラム。 - 前記コンピュータに、
前記自然言語記述が対象とする画像の領域全体に対して、前記自然言語記述に含まれる各色が占める領域間の前記所定の色空間上での距離に応じて、当該自然言語記述に含まれる各色が占める領域の大きさを調整する処理を実行させる
請求項17から請求項20のうちのいずれか1項に記載の色彩解析プログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP10823188.7A EP2490132A4 (en) | 2009-10-16 | 2010-10-12 | COLOR ANALYSIS DEVICE, COLOR ANALYSIS METHOD, AND COLOR ANALYSIS PROGRAM |
US13/502,063 US9400808B2 (en) | 2009-10-16 | 2010-10-12 | Color description analysis device, color description analysis method, and color description analysis program |
JP2011543938A JP5682569B2 (ja) | 2009-10-16 | 2010-10-12 | 色彩解析装置、色彩解析方法、及び色彩解析プログラム |
CN201080046661.8A CN102667767B (zh) | 2009-10-16 | 2010-10-12 | 色彩描述分析设备、色彩描述分析方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009239000 | 2009-10-16 | ||
JP2009-239000 | 2009-10-16 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011045920A1 true WO2011045920A1 (ja) | 2011-04-21 |
Family
ID=43875976
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/006058 WO2011045920A1 (ja) | 2009-10-16 | 2010-10-12 | 色彩解析装置、色彩解析方法、及び色彩解析プログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US9400808B2 (ja) |
EP (1) | EP2490132A4 (ja) |
JP (1) | JP5682569B2 (ja) |
CN (1) | CN102667767B (ja) |
WO (1) | WO2011045920A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013005262A1 (ja) * | 2011-07-07 | 2013-01-10 | パイオニア株式会社 | 画像抽出方法、画像抽出装置、画像抽出システム、サーバー、ユーザー端末、通信システムおよびプログラム |
JP5687806B1 (ja) * | 2014-03-28 | 2015-03-25 | 楽天株式会社 | 色推定装置、色推定方法及び色推定プログラム |
CN105574046A (zh) * | 2014-10-17 | 2016-05-11 | 阿里巴巴集团控股有限公司 | 一种设置网页色彩的方法及装置 |
JP6028130B1 (ja) * | 2016-02-09 | 2016-11-16 | 楽天株式会社 | 色分類装置、色分類方法、プログラム、ならびに、非一時的なコンピュータ読取可能な情報記録媒体 |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5724430B2 (ja) * | 2011-02-15 | 2015-05-27 | カシオ計算機株式会社 | 情報検索装置およびプログラム |
WO2013021889A1 (ja) * | 2011-08-05 | 2013-02-14 | 楽天株式会社 | 色名決定装置、色名決定方法、情報記録媒体、ならびに、プログラム |
CN103718212B (zh) * | 2011-08-05 | 2016-10-12 | 乐天株式会社 | 颜色确定装置、颜色确定系统和颜色确定方法 |
JP6019968B2 (ja) * | 2012-09-10 | 2016-11-02 | 株式会社リコー | レポート作成システム、レポート作成装置及びプログラム |
US10409822B2 (en) * | 2014-05-06 | 2019-09-10 | Shutterstock, Inc. | Systems and methods for presenting ranked search results |
JP6964980B2 (ja) | 2014-05-19 | 2021-11-10 | アベリー・デニソン・リテイル・インフォメーション・サービシズ・リミテッド・ライアビリティ・カンパニーAvery Dennison Retail Information Services, Llc | スキャン可能なマークを有する合成イメージ熱転写物 |
CN104298786B (zh) * | 2014-11-12 | 2018-07-10 | 广州出益信息科技有限公司 | 一种图像检索方法及装置 |
US11062142B2 (en) | 2017-06-29 | 2021-07-13 | Accenture Gobal Solutions Limited | Natural language unification based robotic agent control |
US11436771B2 (en) | 2020-11-20 | 2022-09-06 | International Business Machines Corporation | Graph-based color description generation |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05108728A (ja) * | 1991-10-21 | 1993-04-30 | Hitachi Ltd | 画像のフアイリングならびに検索方法 |
JP2007304738A (ja) * | 2006-05-10 | 2007-11-22 | Viva Computer Co Ltd | 画像蓄積・検索システムと同システム用の画像蓄積装置及び画像検索装置並びにプログラム |
JP2009003581A (ja) | 2007-06-19 | 2009-01-08 | Viva Computer Co Ltd | 画像蓄積・検索システム及び画像蓄積・検索システム用プログラム |
JP2009239000A (ja) | 2008-03-27 | 2009-10-15 | Dainippon Screen Mfg Co Ltd | 基板処理システム |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4144961B2 (ja) * | 1999-01-14 | 2008-09-03 | 富士フイルム株式会社 | 画像データ通信システム,サーバ・システムおよびその制御方法ならびにサーバ・システムを制御するためのプログラムを格納した記録媒体 |
JP5170961B2 (ja) * | 2006-02-01 | 2013-03-27 | ソニー株式会社 | 画像処理システム、画像処理装置および方法、プログラム、並びに記録媒体 |
JP2007206919A (ja) * | 2006-02-01 | 2007-08-16 | Sony Corp | 表示制御装置および方法、プログラム、並びに記録媒体 |
JP2007206920A (ja) * | 2006-02-01 | 2007-08-16 | Sony Corp | 画像処理装置および方法、検索装置および方法、プログラム、並びに記録媒体 |
US8885236B2 (en) * | 2006-06-30 | 2014-11-11 | Geoffrey J. Woolfe | Natural language color communication and system interface |
US7755646B2 (en) * | 2006-10-17 | 2010-07-13 | Hewlett-Packard Development Company, L.P. | Image management through lexical representations |
US20090192990A1 (en) * | 2008-01-30 | 2009-07-30 | The University Of Hong Kong | Method and apparatus for realtime or near realtime video image retrieval |
US8229210B2 (en) * | 2008-04-02 | 2012-07-24 | Bindu Rama Rao | Mobile device with color detection capabilities |
-
2010
- 2010-10-12 CN CN201080046661.8A patent/CN102667767B/zh active Active
- 2010-10-12 WO PCT/JP2010/006058 patent/WO2011045920A1/ja active Application Filing
- 2010-10-12 US US13/502,063 patent/US9400808B2/en active Active
- 2010-10-12 JP JP2011543938A patent/JP5682569B2/ja active Active
- 2010-10-12 EP EP10823188.7A patent/EP2490132A4/en not_active Withdrawn
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05108728A (ja) * | 1991-10-21 | 1993-04-30 | Hitachi Ltd | 画像のフアイリングならびに検索方法 |
JP2007304738A (ja) * | 2006-05-10 | 2007-11-22 | Viva Computer Co Ltd | 画像蓄積・検索システムと同システム用の画像蓄積装置及び画像検索装置並びにプログラム |
JP2009003581A (ja) | 2007-06-19 | 2009-01-08 | Viva Computer Co Ltd | 画像蓄積・検索システム及び画像蓄積・検索システム用プログラム |
JP2009239000A (ja) | 2008-03-27 | 2009-10-15 | Dainippon Screen Mfg Co Ltd | 基板処理システム |
Non-Patent Citations (6)
Title |
---|
HARADA ET AL.: "On Constructing Shape Feature Space for Interpreting Subjective Expressions", TRANSACTIONS OF INFORMATION PROCESSING SOCIETY OF JAPAN, vol. 40, no. 5, 1999, pages 2356 - 2366 |
HIROSHI SUGIURA ET AL.: "Image Retrieval by Natural Language", IEICE TECHNICAL REPORT, vol. 94, no. 51, 20 May 1994 (1994-05-20), pages 55 - 62, XP008153549 * |
IHARA ET AL.: "Mobloget: A Retrieval System for Texts and Images in Blogs", WORKSHOP ON INTERACTIVE SYSTEMS AND SOFTWARE (WISS2005, 2005, pages 69 - 74 |
KOBAYAKAWA ET AL.: "Interactive Image Retrieval Based on Wavelet Transform and Its Application to Japanese Historical Image Data", TRANSACTIONS OF INFORMATION PROCESSING SOCIETY OF JAPAN, vol. 40, no. 3, 1999, pages 899 - 911 |
See also references of EP2490132A4 |
SHOJI HARADA ET AL.: "On Constructing Pictorial Feature Space For Image Retrieval", IEICE TECHNICAL REPORT, vol. 95, no. 322, 19 October 1995 (1995-10-19), pages 7 - 12, XP008153534 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013005262A1 (ja) * | 2011-07-07 | 2013-01-10 | パイオニア株式会社 | 画像抽出方法、画像抽出装置、画像抽出システム、サーバー、ユーザー端末、通信システムおよびプログラム |
JP5687806B1 (ja) * | 2014-03-28 | 2015-03-25 | 楽天株式会社 | 色推定装置、色推定方法及び色推定プログラム |
WO2015145766A1 (ja) * | 2014-03-28 | 2015-10-01 | 楽天株式会社 | 色推定装置、色推定方法及び色推定プログラム |
CN105574046A (zh) * | 2014-10-17 | 2016-05-11 | 阿里巴巴集团控股有限公司 | 一种设置网页色彩的方法及装置 |
CN105574046B (zh) * | 2014-10-17 | 2019-07-12 | 阿里巴巴集团控股有限公司 | 一种设置网页色彩的方法及装置 |
JP6028130B1 (ja) * | 2016-02-09 | 2016-11-16 | 楽天株式会社 | 色分類装置、色分類方法、プログラム、ならびに、非一時的なコンピュータ読取可能な情報記録媒体 |
WO2017138088A1 (ja) * | 2016-02-09 | 2017-08-17 | 楽天株式会社 | 色分類装置、色分類方法、プログラム、ならびに、非一時的なコンピュータ読取可能な情報記録媒体 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2011045920A1 (ja) | 2013-03-04 |
US9400808B2 (en) | 2016-07-26 |
CN102667767A (zh) | 2012-09-12 |
CN102667767B (zh) | 2015-01-07 |
US20120195499A1 (en) | 2012-08-02 |
JP5682569B2 (ja) | 2015-03-11 |
EP2490132A1 (en) | 2012-08-22 |
EP2490132A4 (en) | 2016-10-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5682569B2 (ja) | 色彩解析装置、色彩解析方法、及び色彩解析プログラム | |
US8700494B2 (en) | Identifying product variants | |
US7917514B2 (en) | Visual and multi-dimensional search | |
US20070171473A1 (en) | Information processing apparatus, Information processing method, and computer program product | |
US20080104020A1 (en) | Handwritten Query Builder | |
JP2007286864A (ja) | 画像処理装置、画像処理方法、プログラムおよび記録媒体 | |
KR102119253B1 (ko) | 영상데이터의 추상적특성 획득 방법, 장치 및 프로그램 | |
JP6525921B2 (ja) | 画像処理装置、画像処理方法、検索装置 | |
JP2019520662A (ja) | 商標画像のコンテンツ・ベースの検索及び取得 | |
JP4374902B2 (ja) | 類似画像検索装置、類似画像検索方法、および類似画像検索プログラム | |
JP5507962B2 (ja) | 情報処理装置及びその制御方法、プログラム | |
Chang et al. | Deformed trademark retrieval based on 2D pseudo-hidden Markov model | |
JP2002297648A (ja) | 情報検索装置、情報検索プログラム及び記録媒体 | |
JP6736988B2 (ja) | 画像検索システム、画像処理システム及び画像検索プログラム | |
JP2011238043A (ja) | マンガコンテンツの要約を生成する要約マンガ画像生成装置、プログラム及び方法 | |
JP2003330941A (ja) | 類似画像分類装置 | |
JP2008305311A (ja) | 表示装置および方法、プログラム、並びに記録媒体。 | |
JP5790661B2 (ja) | 順序判定装置、順序判定方法および順序判定プログラム | |
JP2013016024A (ja) | 情報検索方法および装置 | |
EP4195135A1 (en) | Information processing device, information processing method, information processing system, and program | |
JP2020047031A (ja) | 文書検索装置、文書検索システム及びプログラム | |
JP5913774B2 (ja) | Webサイトを共有する方法、電子機器およびコンピュータ・プログラム | |
JP7138264B1 (ja) | 情報処理装置、情報処理方法、情報処理システム、およびプログラム | |
JP2005316881A (ja) | 図面検索のためのプログラム、図面検索装置及び図面検索結果表示方法 | |
JP2010262578A (ja) | 帳票辞書生成装置、帳票識別装置、帳票辞書生成方法、及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080046661.8 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10823188 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011543938 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010823188 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13502063 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |