KR20150099087A - Method and device for providing color information of image - Google Patents

Method and device for providing color information of image Download PDF

Info

Publication number
KR20150099087A
KR20150099087A KR1020140020596A KR20140020596A KR20150099087A KR 20150099087 A KR20150099087 A KR 20150099087A KR 1020140020596 A KR1020140020596 A KR 1020140020596A KR 20140020596 A KR20140020596 A KR 20140020596A KR 20150099087 A KR20150099087 A KR 20150099087A
Authority
KR
South Korea
Prior art keywords
color
information
image
values
color values
Prior art date
Application number
KR1020140020596A
Other languages
Korean (ko)
Inventor
이동혁
박정훈
황성택
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Priority to KR1020140020596A priority Critical patent/KR20150099087A/en
Publication of KR20150099087A publication Critical patent/KR20150099087A/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/60Radiation pyrometry, e.g. infrared or optical thermometry using determination of colour temperature
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L13/00Speech synthesis; Text to speech systems
    • G10L13/02Methods for producing synthetic speech; Speech synthesisers

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Human Computer Interaction (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Color Image Communication Systems (AREA)

Abstract

The present invention relates to a method and a device for providing color information of an image. The method for providing color information of an image comprises: obtaining lighting information from a lighting environment where an image is taken; compensating first color values which colors expressed in the image have by using the obtained lighting information; analyzing color information on the compensated first color values based on a list of reference colors defining color types expressed in the image; and providing the analyzed color information.

Description

[0001] The present invention relates to a method and device for providing color information on an image,

A method of providing color information on an image displayed on a device, and a device performing the method.

Around the world, there are about 40 million blind blind people and about 250 million people with partial blindness, and more and more people with blindness are increasing. Visually impaired people may want to be aware of information such as clothes, bags, and shoes at the mall and coordinate with their own style, just like normal people. Also, blind people can try to dress in their wardrobe to suit the situation and mood of the day. However, if there is no one who can help the blind, the blind can recognize the color of the object he or she chooses, or it may be difficult to select the object of the desired color. Recently, attempts have been made to help the visually impaired to understand the color of objects through a camera of a widely-used portable telephone. However, in the preview image, RGB (Red, Green, Blue) Only inaccurate methods of providing only accurate color information are provided.

A method for providing color information on an image, and a device for performing the method. The present invention also provides a computer-readable recording medium on which a program for causing the computer to execute the method is provided. The technical problem to be solved by this embodiment is not limited to the above-described technical problems, and other technical problems may exist.

According to an aspect, a method of providing color information for an image in a device includes obtaining illumination information in an illumination environment in which the image is captured; Using the acquired illumination information to compensate for first color values of colors represented in the image; Analyzing color information for the compensated first color values based on a list of reference colors defining a type of colors represented in the image; And providing the analyzed color information.

Further, the compensating step adjusts the hue or brightness of the first color values of the colors in the first color space according to the kind of the acquired illumination information.

The compensating may further include adjusting the color of the first color values when the obtained illumination information is information on an attribute of the light source, and adjusting the color of the first color values when the obtained illumination information is information on illuminance Adjust the brightness of the images.

In addition, the obtained illumination information includes information on properties of the light source in the illumination environment, which is measured using at least one of a spectral spectrometer and a color temperature meter provided in the device.

The acquired attribute information includes information on illuminance in the illumination environment measured using an illuminance sensor provided in the device.

Further, the acquiring step acquires the illumination information from the meta information about the illumination environment included in the received image data when the image is based on image data received from outside the device.

In addition, the analyzing step analyzes the color information for the colors represented by pixels in a predetermined area in the image.

In addition, the predetermined area may be an area excluding an edge area of a predetermined thickness in the image or a partial area designated by a user's input in the image.

The analyzing may further include: generating a list of reference colors; Converting the compensated first color values in a first color space to second color values in a second color space; And classifying the transformed second color values based on the generated list of reference colors, wherein the providing step comprises: classifying information about at least one reference color corresponding to the classified second color values And the color information.

The method of claim 1, further comprising converting third color values of the reference colors in the first color space to fourth color values in the second color space, And classifies the converted second color values based on the color values.

The first color space includes at least one of RGB (Red, Green, Blue), YCbCr color space and CMYK color space, and the second color space includes a CIELAB color space and a hue saturation value (HSV) color And a space.

The sorting may further include: setting seed color values corresponding to the reference colors in the second color space; And determining at least one cluster among the clusters of seed color values to which the converted second color values belong by performing a color difference calculation using the seed color values and the converted second color values, The sorted second color values are based on the determination result.

The color difference calculation may include at least one of a color difference calculation using k-means clustering and a color difference calculation for searching a histogram bin having a minimum color difference.

If the number of reference colors to be included in the color information is smaller than the number of clusters to which the converted second color values belong, the determining may further include, by merging neighboring clusters on the second color space, .

In addition, the providing step provides the color information for the dominant reference color, of the at least one reference color corresponding to the classified second color values.

In addition, the providing step provides the color information for a predetermined number of reference colors in the most dominant order among the at least one reference color corresponding to the classified second color values.

In addition, the providing step provides at least one of a name and a ratio for the at least one reference color corresponding to the classified second color values using TTS (text to speech).

In addition, the list of reference colors includes colors classified by the Macbeth color chart.

In addition, when the image expresses human skin, the method further includes providing fashion information or beauty information matching the analyzed color information.

The method further includes the step of classifying the images stored in the device using at least one of the obtained illumination information and the analyzed color information.

According to another aspect, there is provided a computer-readable recording medium recording a program for causing a computer to execute the method of providing color information.

According to yet another aspect, a device for providing color information on an image comprises: an illumination environment information obtaining unit for obtaining illumination information in an illumination environment in which the image is taken; A lighting influence analyzer for compensating for first color values of colors represented in the image using the obtained illumination information; A color analyzer for analyzing color information on the compensated first color values based on a list of reference colors defining a kind of colors represented in the image; And a color information providing unit for providing the analyzed color information.

Further, the illumination effect analyzing unit adjusts the hue or brightness of the first color values of the colors in the first color space according to the obtained illumination information.

The illumination environment information obtaining unit may include at least one of a spectral spectrometer, a color temperature meter, and an illuminance sensor, and the illumination environment information obtaining unit may obtain at least one of the spectral spectrometer and the color temperature meter, And information on the illuminance of the illumination environment measured using the illuminance sensor.

The color analyzer may analyze the color information for the colors represented by pixels in a predetermined area in the image, and the predetermined area may include a region excluding an edge area of a predetermined thickness in the image, Lt; / RTI > may be a partial area designated by the user's input.

The color analyzer may further include: a reference color generator for generating a list of the reference colors; A color space transformer for transforming the compensated first color values in the first color space into second color values in the second color space; And a color classification unit for classifying the converted second color values based on the generated list of reference colors, wherein the color information providing unit is configured to classify the converted second color values into at least one reference color corresponding to the at least one reference color corresponding to the classified second color values And provides the color information including information.

The color space conversion unit may convert the third color values of the reference colors in the first color space into fourth color values in the second color space, And classifies the converted second color values based on the color values.

The color classification unit may further include: a seed color value setting unit that sets seed color values corresponding to the reference colors in the second color space; And a cluster determination unit for determining at least one cluster among the clusters of seed color values to which the converted second color values belong, through a color difference calculation using the seed color values and the converted second color values, The classified second color values are based on the determination result.

Also, the color information providing unit provides at least one of a name and a ratio for the at least one reference color corresponding to the classified second color values using TTS (text to speech).

The apparatus further includes a user interface unit for displaying the image and receiving a user input for setting an area in which the color information is to be provided in the displayed image.

According to the above description, it is possible to provide objective color analysis information on the colors of the image displayed on the device through image analysis by the device. In particular, visually impaired persons who can not see an object expressed in an image can recognize the color information of the object more conveniently through voice guidance of the color information provided by the device. In addition, objective color information can be used in various fields such as fashion, beauty, and education to help people make decisions.

1 is a diagram showing a device for providing color information on an image according to the present embodiment.
2A is a configuration diagram of a device for providing color information on an image according to the present embodiment.
FIG. 2B is a detailed configuration diagram of the color analysis unit in the device of FIG. 2A. FIG.
2C is a diagram showing an illumination environment information acquisition unit mounted in the device according to the present embodiment.
FIGS. 3A and 3B are views for explaining an area to be provided with color information by the user through the user interface unit according to the present embodiment.
FIGS. 4A and 4B illustrate how the reference color generator generates a list of reference colors using a Macbeth color chart according to the present embodiment.
5 is a diagram for explaining setting of the preprocessing area in the color space conversion unit according to the present embodiment.
6 is a diagram for explaining a process of converting color values of a RGB color space into color values of a CIELAB color space, which are generally known.
7 is a detailed configuration diagram of a color classification unit according to the present embodiment.
8 is a diagram for explaining classification of second color values in a cluster determination unit into a predetermined number of clusters corresponding to reference colors, according to the present embodiment.
9 is a diagram for explaining color information about colors of an image classified by the color classification unit according to the present embodiment.
10 is a diagram showing a device for providing color information through a TTS according to the present embodiment.
11 is a flowchart of a method of providing color information for an image displayed on a device according to the present embodiment.

Hereinafter, the embodiments will be described in detail with reference to the drawings.

1 is a diagram showing a device for providing color information on an image according to the present embodiment.

Referring to Figure 1, the device 10 provides the user 20 with color information 30 for the displayed image 11. The color information 30 includes information that the dominant color represented in the displayed image 11 is " white ", or the names and ratios of the colors represented in the displayed image 11 are " white 90 %, Black 10% ".

If the visual cognitive abilities of the user 20 are normal, the user can directly recognize the images 11 displayed on the device 10 to identify the colors represented in the image 11. [ However, if the user 20 is a blind blind person, a partial blind person, or a color blind blind person, it may be difficult to identify the colors represented in the image 11 by themselves.

In such a case, the device 10 may display the color information represented in the displayed image 11 to the user (not shown) via the audio guide means such as TTS (text to speech) 20).

2A is a configuration diagram of a device for providing color information on an image according to the present embodiment.

2A, the device 10 includes a user interface unit 110, a processor 120, a color information providing unit 130, and an illumination environment information obtaining unit 140. [ Here, the processor 120 includes a lighting effect analyzing unit 121 and a color analyzing unit 123. The device 10 shown in Fig. 2A is only shown in the figure to prevent the characteristic of the present embodiment from being blurred. However, other general components other than the components shown in Fig. 2A are further included . It will be appreciated by those of ordinary skill in the art that the computing device 10 may also be implemented with other types of hardware modules capable of implementing the operations described in this embodiment.

The device 10 described in this embodiment may be applied to portable devices such as smart phones, tablet devices, personal digital assistants (PDAs), wearable watches, wearable glasses, or the like, or to computing devices But are not limited to, devices that are similar to the devices listed above.

The user interface unit 110 may include an input device capable of receiving input information from a user 20 such as a keypad, a keyboard, a mouse, a touch screen and the like, and a user interface (UI) And a display device capable of displaying processing information in the display device 10.

The user interface unit 110 displays an image (11 in Fig. 1) on a user interface (UI) screen. The image 11 may correspond to an image taken by the device 10, an image stored in the device 10, or an image retrieved by web surfing using the device 10. [

The user interface unit 110 can also receive from the user 20 the selection information of the user 20 for the area on which the color information is desired to be provided on the image 11. [

FIGS. 3A and 3B are views for explaining an area to be provided with color information by the user through the user interface unit according to the present embodiment.

3A, when an image 11 is displayed on the user interface (UI) screen of the device 10, the user 20 displays the color information on the distribution of colors in the entire area 301 of the image 11 Can be provided. In this case, the user 20 can designate the entire area 301 through the user interface unit 110 so that the entire image 11 is selected.

3B, when the image 11 is displayed on the user interface (UI) screen of the device 10, the user 20 can select a portion of the image 11, for example, The color information on the distribution of the colors of the color images may be provided. In such a case, the user 20 can designate a certain area 303 through the user interface unit 110 so that only a part of the image 11 is selected.

The device 10 can automatically set the entire area 301 or the partial area 303 of the image 11 to the entire area 301 or the area 309 without specifying the entire area 301 or the partial area 303 of the image 11 through the user interface part 110. [ It can be set in advance so that a certain area 303 at the set position is designated. That is, the embodiment for specifying the area to receive color information is not limited by any one of them.

Referring back to FIG. 2A, the illumination environment information acquisition unit 140 acquires illumination information in an illumination environment in which an image is captured.

The illumination environment information acquisition unit 140 may be, for example, a spectral spectrometer. That is, the illumination environment information acquisition unit 140 can acquire information about the spectrum of the light source in the illumination environment at the time the image 11 is photographed. Generally, it is known that the spectra are different depending on the type of light source. The illumination environment information acquiring unit 140 acquires illumination environment information about the light source at the time when the image 11 is photographed, such as whether it is an incandescent lamp, a fluorescent lamp, a candle or a sunlight, based on the acquired information about the spectrum of the light source. .

The illumination environment information acquisition unit 140 may correspond to a color temperature meter as another example. That is, the illumination environment information acquisition unit 140 can acquire information about the color temperature of the light source in the illumination environment at the time the image 11 is photographed. It is generally known that light sources can be classified according to the color temperature. The illumination environment information obtaining unit 140 obtains the illumination environment information about the color temperature of the image 11 based on information about the color temperature of the obtained light source, .

However, the illumination environment information acquisition unit 140 may correspond to other types of hardware modules in addition to the spectrum spectrometer or the color temperature meter, as long as it can acquire illumination information relating to the attributes of the light source.

The illumination environment information acquisition unit 140 may correspond to an illumination sensor as another example. That is, the illumination environment information acquisition unit 140 can acquire information on which value the illuminance in the illumination environment at the time the image 11 is photographed.

If the illumination environment information acquisition unit 140 is a special-purpose hardware module that is separately manufactured, such as a spectrum spectrometer, a color temperature meter, or an illumination sensor, information about the light source or roughness at the time the image 11 is photographed is directly obtained .

Meanwhile, the illumination environment information acquisition unit 140 may acquire illumination information from the meta information about the illumination environment included in the image data received from the outside of the device 10. That is, when the image 11 is image data transmitted by an external device of the device 10, not image captured by the device 10, or image data received by web surfing or the like such as the Internet, The environment information acquisition unit 140 can acquire illumination information on attributes, roughness, etc. of the light source by analyzing the meta information included in the image data. At this time, the illumination environment information acquisition unit 140 may be implemented in the processor 120 or may be implemented as a separate processor module.

2C is a diagram showing an illumination environment information acquisition unit mounted in the device according to the present embodiment.

Referring to FIG. 2C, the illumination environment information obtaining unit 140 may be mounted at a position adjacent to the camera module 150 on the rear surface of the device 10. The illumination environment information acquisition unit 140 acquires information about the attributes or roughness of the light source in the illumination environment when an image is captured by the camera module 150. [ The illumination environment information obtaining unit 140 may be a hardware module such as a spectrum spectrometer, a color temperature meter, or an illuminance sensor modularly mounted in the device 10.

2C, the illumination environment information obtaining unit 140 is located adjacent to the camera module 150 provided on the rear side of the device 10, but this is for convenience of explanation only, 140 may be mounted on the device 10 may vary.

Processor 120 is a hardware configuration that controls the overall operation and functionality of device 10 and processor 120 performs the following processes to provide color information for image 11 to user 20 .

The illumination effect analysis unit 121 compensates the first color values of the colors represented in the image using the acquired illumination information. For example, the illumination effect analysis unit 121 may adjust the hue or brightness of the first color values of the colors of the image 11 in the first color space, depending on the type of the obtained illumination information (not shown). However, the illumination-effect analyzing unit 121 can adjust the saturation and the like in addition to the hue and brightness of the first color values.

Here, the first color space may include RGB (Red, Green, Blue), YCbCr color space, or CMYK color space, for example. If the first color space is an RGB color space, the first color values refer to RGB color values. Hereinafter, for convenience of explanation, the first color space is an RGB color space, and the first color values are RGB color values. However, the present embodiment is not limited thereto.

When it is assumed that the illumination information obtained by the illumination environment information acquisition unit 140 is RGB, and the RGB color values of the light source are incandescent lamps, R is 100, G is 100, and B is 75, the illumination effect analysis unit 121 Can compensate the color of the RGB color values of the colors of the image 11 using the thus obtained illumination information.

Assuming that the RGB color values representing the color in a pixel of the image 11 are 200, 200, and 200, respectively, R, 200, and B, the lighting effect analyzer 121 calculates the brightness for the pixel . For example, based on the brightness of the gray level, the brightness at the gray level of the pixel can be calculated as 190 = 200 * 0.3 + 200 * 0.6 + 100 * 0.1. Here, the coefficients of 0.3, 0.6, and 0.1 are known values, which are parameters multiplied by the RGB color values to yield brightness at a particular gray level.

On the other hand, in the RGB color values of the light source obtained by the illumination environment information obtaining unit 140, the B color value should be increased by about 33% = (100-75) / 75 to be similar to the R color value or G color value . Therefore, the lighting effect analyzer 121 adjusts the color of the RGB color values of the pixel so that the B color value is increased by 33%. That is, the illumination effect analyzing unit 121 adjusts the RGB color values of the corresponding pixels such that R is 200, G is 200, and B is 133.

Next, the illumination effect analyzing unit 121 again calculates the brightness at the same gray level for the adjusted RGB color values of the pixel. That is, the illumination effect analyzing unit 121 calculates the brightness from the adjusted RGB color values of the pixel as 193.3 = 200 * 0.3 + 200 * 0.6 + 133 * 0.1.

The brightness at the corresponding pixel when the influence of illumination was not considered was 190, but the adjusted brightness at the corresponding pixel when considering the influence of illumination was 193.3. That is, a brightness difference of 3.3 is generated.

Thus, by reducing the illumination influence analyzer 121 by a value of 3.3 for each of the adjusted RGB color values, the color of the finally compensated RGB color values of the pixel is thus 196.7 R, 196.7 G, and 129.7 B .

As such, the illumination effect analyzer 121 can use the acquired illumination information to compensate for the first color values of the colors of the image 11 in the first color space.

Furthermore, the illumination effect analyzing unit 121 can compensate for the brightness of the first color values in a similar manner, if the acquired illumination information is related to illuminance. Further, the illumination effect analyzing unit 121 can compensate the first color values in a similar manner, even when the first color values are not RGB color values, YCrCb color values, and the like.

The color analyzer 123 analyzes the color information for the compensated first color values based on the list of reference colors that define the kind of colors represented in the image 11. [

FIG. 2B is a detailed configuration diagram of the color analysis unit in the device of FIG. 2A. FIG.

Referring to FIG. 2B, the color analyzer 123 includes a reference color generator 1231, a color space converter 1233, and a color sorter 1235.

The reference color generator 1231 first generates a list of standard colors that define the types of colors represented in the image 11. [ In general, there are a myriad of colors in the natural world, but a variety of criteria are known for defining them in several kinds of colors.

The reference color generator 1231 may use the names of the 24 colors defined by the Macbeth color chart (or color checker) used as the color calibration or the like as the reference colors. The reference color generation unit 1231 may generate colors defined by the 'color name of light source color' according to KS standard number 'KS A 0012' or the 'color name of object color' according to 'KS A 0011' Lt; / RTI > That is, the list of reference colors usable in the reference color generator 1231 is not limited by any one. However, in the present embodiment, for convenience of description, the reference color generator 1231 uses a Macbeth color chart as an example. However, it should be understood by those skilled in the art that the same can be applied to the KS standard and the like. .

FIGS. 4A and 4B illustrate how the reference color generator generates a list of reference colors using a Macbeth color chart according to the present embodiment.

Referring to FIGS. 4A and 4B, the reference color generator 1231 generates a reference color, which includes a dark skin color, a light skin color, a Blue sky color, a Foliage color, a Blue Flower color, a Bluish Green color, , Moderate Red Color, Purple Color, Yellow Green Color, Orange Yellow Color, Blue Color, Green Color, Red Color, Yellow Color, Magenta Color, Cyan Color, White Color, , And generates a list of black colors.

4A and 4B, the RGB values for the reference colors are differently specified in the A environment and the B environment, respectively. This is because environmental conditions such as illuminance and reflectance in A environment or B environment may be different from each other even in the same reference color. For example, an A environment can be an indoor environment where fluorescent light is illuminated, and a B environment can be an outdoor environment in daylight where sunlight shines. That is, the RGB values defining the reference colors in the A environment may be different from the RGB values defining the reference colors in the B environment.

Therefore, the reference color generator 1231 sets in advance the RGB values of the reference colors in the environment where the device 10 is located.

As a result, the reference color generator 1231 may use 24 colors defined in the Macbeth color chart, or when using N (N is a natural number of 1 or more) colors defined by other standards, And indexes the reference colors by setting the reference RGB values corresponding to the reference colors.

Referring again to FIG. 2B, the color space conversion unit 1233 preprocesses the image 11. According to the present exemplary embodiment, the preprocessing process of the color space converter 1233 may include a process of setting a preprocessing area of the image 11 and a process of removing noise from the preprocessing area. However, preprocessing procedures of the general image 11 may be additionally included.

First, when the user 20 selects the entire area (301 in FIG. 3A) of the image 11 or automatically selects the entire area 301 through the user interface unit 110 , The color space conversion unit 1233 preprocesses the inner areas of the entire area 301 of the image except the edge area of a predetermined thickness. This is because the edge region of the image 11 can be expressed with a relatively low illuminance under the influence of the LED illumination of the device 10, the illumination of the external environment, or the sunlight, and removal of the edge region minimizes such influence It is for this reason. In other words, in the case of the image 11 photographed with the LED illumination of the device 10, the edge region can be dark and the middle region can appear bright, so that distortion of the color information can be prevented by removing the edge region.

However, when the user 20 selects a partial area (303 in FIG. 3B) of the image 11 through the user interface unit 110, the color space conversion unit 1233 converts the entire area 301 of the image Preprocesses the partial area 303 specified by the input of the input unit 20.

Next, the color space conversion unit 1233 removes noise in the preprocessed area of the image 11 by applying a general noise removal processing method such as Gaussian filtering to the preprocessed area.

5 is a diagram for explaining setting of the preprocessing area in the color space conversion unit according to the present embodiment.

Referring to FIG. 5, the color space conversion unit 1233 preprocesses the inner regions of the image 11 except for the edge region 501 having a predetermined thickness, as described above.

Referring again to FIG. 2B, the color space conversion unit 1233 converts the first color values of the colors represented in the image 11 in the first color space into the second color values in the second color space.

Here, the color space conversion unit 1233 converts the first color values compensated by the illumination effect analysis unit 121 into the second color values.

Here, the first color space may include RGB (Red, Green, Blue), YCbCr color space, CMYK color space, or the like, as described above. In addition, the second color space may include, for example, a CIELAB color space or a hue saturation value (HSV) color space. That is, the color space conversion unit 1233 converts color values in a certain color space into color values in a different kind of color space. Hereinafter, the color space conversion unit 1233 converts the color values from the RGB color space to the CIELAB color space, but the present embodiment is not limited thereto.

6 is a diagram for explaining a process of converting color values of a RGB color space into color values of a CIELAB color space, which are generally known.

Referring to FIG. 6, R, G, and B values of the color values in the RGB color space are first converted to X values, Y values, and Z values through Equation (601) ) To an L * value, an a * value, and a b * value.

Meanwhile, the process of converting the color values of the RGB color space into the color values of the CIELAB color space will be apparent to those skilled in the art, so a detailed description will be omitted.

Referring again to FIG. 2B, the color space conversion unit 1233 not only transforms the color values of the image 11 itself, but also converts the RGB values of the reference colors described above into the color values of the second color space.

More specifically, the color space conversion unit 1233 converts the third color values (RGB values) of the reference colors in the first color space (for example, the RGB color space) into the second color space (for example, CIELAB (L * a * b * values) in color space (e.g., color space). Accordingly, the color space conversion unit 1233 converts RGB values (131, 81, 54), (238, 177, 147), (103) such as dark skin color, light skin color, , 137, 162, etc.) into fourth color values (L * a * b * values) in a second color space (e.g., CIELAB color space).

The color classification unit 1235 classifies the second color values converted from the first color values in the color space conversion unit 1233 based on the list of reference colors generated by the reference color generation unit 1231. [ More specifically, the color classification unit 1235 performs color classification on the basis of the converted fourth color values (L * a * b * values) from the third color values (RGB values) And the second color values (L * a * b * values) converted from the first color values (RGB values).

7 is a detailed configuration diagram of a color classification unit according to the present embodiment.

7, the color classification unit 1235 may include a seed color value setting unit 1237 and a cluster determination unit 1239. [

The seed color value setting unit 1237 sets seed color values corresponding to the reference colors in the second color space. That is, in the case where the Macbeth color chart is used in the present embodiment, the seed color values are the fourth color values (L * a * b *) for the dark skin color, the light skin color, Values).

The cluster determination unit 1239 determines the second color values among the clusters of seed color values through color difference calculation using the seed color values (fourth color values (L * a * b * values)) and the second color values Determines at least one cluster to which it belongs. Here, the cluster means a distribution range of color values, which is arbitrarily set so that a second color value can be determined as a reference color corresponding to the seed color value. The range of the cluster is not fixed but can be changed in accordance with the distribution of the second color values.

The cluster determining unit 1239 performs color difference computation using k-means clustering or color difference computation for searching a histogram bin having a minimum color difference for the seed color values and the second color values. The chrominance calculations performed by the cluster determiner 1239 may include various algorithms for classifying the distribution of predetermined data values into a predetermined number of clusters according to reference values as classification criteria, in addition to the algorithms for k-means clustering or histogram bin search. Algorithms can be applied.

8 is a diagram for explaining classification of second color values in a cluster determination unit into a predetermined number of clusters corresponding to reference colors, according to the present embodiment.

8, the distribution of the second color values (L * a * b * values) of the image 11 includes a seed color value 801 of White color, a seed color value 803 of Yellow color, 804, and 806 by the seed color value 805 of the seed color value 805 of FIG. Clustering or sorting by the seed color values 801, 803 and 805 may be performed by using k-means clustering or minimum color difference for the seed color values 801, 803 and 805 and the second color values And a chrominance operation for searching a histogram bin having the histogram bin.

Referring again to FIG. 2B, the cluster determination unit 1239 determines at least one of the clusters to which the second color values belong, based on the clustering or classification result as shown in FIG. Then, the cluster determination unit 1239 determines at least one reference color corresponding to the determined at least one cluster.

More specifically, the cluster determining unit 1239 can determine a cluster 802 of White color, a cluster 804 of Yellow color, and a cluster 806 of Black color, as shown in Fig. Accordingly, the cluster determination unit 1239 determines the white color, the yellow color, and the black color as the classification result of the second color values. In other words, the cluster determination unit 1239 can determine all the reference colors corresponding to all clusters classified.

The cluster determination unit 1239 can also determine the ratio of the white color, the yellow color, and the black color according to the distribution ratio of the second color values belonging to the respective clusters 802, 804, and 806.

On the other hand, the cluster determination unit 1239 can determine that there are two or more reference colors in the image 11 as described above, but the cluster determination unit 1239 determines that there is only one reference color that is the most dominant Can also be determined.

That is, the cluster determining unit 1239 can determine only the reference color having the highest distribution ratio among the two or more reference colors. For example, the second color values contained in the white color cluster 802 among the white color cluster 802, the yellow color cluster 804, and the black color cluster 806 shown in Fig. 8 are relatively In most cases, the cluster determination unit 1239 can determine the white color for the image 11.

Further, the cluster determination unit 1239 may determine a predetermined number of reference colors in the most dominant order, not just one reference color. Here, the predetermined number may be a predetermined number by the user 20.

The cluster determining unit 1239 can determine the merging of neighboring clusters on the second color space when the number of reference colors to be included in the color information is smaller than the number of clusters to which the second color values belong. For example, when a cluster of white color and a cluster of yellow color are adjacent to each other but the number of second color values belonging to a cluster of White color is relatively larger than the number of second color values belonging to a cluster of Yellow color, The cluster determination unit 1239 can determine only one cluster of white color by merging the clusters of the yellow color into the clusters of the white color.

9 is a diagram for explaining color information about colors of an image classified by the color classification unit according to the present embodiment.

9, in the case of the image 901, the color classification unit 1235 can classify the entire region of the image 901 as gray 100%. In the case of the image 902, the color classification unit 1235 can classify the entire area of the image 902 as white 50% and blue 50%. In the case of the image 903, the color classification unit 1235 can classify the entire area of the image 901 as 60% white, 30% brown and 10% black.

Referring again to FIGS. 2A and 2B, the color information providing unit 130 performs color information providing process for at least one reference color corresponding to the second color values classified by the color classification unit 1235 (cluster determination unit 1239) Color information. Here, the color information includes at least one of a name and a ratio for the determined reference color.

The color information providing unit 130 may provide at least one of the name and the ratio for the determined reference color using TTS (text to speech). As described above, even if the user 20 displays the color information through the user interface (UI) screen of the user interface unit 110 when the user 20 is visually impaired, the user 20 can not recognize the color information, The color information providing unit 130 provides color information on the name of the reference color or the ratio of the reference color through voice guidance.

10 is a diagram showing a device for providing color information through a TTS according to the present embodiment.

10, the device 10 displays "Cement: 25%, Mouse: 21%, India Ink" as the color information of the entire area 1002 of the image through the UI screen of the user interface unit 110 : 18%, Silver: 13%, Salmon 5% ". Further, the device 10 can display the phrase " The Color of center region is Salmon " as the color information of the center area 1003 of the image through the user interface (UI) screen of the user interface unit 110 .

Further, the device 10 transmits the voice guidance of "Cement: 25%, Mouse: 21%, India Ink: 18%, Silver: 13%, Salmon 5%" via the TTS by the color information providing unit 130 Quot; The Color of the center region is Salmon ".

Referring again to FIG. 2A, the device 10 can provide fashion information or beauty information utilizing color information together with color information. When the image 11 to be analyzed for color information is related to human skin, the user interface unit 110 may provide skin information together with beauty information matching the color of the skin while displaying the skin image. That is, the user interface unit 110 can provide color information corresponding to a skin image and information on cosmetics or clothes that match the skin color. In addition, the user interface unit 110 may be provided with a kind of color matching or a prosthetic color matching the color of the skin while displaying a skin image, thereby helping to produce a right-handed or prosthetic close to the skin color.

The device 10 may utilize color information for educational purposes. For example, the device 10 may provide the color information for the distribution of the various colors represented in the image 11, while providing the color information to the user interface (not shown) 110) in the form of quizzes.

The device 10 may utilize the color information to classify the images stored in the device 10 according to the illumination information or color type. For example, images taken at the beach may have a dominant color blue, so that the device 10 will classify the images with dominant color blue separately based on the color information analyzed for each of the stored images, Can be managed. Alternatively, since the illumination information for the images photographed at the beach is obtained as the sunlight being the light source, the device 10 separately classifies the images in which the sunlight is the light source based on the analyzed color information for each of the stored images It can be managed as beach images.

In addition, the device 10 can provide not only color information but also information on various fields such as education information, fashion information, beauty information, etc. using color information.

11 is a flowchart of a method of providing color information for an image displayed on a device according to the present embodiment.

Referring to FIG. 11, the method of providing color information according to the present embodiment is comprised of steps that are processed in a time-series manner in the device 10 described in the drawings. Therefore, the contents of the above-described drawings can be applied to the method of providing color information according to the present embodiment even if omitted below.

In step 1101, the illumination environment information acquisition unit 140 acquires illumination information in the illumination environment in which the image 11 is photographed.

In step 1102, the illumination effect analysis unit 121 compensates the first color values of the colors represented in the image 11 using the acquired illumination information.

In step 1103, the color analyzer 123 analyzes the color information for the compensated first color values based on the list of reference colors defining the kind of colors represented in the image 11. [

In step 1104, the color information providing unit 130 provides the analyzed color information.

The above-described embodiments of the present invention can be embodied in a general-purpose digital computer that can be embodied as a program that can be executed by a computer and operates the program using a computer-readable recording medium. In addition, the structure of the data used in the above-described embodiments of the present invention can be recorded on a computer-readable recording medium through various means. The computer-readable recording medium includes a storage medium such as a magnetic storage medium (e.g., ROM, floppy disk, hard disk, etc.), optical reading medium (e.g., CD ROM,

The present invention has been described with reference to the preferred embodiments. It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Therefore, the disclosed embodiments should be considered in an illustrative rather than a restrictive sense. The scope of the present invention is defined by the appended claims rather than by the foregoing description, and all differences within the scope of equivalents thereof should be construed as being included in the present invention.

10: device 110: user interface unit
120: processor 130: color information providing unit
140: Lighting environment information obtaining unit
121: illumination effect analysis unit 123: color analysis unit

Claims (30)

A method of providing color information for an image in a device,
Acquiring illumination information in an illumination environment in which the image is captured;
Using the acquired illumination information to compensate for first color values of colors represented in the image;
Analyzing color information for the compensated first color values based on a list of reference colors defining a type of colors represented in the image; And
And providing the analyzed color information.
The method according to claim 1,
The compensating step
And adjusting the color or brightness of the first color values of the colors in the first color space according to the type of the obtained illumination information.
3. The method of claim 2,
The compensating step
Adjusting the color of the first color values when the obtained illumination information is information on the property of the light source and adjusting the brightness of the first color values when the obtained illumination information is information on the illumination.
The method according to claim 1,
The obtained illumination information is
And information about properties of the light source in the illumination environment, measured using at least one of a spectral spectrometer and a color temperature meter provided in the device.
The method according to claim 1,
The obtained attribute information is
And information about illuminance in the illumination environment measured using an illuminance sensor provided in the device.
The method according to claim 1,
The obtaining step
Wherein if the image is based on image data received from outside the device, the illumination information is obtained from meta information about the illumination environment contained in the received image data.
The method according to claim 1,
The analyzing step
Wherein the color information is analyzed for the colors represented by pixels in a region in the image.
8. The method of claim 7,
The predetermined region
Wherein the image is a region excluding an edge region of a predetermined thickness in the image or a region designated by a user's input in the image.
The method according to claim 1,
The analyzing step
Generating a list of reference colors;
Converting the compensated first color values in a first color space to second color values in a second color space; And
And classifying the converted second color values based on the generated list of reference colors,
The providing step
Wherein the color information comprises information about at least one reference color corresponding to the sorted second color values.
10. The method of claim 9,
Further comprising converting third color values of the reference colors in the first color space to fourth color values in the second color space,
The classifying step
And classifying the converted second color values based on the converted fourth color values.
10. The method of claim 9,
The first color space
(Red, Green, Blue), a YCbCr color space, and a CMYK color space,
The second color space
A CIELAB color space, and a hue saturation value (HSV) color space.
10. The method of claim 9,
The classifying step
Setting seed color values corresponding to the reference colors in the second color space; And
Determining at least one cluster among the clusters of seed color values to which the converted second color values belong, through a color difference calculation using the seed color values and the converted second color values,
And the sorted second color values are based on the determination result.
13. The method of claim 12,
The color difference calculation
a chrominance computation using k-means clustering, and a chrominance computation to search for a histogram bin having a minimum chrominance.
13. The method of claim 12,
The step of determining
Determining a more dominant cluster by merging neighboring clusters on the second color space if the number of reference colors to be included in the color information is less than the number of clusters to which the transformed second color values belong.
10. The method of claim 9,
The providing step
And provides the color information for the dominant reference color, of the at least one reference color corresponding to the sorted second color values.
10. The method of claim 9,
The providing step
Providing the color information for a predetermined number of reference colors in a most dominant order of the at least one reference color corresponding to the sorted second color values.
10. The method of claim 9,
The providing step
Providing at least one of a name and a ratio for the at least one reference color corresponding to the sorted second color values using text to speech.
The method according to claim 1,
The list of reference colors
And colors categorized by a Macbeth color chart.
The method according to claim 1,
Further comprising providing fashion information or beauty information that matches the analyzed color information when the image is representative of a person's skin.
The method according to claim 1,
Further comprising classifying images stored in the device using at least one of the acquired illumination information and the analyzed color information.
20. A computer-readable recording medium having recorded thereon a program for causing a computer to execute the method according to any one of claims 1 to 20. A device for providing color information for an image,
An illumination environment information acquiring unit acquiring illumination information in an illumination environment in which the image is photographed;
A lighting influence analyzer for compensating for first color values of colors represented in the image using the obtained illumination information;
A color analyzer for analyzing color information on the compensated first color values based on a list of reference colors defining a kind of colors represented in the image; And
And a color information providing unit for providing the analyzed color information.
23. The method of claim 22,
The illumination effect analysis unit
And using the acquired illumination information to compensate for brightness of the first color values of the colors in a first color space.
23. The method of claim 22,
The illumination environment information obtaining unit
At least one of a spectral spectrometer, a color temperature meter, and an illuminance sensor,
The illumination environment information obtaining unit
And information about an attribute of the light source in the illumination environment measured using at least one of the spectral spectrometer and the color temperature meter, or information on illuminance in the illumination environment measured using the illumination sensor. .
23. The method of claim 22,
The color analyzer
Analyzing the color information for the colors represented by pixels in a region in the image,
The predetermined region
Wherein the image is a region excluding an edge region of a predetermined thickness in the image or a region designated by a user's input in the image.
23. The method of claim 22,
The color analyzer
A reference color generator for generating a list of the reference colors;
A color space transformer for transforming the compensated first color values in the first color space into second color values in the second color space; And
And a color classifier for classifying the converted second color values based on the generated list of reference colors,
The color information providing unit
Wherein the color information comprises information about at least one reference color corresponding to the sorted second color values.
27. The method of claim 26,
The color space conversion unit
Converting third color values of the reference colors in the first color space to fourth color values in the second color space,
The color classification unit
And classifies the converted second color values based on the converted fourth color values.
27. The method of claim 26,
The color classification unit
A seed color value setting unit for setting seed color values corresponding to the reference colors in the second color space; And
And a cluster determination unit that determines at least one cluster among the clusters of seed color values to which the converted second color values belong, through a color difference calculation using the seed color values and the converted second color values,
And the sorted second color values are based on the determination result.
27. The method of claim 26,
The color information providing unit
And providing at least one of a name and a ratio for the at least one reference color corresponding to the sorted second color values using text to speech (TTS).
23. The method of claim 22,
Further comprising a user interface for displaying the image and receiving a user input for setting an area in which the color information is to be provided in the displayed image.
KR1020140020596A 2014-02-21 2014-02-21 Method and device for providing color information of image KR20150099087A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020140020596A KR20150099087A (en) 2014-02-21 2014-02-21 Method and device for providing color information of image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020140020596A KR20150099087A (en) 2014-02-21 2014-02-21 Method and device for providing color information of image

Publications (1)

Publication Number Publication Date
KR20150099087A true KR20150099087A (en) 2015-08-31

Family

ID=54060325

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020140020596A KR20150099087A (en) 2014-02-21 2014-02-21 Method and device for providing color information of image

Country Status (1)

Country Link
KR (1) KR20150099087A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105640748A (en) * 2016-03-16 2016-06-08 宁波市江东精诚自动化设备有限公司 Vibration blind-guiding clothing

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105640748A (en) * 2016-03-16 2016-06-08 宁波市江东精诚自动化设备有限公司 Vibration blind-guiding clothing

Similar Documents

Publication Publication Date Title
US9990377B1 (en) Content based systems and methods for conducting spectrum color based image search
US8594420B2 (en) Color naming, color categorization and describing color composition of images
Mojsilovic A computational model for color naming and describing color composition of images
US9460521B2 (en) Digital image analysis
CN110619301B (en) Emotion automatic identification method based on bimodal signals
CN104360796B (en) The method, apparatus and electronic equipment of color are applied on an electronic device
TWI431549B (en) Image processing apparatus and method and computer program product
US20150262549A1 (en) Color Palette Generation
US8064691B2 (en) Method for identifying color in machine and computer vision applications
CN109359317A (en) A kind of lipstick is matched colors the model building method and lipstick color matching selection method of selection
Lecca et al. Tuning the locality of filtering with a spatially weighted implementation of random spray Retinex
Wang et al. Optimal illumination for local contrast enhancement based on the human visual system
Khanh et al. Color Quality of Semiconductor and Conventional Light Sources
CN114117197A (en) Apparatus and method for color matching and recommendation
Wannous et al. Improving color correction across camera and illumination changes by contextual sample selection
Moreno et al. Color correction: A novel weighted von kries model based on memory colors
JP2009151350A (en) Image correction method and device
Hussain et al. Max-RGB based colour constancy using the sub-blocks of the image
US7936920B2 (en) Method and apparatus for multiple data channel analysis using relative strength histograms
US10909351B2 (en) Method of improving image analysis
KR20150099087A (en) Method and device for providing color information of image
Almobarak et al. Classification of aesthetic photographic images using SVM and KNN classifiers
JPH116765A (en) Pearl color classifying device
You et al. Saturation enhancement of blue sky for increasing preference of scenery images
Zeng Preferred skin colour reproduction

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination