WO2011062102A1 - 情報処理装置、情報処理方法、プログラム、及び電子機器 - Google Patents
情報処理装置、情報処理方法、プログラム、及び電子機器 Download PDFInfo
- Publication number
- WO2011062102A1 WO2011062102A1 PCT/JP2010/070025 JP2010070025W WO2011062102A1 WO 2011062102 A1 WO2011062102 A1 WO 2011062102A1 JP 2010070025 W JP2010070025 W JP 2010070025W WO 2011062102 A1 WO2011062102 A1 WO 2011062102A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- skin
- subject
- light
- wavelength
- captured image
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/136—Segmentation; Edge detection involving thresholding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/174—Segmentation; Edge detection involving the use of two or more images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/143—Sensing or illuminating at different wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10141—Special mode during image acquisition
- G06T2207/10152—Varying illumination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20224—Image subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30088—Skin; Dermal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2209/00—Details of colour television systems
- H04N2209/04—Picture signal generators
- H04N2209/041—Picture signal generators using solid-state devices
- H04N2209/042—Picture signal generators using solid-state devices having a single pick-up sensor
- H04N2209/044—Picture signal generators using solid-state devices having a single pick-up sensor using sequential colour illumination
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/50—Control of the SSIS exposure
- H04N25/53—Control of the integration time
- H04N25/531—Control of the integration time by controlling rolling shutters in CMOS SSIS
Definitions
- the present invention relates to an information processing apparatus, an information processing method, a program, and an electronic apparatus, and particularly suitable information for use in detecting the shape of a human hand from a captured image obtained by imaging a subject.
- the present invention relates to a processing device, an information processing method, a program, and an electronic device.
- Non-Patent Document 1 there is a skin recognition system that detects (recognizes) a skin region representing human skin from a captured image obtained by imaging a subject.
- FIG. 1 shows a configuration example of a conventional skin recognition system 1.
- the skin recognition system 1 includes a light emitting device 21, a camera 22, and an image processing device 23.
- the light-emitting device 21 includes an LED (light emitting diode) 21a 1 and an LED 21a 2 (respectively indicated by two black circles) that emit (emits) light having a wavelength ⁇ 1 (for example, near infrared light of 870 [nm]), and a wavelength ⁇ 1.
- LED 21b 1 and LED 21b 2 (each indicated by two white circles) that emit light having a wavelength ⁇ 2 different from that (for example, near-infrared rays of 950 [nm]).
- LED 21a 1 and LED 21a 2 when there is no need to distinguish between the LED 21a 1 and LED 21a 2, the LED 21a 1 and LED 21a 2, simply referred to LED 21a. If there is no need to distinguish between the LED 21b 1 and LED 21b 2, the LED 21b 1 and LED 21b 2, simply referred LED 21b.
- the combination of the wavelengths ⁇ 1 and ⁇ 2 is a combination in which, for example, the reflectance when the light of the wavelength ⁇ 1 is irradiated on the human skin is larger than the reflectance when the light of the wavelength ⁇ 2 is irradiated on the human skin. is there.
- the combination of the wavelengths ⁇ 1 and ⁇ 2 is the reflectance when the light of the wavelength ⁇ 1 is irradiated on something other than human skin and the reflectance when the light of the wavelength ⁇ 2 is irradiated on something other than human skin. The combination is almost unchanged.
- the outputs of the LED 21a and the LED 21b are images obtained by imaging of the camera 22 when the subject having the same reflectance with respect to the wavelengths ⁇ 1 and ⁇ 2 is irradiated with light of either wavelength ⁇ 1 or ⁇ 2. Adjustment is made so that the luminance values of corresponding pixels in the image are the same.
- the LED 21a and the LED 21b are arranged in a grid pattern, and emit light alternately, for example.
- the camera 22 has a lens used for imaging a subject such as a user, and the front surface of the lens is covered with a visible light cut filter 22a that blocks visible light.
- the camera 22 receives only the reflected light of the invisible light irradiated to the subject by the light emitting device 21, and the captured image obtained as a result is subjected to image processing. Supply to device 23.
- the camera 22 includes an image sensor that receives reflected light from the subject, and exposure that receives the reflected light from the subject at the same timing for each of a plurality of horizontal lines constituting the built-in image sensor.
- a global shutter type camera is used.
- the camera 22 images a subject and supplies a captured image obtained as a result to the image processing device 23.
- FIG. 2 shows an example of the image sensor 22b built in the camera 22.
- the imaging element 22b is composed of a plurality of light receiving elements, and the plurality of light receiving elements form a plurality of horizontal lines 0 to 11 as shown in FIG.
- FIG. 3 shows the operation of a global shutter type camera adopted as the camera 22.
- an HD signal horizontal synchronization signal
- a VD signal vertical synchronization signal
- irradiation times t1, t3,... represent the time during which the subject is irradiated with light of wavelength ⁇ 1 by the LED 21a.
- the irradiation times t2, t4,... Represent the time during which the LED 21b irradiates the subject with light having the wavelength ⁇ 2.
- the irradiation times t1, t2, t3, t4,... are determined as intervals of rising edges generated in the VD signal.
- numbers 0 to 11 shown on the left side of FIG. 3 represent 12 horizontal lines 0 to 11 constituting the image pickup device 22b built in the global shutter type camera, respectively.
- the horizontal length represents the exposure time during which exposure is performed, and the vertical length (height) is accumulated according to the exposure time. Represents the charge amount.
- the LED 21a irradiates the subject with light having the wavelength ⁇ 1 for the irradiation time t1.
- the camera 22 performs exposure of each of the horizontal lines 0 to 11 constituting the built-in image sensor 22b at the same timing at which the irradiation time t1 is started for only the irradiation time t1.
- the charge amount obtained by exposure of each of the horizontal lines 0 to 11 constituting the image pickup device 22b receives only reflected light when the object is irradiated with light of wavelength ⁇ 1. Will be obtained. Therefore, the camera 22 generates a first captured image based on the charge amount obtained by receiving only the reflected light when the subject is irradiated with light having the wavelength ⁇ 1, and supplies the first captured image to the image processing device 23. .
- the LED 21b irradiates the subject with light having the wavelength ⁇ 2 for the irradiation time t2. Furthermore, the camera 22 performs exposure of each of the horizontal lines 0 to 11 constituting the built-in image sensor 22b for the irradiation time t2 at the same timing at which the irradiation time t2 is started.
- the charge amount obtained by exposure of each of the horizontal lines 0 to 11 constituting the image sensor 22b receives only reflected light when the object is irradiated with light of wavelength ⁇ 2. Will be obtained. Therefore, the camera 22 generates a second captured image based on the charge amount obtained by receiving only the reflected light when the subject is irradiated with light having the wavelength ⁇ 2, and supplies the second captured image to the image processing device 23. .
- the image processing device 23 generates a VD signal and an HD signal. Then, the image processing device 23 controls the light emission of the light emitting device 21 and the imaging of the camera 22 based on, for example, the rising edge interval generated in the generated VD signal or HD signal.
- the image processing device 23 calculates a difference absolute value between the luminance values of the corresponding pixels of the first and second captured images from the camera 22, and performs the first imaging based on the calculated difference absolute value. A skin region on the image (or the second captured image) is detected.
- the first captured image is obtained by receiving only the reflected light when the subject is irradiated with the light with the wavelength ⁇ 1
- the second captured image is obtained by irradiating the subject with the light with the wavelength ⁇ 2. In this case, only the reflected light is received.
- a combination of the wavelengths ⁇ 1 and ⁇ 2 a combination is adopted in which the reflectance when the light of the wavelength ⁇ 1 is irradiated on the human skin is larger than the reflectance when the light of the wavelength ⁇ 2 is irradiated on the human skin. ing.
- the luminance value of the pixel constituting the skin area on the first captured image is a relatively large value
- the luminance value of the pixel constituting the skin area on the second captured image is a relatively small value. Therefore, the absolute difference value between the luminance values of the pixels constituting the skin area on the first and second captured images is a relatively large value.
- the reflectance when the light of the wavelength ⁇ 1 is irradiated on a thing other than the human skin and the reflectance when the light of the wavelength ⁇ 2 is irradiated on a thing other than the human skin are: A combination that is almost unchanged is used.
- the luminance value of the pixel constituting the region other than the skin region on the first captured image and the luminance value of the pixel constituting the region other than the skin region on the second captured image are almost the same value. Become. Therefore, the absolute difference value between the luminance values of the pixels constituting the region other than the skin region on the first and second captured images is a relatively small value.
- the image processing device 23 for example, when the absolute difference value becomes a relatively large value, the corresponding region can be detected as the skin region.
- the global shutter type camera like the camera 22 is manufactured at a cost lower than that of a rolling shutter type camera that performs exposure at different timings for each of the plurality of horizontal lines 0 to 11 constituting the image pickup device 22b. Is high.
- the skin recognition system 1 it is desirable to allow the skin recognition system 1 to adopt a rolling shutter type camera that can be procured at a low cost of about 1/10 compared to a global shutter type camera.
- the present invention has been made in view of such a situation, and on the premise of adopting a rolling shutter type camera, by adjusting the exposure time of the camera and the irradiation time of the light emitting device, the first image captured by the camera is obtained.
- the skin area can be accurately detected based on the first and second captured images.
- An information processing apparatus is an information processing apparatus that detects a skin region representing human skin from a captured image obtained by capturing an image of a subject.
- a built-in image sensor composed of a plurality of lines including a skin detection line used for generating a skin detection region used for detection of the skin region, and each of the plurality of lines has a different timing.
- the light of the first wavelength In a state where the subject is irradiated, the reflected light from the subject is received by the skin detection line to generate the first captured image including at least the skin detection region, and the second Control that causes reflected light from the subject to be received by the skin detection line in a state in which light of a wavelength is irradiated on the subject and generates the second captured image including at least the skin detection region.
- An information processing apparatus comprising: means; and detecting means for detecting the skin region based on the first captured image and the second captured image.
- the imaging device is configured by the plurality of lines including the skin detection lines arranged at intervals of n (n is a natural number) lines, and the control unit includes the first irradiation unit and the second irradiation unit.
- the control unit includes the first irradiation unit and the second irradiation unit.
- the control means controls the first irradiating means, the second irradiating means, and the generating means so as to irradiate the subject with light of the first wavelength.
- the first captured image including the skin detection region is generated, and the subject is irradiated with light of the second wavelength.
- the reflected light from the subject is received by the skin detection line to generate the second captured image including the skin detection region, and the detection means includes the first captured image.
- Extraction means for extracting the skin detection area as a first extraction image, and extracting the skin detection area included in the second captured image as a second extraction image; and the first and second extractions Skin area detecting means for detecting the skin area based on an image; It can be adapted to.
- the control means controls the first irradiating means, the second irradiating means, and the generating means so as to irradiate the subject with light of the first wavelength.
- the reflected light is received by the skin detection line for a predetermined light reception time or longer, and the reflected light from the subject is irradiated for the skin detection in a state where the subject is irradiated with light of the second wavelength.
- the line can receive light for the predetermined light receiving time or longer.
- the generating means sequentially captures the subject at a predetermined imaging timing to generate the captured image.
- the control means includes the first irradiation means, the second irradiation means, And the generation unit may be configured to generate the first captured image at a predetermined imaging timing and generate the second captured image at an imaging timing next to the predetermined imaging timing. .
- the first and second irradiation means are irradiated with light of the second wavelength from the reflectance of the reflected light obtained by irradiating the human skin with the light of the first wavelength. It is possible to irradiate light having a wavelength when the difference obtained by subtracting the reflectance of the obtained reflected light is equal to or greater than a predetermined difference threshold value.
- the first wavelength ⁇ 1 and the second wavelength ⁇ 2 are: 640nm ⁇ ⁇ 1 ⁇ 1000nm 900nm ⁇ ⁇ 2 ⁇ 1100nm Can be met.
- the first irradiating means irradiates the subject with a first infrared ray as the light having the first wavelength
- the second irradiating means has the first wavelength as the light having the second wavelength.
- the subject can be irradiated with a second infrared ray having a longer wavelength than the infrared ray.
- the detection unit may be configured to detect the skin region based on a luminance value of the first captured image and a luminance value of the second captured image.
- the skin area detecting means can detect the skin area based on the luminance value of the first extracted image and the luminance value of the second extracted image.
- An information processing method is an information processing method of an information processing apparatus for detecting a skin region representing human skin from a captured image obtained by imaging a subject, the information processing apparatus Includes a first irradiation unit, a second irradiation unit, a generation unit, a control unit, and a detection unit, and the first irradiation unit irradiates the subject with light having a first wavelength.
- the second irradiating means irradiates the subject with light having a second wavelength that is longer than the first wavelength, and the generating means receives reflected light from the subject
- Built-in image sensor composed of a plurality of lines including a skin detection line used for generating a skin detection region used for detection of a skin region, and each of the plurality of lines at a different timing Receives reflected light from the subject and detects the skin
- the captured image including at least a region is generated, and the control unit controls the first irradiation unit, the second irradiation unit, and the generation unit to emit light having the first wavelength to the subject.
- the reflected light from the subject is received by the skin detection line to generate the first captured image including at least the skin detection region, and the light having the second wavelength.
- the reflected light from the subject is received by the skin detection line to generate the second captured image including at least the skin detection region, and the detection means Is an information processing method including a step of detecting the skin region based on the first captured image and the second captured image.
- a program is an information processing apparatus that detects a skin region representing human skin from a captured image obtained by capturing an image of a subject, and applies light having a first wavelength to the subject.
- An image sensor composed of a plurality of lines including a skin detection line used for generating a skin detection area used for detection of the skin area is built in, and the subject at a different timing for each of the plurality of lines.
- a computer of an information processing apparatus that receives reflected light from the light source and generates a captured image including at least the skin detection region, the first irradiation unit, the second irradiation unit, and The generation hand And the reflected light from the subject is received by the skin detection line in a state in which the subject is irradiated with light of the first wavelength, and includes at least the skin detection region.
- the captured image of 1 is generated and the subject is irradiated with light of the second wavelength
- reflected light from the subject is received by the skin detection line, and the skin detection region
- the first irradiation unit, the second irradiation unit, and the generation unit are controlled to irradiate the subject with light of the first wavelength.
- the reflected light from the subject is received by the skin detection line, the first captured image including at least the skin detection region is generated, and the subject is irradiated with light of the second wavelength.
- the reflected light from the subject is received by the skin detection line, and the second captured image including at least the skin detection region is generated. Then, the skin area is detected based on the first captured image and the second captured image.
- An electronic apparatus is an electronic apparatus including an information processing apparatus that detects a skin region representing human skin from a captured image obtained by capturing an image of a subject, and the information processing apparatus Includes a first irradiating unit that irradiates the subject with light having a first wavelength, and a second irradiating unit that irradiates the subject with light having a second wavelength that is longer than the first wavelength.
- a built-in image sensor comprising a plurality of lines including a skin detection line used for receiving a reflected light from the subject and generating a skin detection region used for detection of the skin region; Generating means for receiving reflected light from the subject at different timings for each of a plurality of lines, and generating the captured image including at least the skin detection region; the first irradiating means; and the second irradiating means. Irradiation means and the generating hand And the reflected light from the subject is received by the skin detection line in a state in which the subject is irradiated with light of the first wavelength, and includes at least the skin detection region.
- the electronic device includes a control unit that generates the second captured image including at least a first captured image, and a detection unit that detects the skin region based on the first captured image and the second captured image.
- the first wavelength is controlled by controlling the first irradiation unit, the second irradiation unit, and the generation unit.
- the reflected light from the subject is received by the skin detection line in a state where the subject is irradiated with the light, and the first captured image including at least the skin detection region is generated.
- the first captured image including at least the skin detection region is generated.
- the skin detection line In a state in which the subject is irradiated with light of the second wavelength, reflected light from the subject is received by the skin detection line, and the second captured image including at least the skin detection region is obtained. Generate. Then, the skin area is detected based on the first captured image and the second captured image.
- the first and second captured images captured by the camera are adjusted by adjusting the exposure time of the camera and the irradiation time of the light emitting device. Based on this, the skin area can be detected with high accuracy.
- FIG. 5 is a diagram for explaining an overview of processing performed by the image processing apparatus of FIG. 4.
- First embodiment an example of generating a captured image including an area used by the camera for skin detection when a rolling shutter camera is employed
- Second embodiment an example of when a rolling shutter type camera is employed, when generating an image for skin detection composed of regions used by the camera for skin detection
- FIG. 4 shows a configuration example of the information processing system 41 according to the first embodiment.
- the information processing system 41 includes a light emitting device 61, a camera 62, and an image processing device 63.
- the light emitting device 61 is constituted by LED 61b 1 and LED 61b 2 having the same function as the LED 21b 1 and LED 21b 2 of LED61a 1 and LED61a 2, and FIG. 1 having the same function as the LED 21a 1 and LED 21a 2 of FIG.
- the LED61a 1 and LED61a 2 when it is unnecessary to distinguish the LED61a 1 and LED61a 2, a LED61a 1 and LED61a 2, simply referred LED61a. If there is no need to distinguish between the LED 61b 1 and LED 61b 2, the LED 61b 1 and LED 61b 2, simply referred LED 61b.
- the number of the LEDs 61a is not limited to two, and is appropriately determined so that light necessary for the subject is irradiated as much as possible. The same applies to the LED 61b.
- the LED 61a irradiates the subject with light of wavelength ⁇ 1.
- the LED 61b irradiates the subject with light having a wavelength ⁇ 2 different from the wavelength ⁇ 1. In this case, it is assumed that the wavelength ⁇ 2 is longer than the wavelength ⁇ 1.
- the camera 62 has a built-in image sensor that receives reflected light from the subject, and is a rolling shutter type that performs exposure to receive reflected light from the subject at different timings for each of a plurality of horizontal lines constituting the built-in image sensor.
- the camera is a built-in image sensor that receives reflected light from the subject, and is a rolling shutter type that performs exposure to receive reflected light from the subject at different timings for each of a plurality of horizontal lines constituting the built-in image sensor.
- the image sensor incorporated in the camera 62 is described as being configured by a plurality of horizontal lines 0 to 11 as in the case shown in FIG. 2, but the number of horizontal lines is not limited to this. .
- the horizontal lines 0 to 11 constituting the image pickup device built in the camera 62 only have to be arranged in parallel to each other, which means that the horizontal lines 0 to 11 are arranged horizontally with respect to the ground. It goes without saying that it is not.
- the camera 62 has a lens used for imaging a subject such as a user, and the front surface of the lens is covered with a visible light cut filter 62a that blocks visible light.
- the camera 62 receives only the reflected light of the invisible light irradiated to the subject by the light emitting device 61, and the captured image obtained as a result is subjected to image processing. It will be supplied to the device 63.
- the camera 62 images a subject and supplies a captured image obtained as a result to the image processing device 63.
- the camera 62 sequentially starts imaging the subject at predetermined imaging timing (at intervals of time t in FIG. 5 described later), and generates a captured image by the imaging.
- the image processing device 63 generates a VD signal and an HD signal, and controls the light emitting device 61 and the camera 62 based on the generated VD signal and the HD signal.
- the image processing device 63 irradiates light of wavelength ⁇ 1 or ⁇ 2 so that only the reflected light of one of the wavelengths ⁇ 1 or ⁇ 2 is received on the horizontal line constituting the image sensor of the camera 62.
- the irradiation time TL and the exposure time Ts of each of the horizontal lines 0 to 11 are adjusted.
- Numbers 0 to 11 shown on the left side of FIG. 5 represent twelve horizontal lines 0 to 11 constituting an image sensor incorporated in a rolling shutter type camera, respectively.
- times t1, t2, t3, t4... Represent the intervals at which the rising edges of the VD signal occur, and t / 12 represents the intervals at which the rising edge of the HD signal occurs.
- the horizontal length represents the exposure time Ts in which the exposure is performed in the horizontal line that constitutes the image sensor incorporated in the rolling shutter type camera
- the vertical length (height) represents the charge charge amount.
- the irradiation time TL and the exposure time Ts for irradiating light of one wavelength are adjusted.
- the reflected light of one wavelength is reflected on the horizontal lines 6 to 11 by the exposure time (Ts ⁇ x / 100) Adjust the irradiation time TL and the exposure time Ts so that light is received at above.
- the reflected light having the wavelength ⁇ 1 from the subject is received at least for the first light receiving time and the reflected light having the wavelength ⁇ 2 from the subject is received as the second light receiving.
- the longer light receiving time of the first or second light receiving time is the above-described exposure. Time (Ts ⁇ x / 100).
- x is a value from 0 to 100, and changes according to the irradiation light quantity of the LED 61a and the LED 61b, the light receiving sensitivity characteristic of the camera 62, and the like.
- Ts x x / 100 the minimum exposure time required for skin detection
- (TL, Ts) (2t / 3, t / 4) can be adopted as the combination (TL, Ts) of the irradiation time TL and the exposure time Ts satisfying the expression (2).
- the total number 6 of the horizontal lines 6 to 11 that receive reflected light of one wavelength in the minimum exposure time (Ts ⁇ x / 100) required for skin detection is L, and a plurality of horizontal lines 0 to 11 are received.
- the equation (1) can be generalized to the following equation (3). TL ⁇ (L-1) ⁇ t / n + Ts ⁇ x / 100 (3)
- the variables L, n, and x are determined in advance by the performance of the camera 62, the company that manufactures the information processing system 41, and the like. Then, (TL, Ts) is determined based on Expression (3) obtained by substituting the determined L, n, x.
- the image processing device 63 controls the LED 61a to irradiate the subject with light of wavelength ⁇ 1 at the irradiation time TL at time t2.
- the image processing device 63 controls the camera 62 to irradiate the subject with light having the wavelength ⁇ 1 on the horizontal lines 6 to 11 among the plurality of horizontal lines 0 to 11 constituting the image sensor incorporated in the camera 62.
- the reflected light is received for the minimum exposure time Ts necessary for skin detection.
- the camera 62 generates a first captured image and supplies it to the image processing device 63.
- the image processing device 63 controls the LED 61b to irradiate the subject with light having the wavelength ⁇ 2 at the irradiation time TL at the time t3.
- the image processing device 63 controls the camera 62 to irradiate the subject with light having the wavelength ⁇ 2 on the horizontal lines 6 to 11 among the plurality of horizontal lines 0 to 11 constituting the image sensor incorporated in the camera 62.
- the reflected light is received for the minimum exposure time Ts necessary for skin detection.
- the camera 62 generates a second captured image and supplies it to the image processing device 63.
- first and second captured images in the first embodiment are different from the first and second captured images described with reference to FIGS. 1 to 3.
- the image processing apparatus 63 receives from the horizontal lines 6 to 11 that receive the reflected light of the wavelength ⁇ 1 among all the areas (areas obtained from the horizontal lines 0 to 11) constituting the first captured image supplied from the camera 62.
- the obtained region is extracted as a first extracted image.
- the image processing device 63 receives the reflected light of the wavelength ⁇ 2 among all the areas (areas obtained from the horizontal lines 0 to 11) constituting the second captured image supplied from the camera 62. 11 is extracted as a second extracted image.
- the image processing device 63 detects a skin region on the first or second extracted image based on the extracted first and second extracted images. The detection of the skin area by the image processing device 63 will be described later with reference to FIGS.
- the image processing device 63 is different from the image processing device 23 of the conventional skin recognition system 1 in the above-described adjustment method. Adjust the irradiation time TL and the exposure time Ts.
- the image processing device 63 extracts the first extracted image from the first captured image supplied from the camera 62 and extracts the second extracted image from the second captured image supplied from the camera 62. .
- the image processing device 63 detects the skin area based on the extracted first and second captured images.
- FIG. 6 shows that when the information processing system 41 adopting a rolling shutter type camera as the camera 62 detects the skin area by the same process as the conventional skin recognition system 1, the skin area is detected with high accuracy. An example of what cannot be done.
- FIG. 6 is configured in the same manner as FIG. 3, and thus the description thereof is omitted.
- the LED 61a irradiates the subject with light having the wavelength ⁇ 1 for the irradiation time t1.
- the LED 61b irradiates the subject with light having a wavelength ⁇ 2 for an irradiation time t2.
- the camera 62 performs exposure of the horizontal lines 0 to 11 constituting the built-in image sensor at different timings. That is, for example, every time a rising edge occurs in the HD signal generated by the image processing device 63, the camera 62 starts exposure in order from the horizontal lines 0 to 11 having a smaller number.
- the exposure of each of the horizontal lines 0 to 10 among the horizontal lines 0 to 11 constituting the image sensor is an irradiation time during which the light of wavelength ⁇ 1 is irradiated (for example, irradiation time t1). ) To the irradiation time (for example, irradiation time t2) in which the light of wavelength ⁇ 2 is irradiated.
- the charge charge amount obtained by the exposure of each of the horizontal lines 0 to 10 is the reflected light and the wavelength when the subject is irradiated with the light of wavelength ⁇ 1. This is obtained by receiving the reflected light when the subject is irradiated with the light of ⁇ 2.
- the camera 62 receives the reflected light when the subject is irradiated with the light with the wavelength ⁇ 1 and the reflected light when the subject is irradiated with the light with the wavelength ⁇ 2. Based on the charged charge amount, first and second captured images used for skin area detection are generated and supplied to the image processing device 63.
- the image processing device 63 receives the first captured image obtained by receiving the reflected light of the wavelengths ⁇ 1 and ⁇ 2, and the second captured image obtained by receiving the reflected light of the wavelengths ⁇ 1 and ⁇ 2.
- the skin area is detected based on the image.
- the image processing device 63 it becomes difficult for the image processing device 63 to detect the skin region using the difference in the reflection ratio between the wavelength ⁇ 1 and the wavelength ⁇ 2, and the accuracy of detecting the skin region is greatly reduced.
- the image processing device 63 adjusts the irradiation time TL and the exposure time Ts, and detects the skin region based on the extracted first and second extracted images. Therefore, in the information processing system 41, even when a rolling shutter type camera is employed as the camera 62, the skin region can be detected with high accuracy.
- FIG. 7 shows spectral reflection characteristics with respect to human skin.
- the horizontal axis indicates the wavelength of the irradiation light irradiated on the human skin
- the vertical axis indicates the reflectance of the irradiation light irradiated on the human skin.
- the reflectance of reflected light obtained by irradiating human skin with light of 870 [nm] is 63 [%], and 950 [nm]
- the reflectance of the reflected light obtained by irradiating the light of 50] is 50 [%].
- a combination in which the wavelength ⁇ 1 is 870 [nm] and the wavelength ⁇ 2 is 950 [nm] is employed as the combination of the wavelengths ⁇ 1 and ⁇ 2.
- This combination is a combination in which the difference in reflectance with respect to human skin is relatively large, and the difference in reflectance with respect to portions other than human skin is relatively small.
- the first extracted image is composed of a region obtained by receiving only the reflected light when the subject is irradiated with light of wavelength ⁇ 1.
- the second extracted image is composed of a region obtained by receiving only the reflected light when the subject is irradiated with light of wavelength ⁇ 2.
- the absolute difference between the luminance value of the pixel constituting the skin area on the first extracted image and the luminance value constituting the skin area on the corresponding second extracted image is the reflectance of the human skin. Corresponding to the difference, the value is relatively large.
- the absolute value of the difference between the luminance value of the pixel constituting the non-skin area (area other than the skin area) on the first extracted image and the luminance value constituting the non-skin area on the corresponding second extracted image Corresponds to a relatively small value corresponding to the difference in reflectance with respect to portions other than human skin.
- FIG. 8 shows an outline of processing performed by the image processing apparatus 63.
- the image processing device 63 is supplied with first and second captured images from the camera 62.
- the image processing device 63 extracts a first extracted image 81 including a skin region 81a and a non-skin region 81b (a region other than the skin region 81a) from the first captured image supplied from the camera 62.
- the image processing device 63 extracts a second extracted image 82 composed of a skin region 82a and a non-skin region 82b (a region other than the skin region 82a) from the first captured image supplied from the camera 62. .
- the image processing device 63 smoothes the extracted first extracted image 81 and second extracted image 82 using LPF (low-pass filter). Then, the image processing device 63 calculates a difference absolute value between luminance values of corresponding pixels of the first extracted image 81 after smoothing and the second extracted image 82 after smoothing, and the difference absolute value is calculated. A difference image 83 having pixel values is generated.
- LPF low-pass filter
- the image processing apparatus 63 performs smoothing using the LPF on the first extracted image 81 and the second extracted image 82, but the timing for performing the smoothing is not limited to this. That is, for example, the image processing device 63 may perform smoothing using LPF on the first and second captured images supplied from the camera 62.
- the image processing device 63 sets the pixel value constituting the difference image 83 to 1 that is a pixel value that is equal to or greater than a predetermined threshold, and 0 that is equal to or less than 0 that is less than the predetermined threshold. Do.
- the skin region 83a in the difference image 83 is composed of pixels having the absolute value of the difference between the skin region 81a and the skin region 82a as the pixel value, so the pixel values of the pixels constituting the skin region 83a are compared. It is a big value.
- the non-skin region 83b in the difference image 83 is composed of pixels having a pixel value that is a difference absolute value between the non-skin region 81b and the non-skin region 82b, and thus the pixel values of the pixels constituting the non-skin region 83b. Is a relatively small value.
- the difference image 83 is obtained by binarization performed by the image processing device 63, and the pixel values of the pixels constituting the skin region 84a and the non-skin region 83b in which the pixel value of the pixels constituting the skin region 83a is set to 1. Is converted to a binarized image 84 composed of non-skin regions 84b.
- the image processing device 63 detects a skin region 84a composed of pixels having a pixel value of 1 among the pixels constituting the binarized image 84 obtained by the binarization as a skin region.
- the image processing apparatus 63 determines the absolute difference value
- a skin region is detected according to whether or not (corresponding to the pixel value of the difference image 83) is equal to or greater than a predetermined threshold value.
- the skin region detection method is not limited to this.
- a difference image 83 having a difference (Y1 ⁇ Y2) obtained by subtracting the luminance value Y2 from the luminance value Y1 as a pixel value is generated. It is desirable to detect the skin region depending on whether the value (Y1-Y2) is equal to or greater than a predetermined threshold value.
- a fixed threshold can be used as a threshold used for detecting the skin region.
- or the difference (Y1 ⁇ Y2) is dynamically changed according to the state of the unevenness of irradiation. Must be changed.
- the image processing device 63 must determine whether or not the irradiation unevenness occurs, and perform complicated processing of dynamically changing the threshold according to the state of the irradiation unevenness. For this reason, it is desirable that the threshold used for detecting the skin region is always a fixed threshold regardless of the unevenness of irradiation.
- and the difference (Y1-Y2) are normalized (divided) by a division value described later and compared with a predetermined threshold value to detect a skin region.
- the predetermined threshold value can be a fixed threshold value regardless of irradiation unevenness.
- the division value represents a value based on at least one of the luminance values Y1 and Y2.
- the luminance value Y1, the luminance value Y2, and the average value ⁇ (Y1 + Y2) / 2 ⁇ of the luminance values Y1 and Y2 are adopted. can do.
- the skin region may be detected based on whether the ratio Y2 / Y1 is equal to or greater than a predetermined threshold.
- a predetermined threshold In this case as well, a fixed threshold value can be used regardless of irradiation unevenness. Further, since only the ratio Y2 / Y1 is calculated, it is compared with the predetermined threshold more quickly than when the difference absolute value
- the image processing device 63 will be described as detecting a skin region depending on whether or not the absolute difference value
- the absolute difference value
- the absolute difference value
- FIG. 9 shows a configuration example of the image processing device 63.
- the image processing apparatus 63 includes a control unit 101, an extraction unit 102, a calculation unit 103, and a binarization unit 104.
- the control unit 101 controls the light emitting device 61 to alternately emit (irradiate) the LEDs 61a and the LEDs 61b of the light emitting device 61. That is, for example, the control unit 101 applies light of wavelength ⁇ 1 from the LED 61a to the subject at the irradiation time TL (time from the start of exposure on the horizontal line 6 to the end of exposure on the horizontal line 11) at times t2, t4,. Irradiate.
- control unit 101 irradiates the subject with light having the wavelength ⁇ 2 from the LED 61b at the irradiation time TL at times t3, t5,.
- the control unit 101 controls the camera 62 to expose the horizontal lines 0 to 11 constituting the image pickup device built in the camera 62 for the exposure time Ts from the timing when the rising edge of the HD signal is detected in ascending order of the number. By doing so, the subject is imaged.
- the extraction unit 102 is supplied with first and second captured images from the camera 62.
- the extraction unit 102 uses, as a first extracted image, an area obtained from the horizontal lines 6 to 11 for generating an area used for skin detection among all the areas constituting the first captured image from the camera 62. Extracted and supplied to the calculation unit 103.
- the extraction unit 102 performs second extraction on the regions obtained from the horizontal lines 6 to 11 for generating the regions used for skin detection among all the regions constituting the second captured image from the camera 62. An image is extracted and supplied to the calculation unit 103.
- the calculation unit 103 performs smoothing using LPF on the first and second extracted images from the extraction unit 102.
- the calculation unit 103 calculates a difference absolute value between the first and second extracted images after smoothing, and binarizes a difference image composed of pixels having the calculated difference absolute value as a pixel value. Supplied to the unit 104.
- the binarization unit 104 binarizes the difference image from the calculation unit 103 and detects a skin region on the first extracted image (or the second extracted image) based on the binarized image obtained as a result. The detection result is output.
- This skin detection process is repeatedly executed from when the information processing system 41 is turned on, for example.
- step S1 the control unit 101 controls the LED 61a of the light emitting device 61 to irradiate the subject with light having the wavelength ⁇ 1 from the LED 61a at the irradiation time TL at time t2, t4,.
- step S2 the camera 62 performs exposure for the exposure time Ts from the timing at which the rising edge of the HD signal is detected for each of the horizontal lines 0 to 11 constituting the built-in image sensor, and the first imaging obtained as a result is performed.
- the image is supplied to the extraction unit 102 of the image processing device 63.
- step S3 the control unit 101 controls the LED 61b of the light-emitting device 61, and irradiates the subject with light having a wavelength ⁇ 2 from the LED 61b at the irradiation time TL at times t3, t5,.
- step S4 the camera 62 performs exposure for the exposure time Ts from the timing at which the rising edge of the HD signal is detected for each of the horizontal lines 0 to 11 constituting the built-in image sensor, and the second image obtained as a result.
- the image is supplied to the extraction unit 102.
- step S ⁇ b> 5 the extraction unit 102 uses, as the first region, the regions obtained from the horizontal lines 6 to 11 for generating regions used for skin detection among all regions constituting the first captured image from the camera 62. Are extracted and supplied to the calculation unit 103.
- the extraction unit 102 performs second extraction on the regions obtained from the horizontal lines 6 to 11 for generating the regions used for skin detection among all the regions constituting the second captured image from the camera 62. An image is extracted and supplied to the calculation unit 103.
- step S6 the calculation unit 103 performs smoothing using LPF on the first and second extracted images supplied from the extraction unit 102. Then, the calculation unit 103 generates a difference image based on the absolute difference value between the luminance values of the corresponding pixels of the smoothed first and second extracted images, and supplies the difference image to the binarization unit 104.
- step S7 the binarization unit 104 binarizes the difference image supplied from the calculation unit 103.
- step S8 the binarization unit 104 detects a skin region from the binarized image obtained by binarization. This completes the skin detection process of FIG.
- the first and second captured images are captured with the irradiation time TL and the exposure time Ts that satisfy Expression (2) (or Expression (3)). I made it.
- the first extraction is performed on the region obtained by receiving only the reflected light when the subject is irradiated with light having the wavelength ⁇ ⁇ b> 1 out of all the regions constituting the first captured image. Extracted as an image.
- the second extraction is performed on the region obtained by receiving only the reflected light when the subject is irradiated with light having the wavelength ⁇ ⁇ b> 2 out of all the regions constituting the second captured image. Extracted as an image.
- the skin region is detected using the difference in the reflection ratio between the wavelength ⁇ 1 and the wavelength ⁇ 2. Therefore, even when a rolling shutter type camera is employed as the camera 62, it is possible to detect the skin region with high accuracy.
- a skin region is detected using an information processing system 41 that employs a type of camera.
- a camera to be used can be selected from many types, and the manufacturing cost of the information processing system 41 is low. It becomes possible to suppress.
- the area obtained by the horizontal lines 6 to 11 is extracted as the first or second extracted image, but the area extracted as the first or second extracted image is
- the present invention is not limited to this.
- the LED 61a is the irradiation time from the timing at which the exposure on the horizontal line 3 is started to the timing at which the exposure on the horizontal line 8 is completed among the horizontal lines 0 to 11 constituting the image sensor incorporated in the camera 62. Flashes with TL. The same applies to the LED 61b.
- the skin area can be detected with higher accuracy.
- the horizontal lines 6 and 11 are extracted. You may make it extract the area
- the image processing device 63 extracts what areas from the first and second captured images as the first and second extracted images used for detecting the skin area. You may make it do.
- the image processing device 63 extracts the first and second extracted images (of the horizontal lines 0 to 11 among the horizontal lines 6 to 11) extracted from the first and second captured images, respectively. 11), the skin region is detected.
- the horizontal lines 0 to 5 of the horizontal lines 0 to 11 cannot be used for detecting the skin area. This is equivalent to the fact that the upper half of the captured image cannot be used for detection of the skin region, and the angle of view of the image sensor is halved.
- the skin region is detected while maintaining the original angle of view of the image sensor will be described.
- the camera 141 directly generates first and second skin detection images (corresponding to the first and second extracted images) used for detection of the skin region, and generates the generated first and second skins.
- a skin region may be detected based on the detection image.
- FIG. 11 shows an example of an information processing system 121 that directly detects a skin region from first and second skin detection images obtained by imaging a subject.
- the information processing system 121 is the information processing according to the first embodiment except that a camera 141 and an image processing device 142 are provided instead of the camera 62 and the image processing device 63 of the information processing system 41.
- the configuration is the same as that of the system 41.
- the camera 141 incorporates an image sensor that receives reflected light from a subject, and is a rolling shutter type that performs exposure to receive reflected light from the subject at different timings for each of a plurality of horizontal lines that constitute the built-in image sensor.
- the camera is a rolling shutter type that performs exposure to receive reflected light from the subject at different timings for each of a plurality of horizontal lines that constitute the built-in image sensor.
- the camera 141 generates an image composed of an area obtained by only six horizontal lines used for skin area detection out of twelve horizontal lines, when receiving light reflected from a subject and performing exposure. It is driven by. And the camera 141 produces
- FIG. 12 shows, in the second embodiment, the areas obtained by the horizontal lines 6 to 11 that receive the reflected light from the subject, as first and second skin detection images (first and second extracted images).
- the image sensor 141a built in the camera 141 is shown. In this case, only the horizontal lines 6 to 11 among the horizontal lines 0 to 11 are used for skin detection, as indicated by the hatched portion in FIG.
- FIG. 13 shows the first and second skin detection areas obtained by the horizontal lines 0, 2, 4, 6, 8, and 10 that receive the reflected light from the subject in the second embodiment.
- an image sensor 141b built in the camera 141 is shown.
- the horizontal lines 0, 2, 4, 6, 8, and 10 are used for skin detection.
- the number of horizontal lines used for skin detection is the same, but FIG. 13 has an advantage that the angle of view of the image sensor is not reduced during skin detection. The resolution of the captured image is reduced accordingly, but it is important for skin detection that the shape and movement of the skin area can be obtained with reasonable accuracy, and it is more important to widen the angle of view that can detect skin over image quality There are many.
- the camera 141 will be described on the assumption that the image sensor 141b is driven as shown in FIG.
- the image captured by the image sensor 141b includes, for example, horizontal lines 0, 2, 4, 6, 8, and 10 (or 1, 3, 5, 7, 9, and 11) that differ by one horizontal line, and n (N is a natural number of 2 or more)
- the horizontal lines may be different from each other by only one horizontal line.
- the camera 141 sequentially starts imaging the subject at a predetermined imaging timing (at an interval of time t in FIG. 14 described later), and performs image processing on the first or second skin detection image obtained as a result. Supply to device 142.
- the camera 141 has a first skin detection image obtained when light of wavelength ⁇ 1 is irradiated on the subject, and a second image obtained when light of wavelength ⁇ 2 is irradiated on the subject.
- Each of the skin detection images is supplied to the image processing device 142.
- the image processing device 142 controls the camera 141, receives the VD signal and the HD signal from the camera 141, and controls the light emitting device 61 based on the received VD signal and the HD signal.
- the image processing apparatus 142 receives only reflected light having one of the wavelengths ⁇ 1 and ⁇ 2 on the horizontal line used for skin detection among the plurality of horizontal lines constituting the image sensor of the camera 141.
- the irradiation time TL for irradiating light of wavelength ⁇ 1 or ⁇ 2 and the exposure times Ts of the horizontal lines 0, 2, 4, 6, 8, and 10 are adjusted.
- Numbers 0, 2, 4, 6, 8, and 10 shown on the left side of FIG. 14 are used for skin detection among 12 horizontal lines constituting the image sensor 141b built in the rolling shutter camera 141, respectively. Six horizontal lines 0, 2, 4, 6, 8, 10 are represented. The rest of the configuration is the same as in the case of FIG.
- the total number L of horizontal lines 0, 2, 4, 6, 8, and 10 for receiving reflected light of one wavelength at an exposure time (Ts ⁇ x / 100) or more necessary for skin detection is 6 or more.
- Ts ⁇ x / 100 the total number L of horizontal lines 0, 2, 4, 6, 8, and 10 for receiving reflected light of one wavelength at an exposure time (Ts ⁇ x / 100) or more necessary for skin detection is 6 or more.
- the expression (3) is expressed by the following expression (1 ′).
- the exposure time Ts is set to t / 3 and the irradiation time TL is set to 3t / 4.
- the exposure time Ts and the irradiation time TL are not limited to this, and the formula (2 ') (or Formula (3)) should just be satisfied.
- the horizontal line that is actually used for imaging during the skin detection operation has a wavelength ⁇ 1 or
- the irradiation time TL and the exposure time Ts for irradiating light of one wavelength are adjusted so that only reflected light of one wavelength of ⁇ 2 is received.
- a plurality of horizontal lines 0, 2, 4, 6, 8, and 10 have reflected light of one wavelength as a minimum exposure time (Ts ⁇ 100) necessary for skin detection. / 100)
- Ts ⁇ 100 a minimum exposure time necessary for skin detection.
- the image processing apparatus 142 controls the LED 61a to irradiate the subject with light having the wavelength ⁇ 1 at the irradiation time TL at the time t1.
- the image processing device 142 controls the camera 141 to expose the reflected light when the subject is irradiated with light of wavelength ⁇ 1 on the horizontal lines 0, 2, 4, 6, 8, and 10 of the image sensor 141b. Receive light at Ts. Accordingly, the camera 141 generates a first skin detection image and supplies it to the image processing device 142.
- the image processing device 142 controls the LED 61b to irradiate the subject with light of wavelength ⁇ 2 at the irradiation time TL at time t2.
- the image processing apparatus 142 controls the camera 141 so that the reflected light when the subject is irradiated with light having the wavelength ⁇ 2 on the plurality of horizontal lines 0, 2, 4, 6, 8, and 10 of the image sensor 141b. Light is received with an exposure time Ts. Accordingly, the camera 142 generates a second skin detection image and supplies it to the image processing device 142.
- the image processing device 142 detects a skin area on the first or second skin detection image based on the first and second skin detection images from the camera 142.
- the number of horizontal lines used for skin detection is not limited to six. That is, for example, all horizontal lines used for skin detection receive reflected light of wavelength ⁇ 1 from the subject at times t1, t3,..., And reflected light of wavelength ⁇ 2 from the subject at times t2, t4,. Depending on the conditions, it is possible to determine the number within a range in which light can be received with a minimum exposure time (Ts ⁇ x / 100) necessary for detection. Further, the arrangement of the horizontal lines used for skin detection is not limited to the arrangement shown in FIGS. 12 and 13 and may be arranged in any manner.
- FIG. 15 shows a configuration example of the image processing apparatus 142.
- the image processing apparatus 142 includes a control unit 161 in place of the control unit 101 in FIG. 9 and a calculation unit 162 in place of the extraction unit 102 and the calculation unit 103 in FIG. Is configured similarly to the image processing apparatus 63 of FIG.
- the control unit 161 controls the light emitting device 61 to alternately emit (irradiate) the LEDs 61a and the LEDs 61b of the light emitting device 61. That is, for example, the control unit 161 applies the light of wavelength ⁇ 1 from the LED 61a to the subject at the irradiation time TL (time from the start of exposure on the horizontal line 0 to the end of exposure on the horizontal line 10) at times t1, t3,. Irradiate.
- the controller 161 causes the LED 61b to irradiate the subject with light having the wavelength ⁇ 2 at the irradiation time TL at the times t2, t4,.
- the control unit 161 controls the camera 141, and the rising edges of the HD signals are detected in the order of increasing numbers of the horizontal lines 0, 2, 4, 6, 8, and 10 constituting the image sensor 141b built in the camera 141.
- the subject is imaged by performing exposure for the exposure time Ts from the timing of the exposure.
- the calculation unit 162 is supplied with the first and second skin detection images from the camera 141.
- the calculating unit 162 performs smoothing using LPF on the first and second skin detection images from the camera 141.
- the calculation unit 162 calculates a difference absolute value between the luminance values of the first and second skin detection images after smoothing, and a difference image including pixels having the calculated difference absolute value as a pixel value. Is supplied to the binarization unit 104.
- the difference image from the calculation unit 162 is binarized, and a skin region is detected based on the binarized image obtained as a result, The detection result is output.
- This skin detection process is repeatedly executed, for example, when the information processing system 121 is turned on.
- step S21 the control unit 161 controls the LED 61a of the light emitting device 61 to irradiate the subject with light having the wavelength ⁇ 1 from the LED 61a at the irradiation time TL at the times t1, t3,.
- step S22 the camera 141 performs exposure for the exposure time Ts from the timing at which the rising edge of the HD signal is detected for each of the horizontal lines 0, 2, 4, 6, 8, and 10 of the built-in image sensor 141b.
- the obtained first skin detection image is supplied to the calculation unit 162 of the image processing apparatus 142.
- step S23 the control unit 161 controls the LED 61b of the light emitting device 61 to irradiate the subject with light having the wavelength ⁇ 2 from the LED 61b with the irradiation time TL at the times t2, t4,. In this case, it is assumed that the LED 61a is turned off.
- step S24 the camera 141 performs exposure for the exposure time Ts from the timing at which the rising edge of the HD signal is detected for each of the horizontal lines 0, 2, 4, 6, 8, and 10 of the built-in image sensor 141b.
- the second skin detection image obtained as a result is supplied to the calculation unit 162.
- step S25 the calculation unit 162 performs smoothing using LPF on the first and second skin detection images supplied from the camera 141. Then, the calculation unit 162 generates a difference image based on the absolute difference value between the luminance values of the corresponding pixels of the smoothed first and second skin detection images, and supplies the difference image to the binarization unit 104. To do.
- step S26 the binarization unit 104 binarizes the difference image supplied from the calculation unit 162.
- step S27 the binarization unit 104 detects a skin region from the binarized image obtained by binarization. This completes the skin detection process of FIG.
- an image quality mode at the time of imaging can be selected. For example, when VGA and QVGA can be selected, the number of horizontal lines of the image sensor used at the time of imaging is half that of QVGA with respect to VGA.
- the first embodiment when the image quality mode selection specifications of the camera match the conditions and can be used for the image processing apparatus 142, the first embodiment, as in the first embodiment. And a skin region directly based on the first and second skin detection images from the camera 141 without performing processing for extracting the first and second extracted images from the second captured image. Can be detected.
- the DSP Digital Signal Processor
- the DSP may be cheaper than the DSP that operates as the image processing device 63 in the first embodiment. It becomes possible. Thereby, for example, it becomes possible to manufacture the information processing system 121 whose manufacturing cost is lower than that of the information processing system 41.
- the regions generated by the horizontal lines 0, 2, 4, 6, 8, and 10 among the horizontal lines 0 to 11 are used as the first and second skin detection images. Therefore, it is possible to detect the skin area with a wide angle of view, as in the case where the area generated by the horizontal lines 0 to 11 is the first and second skin detection images. Become. For this reason, it becomes possible to grasp
- the first picked-up image is obtained by irradiating light with the wavelength ⁇ 1 from the LED 61a at the irradiation time TL at time t2, and the wavelength ⁇ 2 from the LED 61b at the irradiation time TL at time t3.
- the second captured image different from the first captured image by one frame is obtained by irradiating light, the present invention is not limited to this.
- the irradiation time TL is the time from the start of exposure of the horizontal line 0 to the end of exposure of the horizontal line 11.
- the region extracted as the first extracted image (first The entire region constituting one captured image) is obtained by receiving the reflected light of the wavelengths ⁇ 1 and ⁇ 2. The same applies to the region extracted as the second extracted image.
- the control unit 101 of the image processing apparatus 63 sets the LEDs 61a and 61b so that the irradiation time of the light of wavelength ⁇ 1 by the LED 61a and the irradiation time of the light of wavelength ⁇ 2 by the LED 61b do not overlap. Control. Then, the camera 62 captures first and second captured images that differ by a predetermined number of frames.
- the first and second captured images are generated as follows.
- the light of wavelength ⁇ 1 is irradiated from the time when the rising edge of the 10th HD signal generated at time t1 to the time when the 12th rising edge occurs at time t2.
- a first captured image is obtained by the camera 62 and supplied to the image processing device 63.
- irradiation with light of wavelength ⁇ 1 and light of wavelength ⁇ 2 is performed from when the rising edge of the 12th HD signal generated at time t2 occurs until when the 10th rising edge occurs at time t3. Stop.
- the captured image obtained by imaging by the camera 62 is not used for skin detection, and is ignored (or discarded) by the image processing device 63.
- the light of wavelength ⁇ 2 is irradiated from when the rising edge of the 10th HD signal generated at time t3 occurs until when the 12th rising edge occurs at time t4.
- a second captured image is obtained by the camera 62 and supplied to the image processing device 63.
- the image processing device 63 detects the skin region based on the first captured image from the camera 62 and the second captured image captured two frames after the first captured image.
- the camera 62 starts imaging at a predetermined imaging timing (interval of time t).
- a predetermined imaging timing interval of time t.
- the first captured image obtained at a predetermined imaging timing and the imaging timing when the time t has elapsed from the predetermined imaging timing It is desirable to detect the skin region based on the second captured image obtained in step (1).
- the image processing device 63 detects the skin region based on the first captured image and the second captured image captured only one frame after the first captured image.
- the irradiation time TL is equal to or less than the time t (the same time as each of the times t1, t2, t, 3, and t4) from the rising edge generated in the VD signal to the next rising edge. Is desirable.
- the irradiation time TL is set to be equal to or shorter than the time t, it is possible to prevent the irradiation time of the light with the wavelength ⁇ 1 from the LED 61a and the irradiation time of the light with the wavelength ⁇ 2 from the LED 61b from overlapping, and the first captured image.
- the skin region can be detected based on the second captured image captured one frame after the first captured image. That is, in the camera 62, the frame rate for generating the first and second captured images can be improved. The same applies to the second embodiment.
- the irradiation time for irradiating the light with the wavelength ⁇ 1 from the LED 61a and the irradiation time for irradiating the light with the wavelength ⁇ 1 from the LED 61b are different. It can be. The same can be said for the second embodiment.
- irradiation is performed so that reflected light of one wavelength can be received with an exposure time (Ts ⁇ x / 100) or more necessary for minimum skin detection.
- an exposure time Ts ⁇ x / 100
- any irradiation method may be used as long as the reflected light of one wavelength can be received in the minimum exposure time (Ts ⁇ x / 100) necessary for skin detection. It may be.
- light of one wavelength may be intermittently irradiated during the irradiation time TL. The same applies to the second embodiment.
- the exposure time when receiving the reflected light of the wavelength ⁇ 1 from the subject on the horizontal lines 6 to 11 constituting the image sensor of the camera 62 is the same, it is not limited to this.
- the exposure time when receiving reflected light with wavelength ⁇ 1 from the subject is the time required to receive reflected light with wavelength ⁇ 1 that is the minimum necessary for skin detection, and reflected light with wavelength ⁇ 2 from the subject.
- Different exposure times may be used as long as the exposure time at the time of receiving light can receive the reflected light of the wavelength ⁇ 2 that is the minimum necessary for skin detection. The same applies to the second embodiment.
- the combination of the wavelength ⁇ 1 and the wavelength ⁇ 2 is a combination of 870 [nm] and 950 [nm].
- the reflectance at the wavelength ⁇ 1 and the wavelength Any combination may be used as long as the absolute difference value with respect to the reflectance at ⁇ 2 is sufficiently larger than the absolute difference value with respect to the reflectance other than the user's skin.
- the difference obtained by subtracting the reflectance at wavelength ⁇ 2 from the reflectance at wavelength ⁇ 1 is a combination that is sufficiently larger than the difference in reflectance obtained for things other than the user's skin. Any combination is acceptable.
- the combination of 800 [nm] and 950 [nm], 870 [nm] and 1000 [nm] The LED 61a emits irradiation light having a wavelength ⁇ 1 of less than 930 [nm], and the LED 61b has a wavelength ⁇ 2 of 930 [nm] or more, such as a combination with nm] or a combination of 800 [nm] and 1000 [nm]. It is possible to comprise so that it may irradiate.
- the skin detection can be performed with high accuracy Can be performed.
- the range of the wavelengths ⁇ 1 and ⁇ 2 is the near infrared region excluding the visible light region. It is more desirable to do.
- a filter that allows only visible light emitted from the LED 61a to pass through and enter the lens of the camera 62 is used instead of the visible light cut filter 62a. It is done. The same can be said for the LED 61b.
- the information processing system 41 has been described. However, the information processing system 41 is built in an electronic device such as a television receiver, and the television receiver is detected by the information processing system 41. The channel (frequency) to be received can be changed according to the detection result of the skin region. Further, for example, the information processing system 41 may be built in a portable electronic device such as a mobile phone in addition to a television receiver. The same applies to the second embodiment.
- the above-described series of processing can be executed by dedicated hardware or can be executed by software.
- a program constituting the software can execute various functions by installing a so-called embedded computer or various programs. For example, it is installed from a recording medium in a general-purpose personal computer or the like.
- FIG. 17 shows a configuration example of a personal computer that executes the above-described series of processing by a program.
- a CPU (Central Processing Unit) 201 executes various processes in accordance with a program stored in a ROM (Read Only Memory) 202 or a storage unit 208.
- a RAM (Random Access Memory) 203 appropriately stores programs executed by the CPU 201, data, and the like. These CPU 201, ROM 202, and RAM 203 are connected to each other by a bus 204.
- the CPU 201 is also connected with an input / output interface 205 via the bus 204.
- an input unit 206 composed of a keyboard, a mouse, a microphone, and the like
- an output unit 207 composed of a display, a speaker, and the like.
- the CPU 201 executes various processes in response to commands input from the input unit 206. Then, the CPU 201 outputs the processing result to the output unit 207.
- the storage unit 208 connected to the input / output interface 205 includes, for example, a hard disk and stores programs executed by the CPU 201 and various data.
- the communication unit 209 communicates with an external device via a network such as the Internet or a local area network.
- the program may be acquired via the communication unit 209 and stored in the storage unit 208.
- the drive 210 connected to the input / output interface 205 drives a removable medium 211 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and drives programs and data recorded there. Etc. The acquired program and data are transferred to and stored in the storage unit 208 as necessary.
- a recording medium for recording (storing) a program that is installed in a computer and can be executed by the computer is a magnetic disk (including a flexible disk), an optical disk (CD-ROM (Compact Disc- Removable media 211, which is a package media made up of read only memory, DVD (digital versatile disc), magneto-optical disk (including MD (mini-disc)), or semiconductor memory, etc. It is composed of a ROM 202 that is permanently stored, a hard disk that constitutes the storage unit 208, and the like.
- Recording of a program on a recording medium is performed using a wired or wireless communication medium such as a local area network, the Internet, or digital satellite broadcasting via a communication unit 209 that is an interface such as a router or a modem as necessary. Is called.
- a wired or wireless communication medium such as a local area network, the Internet, or digital satellite broadcasting
- a communication unit 209 that is an interface such as a router or a modem as necessary. Is called.
- steps describing the series of processes described above are not limited to the processes performed in time series in the described order, but are not necessarily processed in time series, either in parallel or individually.
- the process to be executed is also included.
- system represents the entire apparatus composed of a plurality of apparatuses.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Image Analysis (AREA)
- Exposure Control For Cameras (AREA)
- Color Television Image Signal Generators (AREA)
- Image Input (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
図1は、従来の肌認識システム1の構成例を示している。
次に、図2及び図3を参照して、カメラ22として採用されているグローバルシャッタ型のカメラについて説明する。
640nm≦λ1≦1000nm
900nm≦λ2≦1100nm
を満たすようにすることができる。
1. 第1の実施の形態(ローリングシャッタ型のカメラを採用した場合に、カメラが肌検出に用いる領域を含む撮像画像を生成するときの一例)
2. 第2の実施の形態(ローリングシャッタ型のカメラを採用した場合に、カメラが肌検出に用いる領域により構成される肌検出用画像を生成するときの一例)
3. 変形例
[情報処理システム41の構成例]
図4は、第1の実施の形態である情報処理システム41の構成例を示している。
次に、図5を参照して、画像処理装置63が行う、照射時間TL及び露光時間Tsの調整方法について説明する。
TL ≧ (6-1)×t/12 + Ts×100/100 ・・・(1)
TL ≧ 5t/12 + Ts ・・・(2)
TL ≧ (L-1)×t/n + Ts×x/100 ・・・(3)
次に、図7乃至図9を参照して、画像処理装置63が行う処理について説明する。
図7は、人間の肌に対する分光反射特性を示している。
図8は、画像処理装置63が行う処理の概要を示している。
図9は、画像処理装置63の構成例を示している。
次に、図10のフローチャートを参照して、情報処理システム41が行う肌検出処理について説明する。
ところで、第1の実施の形態において、画像処理装置63が、第1及び第2の撮像画像のそれぞれから抽出した第1及び第2の抽出画像(水平ライン0乃至11のうち、水平ライン6乃至11により得られる)に基づいて、肌領域を検出するようにした。この例では水平ライン0乃至11のうち、水平ライン0乃至5の領域は肌領域の検出に用いることができない。これは撮像画像のおよそ上半分は肌領域の検出に用いることができず、撮像素子の画角が半分になっていることと等価である。第2の実施の形態では撮像素子の元の画角を維持したまま、肌領域の検出を行なう例を示す。
図12と比較すると肌検出に利用される水平ライン数は等しいが、図13では肌検出時に撮像素子の画角が小さくならないという利点がある。その分、撮像画像の解像度は落ちるが、肌検出では肌領域の形状や動きがそこそこの精度で得られることが重要であり、肌を検出できる画角を広げる方が画質よりも優先される場合が多い。
次に、図14を参照して、画像処理装置142が行う、照射時間TL及び露光時間Tsの調整方法について説明する。
TL ≧ (6-1)×t/12 + Ts×100/100 ・・・(1')
TL ≧ 5t/12 + Ts ・・・(2')
次に、図15は、画像処理装置142の構成例を示している。
次に、図16のフローチャートを参照して、情報処理システム121が行う肌検出処理について説明する。
第1の実施の形態では、例えば、時間t2における照射時間TLで、LED61aから波長λ1の光を照射させて第1の撮像画像を得るとともに、時間t3における照射時間TLで、LED61bから波長λ2の光を照射させて、第1の撮像画像と1フレーム分だけ異なる第2の撮像画像を得るようにしているが、これに限定されない。
次に、図17は、上述した一連の処理をプログラムにより実行するパーソナルコンピュータの構成例を示している。
Claims (13)
- 被写体を撮像して得られる撮像画像上から、人間の肌を表す肌領域を検出する情報処理装置において、
第1の波長の光を前記被写体に照射する第1の照射手段と、
前記第1の波長よりも長波長である第2の波長の光を前記被写体に照射する第2の照射手段と、
前記被写体からの反射光を受光して、前記肌領域の検出に用いる肌検出用領域を生成するために用いる肌検出用のラインを含む複数のラインにより構成される撮像素子を内蔵し、前記複数のライン毎に、それぞれ異なるタイミングで前記被写体からの反射光を受光して、前記肌検出用領域を少なくとも含む前記撮像画像を生成する生成手段と、
前記第1の照射手段、前記第2の照射手段、及び前記生成手段を制御して、前記第1の波長の光を前記被写体に照射させた状態で、前記被写体からの反射光を、前記肌検出用のラインに受光させて、前記肌検出用領域を少なくとも含む第1の前記撮像画像を生成させ、
前記第2の波長の光を前記被写体に照射させた状態で、前記被写体からの反射光を、前記肌検出用のラインに受光させて、前記肌検出用領域を少なくとも含む第2の前記撮像画像を生成させる
制御手段と、
前記第1の撮像画像、及び前記第2の撮像画像に基づいて、前記肌領域を検出する検出手段と
を含む情報処理装置。 - 前記撮像素子は、n(nは自然数)ラインの間隔で配置された前記肌検出用のラインを含む前記複数のラインにより構成され、
前記制御手段は、
前記第1の照射手段、前記第2の照射手段、及び前記生成手段を制御して、前記第1の波長の光を前記被写体に照射させた状態で、前記被写体からの反射光を、前記肌検出用のラインに受光させて、前記肌検出用領域からなる前記第1の撮像画像を第1の肌検出用画像として生成させ、
前記第2の波長の光を前記被写体に照射させた状態で、前記被写体からの反射光を、前記肌検出用のラインに受光させて、前記肌検出用領域からなる前記第1の撮像画像を第2の肌検出用画像として生成させ、
前記検出手段は、前記第1の肌検出用画像、及び前記第2の肌検出用画像に基づいて、前記肌領域を検出する
請求項1に記載の情報処理装置。 - 前記制御手段は、
前記第1の照射手段、前記第2の照射手段、及び前記生成手段を制御して、前記第1の波長の光を前記被写体に照射させた状態で、前記被写体からの反射光を、前記肌検出用のラインに受光させて、前記肌検出用領域を含む第1の前記撮像画像を生成させ、
前記第2の波長の光を前記被写体に照射させた状態で、前記被写体からの反射光を、前記肌検出用のラインに受光させて、前記肌検出用領域を含む第2の前記撮像画像を生成させ、
前記検出手段は、
前記第1の撮像画像に含まれる前記肌検出用領域を第1の抽出画像として抽出し、前記第2の撮像画像に含まれる前記肌検出用領域を第2の抽出画像として抽出する抽出手段と、
前記第1及び第2の抽出画像に基づいて、前記肌領域を検出する肌領域検出手段と
を有する
請求項1に記載の情報処理装置。 - 前記制御手段は、
前記第1の照射手段、前記第2の照射手段、及び前記生成手段を制御して、前記第1の波長の光を前記被写体に照射させた状態で、前記被写体からの反射光を、前記肌検出用のラインに所定の受光時間以上で受光させ、
前記第2の波長の光を前記被写体に照射させた状態で、前記被写体からの反射光を、前記肌検出用のラインに前記所定の受光時間以上で受光させる
請求項1乃至3に記載の情報処理装置。 - 前記生成手段は、予め決められた撮像タイミングで順次、前記被写体を撮像して前記撮像画像を生成するものであり、
前記制御手段は、前記第1の照射手段、前記第2の照射手段、及び前記生成手段を制御して、所定の撮像タイミングで前記第1の撮像画像を生成させ、前記所定の撮像タイミングの次の撮像タイミングで前記第2の撮像画像を生成させる
請求項1乃至4に記載の情報処理装置。 - 前記第1及び第2の照射手段は、人間の肌に対して、前記第1の波長の光を照射して得られる反射光の反射率から、前記第2の波長の光を照射して得られる反射光の反射率を差し引いて得られる差分が、所定の差分閾値以上となる場合の波長の光を照射する
請求項1乃至5に記載の情報処理装置。 - 前記第1の波長λ1及び前記第2の波長λ2は、
640nm≦λ1≦1000nm
900nm≦λ2≦1100nm
を満たす
請求項6に記載の情報処理装置。 - 前記第1の照射手段は、前記第1の波長の光として、第1の赤外線を前記被写体に照射し、
前記第2の照射手段は、前記第2の波長の光として、前記第1の赤外線よりも長波長である第2の赤外線を前記被写体に照射する
請求項7に記載の情報処理装置。 - 前記検出手段は、前記第1の撮像画像の輝度値、及び前記第2の撮像画像の輝度値に基づいて、前記肌領域を検出する
請求項1又は2に記載の情報処理装置。 - 前記肌領域検出手段は、前記第1の抽出画像の輝度値、及び前記第2の抽出画像の輝度値に基づいて、前記肌領域を検出する
請求項3に記載の情報処理装置。 - 被写体を撮像して得られる撮像画像上から、人間の肌を表す肌領域を検出する情報処理装置の情報処理方法において、
前記情報処理装置は、
第1の照射手段と、
第2の照射手段と、
生成手段と、
制御手段と、
検出手段と
を含み、
前記第1の照射手段が、第1の波長の光を前記被写体に照射し、
前記第2の照射手段が、前記第1の波長よりも長波長である第2の波長の光を前記被写体に照射し、
前記生成手段が、前記被写体からの反射光を受光して、前記肌領域の検出に用いる肌検出用領域を生成するために用いる肌検出用のラインを含む複数のラインにより構成される撮像素子を内蔵しており、前記複数のライン毎に、それぞれ異なるタイミングで前記被写体からの反射光を受光して、前記肌検出用領域を少なくとも含む前記撮像画像を生成し、
前記制御手段が、
前記第1の照射手段、前記第2の照射手段、及び前記生成手段を制御して、前記第1の波長の光を前記被写体に照射させた状態で、前記被写体からの反射光を、前記肌検出用のラインに受光させて、前記肌検出用領域を少なくとも含む第1の前記撮像画像を生成させ、
前記第2の波長の光を前記被写体に照射させた状態で、前記被写体からの反射光を、前記肌検出用のラインに受光させて、前記肌検出用領域を少なくとも含む第2の前記撮像画像を生成させ、
前記検出手段が、前記第1の撮像画像、及び前記第2の撮像画像に基づいて、前記肌領域を検出する
ステップを含む情報処理方法。 - 被写体を撮像して得られる撮像画像上から、人間の肌を表す肌領域を検出する情報処理装置であって、第1の波長の光を前記被写体に照射する第1の照射手段と、前記第1の波長よりも長波長である第2の波長の光を前記被写体に照射する第2の照射手段と、前記被写体からの反射光を受光して、前記肌領域の検出に用いる肌検出用領域を生成するために用いる肌検出用のラインを含む複数のラインにより構成される撮像素子を内蔵し、前記複数のライン毎に、それぞれ異なるタイミングで前記被写体からの反射光を受光して、前記肌検出用領域を少なくとも含む前記撮像画像を生成する生成手段とを備える情報処理装置のコンピュータを、
前記第1の照射手段、前記第2の照射手段、及び前記生成手段を制御して、前記第1の波長の光を前記被写体に照射させた状態で、前記被写体からの反射光を、前記肌検出用のラインに受光させて、前記肌検出用領域を少なくとも含む第1の前記撮像画像を生成させ、
前記第2の波長の光を前記被写体に照射させた状態で、前記被写体からの反射光を、前記肌検出用のラインに受光させて、前記肌検出用領域を少なくとも含む第2の前記撮像画像を生成させる
制御手段と、
前記第1の撮像画像、及び前記第2の撮像画像に基づいて、前記肌領域を検出する検出手段と
して機能させるためのプログラム。 - 被写体を撮像して得られる撮像画像上から、人間の肌を表す肌領域を検出する情報処理装置を内蔵する電子機器において、
前記情報処理装置は、
第1の波長の光を前記被写体に照射する第1の照射手段と、
前記第1の波長よりも長波長である第2の波長の光を前記被写体に照射する第2の照射手段と、
前記被写体からの反射光を受光して、前記肌領域の検出に用いる肌検出用領域を生成するために用いる肌検出用のラインを含む複数のラインにより構成される撮像素子を内蔵し、前記複数のライン毎に、それぞれ異なるタイミングで前記被写体からの反射光を受光して、前記肌検出用領域を少なくとも含む前記撮像画像を生成する生成手段と、
前記第1の照射手段、前記第2の照射手段、及び前記生成手段を制御して、前記第1の波長の光を前記被写体に照射させた状態で、前記被写体からの反射光を、前記肌検出用のラインに受光させて、前記肌検出用領域を少なくとも含む第1の前記撮像画像を生成させ、
前記第2の波長の光を前記被写体に照射させた状態で、前記被写体からの反射光を、前記肌検出用のラインに受光させて、前記肌検出用領域を少なくとも含む第2の前記撮像画像を生成させる
制御手段と、
前記第1の撮像画像、及び前記第2の撮像画像に基づいて、前記肌領域を検出する検出手段と
を含む
電子機器。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/509,385 US20120224042A1 (en) | 2009-11-18 | 2010-11-10 | Information processing apparatus, information processing method, program, and electronic apparatus |
CN201080051320XA CN102668543A (zh) | 2009-11-18 | 2010-11-10 | 信息处理装置、信息处理方法、程序以及电子装置 |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009262511 | 2009-11-18 | ||
JP2009-262511 | 2009-11-18 | ||
JP2010-246902 | 2010-11-02 | ||
JP2010246902A JP2011130419A (ja) | 2009-11-18 | 2010-11-02 | 情報処理装置、情報処理方法、プログラム、及び電子機器 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011062102A1 true WO2011062102A1 (ja) | 2011-05-26 |
Family
ID=44059581
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/070025 WO2011062102A1 (ja) | 2009-11-18 | 2010-11-10 | 情報処理装置、情報処理方法、プログラム、及び電子機器 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20120224042A1 (ja) |
JP (1) | JP2011130419A (ja) |
CN (1) | CN102668543A (ja) |
WO (1) | WO2011062102A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130049927A1 (en) * | 2011-08-23 | 2013-02-28 | Takashi Ichimori | Instruction beam detection apparatus and method of detecting instruction beam |
CN114007062A (zh) * | 2021-10-29 | 2022-02-01 | 上海商汤临港智能科技有限公司 | 一种测量系统、方法、装置、计算机设备及存储介质 |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2602692A1 (en) * | 2011-12-05 | 2013-06-12 | Alcatel Lucent | Method for recognizing gestures and gesture detector |
US9593982B2 (en) | 2012-05-21 | 2017-03-14 | Digimarc Corporation | Sensor-synchronized spectrally-structured-light imaging |
US9060113B2 (en) | 2012-05-21 | 2015-06-16 | Digimarc Corporation | Sensor-synchronized spectrally-structured-light imaging |
US8786767B2 (en) * | 2012-11-02 | 2014-07-22 | Microsoft Corporation | Rapid synchronized lighting and shuttering |
US9509919B2 (en) * | 2014-11-17 | 2016-11-29 | Duelight Llc | System and method for generating a digital image |
US20140378810A1 (en) | 2013-04-18 | 2014-12-25 | Digimarc Corporation | Physiologic data acquisition and analysis |
US9621760B2 (en) | 2013-06-07 | 2017-04-11 | Digimarc Corporation | Information coding and decoding in spectral differences |
KR20160023441A (ko) * | 2014-08-22 | 2016-03-03 | 서울바이오시스 주식회사 | 발광소자가 구비된 카메라와, 이를 이용한 피부 촬영 방법 및 피부 상태 측정 방법 |
US10113910B2 (en) | 2014-08-26 | 2018-10-30 | Digimarc Corporation | Sensor-synchronized spectrally-structured-light imaging |
CN106341614B (zh) * | 2015-07-08 | 2019-12-13 | 全崴科技有限公司 | 摄影影像调整方法 |
US10154201B2 (en) * | 2015-08-05 | 2018-12-11 | Three In One Ent Co., Ltd | Method for adjusting photographic images to higher resolution |
CN106331513B (zh) * | 2016-09-06 | 2017-10-03 | 深圳美立知科技有限公司 | 一种高质量皮肤图像的获取方法及系统 |
TWI662940B (zh) * | 2018-06-01 | 2019-06-21 | 廣達電腦股份有限公司 | 影像擷取裝置 |
JP2023018822A (ja) * | 2021-07-28 | 2023-02-09 | パナソニックIpマネジメント株式会社 | 検査方法および検査装置 |
EP4162862A1 (en) | 2021-10-07 | 2023-04-12 | Koninklijke Philips N.V. | Methods and apparatus for analysing images of hair and skin on a body of a subject |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006047067A (ja) * | 2004-08-03 | 2006-02-16 | Funai Electric Co Ltd | 人体検出装置及び人体検出方法 |
JP2008182360A (ja) * | 2007-01-23 | 2008-08-07 | Funai Electric Co Ltd | 皮膚領域検出撮像装置 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7602942B2 (en) * | 2004-11-12 | 2009-10-13 | Honeywell International Inc. | Infrared and visible fusion face recognition system |
US7720281B2 (en) * | 2006-07-31 | 2010-05-18 | Mavs Lab, Inc. | Visual characteristics-based news anchorperson segment detection method |
JP2009212825A (ja) * | 2008-03-04 | 2009-09-17 | Funai Electric Co Ltd | 皮膚領域検出撮像装置 |
-
2010
- 2010-11-02 JP JP2010246902A patent/JP2011130419A/ja not_active Withdrawn
- 2010-11-10 CN CN201080051320XA patent/CN102668543A/zh active Pending
- 2010-11-10 US US13/509,385 patent/US20120224042A1/en not_active Abandoned
- 2010-11-10 WO PCT/JP2010/070025 patent/WO2011062102A1/ja active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006047067A (ja) * | 2004-08-03 | 2006-02-16 | Funai Electric Co Ltd | 人体検出装置及び人体検出方法 |
JP2008182360A (ja) * | 2007-01-23 | 2008-08-07 | Funai Electric Co Ltd | 皮膚領域検出撮像装置 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130049927A1 (en) * | 2011-08-23 | 2013-02-28 | Takashi Ichimori | Instruction beam detection apparatus and method of detecting instruction beam |
US9407305B2 (en) * | 2011-08-23 | 2016-08-02 | Lapis Semiconductor Co., Ltd. | Instruction beam detection apparatus and method of detecting instruction beam |
CN114007062A (zh) * | 2021-10-29 | 2022-02-01 | 上海商汤临港智能科技有限公司 | 一种测量系统、方法、装置、计算机设备及存储介质 |
Also Published As
Publication number | Publication date |
---|---|
CN102668543A (zh) | 2012-09-12 |
US20120224042A1 (en) | 2012-09-06 |
JP2011130419A (ja) | 2011-06-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2011062102A1 (ja) | 情報処理装置、情報処理方法、プログラム、及び電子機器 | |
JP4831267B2 (ja) | 情報処理装置、情報処理方法、プログラム及び電子装置 | |
US20110298909A1 (en) | Image processing apparatus, image processing method, program and electronic apparatus | |
CN110298874B (zh) | 基于单个图像传感器及在可见光谱中的结构光图案撷取图像及相关三维模型的方法及装置 | |
US8411920B2 (en) | Detecting device, detecting method, program, and electronic apparatus | |
TWI516990B (zh) | 具有可調整追蹤參數的導航裝置 | |
JP2013506525A (ja) | フォトプレチスモグラフィーを実行するための方法及びシステム | |
KR101909082B1 (ko) | 사용자 인터랙션을 이용하여 이동 단말을 제어하는 장치 및 방법 | |
JP2012002541A (ja) | 画像処理装置、画像処理方法、プログラム、及び電子機器 | |
JP6373577B2 (ja) | 撮像制御装置 | |
JP2013096941A (ja) | 撮像装置、撮像方法、及びプログラム | |
JP5573209B2 (ja) | 画像処理装置、画像処理方法、プログラム、及び電子機器 | |
US8805006B2 (en) | Information processing device configured to detect a subject from an image and extract a feature point from the subject, information processing method, program and electronic apparatus | |
JP5505693B2 (ja) | 検出装置、検出方法、プログラム、及び電子機器 | |
JP6673223B2 (ja) | 画像処理装置、画像処理方法及びプログラム | |
JP2013062689A (ja) | 撮像装置、その制御方法、及びプログラム | |
JP5287792B2 (ja) | 情報処理装置、情報処理方法及びプログラム | |
CN109981992B (zh) | 一种在高环境光变化下提升测距准确度的控制方法及装置 | |
JP2011158447A (ja) | 画像処理装置、画像処理方法、プログラム、及び電子機器 | |
JP2012004984A (ja) | 画像処理装置、画像処理方法、プログラム、及び電子装置 | |
US11388346B2 (en) | Image capturing control apparatus and image capturing control method | |
US9294686B2 (en) | Image capture apparatus and image capture method | |
JP2008204126A (ja) | 顔判別装置および方法並びにプログラム | |
JP2023154475A (ja) | 撮像装置およびその制御方法 | |
JP2012000147A (ja) | 画像処理装置、画像処理方法、プログラム、及び電子装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080051320.X Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10831497 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13509385 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10831497 Country of ref document: EP Kind code of ref document: A1 |