WO2021044540A1 - 制御装置、制御方法及び記憶媒体 - Google Patents
制御装置、制御方法及び記憶媒体 Download PDFInfo
- Publication number
- WO2021044540A1 WO2021044540A1 PCT/JP2019/034714 JP2019034714W WO2021044540A1 WO 2021044540 A1 WO2021044540 A1 WO 2021044540A1 JP 2019034714 W JP2019034714 W JP 2019034714W WO 2021044540 A1 WO2021044540 A1 WO 2021044540A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- illuminance
- size
- pupil
- eye
- iris
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 86
- 238000005286 illumination Methods 0.000 claims abstract description 197
- 210000001747 pupil Anatomy 0.000 claims abstract description 146
- 238000003384 imaging method Methods 0.000 claims description 139
- 230000004313 glare Effects 0.000 claims description 128
- 210000003786 sclera Anatomy 0.000 claims description 41
- 230000008569 process Effects 0.000 claims description 30
- 238000004904 shortening Methods 0.000 abstract description 2
- 238000001514 detection method Methods 0.000 description 98
- 230000004048 modification Effects 0.000 description 86
- 238000012986 modification Methods 0.000 description 86
- 238000010586 diagram Methods 0.000 description 20
- 230000000694 effects Effects 0.000 description 18
- 239000000284 extract Substances 0.000 description 13
- 230000006870 function Effects 0.000 description 8
- 230000001678 irradiating effect Effects 0.000 description 8
- 230000004044 response Effects 0.000 description 6
- 238000004364 calculation method Methods 0.000 description 5
- 230000008859 change Effects 0.000 description 5
- 238000006243 chemical reaction Methods 0.000 description 5
- 210000003128 head Anatomy 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 239000003086 colorant Substances 0.000 description 4
- 230000035945 sensitivity Effects 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 238000012937 correction Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000001179 pupillary effect Effects 0.000 description 2
- 210000001525 retina Anatomy 0.000 description 2
- 230000003595 spectral effect Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/74—Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/14—Arrangements specially adapted for eye photography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/1032—Determining colour for diagnostic purposes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/107—Measuring physical dimensions, e.g. size of the entire body or parts thereof
- A61B5/1079—Measuring physical dimensions, e.g. size of the entire body or parts thereof using optical or photographic means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/117—Identification of persons
- A61B5/1171—Identification of persons based on the shapes or appearances of their bodies or parts thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6814—Head
- A61B5/6821—Eye
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/141—Control of illumination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/193—Preprocessing; Feature extraction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/11—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils
- A61B3/112—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils for measuring diameter of pupils
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10141—Special mode during image acquisition
- G06T2207/10152—Varying illumination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
Definitions
- the present disclosure relates to a technique for controlling a device, and more particularly to a technique for controlling lighting that irradiates a person with light.
- iris authentication As a method of personal authentication by a living body, there is iris authentication that authenticates using an iris image.
- the shape of the iris changes depending on the amount of light shining on the pupil. For example, in a dark environment, the pupils open. In the image used for iris recognition, if the pupil is too open, the area of the iris will be narrowed. As a result, the accuracy of authentication is likely to decrease. When the pupil is exposed to strong light, the pupil contracts and becomes smaller. If the pupil is too small, the detection of the pupil will fail, and the shape of the iris acquired as an image will be significantly different from the shape of the iris in the image used for authentication, which may reduce the authentication accuracy. ..
- Irradiation of strong light that significantly reduces the iris not only causes discomfort to the user, but can also damage the retina.
- safe and highly accurate iris recognition can be realized by using an iris image in which the pupil size is included in a predetermined range.
- Patent Document 1 describes an iris imaging device that captures an iris image suitable for personal authentication.
- the iris imaging device of Patent Document 1 estimates the brightness based on the captured image.
- the iris imaging device irradiates visible illumination light toward the eyes.
- the iris imaging device appropriately controls the amount of visible light emission and captures a plurality of iris images having different pupil sizes.
- Patent Document 2 describes a control system that irradiates one of the left and right eyes with light and controls the imaging unit so as to image the tail of the other eye while irradiating the light.
- Patent Document 4 obtains an iris image in which the pupil is the desired size by reducing or increasing the illumination until the pupil size reaches the desired size.
- Patent Document 3 describes a display device that controls the light emitting unit of the display device in consideration of the brightness of the ambient light at the time of shooting, and a control method thereof.
- Patent Document 4 describes a biological information measuring device that measures biological information of a subject from the brightness of at least a part of the skin region around the detected pupil region.
- An object of the present disclosure is to provide a control device or the like that can shorten the time required to obtain an iris image in a desired state.
- the control device includes an acquisition means for acquiring an input image including an eye region, which is a region of the eye portion, an estimation means for estimating the illuminance of the eye portion from the acquired input image, and an illuminance.
- a determination means for determining the amount of visible light illuminating the eye portion so that the pupil size of the eye portion satisfies the size condition based on the illuminance size relationship, which is the relationship between the eye portion and the pupil size. To be equipped.
- an input image including the eye region, which is the region of the eye portion is acquired, the illuminance of the eye portion is estimated from the acquired input image, and the illuminance and the size of the pupil are determined. Based on the illuminance size relationship, which is the relationship, the amount of visible light illuminating the eye portion is determined so that the size of the pupil of the eye portion satisfies the size condition.
- the storage medium includes an acquisition process for acquiring an input image including an eye region, which is a region of the eye portion, an estimation process for estimating the illuminance of the eye portion from the acquired input image, and an illuminance. Based on the illuminance size relationship, which is the relationship between the eye and the pupil size, a determination process for determining the amount of visible light illuminating the eye portion so that the pupil size of the eye portion satisfies the size condition, and a determination process. Memorize the program that causes the computer to execute.
- This disclosure has the effect of shortening the time required to obtain an iris image in the desired state.
- FIG. 1 is a block diagram showing an example of the configuration of the imaging system according to the first embodiment of the present disclosure.
- FIG. 2 is a diagram schematically showing an example of arrangement of an image pickup device, a visible light illumination device, and a near infrared light illumination device with respect to a person.
- FIG. 3 is a flowchart showing an example of the operation of the imaging system according to the first embodiment of the present disclosure.
- FIG. 4 is a flowchart showing an example of the operation of the control device according to the first modification of the first embodiment of the present disclosure.
- FIG. 5 is a flowchart showing an example of the operation of the control device according to the first modification of the first embodiment of the present disclosure.
- FIG. 6 is a block diagram showing an example of the configuration of the imaging system of the second modification of the first embodiment of the present disclosure.
- FIG. 7 is a block diagram showing an example of the configuration of the imaging system of the third modification of the first embodiment of the present disclosure.
- FIG. 8 is a block diagram showing a configuration of an imaging system according to a fourth modification of the first embodiment of the present disclosure.
- FIG. 9 is a block diagram showing an example of the configuration of the imaging system according to the second embodiment of the present disclosure.
- FIG. 10 is a block diagram showing an example of the configuration of the control device according to the third embodiment of the present disclosure.
- FIG. 11 is a flowchart showing an example of the operation of the control device according to the third embodiment of the present disclosure.
- FIG. 12 is a diagram showing an example of a computer hardware configuration capable of realizing the control device according to the embodiment of the present disclosure.
- FIG. 13 is a diagram schematically showing an example of a look-up table according to the first embodiment of the present disclosure.
- FIG. 14 is a diagram showing an all-around light environment model.
- FIG. 1 is a block diagram showing an example of the configuration of the imaging system of the present embodiment.
- the image pickup system 1 of FIG. 1 includes a control device 100, an image pickup device 200, a visible light illumination device 300, and a notification destination device 500.
- the imaging system 1 may further include a near-infrared light illumination device 400.
- the visible light illumination device 300 is drawn below the control device 100, but the actual arrangement does not have to be as in this example.
- FIG. 2 is a diagram schematically showing an example of the arrangement of the image pickup device 200, the visible light illumination device 300, and the near infrared light illumination device 400 with respect to a person.
- the imaging device 200 may be arranged so as to image the head of a person.
- the visible light illumination device 300 may be arranged so as to irradiate the head of the person being imaged with visible light.
- the near-infrared light illumination device 400 may be arranged so as to irradiate the head of the person being imaged with near-infrared light.
- the control device 100 is communicably connected to the image pickup device 200.
- the control device 100 is communicably connected to the visible light illumination device 300.
- the control device 100 may be installed at a location distant from the image pickup device 200, the visible light illumination device 300, and the near-infrared light illumination device 400.
- the control device 100 may be installed adjacent to the image pickup device 200, the visible light illumination device 300, and the near-infrared light illumination device 400.
- the person is moving by walking.
- the person may be moving by means other than walking.
- the person does not have to be moving.
- the image pickup device 200 is a camera capable of capturing moving images and still images (hereinafter collectively referred to as images).
- the image pickup device 200 transmits the captured image to the control device 100.
- the image pickup apparatus 200 may include, for example, an image sensor such as a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal-Oxide-Semicondutor) image sensor.
- the image sensor of the image pickup apparatus 200 is a sensor that is sensitive to a wavelength band including at least a part of a near infrared wavelength band (for example, 780 nm (nanometer) to 1000 nm).
- the image pickup apparatus 200 may be, for example, a visible light camera including an image sensor to which an IR (Infrared) cut filter is not attached.
- the image pickup device 200 may be a camera capable of capturing a color image.
- the image pickup apparatus 200 may be a camera capable of capturing a grayscale image.
- the image pickup apparatus 200 may be a near-infrared light camera.
- the image pickup device 200 may be installed in, for example, a walk-through gate or a passage.
- the image pickup device 200 may be installed so as to be able to image the face of a person passing through a walk-through gate, a passage, or the like (hereinafter, also referred to as a walking path).
- the number of image pickup devices 200 does not have to be one.
- the image pickup apparatus 200 may be realized by a plurality of cameras that image different ranges.
- the imaging device 200 is a plurality of cameras arranged so that, for example, when people of various heights walk on a walking path, the range through which the face can pass can be imaged by any of the cameras. It may be realized by.
- the image pickup apparatus 200 may be set to perform imaging in a state where the focus position is fixed.
- the focus position is, for example, the distance from the center of the camera of the image pickup apparatus 200 to the focus position.
- the depth of field may be determined according to the focal length and aperture value of the image pickup apparatus 200 and the distance from the image pickup apparatus 200 to the in-focus plane.
- the depth of field of the image pickup device 200 represents the range of the distance from the center of the camera of the image pickup device 200 to the object considered to be in focus in the image captured by the image pickup device 200.
- the lower limit of the depth of field of the image pickup apparatus 200 is the distance from the image pickup apparatus 200 to the closest object considered to be in focus in the image captured by the image pickup apparatus 200.
- the upper limit of the depth of field of the image pickup device 200 is the distance from the image pickup device 200 to the farthest object considered to be in focus in the image captured by the image pickup device 200.
- the depth of field of the image pickup apparatus 200 represents a range from the above-mentioned lower limit to the above-mentioned upper limit.
- the distance to the person is determined by the depth of field. If it matches the upper limit of the range of the represented distance, it can be considered that the subject is in focus. In such a situation, the focus position may be set to the upper limit of the depth of field.
- the image pickup range of the image pickup apparatus 200 is a range in which the distance to the image pickup apparatus 200 is included in the depth of field.
- the imaging device 200 may be set to automatically focus when a subject such as a person enters the imaging range.
- This imaging range may be a range in which a subject that the imaging device 200 can focus on can exist.
- the image pickup apparatus 200 may be set to detect the eyes in the image obtained by the imaging and to focus on the eyes when the eyes are detected.
- the image pickup apparatus 200 may focus on the eye by any of the various existing methods of focusing on the eye.
- the focus position may be the distance from the center of the camera of the image pickup apparatus 200 to the object considered to be the most precisely focused in the image captured by the image pickup apparatus 200.
- the imaging device 200 may be set to perform imaging even during the focusing operation.
- the image pickup apparatus 200 may be set so as to focus on the iris of a moving person in a plurality of continuous images (frames). In other words, the image pickup apparatus 200 is set so that the position of the iris of the person is included in the depth of field of the image pickup apparatus 200 while the image pickup apparatus 200 images a person moving within the imaging range a plurality of times. You may be.
- the imaging device 200 may be set to continuously perform imaging regardless of the presence or absence of a person (for example, a pedestrian in a walking path) within the imaging range.
- the imaging device 200 may be set to start imaging when a person such as a pedestrian enters the imaging range and continue imaging until the moving object no longer exists in the imaging range.
- an object detection sensor realized by a near-infrared light sensor, an ultrasonic sensor, or the like detects a person in the walking path, and when the person is detected, a signal indicating the detection is transmitted to the imaging device 200.
- the device may be connected to the imaging device 200.
- the image pickup apparatus 200 is set to continuously perform imaging after starting the operation.
- the image pickup device 200 sends the image (specifically, image data) obtained by the image pickup to the control device 100.
- the image transmitted by the image pickup device 200 to the control device 100 may be a moving image.
- the image transmitted by the image pickup device 200 to the control device 100 may be a plurality of still images.
- the image pickup apparatus 200 further transmits camera parameters and the like at the time of imaging (hereinafter referred to as imaging information) to the control device 100.
- the imaging information is, for example, information such as shutter speed, aperture value, sensitivity, focal length, and focus position.
- the image pickup device 200 may send all the images captured by the plurality of cameras to the control device 100. In that case, the image pickup device 200 may send a set of the image and the image pickup information to the control device 100 for each camera.
- the imaging device 200 controls the imaging information and the image for each camera. It may be sent to 100.
- the visible light illumination device 300 is an illumination that irradiates visible light.
- the visible light illuminating device 300 is configured so that the amount of light emitted by the visible light illuminating device 300 can be controlled by another device such as the control device 100.
- the visible light illumination device 300 is configured so that the intensity of the emitted light can be changed by the control of the control device 100.
- the visible light illumination device 300 may be attached so as to illuminate the eye portion of the face of a person (for example, a person walking along a walking path) imaged by the image pickup device 200, which is imaged by the image pickup device 200.
- the eye portion refers to a range including the eyes of a person's face.
- the visible light illumination device 300 is attached so that the light emitted by the visible light illumination device 300 is incident on the pupil of a person imaged by the image pickup device 200.
- the visible light illumination device 300 may be attached to the image pickup device 200.
- the visible light illumination device 300 may be mounted so as to face the same direction as the optical axis of the image pickup device 200.
- the visible light illumination device 300 may be mounted near the image pickup device 200 in a direction similar to the direction of the optical axis of the image pickup device 200.
- the relationship between the position and direction of the visible light illumination device 300 and the position and direction of the image pickup device 200 is measured in advance and given to the control device 100.
- the near-infrared light illumination device 400 is a device that irradiates near-infrared light.
- the near-infrared light illumination device 400 may be installed so as to illuminate a position where the image pickup device 200 is supposed to perform imaging.
- the intensity of the emitted near-infrared light is within a predetermined range (for example, damage to the retina) in a range in which the face (particularly the eye portion) of a person imaged by the image pickup device 200 can exist. It may be configured to be included in (not strong).
- the near-infrared light illuminating device 400 may be realized by a plurality of arranged near-infrared light illuminating devices.
- the near-infrared light illumination device 400 may be configured to continuously irradiate near-infrared light after the image pickup system 1 starts operation.
- the near-infrared light illumination device 400 may be configured to irradiate near-infrared light while the image pickup device 200 is performing imaging.
- the near-infrared light illumination device 400 may be attached to the image pickup device 200.
- the near-infrared light illumination device 400 may be mounted near the image pickup device 200.
- the near-infrared light illumination device 400 may be mounted so as to be oriented in the same direction as or similar to the direction of the optical axis of the image pickup device 200.
- the near-infrared light illumination device 400 may be connected to the above-mentioned object detection sensor and may be configured to irradiate near-infrared light when an animal body is detected in the walking path by the above-mentioned object detection sensor. ..
- the near-infrared light illumination device 400 is configured to continue irradiation with near-infrared light regardless of whether or not the image pickup device 200 is performing imaging.
- Control device 100 includes an acquisition unit 110, an estimation unit 120, a relationship storage unit 130, a determination unit 140, a control unit 150, and a notification unit 160.
- the control device 100 may further include an image storage unit 170.
- the acquisition unit 110 acquires an image (specifically, data of the image) obtained by the image pickup device 200 from the image pickup device 200.
- the acquisition unit 110 further acquires imaging information from the imaging device 200.
- the acquisition unit 110 sends the still image and the imaging information to the estimation unit 120.
- the acquisition unit 110 may generate, for example, still image data for each frame from the acquired moving image data. Then, the acquisition unit 110 may send the generated still image data and the imaging information to the estimation unit 120.
- the acquisition unit 110 may detect the pupil from the acquired image. When the pupil is detected from the acquired image, the acquisition unit 110 may send the image to the estimation unit 120. If the pupil is not detected in the acquired image, the acquisition unit 110 does not have to send the image to the estimation unit 120.
- the acquisition unit 110 may determine whether or not the eye region of the acquired image is in focus. This eye area represents an area in the image that includes the eyes of a person's face.
- the method by which the acquisition unit 110 determines whether or not the eye region of the image is in focus may be any of the existing methods.
- the acquisition unit 110 may detect, for example, a region having high contrast from the acquired image, and detect a region of the pupil from the detected region with high contrast. When the pupil region is detected from the high contrast region, the acquisition unit 110 may determine that the eye region is in focus. When the eye region of the acquired image is in focus, the acquisition unit 110 may send the image to the estimation unit 120. If the eye region of the acquired image is out of focus, the acquisition unit 110 does not have to send the image to the estimation unit 120.
- the acquisition unit 110 receives a plurality of images captured by the plurality of cameras as described above. In that case, the acquisition unit 110 may detect the region of the pupil from the plurality of received images. Of the plurality of acquired images, the acquisition unit 110 does not have to send the image in which the pupil is detected to the estimation unit 120 and the image in which the pupil is not detected to the estimation unit 120.
- the acquisition unit 110 may extract an image in which the eye area is in focus from the plurality of received images. When an image in which the eye area is in focus is extracted from the plurality of received images, the acquisition unit 110 may send the extracted image in focus in the eye area to the estimation unit 120. .. When the image in which the eye region is in focus is not extracted from the received plurality of images, the acquisition unit 110 does not have to send the image to the estimation unit 120.
- the acquisition unit 110 may store the acquired image and imaging information in the image storage unit 170.
- the estimation unit 120 receives an image (specifically, image data) and imaging information from the acquisition unit 110.
- the estimation unit 120 estimates the illuminance in the eye portion from the received image based on the received imaging information.
- the eye portion refers to the area of the face that includes the eyes.
- the eye portion includes at least an iris including a pupil and a sclera.
- the illuminance of the pupil is considered to be the same as the illuminance of the eye portion.
- the illuminance of the pupil is considered to be the same as the illuminance of the sclera.
- the estimation unit 120 first detects the scleral region of the eye from the received image. Specifically, the estimation unit 120 may extract a region of the pupil in the received image. The estimation unit 120 may extract the scleral region based on the position of the extracted pupil. Specifically, the estimation unit 120 may detect, for example, the contour of the iris surrounding the pupil and the contour of the eye including the iris. The estimation unit 120 may extract a region brighter than the iris region between the contour of the iris and the contour of the eye as a scleral region. The method of extracting the sclera region by the estimation unit 120 is not limited to the above examples.
- the estimation unit 120 further estimates the illuminance of the sclera based on the imaging information and the pixel value of the sclera region. Specifically, the estimation unit 120 determines the illuminance of the sclera based on the imaging information such as sensitivity, aperture value, shutter speed, and the brightness of the sclera region represented by the pixel value of the sclera region. presume. The estimation unit 120 determines the pixel value of the sclera extracted from the image when the sclera having the given reflection characteristics is imaged under the conditions represented by the received imaging information. The illuminance may be estimated. The reflection characteristics of the sclera may be approximately appropriately determined.
- the estimation unit 120 may consider that the portion of the sclera other than the portion where reflection close to specular reflection occurs is a uniformly diffuse reflection surface having a constant reflectance.
- the estimation unit 120 may detect pixels brighter than a predetermined reference in the detected sclera region as pixels in a portion where reflection close to specular reflection occurs.
- the estimation unit 120 may exclude the pixels detected as the pixels of the portion where the reflection close to the specular reflection occurs from the sclera region whose pixel value is used for calculating the illuminance.
- the estimation unit 120 may calculate the illuminance in the sclera region based on the pixel value of the sclera region in the image and the reflectance of the sclera.
- the administrator of the control device 100 may experimentally obtain the reflectance in advance and give it to the estimation unit 120.
- the estimation unit 120 uses the estimated sclera illuminance as the illuminance of the eye portion including the pupil.
- the estimation unit 120 may calculate the brightness of the sclera region in the color image according to the calculation method described below.
- a method of calculating the brightness of the sclera in a color image taken by a camera such as an image pickup apparatus 200 will be described. It is assumed that the pixel value of each pixel included in the color image is represented by the value of the R (Red) component, the value of the G (Green) component, and the value of the B (Blue) component. In other words, it is assumed that the color image is composed of RGB.
- the RGB chromaticity and the white chromaticity can be specified in advance as the color characteristics of the camera.
- the chromaticity of RGB and the chromaticity of white may be given to the estimation unit 120 in advance.
- the RGB data (three values of RGB) of the RGB color system are set so as to be uniquely converted into the tristimulus value XYZ in the XYZ color system.
- the following is an example of a conversion method for converting RGB data into tristimulus value XYZ data.
- the relationship between RGB of the input image and the tristimulus value XYZ is determined by, for example, the following equation (1).
- M RX is a 3 ⁇ 3 transformation matrix.
- the conversion formula for converting RGB to XYZ is not limited to the example of formula (1).
- the conversion formula may be any formula that can be uniquely converted from RGB to XYZ, and may be defined as follows as a form in which a quadratic term is added, for example.
- M'RX is a 3 ⁇ 9 transformation matrix.
- the M RX and M'RX may be calculated in advance by performing color calibration of the camera using, for example, a known color patch, and may be given to the estimation unit 120.
- the tristimulus value Y can be converted to absolute brightness L (cd / m 2).
- the image pickup apparatus 200 may send information on the aperture value F, the shutter speed, and the gain at the time of photographing to the acquisition unit 110 of the control apparatus 100.
- the acquisition unit 110 receives information on the aperture value F, shutter speed, and gain at the time of shooting from the control device 100, and receives information on the aperture value F, shutter speed, and gain at the time of shooting received by the estimation unit 120. May be sent to.
- the estimation unit 120 may receive information on the aperture value F, the shutter speed, and the gain at the time of shooting from the acquisition unit 110.
- the estimation unit 120 may convert the tristimulus value Y into absolute brightness L (cd / m 2 ) from the information of the aperture value F, the shutter speed, and the gain at the time of shooting, for example, according to the method shown below.
- the tristimulus value Y represents the stimulus value Y among the tristimulus values XYZ.
- the estimation unit 120 may convert the tristimulus value Y into absolute brightness L according to the following method. Generally, in shooting with a camera, for example, the camera sets the aperture value F, shutter speed, and gain of the lens in order to obtain an appropriate exposure. In the following description, it is assumed that the gain when the input image is captured is set to 1.0.
- the estimation unit 120 acquires data indicating 1.0 as a gain value set when the input image is captured from the image pickup apparatus 200 via the acquisition unit 110.
- the tristimulus value Y corresponding to the pixel values (R, G, B) on the captured image with respect to the absolute brightness L, which is the amount of light incident on the camera, depends on two variables, the lens aperture value F and the shutter speed. It is determined.
- the camera to be used is calibrated in advance. Specifically, in calibration, for example, while changing the imaging conditions (that is, the amount of incident light (absolute brightness L (cd / m 2 )), the lens aperture value F, and the shutter speed), a plurality of images are used. The captured image of is taken. From the pixel values (R, G, B) of the captured image, the tristimulus value Y is calculated by the formula (1) or the formula (2). Then, the relationship between the imaging conditions (that is, the absolute brightness L, which is the amount of incident light, the lens aperture value F, and the shutter speed) when the captured image is captured and the tristimulus value Y obtained from the captured image is derived. To do.
- the imaging conditions that is, the amount of incident light (absolute brightness L (cd / m 2 )
- the lens aperture value F the lens aperture value F
- the shutter speed the relationship between the imaging conditions (that is, the absolute brightness L, which is the amount of incident light, the lens aperture value F
- a LUT (Look Up Table) may be generated.
- this LUT a plurality of values corresponding to the imaging conditions (that is, an absolute brightness L, a lens aperture value F, and a shutter speed S) and a pixel corresponding to the location of the above-mentioned absolute brightness L of the captured image.
- the tristimulus value Y calculated from the pixel value of the above may be associated with.
- the LUT may include a plurality of such combinations of the absolute brightness L, the lens aperture value F, the shutter speed S, and the tristimulus value Y.
- the LUT may be generated so that the absolute brightness L is uniquely determined by the combination of the lens aperture value F, the shutter speed S, and the tristimulus value Y.
- the absolute brightness L can be obtained by this LUT. Therefore, this calibration produces a LUT for obtaining the absolute brightness L, which is the amount of incident light corresponding to the lens aperture value F, the shutter speed, and the tristimulus value Y.
- the tristimulus value Y corresponding to the conditions at the time of shooting with the camera (that is, the lens aperture value F and the shutter speed) and the pixel values (R, G, B) of the image captured under those conditions. From, the absolute brightness L (cd / m 2 ), which is the amount of incident light at the time of shooting, can be determined.
- Such a LUT may be given to the estimation unit 120 in advance.
- the estimation unit 120 receives the input image, the lens aperture value F when the input image is captured, and the shutter speed.
- the estimation unit 120 extracts the sclera region from the input image.
- the estimation unit 120 calculates the tristimulus value Y from the pixel values (R, G, B) of the pixels included in the extracted sclera region.
- the estimation unit 120 determines the absolute brightness L by using the LUT from the received lens aperture value and shutter speed and the calculated tristimulus value Y.
- the LUT may include a plurality of different combinations of absolute luminance L and tristimulus value Y for the same combination of aperture value and shutter speed.
- the estimation unit 120 is associated with the tristimulus value in the LUT. Identify the absolute brightness.
- the estimation unit 120 may set the specified tristimulus value as the absolute brightness corresponding to the tristimulus value.
- the absolute brightness corresponding to the tristimulus values represents the absolute brightness of the object whose brightness is represented as the pixel value for which the tristimulus values are calculated.
- FIG. 13 is a diagram showing an example of a LUT.
- An example of the LUT will be described with reference to FIG.
- a completely white plate is photographed multiple times while changing the aperture value and the shutter speed.
- the amount of light incident on the camera from the completely white plate is L1 or L2
- the aperture value is F1 or F2
- the shutter speed is S1 or S2
- Eight shots are taken.
- An image is obtained.
- Eight tristimulus values Y can be obtained by obtaining the tristimulus values for the pixel values of each of the obtained captured images by the equation (1) or the equation (2).
- a LUT as shown in FIG. 13 is generated from the eight imaging conditions and the eight tristimulus values Y obtained under the eight imaging conditions.
- the estimation unit 120 holds, for example, a LUT as shown in FIG. 13 in advance.
- the estimation unit 120 calculates the tristimulus value Y from the pixel values of the received input image.
- the estimation unit 120 calculates the tristimulus value Y X from the pixel values of the images taken at the aperture value F1 and the shutter speed S2. Then, the estimation unit 120 obtains the tristimulus value calculated from the image taken from the LUT under the same shooting conditions as the shooting conditions of the input image (that is, the aperture value is F1 and the shutter speed is S2). Extract.
- the tristimulus values calculated from the images taken with the aperture value being F1 and the shutter speed being S2 are Y2 and Y6.
- the estimation unit 120 compares the tristimulus values Y2 and Y6 extracted from the LUT with the tristimulus values YX calculated from the input image.
- the estimation unit 120 reads out from the LUT the amount of incident light when the tristimulus value is Y2 and the amount of incident light when the tristimulus value is Y6.
- the amount of incident light when the tristimulus value is Y2 is L1
- the amount of incident light when the tristimulus value is Y6 is L2.
- the estimation unit 120 is based on the difference between the tristimulus values Y2 and Y6 extracted from the LUT and the tristimulus values YX calculated from the input image, and the amount of incident light read from the LUT is L1 and L2.
- the amount of incident light LX is calculated by the existing method such as interpolation calculation. In the example shown in FIG. 13, there are eight shooting conditions, but the number of shooting conditions is not limited to eight. When creating the LUT, shooting may be performed under more conditions to expand the amount of information in the LUT. By using such a LUT, a more accurate amount of incident light (that is, absolute brightness L) can be obtained.
- the relationship between the amount of incident light, the lens aperture value F, the shutter speed, and the image output value may differ depending on the model and individual difference of the camera (camera and lens if the lens can be removed from the camera). Therefore, a LUT that can convert the tristimulus value Y into a highly accurate absolute brightness L (cd / m 2 ) by performing calibration for each camera (if the lens can be removed from the camera, the combination of the camera and the lens). Can be generated.
- the estimation unit 120 calculates the apparent irradiance at each pixel position of the reflector in the image.
- the method described in Reference 1 and the method described in Reference 2 can be applied to the calculation of irradiance.
- Reference 1 Imari Sato, Yoichi Sato, Katsushi Ikeuchi, “Estimation of Light Source Environment Based on Shadows of Objects”, IPSJ Journal: Computer Vision and Image Media, Vol.41, No. SIG 10 (CVIM 1) ), December 2000.
- FIG. 14 is a diagram schematically showing an all-around light environment model considering a surface light source.
- the calculation of the irradiance will be described using the all-around light environment model in consideration of the surface light source shown in FIG.
- Point A is the center of the sphere. Any point on the reflector can be selected as the center A of the sphere.
- the radiance distribution of the light source from the entire periphery, which is observed at the center A of the sphere is modeled assuming that there is no obstacle blocking the light source and the center A.
- this radiance distribution be L ( ⁇ , ⁇ ).
- ⁇ represents the azimuth angle
- ⁇ represents the zenith angle.
- the illuminance E A at the center A is obtained by integrating the incident light energy received from the minute solid angle d ⁇ i represented by the minute zenith angle d ⁇ i and the minute azimuth d ⁇ i in all directions.
- the illuminance E A is expressed by, for example, the following equation.
- d ⁇ i represents a minute zenith angle and d ⁇ i represents a minute azimuth (i indicates an incident).
- the condition that the above equation (3) holds is that the light reflection characteristic of the reflector is Lambertian reflection.
- the reflection on the sclera is not necessarily the Lambertian reflection.
- the situation in which the imaging device 200 images the face of a passing person corresponds to the situation in which a minute object called an eye is observed from a distance. Therefore, when a point on the sclera is selected as the center A, the light reflected at the center A is an integral form of the ambient light incident on the center A from all directions, and is in the viewpoint direction. It will be constant regardless. Therefore, it can be said that the above equation (3) is valid for the illuminance of the sclera. In other words, the illuminance of the sclera is approximately represented by Eq. (3).
- the luminance value I A in the reflector recorded as an image is represented by the product of the illuminance E A and the surface reflectance S A of the reflector, as shown by Eq. (4).
- the surface reflectance S A of the reflector can be obtained in advance, for example, by measurement.
- the brightness value I A can be determined from the tristimulus value Y calculated by the equations (1) and (2), the aperture value F of the camera lens at the time of shooting, the shutter speed, the gain, and the like.
- Absolute brightness L can be used.
- the estimation unit 120 may calculate the illuminance E A by dividing the calculated absolute brightness L I A by the surface reflectance S A obtained in advance.
- the image luminance reflects the spectral sensitivity characteristic of the camera, which is expressed as a function of the wavelength ⁇ .
- the wavelength ⁇ can be regarded as a constant. Therefore, the image luminance I A k (k is r, g, b) at the point A is expressed as follows.
- ⁇ k is the camera gain.
- the estimation unit 120 may acquire the camera gain from the camera. That is, the equation (5), the illuminance E A k at point A can be calculated from the image intensity I A and the camera gain at point A. It is assumed that the surface reflectance S A k (k is r, g, b) of the reflector is obtained in advance, for example, by measurement. The estimation unit 120 may calculate E A k by dividing I A k by ⁇ k and S A k. The illuminance E A of visible light is obtained by adding E A r , E A g , and E A b as shown in the equation (6).
- the illuminance E A around the sclera can be obtained.
- the estimation unit 120 sends the illuminance of the eye portion to the determination unit 140.
- the estimation unit 120 may further send the distance information to the determination unit 140.
- the distance information may be information representing the distance from the visible light illuminating device 300 to the eye portion in the three-dimensional space.
- the estimation unit 120 provides information representing the distance from the image pickup device 200 to the eye portion. May be sent to the determination unit 140 as the above-mentioned distance information. For example, when the image pickup device 200 is configured so that the focus distance is fixed and the distance from the image pickup device 200 to the eye portion is given, the estimation unit 120 can see the given distance. The distance from the optical illumination device 300 to the eye portion may be set. When the acquisition unit 110 acquires the focus distance sent from the image pickup device 200, the estimation unit 120 may set the focus distance acquired by the acquisition unit 110 to the distance from the visible light illumination device 300 to the eye portion.
- the estimation unit 120 estimates the distance from the visible light illumination device 300 to the eye portion. It's okay. Specifically, the estimation unit 120 is derived from the visible light illumination device 300 based on the positional relationship between the image pickup device 200 and the visible light illumination device 300 and the positional relationship between the image pickup device 200 and the eye portion. The distance to the eye area may be estimated. In this case, the positional relationship between the image pickup apparatus 200 and the visible light illumination apparatus 300 may be given in advance. Further, the estimation unit 120 is based on the image including the region of the eye portion captured by the imaging device 200 and the imaging information when the image including the region of the eye portion is captured by the imaging device 200. The positional relationship between the eye portion and the eye portion may be calculated.
- the estimation unit 120 of this modification may calculate the absolute brightness as described below.
- the estimation unit 120 of this modification may operate in the same manner as the estimation unit 120 described above, except that the estimation unit 120 operates as described below.
- the estimation unit 120 starts from the tristimulus value included in the LUT. Extract, for example, two tristimulus values that are closest to the calculated tristimulus values.
- the estimation unit 120 refers to the tristimulus value calculated from the absolute brightness associated with the tristimulus value extracted by interpolation based on the difference between the extracted tristimulus value and the calculated tristimulus value.
- the absolute brightness may be calculated. For example, in a LUT, the two tristimulus values closest to the calculated tristimulus value Ya, which are associated with the received aperture value and the shutter speed, are Yb and Yc, and the absolute brightness associated with Yb. Is Lb and the absolute brightness associated with Yc is Lc.
- the estimation unit 120 When the LUT does not include a tristimulus value that is associated with the received aperture value and the shutter speed and that matches the tristimulus value calculated from the pixel value, the estimation unit 120 is associated with the absolute brightness 3 respectively.
- One or more tristimulus values may be selected from the LUT.
- the method of selecting three or more tristimulus values may be any of various existing methods, for example, a method of excluding outliers and selecting them.
- the estimation unit 120 calculates the absolute brightness L corresponding to the calculated tristimulus value Y based on the selected tristimulus value.
- the estimation unit 120 may derive a function representing the relationship between the tristimulus value and the absolute brightness by, for example, the least squares method. Such a function may be a predetermined type of curve (eg, a straight line).
- the estimation unit 120 may calculate the absolute brightness L from the calculated tristimulus value Y by the derived function.
- the estimation unit 120 determines that the absolute brightness L and the tristimulus value Y when the shooting conditions are the received aperture value and the shutter speed.
- the combination may be calculated using a LUT.
- the estimation unit 120 may first extract from the combinations included in the LUT the combination including the aperture value and the shutter speed, which is the closest to the received aperture value and the shutter speed.
- the F value is a value obtained by dividing the focal length of the lens by the effective aperture.
- the F value may be represented by a character string in which the character "F" is connected to a decimal number or an integer having a maximum of two significant figures, but here, the F value is represented by a real number. If the shutter speed is represented by the length of time the shutter is open, halving the shutter speed value halves the time the shutter is open.
- the sum of the squared log of the aperture value that is, twice the log of the aperture value
- the inverted sign of the log of the shutter speed are the same. If so, the brightness of the imaged object in the captured image should be the same.
- the base of the logarithm may be, for example, 2.
- the sum of the logarithm of the square of the aperture value, which is the F value, and the value obtained by reversing the sign of the logarithm of the shutter speed represented by the length of time will be referred to as a brightness value.
- the smaller the brightness value the brighter the brightness of the image taken at the aperture value and the shutter speed for which the brightness value is calculated.
- the sign of the brightness value may be reversed.
- the aperture value may be represented by, for example, the number of steps based on a predetermined aperture state.
- This number of stages is a numerical value representing the difference between the states of the two diaphragms by the magnification of the area of the opening of the diaphragm.
- the state of the aperture which is the reference, is referred to as the reference state.
- the area of the aperture of the diaphragm when the diaphragm is in the other state is 2 n times (or (1/2) n times) the area of the opening when the diaphragm is in the other state.
- the number of stages representing the difference between the two states is n.
- the number of stages in the state A is + n.
- the number of stages in state B is assumed to be ⁇ n.
- the number of steps is the difference between the logarithms of the two F-numbers representing the two states of the aperture, with the base being 2.
- the shutter speed may also be represented by, for example, the number of steps based on a predetermined shutter speed. This number of steps is a numerical value representing the difference between the two shutter speeds by the magnification of the length of time represented by the shutter speed.
- the standard shutter speed will be referred to as a standard shutter speed.
- the length of time represented by one shutter speed is 2 n times (or (1/2) n times) the length of time represented by the other shutter speed, the difference between the two shutter speeds.
- the number of stages representing is n.
- shutter speed A when the length of time represented by a certain shutter speed (referred to as shutter speed A) is (1/2) n times the length of time represented by the reference shutter speed, the number of steps of shutter speed A is -n.
- shutter speed 2 n times the length of time represented by the reference shutter speed
- the number of steps of shutter speed B is assumed to be ⁇ n.
- the number of steps is the difference between the logarithms of the two shutter speeds with the base of 2.
- the brightness value in this case may be a value obtained by subtracting the shutter speed represented by the number of steps from the predetermined shutter speed from the aperture value represented by the number of steps from the predetermined aperture.
- the estimation unit 120 may calculate the brightness value from the received aperture value and the shutter speed.
- the brightness value calculated from the received aperture value and shutter speed will be referred to as the target brightness value below.
- the estimation unit 120 may calculate the above-mentioned brightness value for each combination of the aperture value and the shutter speed included in the LUT.
- the brightness value calculated from the combination of the aperture value and the shutter speed included in the LUT is hereinafter referred to as the LUT brightness value.
- the estimation unit 120 may extract a LUT brightness value that matches the target brightness value calculated from the received aperture value and the shutter speed from the LUT brightness value.
- the estimation unit 120 may extract a combination of the aperture value for which the extracted LUT brightness value is calculated and the shutter speed from the LUT.
- the estimation unit 120 identifies the absolute brightness associated with the combination. To do.
- the estimation unit 120 determines the specified absolute brightness as the target absolute brightness.
- the estimation unit 120 selects a plurality of combinations from the extracted combinations. ..
- the estimation unit 120 may select two combinations associated with the two tristimulus values closest to the tristimulus value calculated from the input image. Even if the estimation unit 120 calculates the absolute brightness corresponding to the tristimulus value calculated from the input image by the interpolation as described above from the two absolute brightnesses associated with the two selected tristimulus values Y. Good.
- the estimation unit 120 may select three or more combinations associated with three or more tristimulus values Y.
- the estimation unit 120 may derive a function from the tristimulus values and the absolute brightness associated with the three or more selected combinations, for example, as described above. Then, the estimation unit 120 may calculate the absolute brightness corresponding to the tristimulus value Y calculated from the input image by the derived function.
- the estimation unit 120 calculates a value p obtained by subtracting the target brightness value from the LUT brightness value.
- the estimation unit 120 further calculates a value obtained by multiplying the LUT brightness value calculated aperture value and the tristimulus value Y associated with the shutter speed by 2 p (hereinafter referred to as a corrected tristimulus value). You can.
- the estimation unit 120 determines the absolute brightness corresponding to the calculated tristimulus value from the input image.
- the absolute brightness corresponding to the calculated tristimulus values may be determined.
- the estimation unit 120 may select two corrected tristimulus values that are close to the tristimulus value calculated from the input image.
- the estimation unit 120 uses the input image by interpolation as described above based on the two selected corrected tristimulus values and the two absolute brightnesses corresponding to the calculated tristimulus values.
- the absolute brightness corresponding to the calculated tristimulus value Y may be calculated.
- the estimation unit 120 may select three or more corrected tristimulus values by the same method as the above-mentioned method for calculating three or more tristimulus values.
- the estimation unit 120 may derive the absolute brightness corresponding to the tristimulus value Y calculated from the input image from the three or more selected corrected tristimulus values.
- the method for deriving the absolute brightness in that case may be the same method as the above-mentioned method for deriving the absolute brightness corresponding to the tristimulus value Y calculated from the input image based on three or more tristimulus values. ..
- the relationship storage unit 130 stores the relationship between the pupil size and the illuminance.
- the size of the human pupil ie, pupil size
- the relationship storage unit 130 stores, for example, experimentally determined data representing the relationship between the pupil size and the illuminance.
- the data representing the relationship between the illuminance and the pupil which is stored in the relationship storage unit 130, is, for example, data obtained from the illuminance to obtain the pupil size when the illuminance of the eye portion is the illuminance, and from the pupil size, the pupil size. Is the data from which the illuminance is obtained when is the pupil size.
- the data showing the relationship between the illuminance and the pupil will also be referred to as the illuminance size relationship below.
- the determination unit 140 receives the illuminance of the eye portion from the estimation unit 120.
- the determination unit 140 determines the amount of illumination to be applied to the eye portion in order for the pupil size to satisfy a predetermined condition (hereinafter referred to as a size condition) based on the illuminance of the eye portion and the illuminance size relationship.
- the size condition may be represented by, for example, a target value of pupil size.
- the size condition may be represented by a range of pupil sizes. In that case, the determination unit 140 may determine the target value of the pupil size by a predetermined method based on the size condition.
- the determination unit 140 may determine, for example, the value at the center of the size range represented by the size condition as the target value.
- the determination unit 140 may determine, for example, the largest value in the range represented by the size condition as the target value.
- the determination unit 140 may determine, for example, the smallest value in the range represented by the size condition as the target value.
- the determination unit 140 may determine another value as the target value.
- the determination unit 140 determines, for example, the illuminance at which the pupil size becomes the target value (hereinafter referred to as the target illuminance) based on the illuminance size relationship. Then, the determination unit 140 calculates the difference between the target illuminance and the illuminance of the received eye portion (hereinafter, also referred to as additional illuminance). In order to set the illuminance of the eye portion to the target illuminance, the determination unit 140 sets the amount of light emitted to the eye portion by the visible light illumination device 300 (hereinafter, also referred to as the amount of illumination light) to, for example, the visible light illumination device 300 and the eye. Determined based on the distance to the part.
- the visible light illumination device 300 hereinafter, also referred to as the amount of illumination light
- the determination unit 140 sets the amount of light emitted by the visible light illuminating device 300 to, for example, the visible light illuminating device 300 and the eye portion so that the illuminance by the visible light illuminating device 300 in the eye portion becomes an additional illuminance. Determined based on the distance of.
- the determination unit 140 may determine a set value of the brightness of the visible light illumination device 300, which is necessary to irradiate the eye portion with the determined amount of light.
- the determination unit 140 may first calculate the distance from the visible light illuminating device 300 to the eye portion.
- the determination unit 140 calculates the distance from the visible light illumination device 300 to the eye portion based on, for example, the image pickup information of the image pickup device 200 and the positional relationship between the image pickup device 200 and the visible light illumination device 300. Good.
- the determination unit 140 determines the camera parameters of the image pickup device 200, the distance from the image pickup device 200 to the in-focus position, and the position of the eye region, which is the region of the eye region in the captured image. , The positional relationship between the image pickup apparatus 200 and the eye portion may be obtained.
- the determination unit 140 first calculates the direction of the eye region with reference to the image pickup apparatus 200 based on the position of the region of the eye region on the imaged screen and the camera parameters (for example, the angle of view). It's okay.
- the eye area is in focus in the captured image, the eye area is on the in-focus surface.
- the shape of the surface to be focused on and the position with respect to the image pickup apparatus 200 may be given to the determination unit 140 in advance.
- the determination unit 140 may calculate the intersection of the straight line toward the eye region and the in-focus surface as the position of the eye region.
- the determination unit 140 may calculate the relative position of the eye portion with respect to the image pickup apparatus 200 as the position of the eye portion.
- the determination unit 140 calculates the relative position of the eye portion with respect to the visible light illumination device 300 based on the relative position of the eye portion with respect to the image pickup device 200 and the relative position between the image pickup device 200 and the visible light illumination device 300. It's okay.
- the determination unit 140 is further based on the positional relationship between the eye region and the visible light illumination device 300 (that is, the above-mentioned relative position) and the illumination characteristics (for example, light distribution characteristics) of the visible light illumination device 300. , Calculate the amount of light of the visible light illuminating device 300 in which the illuminance at the position of the eye portion becomes the target illuminance.
- the illumination characteristic of the visible light illumination device 300 may be, for example, a light distribution characteristic that represents the amount of light according to the angle from the direction in which the visible light illumination device 300 is facing.
- the light distribution characteristic may be represented by, for example, a ratio to the amount of light in the direction in which the visible light illuminating device is facing.
- the lighting characteristics of the visible light illuminating device 300 may be given in advance. Further, the shape of the eye portion may be assumed to be, for example, a flat surface. In other words, the shape of the eye region may be approximated by, for example, a plane.
- the direction of the eye region may be assumed to be, for example, the direction perpendicular to the direction of the walking path.
- the estimation unit 120 may detect a face in the acquired image and estimate the direction of the detected face. In this case, the estimation unit 120 may estimate the direction of the face with reference to the orientation of the image pickup apparatus 200.
- the determination unit 140 may consider that the determined face direction is a perpendicular line of a plane that approximates the eye region.
- the positional relationship between the eye portion and the visible light illuminating device 300 is the relationship between the direction of the visible light illuminating device 300 and the direction of the face (for example, the direction of the visible light illuminating device 300 and the direction of the face).
- the angle may be further included, and the determination unit 140 may calculate the amount of light of the visible light illuminating device 300 based on the relationship between the direction of the visible light illuminating device 300 and the direction of the face.
- the relationship between the position on the focused surface and the coordinates in the image captured by the image pickup device 200 is fixed.
- the position of the eye portion with respect to the image pickup apparatus 200 can be calculated based on the position of the eye portion extracted from the image acquired by the acquisition unit 110.
- the positional relationship between the image pickup apparatus 200 and the visible light illumination apparatus 300 is also fixed.
- the conversion formula and the parameters for calculating the relative position with respect to the visible light illumination device 300 from the position in the image captured by the image pickup device 200 may be derived in advance.
- the determination unit 140 may calculate the relative position with respect to the visible light illumination device 300 from the position in the captured image by using the conversion formula and the parameters derived in advance and given to the determination unit 140.
- the determination unit 140 determines visible light based on the calculated amount of light, the light distribution characteristics, the relationship between the set value of the brightness of the visible light illumination device 300 and the amount of light emitted by the visible light illumination device 300.
- the set value for irradiating the light of the calculated amount of light by the lighting device 300 may be calculated.
- the relationship between the set value of the brightness of the visible light illuminating device 300 and the amount of light emitted by the visible light illuminating device 300 may be given in advance.
- the determination unit 140 sends the calculated set value to the control unit 150.
- the determination unit 140 may send the calculated light amount (specifically, a value representing the light amount) to the control unit 150.
- the control unit 150 receives the amount of light from the determination unit 140.
- the control unit 150 is based on the amount of light received, the light distribution characteristics of the visible light illuminating device 300, and the relationship between the set value of the brightness of the visible light illuminating device 300 and the light emitted by the visible light illuminating device 300. ,
- the set value for irradiating the received light amount of light by the visible light illuminating device 300 may be calculated.
- the determination unit 140 may change the amount of light emitted by the visible light illuminating device 300 so that the glare condition is satisfied.
- the glare condition is, for example, that the glare felt by a person irradiated with light by the visible light illuminating device 300 does not exceed a predetermined standard.
- the determination unit 140 determines the amount of light emitted by the visible light illumination device 300 so that the glare felt by the person illuminated by the visible light illumination device 300 does not exceed a predetermined standard.
- You may.
- a value indicating the degree of glare according to the illuminance of the eye portion in other words, an index of glare (hereinafter referred to as the degree of glare) may be defined.
- the degree of glare may be, for example, an experimentally determined numerical value. In the following description, the larger the value of the degree of glare, the more the degree of glare indicates that the person feels more dazzling.
- the above-mentioned predetermined standard of the degree of glare is represented by the value of the degree of glare (hereinafter, referred to as the upper limit value of glare), which represents the limit of the permissible glare.
- Information representing the relationship between the illuminance of the eye portion and the degree of glare may be stored in the relationship storage unit 130, for example.
- the relationship between the illuminance of the eye portion and the degree of glare may be expressed in the form of a look-up table, for example.
- the determination unit 140 may read the relationship between the illuminance of the eye portion and the degree of glare from the relationship storage unit 130.
- the determination unit 140 can specify the illuminance of the eye portion (hereinafter referred to as the upper limit illuminance) corresponding to the upper limit value of glare based on the relationship between the illuminance of the eye portion and the degree of glare.
- the determination unit 140 prevents the degree of glare from exceeding a predetermined value (that is, an upper limit value of glare) representing the above-mentioned predetermined standard based on the information representing the relationship between the illuminance of the eye portion and the degree of glare. , Additional illuminance may be determined.
- the determination unit 140 determines, for example, based on the relationship between the illuminance of the eye portion and the degree of glare, the degree of glare does not exceed the above-mentioned predetermined standard (that is, the upper limit of glare).
- the maximum value of the illuminance (upper limit illuminance) may be calculated.
- the determination unit 140 may further calculate the difference between the upper limit illuminance and the illuminance of the eye portion estimated by the estimation unit 120 as the maximum value of the additional illuminance.
- the determination unit 140 determines the value of the additional illuminance.
- the determination unit 140 does not need to change the value of the additional illuminance. ..
- the relationship between the illuminance and the degree of glare may be determined for each type of lighting.
- the relationship storage unit 130 may store the relationship between the illuminance of the eye portion and the degree of glare for each type of illumination.
- the relationship storage unit 130 describes, for example, the relationship between the illuminance and the degree of glare of the eye portion due to natural light (for example, the light when the visible light illumination device 300 does not irradiate light) and the eye portion due to the visible light illumination device 300.
- the relationship between the illuminance and the glare may be stored separately.
- the degree of glare is the degree of glare when the eye part is illuminated by two light sources, the degree of glare when the eye part is irradiated by one of the two light sources, and the degree of glare when the eye part is irradiated by the other of the two light sources. It may be set so that it can be calculated based on the degree of glare when irradiated with. In other words, the degree of glare when the eye part is illuminated by two light sources, the degree of glare when the eye part is irradiated by one of the two light sources, and the degree of glare when the eye part is irradiated by the other of the two light sources.
- the relationship between the degree of glare and the degree of glare may be determined in advance.
- the degree of glare when the eye part is illuminated by two light sources the degree of glare when the eye part is irradiated by one of the two light sources, and the degree of glare when the eye part is irradiated by the other of the two light sources.
- the relationship between the degree of glare and the degree of glare is described as the relationship between the three degrees of glare.
- the two light sources in this case are natural light and the visible light illuminating device 300.
- the maximum value of the illuminance of the visible light illumination device 300 may be calculated.
- the degree of glare of the light emitted by the visible light illuminating device 300 may be determined according to the illuminance by natural light. For example, the magnitude of the illuminance due to natural light may be divided into a plurality of ranges. Then, the relationship between the illuminance by the visible light illuminating device 300 and the degree of glare may be obtained in advance for each range of the illuminance by natural light.
- the determination unit 140 specifies the range including the illuminance of the eye portion estimated by the estimation unit 120, and determines the illuminance and the degree of glare of the visible light illumination device 300 determined for the specified range.
- the degree of glare can be calculated from the illuminance of the visible light illumination device 300. Further, the determination unit 140 determines the value of the degree of glare (for example, the degree of glare that satisfies a predetermined standard) based on the relationship between the illuminance and the degree of glare determined by the visible light illuminating device 300 for the specified range. From the maximum value), the illuminance by the visible light illumination device 300 when the degree of glare is that value can be calculated.
- the degree of glare for example, the degree of glare that satisfies a predetermined standard
- the illuminance of the eye portion in a state where the degree of glare satisfies a predetermined standard and the visible light illumination device 300 is not irradiating light is estimated by the estimation unit 120.
- the maximum value of the illuminance of the light by the visible light illuminating device 300 is calculated in the case of the illuminance of.
- the determination unit 140 determines this maximum value based on the relationship between the illuminance due to natural light and the degree of glare, the relationship between the illuminance and the degree of glare due to the visible light illuminating device 300, and the relationship between the above-mentioned three degrees of glare. May be calculated.
- the determination unit 140 changes the value of the additional illuminance to the maximum value of the calculated illuminance. You may. When the additional illuminance calculated based on the illuminance size relationship is not larger than the maximum value of the calculated illuminance, the determination unit 140 does not have to change the value of the additional illuminance.
- the determination unit 140 determines whether the amount of light of the visible light illumination device 300 can be controlled so that the size of the pupil satisfies the size condition. In other words, the determination unit 140 may determine whether or not the illuminance of the eye portion can be controlled to an illuminance in which the size of the pupil satisfies the size condition by irradiating the light with the visible light illumination device 300.
- the visible light illuminating device 300 determines the amount of light that the visible light illuminating device 300 needs to irradiate in order to make the illuminance of the eye portion the illuminance that the pupil size satisfies the size condition. It may be determined whether or not it is included in the range of the amount of light that can be irradiated.
- the amount of light that the visible light illuminating device 300 needs to irradiate in order to make the illuminance of the eye portion an illuminance that satisfies the size condition of the pupil size is referred to as a required light amount.
- the range of the amount of light that the visible light illuminating device 300 can irradiate is referred to as a possible range.
- the determination unit 140 may determine that the light amount of the visible light illumination device 300 cannot be controlled so that the illuminance of the eye portion becomes the illuminance that satisfies the size condition of the pupil size. .. In other words, if the required amount of light is not included in the possible range, the determination unit 140 may determine that the amount of light of the visible light illuminating device 300 cannot be controlled so that the size of the pupil satisfies the size condition. When the required light amount is included in the possible range, the determination unit 140 may determine that the light amount of the visible light illumination device 300 can be controlled so that the illuminance of the eye portion becomes the illuminance at which the pupil size satisfies the size condition. In other words, when the required amount of light is included in the possible range, the determination unit 140 may determine that the amount of light of the visible light illuminating device 300 can be controlled so that the size of the pupil satisfies the size condition.
- the illuminance of the eye portion is caused by the irradiation of light by the visible light illuminating device 300. May be determined that the pupil size cannot be controlled to an illuminance that satisfies the size condition.
- the determination unit 140 for example, when the illuminance of the eye portion irradiated with the maximum amount of light is smaller than the illuminance that satisfies the size condition of the pupil size, the illuminance of the eye portion is caused by the irradiation of light by the visible light illuminating device 300. May be determined that the pupil size cannot be controlled to an illuminance that satisfies the size condition.
- the above-mentioned maximum amount of light is the strongest amount of light that the visible light illuminating device 300 can irradiate.
- the determination unit 140 determines the illuminance of the eye portion by the visible light illuminating device 300 via the notification unit 160.
- a notification may be given indicating that the illuminance that satisfies the size condition cannot be controlled.
- the notification indicating that the illuminance of the eye portion cannot be controlled by the visible light illuminating device 300 to the illuminance at which the pupil size satisfies the size condition is referred to as an uncontrollable notification.
- the determination unit 140 may, for example, send an instruction to send an out-of-control notification to the notification unit 160.
- the notification unit 160 sends an out-of-control notification to a predetermined notification destination (for example, a notification destination device 500).
- the determination unit 140 when the additional illuminance calculated based on the illuminance size relationship is larger than the maximum value of the illuminance so that the glare condition is satisfied, the size of the pupil satisfies the illuminance of the eye portion. It may be set to determine that the illuminance cannot be controlled. In this case, the determination unit 140 may send an instruction to send an out-of-control notification to the notification unit 160.
- the determination unit 140 can control the illuminance of the eye portion to an illuminance in which the pupil size satisfies the size condition by irradiating the light by the visible light illuminating device 300 regardless of whether or not the glare condition is satisfied. May be determined. Then, when the determination unit 140 determines that the amount of light of the visible light illumination device 300 cannot be controlled so that the size of the pupil satisfies the size condition, the determination unit 140 sends an instruction to send an uncontrollable notification to the notification unit 160. May be sent. Further, the determination unit 140 may determine whether or not the additional illuminance calculated based on the illuminance size relationship is larger than the maximum value of the illuminance so that the glare condition is satisfied.
- the determination unit 140 may change the value of the additional illuminance to the maximum value of the calculated illuminance. Further, the determination unit 140 transmits a notification indicating that the glare condition cannot be satisfied when the additional illuminance calculated based on the illuminance size relationship is larger than the maximum value of the illuminance so that the glare condition is satisfied.
- the instruction may be sent to the notification unit 160.
- the determination unit 140 sets the illuminance of the eye portion to the illuminance in which the pupil size satisfies the size condition. It may be determined that control is possible.
- the determination unit 140 determines the illuminance of the eye portion by the irradiation of light by the visible light illumination device 300 regardless of whether or not the glare condition is satisfied, and the size of the pupil is the size condition. It is determined whether or not the illuminance can be controlled to satisfy the above conditions. Further, the determination unit 140 transmits a notification indicating that the glare condition cannot be satisfied when the additional illuminance calculated based on the illuminance size relationship is larger than the maximum value of the illuminance so that the glare condition is satisfied. Do not send instructions.
- Control unit 150 controls the visible light illumination device 300 so that the visible light illumination device 300 irradiates the light amount determined by the determination unit 140.
- control unit 150 receives from the control unit 150 the set value of the visible light illumination device 300 for the visible light illumination device 300 to irradiate the light amount determined by the determination unit 140.
- the control unit 150 sets the received set value in the visible light illumination device 300.
- the control unit 150 may send an instruction to perform imaging to the image pickup device 200 after setting the received set value in the visible light illumination device 300.
- the control unit 150 does not have to transmit an instruction to perform imaging to the imaging device 200.
- the determination unit 140 may send the calculated light amount (specifically, a value representing the light amount) to the control unit 150.
- the control unit 150 receives the amount of light from the determination unit 140.
- the control unit 150 is based on the amount of light received, the light distribution characteristics of the visible light illuminating device 300, and the relationship between the set value of the brightness of the visible light illuminating device 300 and the light emitted by the visible light illuminating device 300. ,
- the set value for irradiating the received light amount of light by the visible light illuminating device 300 may be calculated.
- ⁇ Notification unit 160 When the determination unit 140 sends a non-configurable notification, the notification unit 160 receives an instruction from the determination unit 140 to send the non-configurable notification. As described above, when the non-configurable notification is received, the notification unit 160 transmits the non-configurable notification to a predetermined notification destination (for example, the notification destination device 500).
- the non-settable notification may be a predetermined signal or the like. The non-settable notification may indicate the reason why the illuminance of the eye portion cannot be controlled to reach the target illuminance.
- the reason why the illuminance of the eye portion cannot be controlled to reach the target illuminance is that, for example, the illuminance of the eye portion due to natural light is too bright, or the illuminance of the eye portion due to natural light is too dark.
- the notification unit 160 may receive an instruction from the determination unit 140 to transmit the above-mentioned notification indicating that the glare condition cannot be satisfied (hereinafter, referred to as a glare notification). Upon receiving the instruction to send the glare notification, the notification unit 160 transmits the glare notification to a predetermined notification destination.
- the glare notification may be a predetermined signal or the like different from the non-settable notification.
- the image storage unit 170 stores the image and the imaging information acquired by the acquisition unit 110. For example, when the amount of light emitted by the visible light illuminating device 300 is controlled by the control unit 150 and the size of the pupil of the eye portion changes so as to satisfy the size condition, the image storage unit 170 is newly acquired by the acquisition unit 110. In the image stored in, the size of the pupil satisfies the size condition.
- the notification destination device 500 receives the non-configurable notification transmitted by the notification unit 160.
- the notification destination device 500 outputs in response to the received non-configurable notification.
- the notification destination device 500 may be, for example, a terminal device held by the administrator of the imaging system 1.
- the notification destination device 500 may be, for example, a display device that displays information output from the control device 100.
- the notification destination device 500 displays, for example, a display (for example, a message) indicating that the visible light illumination device 300 cannot set the illuminance of the eye portion to an illuminance at which the pupil size satisfies the size condition as an output corresponding to the non-settable notification. It may be displayed.
- the notification destination device 500 may output, for example, a predetermined sound as an output in response to the non-settable notification.
- the notification destination device 500 may output, for example, a predetermined image or animation as an output in response to the non-settable notification.
- the notification destination device 500 may vibrate according to a predetermined pattern, for example, as an output in response to a non-settable notification.
- the notification destination device 500 may output a combination of two or more of the above examples as an output in response to the non-settable notification.
- the notification destination device 500 may further receive the glare notification transmitted by the notification unit 160.
- the notification destination device 500 outputs in response to the received glare notification.
- FIG. 3 is a flowchart showing an example of the operation of the control device 100 of the present embodiment.
- the acquisition unit 110 acquires an input image including the area of the eye portion (step S101).
- the estimation unit 120 estimates the illuminance of the eye portion from the acquired input image.
- the estimation unit 120 estimates the illuminance of the eye portion based on the pixel value of the sclera region in the eye region of the acquired input image.
- the determination unit 140 determines the amount of illumination so that the size of the pupil of the eye portion satisfies the size condition.
- the determination unit 140 determines whether the amount of illumination can be controlled so that the size of the pupil satisfies the size condition (step S104).
- the control device 100 notifies, for example, the notification destination device 500.
- the determination unit 140 sends an instruction to send a non-configurable notification to the notification unit 160.
- the notification unit 160 transmits the non-configurable notification to the notification destination device 500.
- the control device 100 then performs the operation of step S106.
- step S104 If the amount of illumination can be controlled so that the size of the pupil satisfies the size condition (YES in step S104), the control device 100 then performs the operation of step S106.
- step S106 the determination unit 140 determines whether the glare condition is satisfied when the visible light illumination device 300 irradiates the light of the amount of light determined in step S103. When the glare condition is satisfied (YES in step S106), the control device 100 then performs the operation of step S108.
- step S106 When the glare condition is not satisfied (NO in step S106), the determination unit 140 changes the amount of light emitted by the visible light illumination device 300 so that the glare condition is satisfied (step S107). The control device 100 then performs the operation of step S108.
- step S108 the control unit 150 controls the amount of light of the visible light illuminating device 300 so that the amount of light of the visible light illuminating device 300 becomes the determined amount of light.
- the determination unit 140 determines the amount of light of the visible light illuminating device 300 based on the relationship between the set value and the amount of light, which is represented by, for example, a look-up table.
- the set value of the lighting device 300 may be determined.
- the determination unit 140 transmits the determined set value to the control unit 150.
- the control unit 150 receives a set value from the determination unit, and sets the set value of the visible light illumination device 300 to the received set value.
- the present embodiment has an effect that the time required to obtain an iris image in a desired state can be shortened.
- the iris image in the desired state is, for example, an image of an iris in which the size of the pupil satisfies the size condition.
- the reason is that the estimation unit 120 estimates the illuminance of the eye portion from the acquired input image, and the determination unit 140 irradiates the light emitted by the visible light illumination device 300 so that the size of the pupil of the eye portion satisfies the size condition. This is because it determines the amount of light.
- the control unit 150 controls the visible light illuminating device 300 so that the amount of light emitted by the visible light illuminating device 300 becomes the determined amount of light, so that the size of the pupil of the eye portion satisfies the size condition. As such, the size of the pupil changes. By imaging the eye portion after the pupil size has changed, an iris image of the desired state can be obtained.
- the image pickup system 1 of this modification is the same as the image pickup system of the first embodiment except for the difference in the control device 100 described below.
- the control device 100 of this modification is the same as the control device 100 of the first embodiment except for the following differences.
- the estimation unit 120 detects the iris region in the input image, and further estimates the iris color based on the pixel values of the pixels included in the detected iris region.
- the estimation unit 120 may estimate which of the plurality of predefined colors the iris color is, based on the pixel values of the pixels in the detected iris region.
- the estimation unit 120 sends information representing the estimated iris color to the determination unit 140.
- the relationship storage unit 130 stores the illuminance size relationship for each color of the iris.
- the relationship storage unit 130 may store, for example, a look-up table representing the relationship between the illuminance of the eye portion and the size of the pupil as the illuminance size relationship for each of the plurality of predefined colors.
- the determination unit 140 receives information representing the estimated iris color from the estimation unit 120.
- the determination unit 140 reads out the illuminance size relationship stored for the estimated iris color, and determines the amount of light emitted by the visible light illuminating device 300 based on the read illuminance size relationship.
- a glare condition is further obtained for each color of the iris.
- the glare condition set for each color of the iris may be stored in the relational storage unit 130.
- the determination unit 140 may read the glare condition for the estimated iris color from, for example, the relational storage unit 130.
- the determination unit 140 may store the glare condition for each color of the iris and select the glare condition for the estimated color of the iris. Based on the glare condition for the estimated iris color, the determination unit 140 changes the amount of light emitted by the visible light illuminating device 300 according to the above-mentioned glare degree, and gives an instruction to transmit a glare notification. You may go.
- the same glare condition may be set for each of the plurality of iris colors. Then, the relationship between the illuminance and the degree of glare may be defined for each color of the iris. In that case, the relationship storage unit 130 may store the relationship between the illuminance and the degree of glare for each color of the iris.
- the determination unit 140 may determine the degree of glare from the illuminance by using the relationship between the illuminance and the degree of glare for each color of the iris. Further, the determination unit 140 may use the relationship between the illuminance and the degree of glare for each color of the iris to determine the illuminance at which the degree of glare is the value from the value of the degree of glare.
- 4 and 5 are flowcharts showing an example of the operation of the control device 100 of this modified example.
- step S101 and step S102 shown in FIG. 4 the control device 100 performs the same operation as the operation of step S101 and step S102 of the control device 100 of the first embodiment shown in FIG.
- the estimation unit 120 estimates the color of the iris based on the input image (step S203).
- the estimation unit 120 may detect the iris region from the input image, for example.
- the estimation unit 120 may estimate the color of the iris from the pixel values of the pixels included in the iris region.
- the estimation unit 120 may estimate, for example, a color corresponding to the color of the iris in the detected region from a plurality of predetermined colors of the iris.
- the estimation unit 120 sends information representing the estimated iris color to the determination unit 140.
- the determination unit 140 receives information representing the color of the iris from the estimation unit 120.
- the determination unit 140 determines the glare condition based on the estimated color of the iris (step S204).
- the determination unit 140 may select a glare condition defined for the estimated iris color from a plurality of glare conditions.
- the determination unit 140 selects the illuminance size relationship based on the estimated iris color (step S205).
- the determination unit 140 may select an illuminance size relationship defined for the estimated iris color from a plurality of illuminance size relationships.
- the determination unit 140 illuminates the amount of illumination (that is, the visible light illuminating device 300) so that the size of the pupil of the eye portion satisfies a predetermined condition (that is, the above-mentioned size condition).
- the amount of light is determined (step S206).
- the control device 100 performs the operation of step S104 of FIG.
- control device 100 of the present modification shown in FIG. 5 from step S104 to step S108 is the same as the operation of the control device 100 of the first embodiment shown in FIG. 3 from step S104 to step S108. is there.
- the determination unit 140 uses the same glare condition regardless of the color of the iris, and the illuminance of the eye portion is determined based on the relationship between the illuminance and the degree of glare according to the color of the iris.
- the visible light illumination device 300 may be controlled so that the degree of glare becomes an illuminance satisfying the glare condition.
- the determination unit 140 selects the relationship between the illuminance and the degree of glare defined for the estimated iris color in step S204.
- Other operations are the same as the operation examples described above.
- This modified example has a second effect that an iris image in a desired state can be obtained regardless of the color of the iris.
- the reason is that the estimation unit 120 estimates the color of the iris from the input image, and the determination unit 140 determines the light emitted by the visible light illuminating device 300 based on the illuminance size relationship according to the estimated iris color. This is because it determines the amount of light.
- This modified example has a third effect that glare can be appropriately reduced regardless of the color of the iris.
- the reason is that the determination unit 140 changes the amount of light emitted by the visible light illuminating device 300 based on the glare condition according to the estimated color of the iris.
- the determination unit 140 may change the amount of light emitted by the visible light illuminating device 300 based on the same glare conditions regardless of the color of the iris. In that case, the same effect as that of the first embodiment and the above-mentioned second effect can be obtained by this modification.
- the determination unit 140 may determine the amount of light emitted by the visible light illumination device 300 based on the same illuminance size relationship regardless of the color of the iris. In that case, the same effect as that of the first embodiment and the above-mentioned third effect can be obtained by this modification.
- FIG. 6 is a block diagram showing an example of the configuration of the imaging system 1A of this modified example.
- the image pickup system 1A of this modification includes a control device 100, an image pickup device 200, a visible light illumination device 300, a near infrared light illumination device 400, a notification destination device 500, and a detection device 600. ..
- the detection device 600 detects a person, for example, in the above-mentioned walking path. When a person starts to be detected, the detection device 600 transmits a notification indicating that the person has started to be detected to the image pickup device 200 and the near-infrared light illumination device 400. When the person is no longer detected, the detection device 600 transmits a notification indicating that the person is no longer detected to the image pickup device 200 and the near-infrared light illumination device 400.
- the notification indicating that a person has begun to be detected (hereinafter, referred to as a detection notification) may be realized by transmitting a predetermined signal (hereinafter, also referred to as a detection signal).
- non-detection notification may be realized by transmitting another predetermined signal (non-detection signal).
- the detection notification may be realized by starting the transmission of a predetermined signal (hereinafter, referred to as a detection continuation signal).
- the non-detection notification may be realized by the end of the detection continuation signal transmission.
- the detection device 600 may be realized by a sensor that detects a person or the like by, for example, near-infrared light or ultrasonic waves.
- the imaging device 200 receives a detection notification and a non-detection notification from the detection device 600.
- the image pickup apparatus 200 continuously takes an image from the time when the detection notification is received from the detection apparatus 600 until the non-detection notification is received, and transmits the image obtained by the imaging to the control device 100.
- the image pickup apparatus 200 may interrupt the imaging and the transmission of the image until the next detection notification is received.
- the near-infrared light illumination device 400 receives a detection notification and a non-detection notification from the detection device 600.
- the near-infrared light illumination device 400 continuously irradiates near-infrared light from the time when the detection notification is received from the detection device 600 to the time when the non-detection notification is received.
- the near-infrared light illumination device 400 may suspend the irradiation of the near-infrared light until the next detection notification is received.
- the imaging system 1B of the present modification is the same as the imaging system 1A of the second modification, except for the differences described below.
- FIG. 7 is a block diagram showing an example of the configuration of the imaging system 1B of this modified example.
- the image pickup system 1B of this modification is the same as the image pickup system 1A of the first modification, the control device 100, the image pickup device 200, the visible light illumination device 300, the near infrared light illumination device 400, and the notification destination.
- the detection device 600 is included.
- two image pickup devices 200 are drawn.
- the image pickup device 200 drawn on the control device 100 is the same image pickup device 200 as the image pickup device 200 drawn under the control device 100.
- the same imaging device 200 is depicted above the control device 100 and below the control device 100.
- the image pickup device 200 is connected to each of the acquisition unit 110 and the control unit 150 of the control device 100.
- the near-infrared light illumination device 400 is connected to the control unit 150 of the control device 100.
- the detection device 600 is connected to the acquisition unit 110 of the control device 100.
- the detection device 600 detects a person, for example, in the above-mentioned walking path. In this modification, the detection device 600 transmits the detection notification and the non-detection notification to the acquisition unit 110 of the control device 100.
- the acquisition unit 110 receives the detection notification and the non-detection notification from the detection device 600.
- the acquisition unit 110 sends an instruction to transmit the detection notification to the image pickup device 200 and the near-infrared light illumination device 400 to the control unit 150.
- the acquisition unit 110 sends an instruction to transmit the non-detection notification to the image pickup device 200 and the near-infrared light illumination device 400 to the control unit 150.
- the control unit 150 receives an instruction to send a detection notification and an instruction to send a non-detection notification from the acquisition unit 110.
- the control unit 150 transmits the detection notification to the image pickup apparatus 200 and the near-infrared light illumination apparatus 400.
- the control unit 150 transmits the non-detection notification to the image pickup apparatus 200 and the near-infrared light illumination apparatus 400.
- the imaging device 200 receives a detection notification and a non-detection notification from the control unit 150.
- the image pickup apparatus 200 continuously performs imaging from the time when the detection notification is received from the control unit 150 to the time when the non-detection notification is received, and transmits the image obtained by the imaging to the control device 100.
- the image pickup apparatus 200 may interrupt the imaging and the transmission of the image until the next detection notification is received.
- the near-infrared light illumination device 400 receives a detection notification and a non-detection notification from the control unit 150.
- the near-infrared light illumination device 400 continuously irradiates near-infrared light from the time when the detection notification is received from the control unit 150 to the time when the non-detection notification is received.
- the near-infrared light illumination device 400 may suspend the irradiation of the near-infrared light until the next detection notification is received.
- the imaging system 1C of this modification is the same as the imaging system 1 of the first embodiment except for the differences described below.
- FIG. 8 is a block diagram showing the configuration of the imaging system 1C of this modified example.
- the image pickup system 1C of FIG. 8 includes an image pickup device 200A and an image pickup device 200B instead of the image pickup device 200.
- the near-infrared light illumination device 400 is connected to the image pickup device 200B.
- the image pickup device 200A and the image pickup device 200B are attached so as to be able to image the head of a person walking along the above-mentioned walking path, for example.
- the focus position of the image pickup apparatus 200A and the focus position of the image pickup apparatus 200B are fixed.
- the focus position of the image pickup device 200A and the focus position of the image pickup device 200B are such that the head of a person walking along the walking path passes through the focus position of the image pickup device 200A and then passes through the focus position of the image pickup device 200B. Is set to.
- the image pickup apparatus 200A and the image pickup apparatus 200B may continuously perform imaging.
- the near-infrared light illumination device 400 may continuously irradiate near-infrared light.
- the control unit 150 controls the visible light illumination device 300 so that the amount of light emitted by the visible light illumination device 300 becomes a determined amount of light, and then transmits an instruction to the image pickup device 200B to perform imaging.
- the image pickup apparatus 200B may start imaging according to the instruction from the control unit 150.
- the image pickup apparatus 200B may transmit a notification to start imaging to the near-infrared light illumination apparatus 400.
- the imaging device 200B ends the imaging, the imaging device 200B may transmit a notification to end the imaging to the near-infrared light illumination device 400.
- the near-infrared light illumination device 400 may receive a notification for starting imaging and a notification for ending imaging from the imaging device 200B.
- the near-infrared light illuminating device 400 continues to irradiate near-infrared light while the imaging device 200B is performing imaging, for example, from receiving a notification to start imaging to receiving a notification to end imaging. You may be.
- the acquisition unit 110 of this modification receives the image captured by the image pickup device 200A.
- control device 100 of this modification performs the same operation as the control device 100 of the first embodiment shown in FIG.
- the control unit 150 of the present embodiment transmits an instruction to perform imaging to the image pickup device 200B after controlling the lighting in step S108, that is, after setting the brightness setting value of the visible light illumination device 300. You may.
- the imaging device 200B may start imaging when it receives an instruction to perform imaging.
- the image pickup apparatus 200B may perform the image pickup upon receiving the instruction to perform the image pickup.
- the fifth modification is a combination of at least one of a first modification, a second modification or a third modification, and a fourth modification.
- the image pickup apparatus 200 does not have to be connected to the near-infrared light illumination apparatus 400. Even when the first modification and the fourth modification are combined, the image pickup apparatus 200 does not have to be connected to the near-infrared light illumination apparatus 400.
- the configuration of the imaging system of the fifth modification is the same as the configuration of the imaging system 1C of the fourth modification shown in FIG. It's fine.
- the detection device 600 is connected to the image pickup device 200 and the near-infrared light illumination device 400, as in the example shown in FIG. Then, the detection notification and the non-detection notification are transmitted from the detection device 600 to the image pickup device 200 and the near-infrared light illumination device 400.
- the control unit 150 is connected to the image pickup apparatus 200 and the near-infrared light illumination apparatus 400, as in the example shown in FIG. 7. Then, the detection notification and the non-detection notification are transmitted from the control unit 150 to the image pickup apparatus 200 and the near-infrared light illumination apparatus 400.
- the image pickup apparatus 200B does not have to be connected to the near-infrared light illumination apparatus 400.
- the detection device 600 is connected to the image pickup device 200A, the image pickup device 200B, and the near-infrared light illumination device 400. Then, the detection notification and the non-detection notification are transmitted from the detection device 600 to the image pickup device 200A, the image pickup device 200B, and the near infrared light illumination device 400.
- the image pickup apparatus 200A and the image pickup apparatus 200B may continuously perform imaging from the reception of the detection notification to the reception of the non-detection notification.
- the near-infrared light illumination device 400 may continuously irradiate near-infrared light from the time when the detection notification is received until the time when the non-detection notification is received. The same applies to the case where the first modification, the second modification, and the fourth modification are combined.
- the control unit 150 may be connected to the image pickup device 200A, the image pickup device 200B, and the near infrared light illumination device 400. Then, the detection notification and the non-detection notification may be transmitted from the control unit 150 to the image pickup device 200A, the image pickup device 200B, and the near infrared light illumination device 400. Then, the detection notification and the non-detection notification are transmitted from the detection device 600 to the image pickup device 200A, the image pickup device 200B, and the near infrared light illumination device 400.
- the image pickup apparatus 200A and the image pickup apparatus 200B may continuously perform imaging from the reception of the detection notification to the reception of the non-detection notification.
- the near-infrared light illumination device 400 may continuously irradiate near-infrared light from the time when the detection notification is received until the time when the non-detection notification is received. The same applies to the case where the first modification, the third modification, and the fourth modification are combined.
- the detection device 600 may be connected to the image pickup device 200A. Then, the detection notification and the non-detection notification may be transmitted from the detection device 600 to the image pickup device 200A.
- the image pickup apparatus 200A may continuously perform imaging from the reception of the detection notification to the reception of the non-detection notification.
- the image pickup apparatus 200B may continuously perform imaging, and the near-infrared light illumination apparatus 400 may continuously irradiate near-infrared light.
- the control unit 150 may transmit an instruction to start imaging to the imaging device 200B.
- the image pickup apparatus 200B may start imaging according to an instruction from the control unit 150.
- the image pickup apparatus 200B may transmit a notification to the near-infrared light illumination apparatus 400 when the image pickup is started. Upon receiving the notification, the near-infrared light illuminator 400 may start irradiating the near-infrared light. The same applies to the case where the first modification, the second modification, and the fourth modification are combined.
- the size of the pupil of the imaged eye region is estimated from the image captured by the imaging device 200, and the illuminance of the eye region is estimated based on the size of the pupil.
- FIG. 9 is a block diagram showing an example of the configuration of the imaging system 2 of the present embodiment.
- the imaging system 2 is the same as the imaging system 1 of the first embodiment except for the differences described below.
- the imaging system 2 includes a control device 101 instead of the control device 100.
- the control device 101 is the same as the control device 100 of the first embodiment except for the differences described below.
- the estimation unit 120 is connected to the relation storage unit 130.
- the estimation unit 120 of the present embodiment is the same as the estimation unit 120 of the first embodiment, except for the differences described below.
- the estimation unit 120 extracts the pupil region from the image (that is, the input image) acquired by the acquisition unit 110.
- the estimation unit 120 may extract the pupillary region according to any of the various existing methods of extracting the pupillary region.
- the estimation unit 120 estimates the size of the pupil of the eye region based on the imaging information (particularly, camera parameters) of the imaging device 200. Specifically, the estimation unit 120 includes the focus position of the image pickup device 200, the angle of view of the image pickup device 200, the size of the image captured by the image pickup device 200 (that is, the number of vertical pixels and the number of horizontal pixels), and the input.
- the size of the pupil may be estimated based on the number of pixels in the diameter of the region of the pupil in the image.
- the estimation unit 120 may read the illuminance size relationship from the relationship storage unit 130.
- the estimation unit 120 estimates the illuminance of the eye region from the estimated pupil size based on the illuminance size relationship.
- the control device 101 of the present embodiment performs the same operation as the operation of the control device 100 of the first embodiment shown in FIG. 3, except for the following differences.
- the estimation unit 120 extracts the pupil region from the acquired input image.
- the estimation unit 120 estimates the size of the extracted pupil based on the imaging information.
- step S102 of the control device 101 of the present embodiment is the same as the operation of the step to which the same reference numeral is given in the control device 100 of the first embodiment.
- FIG. 10 is a block diagram showing an example of the configuration of the control device 102 according to the present embodiment.
- the control device 102 includes an acquisition unit 110, an estimation unit 120, and a determination unit 140.
- the acquisition unit 110 acquires an input image including the eye region, which is the region of the eye portion.
- the estimation unit 120 estimates the illuminance of the eye portion from the acquired input image.
- the determination unit 140 determines the amount of visible light illuminating the eye portion so that the pupil size of the eye portion satisfies the size condition. decide.
- FIG. 11 is a flowchart showing an example of the operation of the control device 102 of the present embodiment.
- the acquisition unit 110 acquires an input image including the area of the eye portion (step S201).
- the acquisition unit 110 may acquire an input image from, for example, the above-mentioned image pickup device 200 or an image pickup device equivalent to the image pickup device 200A.
- the estimation unit 120 estimates the illuminance of the eye portion from the acquired input image (step S202).
- the method in which the estimation unit 120 estimates the illuminance of the eye portion may be the same as the method in which the estimation unit 120 of the first embodiment estimates the illuminance of the eye portion.
- the method in which the estimation unit 120 estimates the illuminance of the eye portion may be the same as the method in which the estimation unit 120 of the second embodiment estimates the illuminance of the eye portion.
- the determination unit 140 determines the amount of visible light illumination so that the size of the pupil of the eye portion satisfies the size condition (step S203).
- the visible light illumination may be, for example, the same device as the visible light illumination device 300 in the first embodiment and the second embodiment.
- the method in which the determination unit 140 determines the amount of visible light illumination is the same as the method in which the determination unit 140 of the first embodiment and the determination unit 140 of the second embodiment determine the amount of visible light illumination. May be. In that case, the determination unit 140 may maintain the above-mentioned illuminance size relationship.
- the present embodiment has the same effect as that of the first embodiment.
- the reason is the same as the reason why the effect of the first embodiment occurs.
- the control device 100, the control device 101, and the control device 102 can be realized by a computer including a memory in which a program read from a storage medium is loaded and a processor that executes the program.
- the control device 100, the control device 101, and the control device 102 can also be realized by dedicated hardware.
- the control device 100, the control device 101, and the control device 102 can also be realized by a combination of the above-mentioned computer and dedicated hardware.
- FIG. 12 is a diagram showing an example of a hardware configuration of a computer 1000 capable of realizing the control device (that is, the control device 100, the control device 101, and the control device 102) according to the above-described embodiment.
- the computer 1000 shown in FIG. 12 includes a processor 1001, a memory 1002, a storage device 1003, and an I / O (Input / Output) interface 1004.
- the computer 1000 can access the storage medium 1005.
- the memory 1002 and the storage device 1003 are storage devices such as a RAM (Random Access Memory) and a hard disk, for example.
- the storage medium 1005 is, for example, a storage device such as a RAM or a hard disk, a ROM (Read Only Memory), or a portable storage medium.
- the storage device 1003 may be a storage medium 1005.
- the processor 1001 can read and write data and programs to the memory 1002 and the storage device 1003.
- the processor 1001 is connected to, for example, an image pickup device 200, an image pickup device 200A, an image pickup device 200B, a visible light illumination device 300, a near infrared light illumination device 400, a notification destination device 500, and a detection device 600 via an I / O interface 1004. You can access it.
- the processor 1001 can access the storage medium 1005.
- the storage medium 1005 stores a program for operating the computer 1000 as a control device according to the above-described embodiment.
- the processor 1001 loads the memory 1002 with a program stored in the storage medium 1005 that causes the computer 1000 to operate as the control device according to the above-described embodiment. Then, when the processor 1001 executes the program loaded in the memory 1002, the computer 1000 operates as the control device according to the above-described embodiment.
- the acquisition unit 110, the estimation unit 120, the determination unit 140, the control unit 150, and the notification unit 160 can be realized by, for example, a processor 1001 that executes a program loaded in the memory 1002.
- the relationship storage unit 130 and the image storage unit 170 can be realized by a storage device 1003 such as a memory 1002 included in the computer 1000 or a hard disk device.
- a part or all of the acquisition unit 110, the estimation unit 120, the relational storage unit 130, the determination unit 140, the control unit 150, the notification unit 160, and the image storage unit 170 are realized by a dedicated circuit that realizes the functions of each unit. You can also do it.
- a control device comprising.
- Appendix 2 The control according to Appendix 1, wherein the estimation means detects a sclera region of the eye region from the input image and estimates the illuminance of the eye region based on the information of the detected pixels of the sclera region. apparatus.
- the estimation means detects the pupil region of the eye region from the input image, estimates the pupil size of the eye portion based on the detected pupil region, and determines the estimated pupil size.
- the control device which estimates the illuminance of the eye portion based on the illuminance size relationship.
- the estimation means estimates the color of the iris based on the acquired input image.
- the control device according to Appendix 4, wherein the determination means selects the glare condition according to the estimated color of the iris.
- the estimation means estimates the color of the iris based on the acquired input image.
- the determining means selects the illuminance size relationship based on the estimated color of the iris from a plurality of illuminance size relationships according to the color of the iris, and based on the selected illuminance size relationship, the lighting of the illumination.
- the control device according to any one of Appendix 1 to 4, which determines the amount of light.
- the determining means determines whether it is possible to control the amount of light of the illumination so that the size of the pupil satisfies the size condition.
- the control device according to any one of Supplementary note 1 to 6, further comprising a notification means for notifying when it is determined that the amount of light of the illumination cannot be controlled so that the size of the pupil satisfies the size condition.
- (Appendix 10) Acquires an input image including the eye area, which is the area of the eye part, The illuminance of the eye portion is estimated from the acquired input image, and the illuminance is estimated. Based on the illuminance size relationship, which is the relationship between the illuminance and the size of the pupil, the amount of visible light illuminating the eye portion is determined so that the size of the pupil of the eye portion satisfies the size condition. Control method.
- Appendix 11 The control method according to Appendix 10, wherein the sclera region of the eye region is detected from the input image, and the illuminance of the eye portion is estimated based on the information of the detected pixels of the sclera region.
- Appendix 12 The pupil region of the eye region is detected from the input image, the pupil size of the eye portion is estimated based on the detected pupil region, and the estimated pupil size and the illuminance size relationship The control method according to Appendix 10, which estimates the illuminance of the eye portion based on the above.
- Appendix 13 The control method according to any one of Appendix 10 to 12, wherein the amount of light of the illumination is determined so that the illuminance in the eye region satisfies the glare condition in the index of glare determined based on the illuminance.
- Appendix 14 Based on the acquired input image, the color of the iris is estimated and The control method according to Appendix 13 for selecting the glare condition according to the estimated color of the iris.
- the color of the iris is estimated and From a plurality of illuminance size relationships according to the color of the iris, the illuminance size relationship is selected based on the estimated color of the iris, and the amount of light of the illumination is determined based on the selected illuminance size relationship.
- the control method according to any one of 10 to 13.
- Appendix 16 Based on the estimated illuminance and the illuminance size relationship, it is determined whether it is possible to control the amount of light of the illumination so that the size of the pupil satisfies the size condition.
- the control method according to any one of Appendix 10 to 15, wherein when it is determined that the amount of light of the illumination cannot be controlled so that the size of the pupil satisfies the size condition.
- Appendix 17 The control method according to any one of Appendix 10 to 16, wherein the illumination is controlled so that the illumination irradiates visible light with a determined amount of light of the illumination.
- a storage medium that stores a program that causes a computer to execute a program.
- Appendix 19 The memory according to Appendix 18, wherein the estimation process detects the sclera region of the eye region from the input image and estimates the illuminance of the eye region based on the pixel information of the detected sclera region. Medium.
- Appendix 20 In the estimation process, the pupil region of the eye region is detected from the input image, the pupil size of the eye portion is estimated based on the detected pupil region, and the estimated pupil size is used.
- Appendix 21 The determination process is described in any one of Appendix 18 to 20, wherein the illuminance in the eye region determines the amount of light of the illumination so that the index of glare determined based on the illuminance satisfies the glare condition. Storage medium.
- the estimation process estimates the color of the iris based on the acquired input image.
- the estimation process estimates the color of the iris based on the acquired input image.
- the illuminance size relationship is selected based on the estimated iris color from a plurality of illuminance size relationships according to the color of the iris, and the illumination size relationship is selected based on the selected illuminance size relationship.
- the storage medium according to any one of Appendix 18 to 21, which determines the amount of light.
- the determination process determines whether it is possible to control the amount of light of the illumination so that the size of the pupil satisfies the size condition based on the estimated illuminance and the illuminance size relationship.
- the program The storage medium according to any one of Supplementary note 18 to 23, wherein when it is determined that the amount of light of the illumination cannot be controlled so that the size of the pupil satisfies the size condition, the computer further executes the notification process for notifying. ..
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Ophthalmology & Optometry (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Pathology (AREA)
- Human Computer Interaction (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Signal Processing (AREA)
- Geometry (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Studio Devices (AREA)
- Eye Examination Apparatus (AREA)
- Collating Specific Patterns (AREA)
Abstract
Description
まず、本開示の第1の実施形態について、図面を使用して詳細に説明する。
図1は、本実施形態の撮像システムの構成の例を表すブロック図である。
撮像装置200は、動画や静止画(以下、画像と総称する)を撮像できるカメラである。撮像装置200は、撮像した画像を、制御装置100に送信する。撮像装置200は、例えば、CCD(Charge Coupled Device)イメージセンサやCMOS(Complementary Metal-Oxide-Semiconductor)イメージセンサ等のイメージセンサを含んでいてよい。撮像装置200のイメージセンサは、近赤外の波長帯(例えば、780nm(nanometer)~1000nm)の少なくとも一部を含む波長帯に感度があるセンサである。撮像装置200は、例えば、IR(Infrared)カットフィルタが取り付けられていないイメージセンサを備える可視光カメラであってもよい。この場合、撮像装置200は、カラー画像を撮像できるカメラであってもよい。撮像装置200は、グレースケールの画像を撮像できるカメラであってもよい。撮像装置200は、近赤外光カメラであってもよい。
可視光照明装置300は、可視光を照射する照明である。可視光照明装置300は、可視光照明装置300が照射する光の光量を、例えば制御装置100等の他の装置が制御できるように構成されている。言い換えると、可視光照明装置300は、制御装置100の制御によって、照射する光の強度を変更できるように構成されている。可視光照明装置300は、撮像装置200によって撮像される人物(例えば、歩行経路を歩行する人物)の顔の、撮像装置200によって撮像される目部分を照射するように取り付けられていてよい。なお、目部分は、人物の顔の目を含む範囲を指す。具体的には、可視光照明装置300は、可視光照明装置300が照射する光が、撮像装置200によって撮像される人物の瞳孔に入射するように取り付けられている。可視光照明装置300は、撮像装置200に取り付けられていてもよい。可視光照明装置300は、撮像装置200の光軸の方向と同じ方向に向けて取り付けられていてもよい。可視光照明装置300は、撮像装置200の近くに、撮像装置200の光軸の方向と同様の方向に向けて取り付けられていてもよい。
近赤外光照明装置400は、近赤外光を照射する装置である。近赤外光照明装置400は、撮像装置200が撮像を行うことが想定されている位置を照らすように設置されていてよい。近赤外光照明装置400は、撮像装置200によって撮像される人物の顔(特に目部分)が存在しうる範囲において、照射される近赤外光の強度が所定の範囲(例えば、網膜を傷めない強度)に含まれるように構成されていてよい。例えば、撮像装置200のピント位置が固定されている場合、ピント位置の面の、人物の顔が通過しうる範囲において、照射される近赤外光の強度が所定の範囲に含まれるように構成されていてよい。近赤外光照明装置400は、並べられた複数の近赤外光照明装置によって実現されていてもよい。近赤外光照明装置400は、撮像システム1が動作を開始した後、継続的に近赤外光を照射するように構成されていてもよい。近赤外光照明装置400は、撮像装置200が撮像を行っている間、近赤外光を照射するように構成されていてもよい。近赤外光照明装置400は、撮像装置200に取り付けられていてもよい。近赤外光照明装置400は、撮像装置200の近くに取り付けられていてもよい。近赤外光照明装置400は、撮像装置200の光軸の方向と同じ又は同様の方向に向けて取り付けられていてもよい。近赤外光照明装置400は、上述の物体検出センサに接続され、上述の物体検出センサによって歩行経路において動物体が検出された場合に、近赤外光を照射するよう構成されていてもよい。以下の本実施形態の説明では、近赤外光照明装置400は、撮像装置200が撮像を行っているか否かに関わらず、近赤外光の照射を継続するよう構成されている。
制御装置100は、取得部110と、推定部120と、関係記憶部130と、決定部140と、制御部150と、通知部160とを含む。制御装置100は、さらに、画像記憶部170を含んでいてもよい。
取得部110は、撮像装置200が撮像によって得た画像(具体的には、その画像のデータ)を、撮像装置200から取得する。取得部110は、さらに、撮像情報を撮像装置200から取得する。取得部110は、撮像装置200から静止画像を取得した場合、その静止画像と撮像情報とを、推定部120に送出する。取得部110は、撮像装置200から動画像を取得した場合、取得した動画像のデータから、例えばフレーム毎の静止画像のデータを生成してもよい。そして、取得部110は、生成した静止画像のデータと、撮像情報とを、推定部120に送出してもよい。
推定部120は、取得部110から、画像(具体的には、画像のデータ)と、撮像情報とを受け取る。推定部120は、受け取った撮像情報に基づいて、受け取った画像から、目部分における照度を推定する。上述のように、目部分は、目を含む顔の領域を指す。本実施形態では、目部分は、少なくとも、瞳孔を含む虹彩と、強膜とを含む。本実施形態では、瞳孔の照度は、目部分の照度と同じであるとみなされる。言い換えると、本実施形態では、瞳孔の照度は、強膜の照度と同じであるとみなされる。
(参考文献1)佐藤いまり, 佐藤洋一, 池内克史,“物体の陰影に基づく光源環境の推定”,情報処理学会論文誌:コンピュータビジョンとイメージメディア, Vol.41, No. SIG 10(CVIM 1), 2000年12月。
(参考文献2)大石岳史, 大藏苑子, 川上玲, 阪野貴彦, 池内克史, “全方位カメラを用いた光源環境と対象物の同時撮影法に基づく人物モデルおMRシステムへの重畳”, 画像の認識・理解シンポジウム(MIRU2009), 2009年7月。
次に、推定部120の変形例について説明する。本変形例の推定部120は、LUTを使用して絶対輝度を算出する場合に、以下で説明するように絶対輝度を算出してよい。本変形例の推定部120は、以下の説明のように動作する場合を除いて、上述の推定部120と同様に動作してよい。受け取った絞り値とシャッタースピードとに関連付けられ、画素値から算出された三刺激値と一致する三刺激値がLUTに含まれていない場合、推定部120は、LUTに含まれる三刺激値から、算出された三刺激値に最も近い、例えば2つの三刺激値を抽出する。そして、推定部120は、抽出された三刺激値と算出された三刺激値との差に基づく補間によって、抽出された三刺激値に関連付けられている絶対輝度から、算出された三刺激値に対する絶対輝度を算出してよい。例えば、LUTにおいて、受け取った絞り値とシャッタースピードとに関連付けられている、算出された三刺激値Yaに最も近い2つの三刺激値が、Yb及びYcであり、Ybに関連付けられている絶対輝度がLbであり、Ycに関連付けられている絶対輝度がLcであるとする。この場合、推定部120は、三刺激値Yaが導出された画素値を持つ画素に対応する対象の絶対輝度Laを、例えば、式La=Lb+(Lc-Lb)x(Ya-Yb)/(Yc-Yb)によって算出してもよい。
関係記憶部130は、瞳孔サイズと照度との関係を記憶する。人間の瞳孔の大きさ(すなわち瞳孔サイズ)は、目部分の照度に応じて、約2mm(millimeter)から約8mmの間で変化する。関係記憶部130は、例えば実験的に決定された、瞳孔サイズと照度との関係を表すデータを記憶している。関係記憶部130が記憶する、照度と瞳孔との関係を表すデータは、例えば、照度から、目部分の照度がその照度である場合の瞳孔サイズが得られるデータであり、瞳孔サイズから、瞳孔サイズがその瞳孔サイズである場合の照度が得られるデータである。照度と瞳孔との関係を表すデータを、以下では、照度サイズ関係とも表記する。
決定部140は、推定部120から、目部分の照度を受け取る。決定部140は、目部分の照度と、照度サイズ関係とに基づいて、瞳孔のサイズが所定の条件(以下、サイズ条件と表記)を満たすために目部分に照射する照明の光量を決定する。サイズ条件は、例えば、瞳孔サイズの目標値によって表されていてもよい。サイズ条件は、瞳孔サイズの範囲によって表されていてもよい。その場合、決定部140は、サイズ条件に基づいて、所定の方法によって瞳孔サイズの目標値を決定してもよい。決定部140は、例えば、サイズ条件によって表されるサイズの範囲の中心の値を、目標値に決定してもよい。決定部140は、例えば、サイズ条件によって表される範囲のうち最も大きい値を、目標値に決定してもよい。決定部140は、例えば、サイズ条件によって表される範囲のうち最も小さい値を、目標値に決定してもよい。決定部140は、他の値を、目標値に決定してもよい。
決定部140は、眩しさ条件が満たされるように、可視光照明装置300が照射する光の光量を変更してもよい。眩しさ条件は、例えば、可視光照明装置300によって光を照射される人物が感じる眩しさが、所定の基準を超えないことである。
決定部140は、瞳孔のサイズがサイズ条件を満たすように、可視光照明装置300の光量を制御できるか判定する。言い換えると、決定部140は、可視光照明装置300による光の照射によって、目部分の照度を瞳孔のサイズがサイズ条件を満たす照度に制御できるか否かを判定してよい。
なお、決定部140は、照度サイズ関係に基づいて算出された追加照度が、眩しさ条件が満たされるように照度の最大値よりも大きい場合、目部分の照度を瞳孔のサイズがサイズ条件を満たす照度に制御できないと判定するよう設定されていてもよい。この場合、決定部140は、制御不能通知を送信する指示を、通知部160に送出してもよい。
制御部150は、決定部140によって決定された光量の光を可視光照明装置300が照射するように、可視光照明装置300を制御する。
決定部140が設定不能通知を送出した場合、通知部160は、決定部140から、設定不能通知を送信する指示を受け取る。上述のように、設定不能通知を受け取った場合、通知部160は、あらかじめ定められている通知先(例えば通知先装置500)に、設定不能通知を送信する。設定不能通知は、あらかじめ定められた信号等であってよい。設定不能通知は、目部分の照度が目標照度になるよう制御できない理由を示していてもよい。目部分の照度が目標照度になるよう制御できない理由は、例えば、自然光による目部分の照度が明るすぎること、又は、自然光による目部分の照度が暗すぎることである。
画像記憶部170は、取得部110によって取得された、画像と撮像情報とを記憶する。例えば、制御部150によって可視光照明装置300が照射する光の光量が制御され、目部分の瞳孔のサイズがサイズ条件を満たすように変化した場合、新たに取得部110によって取得され画像記憶部170に格納される画像では、瞳孔のサイズがサイズ条件を満たす。
通知先装置500は、通知部160によって送信された設定不能通知を受け取る。通知先装置500は、受け取った設定不能通知に応じた出力を行う。通知先装置500は、例えば、撮像システム1の管理者が保持する端末装置であってもよい。通知先装置500は、例えば、制御装置100から出力される情報を表示する表示装置であってもよい。通知先装置500は、設定不能通知に応じた出力として、例えば、可視光照明装置300によって目部分の照度を瞳孔のサイズがサイズ条件を満たす照度に設定できないことを表す表示(例えば、メッセージ)を表示してもよい。通知先装置500は、設定不能通知に応じた出力として、例えば、あらかじめ定められている音を出力してもよい。通知先装置500は、設定不能通知に応じた出力として、例えば、あらかじめ定められている画像やアニメーションを出力してもよい。通知先装置500は、例えば、設定不能通知に応じた出力として、あらかじめ定められているパターンによって振動してもよい。通知先装置500は、設定不能通知に応じた出力として、以上の例を2つ以上組み合わせた出力を行ってもよい。
次に、本実施形態の制御装置100の動作について、図面を参照して詳細に説明する。
本実施形態には、目的の状態の虹彩画像を得るために要する時間を短縮できるという効果がある。目的の状態の虹彩画像は、例えば、瞳孔のサイズがサイズ条件を満たす虹彩の画像である。その理由は、推定部120が、取得した入力画像から目部分の照度を推定し、決定部140が、目部分の瞳孔のサイズがサイズ条件を満たすように、可視光照明装置300が照射する光の光量を決定するからである。制御部150が、可視光照明装置300によって照射される光の光量が、決定された光量になるように、可視光照明装置300を制御することによって、目部分の瞳孔のサイズがサイズ条件を満たすように、瞳孔のサイズが変化する。瞳孔のサイズが変化した後に眼部分を撮像することによって、目的の状態の虹彩画像が得られる。
次に、第1の変形例について説明する。以下で説明する第1から第5の変形例は、第1の実施形態の変形の例を表す。
本変形例の撮像システム1は、以下で説明する制御装置100における相違を除いて、第1の実施形態の撮像システムと同じである。本変形例の制御装置100は、以下の相違を除いて、第1の実施形態の制御装置100と同じである。
次に、本変形例の制御装置100の動作について、図面を参照して詳細に説明する。
本変形例には、第1の実施形態の効果と同じ効果がある。その理由は、第1の実施形態の効果が生じる理由と同じである。
次に、第2の変形例について説明する。本変形例の撮像システム1Aは、以下で説明する相違を除いて、第1の実施形態の撮像システム1と同じである。
図6は、本変形例の撮像システム1Aの構成の一例を表すブロック図である。本変形例の撮像システム1Aは、制御装置100と、撮像装置200と、可視光照明装置300と、近赤外光照明装置400と、通知先装置500と、に加えて、検出装置600を含む。
本変形例の制御装置100は、第1の実施形態と同じ動作を行う。
次に、第3の変形例について説明する。本変形例の撮像システム1Bは、以下で説明する相違を除いて、第2の変形例の撮像システム1Aと同じである。
図7は、本変形例の撮像システム1Bの構成の一例を表すブロック図である。本変形例の撮像システム1Bは、第1の変形例の撮像システム1Aと同様に、制御装置100と、撮像装置200と、可視光照明装置300と、近赤外光照明装置400と、通知先装置500と、に加えて、検出装置600を含む。図7では、2つの撮像装置200が描かれている。しかし、制御装置100の上に描かれている撮像装置200は、制御装置100の下に描かれている撮像装置200と同一の撮像装置200である。言い換えると、同じ撮像装置200が、制御装置100の上と、制御装置100の下とに描かれている。本変形例では、撮像装置200は、制御装置100の取得部110及び制御部150の各々と接続されている。近赤外光照明装置400は、制御装置100の制御部150と接続されている。検出装置600は、制御装置100の取得部110と接続されている。
本変形例の制御装置100は、第1の実施形態と同じ動作を行う。
次に、第4の変形例について説明する。本変形例の撮像システム1Cは、以下で説明する相違を除いて、第1の実施形態の撮像システム1と同じである。
図8は、本変形例の撮像システム1Cの構成を表すブロック図である。図8の撮像システム1Cは、撮像装置200の代わりに、撮像装置200Aと撮像装置200Bとを含む。図8に示す例では、近赤外光照明装置400は、撮像装置200Bに接続されている。
本変形例の制御装置100は、図3に示す、第1の実施形態の制御装置100と同じ動作を行う。
次に、本変形例の第5の変形例について説明する。
次に、本開示の第2の実施形態について、図面を参照して詳細に説明する。本実施形態では、撮像装置200が撮像した画像から、撮像された目部位の瞳孔のサイズが推定され、瞳孔のサイズに基づいて、目部位の照度が推定される。
図9は、本実施形態の撮像システム2の構成の例を表すブロック図である。撮像システム2は、第1の実施形態の撮像システム1と、以下で説明する相違を除いて、同じである。
本実施形態の制御装置101は、以下の相違を除いて、図3に示す第1の実施形態の制御装置100の動作と同じ動作を行う。
本実施形態には、第1の実施形態と同じ効果がある。その理由は、第1の実施形態の効果が生じる理由と同じである。
本実施形態には、上述の第1から第5の変形例の変形を適用できる。
次に、本開示の第3の実施形態について図面を用いて詳細に説明する。
まず、本実施形態に係る制御装置102の構成について、図面を用いて詳細に説明する。
次に、本実施形態の制御装置102の動作について図面を用いてイ説明する。
本実施形態には、第1の実施形態の効果と同じ効果がある。その理由は、第1の実施形態の効果が生じる理由と同じである。
制御装置100、制御装置101、及び、制御装置102は、記憶媒体から読み出されたプログラムがロードされたメモリと、そのプログラムを実行するプロセッサとを含むコンピュータによって実現することができる。制御装置100、制御装置101、及び、制御装置102は、専用のハードウェアによって実現することもできる。制御装置100、制御装置101、及び、制御装置102は、前述のコンピュータと専用のハードウェアとの組み合わせによって実現することもできる。
また、上記の実施形態の一部又は全部は、以下の付記のようにも記載されうるが、以下には限られない。
目部分の領域である目領域を含む入力画像を取得する取得手段と、
取得した前記入力画像から、前記目部分の照度を推定する推定手段と、
照度と瞳孔のサイズとの関係である照度サイズ関係に基づいて、前記目部分の瞳孔のサイズがサイズ条件を満たすように、前記目部分を照射する可視光の照明の光量を決定する決定手段と、
を備える制御装置。
前記推定手段は、前記入力画像から前記目領域の強膜の領域を検出し、検出した前記強膜の領域の画素の情報に基づいて、前記目部分の照度を推定する
付記1に記載の制御装置。
前記推定手段は、前記入力画像から前記目領域の瞳孔の領域を検出し、検出された前記瞳孔の領域に基づいて、前記目部分の瞳孔のサイズを推定し、推定された前記瞳孔のサイズと前記照度サイズ関係とに基づいて、前記目部分の照度を推定する
付記1に記載の制御装置。
前記決定手段は、前記目領域の照度が、当該照度に基づいて定まる眩しさの指標が眩しさ条件を満たすように、前記照明の光量を決定する
付記1乃至3のいずれか1項に記載の制御装置。
前記推定手段は、取得した前記入力画像に基づいて、虹彩の色を推定し、
前記決定手段は、推定された前記虹彩の色に応じた前記眩しさ条件を選択する
付記4に記載の制御装置。
前記推定手段は、取得した前記入力画像に基づいて、虹彩の色を推定し、
前記決定手段は、虹彩の色に応じた複数の照度サイズ関係から、推定された前記虹彩の色に基づいて前記照度サイズ関係を選択し、選択された前記照度サイズ関係に基づいて、前記照明の光量を決定する
付記1乃至4のいずれか1項に記載の制御装置。
前記決定手段は、推定された前記照度と、前記照度サイズ関係とに基づいて、前記瞳孔のサイズが前記サイズ条件を満たすように前記照明の光量を制御することが可能であるか判定し、
前記瞳孔のサイズが前記サイズ条件を満たすように前記照明の光量を制御できないと判定された場合、通知を行う通知手段
をさらに備える付記1乃至6のいずれか1項に記載の制御装置。
前記照明が、決定された前記照明の光量の可視光を照射するように、前記照明を制御する制御手段
をさらに備える付記1乃至7のいずれか1項に記載の制御装置。
付記1乃至8のいずれか1項に記載の制御装置と、
前記照明と、前記入力画像を撮像する撮像装置と、
を含む撮像システム。
目部分の領域である目領域を含む入力画像を取得し、
取得した前記入力画像から、前記目部分の照度を推定し、
照度と瞳孔のサイズとの関係である照度サイズ関係に基づいて、前記目部分の瞳孔のサイズがサイズ条件を満たすように、前記目部分を照射する可視光の照明の光量を決定する、
制御方法。
前記入力画像から前記目領域の強膜の領域を検出し、検出した前記強膜の領域の画素の情報に基づいて、前記目部分の照度を推定する
付記10に記載の制御方法。
前記入力画像から前記目領域の瞳孔の領域を検出し、検出された前記瞳孔の領域に基づいて、前記目部分の瞳孔のサイズを推定し、推定された前記瞳孔のサイズと前記照度サイズ関係とに基づいて、前記目部分の照度を推定する
付記10に記載の制御方法。
前記目領域の照度が、当該照度に基づいて定まる眩しさの指標が眩しさ条件を満たすように、前記照明の光量を決定する
付記10乃至12のいずれか1項に記載の制御方法。
取得した前記入力画像に基づいて、虹彩の色を推定し、
推定された前記虹彩の色に応じた前記眩しさ条件を選択する
付記13に記載の制御方法。
取得した前記入力画像に基づいて、虹彩の色を推定し、
虹彩の色に応じた複数の照度サイズ関係から、推定された前記虹彩の色に基づいて前記照度サイズ関係を選択し、選択された前記照度サイズ関係に基づいて、前記照明の光量を決定する
付記10乃至13のいずれか1項に記載の制御方法。
推定された前記照度と、前記照度サイズ関係とに基づいて、前記瞳孔のサイズが前記サイズ条件を満たすように前記照明の光量を制御することが可能であるか判定し、
前記瞳孔のサイズが前記サイズ条件を満たすように前記照明の光量を制御できないと判定された場合、通知を行う
付記10乃至15のいずれか1項に記載の制御方法。
前記照明が、決定された前記照明の光量の可視光を照射するように、前記照明を制御する
付記10乃至16のいずれか1項に記載の制御方法。
目部分の領域である目領域を含む入力画像を取得する取得処理と、
取得した前記入力画像から、前記目部分の照度を推定する推定処理と、
照度と瞳孔のサイズとの関係である照度サイズ関係に基づいて、前記目部分の瞳孔のサイズがサイズ条件を満たすように、前記目部分を照射する可視光の照明の光量を決定する決定処理と、
をコンピュータに実行させるプログラムを記憶する記憶媒体。
前記推定処理は、前記入力画像から前記目領域の強膜の領域を検出し、検出した前記強膜の領域の画素の情報に基づいて、前記目部分の照度を推定する
付記18に記載の記憶媒体。
前記推定処理は、前記入力画像から前記目領域の瞳孔の領域を検出し、検出された前記瞳孔の領域に基づいて、前記目部分の瞳孔のサイズを推定し、推定された前記瞳孔のサイズと前記照度サイズ関係とに基づいて、前記目部分の照度を推定する
付記18に記載の記憶媒体。
前記決定処理は、前記目領域の照度が、当該照度に基づいて定まる眩しさの指標が眩しさ条件を満たすように、前記照明の光量を決定する
付記18乃至20のいずれか1項に記載の記憶媒体。
前記推定処理は、取得した前記入力画像に基づいて、虹彩の色を推定し、
前記決定処理は、推定された前記虹彩の色に応じた前記眩しさ条件を選択する
付記21に記載の記憶媒体。
前記推定処理は、取得した前記入力画像に基づいて、虹彩の色を推定し、
前記決定処理は、虹彩の色に応じた複数の照度サイズ関係から、推定された前記虹彩の色に基づいて前記照度サイズ関係を選択し、選択された前記照度サイズ関係に基づいて、前記照明の光量を決定する
付記18乃至21のいずれか1項に記載の記憶媒体。
前記決定処理は、推定された前記照度と、前記照度サイズ関係とに基づいて、前記瞳孔のサイズが前記サイズ条件を満たすように前記照明の光量を制御することが可能であるか判定し、
前記プログラムは、
前記瞳孔のサイズが前記サイズ条件を満たすように前記照明の光量を制御できないと判定された場合、通知を行う通知処理
をコンピュータにさらに実行させる付記18乃至23のいずれか1項に記載の記憶媒体。
前記プログラムは、
前記照明が、決定された前記照明の光量の可視光を照射するように、前記照明を制御する制御処理
を前記コンピュータにさらに実行させる付記18乃至24のいずれか1項に記載の記憶媒体。
1A 撮像システム
1B 撮像システム
1C 撮像システム
2 撮像システム
100 制御装置
101 制御装置
102 制御装置
110 取得部
120 推定部
130 関係記憶部
140 決定部
150 制御部
160 通知部
170 画像記憶部
200 撮像装置
200A 撮像装置
200B 撮像装置
300 可視光照明装置
400 近赤外光照明装置
500 通知先装置
600 検出装置
1000 コンピュータ
1001 プロセッサ
1002 メモリ
1003 記憶装置
1004 I/Oインタフェース
1005 記憶媒体
Claims (25)
- 目部分の領域である目領域を含む入力画像を取得する取得手段と、
取得した前記入力画像から、前記目部分の照度を推定する推定手段と、
照度と瞳孔のサイズとの関係である照度サイズ関係に基づいて、前記目部分の瞳孔のサイズがサイズ条件を満たすように、前記目部分を照射する可視光の照明の光量を決定する決定手段と、
を備える制御装置。 - 前記推定手段は、前記入力画像から前記目領域の強膜の領域を検出し、検出した前記強膜の領域の画素の情報に基づいて、前記目部分の照度を推定する
請求項1に記載の制御装置。 - 前記推定手段は、前記入力画像から前記目領域の瞳孔の領域を検出し、検出された前記瞳孔の領域に基づいて、前記目部分の瞳孔のサイズを推定し、推定された前記瞳孔のサイズと前記照度サイズ関係とに基づいて、前記目部分の照度を推定する
請求項1に記載の制御装置。 - 前記決定手段は、前記目領域の照度が、当該照度に基づいて定まる眩しさの指標が眩しさ条件を満たすように、前記照明の光量を決定する
請求項1乃至3のいずれか1項に記載の制御装置。 - 前記推定手段は、取得した前記入力画像に基づいて、虹彩の色を推定し、
前記決定手段は、推定された前記虹彩の色に応じた前記眩しさ条件を選択する
請求項4に記載の制御装置。 - 前記推定手段は、取得した前記入力画像に基づいて、虹彩の色を推定し、
前記決定手段は、虹彩の色に応じた複数の照度サイズ関係から、推定された前記虹彩の色に基づいて前記照度サイズ関係を選択し、選択された前記照度サイズ関係に基づいて、前記照明の光量を決定する
請求項1乃至4のいずれか1項に記載の制御装置。 - 前記決定手段は、推定された前記照度と、前記照度サイズ関係とに基づいて、前記瞳孔のサイズが前記サイズ条件を満たすように前記照明の光量を制御することが可能であるか判定し、
前記瞳孔のサイズが前記サイズ条件を満たすように前記照明の光量を制御できないと判定された場合、通知を行う通知手段
をさらに備える請求項1乃至6のいずれか1項に記載の制御装置。 - 前記照明が、決定された前記照明の光量の可視光を照射するように、前記照明を制御する制御手段
をさらに備える請求項1乃至7のいずれか1項に記載の制御装置。 - 請求項1乃至8のいずれか1項に記載の制御装置と、
前記照明と、前記入力画像を撮像する撮像装置と、
を含む撮像システム。 - 目部分の領域である目領域を含む入力画像を取得し、
取得した前記入力画像から、前記目部分の照度を推定し、
照度と瞳孔のサイズとの関係である照度サイズ関係に基づいて、前記目部分の瞳孔のサイズがサイズ条件を満たすように、前記目部分を照射する可視光の照明の光量を決定する、
制御方法。 - 前記入力画像から前記目領域の強膜の領域を検出し、検出した前記強膜の領域の画素の情報に基づいて、前記目部分の照度を推定する
請求項10に記載の制御方法。 - 前記入力画像から前記目領域の瞳孔の領域を検出し、検出された前記瞳孔の領域に基づいて、前記目部分の瞳孔のサイズを推定し、推定された前記瞳孔のサイズと前記照度サイズ関係とに基づいて、前記目部分の照度を推定する
請求項10に記載の制御方法。 - 前記目領域の照度が、当該照度に基づいて定まる眩しさの指標が眩しさ条件を満たすように、前記照明の光量を決定する
請求項10乃至12のいずれか1項に記載の制御方法。 - 取得した前記入力画像に基づいて、虹彩の色を推定し、
推定された前記虹彩の色に応じた前記眩しさ条件を選択する
請求項13に記載の制御方法。 - 取得した前記入力画像に基づいて、虹彩の色を推定し、
虹彩の色に応じた複数の照度サイズ関係から、推定された前記虹彩の色に基づいて前記照度サイズ関係を選択し、選択された前記照度サイズ関係に基づいて、前記照明の光量を決定する
請求項10乃至13のいずれか1項に記載の制御方法。 - 推定された前記照度と、前記照度サイズ関係とに基づいて、前記瞳孔のサイズが前記サイズ条件を満たすように前記照明の光量を制御することが可能であるか判定し、
前記瞳孔のサイズが前記サイズ条件を満たすように前記照明の光量を制御できないと判定された場合、通知を行う
請求項10乃至15のいずれか1項に記載の制御方法。 - 前記照明が、決定された前記照明の光量の可視光を照射するように、前記照明を制御する
請求項10乃至16のいずれか1項に記載の制御方法。 - 目部分の領域である目領域を含む入力画像を取得する取得処理と、
取得した前記入力画像から、前記目部分の照度を推定する推定処理と、
照度と瞳孔のサイズとの関係である照度サイズ関係に基づいて、前記目部分の瞳孔のサイズがサイズ条件を満たすように、前記目部分を照射する可視光の照明の光量を決定する決定処理と、
をコンピュータに実行させるプログラムを記憶する記憶媒体。 - 前記推定処理は、前記入力画像から前記目領域の強膜の領域を検出し、検出した前記強膜の領域の画素の情報に基づいて、前記目部分の照度を推定する
請求項18に記載の記憶媒体。 - 前記推定処理は、前記入力画像から前記目領域の瞳孔の領域を検出し、検出された前記瞳孔の領域に基づいて、前記目部分の瞳孔のサイズを推定し、推定された前記瞳孔のサイズと前記照度サイズ関係とに基づいて、前記目部分の照度を推定する
請求項18に記載の記憶媒体。 - 前記決定処理は、前記目領域の照度が、当該照度に基づいて定まる眩しさの指標が眩しさ条件を満たすように、前記照明の光量を決定する
請求項18乃至20のいずれか1項に記載の記憶媒体。 - 前記推定処理は、取得した前記入力画像に基づいて、虹彩の色を推定し、
前記決定処理は、推定された前記虹彩の色に応じた前記眩しさ条件を選択する
請求項21に記載の記憶媒体。 - 前記推定処理は、取得した前記入力画像に基づいて、虹彩の色を推定し、
前記決定処理は、虹彩の色に応じた複数の照度サイズ関係から、推定された前記虹彩の色に基づいて前記照度サイズ関係を選択し、選択された前記照度サイズ関係に基づいて、前記照明の光量を決定する
請求項18乃至21のいずれか1項に記載の記憶媒体。 - 前記決定処理は、推定された前記照度と、前記照度サイズ関係とに基づいて、前記瞳孔のサイズが前記サイズ条件を満たすように前記照明の光量を制御することが可能であるか判定し、
前記プログラムは、
前記瞳孔のサイズが前記サイズ条件を満たすように前記照明の光量を制御できないと判定された場合、通知を行う通知処理
をコンピュータにさらに実行させる請求項18乃至23のいずれか1項に記載の記憶媒体。 - 前記プログラムは、
前記照明が、決定された前記照明の光量の可視光を照射するように、前記照明を制御する制御処理
を前記コンピュータにさらに実行させる請求項18乃至24のいずれか1項に記載の記憶媒体。
Priority Applications (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2019/034714 WO2021044540A1 (ja) | 2019-09-04 | 2019-09-04 | 制御装置、制御方法及び記憶媒体 |
JP2021543856A JP7264257B2 (ja) | 2019-09-04 | 2019-09-04 | 制御装置、制御方法及びプログラム |
EP19944642.8A EP4027297A4 (en) | 2019-09-04 | 2019-09-04 | CONTROL DEVICE, CONTROL METHOD AND STORAGE MEDIA |
US17/638,356 US20220294965A1 (en) | 2019-09-04 | 2019-09-04 | Control device, control method, and storage medium |
CN201980099822.0A CN114341923A (zh) | 2019-09-04 | 2019-09-04 | 控制设备、控制方法和存储介质 |
BR112022001826A BR112022001826A2 (pt) | 2019-09-04 | 2019-09-04 | Dispositivo de controle, método de controle, e meio de armazenamento |
JP2023064693A JP7401013B2 (ja) | 2019-09-04 | 2023-04-12 | 情報処理装置、制御装置、情報処理方法及びプログラム |
JP2023205147A JP2024037793A (ja) | 2019-09-04 | 2023-12-05 | 情報処理装置、情報処理方法及びプログラム |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2019/034714 WO2021044540A1 (ja) | 2019-09-04 | 2019-09-04 | 制御装置、制御方法及び記憶媒体 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021044540A1 true WO2021044540A1 (ja) | 2021-03-11 |
Family
ID=74852107
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2019/034714 WO2021044540A1 (ja) | 2019-09-04 | 2019-09-04 | 制御装置、制御方法及び記憶媒体 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20220294965A1 (ja) |
EP (1) | EP4027297A4 (ja) |
JP (3) | JP7264257B2 (ja) |
CN (1) | CN114341923A (ja) |
BR (1) | BR112022001826A2 (ja) |
WO (1) | WO2021044540A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024012593A1 (zh) * | 2022-07-15 | 2024-01-18 | 北京鹰瞳科技发展股份有限公司 | 用于近视理疗的红光照射控制方法及其相关产品 |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPWO2021124709A1 (ja) * | 2019-12-19 | 2021-06-24 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003016435A (ja) * | 2001-06-27 | 2003-01-17 | Matsushita Electric Ind Co Ltd | 個体認証装置及び方法 |
JP2004261515A (ja) * | 2003-03-04 | 2004-09-24 | Matsushita Electric Ind Co Ltd | 虹彩撮像装置 |
WO2011042989A1 (ja) * | 2009-10-09 | 2011-04-14 | Kikuchi Kouichi | 視認情景に対する視認者情感判定装置 |
WO2016080031A1 (ja) * | 2014-11-20 | 2016-05-26 | ソニー株式会社 | 制御システム、情報処理装置、制御方法、およびプログラム |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003016434A (ja) * | 2001-06-27 | 2003-01-17 | Matsushita Electric Ind Co Ltd | 個体認証装置 |
WO2015127313A1 (en) * | 2014-02-21 | 2015-08-27 | Samsung Electronics Co., Ltd. | Multi-band biometric camera system having iris color recognition |
JP6417676B2 (ja) * | 2014-03-06 | 2018-11-07 | ソニー株式会社 | 情報処理装置、情報処理方法、アイウェア端末および認証システム |
US20160282934A1 (en) * | 2015-03-25 | 2016-09-29 | Motorola Mobility Llc | Presence detection for gesture recognition and iris authentication |
KR102322029B1 (ko) * | 2015-03-27 | 2021-11-04 | 삼성전자주식회사 | 생체 정보 획득 방법 및 이를 위한 장치 |
EP3281138A4 (en) * | 2015-04-08 | 2018-11-21 | Wavefront Biometric Technologies Pty Limited | Multi-biometric authentication |
JP6530239B2 (ja) * | 2015-05-28 | 2019-06-12 | 浜松ホトニクス株式会社 | 両眼計測装置、両眼計測方法、及び両眼計測プログラム |
US10049272B2 (en) * | 2015-09-24 | 2018-08-14 | Microsoft Technology Licensing, Llc | User authentication using multiple capture techniques |
WO2017091909A1 (en) * | 2015-12-03 | 2017-06-08 | Ophthalight Digital Solutions Inc. | Portable ocular response testing device and methods of use |
JP6753934B2 (ja) * | 2015-12-18 | 2020-09-09 | エーエスエムエル ネザーランズ ビー.ブイ. | 光学システムおよび方法 |
JP2019506694A (ja) * | 2016-01-12 | 2019-03-07 | プリンストン・アイデンティティー・インコーポレーテッド | 生体測定分析のシステムおよび方法 |
US20180064576A1 (en) * | 2016-09-08 | 2018-03-08 | Amo Development, Llc | Systems and methods for obtaining iris registration and pupil centration for laser surgery |
JP6471840B2 (ja) * | 2016-11-17 | 2019-02-20 | 大日本印刷株式会社 | 照明装置 |
US20180173933A1 (en) * | 2016-12-16 | 2018-06-21 | Qualcomm Incorporated | User authentication using iris sector |
JP6614598B2 (ja) * | 2016-12-26 | 2019-12-04 | 株式会社坪田ラボ | 表示システム、電子機器及び照明システム |
DE102017103884A1 (de) * | 2017-02-24 | 2018-08-30 | Osram Opto Semiconductors Gmbh | Beleuchtungseinrichtung, elektronisches Gerät mit einer Beleuchtungseinrichtung und Verwendung einer Beleuchtungseinrichtung |
JP6550094B2 (ja) * | 2017-06-08 | 2019-07-24 | シャープ株式会社 | 認証装置および認証方法 |
US10380418B2 (en) * | 2017-06-19 | 2019-08-13 | Microsoft Technology Licensing, Llc | Iris recognition based on three-dimensional signatures |
US20220141372A1 (en) * | 2019-02-18 | 2022-05-05 | Nec Corporation | Illumination control apparatus, method, system, and computer readable medium |
-
2019
- 2019-09-04 BR BR112022001826A patent/BR112022001826A2/pt unknown
- 2019-09-04 WO PCT/JP2019/034714 patent/WO2021044540A1/ja active Application Filing
- 2019-09-04 US US17/638,356 patent/US20220294965A1/en active Pending
- 2019-09-04 CN CN201980099822.0A patent/CN114341923A/zh active Pending
- 2019-09-04 JP JP2021543856A patent/JP7264257B2/ja active Active
- 2019-09-04 EP EP19944642.8A patent/EP4027297A4/en not_active Withdrawn
-
2023
- 2023-04-12 JP JP2023064693A patent/JP7401013B2/ja active Active
- 2023-12-05 JP JP2023205147A patent/JP2024037793A/ja active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003016435A (ja) * | 2001-06-27 | 2003-01-17 | Matsushita Electric Ind Co Ltd | 個体認証装置及び方法 |
JP2004261515A (ja) * | 2003-03-04 | 2004-09-24 | Matsushita Electric Ind Co Ltd | 虹彩撮像装置 |
WO2011042989A1 (ja) * | 2009-10-09 | 2011-04-14 | Kikuchi Kouichi | 視認情景に対する視認者情感判定装置 |
WO2016080031A1 (ja) * | 2014-11-20 | 2016-05-26 | ソニー株式会社 | 制御システム、情報処理装置、制御方法、およびプログラム |
Non-Patent Citations (3)
Title |
---|
IMARI SATOYOICHI SATOKATSUSHI IKEUCHI: "Estimation of Light Source Environment Based on Shadows of Objects", INFORMATION PROCESSING SOCIETY OF JAPAN JOURNAL: COMPUTER VISION AND IMAGE MEDIA, vol. 41, no. SIG 10, December 2000 (2000-12-01) |
See also references of EP4027297A4 |
TAKESHI OISHISONOKO OKURAREI KAWAKAMITAKAHIKO SAKANOKATSUSHI IKEUCHI: "Superimposition of Person Model Based on Simultaneous Capture Method of Light Source Environment and Object Using Omnidirectional Camera on MR System", IMAGE RECOGNITION AND UNDERSTANDING SYMPOSIUM (MIRU2009, July 2009 (2009-07-01) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024012593A1 (zh) * | 2022-07-15 | 2024-01-18 | 北京鹰瞳科技发展股份有限公司 | 用于近视理疗的红光照射控制方法及其相关产品 |
Also Published As
Publication number | Publication date |
---|---|
JP7401013B2 (ja) | 2023-12-19 |
US20220294965A1 (en) | 2022-09-15 |
JP2023093574A (ja) | 2023-07-04 |
JPWO2021044540A1 (ja) | 2021-03-11 |
BR112022001826A2 (pt) | 2022-03-29 |
EP4027297A1 (en) | 2022-07-13 |
EP4027297A4 (en) | 2022-09-07 |
CN114341923A (zh) | 2022-04-12 |
JP7264257B2 (ja) | 2023-04-25 |
JP2024037793A (ja) | 2024-03-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11132536B2 (en) | Living body detection device, living body detection method, and recording medium | |
JP7401013B2 (ja) | 情報処理装置、制御装置、情報処理方法及びプログラム | |
CN106104630B (zh) | 检测设备和检测方法 | |
JP2012508423A (ja) | ビデオ赤外線網膜像スキャナ | |
JP2016028669A (ja) | 瞳孔検出装置、および瞳孔検出方法 | |
WO2020129138A1 (ja) | 画像処理装置、画像処理方法及び記憶媒体 | |
JP2006285763A (ja) | 被写体についての陰影のない画像を生成する方法および装置、並びにそれに用いる白色板 | |
US11887349B2 (en) | Interest determination apparatus, interest determination system, interest determination method, and non-transitory computer readable medium storing program | |
JP2020021397A (ja) | 画像処理装置、画像処理方法、および画像処理プログラム | |
KR20180000580A (ko) | 조명기를 구비한 스테레오 매칭 시스템에서의 코스트 볼륨 연산장치 및 그 방법 | |
JP2021058361A (ja) | 生体情報取得装置およびプログラム | |
EP3481261B1 (en) | An apparatus for providing semantic information and a method of operating the same | |
JP2020140637A (ja) | 瞳孔検出装置 | |
JP6819653B2 (ja) | 検知装置 | |
JP6967579B2 (ja) | 眼の虹彩角膜領域のカラー像を作成および処理する方法、ならびにそのための装置 | |
JP2019129469A (ja) | 画像処理装置 | |
WO2015068495A1 (ja) | 器官画像撮影装置 | |
JP6512806B2 (ja) | 撮像装置 | |
JP2021047929A (ja) | 情報処理装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19944642 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2021543856 Country of ref document: JP |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112022001826 Country of ref document: BR |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 112022001826 Country of ref document: BR Kind code of ref document: A2 Effective date: 20220131 |
|
ENP | Entry into the national phase |
Ref document number: 2019944642 Country of ref document: EP Effective date: 20220404 |