WO2017073059A1 - 撮像装置 - Google Patents
撮像装置 Download PDFInfo
- Publication number
- WO2017073059A1 WO2017073059A1 PCT/JP2016/004712 JP2016004712W WO2017073059A1 WO 2017073059 A1 WO2017073059 A1 WO 2017073059A1 JP 2016004712 W JP2016004712 W JP 2016004712W WO 2017073059 A1 WO2017073059 A1 WO 2017073059A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- imaging
- image
- subject
- optical system
- light
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 107
- 230000003287 optical effect Effects 0.000 claims abstract description 51
- 238000012545 processing Methods 0.000 claims abstract description 26
- 238000005286 illumination Methods 0.000 claims description 40
- 239000011247 coating layer Substances 0.000 claims description 30
- 238000002834 transmittance Methods 0.000 claims description 24
- 238000000034 method Methods 0.000 claims description 15
- 230000008569 process Effects 0.000 claims description 9
- 238000009826 distribution Methods 0.000 claims description 5
- HCHKCACWOHOZIP-UHFFFAOYSA-N Zinc Chemical group [Zn] HCHKCACWOHOZIP-UHFFFAOYSA-N 0.000 claims description 4
- 238000001514 detection method Methods 0.000 abstract description 10
- 230000003667 anti-reflective effect Effects 0.000 description 30
- 238000012544 monitoring process Methods 0.000 description 13
- 210000001747 pupil Anatomy 0.000 description 13
- 238000004891 communication Methods 0.000 description 10
- 239000010410 layer Substances 0.000 description 10
- 230000000875 corresponding effect Effects 0.000 description 5
- 239000000463 material Substances 0.000 description 4
- 230000004075 alteration Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 230000036544 posture Effects 0.000 description 3
- 206010062519 Poor quality sleep Diseases 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000000576 coating method Methods 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 230000002596 correlated effect Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 230000020169 heat generation Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 239000011347 resin Substances 0.000 description 2
- 229920005989 resin Polymers 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 229910010413 TiO 2 Inorganic materials 0.000 description 1
- GWEVSGVZZGPLCZ-UHFFFAOYSA-N Titan oxide Chemical compound O=[Ti]=O GWEVSGVZZGPLCZ-UHFFFAOYSA-N 0.000 description 1
- MCMNRKCIXSYSNV-UHFFFAOYSA-N ZrO2 Inorganic materials O=[Zr]=O MCMNRKCIXSYSNV-UHFFFAOYSA-N 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 239000011248 coating agent Substances 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 230000008094 contradictory effect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- ORUIBWPALBXDOA-UHFFFAOYSA-L magnesium fluoride Chemical compound [F-].[F-].[Mg+2] ORUIBWPALBXDOA-UHFFFAOYSA-L 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- RVTZCBVAJQQJTK-UHFFFAOYSA-N oxygen(2-);zirconium(4+) Chemical compound [O-2].[O-2].[Zr+4] RVTZCBVAJQQJTK-UHFFFAOYSA-N 0.000 description 1
- 230000035699 permeability Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- OGIDPMRJRNCKJF-UHFFFAOYSA-N titanium oxide Inorganic materials [Ti]=O OGIDPMRJRNCKJF-UHFFFAOYSA-N 0.000 description 1
- 238000009827 uniform distribution Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/73—Deblurring; Sharpening
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/18—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6887—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
- A61B5/6893—Cars
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K28/00—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
- B60K28/02—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
- B60K28/06—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B1/00—Optical elements characterised by the material of which they are made; Optical coatings for optical elements
- G02B1/10—Optical coatings produced by application to, or surface treatment of, optical elements
- G02B1/11—Anti-reflection coatings
- G02B1/113—Anti-reflection coatings using inorganic layer materials only
- G02B1/115—Multilayers
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/147—Details of sensors, e.g. sensor lenses
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2218/00—Aspects of pattern recognition specially adapted for signal processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30268—Vehicle interior
Definitions
- the embodiment of the present disclosure relates to an imaging apparatus for monitoring the state of a subject such as a vehicle driver.
- Patent Document 1 discloses an infrared LED that illuminates the driver's face with near infrared rays, a CCD camera that captures the driver's face, and a blink that detects the open / closed state of the subject's eyes based on the face image.
- a doze alarm device is disclosed that includes a detection circuit and a wakefulness reduction detection circuit that detects a decrease in the wakefulness of a subject.
- An imaging apparatus includes: An imaging optical system that forms an optical image of the face of the subject in the vehicle; An image sensor that captures the imaged optical image to generate a captured image; An image processing unit that determines the state of the subject based on the captured image, When the F value of the imaging optical system is F, the focal length is f, and the pixel pitch of the imaging element is p, F ⁇ 0.0003f 2 / p is satisfied.
- FIG. 1 It is a block diagram which shows schematic structure of the subject monitoring system which concerns on one Embodiment. It is the figure which looked at the vehicle provided with the subject monitoring system of FIG. 1 from the left side. It is sectional drawing along the optical axis which shows schematic structure of the lens contained in an imaging optical system. It is a figure which shows the relationship between a focal distance and F value. It is a figure which shows the relationship between the square of a focal distance / tolerance circle diameter, and F value. It is a figure which shows the transmittance
- a target person monitoring system 10 according to an embodiment of the present disclosure will be described with reference to FIG.
- the subject monitoring system 10 includes a lighting device 11, an imaging device 12, and a warning device 13. Each component of the target person monitoring system 10 can transmit and receive information via the network 14.
- the network 14 may include, for example, wireless, wired, or CAN (Controller Area Network). In other embodiments, some or all of the components of the subject monitoring system 10 may be integrally configured as one device.
- the lighting device 11 is in an arbitrary position where the face of the subject 15 can be irradiated with light.
- the subject 15 may include a driver of the vehicle 16, for example.
- the lighting device 11 may be disposed on a dashboard 17 of the vehicle 16.
- the light emitted from the illumination device 11 is also referred to as illumination light.
- the imaging device 12 is in an arbitrary position where the face of the subject 15 can be imaged.
- the imaging device 12 may be disposed on the dashboard 17 of the vehicle 16, for example.
- the imaging device 12 generates a captured image in which the pupil of the eye of the subject 15 on the image is bright.
- a captured image with a bright pupil is also referred to as a bright pupil image.
- the imaging device 12 may generate a bright pupil image by arranging the illumination device 11 and the imaging device 12 close to each other.
- the imaging device 12 generates a captured image in which the pupil of the eye of the subject 15 on the image is dark.
- a captured image with a dark pupil is also referred to as a dark pupil image.
- the dark device image may be generated by the imaging device 12 by arranging the illumination device 11 and the imaging device 12 apart from each other.
- the warning device 13 issues a warning to the target person 15.
- the warning device 13 may be arranged at an arbitrary position in the vehicle 16.
- the warning device 13 may be arranged at an arbitrary position where the vibration is transmitted to the subject 15.
- the warning device 13 may be disposed on a driver's seat, a steering wheel, a shift knob, or a footrest in the vehicle 16.
- the imaging device 12 will be described. As shown in FIG. 1, the imaging device 12 includes an imaging optical system 22, an imaging element 23, an AFE (Analog Front End) 24, an image processing unit 25, a communication unit 26, and a camera control unit 27. Prepare.
- AFE Analog Front End
- the imaging optical system 22 may include a diaphragm, one or more lenses, and a lens barrel that holds them.
- the imaging optical system 22 forms a subject image by light passing through the imaging optical system 22.
- the imaging optical system 22 transmits light of at least a predetermined wavelength band.
- the predetermined wavelength band may include the wavelength of illumination light emitted from the illumination device 11 as described later.
- the predetermined wavelength band may be a band including the wavelength of the infrared light.
- the imaging optical system 22 may further include a filter that passes light in the predetermined wavelength band.
- the imaging optical system 22 is at an arbitrary position at which reflected light from the irradiation destination of the illumination light emitted from the illumination device 11 can be taken.
- the imaging optical system 22 can form a subject image including the face of the subject 15 irradiated with the illumination light emitted from the illumination device 11.
- an AR (Anti-Reflective) coating layer is formed on each of the two or more lens surfaces.
- the AR coating layer may be formed by multi-coating, for example.
- the transmittance of light in an arbitrary band can be controlled by the AR coating layer.
- the transmittance of light in the visible light band may be controlled.
- the transmittance of light in an arbitrary band may be controlled by forming an AR coating layer having different characteristics for each lens surface.
- an AR coating layer is formed on each of the two or more lens surfaces so that the transmittance of illumination light emitted from the illumination device 11 is greater than the transmittance of visible light. Details of the light transmittance of the AR coating layer will be described later.
- the F value of the lens included in the imaging optical system 22 is determined according to the pixel pitch of the imaging element 23. Details of the method for determining the F value of the lens will be described later.
- the image sensor 23 may include, for example, a CCD (Charge-Coupled Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor. A plurality of pixels may be arranged on the light receiving surface of the image sensor 23 at an arbitrary pixel pitch.
- the image sensor 23 captures a subject image formed by the imaging optical system 22 and generates a captured image.
- the image pickup device 23 may have each function described later of the AFE 24.
- the AFE 24 may include, for example, CDS (Correlated Double Sampling), AGC (Auto Gain Control), and ADC (Analog-to-Digital Converter).
- the AFE 24 performs predetermined pre-stage image processing on the analog captured image generated by the image sensor 23.
- the pre-stage image processing may include correlated double sampling, gain adjustment, A / D conversion, and the like.
- the image processing unit 25 includes one or more processors.
- the processor may include a dedicated processor specialized for a specific process and a general-purpose processor that executes a specific function by reading a specific program.
- the dedicated processor may include a DSP (Digital Signal Processor) and an application specific IC (ASIC; Application Specific Integrated Circuit).
- the processor may include a programmable logic device (PLD).
- the PLD may include an FPGA (Field-Programmable Gate Array).
- the illuminating device control unit 21 may be one of SoC (System-on-a-Chip) and SiP (System-In-a-Package) in which one or a plurality of processors cooperate.
- the image processing unit 25 performs predetermined subsequent image processing on the captured image that has been subjected to the previous image processing by the AFE 24.
- the post-stage image processing may include, for example, exposure adjustment processing.
- the post-stage image processing may include a restoration process for reducing blurring of the captured image.
- the lens included in the imaging optical system 22 has an F value of an appropriate size in order to obtain performance substantially equal to the diffraction limit in the entire region of the captured image.
- the response of the lens to the point source is a rotationally symmetric zinc function distribution.
- the response of the lens to the point light source may be indicated by, for example, a point spread function (PSF).
- PSF point spread function
- the deconvolution filter may include, for example, a Wiener filter.
- the image processing unit 25 performs restoration processing on the captured image using a single deconvolution filter corresponding to the point spread function.
- the restoration process may include a deconvolution process.
- the image processing unit 25 detects the eye or pupil of the subject 15 on the captured image on which the subsequent image processing has been performed. Any method can be adopted for detecting the eye or the pupil. For example, a method using pattern matching or a method of extracting feature points on the captured image (for example, feature points corresponding to the contour of the face, eyes, nose, mouth, etc.) can be employed.
- the feature points on the captured image may include, for example, points corresponding to the face outline, eyes, pupils, nose, mouth, and the like of the subject 15.
- the image processing unit 25 determines the state of the subject 15 based on the detection result of the eye or pupil of the subject 15 on the captured image. For example, when the subject 15 is looking away or drowsy, the image processing unit 25 may not be able to detect the eye or pupil of the subject 15 on the captured image over a plurality of consecutive frames.
- the state in which the target person 15 is looking away or driving asleep is also referred to as a driving inappropriate state.
- the image processing unit 25 may determine that the subject 15 is in a driving inappropriate state when the eye or pupil of the subject 15 on the captured image is not detected over a predetermined number of consecutive frames.
- the image processing unit 25 may generate a control signal for causing the warning device 13 to issue a warning according to the determination result of the state of the subject 15.
- the image processing unit 25 may output the control signal to the warning device 13 via the communication unit 26.
- the image processing unit 25 generates and outputs the control signal when it is determined that the subject 15 is in a driving inappropriate state.
- the communication unit 26 may include an interface for inputting and outputting information via the network 14.
- input of information is also referred to as acquisition or reception of information.
- the output of information is also called information transmission.
- the camera control unit 27 includes one or more processors.
- the processor may include a dedicated processor specialized for a specific process and a general-purpose processor that executes a specific function by reading a specific program.
- the dedicated processor may include an ASIC.
- the processor may include a PLD.
- the PLD may include an FPGA.
- the lighting device control unit 21 may be any of SoC and SiP in which one or more processors cooperate.
- the camera control unit 27 controls the entire operation of the imaging device 12. For example, the camera control unit 27 generates a synchronization signal indicating the imaging timing, controls the operation of the communication unit 26, and causes the illumination device 11 to output the synchronization signal. When the synchronization signal is output, the camera control unit 27 controls the operation of the image sensor 23 to capture a subject image.
- the synchronization signal may be generated by a device other than the imaging device 12 among the devices provided in the vehicle 16.
- the synchronization signal may be generated by an ECU (Electronic Control Unit) mounted on the lighting device 11, the warning device 13, or the vehicle 16.
- the generated synchronization signal may be input to the imaging device 12 and the illumination device 11, respectively.
- the illumination device 11 includes one or more light sources 18, an illumination optical system 19, an illumination device communication unit 20, and an illumination device control unit 21.
- the light source 18 includes, for example, an LED.
- the light source 18 emits light of at least a predetermined wavelength band.
- the light emitted from the light source 18 may be continuous light emission or pulse light emission.
- the light in the predetermined wavelength band is light that can be photoelectrically converted by the image sensor 23 of the imaging device 12.
- the light source 18 may be an infrared LED that emits diffused light in the infrared band.
- the illumination optical system 19 includes a lens whose angle of view is adjusted, for example.
- the illumination optical system 19 irradiates light that passes through the illumination optical system 19.
- the illumination light emitted from the light source 18 and transmitted through the illumination optical system 19 is applied to the entire face of the subject 15.
- the lighting device communication unit 20 includes an interface for inputting and outputting information via the network 14.
- the lighting device control unit 21 includes one or more processors.
- the processor may include a dedicated processor specialized for a specific process and a general-purpose processor that executes a specific function by reading a specific program.
- the dedicated processor may include an ASIC.
- the processor may include a PLD.
- the PLD may include an FPGA.
- the lighting device control unit 21 may be any of SoC and SiP in which one or more processors cooperate.
- the lighting device control unit 21 controls the operation of each part of the lighting device 11. For example, the illumination device control unit 21 causes the light source 18 to emit light in synchronization with the imaging timing of the imaging device 12.
- the illuminating device control unit 21 causes the light source 18 to emit light in the infrared band for a predetermined time periodically according to the synchronization signal acquired via the illuminating device communication unit 20.
- the imaging device 12 performs imaging according to the synchronization signal. For this reason, the light emission timing of the illuminating device 11 and the imaging timing of the imaging device 12 are synchronized.
- the warning device 13 will be described.
- the warning device 13 may include, for example, a speaker and a vibrator.
- the warning device 13 receives the control signal described above from the imaging device 12, the warning device 13 issues a warning to the subject 15.
- the warning may be performed by, for example, at least one of sound and vibration. By warning, for example, it is possible to alert the subject 15 who is performing a look-ahead driving or a drowsy driving to the driving.
- the lens 28 included in the imaging optical system 22 of the imaging device 12 and a method for determining the F value of the lens 28 will be specifically described.
- the lens 28 forms an image of light incident from the object side on the light receiving surface position of the image sensor 23.
- the focal length of the lens 28 is f [mm]
- the F value is F
- the rear depth of field is D n [mm]
- the front depth of field is D f [mm]
- the shooting distance is S [mm]
- the depth of field (rear depth of field D n + forward depth of field D f ) is expressed by the following equation (1).
- the imaging distance between the imaging device 12 and the subject 15 can change according to individual differences or postures of the subject 15.
- predetermined system requirements may be imposed on the imaging device 12 so that a shooting range that can be taken by the shooting distance is within the depth of field.
- the system requirements may be determined according to, for example, the vehicle constant of the vehicle 16 and the arrangement of the imaging device 12 in the vehicle 16.
- four system requirements are considered, for example, shooting range, horizontal field angle ⁇ , pixel pitch p, and allowable circle of confusion diameter ⁇ .
- An example of each system requirement value is shown below.
- Pixel pitch p 0.003mm to 0.006mm
- Allowable confusion circle diameter ⁇ : ⁇ 2p
- the focal length f that satisfies the above system requirements will be described.
- the focal length f is calculated for various combinations in which the numerical values of the system requirements are different.
- a plurality of focal lengths f are calculated by changing the pixel pitch p, the number of horizontal pixels of the image sensor 23, and the horizontal angle of view ⁇ .
- the pixel pitch p may be changed within a range of 0.003 mm to 0.006 mm, for example.
- the number of horizontal pixels may be changed, for example, in the range of 640 pixels to 1920 pixels.
- the horizontal angle of view ⁇ may be changed in the range of 30 ° to 60 °, for example.
- a part of the calculated focal length f is shown in Table 1 below.
- Table 1 shows that the pixel pitch p is 0.006 mm or 0.003 mm, the number of horizontal pixels of the image sensor 23 is 640 pixels, 1280 pixels, or 1920 pixels, and the horizontal angle of view ⁇ is 30 °, 50 °, or 60 °.
- the focal length f calculated so as to satisfy the above system requirements is illustrated for 10 combinations of
- the lower limit value of the F value that satisfies the above system requirements is calculated.
- the lower limit value of the F value is calculated using the above equation (1). For example, each focal length f when the pixel pitch p is 0.006 mm or 0.003 mm, the number of horizontal pixels is 640 pixels, 1280 pixels, or 1920 pixels, and the horizontal angle of view ⁇ is changed from 30 ° to 60 °.
- the lower limit value of the F value is calculated for.
- FIG. 4 is a graph in which the calculated F value is plotted with the horizontal axis representing the focal length f and the vertical axis representing the F value.
- FIG. 5 is a graph obtained by re-plotting the lower limit value of the F value shown in FIG. 4 with the horizontal axis being f 2 / ⁇ .
- the slope of the straight line passing through each plot shown in FIG. 5 is 0.0006, and the intercept is 0. Therefore, the range of the F value that satisfies the system requirements is expressed by the following equation (3).
- the imaging device 12 that satisfies the above system requirements can be realized.
- the captured image may become dark unless the illumination light output from the illumination device 11 is increased, for example. In such a case, the detection accuracy of the state of the subject 15 may be reduced or may not be detected. On the other hand, for example, from the viewpoint of heat generation or power consumption, the output of illumination light from the illumination device 11 may not be increased. Therefore, an appropriate upper limit value of the F value may be determined.
- the lens 28 having a large F value can be easily corrected for aberrations by design and has low tolerance sensitivity. Therefore, the imaging optical system 22 including the lens 28 having a large F value can obtain performance substantially equal to the diffraction limit in the entire region of the captured image.
- the entire area of the captured image may correspond to the entire light receiving surface of the image sensor 23.
- the resolution performance may be resolution. Therefore, the minimum F value that can obtain performance substantially equal to the diffraction limit may be determined, and the determined value may be set as the upper limit value of the F value.
- a method for determining the upper limit value of the F value will be specifically described.
- ⁇ 2.44 ⁇ F (4)
- ⁇ is the wavelength of light.
- ⁇ is the center wavelength of the illumination light in the infrared band by the illumination device 11.
- the optical member includes a lens 28.
- Four configurations will be described in the case where the number of AR coating layers formed on one lens surface is one or two and the number of lens surfaces on which the AR coating layer is formed is four or six.
- FIG. 6 is a graph showing the light wavelength ⁇ [nm] on the horizontal axis and the light transmittance [%] on the vertical axis.
- FIG. 6 shows configuration A in which the number of AR coating layers is one and the number of lens surfaces is four, configuration B in which the number of AR coating layers is one and the number of lens surfaces is six, and the AR coating layer The relationship between wavelength and transmittance is shown for the configuration C in which the number is 2 and the number of lens surfaces is 4, and the configuration D in which the number of AR coating layers is 2 and the number of lens surfaces is 6 ing.
- the solid line in the figure corresponds to configuration A.
- a broken line in the figure corresponds to the configuration B.
- a one-dot chain line in the figure corresponds to the configuration C.
- a two-dot chain line in the figure corresponds to the configuration D.
- the AR coating layer reduces the transmittance of light in the visible light band while maintaining the transmittance of light in the infrared band from the lighting device 11 at a relatively high value.
- the visible light band may include a wavelength band of 360 nm to 760 nm, for example.
- the infrared band may include a wavelength band of 850 nm to 940 nm, for example.
- magnesium fluoride (MgF 2 ) is used for the AR coating layer.
- MgF 2 is a low refractive index material that transmits light from deep ultraviolet to near infrared.
- MgF 2 is used for the first layer and zirconium dioxide (ZrO 2 ) is used for the second layer.
- ZrO 2 is a highly refractive material that transmits light from 340 nm to 8 ⁇ m.
- MgF 2 is used for the first layer and titanium oxide (TiO 2 ) is used for the second layer.
- TiO 2 is a highly refractive material that absorbs light in the ultraviolet band.
- the transmittance is about 80% near 420 nm, and the transmittance in the visible light band is relatively high.
- the transmittance in the visible light band is relatively high in the configurations A and B. The rate has been reduced.
- the transmittance is about 10% near 480 nm, for example, and the transmittance in the visible light band is even lower than the configuration C.
- the transmittance near 400 nm is about 85% in the configuration C, and is about 25% in the configuration D.
- the transmittance near 700 nm is about 80% in the configuration C and about 55% in the configuration D.
- the transmittance of the configuration D is further reduced over the visible light band as compared with the configuration C.
- the configuration of the imaging optical system 22 according to the present embodiment is not limited to the configuration C and the configuration D described above.
- the type of the AR coating layer may include, for example, the material included in the AR coating layer and the number of layers formed.
- TiO at any lens surface AR coating layer ZrO 2 is used for the MgF 2 and the second layer is subjected to a first layer, in other lens surface, the MgF 2 and the second layer to the first layer
- An AR coating layer using 2 may be formed.
- a desired wavelength characteristic can be realized by adjusting the type of the AR coating layer formed on each of two or more lens surfaces.
- the F value of the lens 28 is relatively large. For this reason, the intensity of light in the visible light band of outside light becomes relatively small when passing through the lens 28. For example, when the intensity of illumination light from the illumination device 11 is sufficiently large compared to outside light, a practically good captured image can be obtained if the outside light is reduced to a certain extent by a plurality of AR coating layers as described above. . Therefore, for example, the visible light cut filter can be omitted and the configuration of the imaging optical system 22 can be simplified and downsized as compared with a configuration in which the imaging optical system includes a visible light cut filter.
- the lens 28 of the image pickup apparatus 12 uses the F value, the focal length f, and the pixel pitch p of the image pickup device 23 to satisfy F ⁇ 0. It may be determined to satisfy 0003 f 2 / p. With such a configuration, it is possible to realize the imaging device 12 that is installed in the vehicle 16 and can be used in a special usage environment in which the subject 15 is imaged.
- an imaging device other than the imaging device 12 according to an embodiment is used.
- the space in the vehicle is limited and the imaging distance from the imaging device to the subject is short, so that the depth of field of the imaging device becomes shallow.
- the shooting distance can vary depending on individual differences or postures of the subject. For this reason, if the depth of field is shallow, the subject is not necessarily focused and the captured image may be blurred. The captured image in which the blur has occurred may cause a decrease in detection accuracy of the state of the subject.
- the imaging device 12 for example, the imaging device 12 that secures a sufficient depth of field to capture the subject 15 whose posture can change in the vehicle 16 can be realized. It is. Therefore, the occurrence of blur in the captured image is suppressed, and the detection accuracy of the state of the subject 15 is improved.
- the F value can be increased.
- the aperture of the lens 28 can be reduced.
- the lens barrel of the imaging optical system 22 can be reduced in size, and the entire imaging apparatus 12 can be reduced in size.
- Aberration correction is facilitated by increasing the F value. Therefore, for example, the number of lenses included in the imaging optical system 22 can be reduced as compared with a configuration using a lens that satisfies F ⁇ 0.0003 f 2 / p that does not satisfy the above-described formula (1), and the imaging apparatus 12 as a whole further. Miniaturization is possible. Increasing the F value increases the depth of focus.
- the F value of the lens 28 may be determined using the central wavelength ⁇ of the illumination light that illuminates the subject 15 so as to satisfy F ⁇ 1.64p / ⁇ , which is the above-described equation (6). With this configuration, it is possible to realize the imaging device 12 having a good balance between the resolution performance due to the diffraction limit and the brightness of the captured image.
- an imaging device other than the imaging device 12 according to an embodiment In order to increase the depth of field, it is conceivable to increase the F value of the imaging device. However, when the F value is increased, the brightness of the captured image becomes darker. For this reason, the detection accuracy of the state of the subject may be reduced or cannot be detected.
- the imaging device 12 may perform a restoration process on the captured image using a single deconvolution filter. As described above, by determining the F value of the lens 28 so as to satisfy the expression (3) and sufficiently deepening the depth of field, the distribution of the zinc function having a substantially uniform point spread function in the entire region of the captured image. It becomes. In other words, by using the lens 28 having a deep depth of field, the imaging device 12 can be designed so that the spread of the point image over the entire photographing distance has a constant distribution. Specifically, the illumination light from the illumination device 11 is infrared band light having a predetermined center wavelength, and chromatic aberration can be ignored in practice.
- the spread of the point image becomes a substantially uniform distribution in the entire region of the captured image.
- the lens 28 having a large F value has low tolerance sensitivity. For this reason, performance close to the design value can be obtained. Therefore, the blur of the captured image can be reduced using a single deconvolution filter corresponding to the point spread function. For this reason, for example, the processing load is reduced compared to a configuration in which the point spread function is nonuniform in the entire region of the captured image and the restoration process is performed using a plurality of filters. When the performance is close to the diffraction limit, the filter size can be reduced.
- Different types of coating layers may be formed on two or more lens surfaces of one or more lenses included in the imaging optical system 22.
- desired wavelength characteristics can be obtained. For example, it is possible to increase the transmittance in the infrared band while reducing the transmittance in the visible light band and the ultraviolet band.
- the visible light cut filter can be omitted as compared with a configuration in which the imaging optical system has a visible light cut filter. For this reason, the configuration of the imaging optical system 22 can be simplified and downsized.
- the imaging device 12 or the like may be realized as a communication device such as a mobile phone, and may be connected to other components of the target person monitoring system 10 by wire or wirelessly.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- General Health & Medical Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Veterinary Medicine (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Optics & Photonics (AREA)
- Inorganic Chemistry (AREA)
- Combustion & Propulsion (AREA)
- Emergency Management (AREA)
- Business, Economics & Management (AREA)
- Vascular Medicine (AREA)
- Signal Processing (AREA)
- Developmental Disabilities (AREA)
- Educational Technology (AREA)
- Child & Adolescent Psychology (AREA)
- Social Psychology (AREA)
- Psychiatry (AREA)
- Hospice & Palliative Care (AREA)
- Psychology (AREA)
- Studio Devices (AREA)
- Surface Treatment Of Optical Elements (AREA)
- Emergency Alarm Devices (AREA)
Abstract
Description
車両内の対象者の顔の光学像を結像させる撮像光学系と、
結像された前記光学像を撮像して撮像画像を生成する撮像素子と、
前記撮像画像に基づいて前記対象者の状態を判定する画像処理部と、を備え、
前記撮像光学系のF値をF、焦点距離をf、および前記撮像素子の画素ピッチをpとしたとき、F≧0.0003f2/pを満たす。
・撮影範囲:400mm~1000mm
・水平画角θ:30°~60°
・画素ピッチp:0.003mm~0.006mm
・許容錯乱円直径δ:δ=2p
h=f×tan(θ/2) (2)
F≧0.0003f2/p (3)
φ=2.44λF (4)
ここで、λは光の波長である。一実施形態において、λは、照明装置11による赤外帯域の照明光の中心波長である。
φ≦4p (5)
F≦1.64p/λ (6)
11 照明装置
12 撮像装置
13 警告装置
14 ネットワーク
15 対象者
16 車両
17 ダッシュボード
18 光源
19 照明光学系
20 照明装置通信部
21 照明装置制御部
22 撮像光学系
23 撮像素子
24 AFE
25 画像処理部
26 通信部
27 カメラ制御部
28 レンズ
Claims (5)
- 車両内の対象者の顔の光学像を結像させる撮像光学系と、
結像された前記光学像を撮像して撮像画像を生成する撮像素子と、
前記撮像画像に基づいて前記対象者の状態を判定する画像処理部と、を備え、
前記撮像光学系のF値をF、焦点距離をf、および前記撮像素子の画素ピッチをpとしたとき、以下の式を満たす、撮像装置。
F≧0.0003f2/p - 請求項1に記載の撮像装置であって、
前記対象者を照らす照明光の波長をλとしたとき、以下の式を満たす、撮像装置。
F≦1.64p/λ - 請求項1または2に記載の撮像装置であって、
前記画像処理部は、単一のデコンボリューションフィルタを用いて、前記撮像画像に対して復元処理を施す、撮像装置。 - 請求項3に記載の撮像装置であって、
前記デコンボリューションフィルタは、回転対称のジンク関数の分布を示す点拡がり関数に対応するフィルタである、撮像装置。 - 請求項1乃至4の何れか一項に記載の撮像装置であって、
前記撮像光学系は、1以上のレンズを含み、2以上のレンズ面に異なる種類のコーティング層が形成され、前記対象者を照らす照明光の透過率が可視光の透過率よりも大きい、撮像装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP16859297.0A EP3370410B1 (en) | 2015-10-26 | 2016-10-26 | Imaging device |
JP2017547620A JP6431210B2 (ja) | 2015-10-26 | 2016-10-26 | 撮像装置 |
US15/769,975 US10376199B2 (en) | 2015-10-26 | 2016-10-26 | Imaging apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015210095 | 2015-10-26 | ||
JP2015-210095 | 2015-10-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017073059A1 true WO2017073059A1 (ja) | 2017-05-04 |
Family
ID=58630190
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/004712 WO2017073059A1 (ja) | 2015-10-26 | 2016-10-26 | 撮像装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US10376199B2 (ja) |
EP (1) | EP3370410B1 (ja) |
JP (1) | JP6431210B2 (ja) |
WO (1) | WO2017073059A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107346416A (zh) * | 2017-06-09 | 2017-11-14 | 湖北天业云商网络科技有限公司 | 一种基于人体拓扑结构的身体运动状态检测方法及系统 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009252094A (ja) * | 2008-04-09 | 2009-10-29 | Toyota Motor Corp | 顔画像検出装置 |
JP2009282551A (ja) * | 2009-08-31 | 2009-12-03 | Canon Inc | 撮像装置および撮像システム |
WO2015093004A1 (ja) * | 2013-12-18 | 2015-06-25 | 株式会社デンソー | 顔画像撮影装置、および運転者状態判定装置 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08290726A (ja) | 1995-04-21 | 1996-11-05 | Mitsubishi Electric Corp | 居眠り警報装置 |
JP4290935B2 (ja) * | 2002-07-18 | 2009-07-08 | オリンパス株式会社 | 電子撮像装置 |
EP2430826A1 (en) * | 2009-05-12 | 2012-03-21 | Koninklijke Philips Electronics N.V. | Camera, system comprising a camera, method of operating a camera and method for deconvoluting a recorded image |
JP5897769B2 (ja) * | 2013-04-16 | 2016-03-30 | 富士フイルム株式会社 | 撮像装置、キャリブレーションシステム、及びプログラム |
CN203535250U (zh) | 2013-11-11 | 2014-04-09 | 成都市晶林电子技术有限公司 | 一种红外镀膜镜片 |
DE102014203444A1 (de) * | 2014-02-26 | 2015-08-27 | Application Solutions (Electronics and Vision) Ltd. | Vorrichtung und Verfahrung zur Ermittlung eines Augenzustands eines Fahrzeuginsassen |
CN106165390B (zh) * | 2014-03-28 | 2019-06-28 | 富士胶片株式会社 | 图像处理装置、摄影装置、图像处理方法 |
-
2016
- 2016-10-26 JP JP2017547620A patent/JP6431210B2/ja active Active
- 2016-10-26 US US15/769,975 patent/US10376199B2/en active Active
- 2016-10-26 EP EP16859297.0A patent/EP3370410B1/en active Active
- 2016-10-26 WO PCT/JP2016/004712 patent/WO2017073059A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009252094A (ja) * | 2008-04-09 | 2009-10-29 | Toyota Motor Corp | 顔画像検出装置 |
JP2009282551A (ja) * | 2009-08-31 | 2009-12-03 | Canon Inc | 撮像装置および撮像システム |
WO2015093004A1 (ja) * | 2013-12-18 | 2015-06-25 | 株式会社デンソー | 顔画像撮影装置、および運転者状態判定装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3370410A4 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107346416A (zh) * | 2017-06-09 | 2017-11-14 | 湖北天业云商网络科技有限公司 | 一种基于人体拓扑结构的身体运动状态检测方法及系统 |
Also Published As
Publication number | Publication date |
---|---|
US10376199B2 (en) | 2019-08-13 |
EP3370410A1 (en) | 2018-09-05 |
EP3370410B1 (en) | 2020-11-25 |
JPWO2017073059A1 (ja) | 2018-07-05 |
JP6431210B2 (ja) | 2018-11-28 |
US20180310868A1 (en) | 2018-11-01 |
EP3370410A4 (en) | 2019-04-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10097752B2 (en) | Lens barrel, imaging device body, and imaging device with apodization filters | |
US10132971B2 (en) | Vehicle camera with multiple spectral filters | |
JP2008268868A (ja) | 撮像装置 | |
EP2490060A1 (en) | Focusing device and focusing method | |
WO2004063989A3 (en) | Camera with image enhancement functions | |
JP2010213274A (ja) | 拡張された被写界深度の監視用撮像システム | |
JP2008246004A (ja) | 瞳孔検出方法 | |
KR20140068042A (ko) | 촬상 장치 및 필터 | |
US11675174B2 (en) | Single optic for low light and high light level imaging | |
JP6431210B2 (ja) | 撮像装置 | |
CN105430310B (zh) | 图像处理方法、摄像装置以及图像处理装置 | |
JP6330474B2 (ja) | 画像処理装置、画像処理装置の制御方法、撮像装置 | |
JP2010175713A (ja) | 撮像光学系および撮像装置 | |
JP2008132160A (ja) | 瞳孔検出装置及び瞳孔検出方法 | |
JP2021021892A (ja) | 撮像装置及び撮像システム | |
US20140016188A1 (en) | Lens System | |
JP2003102029A (ja) | カラー撮像装置、カラー撮像装置の光学フィルタ、及びカラー撮像装置の交換レンズ | |
JP6463975B2 (ja) | 眼の開閉状態の判定方法、画像処理装置、および判定システム | |
WO2023181630A1 (ja) | 画像処理装置、および撮像システム | |
JP2016177215A (ja) | 撮像装置 | |
JP5286495B2 (ja) | レンズ装置及び撮像装置 | |
JP2015087494A (ja) | 撮像装置 | |
JP6632314B2 (ja) | 撮像装置、その制御方法、および制御プログラム | |
JP2017068385A (ja) | 撮像装置、および監視システム | |
JP2017220811A (ja) | 撮像装置およびカメラシステム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16859297 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017547620 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15769975 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2016859297 Country of ref document: EP |