WO2020194489A1 - Exposure control device and exposure control method - Google Patents

Exposure control device and exposure control method Download PDF

Info

Publication number
WO2020194489A1
WO2020194489A1 PCT/JP2019/012738 JP2019012738W WO2020194489A1 WO 2020194489 A1 WO2020194489 A1 WO 2020194489A1 JP 2019012738 W JP2019012738 W JP 2019012738W WO 2020194489 A1 WO2020194489 A1 WO 2020194489A1
Authority
WO
WIPO (PCT)
Prior art keywords
exposure control
detection
edge
eye
control device
Prior art date
Application number
PCT/JP2019/012738
Other languages
French (fr)
Japanese (ja)
Inventor
吉本 忠文
昇吾 米山
篤 松本
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2019/012738 priority Critical patent/WO2020194489A1/en
Priority to JP2021507101A priority patent/JP6873351B2/en
Publication of WO2020194489A1 publication Critical patent/WO2020194489A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Definitions

  • the present invention relates to exposure control when imaging the eyes of a vehicle occupant.
  • an occupant monitoring system that detects the inattentive state or dozing state from the position of the occupant's eyes or the opening of the eyelids and warns the occupant.
  • the eyes of the occupant are photographed by a camera mounted on the vehicle, and the state of the occupant is detected using the photographed image.
  • the occupant is wearing spectacles, there is a problem that external light is reflected on the lens of the spectacles and it is not possible to properly take an image of the eyes.
  • Patent Document 1 proposes a technique for adjusting the exposure when external light reflection is detected.
  • the present invention has been made to solve the above problems, and an object of the present invention is to perform appropriate exposure control when a part of the eye cannot be photographed due to external light reflection.
  • the exposure control device of the present invention detects an eye detection unit that detects the outer and inner corners of the eyes from an image of the eyes of a vehicle occupant by a camera, and an edge of the eyelid between the outer and inner corners of the eye from the captured image as a detection edge.
  • An edge detection unit for detecting an edge, a difference detection unit for detecting a difference between the detection edge and the reference edge, and an exposure control unit for performing exposure control so that the difference between the detection edge and the reference edge becomes small are provided.
  • the exposure control device of the present invention performs exposure control so that the difference between the detection edge and the reference edge becomes small, it is possible to perform appropriate exposure control when a part of the eye cannot be photographed due to external light reflection.
  • FIG. It is a block diagram which shows the structure of the exposure control apparatus of Embodiment 1.
  • FIG. It is a flowchart which shows the operation of the exposure control apparatus of Embodiment 1. It is a figure which shows the shooting range of a camera. It is a figure which shows the detection edge in the state which there is no external light reflection. It is a figure which shows the image taken by the camera with the outside light reflection. It is a figure which shows the detection edge detected by the edge detection part from the photographed image of FIG. It is a figure which shows the method of exposure control. It is a figure which shows the image taken by the camera after exposure control. It is a flowchart which shows the operation of the difference detection part of Embodiment 2 in step S104 of FIG.
  • FIG. 1 is a block diagram showing the configuration of the exposure control device 101 of the first embodiment.
  • the exposure control device 101 is a device that controls the exposure of the camera 21 so that the camera 21 can appropriately photograph the eyes of the occupants of the vehicle.
  • a vehicle equipped with a camera 21 for which the exposure control device performs exposure control is simply referred to as a "vehicle".
  • the camera 21 is mounted on the vehicle and captures the facial image of the occupant. As shown in FIG. 2, the shooting range 31 of the camera 21 is a range including the face of the occupant 32. Here, the occupant photographed by the camera 21 is typically a driver, but may be an occupant other than the driver.
  • the image acquisition unit 11 acquires the captured image of the camera 21 and outputs it to the eye detection unit 12.
  • the eye detection unit 12 acquires a photographed image from the image acquisition unit 11 and detects the outer and inner corners of the occupant's eyes from the photographed image.
  • the edge detection unit 13 acquires a captured image and detection information of the outer and inner corners of the eye from the eye detection unit 12, and detects the edge shape of the eyelid between the outer and inner corners of the eye as a detection edge from the information.
  • the difference detection unit 14 acquires the detection edge from the edge detection unit 13 and compares the detection edge with the reference edge to detect the difference between the two.
  • the reference edge is an edge prepared in advance as an edge shape of a general eyelid. Since there are individual differences in the shape of the eyes, for example, the fineness and size of the eyes, a plurality of patterns are usually prepared for the reference edge, and the difference detection unit 14 extracts the reference edge closest to the detection edge and both. To compare. The closer the detection edge is to the reference edge, the more correctly the camera 21 can capture the eyes of the occupant, and it can be said that the exposure is appropriate. If the occupant is wearing spectacles and the lens of the spectacles reflects external light, it is difficult to accurately detect the edge of the eyelid in the area where the external light is reflected. The detection edge deviates from the reference edge.
  • the exposure control unit 15 acquires the difference between the detection edge and the reference edge from the difference detection unit 14, and controls the exposure of the camera 21 in the direction in which the difference becomes smaller.
  • FIG. 2 is a flowchart showing the operation of the exposure control device 101. The flow of FIG. 2 starts at the timing when the accessory power of the vehicle is turned on and the camera 21 starts shooting, and is repeated while the vehicle is running.
  • the image acquisition unit 11 acquires the captured image of the camera 21 (step S101).
  • FIG. 3 shows the shooting range 31 of the camera 21.
  • the photographing range 31 is a range including the face of the occupant 32.
  • the eye detection unit 12 detects the outer and inner corners of the eye from the captured image (step S102).
  • the edge detection unit 13 detects the edge of the eyelid between the outer and inner corners of the eye from the captured image (step S103).
  • the edge detected here is referred to as a detection edge.
  • the difference detection unit 14 compares the detection edge with the reference edge and detects the difference between the two (step S104).
  • the exposure control unit 15 controls the exposure of the camera 21 so that the difference between the detection edge and the reference edge becomes small, in other words, the detection edge approaches the reference edge (step S105).
  • the exposure control device 101 of the first embodiment includes an eye detection unit 12 that detects the outer and inner corners of the eyes from the captured image of the eyes of the vehicle occupant by the camera 21, and the outer corners of the eyes from the captured image.
  • the edge detection unit 13 that detects the edge of the eyelid between the inner corners of the eye as the detection edge
  • the difference detection unit 14 that detects the difference between the detection edge and the reference edge
  • It includes an exposure control unit 15 that controls exposure. Therefore, according to the exposure control device 101, even if a part of the eye cannot be photographed due to external light reflection on the lens of the spectacles worn by the occupant, the entire eye is photographed by performing appropriate exposure control. Can be done.
  • Embodiment 2 > ⁇ B-1.
  • FIG. 4 shows the detection edge 42 in the absence of external light reflection. If the lens 41 of the spectacles does not reflect external light, the entire eye is clearly captured in the captured image, so that the edge detection unit 13 can accurately detect the edge of the eyelid.
  • the edge detection unit 13 detects the edge of the eyelid at five detection points 421-425, and the solid line connecting the detection points 421-425 is the detection edge 42.
  • the reference point 431-435 corresponding to the detection point 421-425 and the reference edge 43 connecting the reference points 431-435 are shown by dotted lines. Of the reference edges 43, the points corresponding to the detection points 421-425 are the reference points 431-435.
  • detection points 421-425 means, for example, that the lateral position is equal to detection points 421-425.
  • the detection points 421-425 are detected with high accuracy and coincide with the reference points 431-435. In this case, the exposure control unit 15 does not change the current exposure control.
  • FIG. 5 shows an image taken by the camera 21 in a state where there is external light reflection.
  • region 44 the region where external light reflection occurs
  • region 45 the region where external light reflection does not occur. Since the face portion overlapping the region 44 does not appear in the captured image, the edge detection unit 13 cannot accurately detect the edge of the eyelid.
  • FIG. 6 shows the detection edge 42 detected by the edge detection unit 13 from the captured image of FIG.
  • the detection points 421,422,425 in the region 45 coincide with the corresponding reference points 431,432,435, but the detection points 423,424 in the region 44 do not coincide with the corresponding reference points 433,434.
  • the exposure control unit 15 determines that external light reflection occurs at the detection points 423 and 424, and that external light reflection does not occur at the detection points 421, 422 and 425.
  • the detection points 423 and 424 in the region where external light reflection occurs are referred to as first detection points
  • the detection points 421, 422 and 425 in the region where external light reflection does not occur are referred to as second detection points.
  • the exposure control unit 15 controls the exposure of the camera 21 based on the brightness of the first detection point and the brightness of the second detection point.
  • the brightness of each detection point can be measured from the captured image.
  • one detection point is selected from each of the first detection point and the second detection point, and the average brightness of the selected detection points is used as a reference.
  • the detection point 423 is selected as the first detection point
  • the detection point 421 is selected as the second detection point.
  • the exposure control unit 15 calculates the average brightness LA1 of the detection points 423 and 424 which are the first detection points and the average brightness LA2 of the detection points 421, 422 and 425 which are the second detection points. Then, the exposure control of the camera 21 is performed with reference to the average value of the average brightness LA1 and the average brightness LA2. The exposure control unit 15 is performed by adjusting the exposure time or the aperture value of the camera 21 or adjusting the amount of light of the floodlight 22. A plurality of adjustment methods listed above may be combined.
  • the floodlight 22 is mounted on the vehicle and irradiates the occupants with light.
  • FIG. 8 shows a captured image of the camera 21 after exposure control. Although the brightness of the entire photographing region is reduced by the exposure control, it is possible to detect the eyes even in the region 44 where the external light reflection has occurred.
  • FIG. 9 is a flowchart showing the operation of the difference detection unit 14 of the exposure control device 102 in step S104 of FIG.
  • the edge detection unit 13 detects the edge of the eyelid between the inner and outer corners of the eye at a plurality of detection points (step S103 in FIG. 2).
  • the difference detection unit 14 selects one detection point (step S1401), and performs the following processing on the selected detection point. That is, the difference detection unit 14 determines whether or not the distance between the detection point and the reference point is equal to or greater than a predetermined threshold value (step S1402).
  • the difference detection unit 14 sets the detection point as the first detection point (step S1403). On the other hand, when the distance between the detection point and the reference point is less than the threshold value, the difference detection unit 14 sets the detection point as the second detection point (step S1404).
  • the difference detection unit 14 determines whether or not the processing has been completed for all the detection points (step S1405). If there are unprocessed detection points, the difference detection unit 14 returns to step S1401, and if the processing is completed for all the detection points, the difference detection unit 14 proceeds to step S105.
  • FIG. 10 is a flowchart showing the operation of the exposure control unit 15 of the exposure control device 102 in step S105 of FIG.
  • the exposure control unit 15 calculates the average brightness LA1 of the first detection point (step S1501).
  • the first detection points are detection points 423 and 424.
  • the exposure control unit 15 calculates the average brightness LA2 of the second detection point (step S1502).
  • the second detection point is the detection point 421, 422, 425.
  • the edge detection unit 13 detects the edge of the eyelid at a plurality of detection points. Although five detection points are shown in FIG. 4 and the like, the number of detection points is not limited to five. Further, the edge detection unit 13 may detect not only the upper eyelid but also the edge of the lower eyelid, or may detect both the edge of the upper eyelid and the edge of the lower eyelid.
  • the exposure control unit 15 sets the average value of the average brightness LA1 at the first detection point and the average brightness LA2 at the second detection point as the reference brightness LS, but sets the reference brightness LS by another method. Is also good.
  • the exposure control unit 15 may use the median brightness of the first detection point and the second detection point as the reference brightness LS, or the median brightness of the first detection point and the median brightness of the second detection point. The average value of and may be used as the reference luminance LS.
  • the edge detection unit 13 detects the detection edge at a plurality of detection points. Then, the exposure control unit 15 is a second detection point in which the brightness of the first detection point, which is a detection point whose distance from the corresponding point of the reference edge is equal to or greater than the threshold value, and the distance between the corresponding point of the reference edge are less than the threshold value.
  • the exposure of the camera 21 is controlled based on the brightness of the detection point. This exposure control makes it possible to photograph the eyes at both the first detection point and the second detection point. Therefore, even if there is external light reflection, it is possible to appropriately detect eye information.
  • the exposure control unit 15 controls the exposure of the camera 21 based on the average value of the average brightness of the first detection point and the average brightness of the second detection point. This exposure control makes it possible to photograph the eyes at both the first detection point and the second detection point.
  • the exposure control unit 15 controls the exposure of the camera 21 based on the average value of the median brightness of the first detection point and the median brightness of the second detection point. This exposure control makes it possible to photograph the eyes at both the first detection point and the second detection point.
  • the exposure control unit 15 controls the exposure of the camera 21 based on the median value of the brightness of the first detection point and the brightness of the second detection point. This exposure control makes it possible to photograph the eyes at both the first detection point and the second detection point.
  • FIG. 11 is a block diagram showing the configuration of the exposure control device 103 according to the third embodiment.
  • the exposure control device 103 includes an eye information detection unit 16 in addition to the configurations of the exposure control devices 101 and 102 of the first and second embodiments. Since the configuration of the exposure control device 103 other than the eye information detection unit 16 is the same as that of the exposure control devices 101 and 102, the description thereof will be omitted.
  • the eye information detection unit 16 acquires the captured image of the camera 21 from the image acquisition unit 11 and detects the eye information from the captured image.
  • the eye information is, for example, the direction of the line of sight or the opening degree of the eyelids, and is used to grasp the state of the driver.
  • the driver's condition is, for example, inattentive driving or dozing driving.
  • the exposure control unit 15 performs appropriate exposure control, so that the eye information can be accurately obtained from the captured image after the exposure control. Can be detected.
  • the eye information detection unit 16 may detect eye information only from the captured image after exposure control, or exposes and controls a portion of the eye information detected from the captured image before exposure control where external light reflection occurs. It may be supplemented with the eye information detected from the later captured image. The complementation of eye information will be described below.
  • FIG. 12 shows an image taken before exposure control by the exposure control device 103. Since the external light reflection occurs in the region 44, the eye information detection unit 16 can detect the information of the eye portion overlapping the region 45, but cannot detect the information of the eye portion overlapping the region 44 with high accuracy.
  • FIG. 13 shows a captured image after exposure control. In the captured image of FIG. 13, eyes are reflected in both the area 44 and the area 45, and the eye information detection unit 16 can detect the information of the eye portion in the area 45 with high accuracy.
  • the eye information detection unit 16 complements the eye information that could not be detected from the captured image before the exposure control from the captured image after the exposure control. For example, when the eye information is the edge of the eyelid, the eye information is complemented by replacing the detection points 4231 and 4241 shown in FIG. 13 with the detection points 423 and 424 shown in FIG. Do.
  • the eye information detection unit 16 estimates the region 44 where external light reflection occurs from the detection points 4231 and 4241, which are the second detection points. ..
  • the eye information detection unit 16 may estimate a certain region including the detection points 4231 and 4241, specifically, a region along the vertical direction of the lens 41 as a region 44 in which external light reflection occurs.
  • the edge detection unit 13 detects the edges of both the upper eyelid and the lower eyelid
  • the eye information detection unit 16 reflects external light on the region connecting the second detection points extracted from both edges. May be estimated as the region 44 where As a result, the region 44 where the external light reflection has occurred can be accurately estimated.
  • the eye information detection unit 16 complements the eye information detected from the region estimated to have caused external light reflection in the captured image with the eye information detected from the captured image before the exposure control. As a result, the eye information detection unit 16 can obtain accurate eye information.
  • the exposure control device 103 includes an eye information detection unit 16 that detects eye information including eyelid shape information from a captured image. Therefore, according to the exposure control device 103, the eye information can be detected with high accuracy even when there is external light reflection.
  • the eye information detection unit 16 complements the eye information detected from the captured image before the exposure control by the exposure control device 103 with the eye information detected from the captured image after the exposure control. As a result, eye information can be detected with high accuracy even when there is external light reflection.
  • FIG. 14 is a block diagram showing the configuration of the exposure control device 104 according to the fourth embodiment.
  • the exposure control device 104 includes a state detection unit 17 in addition to the exposure control device 103 of the third embodiment. Since the configuration of the exposure control device 104 other than the state detection unit 17 is the same as that of the exposure control device 103, the description thereof will be omitted.
  • the state detection unit 17 detects the driver's state based on the eye information detected by the eye information detection unit 16. For example, the state detection unit 17 detects whether or not the driver is inattentive driving based on the direction of the driver's line of sight detected as eye information. Alternatively, the state detection unit 17 detects whether or not the driver is dozing based on the edge of the driver's eyelids detected as eye information. As a result, for example, when the driver is in a state unsuitable for driving, a warning can be given through a display device or a voice output device mounted on the vehicle.
  • the exposure control device 104 includes a state detection unit that detects the state of the occupant based on eye information. Therefore, even if the occupant is wearing glasses, the state of the occupant can be accurately detected.
  • the processing circuit 51 includes an image acquisition unit 11, an eye detection unit 12, an edge detection unit 13, a difference detection unit 14, an exposure control unit 15, an eye information detection unit 16, and a state detection unit 17 (hereinafter, “image acquisition unit”). It is referred to as "11 mag").
  • image acquisition unit It is referred to as "11 mag”).
  • Dedicated hardware may be applied to the processing circuit 51, or a processor that executes a program stored in the memory may be applied.
  • the processor is, for example, a central processing unit, a processing unit, an arithmetic unit, a microprocessor, a microcomputer, a DSP (Digital Signal Processor), or the like.
  • the processing circuit 51 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-Programmable). GateArray), or a combination of these.
  • Each of the functions of each part such as the parts may be realized by a plurality of processing circuits 51, or the functions of each part may be collectively realized by one processing circuit.
  • the processing circuit 51 When the processing circuit 51 is a processor, the functions of the image acquisition unit 11 and the like are realized by combining software and the like (software, firmware or software and firmware). Software and the like are described as programs and stored in memory. As shown in FIG. 16, the processor 52 applied to the processing circuit 51 realizes the functions of each part by reading and executing the program stored in the memory 53. That is, when the exposure control device 101-104 is executed by the processing circuit 51, the step of detecting the outer and inner corners of the eyes from the captured image of the eyes of the vehicle occupant by the camera 21 and the outer and inner corners of the eyes from the captured image.
  • a memory 53 for storing a program to be executed as a result is provided.
  • this program causes the computer to execute the procedure or method of the image acquisition unit 11 or the like.
  • the memory 53 includes, for example, non-volatile such as RAM (RandomAccessMemory), ROM (ReadOnlyMemory), flash memory, EPROM (ErasableProgrammableReadOnlyMemory), and EPROM (ElectricallyErasableProgrammableReadOnlyMemory).
  • non-volatile such as RAM (RandomAccessMemory), ROM (ReadOnlyMemory), flash memory, EPROM (ErasableProgrammableReadOnlyMemory), and EPROM (ElectricallyErasableProgrammableReadOnlyMemory).
  • HDD Hard Disk Drive
  • magnetic disk flexible disk
  • optical disk compact disk
  • mini disk DVD (Digital Versatile Disk) and its drive device, or any storage medium that will be used in the future. There may be.
  • each function of the image acquisition unit 11 and the like is realized by either hardware or software has been described above.
  • the present invention is not limited to this, and a configuration may be configured in which a part of the image acquisition unit 11 or the like is realized by dedicated hardware and another part is realized by software or the like.
  • the image acquisition unit 11 realizes its function by a processing circuit as dedicated hardware, and other than that, the processing circuit 51 as a processor 52 reads and executes a program stored in the memory 53 to execute the function. It is possible to realize.
  • the processing circuit can realize each of the above-mentioned functions by hardware, software, or a combination thereof.
  • FIG. 17 shows a configuration example of the exposure control device 103 by the vehicle 71 and the server 72.
  • the image acquisition unit 11, the exposure control unit 15, and the eye information detection unit 16 are arranged in the vehicle 71, and the eye detection unit 12, the edge detection unit 13, and the difference detection unit 14 are arranged in the server 72.
  • each embodiment can be freely combined, and each embodiment can be appropriately modified or omitted.
  • the present invention has been described in detail, the above description is exemplary in all embodiments and the invention is not limited thereto. It is understood that a myriad of variations not illustrated can be envisioned without departing from the scope of the invention.

Abstract

The purpose of the present invention is to perform appropriate exposure control when a portion of the eye cannot be imaged due to the reflection of external light. An exposure control device according to the present invention comprises an eye detection unit (12) for detecting the outer canthus and inner corner of the eye from the image of the eye of a vehicle occupant captured by a camera (21), an edge detection unit (13) for detecting the edge of an eyelid between the outer canthus and inner corner as a detection edge from the captured image, a difference detection unit (14) for detecting a difference between the detection edge and a reference edge, and an exposure control unit (15) for performing exposure control so that the difference between the detection edge and the reference edge becomes smaller.

Description

露光制御装置および露光制御方法Exposure control device and exposure control method
 この発明は、車両の乗員の目を撮像する際の露光制御に関する。 The present invention relates to exposure control when imaging the eyes of a vehicle occupant.
 乗員の目の位置または瞼の開度から脇見状態または居眠り状態を検知し、乗員に警告等を行う乗員監視システムがある。こうした乗員監視システムは、車両に搭載したカメラにより乗員の目を撮影し、当該撮影画像を用いて乗員の状態を検知している。しかし、乗員が眼鏡を着用していると、眼鏡のレンズに外光が反射してしまい、目の画像を適切に撮影することが出来ないという問題があった。 There is an occupant monitoring system that detects the inattentive state or dozing state from the position of the occupant's eyes or the opening of the eyelids and warns the occupant. In such an occupant monitoring system, the eyes of the occupant are photographed by a camera mounted on the vehicle, and the state of the occupant is detected using the photographed image. However, when the occupant is wearing spectacles, there is a problem that external light is reflected on the lens of the spectacles and it is not possible to properly take an image of the eyes.
 こうした問題に対して、特許文献1では、外光反射を検出した場合に露光調整を行う技術が提案されている。 For such a problem, Patent Document 1 proposes a technique for adjusting the exposure when external light reflection is detected.
特開2013-175914号公報Japanese Unexamined Patent Publication No. 2013-175914
 しかし、特許文献1の技術では、外光反射の有無を、撮影画像から目が検出できるか否かによって判断している。そのため、外光反射によって目が一部隠れてしまっていても、大まかに目が検出できる場合には、露光制御が行われず、目の検出精度が改善されないという問題があった。 However, in the technique of Patent Document 1, the presence or absence of external light reflection is determined by whether or not the eyes can be detected from the captured image. Therefore, even if the eyes are partially hidden by the reflection of external light, if the eyes can be roughly detected, the exposure control is not performed and the eye detection accuracy is not improved.
 本発明は、上記の問題点を解決するためになされたものであり、外光反射により目の一部が撮影できない場合に適切な露光制御を行うことを目的とする。 The present invention has been made to solve the above problems, and an object of the present invention is to perform appropriate exposure control when a part of the eye cannot be photographed due to external light reflection.
 本発明の露光制御装置は、カメラによる車両の乗員の目の撮影画像から、目の目尻と目頭を検出する目検出部と、撮影画像から目尻と目頭の間の瞼のエッジを検出エッジとして検出するエッジ検出部と、検出エッジと、リファレンスエッジとの差異を検出する差異検出部と、検出エッジとリファレンスエッジとの差異が小さくなるように露光制御を行う露光制御部と、を備える。 The exposure control device of the present invention detects an eye detection unit that detects the outer and inner corners of the eyes from an image of the eyes of a vehicle occupant by a camera, and an edge of the eyelid between the outer and inner corners of the eye from the captured image as a detection edge. An edge detection unit for detecting an edge, a difference detection unit for detecting a difference between the detection edge and the reference edge, and an exposure control unit for performing exposure control so that the difference between the detection edge and the reference edge becomes small are provided.
 本発明の露光制御装置は、検出エッジとリファレンスエッジとの差異が小さくなるように露光制御を行うため、外光反射により目の一部が撮影できない場合に適切な露光制御を行うことができる。本発明の目的、特徴、態様、および利点は、以下の詳細な説明と添付図面とによって、より明白となる。 Since the exposure control device of the present invention performs exposure control so that the difference between the detection edge and the reference edge becomes small, it is possible to perform appropriate exposure control when a part of the eye cannot be photographed due to external light reflection. Objectives, features, aspects, and advantages of the present invention will be made clearer by the following detailed description and accompanying drawings.
実施の形態1の露光制御装置の構成を示すブロック図である。It is a block diagram which shows the structure of the exposure control apparatus of Embodiment 1. FIG. 実施の形態1の露光制御装置の動作を示すフローチャートである。It is a flowchart which shows the operation of the exposure control apparatus of Embodiment 1. カメラの撮影範囲を示す図である。It is a figure which shows the shooting range of a camera. 外光反射がない状態での検出エッジを示す図である。It is a figure which shows the detection edge in the state which there is no external light reflection. 外光反射がある状態でのカメラの撮影画像を示す図である。It is a figure which shows the image taken by the camera with the outside light reflection. 図5の撮影画像からエッジ検出部が検出した検出エッジを示す図である。It is a figure which shows the detection edge detected by the edge detection part from the photographed image of FIG. 露光制御の方法を示す図である。It is a figure which shows the method of exposure control. 露光制御後のカメラの撮影画像を示す図である。It is a figure which shows the image taken by the camera after exposure control. 図2のステップS104における実施の形態2の差異検出部の動作を示すフローチャートである。It is a flowchart which shows the operation of the difference detection part of Embodiment 2 in step S104 of FIG. 図2のステップS105における実施の形態2の露光制御部の動作を示すフローチャートである。It is a flowchart which shows the operation of the exposure control part of Embodiment 2 in step S105 of FIG. 実施の形態3の露光制御装置の構成を示すブロック図である。It is a block diagram which shows the structure of the exposure control apparatus of Embodiment 3. 実施の形態3の露光制御装置による露光制御前の撮影画像を示す図である。It is a figure which shows the photographed image before the exposure control by the exposure control apparatus of Embodiment 3. 露光制御後の撮影画像を示す図である。It is a figure which shows the photographed image after exposure control. 実施の形態4の露光制御装置の構成を示すブロック図である。It is a block diagram which shows the structure of the exposure control apparatus of Embodiment 4. 露光制御装置のハードウェア構成を示す図である。It is a figure which shows the hardware composition of the exposure control apparatus. 露光制御装置のハードウェア構成を示す図である。It is a figure which shows the hardware composition of the exposure control apparatus. 車両とサーバによる露光制御装置の構成例を示す図である。It is a figure which shows the configuration example of the exposure control apparatus by a vehicle and a server.
 <A.実施の形態1>
 <A-1.構成>
 図1は、実施の形態1の露光制御装置101の構成を示すブロック図である。露光制御装置101は、カメラ21が車両の乗員の目を適切に撮影することが出来るように、カメラ21の露光制御を行う装置である。以下、本明細書では露光制御装置が露光制御を行う対象のカメラ21が搭載された車両を、単に「車両」と称する。
<A. Embodiment 1>
<A-1. Configuration>
FIG. 1 is a block diagram showing the configuration of the exposure control device 101 of the first embodiment. The exposure control device 101 is a device that controls the exposure of the camera 21 so that the camera 21 can appropriately photograph the eyes of the occupants of the vehicle. Hereinafter, in the present specification, a vehicle equipped with a camera 21 for which the exposure control device performs exposure control is simply referred to as a "vehicle".
 カメラ21は車両に搭載され、乗員の顔画像を撮影する。図2に示すように、カメラ21の撮影範囲31は乗員32の顔を含む範囲である。ここで、カメラ21が撮影する乗員は、典型的には運転者であるが、運転者以外の乗員であっても良い。 The camera 21 is mounted on the vehicle and captures the facial image of the occupant. As shown in FIG. 2, the shooting range 31 of the camera 21 is a range including the face of the occupant 32. Here, the occupant photographed by the camera 21 is typically a driver, but may be an occupant other than the driver.
 画像取得部11はカメラ21の撮影画像を取得し、目検出部12に出力する。 The image acquisition unit 11 acquires the captured image of the camera 21 and outputs it to the eye detection unit 12.
 目検出部12は、画像取得部11から撮影画像を取得し、撮影画像から乗員の目尻と目頭を検出する。 The eye detection unit 12 acquires a photographed image from the image acquisition unit 11 and detects the outer and inner corners of the occupant's eyes from the photographed image.
 エッジ検出部13は、目検出部12から撮影画像、ならびに目尻および目頭の検出情報を取得し、これらの情報から、目尻と目頭の間の瞼のエッジ形状を検出エッジとして検出する。 The edge detection unit 13 acquires a captured image and detection information of the outer and inner corners of the eye from the eye detection unit 12, and detects the edge shape of the eyelid between the outer and inner corners of the eye as a detection edge from the information.
 差異検出部14は、エッジ検出部13から検出エッジを取得し、検出エッジをリファレンスエッジと比較して両者の差異を検出する。リファレンスエッジとは、一般的な瞼のエッジ形状として予め用意されたエッジである。目の形状、例えば目の細さや大きさには個人差があるため、通常、リファレンスエッジは複数のパターンが用意されており、差異検出部14は検出エッジに最も近いリファレンスエッジを抽出して両者を比較する。検出エッジがリファレンスエッジに近いほど、カメラ21は正しく乗員の目を撮影できており、露光が適切であるといえる。乗員が眼鏡を着用しており、眼鏡のレンズに外光反射が発生している場合には、外光反射が発生している領域で瞼のエッジを正確に検出することが困難であるため、検出エッジはリファレンスエッジからずれてしまう。 The difference detection unit 14 acquires the detection edge from the edge detection unit 13 and compares the detection edge with the reference edge to detect the difference between the two. The reference edge is an edge prepared in advance as an edge shape of a general eyelid. Since there are individual differences in the shape of the eyes, for example, the fineness and size of the eyes, a plurality of patterns are usually prepared for the reference edge, and the difference detection unit 14 extracts the reference edge closest to the detection edge and both. To compare. The closer the detection edge is to the reference edge, the more correctly the camera 21 can capture the eyes of the occupant, and it can be said that the exposure is appropriate. If the occupant is wearing spectacles and the lens of the spectacles reflects external light, it is difficult to accurately detect the edge of the eyelid in the area where the external light is reflected. The detection edge deviates from the reference edge.
 露光制御部15は、差異検出部14から検出エッジとリファレンスエッジの差異を取得し、差異が小さくなる方向にカメラ21の露光制御を行う。 The exposure control unit 15 acquires the difference between the detection edge and the reference edge from the difference detection unit 14, and controls the exposure of the camera 21 in the direction in which the difference becomes smaller.
 <A-2.動作>
 図2は、露光制御装置101の動作を示すフローチャートである。図2のフローは、例えば車両のアクセサリ電源がオンになり、カメラ21が撮影を開始したタイミングで開始し、車両の走行中は繰り返し行われる。
<A-2. Operation>
FIG. 2 is a flowchart showing the operation of the exposure control device 101. The flow of FIG. 2 starts at the timing when the accessory power of the vehicle is turned on and the camera 21 starts shooting, and is repeated while the vehicle is running.
 まず、画像取得部11はカメラ21の撮影画像を取得する(ステップS101)。図3は、カメラ21の撮影範囲31を示している。図3に示すように、撮影範囲31は乗員32の顔を含む範囲である。次に、目検出部12は、撮影画像から目尻と目頭を検出する(ステップS102)。そして、エッジ検出部13は、撮影画像から目尻と目頭の間の瞼のエッジを検出する(ステップS103)。ここで検出されたエッジを検出エッジと称する。次に、差異検出部14は、検出エッジとリファレンスエッジとを比較して両者の差異を検出する(ステップS104)。最後に、露光制御部15は、検出エッジとリファレンスエッジの差異が小さくなるように、言い換えれば検出エッジがリファレンスエッジに近づくように、カメラ21の露光制御を行う(ステップS105)。 First, the image acquisition unit 11 acquires the captured image of the camera 21 (step S101). FIG. 3 shows the shooting range 31 of the camera 21. As shown in FIG. 3, the photographing range 31 is a range including the face of the occupant 32. Next, the eye detection unit 12 detects the outer and inner corners of the eye from the captured image (step S102). Then, the edge detection unit 13 detects the edge of the eyelid between the outer and inner corners of the eye from the captured image (step S103). The edge detected here is referred to as a detection edge. Next, the difference detection unit 14 compares the detection edge with the reference edge and detects the difference between the two (step S104). Finally, the exposure control unit 15 controls the exposure of the camera 21 so that the difference between the detection edge and the reference edge becomes small, in other words, the detection edge approaches the reference edge (step S105).
 <A-3.効果>
 以上に説明したように、実施の形態1の露光制御装置101は、カメラ21による車両の乗員の目の撮影画像から、目の目尻と目頭を検出する目検出部12と、撮影画像から目尻と目頭の間の瞼のエッジを検出エッジとして検出するエッジ検出部13と、検出エッジと、リファレンスエッジとの差異を検出する差異検出部14と、検出エッジとリファレンスエッジとの差異が小さくなるように露光制御を行う露光制御部15と、を備える。従って、露光制御装置101によれば、乗員が着用する眼鏡のレンズに外光反射が生じて、目の一部が撮影できない場合でも、適切な露光制御を行うことにより目の全体を撮影することができる。
<A-3. Effect>
As described above, the exposure control device 101 of the first embodiment includes an eye detection unit 12 that detects the outer and inner corners of the eyes from the captured image of the eyes of the vehicle occupant by the camera 21, and the outer corners of the eyes from the captured image. The edge detection unit 13 that detects the edge of the eyelid between the inner corners of the eye as the detection edge, the difference detection unit 14 that detects the difference between the detection edge and the reference edge, and the difference between the detection edge and the reference edge are reduced. It includes an exposure control unit 15 that controls exposure. Therefore, according to the exposure control device 101, even if a part of the eye cannot be photographed due to external light reflection on the lens of the spectacles worn by the occupant, the entire eye is photographed by performing appropriate exposure control. Can be done.
 <B.実施の形態2>
 <B-1.構成>
 実施の形態2の露光制御装置102の構成は、図1に示した実施の形態1の露光制御装置101の構成と同様である。
<B. Embodiment 2>
<B-1. Configuration>
The configuration of the exposure control device 102 of the second embodiment is the same as the configuration of the exposure control device 101 of the first embodiment shown in FIG.
 <B-2.動作>
 図4は、外光反射がない状態での検出エッジ42を示している。眼鏡のレンズ41に外光反射がなければ、撮影画像には目の全体が明瞭に写るため、エッジ検出部13は瞼のエッジを正確に検出することができる。エッジ検出部13は、5つの検出点421-425で瞼のエッジを検出しており、検出点421-425を繋いだ実線が検出エッジ42である。図4には、検出点421-425に対応するリファレンス点431-435と、リファレンス点431-435をつないだリファレンスエッジ43が点線で示されている。リファレンスエッジ43のうち検出点421-425に対応する点がリファレンス点431-435である。検出点421-425に対応するとは、例えば、横方向の位置が検出点421-425と等しいという意味である。図4において、検出点421-425は高精度に検出され、リファレンス点431-435に一致している。この場合、露光制御部15は現状の露光制御を変更しない。
<B-2. Operation>
FIG. 4 shows the detection edge 42 in the absence of external light reflection. If the lens 41 of the spectacles does not reflect external light, the entire eye is clearly captured in the captured image, so that the edge detection unit 13 can accurately detect the edge of the eyelid. The edge detection unit 13 detects the edge of the eyelid at five detection points 421-425, and the solid line connecting the detection points 421-425 is the detection edge 42. In FIG. 4, the reference point 431-435 corresponding to the detection point 421-425 and the reference edge 43 connecting the reference points 431-435 are shown by dotted lines. Of the reference edges 43, the points corresponding to the detection points 421-425 are the reference points 431-435. Corresponding to detection points 421-425 means, for example, that the lateral position is equal to detection points 421-425. In FIG. 4, the detection points 421-425 are detected with high accuracy and coincide with the reference points 431-435. In this case, the exposure control unit 15 does not change the current exposure control.
 図5は、外光反射がある状態でのカメラ21の撮影画像を示している。図5において、眼鏡のレンズ41のうち、外光反射が生じている領域を領域44、生じていない領域を領域45とする。領域44に重なる顔の部分は撮影画像に表れないため、エッジ検出部13は瞼のエッジを正確に検出することができない。 FIG. 5 shows an image taken by the camera 21 in a state where there is external light reflection. In FIG. 5, in the lens 41 of the spectacles, the region where external light reflection occurs is referred to as region 44, and the region where external light reflection does not occur is referred to as region 45. Since the face portion overlapping the region 44 does not appear in the captured image, the edge detection unit 13 cannot accurately detect the edge of the eyelid.
 図6は、図5の撮影画像からエッジ検出部13が検出した検出エッジ42を示している。領域45における検出点421,422,425は、対応するリファレンス点431,432,435と一致するが、領域44における検出点423,424は、対応するリファレンス点433,434と一致しない。このことから、露光制御部15は、検出点423,424において外光反射が生じており、検出点421,422,425において外光反射が生じていないと判断する。なお、外光反射が生じている領域の検出点423,424を第1検出点と称し、外光反射が生じていない領域の検出点421,422,425を第2検出点と称する。 FIG. 6 shows the detection edge 42 detected by the edge detection unit 13 from the captured image of FIG. The detection points 421,422,425 in the region 45 coincide with the corresponding reference points 431,432,435, but the detection points 423,424 in the region 44 do not coincide with the corresponding reference points 433,434. From this, the exposure control unit 15 determines that external light reflection occurs at the detection points 423 and 424, and that external light reflection does not occur at the detection points 421, 422 and 425. The detection points 423 and 424 in the region where external light reflection occurs are referred to as first detection points, and the detection points 421, 422 and 425 in the region where external light reflection does not occur are referred to as second detection points.
 露光制御部15は、第1検出点の輝度と第2検出点の輝度に基づき、カメラ21の露光制御を行う。各検出点の輝度は撮影画像から測定することが可能である。最も単純な方法としては、第1検出点と第2検出点から検出点をそれぞれ1つ選択し、選択した検出点の平均輝度を基準とする。図7の例では、第1検出点として検出点423が選択され、第2検出点として検出点421が選択される。 The exposure control unit 15 controls the exposure of the camera 21 based on the brightness of the first detection point and the brightness of the second detection point. The brightness of each detection point can be measured from the captured image. As the simplest method, one detection point is selected from each of the first detection point and the second detection point, and the average brightness of the selected detection points is used as a reference. In the example of FIG. 7, the detection point 423 is selected as the first detection point, and the detection point 421 is selected as the second detection point.
 あるいは、露光制御部15は、第1検出点である検出点423,424の平均輝度LA1と、第2検出点である検出点421,422,425の平均輝度LA2を算出する。そして、平均輝度LA1と平均輝度LA2の平均値を基準として、カメラ21の露光制御を行う。露光制御部15は、カメラ21の露光時間または絞り値を調整すること、もしくは投光器22の光量を調整することにより行われる。上記に挙げた複数の調整方法が組み合わされても良い。投光器22は、車両に搭載され、乗員に光を照射する。 Alternatively, the exposure control unit 15 calculates the average brightness LA1 of the detection points 423 and 424 which are the first detection points and the average brightness LA2 of the detection points 421, 422 and 425 which are the second detection points. Then, the exposure control of the camera 21 is performed with reference to the average value of the average brightness LA1 and the average brightness LA2. The exposure control unit 15 is performed by adjusting the exposure time or the aperture value of the camera 21 or adjusting the amount of light of the floodlight 22. A plurality of adjustment methods listed above may be combined. The floodlight 22 is mounted on the vehicle and irradiates the occupants with light.
 図8は、露光制御後のカメラ21の撮影画像を示している。露光制御により撮影領域全体の輝度は低下するが、外光反射が生じていた領域44でも目を検出することが可能となる。 FIG. 8 shows a captured image of the camera 21 after exposure control. Although the brightness of the entire photographing region is reduced by the exposure control, it is possible to detect the eyes even in the region 44 where the external light reflection has occurred.
 露光制御装置102の全体的な動作のフローは、図2に示した通りである。図9は、図2のステップS104における露光制御装置102の差異検出部14の動作を示すフローチャートである。以下、図9に沿って差異検出部14の動作を説明する。エッジ検出部13は、目頭と目尻の間の瞼のエッジを、複数の検出点で検出する(図2のステップS103)。その後、差異検出部14は検出点を1つ選択し(ステップS1401)、選択した検出点について以下の処理を行う。すなわち、差異検出部14は、検出点とリファレンス点の距離が予め定められた閾値以上であるか否かを判断する(ステップS1402)。検出点とリファレンス点の距離が閾値以上である場合、差異検出部14は検出点を第1検出点に設定する(ステップS1403)。一方、検出点とリファレンス点の距離が閾値未満である場合、差異検出部14は検出点を第2検出点に設定する(ステップS1404)。 The overall operation flow of the exposure control device 102 is as shown in FIG. FIG. 9 is a flowchart showing the operation of the difference detection unit 14 of the exposure control device 102 in step S104 of FIG. Hereinafter, the operation of the difference detection unit 14 will be described with reference to FIG. The edge detection unit 13 detects the edge of the eyelid between the inner and outer corners of the eye at a plurality of detection points (step S103 in FIG. 2). After that, the difference detection unit 14 selects one detection point (step S1401), and performs the following processing on the selected detection point. That is, the difference detection unit 14 determines whether or not the distance between the detection point and the reference point is equal to or greater than a predetermined threshold value (step S1402). When the distance between the detection point and the reference point is equal to or greater than the threshold value, the difference detection unit 14 sets the detection point as the first detection point (step S1403). On the other hand, when the distance between the detection point and the reference point is less than the threshold value, the difference detection unit 14 sets the detection point as the second detection point (step S1404).
 次に、差異検出部14は全ての検出点について処理を完了したか否かを判断する(ステップS1405)。未処理の検出点があれば、差異検出部14はステップS1401に戻り、全ての検出点について処理を完了すれば、差異検出部14はステップS105に進む。 Next, the difference detection unit 14 determines whether or not the processing has been completed for all the detection points (step S1405). If there are unprocessed detection points, the difference detection unit 14 returns to step S1401, and if the processing is completed for all the detection points, the difference detection unit 14 proceeds to step S105.
 図10は、図2のステップS105における露光制御装置102の露光制御部15の動作を示すフローチャートである。以下、図10に沿って露光制御部15の動作を説明する。露光制御部15は、第1検出点の平均輝度LA1を算出する(ステップS1501)。図7の例で、第1検出点は検出点423,424である。次に、露光制御部15は、第2検出点の平均輝度LA2を算出する(ステップS1502)。図7の例で、第2検出点は検出点421,422,425である。 FIG. 10 is a flowchart showing the operation of the exposure control unit 15 of the exposure control device 102 in step S105 of FIG. Hereinafter, the operation of the exposure control unit 15 will be described with reference to FIG. The exposure control unit 15 calculates the average brightness LA1 of the first detection point (step S1501). In the example of FIG. 7, the first detection points are detection points 423 and 424. Next, the exposure control unit 15 calculates the average brightness LA2 of the second detection point (step S1502). In the example of FIG. 7, the second detection point is the detection point 421, 422, 425.
 次に、露光制御部15は、基準輝度LS=(LA1+LA2)/2を算出する(ステップS1503)。このように、露光制御部15は、第1検出点の平均輝度LA1と第2検出点の平均輝度LA2の平均値を基準輝度LSとする。そして、露光制御部15は、基準輝度LSに基づきカメラ21の露光制御を行う(ステップS1504)。 Next, the exposure control unit 15 calculates the reference luminance LS = (LA1 + LA2) / 2 (step S1503). As described above, the exposure control unit 15 sets the average value of the average luminance LA1 at the first detection point and the average luminance LA2 at the second detection point as the reference luminance LS. Then, the exposure control unit 15 controls the exposure of the camera 21 based on the reference luminance LS (step S1504).
 <B-3.変形例>
 上記では、エッジ検出部13が複数の検出点で瞼のエッジを検出することについて説明した。図4等では5つの検出点を示しているが、検出点の数は5つに限らない。また、エッジ検出部13は上瞼に限らず下瞼のエッジを検出しても良いし、上瞼と下瞼のエッジを両方検出しても良い。
<B-3. Modification example>
In the above, it has been described that the edge detection unit 13 detects the edge of the eyelid at a plurality of detection points. Although five detection points are shown in FIG. 4 and the like, the number of detection points is not limited to five. Further, the edge detection unit 13 may detect not only the upper eyelid but also the edge of the lower eyelid, or may detect both the edge of the upper eyelid and the edge of the lower eyelid.
 また、上記では、露光制御部15は、第1検出点の平均輝度LA1と第2検出点の平均輝度LA2の平均値を基準輝度LSとしたが、他の方法で基準輝度LSを設定しても良い。例えば、露光制御部15は、第1検出点と第2検出点の輝度の中央値を基準輝度LSとしても良いし、第1検出点の輝度の中央値と第2検出点の輝度の中央値との平均値を基準輝度LSとしても良い。 Further, in the above, the exposure control unit 15 sets the average value of the average brightness LA1 at the first detection point and the average brightness LA2 at the second detection point as the reference brightness LS, but sets the reference brightness LS by another method. Is also good. For example, the exposure control unit 15 may use the median brightness of the first detection point and the second detection point as the reference brightness LS, or the median brightness of the first detection point and the median brightness of the second detection point. The average value of and may be used as the reference luminance LS.
 <B-4.効果>
 実施の形態2の露光制御装置102において、エッジ検出部13は、検出エッジを複数の検出点で検出する。そして、露光制御部15は、リファレンスエッジの対応点との距離が閾値以上の検出点である第1検出点の輝度と、リファレンスエッジの対応点との距離が閾値未満の検出点である第2検出点の輝度に基づき、カメラ21の露光制御を行う。この露光制御により、第1検出点と第2検出点の両方で目を撮影することが可能となる。従って、外光反射がある場合でも、適切に目の情報を検出することが可能となる。
<B-4. Effect>
In the exposure control device 102 of the second embodiment, the edge detection unit 13 detects the detection edge at a plurality of detection points. Then, the exposure control unit 15 is a second detection point in which the brightness of the first detection point, which is a detection point whose distance from the corresponding point of the reference edge is equal to or greater than the threshold value, and the distance between the corresponding point of the reference edge are less than the threshold value. The exposure of the camera 21 is controlled based on the brightness of the detection point. This exposure control makes it possible to photograph the eyes at both the first detection point and the second detection point. Therefore, even if there is external light reflection, it is possible to appropriately detect eye information.
 また、露光制御部15は、第1検出点の平均輝度と第2検出点の平均輝度との平均値に基づき、カメラ21の露光制御を行う。この露光制御により、第1検出点と第2検出点の両方で目を撮影することが可能となる。 Further, the exposure control unit 15 controls the exposure of the camera 21 based on the average value of the average brightness of the first detection point and the average brightness of the second detection point. This exposure control makes it possible to photograph the eyes at both the first detection point and the second detection point.
 あるいは、露光制御部15は、第1検出点の輝度の中央値と第2検出点の輝度の中央値との平均値に基づき、カメラ21の露光制御を行う。この露光制御により、第1検出点と第2検出点の両方で目を撮影することが可能となる。 Alternatively, the exposure control unit 15 controls the exposure of the camera 21 based on the average value of the median brightness of the first detection point and the median brightness of the second detection point. This exposure control makes it possible to photograph the eyes at both the first detection point and the second detection point.
 あるいは、露光制御部15は、第1検出点の輝度と第2検出点の輝度の中央値に基づき、カメラ21の露光制御を行う。この露光制御により、第1検出点と第2検出点の両方で目を撮影することが可能となる。 Alternatively, the exposure control unit 15 controls the exposure of the camera 21 based on the median value of the brightness of the first detection point and the brightness of the second detection point. This exposure control makes it possible to photograph the eyes at both the first detection point and the second detection point.
 <C.実施の形態3>
 <C-1.構成>
 図11は、実施の形態3の露光制御装置103の構成を示すブロック図である。露光制御装置103は、実施の形態1および実施の形態2の露光制御装置101,102の構成に加えて、目情報検出部16を備えている。目情報検出部16以外の露光制御装置103の構成は露光制御装置101,102と同様であるため、その説明は省略する。
<C. Embodiment 3>
<C-1. Configuration>
FIG. 11 is a block diagram showing the configuration of the exposure control device 103 according to the third embodiment. The exposure control device 103 includes an eye information detection unit 16 in addition to the configurations of the exposure control devices 101 and 102 of the first and second embodiments. Since the configuration of the exposure control device 103 other than the eye information detection unit 16 is the same as that of the exposure control devices 101 and 102, the description thereof will be omitted.
 目情報検出部16は、画像取得部11からカメラ21の撮影画像を取得し、撮影画像から目情報を検出する。目情報とは、例えば視線方向または瞼の開度などであり、運転者の状態を把握するために用いられる。運転者の状態とは、例えば脇見運転または居眠り運転である。 The eye information detection unit 16 acquires the captured image of the camera 21 from the image acquisition unit 11 and detects the eye information from the captured image. The eye information is, for example, the direction of the line of sight or the opening degree of the eyelids, and is used to grasp the state of the driver. The driver's condition is, for example, inattentive driving or dozing driving.
 露光制御装置103によれば、乗員が着用する眼鏡のレンズに外光反射が生じても、露光制御部15により適切な露光制御が行われるため、露光制御後の撮影画像から目情報を正確に検出することができる。 According to the exposure control device 103, even if the lens of the spectacles worn by the occupant reflects external light, the exposure control unit 15 performs appropriate exposure control, so that the eye information can be accurately obtained from the captured image after the exposure control. Can be detected.
 <C-2.動作>
 目情報検出部16は、露光制御後の撮影画像のみから目情報を検出しても良いし、露光制御前の撮影画像から検出した目情報のうち、外光反射が生じた部分を、露光制御後の撮影画像から検出した目情報で補完しても良い。以下、目情報の補完について説明する。
<C-2. Operation>
The eye information detection unit 16 may detect eye information only from the captured image after exposure control, or exposes and controls a portion of the eye information detected from the captured image before exposure control where external light reflection occurs. It may be supplemented with the eye information detected from the later captured image. The complementation of eye information will be described below.
 図12は、露光制御装置103による露光制御前の撮影画像を示している。領域44に外光反射が生じているため、目情報検出部16は、領域45に重なる目の部分の情報を検出できるが、領域44に重なる目の部分の情報は高精度に検出できない。図13は、露光制御後の撮影画像を示している。図13の撮影画像では、領域44と領域45の両方に目が映っており、目情報検出部16は、領域45における目の部分の情報を高精度に検出可能である。 FIG. 12 shows an image taken before exposure control by the exposure control device 103. Since the external light reflection occurs in the region 44, the eye information detection unit 16 can detect the information of the eye portion overlapping the region 45, but cannot detect the information of the eye portion overlapping the region 44 with high accuracy. FIG. 13 shows a captured image after exposure control. In the captured image of FIG. 13, eyes are reflected in both the area 44 and the area 45, and the eye information detection unit 16 can detect the information of the eye portion in the area 45 with high accuracy.
 目情報検出部16は、露光制御前の撮影画像から検出できなかった目情報を、露光制御後の撮影画像から補完する。例えば、目情報が瞼のエッジである場合には、図13に示す第2検出点である検出点4231,4241を、図12に示す検出点423,424と置き換えることによって、目情報の補完を行う。 The eye information detection unit 16 complements the eye information that could not be detected from the captured image before the exposure control from the captured image after the exposure control. For example, when the eye information is the edge of the eyelid, the eye information is complemented by replacing the detection points 4231 and 4241 shown in FIG. 13 with the detection points 423 and 424 shown in FIG. Do.
 目情報が瞳孔の情報など、瞼のエッジ以外の情報を含む場合には、目情報検出部16は、第2検出点である検出点4231,4241から外光反射が生じた領域44を推定する。例えば、目情報検出部16は、検出点4231,4241を含む一定の領域、具体的にはレンズ41の上下方向に沿った領域を外光反射が生じた領域44として推定しても良い。あるいは、エッジ検出部13が上瞼と下瞼の両方のエッジを検出している場合には、目情報検出部16は、両方のエッジから抽出した第2検出点を結ぶ領域を、外光反射が生じた領域44として推定しても良い。これにより、外光反射が生じた領域44を正確に推定することができる。 When the eye information includes information other than the edge of the eyelid, such as pupil information, the eye information detection unit 16 estimates the region 44 where external light reflection occurs from the detection points 4231 and 4241, which are the second detection points. .. For example, the eye information detection unit 16 may estimate a certain region including the detection points 4231 and 4241, specifically, a region along the vertical direction of the lens 41 as a region 44 in which external light reflection occurs. Alternatively, when the edge detection unit 13 detects the edges of both the upper eyelid and the lower eyelid, the eye information detection unit 16 reflects external light on the region connecting the second detection points extracted from both edges. May be estimated as the region 44 where As a result, the region 44 where the external light reflection has occurred can be accurately estimated.
 そして、目情報検出部16は、撮影画像の、外光反射が生じたと推定した領域から検出した目情報を、露光制御前の撮影画像から検出した目情報に補完する。これにより、目情報検出部16は正確な目情報を得ることができる。 Then, the eye information detection unit 16 complements the eye information detected from the region estimated to have caused external light reflection in the captured image with the eye information detected from the captured image before the exposure control. As a result, the eye information detection unit 16 can obtain accurate eye information.
 <C-3.効果>
 露光制御装置103は、撮影画像から瞼の形状の情報を含む目情報を検出する目情報検出部16を備える。従って、露光制御装置103によれば、外光反射がある場合でも目情報を高精度に検出することができる。
<C-3. Effect>
The exposure control device 103 includes an eye information detection unit 16 that detects eye information including eyelid shape information from a captured image. Therefore, according to the exposure control device 103, the eye information can be detected with high accuracy even when there is external light reflection.
 また、目情報検出部16は、露光制御装置103による露光制御前の撮影画像から検出した目情報を、露光制御後の撮影画像から検出した目情報により補完する。これにより、外光反射がある場合でも目情報を高精度に検出することができる。 Further, the eye information detection unit 16 complements the eye information detected from the captured image before the exposure control by the exposure control device 103 with the eye information detected from the captured image after the exposure control. As a result, eye information can be detected with high accuracy even when there is external light reflection.
 <D.実施の形態4>
 <D-1.構成>
 図14は、実施の形態4の露光制御装置104の構成を示すブロック図である。露光制御装置104は、実施の形態3の露光制御装置103に加えて、状態検出部17を備えている。状態検出部17以外の露光制御装置104の構成は露光制御装置103と同様であるため、その説明は省略する。
<D. Embodiment 4>
<D-1. Configuration>
FIG. 14 is a block diagram showing the configuration of the exposure control device 104 according to the fourth embodiment. The exposure control device 104 includes a state detection unit 17 in addition to the exposure control device 103 of the third embodiment. Since the configuration of the exposure control device 104 other than the state detection unit 17 is the same as that of the exposure control device 103, the description thereof will be omitted.
 状態検出部17は、目情報検出部16により検出された目情報に基づき、運転者の状態を検出する。例えば、状態検出部17は、目情報として検出された運転者の視線方向に基づき、運転者が脇見運転をしているか否かを検出する。あるいは、状態検出部17は、目情報として検出された運転者の瞼のエッジに基づき、運転者が居眠りをしているか否かを検出する。これにより、例えば運転者が運転に適していない状態にある場合には、車両に搭載された表示装置または音声出力装置を通して警告を行うことができる。 The state detection unit 17 detects the driver's state based on the eye information detected by the eye information detection unit 16. For example, the state detection unit 17 detects whether or not the driver is inattentive driving based on the direction of the driver's line of sight detected as eye information. Alternatively, the state detection unit 17 detects whether or not the driver is dozing based on the edge of the driver's eyelids detected as eye information. As a result, for example, when the driver is in a state unsuitable for driving, a warning can be given through a display device or a voice output device mounted on the vehicle.
 <D-2.効果>
 露光制御装置104は、目情報に基づき乗員の状態を検出する状態検出部を備える。従って、乗員が眼鏡を着用していても、乗員の状態を正確に検出することができる。
<D-2. Effect>
The exposure control device 104 includes a state detection unit that detects the state of the occupant based on eye information. Therefore, even if the occupant is wearing glasses, the state of the occupant can be accurately detected.
 <E.ハードウェア構成>
 上述した露光制御装置101-104における、画像取得部11、目検出部12、エッジ検出部13、差異検出部14、露光制御部15、目情報検出部16、および状態検出部17は、図15に示す処理回路51により実現される。すなわち、処理回路51は、画像取得部11、目検出部12、エッジ検出部13、差異検出部14、露光制御部15、目情報検出部16、および状態検出部17(以下、「画像取得部11等」と称する)を備える。処理回路51には、専用のハードウェアが適用されても良いし、メモリに格納されるプログラムを実行するプロセッサが適用されても良い。プロセッサは、例えば中央処理装置、処理装置、演算装置、マイクロプロセッサ、マイクロコンピュータ、DSP(Digital Signal Processor)等である。
<E. Hardware configuration>
In the exposure control device 101-104 described above, the image acquisition unit 11, the eye detection unit 12, the edge detection unit 13, the difference detection unit 14, the exposure control unit 15, the eye information detection unit 16, and the state detection unit 17 are shown in FIG. It is realized by the processing circuit 51 shown in. That is, the processing circuit 51 includes an image acquisition unit 11, an eye detection unit 12, an edge detection unit 13, a difference detection unit 14, an exposure control unit 15, an eye information detection unit 16, and a state detection unit 17 (hereinafter, “image acquisition unit”). It is referred to as "11 mag"). Dedicated hardware may be applied to the processing circuit 51, or a processor that executes a program stored in the memory may be applied. The processor is, for example, a central processing unit, a processing unit, an arithmetic unit, a microprocessor, a microcomputer, a DSP (Digital Signal Processor), or the like.
 処理回路51が専用のハードウェアである場合、処理回路51は、例えば、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field-Programmable Gate Array)、またはこれらを組み合わせたものが該当する。~部等の各部の機能それぞれは、複数の処理回路51で実現されてもよいし、各部の機能をまとめて一つの処理回路で実現されてもよい。 When the processing circuit 51 is dedicated hardware, the processing circuit 51 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-Programmable). GateArray), or a combination of these. Each of the functions of each part such as the parts may be realized by a plurality of processing circuits 51, or the functions of each part may be collectively realized by one processing circuit.
 処理回路51がプロセッサである場合、画像取得部11等の機能は、ソフトウェア等(ソフトウェア、ファームウェアまたはソフトウェアとファームウェア)との組み合わせにより実現される。ソフトウェア等はプログラムとして記述され、メモリに格納される。図16に示すように、処理回路51に適用されるプロセッサ52は、メモリ53に記憶されたプログラムを読み出して実行することにより、各部の機能を実現する。すなわち、露光制御装置101-104は、処理回路51により実行されるときに、カメラ21による車両の乗員の目の撮影画像から目の目尻と目頭を検出するステップと、撮影画像から目尻と目頭の間の瞼を検出エッジとして検出するステップと、検出エッジとリファレンスエッジとの差異を検出するステップと、検出エッジとリファレンスエッジとの差異が小さくなるように、カメラの露光制御を行うステップと、が結果的に実行されることになるプログラムを格納するためのメモリ53を備える。換言すれば、このプログラムは、画像取得部11等の手順または方法をコンピュータに実行させるものであるともいえる。ここで、メモリ53には、例えば、RAM(Random Access Memory)、ROM(Read Only Memory)、フラッシュメモリ、EPROM(Erasable Programmable Read Only Memory)、EEPROM(Electrically Erasable Programmable Read Only Memory)などの、不揮発性または揮発性の半導体メモリ、HDD(Hard Disk Drive)、磁気ディスク、フレキシブルディスク、光ディスク、コンパクトディスク、ミニディスク、DVD(Digital Versatile Disk)及びそのドライブ装置等、または、今後使用されるあらゆる記憶媒体であってもよい。 When the processing circuit 51 is a processor, the functions of the image acquisition unit 11 and the like are realized by combining software and the like (software, firmware or software and firmware). Software and the like are described as programs and stored in memory. As shown in FIG. 16, the processor 52 applied to the processing circuit 51 realizes the functions of each part by reading and executing the program stored in the memory 53. That is, when the exposure control device 101-104 is executed by the processing circuit 51, the step of detecting the outer and inner corners of the eyes from the captured image of the eyes of the vehicle occupant by the camera 21 and the outer and inner corners of the eyes from the captured image. The step of detecting the eyelids in between as the detection edge, the step of detecting the difference between the detection edge and the reference edge, and the step of controlling the exposure of the camera so that the difference between the detection edge and the reference edge becomes small. A memory 53 for storing a program to be executed as a result is provided. In other words, it can be said that this program causes the computer to execute the procedure or method of the image acquisition unit 11 or the like. Here, the memory 53 includes, for example, non-volatile such as RAM (RandomAccessMemory), ROM (ReadOnlyMemory), flash memory, EPROM (ErasableProgrammableReadOnlyMemory), and EPROM (ElectricallyErasableProgrammableReadOnlyMemory). Or with volatile semiconductor memory, HDD (Hard Disk Drive), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disk) and its drive device, or any storage medium that will be used in the future. There may be.
 以上、画像取得部11等の各機能が、ハードウェア及びソフトウェア等のいずれか一方で実現される構成について説明した。しかしこれに限ったものではなく、画像取得部11等の一部を専用のハードウェアで実現し、別の一部をソフトウェア等で実現する構成であってもよい。例えば画像取得部11については専用のハードウェアとしての処理回路でその機能を実現し、それ以外についてはプロセッサ52としての処理回路51がメモリ53に格納されたプログラムを読み出して実行することによってその機能を実現することが可能である。 The configuration in which each function of the image acquisition unit 11 and the like is realized by either hardware or software has been described above. However, the present invention is not limited to this, and a configuration may be configured in which a part of the image acquisition unit 11 or the like is realized by dedicated hardware and another part is realized by software or the like. For example, the image acquisition unit 11 realizes its function by a processing circuit as dedicated hardware, and other than that, the processing circuit 51 as a processor 52 reads and executes a program stored in the memory 53 to execute the function. It is possible to realize.
 以上のように、処理回路は、ハードウェア、ソフトウェア等、またはこれらの組み合わせによって、上述の各機能を実現することができる。 As described above, the processing circuit can realize each of the above-mentioned functions by hardware, software, or a combination thereof.
 また、上記では露光制御装置101-104を車載装置として説明したが、車載装置、PND(Portable Navigation Device)、通信端末(例えば携帯電話、スマートフォン、およびタブレットなどの携帯端末)、およびこれらにインストールされるアプリケーションの機能、並びにサーバなどを適宜に組み合わせてシステムとして構築されるシステムにも適用することができる。この場合、以上で説明した露光制御装置101-104の各機能または各構成要素は、システムを構築する各機器に分散して配置されてもよいし、いずれかの機器に集中して配置されてもよい。図17は、車両71とサーバ72による露光制御装置103の構成例を示している。この構成例では、画像取得部11、露光制御部15および目情報検出部16が車両71に配置され、目検出部12、エッジ検出部13および差異検出部14がサーバ72に配置されている。 Further, although the exposure control device 101-104 has been described above as an in-vehicle device, it is installed in an in-vehicle device, a PND (Portable Navigation Device), a communication terminal (for example, a mobile terminal such as a mobile phone, a smartphone, and a tablet), and the like. It can also be applied to a system constructed as a system by appropriately combining the functions of various applications and servers. In this case, each function or each component of the exposure control device 101-104 described above may be distributed and arranged in each device for constructing the system, or may be centrally arranged in any device. May be good. FIG. 17 shows a configuration example of the exposure control device 103 by the vehicle 71 and the server 72. In this configuration example, the image acquisition unit 11, the exposure control unit 15, and the eye information detection unit 16 are arranged in the vehicle 71, and the eye detection unit 12, the edge detection unit 13, and the difference detection unit 14 are arranged in the server 72.
 なお、本発明は、その発明の範囲内において、各実施の形態を自由に組み合わせたり、各実施の形態を適宜、変形、省略することが可能である。この発明は詳細に説明されたが、上記した説明は、すべての態様において、例示であって、この発明がそれに限定されるものではない。例示されていない無数の変形例が、この発明の範囲から外れることなく想定され得るものと解される。 It should be noted that, within the scope of the present invention, each embodiment can be freely combined, and each embodiment can be appropriately modified or omitted. Although the present invention has been described in detail, the above description is exemplary in all embodiments and the invention is not limited thereto. It is understood that a myriad of variations not illustrated can be envisioned without departing from the scope of the invention.
 11 画像取得部、12 目検出部、13 エッジ検出部、14 差異検出部、15 露光制御部、16 目情報検出部、17 状態検出部、21 カメラ、22 投光器、101-104 露光制御装置。 11 image acquisition unit, 12th detection unit, 13 edge detection unit, 14 difference detection unit, 15 exposure control unit, 16th information detection unit, 17 state detection unit, 21 camera, 22 floodlight, 101-104 exposure control device.

Claims (9)

  1.  カメラによる車両の乗員の目の撮影画像から、前記目の目尻と目頭を検出する目検出部と、
     前記撮影画像から前記目尻と前記目頭の間の瞼のエッジを検出エッジとして検出するエッジ検出部と、
     前記検出エッジと、リファレンスエッジとの差異を検出する差異検出部と、
     前記検出エッジと前記リファレンスエッジとの差異が小さくなるように露光制御を行う露光制御部と、を備える、
    露光制御装置。
    An eye detection unit that detects the outer and inner corners of the eyes from images taken by the camera of the occupant of the vehicle,
    An edge detection unit that detects the edge of the eyelid between the outer corner of the eye and the inner corner of the eye as a detection edge from the captured image.
    A difference detection unit that detects the difference between the detection edge and the reference edge,
    An exposure control unit that controls exposure so that the difference between the detection edge and the reference edge becomes small is provided.
    Exposure control device.
  2.  前記エッジ検出部は、前記検出エッジを複数の検出点で検出し、
     前記露光制御部は、前記リファレンスエッジの対応点との距離が閾値以上の前記検出点である第1検出点の輝度と、前記リファレンスエッジの対応点との距離が前記閾値未満の前記検出点である第2検出点の輝度に基づき、前記カメラの露光制御を行う、
    請求項1に記載の露光制御装置。
    The edge detection unit detects the detection edge at a plurality of detection points,
    The exposure control unit is at the detection point where the distance between the brightness of the first detection point, which is the detection point whose distance from the corresponding point of the reference edge is equal to or greater than the threshold value, and the distance between the corresponding point of the reference edge is less than the threshold value. The exposure of the camera is controlled based on the brightness of a second detection point.
    The exposure control device according to claim 1.
  3.  前記露光制御部は、前記第1検出点の平均輝度と前記第2検出点の平均輝度との平均値に基づき、前記カメラの露光制御を行う、
    請求項2に記載の露光制御装置。
    The exposure control unit controls the exposure of the camera based on the average value of the average brightness of the first detection point and the average brightness of the second detection point.
    The exposure control device according to claim 2.
  4.  前記露光制御部は、前記第1検出点の輝度の中央値と前記第2検出点の輝度の中央値との平均値に基づき、前記カメラの露光制御を行う、
    請求項2に記載の露光制御装置。
    The exposure control unit controls the exposure of the camera based on the average value of the median brightness of the first detection point and the median brightness of the second detection point.
    The exposure control device according to claim 2.
  5.  前記露光制御部は、前記第1検出点の輝度と前記第2検出点の輝度の中央値に基づき、前記カメラの露光制御を行う、
    請求項2に記載の露光制御装置。
    The exposure control unit controls the exposure of the camera based on the median value of the brightness of the first detection point and the brightness of the second detection point.
    The exposure control device according to claim 2.
  6.  前記撮影画像から前記目の情報である目情報を検出する目情報検出部をさらに備える、
    請求項1に記載の露光制御装置。
    An eye information detection unit that detects eye information, which is the information of the eyes, from the captured image is further provided.
    The exposure control device according to claim 1.
  7.  前記目情報検出部は、前記露光制御装置による露光制御前の前記撮影画像から検出した前記目情報を、露光制御後の前記撮影画像から検出した前記目情報により補完する、
    請求項6に記載の露光制御装置。
    The eye information detection unit complements the eye information detected from the captured image before exposure control by the exposure control device with the eye information detected from the captured image after exposure control.
    The exposure control device according to claim 6.
  8.  前記目情報に基づき前記乗員の状態を検出する状態検出部をさらに備える、
    請求項6に記載の露光制御装置。
    A state detection unit that detects the state of the occupant based on the eye information is further provided.
    The exposure control device according to claim 6.
  9.  カメラによる車両の乗員の目の撮影画像から前記目の目尻と目頭を検出し、
     前記撮影画像から前記目尻と前記目頭の間の瞼を検出エッジとして検出し、
     前記検出エッジとリファレンスエッジとの差異を検出し、
     前記検出エッジと前記リファレンスエッジとの差異が小さくなるように、前記カメラの露光制御を行う、
    露光制御方法。
    The outer and inner corners of the eyes are detected from the images taken by the camera of the occupant of the vehicle.
    The eyelid between the outer corner of the eye and the inner corner of the eye is detected as a detection edge from the captured image.
    The difference between the detection edge and the reference edge is detected,
    The exposure of the camera is controlled so that the difference between the detection edge and the reference edge becomes small.
    Exposure control method.
PCT/JP2019/012738 2019-03-26 2019-03-26 Exposure control device and exposure control method WO2020194489A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2019/012738 WO2020194489A1 (en) 2019-03-26 2019-03-26 Exposure control device and exposure control method
JP2021507101A JP6873351B2 (en) 2019-03-26 2019-03-26 Exposure control device and exposure control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/012738 WO2020194489A1 (en) 2019-03-26 2019-03-26 Exposure control device and exposure control method

Publications (1)

Publication Number Publication Date
WO2020194489A1 true WO2020194489A1 (en) 2020-10-01

Family

ID=72609344

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/012738 WO2020194489A1 (en) 2019-03-26 2019-03-26 Exposure control device and exposure control method

Country Status (2)

Country Link
JP (1) JP6873351B2 (en)
WO (1) WO2020194489A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11183127A (en) * 1997-12-25 1999-07-09 Nissan Motor Co Ltd Eye position detector
JP2000310510A (en) * 1999-04-28 2000-11-07 Niles Parts Co Ltd Device for detecting position of eye
JP2002352229A (en) * 2001-05-30 2002-12-06 Mitsubishi Electric Corp Face region detector
JP2008224565A (en) * 2007-03-15 2008-09-25 Aisin Seiki Co Ltd Eye state discrimination device, eye state discrimination method, and eye state discrimination program
JP2009116797A (en) * 2007-11-09 2009-05-28 Aisin Seiki Co Ltd Face imaging apparatus, face imaging method, and program of the same
JP2013045317A (en) * 2011-08-25 2013-03-04 Denso Corp Face image detector
JP2013175914A (en) * 2012-02-24 2013-09-05 Denso Corp Imaging controller and program

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11183127A (en) * 1997-12-25 1999-07-09 Nissan Motor Co Ltd Eye position detector
JP2000310510A (en) * 1999-04-28 2000-11-07 Niles Parts Co Ltd Device for detecting position of eye
JP2002352229A (en) * 2001-05-30 2002-12-06 Mitsubishi Electric Corp Face region detector
JP2008224565A (en) * 2007-03-15 2008-09-25 Aisin Seiki Co Ltd Eye state discrimination device, eye state discrimination method, and eye state discrimination program
JP2009116797A (en) * 2007-11-09 2009-05-28 Aisin Seiki Co Ltd Face imaging apparatus, face imaging method, and program of the same
JP2013045317A (en) * 2011-08-25 2013-03-04 Denso Corp Face image detector
JP2013175914A (en) * 2012-02-24 2013-09-05 Denso Corp Imaging controller and program

Also Published As

Publication number Publication date
JP6873351B2 (en) 2021-05-19
JPWO2020194489A1 (en) 2021-09-13

Similar Documents

Publication Publication Date Title
US8508652B2 (en) Autofocus method
US20160173759A1 (en) Image capturing apparatus, control method thereof, and storage medium
US20080074529A1 (en) Imaging apparatus, control method of imaging apparatus, and computer program
JP4998308B2 (en) Focus adjustment device and imaging device
JP2008061157A (en) Camera
JP6415196B2 (en) IMAGING DEVICE AND IMAGING DEVICE CONTROL METHOD
JP2013160919A (en) Image signal processing device and image signal processing method
JP4721434B2 (en) Imaging apparatus and control method thereof
US20200250806A1 (en) Information processing apparatus, information processing method, and storage medium
US10594938B2 (en) Image processing apparatus, imaging apparatus, and method for controlling image processing apparatus
US9781331B2 (en) Imaging apparatus, and control method thereof
US9609202B2 (en) Image pickup apparatus and control method with focus adjusting modes
JP2017130976A5 (en) Image processing apparatus and image processing method
JP6873351B2 (en) Exposure control device and exposure control method
US10638042B2 (en) Electronic device, control device for controlling electronic device, control program, and control method
CN110199318B (en) Driver state estimation device and driver state estimation method
JP6381206B2 (en) Image processing apparatus, control method thereof, and program
JP2007189595A (en) On-vehicle camera device
EP3163369B1 (en) Auto-focus control in a camera to prevent oscillation
US11842570B2 (en) Image processing apparatus, image pickup apparatus, image processing method, and storage medium
US20210195110A1 (en) Control apparatus, lens apparatus, image pickup apparatus, and image pickup system
JP5341655B2 (en) Imaging apparatus and control method thereof
US20200065936A1 (en) Image capturing apparatus, method for controlling same, and storage medium
JP2016218206A (en) Imaging device and control method of the same
WO2019097677A1 (en) Image capture control device, image capture control method, and driver monitoring system provided with image capture control device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19921839

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021507101

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19921839

Country of ref document: EP

Kind code of ref document: A1