WO2020194489A1 - Dispositif de commande d'exposition et procédé de commande d'exposition - Google Patents

Dispositif de commande d'exposition et procédé de commande d'exposition Download PDF

Info

Publication number
WO2020194489A1
WO2020194489A1 PCT/JP2019/012738 JP2019012738W WO2020194489A1 WO 2020194489 A1 WO2020194489 A1 WO 2020194489A1 JP 2019012738 W JP2019012738 W JP 2019012738W WO 2020194489 A1 WO2020194489 A1 WO 2020194489A1
Authority
WO
WIPO (PCT)
Prior art keywords
exposure control
detection
edge
eye
control device
Prior art date
Application number
PCT/JP2019/012738
Other languages
English (en)
Japanese (ja)
Inventor
吉本 忠文
昇吾 米山
篤 松本
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2021507101A priority Critical patent/JP6873351B2/ja
Priority to PCT/JP2019/012738 priority patent/WO2020194489A1/fr
Publication of WO2020194489A1 publication Critical patent/WO2020194489A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Definitions

  • the present invention relates to exposure control when imaging the eyes of a vehicle occupant.
  • an occupant monitoring system that detects the inattentive state or dozing state from the position of the occupant's eyes or the opening of the eyelids and warns the occupant.
  • the eyes of the occupant are photographed by a camera mounted on the vehicle, and the state of the occupant is detected using the photographed image.
  • the occupant is wearing spectacles, there is a problem that external light is reflected on the lens of the spectacles and it is not possible to properly take an image of the eyes.
  • Patent Document 1 proposes a technique for adjusting the exposure when external light reflection is detected.
  • the present invention has been made to solve the above problems, and an object of the present invention is to perform appropriate exposure control when a part of the eye cannot be photographed due to external light reflection.
  • the exposure control device of the present invention detects an eye detection unit that detects the outer and inner corners of the eyes from an image of the eyes of a vehicle occupant by a camera, and an edge of the eyelid between the outer and inner corners of the eye from the captured image as a detection edge.
  • An edge detection unit for detecting an edge, a difference detection unit for detecting a difference between the detection edge and the reference edge, and an exposure control unit for performing exposure control so that the difference between the detection edge and the reference edge becomes small are provided.
  • the exposure control device of the present invention performs exposure control so that the difference between the detection edge and the reference edge becomes small, it is possible to perform appropriate exposure control when a part of the eye cannot be photographed due to external light reflection.
  • FIG. It is a block diagram which shows the structure of the exposure control apparatus of Embodiment 1.
  • FIG. It is a flowchart which shows the operation of the exposure control apparatus of Embodiment 1. It is a figure which shows the shooting range of a camera. It is a figure which shows the detection edge in the state which there is no external light reflection. It is a figure which shows the image taken by the camera with the outside light reflection. It is a figure which shows the detection edge detected by the edge detection part from the photographed image of FIG. It is a figure which shows the method of exposure control. It is a figure which shows the image taken by the camera after exposure control. It is a flowchart which shows the operation of the difference detection part of Embodiment 2 in step S104 of FIG.
  • FIG. 1 is a block diagram showing the configuration of the exposure control device 101 of the first embodiment.
  • the exposure control device 101 is a device that controls the exposure of the camera 21 so that the camera 21 can appropriately photograph the eyes of the occupants of the vehicle.
  • a vehicle equipped with a camera 21 for which the exposure control device performs exposure control is simply referred to as a "vehicle".
  • the camera 21 is mounted on the vehicle and captures the facial image of the occupant. As shown in FIG. 2, the shooting range 31 of the camera 21 is a range including the face of the occupant 32. Here, the occupant photographed by the camera 21 is typically a driver, but may be an occupant other than the driver.
  • the image acquisition unit 11 acquires the captured image of the camera 21 and outputs it to the eye detection unit 12.
  • the eye detection unit 12 acquires a photographed image from the image acquisition unit 11 and detects the outer and inner corners of the occupant's eyes from the photographed image.
  • the edge detection unit 13 acquires a captured image and detection information of the outer and inner corners of the eye from the eye detection unit 12, and detects the edge shape of the eyelid between the outer and inner corners of the eye as a detection edge from the information.
  • the difference detection unit 14 acquires the detection edge from the edge detection unit 13 and compares the detection edge with the reference edge to detect the difference between the two.
  • the reference edge is an edge prepared in advance as an edge shape of a general eyelid. Since there are individual differences in the shape of the eyes, for example, the fineness and size of the eyes, a plurality of patterns are usually prepared for the reference edge, and the difference detection unit 14 extracts the reference edge closest to the detection edge and both. To compare. The closer the detection edge is to the reference edge, the more correctly the camera 21 can capture the eyes of the occupant, and it can be said that the exposure is appropriate. If the occupant is wearing spectacles and the lens of the spectacles reflects external light, it is difficult to accurately detect the edge of the eyelid in the area where the external light is reflected. The detection edge deviates from the reference edge.
  • the exposure control unit 15 acquires the difference between the detection edge and the reference edge from the difference detection unit 14, and controls the exposure of the camera 21 in the direction in which the difference becomes smaller.
  • FIG. 2 is a flowchart showing the operation of the exposure control device 101. The flow of FIG. 2 starts at the timing when the accessory power of the vehicle is turned on and the camera 21 starts shooting, and is repeated while the vehicle is running.
  • the image acquisition unit 11 acquires the captured image of the camera 21 (step S101).
  • FIG. 3 shows the shooting range 31 of the camera 21.
  • the photographing range 31 is a range including the face of the occupant 32.
  • the eye detection unit 12 detects the outer and inner corners of the eye from the captured image (step S102).
  • the edge detection unit 13 detects the edge of the eyelid between the outer and inner corners of the eye from the captured image (step S103).
  • the edge detected here is referred to as a detection edge.
  • the difference detection unit 14 compares the detection edge with the reference edge and detects the difference between the two (step S104).
  • the exposure control unit 15 controls the exposure of the camera 21 so that the difference between the detection edge and the reference edge becomes small, in other words, the detection edge approaches the reference edge (step S105).
  • the exposure control device 101 of the first embodiment includes an eye detection unit 12 that detects the outer and inner corners of the eyes from the captured image of the eyes of the vehicle occupant by the camera 21, and the outer corners of the eyes from the captured image.
  • the edge detection unit 13 that detects the edge of the eyelid between the inner corners of the eye as the detection edge
  • the difference detection unit 14 that detects the difference between the detection edge and the reference edge
  • It includes an exposure control unit 15 that controls exposure. Therefore, according to the exposure control device 101, even if a part of the eye cannot be photographed due to external light reflection on the lens of the spectacles worn by the occupant, the entire eye is photographed by performing appropriate exposure control. Can be done.
  • Embodiment 2 > ⁇ B-1.
  • FIG. 4 shows the detection edge 42 in the absence of external light reflection. If the lens 41 of the spectacles does not reflect external light, the entire eye is clearly captured in the captured image, so that the edge detection unit 13 can accurately detect the edge of the eyelid.
  • the edge detection unit 13 detects the edge of the eyelid at five detection points 421-425, and the solid line connecting the detection points 421-425 is the detection edge 42.
  • the reference point 431-435 corresponding to the detection point 421-425 and the reference edge 43 connecting the reference points 431-435 are shown by dotted lines. Of the reference edges 43, the points corresponding to the detection points 421-425 are the reference points 431-435.
  • detection points 421-425 means, for example, that the lateral position is equal to detection points 421-425.
  • the detection points 421-425 are detected with high accuracy and coincide with the reference points 431-435. In this case, the exposure control unit 15 does not change the current exposure control.
  • FIG. 5 shows an image taken by the camera 21 in a state where there is external light reflection.
  • region 44 the region where external light reflection occurs
  • region 45 the region where external light reflection does not occur. Since the face portion overlapping the region 44 does not appear in the captured image, the edge detection unit 13 cannot accurately detect the edge of the eyelid.
  • FIG. 6 shows the detection edge 42 detected by the edge detection unit 13 from the captured image of FIG.
  • the detection points 421,422,425 in the region 45 coincide with the corresponding reference points 431,432,435, but the detection points 423,424 in the region 44 do not coincide with the corresponding reference points 433,434.
  • the exposure control unit 15 determines that external light reflection occurs at the detection points 423 and 424, and that external light reflection does not occur at the detection points 421, 422 and 425.
  • the detection points 423 and 424 in the region where external light reflection occurs are referred to as first detection points
  • the detection points 421, 422 and 425 in the region where external light reflection does not occur are referred to as second detection points.
  • the exposure control unit 15 controls the exposure of the camera 21 based on the brightness of the first detection point and the brightness of the second detection point.
  • the brightness of each detection point can be measured from the captured image.
  • one detection point is selected from each of the first detection point and the second detection point, and the average brightness of the selected detection points is used as a reference.
  • the detection point 423 is selected as the first detection point
  • the detection point 421 is selected as the second detection point.
  • the exposure control unit 15 calculates the average brightness LA1 of the detection points 423 and 424 which are the first detection points and the average brightness LA2 of the detection points 421, 422 and 425 which are the second detection points. Then, the exposure control of the camera 21 is performed with reference to the average value of the average brightness LA1 and the average brightness LA2. The exposure control unit 15 is performed by adjusting the exposure time or the aperture value of the camera 21 or adjusting the amount of light of the floodlight 22. A plurality of adjustment methods listed above may be combined.
  • the floodlight 22 is mounted on the vehicle and irradiates the occupants with light.
  • FIG. 8 shows a captured image of the camera 21 after exposure control. Although the brightness of the entire photographing region is reduced by the exposure control, it is possible to detect the eyes even in the region 44 where the external light reflection has occurred.
  • FIG. 9 is a flowchart showing the operation of the difference detection unit 14 of the exposure control device 102 in step S104 of FIG.
  • the edge detection unit 13 detects the edge of the eyelid between the inner and outer corners of the eye at a plurality of detection points (step S103 in FIG. 2).
  • the difference detection unit 14 selects one detection point (step S1401), and performs the following processing on the selected detection point. That is, the difference detection unit 14 determines whether or not the distance between the detection point and the reference point is equal to or greater than a predetermined threshold value (step S1402).
  • the difference detection unit 14 sets the detection point as the first detection point (step S1403). On the other hand, when the distance between the detection point and the reference point is less than the threshold value, the difference detection unit 14 sets the detection point as the second detection point (step S1404).
  • the difference detection unit 14 determines whether or not the processing has been completed for all the detection points (step S1405). If there are unprocessed detection points, the difference detection unit 14 returns to step S1401, and if the processing is completed for all the detection points, the difference detection unit 14 proceeds to step S105.
  • FIG. 10 is a flowchart showing the operation of the exposure control unit 15 of the exposure control device 102 in step S105 of FIG.
  • the exposure control unit 15 calculates the average brightness LA1 of the first detection point (step S1501).
  • the first detection points are detection points 423 and 424.
  • the exposure control unit 15 calculates the average brightness LA2 of the second detection point (step S1502).
  • the second detection point is the detection point 421, 422, 425.
  • the edge detection unit 13 detects the edge of the eyelid at a plurality of detection points. Although five detection points are shown in FIG. 4 and the like, the number of detection points is not limited to five. Further, the edge detection unit 13 may detect not only the upper eyelid but also the edge of the lower eyelid, or may detect both the edge of the upper eyelid and the edge of the lower eyelid.
  • the exposure control unit 15 sets the average value of the average brightness LA1 at the first detection point and the average brightness LA2 at the second detection point as the reference brightness LS, but sets the reference brightness LS by another method. Is also good.
  • the exposure control unit 15 may use the median brightness of the first detection point and the second detection point as the reference brightness LS, or the median brightness of the first detection point and the median brightness of the second detection point. The average value of and may be used as the reference luminance LS.
  • the edge detection unit 13 detects the detection edge at a plurality of detection points. Then, the exposure control unit 15 is a second detection point in which the brightness of the first detection point, which is a detection point whose distance from the corresponding point of the reference edge is equal to or greater than the threshold value, and the distance between the corresponding point of the reference edge are less than the threshold value.
  • the exposure of the camera 21 is controlled based on the brightness of the detection point. This exposure control makes it possible to photograph the eyes at both the first detection point and the second detection point. Therefore, even if there is external light reflection, it is possible to appropriately detect eye information.
  • the exposure control unit 15 controls the exposure of the camera 21 based on the average value of the average brightness of the first detection point and the average brightness of the second detection point. This exposure control makes it possible to photograph the eyes at both the first detection point and the second detection point.
  • the exposure control unit 15 controls the exposure of the camera 21 based on the average value of the median brightness of the first detection point and the median brightness of the second detection point. This exposure control makes it possible to photograph the eyes at both the first detection point and the second detection point.
  • the exposure control unit 15 controls the exposure of the camera 21 based on the median value of the brightness of the first detection point and the brightness of the second detection point. This exposure control makes it possible to photograph the eyes at both the first detection point and the second detection point.
  • FIG. 11 is a block diagram showing the configuration of the exposure control device 103 according to the third embodiment.
  • the exposure control device 103 includes an eye information detection unit 16 in addition to the configurations of the exposure control devices 101 and 102 of the first and second embodiments. Since the configuration of the exposure control device 103 other than the eye information detection unit 16 is the same as that of the exposure control devices 101 and 102, the description thereof will be omitted.
  • the eye information detection unit 16 acquires the captured image of the camera 21 from the image acquisition unit 11 and detects the eye information from the captured image.
  • the eye information is, for example, the direction of the line of sight or the opening degree of the eyelids, and is used to grasp the state of the driver.
  • the driver's condition is, for example, inattentive driving or dozing driving.
  • the exposure control unit 15 performs appropriate exposure control, so that the eye information can be accurately obtained from the captured image after the exposure control. Can be detected.
  • the eye information detection unit 16 may detect eye information only from the captured image after exposure control, or exposes and controls a portion of the eye information detected from the captured image before exposure control where external light reflection occurs. It may be supplemented with the eye information detected from the later captured image. The complementation of eye information will be described below.
  • FIG. 12 shows an image taken before exposure control by the exposure control device 103. Since the external light reflection occurs in the region 44, the eye information detection unit 16 can detect the information of the eye portion overlapping the region 45, but cannot detect the information of the eye portion overlapping the region 44 with high accuracy.
  • FIG. 13 shows a captured image after exposure control. In the captured image of FIG. 13, eyes are reflected in both the area 44 and the area 45, and the eye information detection unit 16 can detect the information of the eye portion in the area 45 with high accuracy.
  • the eye information detection unit 16 complements the eye information that could not be detected from the captured image before the exposure control from the captured image after the exposure control. For example, when the eye information is the edge of the eyelid, the eye information is complemented by replacing the detection points 4231 and 4241 shown in FIG. 13 with the detection points 423 and 424 shown in FIG. Do.
  • the eye information detection unit 16 estimates the region 44 where external light reflection occurs from the detection points 4231 and 4241, which are the second detection points. ..
  • the eye information detection unit 16 may estimate a certain region including the detection points 4231 and 4241, specifically, a region along the vertical direction of the lens 41 as a region 44 in which external light reflection occurs.
  • the edge detection unit 13 detects the edges of both the upper eyelid and the lower eyelid
  • the eye information detection unit 16 reflects external light on the region connecting the second detection points extracted from both edges. May be estimated as the region 44 where As a result, the region 44 where the external light reflection has occurred can be accurately estimated.
  • the eye information detection unit 16 complements the eye information detected from the region estimated to have caused external light reflection in the captured image with the eye information detected from the captured image before the exposure control. As a result, the eye information detection unit 16 can obtain accurate eye information.
  • the exposure control device 103 includes an eye information detection unit 16 that detects eye information including eyelid shape information from a captured image. Therefore, according to the exposure control device 103, the eye information can be detected with high accuracy even when there is external light reflection.
  • the eye information detection unit 16 complements the eye information detected from the captured image before the exposure control by the exposure control device 103 with the eye information detected from the captured image after the exposure control. As a result, eye information can be detected with high accuracy even when there is external light reflection.
  • FIG. 14 is a block diagram showing the configuration of the exposure control device 104 according to the fourth embodiment.
  • the exposure control device 104 includes a state detection unit 17 in addition to the exposure control device 103 of the third embodiment. Since the configuration of the exposure control device 104 other than the state detection unit 17 is the same as that of the exposure control device 103, the description thereof will be omitted.
  • the state detection unit 17 detects the driver's state based on the eye information detected by the eye information detection unit 16. For example, the state detection unit 17 detects whether or not the driver is inattentive driving based on the direction of the driver's line of sight detected as eye information. Alternatively, the state detection unit 17 detects whether or not the driver is dozing based on the edge of the driver's eyelids detected as eye information. As a result, for example, when the driver is in a state unsuitable for driving, a warning can be given through a display device or a voice output device mounted on the vehicle.
  • the exposure control device 104 includes a state detection unit that detects the state of the occupant based on eye information. Therefore, even if the occupant is wearing glasses, the state of the occupant can be accurately detected.
  • the processing circuit 51 includes an image acquisition unit 11, an eye detection unit 12, an edge detection unit 13, a difference detection unit 14, an exposure control unit 15, an eye information detection unit 16, and a state detection unit 17 (hereinafter, “image acquisition unit”). It is referred to as "11 mag").
  • image acquisition unit It is referred to as "11 mag”).
  • Dedicated hardware may be applied to the processing circuit 51, or a processor that executes a program stored in the memory may be applied.
  • the processor is, for example, a central processing unit, a processing unit, an arithmetic unit, a microprocessor, a microcomputer, a DSP (Digital Signal Processor), or the like.
  • the processing circuit 51 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-Programmable). GateArray), or a combination of these.
  • Each of the functions of each part such as the parts may be realized by a plurality of processing circuits 51, or the functions of each part may be collectively realized by one processing circuit.
  • the processing circuit 51 When the processing circuit 51 is a processor, the functions of the image acquisition unit 11 and the like are realized by combining software and the like (software, firmware or software and firmware). Software and the like are described as programs and stored in memory. As shown in FIG. 16, the processor 52 applied to the processing circuit 51 realizes the functions of each part by reading and executing the program stored in the memory 53. That is, when the exposure control device 101-104 is executed by the processing circuit 51, the step of detecting the outer and inner corners of the eyes from the captured image of the eyes of the vehicle occupant by the camera 21 and the outer and inner corners of the eyes from the captured image.
  • a memory 53 for storing a program to be executed as a result is provided.
  • this program causes the computer to execute the procedure or method of the image acquisition unit 11 or the like.
  • the memory 53 includes, for example, non-volatile such as RAM (RandomAccessMemory), ROM (ReadOnlyMemory), flash memory, EPROM (ErasableProgrammableReadOnlyMemory), and EPROM (ElectricallyErasableProgrammableReadOnlyMemory).
  • non-volatile such as RAM (RandomAccessMemory), ROM (ReadOnlyMemory), flash memory, EPROM (ErasableProgrammableReadOnlyMemory), and EPROM (ElectricallyErasableProgrammableReadOnlyMemory).
  • HDD Hard Disk Drive
  • magnetic disk flexible disk
  • optical disk compact disk
  • mini disk DVD (Digital Versatile Disk) and its drive device, or any storage medium that will be used in the future. There may be.
  • each function of the image acquisition unit 11 and the like is realized by either hardware or software has been described above.
  • the present invention is not limited to this, and a configuration may be configured in which a part of the image acquisition unit 11 or the like is realized by dedicated hardware and another part is realized by software or the like.
  • the image acquisition unit 11 realizes its function by a processing circuit as dedicated hardware, and other than that, the processing circuit 51 as a processor 52 reads and executes a program stored in the memory 53 to execute the function. It is possible to realize.
  • the processing circuit can realize each of the above-mentioned functions by hardware, software, or a combination thereof.
  • FIG. 17 shows a configuration example of the exposure control device 103 by the vehicle 71 and the server 72.
  • the image acquisition unit 11, the exposure control unit 15, and the eye information detection unit 16 are arranged in the vehicle 71, and the eye detection unit 12, the edge detection unit 13, and the difference detection unit 14 are arranged in the server 72.
  • each embodiment can be freely combined, and each embodiment can be appropriately modified or omitted.
  • the present invention has been described in detail, the above description is exemplary in all embodiments and the invention is not limited thereto. It is understood that a myriad of variations not illustrated can be envisioned without departing from the scope of the invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

La présente invention ‌a‌ ‌pour‌ ‌objet‌ d'effectuer une commande d'exposition appropriée lorsqu'une partie de l'œil ne peut pas être imagée en raison de la réflexion de la lumière externe. À cet effet, l'invention porte sur un dispositif de commande d'exposition qui comprend une unité de détection d'œil (12) pour détecter le canthus externe et le coin interne de l'œil à partir de l'image de l'œil d'un occupant de véhicule capturée par une caméra (21); une unité de détection de bord (13) pour détecter le bord d'une paupière entre le canthus externe et le coin intérieur comme bord de détection à partir de l'image capturée ; une unité de détection de différence (14) pour détecter une différence entre le bord de détection et un bord de référence ; et une unité de commande d'exposition (15) pour effectuer une commande d'exposition de telle sorte que la différence entre le bord de détection et le bord de référence devient plus petite.
PCT/JP2019/012738 2019-03-26 2019-03-26 Dispositif de commande d'exposition et procédé de commande d'exposition WO2020194489A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2021507101A JP6873351B2 (ja) 2019-03-26 2019-03-26 露光制御装置および露光制御方法
PCT/JP2019/012738 WO2020194489A1 (fr) 2019-03-26 2019-03-26 Dispositif de commande d'exposition et procédé de commande d'exposition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/012738 WO2020194489A1 (fr) 2019-03-26 2019-03-26 Dispositif de commande d'exposition et procédé de commande d'exposition

Publications (1)

Publication Number Publication Date
WO2020194489A1 true WO2020194489A1 (fr) 2020-10-01

Family

ID=72609344

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/012738 WO2020194489A1 (fr) 2019-03-26 2019-03-26 Dispositif de commande d'exposition et procédé de commande d'exposition

Country Status (2)

Country Link
JP (1) JP6873351B2 (fr)
WO (1) WO2020194489A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11183127A (ja) * 1997-12-25 1999-07-09 Nissan Motor Co Ltd 眼位置検出装置
JP2000310510A (ja) * 1999-04-28 2000-11-07 Niles Parts Co Ltd 眼位置検出装置
JP2002352229A (ja) * 2001-05-30 2002-12-06 Mitsubishi Electric Corp 顔部位検出装置
JP2008224565A (ja) * 2007-03-15 2008-09-25 Aisin Seiki Co Ltd 目状態判別装置、目状態判別方法及び目状態判別プログラム
JP2009116797A (ja) * 2007-11-09 2009-05-28 Aisin Seiki Co Ltd 顔画像撮像装置、顔画像撮像方法、及びそのプログラム
JP2013045317A (ja) * 2011-08-25 2013-03-04 Denso Corp 顔画像検出装置
JP2013175914A (ja) * 2012-02-24 2013-09-05 Denso Corp 撮像制御装置及びプログラム

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11183127A (ja) * 1997-12-25 1999-07-09 Nissan Motor Co Ltd 眼位置検出装置
JP2000310510A (ja) * 1999-04-28 2000-11-07 Niles Parts Co Ltd 眼位置検出装置
JP2002352229A (ja) * 2001-05-30 2002-12-06 Mitsubishi Electric Corp 顔部位検出装置
JP2008224565A (ja) * 2007-03-15 2008-09-25 Aisin Seiki Co Ltd 目状態判別装置、目状態判別方法及び目状態判別プログラム
JP2009116797A (ja) * 2007-11-09 2009-05-28 Aisin Seiki Co Ltd 顔画像撮像装置、顔画像撮像方法、及びそのプログラム
JP2013045317A (ja) * 2011-08-25 2013-03-04 Denso Corp 顔画像検出装置
JP2013175914A (ja) * 2012-02-24 2013-09-05 Denso Corp 撮像制御装置及びプログラム

Also Published As

Publication number Publication date
JPWO2020194489A1 (ja) 2021-09-13
JP6873351B2 (ja) 2021-05-19

Similar Documents

Publication Publication Date Title
US9876950B2 (en) Image capturing apparatus, control method thereof, and storage medium
US8508652B2 (en) Autofocus method
JP5098259B2 (ja) カメラ
US20080074529A1 (en) Imaging apparatus, control method of imaging apparatus, and computer program
US9531945B2 (en) Image capturing device with an auto-focusing method thereof
JP4998308B2 (ja) 焦点調節装置および撮像装置
JP6415196B2 (ja) 撮像装置および撮像装置の制御方法
JP2013160919A (ja) 画像信号処理装置及び画像信号処理方法
US20200250806A1 (en) Information processing apparatus, information processing method, and storage medium
US10594938B2 (en) Image processing apparatus, imaging apparatus, and method for controlling image processing apparatus
JP2007274587A (ja) 撮像装置及びその制御方法
JP2017130976A5 (ja) 画像処理装置、及び画像処理方法
US9781331B2 (en) Imaging apparatus, and control method thereof
US9609202B2 (en) Image pickup apparatus and control method with focus adjusting modes
JP6873351B2 (ja) 露光制御装置および露光制御方法
JP2019028959A (ja) 画像登録装置、画像登録システムおよび画像登録方法
US10638042B2 (en) Electronic device, control device for controlling electronic device, control program, and control method
JP6381206B2 (ja) 画像処理装置、その制御方法およびプログラム
JP2007189595A (ja) 車載カメラ装置
EP3163369B1 (fr) Commande de la mise au point automatique dans une caméra en vue d'empêcher l'oscillation
WO2019097677A1 (fr) Dispositif et procédé de commande de capture d'image et système de surveillance de conducteur comprenant un dispositif de commande de capture d'image
US20210195110A1 (en) Control apparatus, lens apparatus, image pickup apparatus, and image pickup system
JP6429724B2 (ja) 撮像装置およびその制御方法
US20240177463A1 (en) Image processing apparatus, control apparatus, image processing method, and storage medium
US20230077645A1 (en) Interchangeable lens and image pickup apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19921839

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021507101

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19921839

Country of ref document: EP

Kind code of ref document: A1