WO2022014068A1 - Video processing device and video processing method - Google Patents

Video processing device and video processing method Download PDF

Info

Publication number
WO2022014068A1
WO2022014068A1 PCT/JP2020/047730 JP2020047730W WO2022014068A1 WO 2022014068 A1 WO2022014068 A1 WO 2022014068A1 JP 2020047730 W JP2020047730 W JP 2020047730W WO 2022014068 A1 WO2022014068 A1 WO 2022014068A1
Authority
WO
WIPO (PCT)
Prior art keywords
face
image
orientation
unit
vehicle
Prior art date
Application number
PCT/JP2020/047730
Other languages
French (fr)
Japanese (ja)
Inventor
久 小磯
Original Assignee
株式会社Jvcケンウッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Jvcケンウッド filed Critical 株式会社Jvcケンウッド
Publication of WO2022014068A1 publication Critical patent/WO2022014068A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a video processing apparatus and a video processing method.
  • an image processing device such as a driver monitor mounted on a vehicle
  • the inside of the vehicle is photographed and the image including the driver is stored in the storage device.
  • the image of the vehicle interior taken by the image processing device is used, for example, to confirm the driving situation such as the direction in which the driver's face is facing and the behavior of the passenger in the vehicle interior.
  • Patent Document 1 includes a camera that captures the face of a driver of an automobile and an image processing device that processes a face image obtained from the camera, and is obtained by this camera when the device is started. Based on the front face of the driver, a face model group corresponding to various face orientations of the driver is generated, and after this generation is completed, the face image of the driver is input cyclically, and among the face model groups, Detects the driver's face orientation by selecting the best matching face model.
  • the conventional driver's face image processing device described in Patent Document 1 detects the direction of the driver's face every moment, but it is not possible to determine whether or not the driver is looking away while driving. It is enough. For example, the standard position of the driver's face in the driver's seat of the vehicle is determined, the front direction from the standard position of the face within the angle of view taken by the image processing device is determined, and the front direction is relative to the front direction. If the face is misaligned, it can be determined that the person is looking away.
  • the position of the face is actually shifted from the standard position of the face to the front, back, left, right, up and down, and to the right or left from the center in the lane depending on the driver's preference. You may also drive in the direction of. In this case, it is determined that the driver is facing a direction different from the front direction from the standard position of the face determined in advance on the image processing apparatus side.
  • the shooting direction is changed by the video processing device.
  • the front direction from the standard position of the face is predetermined on the image processing apparatus side, the front direction may shift due to the adjustment of the rear-view mirror.
  • the present invention has been made in view of such circumstances, and an object of the present invention is to provide an image processing device and an image processing method capable of correcting the front direction of the face of an imaged subject to be photographed. There is something in it.
  • One embodiment of the present embodiment includes an image acquisition unit that acquires an image including the face of the image subject, an orientation detection unit that detects the direction of the face based on the image acquired by the image acquisition unit, and the subject.
  • the correction processing unit has a face orientation based on the distribution of the face orientation detected a plurality of times by the orientation detection unit. Is corrected as the front direction in the direction of the face of the image subject.
  • This image processing method includes an image acquisition step of acquiring an image including a face of an imaged object, an orientation detection step of detecting the direction of the face based on the image acquired by the image acquisition step, and the imaged body.
  • the correction processing step includes a correction processing step for correcting the front direction in the face orientation, and the correction processing step captures the center of the face orientation based on the distribution of the face orientation detected a plurality of times by the orientation detection step. Correct as the front direction in the direction of the face of the body.
  • FIGS. 1 to 5 based on a preferred embodiment.
  • the same or equivalent components and members shown in the drawings shall be designated by the same reference numerals, and duplicate description thereof will be omitted as appropriate.
  • some of the members that are not important for explaining the embodiment in each drawing are omitted and displayed.
  • FIG. 1 is a block diagram showing a configuration of a video processing device 100 according to an embodiment.
  • the image processing device 100 is, for example, a driver monitor, which is mounted on the vehicle and continuously photographs the occupants in the vehicle interior of the vehicle in time.
  • the image processing apparatus 100 may record the captured image, or may discard it without recording it.
  • the image processing device 100 is arranged, for example, on a dashboard below the windshield of a vehicle, on a rear-view mirror, or the like.
  • the image processing apparatus 100 captures an image including, for example, a driver in a vehicle as an image-received object, and recognizes the entire face of the photographed object or a facial portion such as an eye.
  • the image processing device 100 detects the orientation of the face of the driver (object to be imaged) that has recognized the image, acquires the data of the orientation of the face over a predetermined period, and obtains the distribution of the orientation of the face.
  • the image processing apparatus 100 obtains the center of the face orientation based on, for example, the distribution of the driver's face orientation, and corrects the center to be the front direction of the driver's face orientation.
  • the face of the driver who is the image subject, faces the front of the vehicle for safe driving, but as described above, the center of the direction in which each driver's face faces differs due to individual differences and the like. ing.
  • the center of the direction in which the driver's face is facing should be the reference direction of the face direction, but in the present specification, it is referred to as the front direction for convenience. That is, "the direction of the driver's face” and “the direction of the face of the imaged object” mean the direction of the face, which changes from moment to moment, and is, for example, an assistant in the image of a certain moment taken by the image processing apparatus 100. If you face the seat, say that direction.
  • the "front direction of the face orientation” (or simply the “front direction”) is the direction in which the face of the driver or the image subject is mainly facing in the image captured by the image processing apparatus 100 (described above). Refers to the reference direction), and refers to the center obtained from the distribution of the direction of the face that moves from moment to moment.
  • the "front direction of the face orientation” may be expressed by terms such as "reference direction of the face orientation” or "main direction of the face orientation”.
  • the video processing device 100 detects, for example, that the vehicle is running, and corrects the front direction of the driver's face. Further, the image processing device 100 notifies the driver that the correction in the front direction is to be performed, and corrects the front direction of the driver's face while the driver himself is facing the front direction. You may do so.
  • the correction of the driver's face orientation to the front direction in the image processing device 100 means that the front direction of the face orientation is calibrated, and by executing such calibration, the image is displayed. It becomes possible to correctly analyze the orientation of the driver's face.
  • the description of the function based on the orientation of the driver's face taken by the image processing apparatus 100 is omitted, but in the image processing apparatus 100, the output function of the data regarding the orientation of the driver's face and the look away are described. It may have a warning function for detecting and notifying. Further, the video processing device 100 has a function of statistically processing how much the driver's face orientation is deviated from the front direction based on a distribution or the like, and diagnosing and evaluating the driver's face orientation. You may have.
  • the video processing device 100 includes an imaging unit 10, a recording unit 20, a video processing unit 30, an external input unit 40, an external output unit 41, and the like.
  • the image pickup unit 10 is a camera having a detector such as a CCD, and captures an image of a vehicle interior including, for example, a driver in a vehicle as an image subject.
  • the imaging unit 10 continuously acquires images in time and sends them to the image acquisition unit 31, which will be described later.
  • the recording unit 20 is, for example, a detachable medium such as an SD card or a USB memory, a hard disk, or the like, and can record and delete the video acquired by the video acquisition unit 31.
  • a detachable medium such as an SD card or a USB memory, a hard disk, or the like
  • the recording unit 20 may be configured to be detachably attached to the video processing device 100. In this case, the recording unit 20 can be removed from the video processing device 100 and the video can be played back on another PC or the like.
  • the external input unit 40 acquires vehicle speed information, position information, and the like from an external device. Further, the external input unit 40 may acquire data measured by an acceleration sensor, a gyro, or the like mounted on the vehicle from an external device.
  • the external output unit 41 outputs the video information processed by the video processing unit 30, data related to the orientation of the driver's face, and the like to the external device. The data regarding the driver's face orientation detected by the video processing apparatus 100 and the correction data regarding the frontal direction of the face orientation are recorded in the recording unit 20 and then output from the external output unit 41. You may. Further, the video processing apparatus 100 may add speed information, position information, and the like acquired by the external input unit 40 to the video, record it in the recording unit 20, and output it from the external output unit 41.
  • the video processing unit 30 includes a video acquisition unit 31, a video recognition unit 32, an orientation detection unit 33, a correction processing unit 34, and a travel determination unit 35.
  • the video processing unit 30 is configured by, for example, a CPU or the like, and operates according to a computer program to execute the processing by each of the above-mentioned units.
  • the storage unit 30a is composed of a data storage device such as a RAM (RandomAccessMemory), a flash memory, and a hard disk storage device, and stores a computer program or the like executed by the video processing unit 30. Further, the storage unit 30a stores a recognition dictionary for recognizing the face, eyes, and the like of the image subject from the captured image.
  • the image acquisition unit 31 acquires the image captured by the image pickup unit 10, performs processing such as data compression, and outputs the image to the recording unit 20.
  • the image acquisition unit 31 may be configured to include the image pickup unit 10.
  • the image recognition unit 32 obtains facial parts such as the face and eyes of the imaged object from the image input from the image acquisition unit 31 based on the recognition dictionary stored in the storage unit 30a, the learning type calculation model, and the like. recognize.
  • the recognition dictionary stored in the storage unit 30a includes shape data such as faces and eyes, and the image recognition unit 32 extracts the shape pattern appearing in the image and includes it in the recognition dictionary.
  • the face, eyes, etc. are recognized by collating with the shape data.
  • the image recognition unit 32 can recognize faces, eyes, etc. from images by using known image recognition processing methods developed in various fields. For example, a face or an eye can be recognized by a learning-type arithmetic model using a neural network. A method of recognizing eyes or the like may be used.
  • the orientation detection unit 33 detects the orientation of the face of the image subject based on the data of the face and each part of the face recognized by the image recognition unit 32.
  • FIG. 2 is a schematic diagram for explaining the orientation of the face.
  • FIG. 2 shows, for example, a state in which the driver's face is viewed from the front side of the vehicle, the front direction of the vehicle is the X-axis direction, the left direction of the vehicle is the Y-axis direction, and the upward direction of the vehicle is the Z-axis direction. It is supposed to be.
  • the rotation around the X axis is defined as a roll
  • the rotation around the Y axis is defined as pitch
  • the rotation around the Z axis is defined as yaw.
  • the orientation of the face of the image subject changes due to pitching and yawing. That is, by detecting pitching with respect to the orientation of the face, it can be seen that the object to be imaged changes the orientation of the face in the vertical direction. Further, by detecting the yawing with respect to the orientation of the face, it can be seen that the object to be imaged changes the orientation of the face in the left-right direction.
  • the orientation detection unit 33 detects the pitch angle by the vertical movement of the line L1 connecting the left and right eyes shown in FIG. 2, for example. Further, the orientation detection unit 33 detects the yaw angle by the movement of the center line L2 of the nose and the positions P1 and P2 of the left and right outer corners of the eyes. In addition to these parts, the orientation detection unit 33 can detect the pitch angle and yaw angle of the face based on the movement of each part of the face such as the eyebrows, nose, and mouth. The orientation detection unit 33 can use known techniques developed in various technical fields for detecting the orientation of the face.
  • the correction processing unit 34 corrects the front direction of the face orientation of the image subject based on the distribution of the face orientation of the image subject detected by the orientation detection unit 33.
  • FIG. 3 is a graph showing an example of the distribution of face orientation detected by the orientation detection unit 33.
  • the horizontal axis is the yaw angle and the vertical axis is the pitch angle
  • the pitch angle and the yaw angle of the face orientation detected by the orientation detection unit 33 are plotted by shading, which is represented by a so-called heat map. That is, for the face orientation pitch angle and yaw angle data detected within a predetermined period, the number of times detected for each pitch angle and yaw angle is integrated, and if the integrated number is low, it is black and integrated.
  • the predetermined period is, for example, several seconds to several minutes, but may be set to a period of any time length.
  • the heat map used for detection may be output as video information through the external output unit 41 in real time or non-real time for the purpose of adjusting mode of the video processing device 100 or temporary operation check by the driver, or as data. It may be recorded in the recording unit 20.
  • the correction processing unit 34 obtains the center of the distribution of the pitch angle and the yaw angle of the face orientation represented by FIG. 3, and corrects the face orientation to the front direction.
  • the center of the distribution can be determined using statistical methods such as mean, median, or mode. For example, as shown in FIG. 3, the correction processing unit 34 obtains the position of the star mark as the center of the distribution, and corrects the direction of the star mark as the front direction of the face orientation of the image subject.
  • the travel determination unit 35 determines whether or not the vehicle on which the video processing device 100 itself is mounted is traveling.
  • the travel determination unit 35 may determine whether or not the vehicle is traveling based on the speed information and the position information acquired by the external input unit 40, or the vehicle may determine whether or not the vehicle is traveling based on the image acquired by the image acquisition unit 31. It may be determined whether or not the vehicle is traveling.
  • FIG. 4 is a schematic diagram showing an example of an image acquired by the image acquisition unit 31.
  • the rear window A and the windows B of the driver's seat and the passenger seat are shown in the image shown in FIG. 4, and the traveling determination unit 35 detects the flow of the landscape by the difference between frames of the images in each window.
  • the travel determination unit 35 determines that the vehicle is traveling when the scenery in each window is flowing, and determines that the vehicle is not traveling when the scenery in each window is not flowing.
  • the travel determination unit 35 determines that the vehicle is traveling
  • the image processing unit 30 detects the face orientation by the orientation detection unit 33, and the correction processing unit 34 determines the front direction of the face orientation. It is advisable to ask for it and perform the correction.
  • the image processing unit 30 detects the direction of the face by the direction detection unit 33 regardless of the determination by the travel determination unit 35, and the correction processing unit 34 obtains the front direction of the face direction and executes the correction. May be good.
  • the driver is expected to change the direction of his face depending on the movement before driving, such as checking the inside of the vehicle. Will be detected.
  • the image processing unit 30 notifies the driver of an instruction to face the front direction by voice or the like, and corrects the front direction of the driver's face while the vehicle is not running. May be good.
  • FIG. 5 is a flowchart showing a procedure of correction processing in the front direction in the image processing apparatus 100.
  • the travel determination unit 35 of the image processing device 100 determines whether or not the vehicle is traveling (S1).
  • the travel determination unit 35 determines whether or not the vehicle is traveling by detecting whether or not the scenery in the window portion is flowing from the image acquired by the image acquisition unit 31 as described above by the difference between frames or the like. Further, the travel determination unit 35 may determine whether or not the vehicle is traveling based on the speed information and the position information acquired by the external input unit 40.
  • step S1 when it is determined that the vehicle is not traveling (S1: NO), the determination in step S1 is repeated, and when it is determined that the vehicle is traveling (S1: YES), the orientation detection unit 33 is subject to The orientation of the face of the image pickup body is detected (S2).
  • the orientation detection unit 33 detects the orientation of the face of the image subject and outputs it to the correction processing unit 34.
  • the correction processing unit 34 accumulates the face orientation of the image subject input from the orientation detection unit 33 over a predetermined period of time, and acquires face orientation distribution data (S3).
  • the correction processing unit 34 calculates the center of the distribution of the face orientation based on the distribution data of the face orientation, performs correction with the calculated center as the front direction (S4), and ends the processing.
  • the center of the distribution is the mean, median, or mode.
  • the image processing apparatus 100 acquires an image including the face of the image subject, detects the orientation of the face, obtains the center of the distribution from the distribution of the orientation of the face detected within a predetermined period, and uses the obtained center as the face. It can be corrected as the front direction of the direction of.
  • the image processing device 100 can be applied to a function of notifying and warning when the driver is looking away from the front direction by correcting the front direction of the face direction. Further, the image processing device 100 provides an external device that statistically processes how much the driver's face orientation is deviated from the front direction based on a distribution or the like to obtain accuracy in face orientation analysis and evaluation. Information can be provided to improve.
  • the image processing device 100 corrects the front direction of the face when it is determined that the vehicle is running, so that the driver turns his face to the front with a higher probability than when the vehicle is not running. Therefore, the correction in the front direction can be performed more accurately.
  • the image processing device 100 corrects the front direction of the face when the vehicle is not running, the probability that the driver is facing the front direction is low, but what kind of operation is performed before driving. It is possible to provide information for grasping whether or not. Further, the image processing device 100 may notify the instruction to face the front direction when the vehicle is not traveling, and may correct the front direction of the driver's face direction.
  • the image processing device 100 determines whether or not the vehicle is traveling based on the flow of the scenery in the window reflected in the image acquired by the image acquisition unit 31, so that the speed information and the position information from the outside can be obtained. It is possible to correct the frontal direction of the face orientation without depending.
  • the video processing apparatus 100 obtains the center of the distribution from the distribution of the face orientation detected within a predetermined period. For example, when a point facing a direction greatly deviated from the front direction appears in the distribution shown in FIG. 3, the center of the distribution is deviated from the original front direction due to the influence of the point deviated greatly from the front direction. There is a possibility.
  • the correction processing unit 34 may set threshold values for the pitch angle and the yaw angle to exclude points that are significantly deviated from the front direction. Further, the correction processing unit 34 may set a threshold value of frequency and remove points appearing at low frequency.
  • the image processing apparatus 100 may set initial values for the front direction of the face orientation of the image subject.
  • the initial values in the front direction are, for example, the pitch angle and yaw angle of the face orientation in the image taken when a person with a reference height and sitting height sits at a reference seat position and faces the front. Stipulated in.
  • the reference height and sitting height refer to, for example, the average height and sitting height of an adult, but they do not necessarily have to be exact values.
  • the reference seat position is, for example, the center position of the vertical stroke. Further, when the image processing device 100 is attached to the rear-view mirror, it is assumed that the rear-view mirror is in the reference position.
  • the reference position of the rear-view mirror is, for example, the center position in the left-right adjustment angle range of the rear-view mirror, but it does not necessarily have to be an exact value.
  • the travel determination unit 35 determines whether or not the vehicle is traveling straight, for example, using position information and map information.
  • the travel determination unit 35 may acquire information on whether or not the vehicle is traveling straight from a navigation device (not shown) mounted on the vehicle by the external input unit 40 and determine.
  • the correction processing unit 34 corrects the front direction of the face orientation of the image subject.
  • the image processing apparatus 100 can correct the front direction more accurately by correcting the front direction of the face orientation of the image subject when the vehicle is traveling straight.
  • the image processing device 100 may correct the direction of the face of the passenger of the vehicle not only in the driver but also in the passenger seat, the rear seat, and the like as the image subject in the front direction. Passenger seat and rear seat passengers do not need to evaluate looking away, but evaluate which direction the face is mainly facing and how much it deviates from that direction. For example, it can be used for devising the arrangement of information devices and advertisements, and devising the layout of the passenger seat to secure a field of view.
  • the image processing device 100 includes an image acquisition unit 31, an orientation detection unit 33, and a correction processing unit 34.
  • the image acquisition unit 31 acquires an image including the face of the image subject.
  • the orientation detection unit 33 detects the orientation of the face based on the image acquired by the image acquisition unit 31.
  • the correction processing unit 34 corrects the front direction in the direction of the face of the image subject.
  • the correction processing unit 34 corrects the center of the face orientation based on the distribution of the face orientations detected a plurality of times by the orientation detection unit 33 as the front direction in the face orientation of the image subject.
  • the image processing apparatus 100 can correct the front direction of the face of the image subject to be photographed, and can provide information for improving the accuracy of analysis and evaluation of the face orientation.
  • a traveling determination unit 35 for determining whether the vehicle is traveling is further provided.
  • the correction processing unit 34 corrects based on the distribution of the face orientation detected by the orientation detection unit 33 when the travel determination unit 35 determines that the vehicle is traveling. As a result, the image processing apparatus 100 can more accurately correct the front direction of the face orientation of the image subject.
  • the travel determination unit 35 determines whether the vehicle is traveling based on the flow of the scenery outside the vehicle in the image acquired by the image acquisition unit 31. As a result, the image processing apparatus 100 can correct the front direction of the face orientation without depending on the speed information and the position information from the outside.
  • the correction processing unit 34 corrects based on the distribution of the face orientation detected by the orientation detection unit 33 when the vehicle is traveling straight. As a result, the image processing apparatus 100 can more accurately correct the front direction of the face orientation of the image subject.
  • the video processing method in the video processing apparatus 100 includes a video acquisition step, an orientation detection step, and a correction processing step.
  • the image acquisition step acquires an image including the face of the image subject.
  • the orientation detection step detects the orientation of the face based on the image acquired by the image acquisition step.
  • the correction processing step corrects the front direction in the orientation of the face of the image subject.
  • the correction processing step corrects the center of the face orientation based on the distribution of the face orientations detected a plurality of times by the orientation detection step as the front direction in the face orientation of the image subject. According to this image processing method, it is possible to correct the front direction of the face of the image subject to be photographed, and it is possible to provide information for improving the accuracy of analysis and evaluation of the face orientation.
  • the video processing program in the video processing apparatus 100 causes a computer to execute a video acquisition step, an orientation detection step, and a correction processing step.
  • the image acquisition step acquires an image including the face of the image subject.
  • the orientation detection step detects the orientation of the face based on the image acquired by the image acquisition step.
  • the correction processing step corrects the front direction in the orientation of the face of the image subject.
  • the correction processing step corrects the center of the face orientation based on the distribution of the face orientations detected a plurality of times by the orientation detection step as the front direction in the face orientation of the image subject. According to this image processing program, it is possible to correct the front direction of the face of the image subject to be photographed, and it is possible to provide information for improving the accuracy of analysis and evaluation of the face orientation.
  • the present invention relates to a video processing device such as a driver monitor and a video processing method.
  • Video acquisition unit 33 Direction detection unit, 34 Correction processing unit, 35 Driving judgment unit, 100 Video processing device.

Abstract

A video processing device (100) includes a video acquisition unit (31), a direction detection unit (33), and a correction processing unit (34). The video acquisition unit (31) acquires a video in which the face of an imaging subject is captured. The direction detection unit (33) detects the directions the face is facing on the basis of the video acquired by the video acquisition unit (31). The correction processing unit (34) corrects the front direction among the directions the face of the imaging subject is facing. The correction processing unit (34) performs the correction by using the center of a distribution of the directions the face is facing, which are detected multiple times by means of the direction detecting unit (33), so as to serve as the front direction among the directions the face of the imaging subject is facing.

Description

映像処理装置および映像処理方法Video processing equipment and video processing method
 本発明は、映像処理装置および映像処理方法に関する。 The present invention relates to a video processing apparatus and a video processing method.
 例えば車両に搭載されるドライバーモニター等の映像処理装置では、車室内を撮影して運転者を含む映像を記憶装置に記憶する。映像処理装置で撮影された車室内の映像は、例えば運転者の顔が向いている方向などの運転状況や、車室内における搭乗者の行動を確認することなどに用いられる。 For example, in an image processing device such as a driver monitor mounted on a vehicle, the inside of the vehicle is photographed and the image including the driver is stored in the storage device. The image of the vehicle interior taken by the image processing device is used, for example, to confirm the driving situation such as the direction in which the driver's face is facing and the behavior of the passenger in the vehicle interior.
 例えば特許文献1には、自動車のドライバの顔を撮影するカメラと、このカメラより得られた顔画像を処理する画像処理装置にて構成され、装置の起動時に於いて、このカメラにより得られたドライバの正面顔を基にしてドライバの様々な顔の向きに対応した顔モデル群を生成し、この生成を完了した以降は、ドライバの顔画像を循環的に入力し、顔モデル群の内、最も良く一致する顔モデルを選択する事によりドライバの顔の向きを検出する。 For example, Patent Document 1 includes a camera that captures the face of a driver of an automobile and an image processing device that processes a face image obtained from the camera, and is obtained by this camera when the device is started. Based on the front face of the driver, a face model group corresponding to various face orientations of the driver is generated, and after this generation is completed, the face image of the driver is input cyclically, and among the face model groups, Detects the driver's face orientation by selecting the best matching face model.
特開2003-308533号公報Japanese Unexamined Patent Publication No. 2003-308533
 特許文献1に記載の従来のドライバの顔画像処理装置は、運転者の顔の向きを時々刻々に検出するが、運転者が運転中によそ見をしているかどうかなどの判定を行うには不十分である。例えば、車両の運転席における運転者の顔の標準的な位置を定め、映像処理装置によって撮影される画角内における顔の標準的な位置からの正面方向を決定し、当該正面方向に対して顔の向きがずれていると、よそ見をしているなどと判定することもできる。 The conventional driver's face image processing device described in Patent Document 1 detects the direction of the driver's face every moment, but it is not possible to determine whether or not the driver is looking away while driving. It is enough. For example, the standard position of the driver's face in the driver's seat of the vehicle is determined, the front direction from the standard position of the face within the angle of view taken by the image processing device is determined, and the front direction is relative to the front direction. If the face is misaligned, it can be determined that the person is looking away.
 しかしながら、運転者の個人差によって、実際には顔の位置は前後左右、上下に顔の標準的な位置からずれており、また、運転者の嗜好性に応じて車線内の中央から右寄り或いは左寄りの方向を向いて運転することなどもある。この場合、運転者は、映像処理装置側で予め決定された顔の標準的な位置からの正面方向とは異なる方向を向いていると判定されてしまう。 However, due to individual differences of the driver, the position of the face is actually shifted from the standard position of the face to the front, back, left, right, up and down, and to the right or left from the center in the lane depending on the driver's preference. You may also drive in the direction of. In this case, it is determined that the driver is facing a direction different from the front direction from the standard position of the face determined in advance on the image processing apparatus side.
 さらに、映像処理装置がバックミラーに配置されている場合に、バックミラーが左右方向に調整されると、映像処理装置によって撮影している方向が変化する。上記のように、映像処理装置側で顔の標準的な位置からの正面方向が予め決定されていると、バックミラーの調整によって、当該正面方向がずれてしまうということもある。 Furthermore, when the video processing device is arranged in the rear-view mirror and the rear-view mirror is adjusted in the left-right direction, the shooting direction is changed by the video processing device. As described above, if the front direction from the standard position of the face is predetermined on the image processing apparatus side, the front direction may shift due to the adjustment of the rear-view mirror.
 本発明は、斯かる事情に鑑みてなされたものであり、その目的とするところは、撮影される被撮像体の顔の正面方向を補正することができる映像処理装置および映像処理方法を提供することにある。 The present invention has been made in view of such circumstances, and an object of the present invention is to provide an image processing device and an image processing method capable of correcting the front direction of the face of an imaged subject to be photographed. There is something in it.
 本実施形態のある態様は、被撮像体の顔を含む映像を取得する映像取得部と、前記映像取得部によって取得された前記映像に基づいて顔の向きを検出する向き検出部と、前記被撮像体の顔の向きにおける正面方向を補正する補正処理部と、を備える映像処理装置において、前記補正処理部は、前記向き検出部によって複数回検出した顔の向きの分布に基づいた顔の向きの中心を前記被撮像体の顔の向きにおける正面方向として補正する。 One embodiment of the present embodiment includes an image acquisition unit that acquires an image including the face of the image subject, an orientation detection unit that detects the direction of the face based on the image acquired by the image acquisition unit, and the subject. In an image processing apparatus including a correction processing unit for correcting the front direction in the face orientation of the image pickup body, the correction processing unit has a face orientation based on the distribution of the face orientation detected a plurality of times by the orientation detection unit. Is corrected as the front direction in the direction of the face of the image subject.
 また本実施形態の別の態様は映像処理装置における映像処理方法である。この映像処理方法は、被撮像体の顔を含む映像を取得する映像取得ステップと、前記映像取得ステップによって取得された前記映像に基づいて顔の向きを検出する向き検出ステップと、前記被撮像体の顔の向きにおける正面方向を補正する補正処理ステップと、を備え、前記補正処理ステップは、前記向き検出ステップによって複数回検出した顔の向きの分布に基づいた顔の向きの中心を前記被撮像体の顔の向きにおける正面方向として補正する。 Another aspect of this embodiment is a video processing method in a video processing device. This image processing method includes an image acquisition step of acquiring an image including a face of an imaged object, an orientation detection step of detecting the direction of the face based on the image acquired by the image acquisition step, and the imaged body. The correction processing step includes a correction processing step for correcting the front direction in the face orientation, and the correction processing step captures the center of the face orientation based on the distribution of the face orientation detected a plurality of times by the orientation detection step. Correct as the front direction in the direction of the face of the body.
 本実施形態によれば、撮影される被撮像体の顔の正面方向を補正することができる。 According to this embodiment, it is possible to correct the front direction of the face of the image subject to be imaged.
実施形態に係る映像処理装置の構成を示すブロック図である。It is a block diagram which shows the structure of the image processing apparatus which concerns on embodiment. 顔の向きについて説明するための模式図である。It is a schematic diagram for demonstrating the direction of a face. 向き検出部によって検出した顔の向きの分布の一例を表わすグラフである。It is a graph showing an example of the distribution of the face orientation detected by the orientation detection unit. 映像取得部によって取得した映像の一例を表す模式図である。It is a schematic diagram which shows an example of the image acquired by the image acquisition part. 映像処理装置における正面方向の補正処理の手順を示すフローチャートである。It is a flowchart which shows the procedure of the correction process in the front direction in a video processing apparatus.
 以下、本発明を好適な実施の形態をもとに図1から図5を参照しながら説明する。各図面に示される同一または同等の構成要素、部材には、同一の符号を付するものとし、適宜重複した説明は省略する。また、各図面において実施の形態を説明する上で重要ではない部材の一部は省略して表示する。 Hereinafter, the present invention will be described with reference to FIGS. 1 to 5 based on a preferred embodiment. The same or equivalent components and members shown in the drawings shall be designated by the same reference numerals, and duplicate description thereof will be omitted as appropriate. In addition, some of the members that are not important for explaining the embodiment in each drawing are omitted and displayed.
(実施形態)
 図1は、実施形態に係る映像処理装置100の構成を示すブロック図である。映像処理装置100は、例えばドライバーモニターであり、車両に搭載され、車両の車室内の乗員を時間的に連続して撮影する。映像処理装置100は撮影した映像を記録するようにしてもよいし、記録せずに破棄してもよい。映像処理装置100は、例えば車両のフロントガラスの下方におけるダッシュボード上や、バックミラーなどに配置される。映像処理装置100は、例えば車両に搭乗している運転者を被撮像体として含む映像を撮影し、撮影された被撮像体の顔全体、または目などの顔の部位を画像認識する。
(Embodiment)
FIG. 1 is a block diagram showing a configuration of a video processing device 100 according to an embodiment. The image processing device 100 is, for example, a driver monitor, which is mounted on the vehicle and continuously photographs the occupants in the vehicle interior of the vehicle in time. The image processing apparatus 100 may record the captured image, or may discard it without recording it. The image processing device 100 is arranged, for example, on a dashboard below the windshield of a vehicle, on a rear-view mirror, or the like. The image processing apparatus 100 captures an image including, for example, a driver in a vehicle as an image-received object, and recognizes the entire face of the photographed object or a facial portion such as an eye.
 映像処理装置100は、画像認識した運転者(被撮像体)の顔の向きを検出し、所定期間に亘って顔の向きのデータを取得し、顔の向きの分布を求める。映像処理装置100は、例えば運転者の顔の向きの分布に基づいて顔の向きの中心を求め、当該中心を運転者の顔の向きの正面方向とする補正を行う。 The image processing device 100 detects the orientation of the face of the driver (object to be imaged) that has recognized the image, acquires the data of the orientation of the face over a predetermined period, and obtains the distribution of the orientation of the face. The image processing apparatus 100 obtains the center of the face orientation based on, for example, the distribution of the driver's face orientation, and corrects the center to be the front direction of the driver's face orientation.
 ここで、被撮像体である運転者の顔は、安全運転のため車両の前方を向いているが、上述のように個人差等によって個々の運転者の顔が向いている方向の中心は異なっている。運転者の顔が向いている方向の中心は、いわば顔の向きの基準方向というべきであるが、本明細書では、便宜的に正面方向と呼ぶこととする。即ち、「運転者の顔の向き」、「被撮像体の顔の向き」は、時々刻々と変化する、まさに顔の向きを言い、映像処理装置100で撮影したある瞬間の映像内において例えば助手席側を向いた場合には、その向きを言う。 Here, the face of the driver, who is the image subject, faces the front of the vehicle for safe driving, but as described above, the center of the direction in which each driver's face faces differs due to individual differences and the like. ing. The center of the direction in which the driver's face is facing should be the reference direction of the face direction, but in the present specification, it is referred to as the front direction for convenience. That is, "the direction of the driver's face" and "the direction of the face of the imaged object" mean the direction of the face, which changes from moment to moment, and is, for example, an assistant in the image of a certain moment taken by the image processing apparatus 100. If you face the seat, say that direction.
 これに対して「顔の向きの正面方向」(あるいは単に「正面方向」)は、映像処理装置100で撮影した映像内において、運転者または被撮像体の顔が主として向いている方向(上述の基準方向)を言い、時々刻々と動く顔の向きの分布から求められる中心を言う。尚、「顔の向きの正面方向」は「顔の向きの基準方向」或いは「顔の向きの主方向」などの用語で表現されてもよい。 On the other hand, the "front direction of the face orientation" (or simply the "front direction") is the direction in which the face of the driver or the image subject is mainly facing in the image captured by the image processing apparatus 100 (described above). Refers to the reference direction), and refers to the center obtained from the distribution of the direction of the face that moves from moment to moment. The "front direction of the face orientation" may be expressed by terms such as "reference direction of the face orientation" or "main direction of the face orientation".
 映像処理装置100は、例えば、車両が走行している状態であることを検出して、運転者の顔の向きの正面方向を補正する。さらに、映像処理装置100は、正面方向の補正を実行する旨を運転者に向けて報知し、運転者自らが正面方向を向いている状態で、運転者の顔の向きの正面方向を補正するようにしてもよい。映像処理装置100における運転者の顔の向きの正面方向に対する補正は、顔の向きの正面方向をキャリブレーション(校正)することを意味しており、このようなキャリブレーションの実行によって、映像中の運転者の顔の向きを正しく分析することが可能になる。 The video processing device 100 detects, for example, that the vehicle is running, and corrects the front direction of the driver's face. Further, the image processing device 100 notifies the driver that the correction in the front direction is to be performed, and corrects the front direction of the driver's face while the driver himself is facing the front direction. You may do so. The correction of the driver's face orientation to the front direction in the image processing device 100 means that the front direction of the face orientation is calibrated, and by executing such calibration, the image is displayed. It becomes possible to correctly analyze the orientation of the driver's face.
 本実施形態では、映像処理装置100によって撮影された運転者の顔の向きに基づく機能に関して記載を省略するが、映像処理装置100において、運転者の顔の向きに関するデータの出力機能や、よそ見を検出して報知する警告機能などを有していてもよい。また、映像処理装置100は、運転者の顔の向きが正面方向に対してどれだけずれているかを分布などに基づいて統計的に処理し、運転者の顔の向きを診断し評価する機能を有していてもよい。 In the present embodiment, the description of the function based on the orientation of the driver's face taken by the image processing apparatus 100 is omitted, but in the image processing apparatus 100, the output function of the data regarding the orientation of the driver's face and the look away are described. It may have a warning function for detecting and notifying. Further, the video processing device 100 has a function of statistically processing how much the driver's face orientation is deviated from the front direction based on a distribution or the like, and diagnosing and evaluating the driver's face orientation. You may have.
 映像処理装置100は、撮像部10、記録部20、映像処理部30、外部入力部40および外部出力部41等を備える。撮像部10は、例えばCCD等の検出器を有するカメラであり、車両に搭乗している例えば運転者を被撮像体として含む車室内の映像を撮影する。撮像部10は、時間的に連続して映像を取得し、後述する映像取得部31へ送出する。 The video processing device 100 includes an imaging unit 10, a recording unit 20, a video processing unit 30, an external input unit 40, an external output unit 41, and the like. The image pickup unit 10 is a camera having a detector such as a CCD, and captures an image of a vehicle interior including, for example, a driver in a vehicle as an image subject. The imaging unit 10 continuously acquires images in time and sends them to the image acquisition unit 31, which will be described later.
 記録部20は、例えばSDカードやUSBメモリ等の着脱可能な媒体や、ハードディスクなどであり、映像取得部31で取得された映像を記録し、削除することができるものとする。以下、記録部20が設けられた構成について説明するが、映像処理装置100が映像を記録する部分を持たない場合には記録部20は設けなくてもよい。記録部20は、映像処理装置100に着脱可能に構成されていてもよく、この場合、映像処理装置100から取り外し、別のPC等で映像を再生等することができる。 The recording unit 20 is, for example, a detachable medium such as an SD card or a USB memory, a hard disk, or the like, and can record and delete the video acquired by the video acquisition unit 31. Hereinafter, the configuration in which the recording unit 20 is provided will be described, but if the video processing apparatus 100 does not have a portion for recording video, the recording unit 20 may not be provided. The recording unit 20 may be configured to be detachably attached to the video processing device 100. In this case, the recording unit 20 can be removed from the video processing device 100 and the video can be played back on another PC or the like.
 外部入力部40は、車両の速度情報および位置情報等を外部装置から取得する。また外部入力部40は、車両に搭載された加速度センサやジャイロ等により計測されたデータを外部装置から取得するようにしてもよい。外部出力部41は、映像処理部30によって処理された映像情報、および運転者の顔の向きに関するデータ等を外部装置へ出力する。尚、映像処理装置100によって検出された運転者の顔の向きに関するデータや、顔の向きの正面方向に関する補正データは、記録部20に記録された後、外部出力部41から出力されるようにしてもよい。また、映像処理装置100は、外部入力部40によって取得した速度情報および位置情報等を映像に付加して記録部20へ記録し、外部出力部41から出力するようにしてもよい。 The external input unit 40 acquires vehicle speed information, position information, and the like from an external device. Further, the external input unit 40 may acquire data measured by an acceleration sensor, a gyro, or the like mounted on the vehicle from an external device. The external output unit 41 outputs the video information processed by the video processing unit 30, data related to the orientation of the driver's face, and the like to the external device. The data regarding the driver's face orientation detected by the video processing apparatus 100 and the correction data regarding the frontal direction of the face orientation are recorded in the recording unit 20 and then output from the external output unit 41. You may. Further, the video processing apparatus 100 may add speed information, position information, and the like acquired by the external input unit 40 to the video, record it in the recording unit 20, and output it from the external output unit 41.
 映像処理部30は、映像取得部31、映像認識部32、向き検出部33、補正処理部34および走行判定部35を有する。映像処理部30は、例えばCPUなどによって構成され、コンピュータプログラムに従って動作することによって、上述の各部による処理を実行する。記憶部30aは、RAM(Random Access Memory)、フラッシュメモリ、ハードディスク記憶装置等のデータ記憶装置によって構成されており、映像処理部30で実行するコンピュータプログラム等を記憶する。また記憶部30aは、撮影された映像から被撮像体の顔や目等を認識するための認識用辞書などを記憶している。 The video processing unit 30 includes a video acquisition unit 31, a video recognition unit 32, an orientation detection unit 33, a correction processing unit 34, and a travel determination unit 35. The video processing unit 30 is configured by, for example, a CPU or the like, and operates according to a computer program to execute the processing by each of the above-mentioned units. The storage unit 30a is composed of a data storage device such as a RAM (RandomAccessMemory), a flash memory, and a hard disk storage device, and stores a computer program or the like executed by the video processing unit 30. Further, the storage unit 30a stores a recognition dictionary for recognizing the face, eyes, and the like of the image subject from the captured image.
 映像取得部31は、撮像部10において撮影された映像を取得してデータ圧縮等の処理を行い、記録部20に出力する。尚、映像取得部31は、撮像部10を含んで構成されていてもよい。 The image acquisition unit 31 acquires the image captured by the image pickup unit 10, performs processing such as data compression, and outputs the image to the recording unit 20. The image acquisition unit 31 may be configured to include the image pickup unit 10.
 映像認識部32は、記憶部30aに記憶された認識用辞書や学習型の演算モデルなどに基づいて、映像取得部31から入力された映像から被撮像体の顔や目などの顔の部位を認識する。記憶部30aに記憶した認識用辞書には、顔や目等の形状データ等が含まれており、映像認識部32は、映像に表われている形状パターンを抽出し、認識用辞書に含まれる形状データと照合することによって顔や目等を認識する。映像認識部32は、様々な分野において開発されてきた公知の画像認識処理の手法を用いて、映像から顔や目等を認識することができ、例えばニューラルネットワークによる学習型の演算モデルによって顔や目等を認識する手法などを用いてもよい。 The image recognition unit 32 obtains facial parts such as the face and eyes of the imaged object from the image input from the image acquisition unit 31 based on the recognition dictionary stored in the storage unit 30a, the learning type calculation model, and the like. recognize. The recognition dictionary stored in the storage unit 30a includes shape data such as faces and eyes, and the image recognition unit 32 extracts the shape pattern appearing in the image and includes it in the recognition dictionary. The face, eyes, etc. are recognized by collating with the shape data. The image recognition unit 32 can recognize faces, eyes, etc. from images by using known image recognition processing methods developed in various fields. For example, a face or an eye can be recognized by a learning-type arithmetic model using a neural network. A method of recognizing eyes or the like may be used.
 向き検出部33は、映像認識部32によって認識された顔や顔の各部位のデータに基づいて、被撮像体の顔の向きを検出する。図2は顔の向きについて説明するための模式図である。図2には、例えば運転者の顔を車両の前側から見た状態が表されており、車両の前方向をX軸方向、車両の左方向をY軸方向、車両の上方向をZ軸方向としている。また、X軸回りの回転をロール、Y軸回りの回転をピッチ、Z軸回りの回転をヨーとする。 The orientation detection unit 33 detects the orientation of the face of the image subject based on the data of the face and each part of the face recognized by the image recognition unit 32. FIG. 2 is a schematic diagram for explaining the orientation of the face. FIG. 2 shows, for example, a state in which the driver's face is viewed from the front side of the vehicle, the front direction of the vehicle is the X-axis direction, the left direction of the vehicle is the Y-axis direction, and the upward direction of the vehicle is the Z-axis direction. It is supposed to be. Further, the rotation around the X axis is defined as a roll, the rotation around the Y axis is defined as pitch, and the rotation around the Z axis is defined as yaw.
 被撮像体の顔の向きは、ピッチングおよびヨーイングによって変化する。即ち、顔の向きについてピッチングを検出することによって、被撮像体が上下方向に顔の向きを変化させたことがわかる。また、顔の向きについてヨーイングを検出することによって、被撮像体が左右方向に顔の向きを変化させたことがわかる。 The orientation of the face of the image subject changes due to pitching and yawing. That is, by detecting pitching with respect to the orientation of the face, it can be seen that the object to be imaged changes the orientation of the face in the vertical direction. Further, by detecting the yawing with respect to the orientation of the face, it can be seen that the object to be imaged changes the orientation of the face in the left-right direction.
 向き検出部33は、例えば図2に示す左右の目を結ぶ線L1の上下への動きによってピッチ角を検出する。また、向き検出部33は、鼻の中心線L2や、左右の目尻の位置P1およびP2などの動きによってヨー角を検出する。向き検出部33は、これらの部位のほか、眉、鼻、口等の顔の各部位の移動に基づいて顔のピッチ角およびヨー角を検出することができる。尚、向き検出部33は、顔の向きの検出に関して様々な技術分野において開発されてきた公知の技術を用いることができる。 The orientation detection unit 33 detects the pitch angle by the vertical movement of the line L1 connecting the left and right eyes shown in FIG. 2, for example. Further, the orientation detection unit 33 detects the yaw angle by the movement of the center line L2 of the nose and the positions P1 and P2 of the left and right outer corners of the eyes. In addition to these parts, the orientation detection unit 33 can detect the pitch angle and yaw angle of the face based on the movement of each part of the face such as the eyebrows, nose, and mouth. The orientation detection unit 33 can use known techniques developed in various technical fields for detecting the orientation of the face.
 補正処理部34は、向き検出部33によって検出された被撮像体の顔の向きの分布に基づいて被撮像体の顔の向きの正面方向を補正する。図3は、向き検出部33によって検出した顔の向きの分布の一例を表わすグラフである。図3では、横軸にヨー角、縦軸にピッチ角をとり、向き検出部33によって検出された顔の向きのピッチ角およびヨー角を濃淡によりプロットするいわゆるヒートマップで表している。即ち、所定期間内に検出される顔の向きのピッチ角およびヨー角のデータについて、各ピッチ角およびヨー角ごとに検出された回数を積算し、積算された回数が低い場合には黒く、積算された回数が高い場合には白く表される。尚、所定期間は、例えば数秒~数分程度とするが任意の時間長の期間に設定してもよい。検出に用いるヒートマップは、映像処理装置100の調整モードや運転者による一時的な動作確認などの目的で、リアルタイム或いは非リアルタイムに外部出力部41を通して映像情報として出力してもよいし、データとして記録部20に記録してもよい。 The correction processing unit 34 corrects the front direction of the face orientation of the image subject based on the distribution of the face orientation of the image subject detected by the orientation detection unit 33. FIG. 3 is a graph showing an example of the distribution of face orientation detected by the orientation detection unit 33. In FIG. 3, the horizontal axis is the yaw angle and the vertical axis is the pitch angle, and the pitch angle and the yaw angle of the face orientation detected by the orientation detection unit 33 are plotted by shading, which is represented by a so-called heat map. That is, for the face orientation pitch angle and yaw angle data detected within a predetermined period, the number of times detected for each pitch angle and yaw angle is integrated, and if the integrated number is low, it is black and integrated. If the number of times it is done is high, it will be displayed in white. The predetermined period is, for example, several seconds to several minutes, but may be set to a period of any time length. The heat map used for detection may be output as video information through the external output unit 41 in real time or non-real time for the purpose of adjusting mode of the video processing device 100 or temporary operation check by the driver, or as data. It may be recorded in the recording unit 20.
 補正処理部34は、図3によって表される顔の向きのピッチ角およびヨー角の分布の中心を求めて、顔の向きの正面方向とする補正を行う。分布の中心については、平均値、中央値、或いは最頻値など統計学的手法を用いて求めることができる。例えば図3に示すように、補正処理部34は、星印の位置を分布の中心として求め、星印の方向を被撮像体の顔の向きの正面方向として補正する。 The correction processing unit 34 obtains the center of the distribution of the pitch angle and the yaw angle of the face orientation represented by FIG. 3, and corrects the face orientation to the front direction. The center of the distribution can be determined using statistical methods such as mean, median, or mode. For example, as shown in FIG. 3, the correction processing unit 34 obtains the position of the star mark as the center of the distribution, and corrects the direction of the star mark as the front direction of the face orientation of the image subject.
 走行判定部35は、映像処理装置100自身が搭載された車両が走行しているか否かを判定する。走行判定部35は、外部入力部40によって取得した速度情報や位置情報に基づいて車両が走行しているか否かを判定してもよいし、映像取得部31によって取得した映像に基づいて車両が走行しているか否かを判定するようにしてもよい。 The travel determination unit 35 determines whether or not the vehicle on which the video processing device 100 itself is mounted is traveling. The travel determination unit 35 may determine whether or not the vehicle is traveling based on the speed information and the position information acquired by the external input unit 40, or the vehicle may determine whether or not the vehicle is traveling based on the image acquired by the image acquisition unit 31. It may be determined whether or not the vehicle is traveling.
 走行判定部35は、映像に基づいて車両が走行しているかを判定する場合には、撮影した映像中の窓の部分における風景の流れによって判定を実行する。図4は映像取得部31によって取得した映像の一例を表す模式図である。図4に示す映像中にはリアウインドウA、並びに運転席および助手席のウインドウBが映っており、走行判定部35は、各ウインドウ内の映像のフレーム間差分等によって風景の流れを検出する。走行判定部35は、各ウインドウ内の風景が流れている場合に車両が走行していると判定し、各ウインドウ内の風景が流れていない場合には車両が走行していないと判定する。 When determining whether or not the vehicle is traveling based on the image, the travel determination unit 35 executes the determination based on the flow of the scenery in the window portion in the captured image. FIG. 4 is a schematic diagram showing an example of an image acquired by the image acquisition unit 31. The rear window A and the windows B of the driver's seat and the passenger seat are shown in the image shown in FIG. 4, and the traveling determination unit 35 detects the flow of the landscape by the difference between frames of the images in each window. The travel determination unit 35 determines that the vehicle is traveling when the scenery in each window is flowing, and determines that the vehicle is not traveling when the scenery in each window is not flowing.
 運転者は、車両が走行している場合に、車両前方に顔を向けて安全確認をしつつ運転している。そこで、映像処理部30は、走行判定部35によって車両が走行していると判定された場合に、向き検出部33によって顔の向きを検出し、補正処理部34によって顔の向きの正面方向を求めて補正を実行するようにするとよい。 When the vehicle is running, the driver turns his face to the front of the vehicle and drives while checking the safety. Therefore, when the travel determination unit 35 determines that the vehicle is traveling, the image processing unit 30 detects the face orientation by the orientation detection unit 33, and the correction processing unit 34 determines the front direction of the face orientation. It is advisable to ask for it and perform the correction.
 また映像処理部30は、走行判定部35による判定に関わらず、向き検出部33によって顔の向きを検出し、補正処理部34によって顔の向きの正面方向を求めて補正を実行するようにしてもよい。走行前の車両停止状態では運転者は車内の様子を確認するなど運転前の動作によって顔の向きが変化することが予想されるが、映像処理部30は運転開始前の運転者の顔の挙動を検出することになる。また、映像処理部30は、運転者に対して正面方向を向くように指示を音声等により報知し、車両が走行していない状態で運転者の顔の向きの正面方向を補正するようにしてもよい。 Further, the image processing unit 30 detects the direction of the face by the direction detection unit 33 regardless of the determination by the travel determination unit 35, and the correction processing unit 34 obtains the front direction of the face direction and executes the correction. May be good. When the vehicle is stopped before driving, the driver is expected to change the direction of his face depending on the movement before driving, such as checking the inside of the vehicle. Will be detected. Further, the image processing unit 30 notifies the driver of an instruction to face the front direction by voice or the like, and corrects the front direction of the driver's face while the vehicle is not running. May be good.
 次に映像処理装置100の動作について、顔の向きの正面方向の補正処理に基づいて説明する。図5は、映像処理装置100における正面方向の補正処理の手順を示すフローチャートである。映像処理装置100の走行判定部35は、車両が走行しているか否かを判定する(S1)。走行判定部35は、上述のように映像取得部31で取得した映像からウインドウ部分における風景が流れているかをフレーム間差分等により検出することによって、車両が走行しているか否かを判定する。また走行判定部35は、外部入力部40で取得した速度情報や位置情報に基づいて車両が走行しているか否かを判定してもよい。 Next, the operation of the image processing device 100 will be described based on the correction process in the front direction of the face orientation. FIG. 5 is a flowchart showing a procedure of correction processing in the front direction in the image processing apparatus 100. The travel determination unit 35 of the image processing device 100 determines whether or not the vehicle is traveling (S1). The travel determination unit 35 determines whether or not the vehicle is traveling by detecting whether or not the scenery in the window portion is flowing from the image acquired by the image acquisition unit 31 as described above by the difference between frames or the like. Further, the travel determination unit 35 may determine whether or not the vehicle is traveling based on the speed information and the position information acquired by the external input unit 40.
 ステップS1において、車両が走行していないと判定した場合(S1:NO)、ステップS1の判定を繰り返し、車両が走行していると判定した場合(S1:YES)、向き検出部33は、被撮像体の顔の向きを検出する(S2)。向き検出部33は、被撮像体の顔の向きを検出して補正処理部34へ出力する。補正処理部34は、向き検出部33から入力された被撮像体の顔の向きを所定期間に亘って蓄積し、顔の向きの分布データを取得する(S3)。 In step S1, when it is determined that the vehicle is not traveling (S1: NO), the determination in step S1 is repeated, and when it is determined that the vehicle is traveling (S1: YES), the orientation detection unit 33 is subject to The orientation of the face of the image pickup body is detected (S2). The orientation detection unit 33 detects the orientation of the face of the image subject and outputs it to the correction processing unit 34. The correction processing unit 34 accumulates the face orientation of the image subject input from the orientation detection unit 33 over a predetermined period of time, and acquires face orientation distribution data (S3).
 補正処理部34は、顔の向きの分布データに基づいて、顔の向きの分布の中心を算出し、算出した中心を正面方向とする補正を行い(S4)、処理を終了する。分布の中心は、平均値、中央値、或いは最頻値などとする。 The correction processing unit 34 calculates the center of the distribution of the face orientation based on the distribution data of the face orientation, performs correction with the calculated center as the front direction (S4), and ends the processing. The center of the distribution is the mean, median, or mode.
 映像処理装置100は、被撮像体の顔を含む映像を取得し、顔の向きを検出しており、所定期間内で検出した顔の向きの分布から分布の中心を求め、求めた中心を顔の向きの正面方向として補正することができる。映像処理装置100は、顔の向きの正面方向を補正することによって、正面方向に対して運転者がよそ見をしている場合に報知して警告する機能などに応用することができる。また、映像処理装置100は、運転者の顔の向きが正面方向に対してどれだけずれているかを分布などに基づいて統計的に処理する外部装置に、顔の向きの分析や評価の精度を良くするための情報を提供することができる。 The image processing apparatus 100 acquires an image including the face of the image subject, detects the orientation of the face, obtains the center of the distribution from the distribution of the orientation of the face detected within a predetermined period, and uses the obtained center as the face. It can be corrected as the front direction of the direction of. The image processing device 100 can be applied to a function of notifying and warning when the driver is looking away from the front direction by correcting the front direction of the face direction. Further, the image processing device 100 provides an external device that statistically processes how much the driver's face orientation is deviated from the front direction based on a distribution or the like to obtain accuracy in face orientation analysis and evaluation. Information can be provided to improve.
 映像処理装置100は、車両が走行していると判定されたときに顔の向きの正面方向を補正することによって、走行していない場合に比べて高い確率で運転者が正面方向に顔を向けているので、正面方向の補正をより正確に行うことができる。映像処理装置100は、車両が走行していないときに顔の向きの正面方向を補正する場合、運転者が正面方向を向いている確率は低くなるが、運転前にどのような動作をしているかを把握するための情報を提供することができる。また、映像処理装置100は、車両が走行していないときに、正面方向を向くように指示を報知し、運転者の顔の向きの正面方向を補正するようにしてもよい。 The image processing device 100 corrects the front direction of the face when it is determined that the vehicle is running, so that the driver turns his face to the front with a higher probability than when the vehicle is not running. Therefore, the correction in the front direction can be performed more accurately. When the image processing device 100 corrects the front direction of the face when the vehicle is not running, the probability that the driver is facing the front direction is low, but what kind of operation is performed before driving. It is possible to provide information for grasping whether or not. Further, the image processing device 100 may notify the instruction to face the front direction when the vehicle is not traveling, and may correct the front direction of the driver's face direction.
 また映像処理装置100は、車両が走行しているか否かを映像取得部31によって取得した映像に映ったウインドウ内の風景の流れに基づいて判断することで、外部からの速度情報や位置情報に依存せずに、顔の向きの正面方向を補正することができる。 Further, the image processing device 100 determines whether or not the vehicle is traveling based on the flow of the scenery in the window reflected in the image acquired by the image acquisition unit 31, so that the speed information and the position information from the outside can be obtained. It is possible to correct the frontal direction of the face orientation without depending.
 映像処理装置100は、所定期間内で検出した顔の向きの分布から分布の中心を求めている。例えば正面方向から大きくずれた方向を向いている点が図3に示した分布に現れた場合、分布の中心は、正面方向から大きくずれた点の影響を受けて本来の正面方向からずれてしまう可能性もある。補正処理部34は、ピッチ角およびヨー角の閾値を定めて正面方向から大きくずれた点を除外するようにしてもよい。また補正処理部34は、頻度の閾値を定め、低頻度で現れる点を除去するようにしてもよい。 The video processing apparatus 100 obtains the center of the distribution from the distribution of the face orientation detected within a predetermined period. For example, when a point facing a direction greatly deviated from the front direction appears in the distribution shown in FIG. 3, the center of the distribution is deviated from the original front direction due to the influence of the point deviated greatly from the front direction. There is a possibility. The correction processing unit 34 may set threshold values for the pitch angle and the yaw angle to exclude points that are significantly deviated from the front direction. Further, the correction processing unit 34 may set a threshold value of frequency and remove points appearing at low frequency.
 映像処理装置100は、被撮像体の顔の向きの正面方向について初期値を設定していてもよい。正面方向の初期値は、例えば基準とする背丈および座高の人が基準とするシート位置に着座して正面を向いたときに撮影される映像内における顔の向きのピッチ角およびヨー角を初期値に定める。基準とする背丈や座高は、例えば成人の身長や座高の平均値などを参考とするが、必ずしも厳密な値である必要はない。また基準とするシート位置は、例えば上下方向のストロークの中央位置などとする。また、映像処理装置100がバックミラーに取り付けられている場合には、バックミラーが基準位置にあることを想定する。バックミラーの基準位置は、例えばバックミラーの左右調整角度範囲における中央位置などとするが、必ずしも厳密な値である必要はない。 The image processing apparatus 100 may set initial values for the front direction of the face orientation of the image subject. The initial values in the front direction are, for example, the pitch angle and yaw angle of the face orientation in the image taken when a person with a reference height and sitting height sits at a reference seat position and faces the front. Stipulated in. The reference height and sitting height refer to, for example, the average height and sitting height of an adult, but they do not necessarily have to be exact values. The reference seat position is, for example, the center position of the vertical stroke. Further, when the image processing device 100 is attached to the rear-view mirror, it is assumed that the rear-view mirror is in the reference position. The reference position of the rear-view mirror is, for example, the center position in the left-right adjustment angle range of the rear-view mirror, but it does not necessarily have to be an exact value.
(変形例)
 上述の実施形態では、車両が走行していると判定した場合に顔の向きの正面方向を補正することを述べたが、車両が直進している場合に顔の向きの正面方向を補正するようにしてもよい。
(Modification example)
In the above-described embodiment, it has been described that the front direction of the face orientation is corrected when it is determined that the vehicle is traveling, but the front direction of the face orientation is corrected when the vehicle is traveling straight. You may do it.
 走行判定部35は、車両が直進しているか否かを例えば位置情報および地図情報を用いて判定する。走行判定部35は、車両に搭載されたナビゲーション装置(図示略)から車両が直進しているか否かの情報を外部入力部40によって取得して判定しても良い。走行判定部35によって車両が直進していると判定した場合に、補正処理部34は、被撮像体の顔の向きの正面方向を補正する。 The travel determination unit 35 determines whether or not the vehicle is traveling straight, for example, using position information and map information. The travel determination unit 35 may acquire information on whether or not the vehicle is traveling straight from a navigation device (not shown) mounted on the vehicle by the external input unit 40 and determine. When the travel determination unit 35 determines that the vehicle is traveling straight, the correction processing unit 34 corrects the front direction of the face orientation of the image subject.
 車両が例えばカーブを走行している場合には運転者の顔の向きが正面方向からずれることが多い。映像処理装置100は、車両が直進している場合に被撮像体の顔の向きの正面方向を補正することで、正面方向をより正確に補正することができる。 When the vehicle is traveling on a curve, for example, the direction of the driver's face often deviates from the front direction. The image processing apparatus 100 can correct the front direction more accurately by correcting the front direction of the face orientation of the image subject when the vehicle is traveling straight.
 また映像処理装置100は、被撮像体として運転者だけでなく、助手席や後部座席等における車両の搭乗者の顔の向きについて正面方向の補正を行うようにしてもよい。助手席や後部座席の搭乗者は、よそ見などの評価をする必要はないが、顔の向きが主としてどの方向を向いており、当該方向に対してどの程度ずれた方向を向くかなどを評価し、例えば情報機器や広告の配置を工夫することや、視野を確保するための車室内レイアウトの工夫などに利用することができる。 Further, the image processing device 100 may correct the direction of the face of the passenger of the vehicle not only in the driver but also in the passenger seat, the rear seat, and the like as the image subject in the front direction. Passenger seat and rear seat passengers do not need to evaluate looking away, but evaluate which direction the face is mainly facing and how much it deviates from that direction. For example, it can be used for devising the arrangement of information devices and advertisements, and devising the layout of the passenger seat to secure a field of view.
 次に、上述の各実施形態および変形例に係る映像処理装置100および映像処理方法の特徴を説明する。
 映像処理装置100は、映像取得部31、向き検出部33および補正処理部34を備える。映像取得部31は、被撮像体の顔を含む映像を取得する。向き検出部33は、映像取得部31によって取得された映像に基づいて顔の向きを検出する。補正処理部34は、被撮像体の顔の向きにおける正面方向を補正する。補正処理部34は、向き検出部33によって複数回検出した顔の向きの分布に基づいた顔の向きの中心を被撮像体の顔の向きにおける正面方向として補正する。これにより、映像処理装置100は、撮影される被撮像体の顔の正面方向を補正することができ、顔の向きの分析や評価の精度を良くするための情報を提供することができる。
Next, the features of the video processing apparatus 100 and the video processing method according to each of the above-described embodiments and modifications will be described.
The image processing device 100 includes an image acquisition unit 31, an orientation detection unit 33, and a correction processing unit 34. The image acquisition unit 31 acquires an image including the face of the image subject. The orientation detection unit 33 detects the orientation of the face based on the image acquired by the image acquisition unit 31. The correction processing unit 34 corrects the front direction in the direction of the face of the image subject. The correction processing unit 34 corrects the center of the face orientation based on the distribution of the face orientations detected a plurality of times by the orientation detection unit 33 as the front direction in the face orientation of the image subject. As a result, the image processing apparatus 100 can correct the front direction of the face of the image subject to be photographed, and can provide information for improving the accuracy of analysis and evaluation of the face orientation.
 また車両が走行中であるかを判定する走行判定部35をさらに備える。補正処理部34は、走行判定部35によって車両が走行中であると判定しているときに向き検出部33によって検出した顔の向きの分布に基づいて補正する。これにより、映像処理装置100は、被撮像体の顔の向きの正面方向をより正確に補正することができる。 Further, a traveling determination unit 35 for determining whether the vehicle is traveling is further provided. The correction processing unit 34 corrects based on the distribution of the face orientation detected by the orientation detection unit 33 when the travel determination unit 35 determines that the vehicle is traveling. As a result, the image processing apparatus 100 can more accurately correct the front direction of the face orientation of the image subject.
 また走行判定部35は、映像取得部31によって取得した映像における車両外部の風景の流れに基づいて車両が走行中であるかを判定する。これにより、映像処理装置100は、外部からの速度情報や位置情報に依存せずに、顔の向きの正面方向を補正することができる。 Further, the travel determination unit 35 determines whether the vehicle is traveling based on the flow of the scenery outside the vehicle in the image acquired by the image acquisition unit 31. As a result, the image processing apparatus 100 can correct the front direction of the face orientation without depending on the speed information and the position information from the outside.
 また補正処理部34は、車両が直進している場合に向き検出部33によって検出した顔の向きの分布に基づいて補正する。これにより、映像処理装置100は、被撮像体の顔の向きの正面方向をより正確に補正することができる。 Further, the correction processing unit 34 corrects based on the distribution of the face orientation detected by the orientation detection unit 33 when the vehicle is traveling straight. As a result, the image processing apparatus 100 can more accurately correct the front direction of the face orientation of the image subject.
 映像処理装置100における映像処理方法は、映像取得ステップ、向き検出ステップおよび補正処理ステップを備える。映像取得ステップは、被撮像体の顔を含む映像を取得する。向き検出ステップは、映像取得ステップによって取得された映像に基づいて顔の向きを検出する。補正処理ステップは、被撮像体の顔の向きにおける正面方向を補正する。補正処理ステップは、向き検出ステップによって複数回検出した顔の向きの分布に基づいた顔の向きの中心を被撮像体の顔の向きにおける正面方向として補正する。この映像処理方法によれば、撮影される被撮像体の顔の正面方向を補正することができ、顔の向きの分析や評価の精度を良くするための情報を提供することができる。 The video processing method in the video processing apparatus 100 includes a video acquisition step, an orientation detection step, and a correction processing step. The image acquisition step acquires an image including the face of the image subject. The orientation detection step detects the orientation of the face based on the image acquired by the image acquisition step. The correction processing step corrects the front direction in the orientation of the face of the image subject. The correction processing step corrects the center of the face orientation based on the distribution of the face orientations detected a plurality of times by the orientation detection step as the front direction in the face orientation of the image subject. According to this image processing method, it is possible to correct the front direction of the face of the image subject to be photographed, and it is possible to provide information for improving the accuracy of analysis and evaluation of the face orientation.
 映像処理装置100における映像処理プログラムは、映像取得ステップ、向き検出ステップおよび補正処理ステップをコンピュータに実行させる。映像取得ステップは、被撮像体の顔を含む映像を取得する。向き検出ステップは、映像取得ステップによって取得された映像に基づいて顔の向きを検出する。補正処理ステップは、被撮像体の顔の向きにおける正面方向を補正する。補正処理ステップは、向き検出ステップによって複数回検出した顔の向きの分布に基づいた顔の向きの中心を被撮像体の顔の向きにおける正面方向として補正する。この映像処理プログラムによれば、撮影される被撮像体の顔の正面方向を補正することができ、顔の向きの分析や評価の精度を良くするための情報を提供することができる。 The video processing program in the video processing apparatus 100 causes a computer to execute a video acquisition step, an orientation detection step, and a correction processing step. The image acquisition step acquires an image including the face of the image subject. The orientation detection step detects the orientation of the face based on the image acquired by the image acquisition step. The correction processing step corrects the front direction in the orientation of the face of the image subject. The correction processing step corrects the center of the face orientation based on the distribution of the face orientations detected a plurality of times by the orientation detection step as the front direction in the face orientation of the image subject. According to this image processing program, it is possible to correct the front direction of the face of the image subject to be photographed, and it is possible to provide information for improving the accuracy of analysis and evaluation of the face orientation.
 以上、本発明の実施の形態をもとに説明した。これらの実施の形態は例示であり、いろいろな変形および変更が本発明の特許請求範囲内で可能なこと、またそうした変形例および変更も本発明の特許請求の範囲にあることは当業者に理解されるところである。従って、本明細書での記述および図面は限定的ではなく例証的に扱われるべきものである。 The above description has been made based on the embodiment of the present invention. It will be appreciated by those skilled in the art that these embodiments are exemplary and that various modifications and modifications are possible within the claims of the invention and that such modifications and modifications are also within the claims of the present invention. It is about to be done. Therefore, the descriptions and drawings herein should be treated as exemplary rather than limiting.
 本発明は、ドライバーモニター等の映像処理装置および映像処理方法に関する。 The present invention relates to a video processing device such as a driver monitor and a video processing method.
 31 映像取得部、 33 向き検出部、 34 補正処理部、
 35 走行判定部、 100 映像処理装置。
31 Video acquisition unit, 33 Direction detection unit, 34 Correction processing unit,
35 Driving judgment unit, 100 Video processing device.

Claims (5)

  1.  被撮像体の顔を含む映像を取得する映像取得部と、
     前記映像取得部によって取得された前記映像に基づいて顔の向きを検出する向き検出部と、
     前記被撮像体の顔の向きにおける正面方向を補正する補正処理部と、
    を備える映像処理装置において、
     前記補正処理部は、前記向き検出部によって複数回検出した顔の向きの分布に基づいた顔の向きの中心を前記被撮像体の顔の向きにおける正面方向として補正することを特徴とする映像処理装置。
    An image acquisition unit that acquires images including the face of the image subject, and
    An orientation detection unit that detects the orientation of the face based on the image acquired by the image acquisition unit, and an orientation detection unit.
    A correction processing unit that corrects the front direction in the direction of the face of the image subject, and a correction processing unit.
    In a video processing device equipped with
    The image processing unit is characterized in that the center of the face orientation based on the distribution of the face orientation detected a plurality of times by the orientation detection unit is corrected as the front direction in the face orientation of the image subject. Device.
  2.  車両が走行中であるかを判定する走行判定部をさらに備え、
     前記補正処理部は、前記走行判定部によって車両が走行中であると判定しているときに前記向き検出部によって検出した顔の向きの分布に基づいて補正することを特徴とする請求項1に記載の映像処理装置。
    It also has a running determination unit that determines whether the vehicle is running.
    The first aspect of the present invention is characterized in that the correction processing unit corrects based on the distribution of the face orientation detected by the orientation detection unit when the travel determination unit determines that the vehicle is traveling. The video processing device described.
  3.  前記走行判定部は、前記映像取得部によって取得した映像における車両外部の風景の流れに基づいて車両が走行中であるかを判定することを特徴とする請求項2に記載の映像処理装置。 The video processing device according to claim 2, wherein the travel determination unit determines whether the vehicle is traveling based on the flow of the scenery outside the vehicle in the video acquired by the video acquisition unit.
  4.  前記補正処理部は、車両が直進している場合に前記向き検出部によって検出した顔の向きの分布に基づいて補正することを特徴とする請求項1から3のいずれか1項に記載の映像処理装置。 The image according to any one of claims 1 to 3, wherein the correction processing unit corrects based on the distribution of the face orientation detected by the orientation detection unit when the vehicle is traveling straight. Processing equipment.
  5.  被撮像体の顔を含む映像を取得する映像取得ステップと、
     前記映像取得ステップによって取得された前記映像に基づいて顔の向きを検出する向き検出ステップと、
     前記被撮像体の顔の向きにおける正面方向を補正する補正処理ステップと、
    を備え、
     前記補正処理ステップは、前記向き検出ステップによって複数回検出した顔の向きの分布に基づいた顔の向きの中心を前記被撮像体の顔の向きにおける正面方向として補正することを特徴とする、映像処理装置における映像処理方法。
    An image acquisition step to acquire an image including the face of the image subject, and
    An orientation detection step that detects the orientation of the face based on the image acquired by the image acquisition step,
    A correction processing step for correcting the front direction in the orientation of the face of the image subject, and
    Equipped with
    The correction processing step is characterized in that the center of the face orientation based on the distribution of the face orientation detected a plurality of times by the orientation detection step is corrected as the front direction in the face orientation of the image subject. Video processing method in the processing device.
PCT/JP2020/047730 2020-07-15 2020-12-21 Video processing device and video processing method WO2022014068A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020121188A JP2022018231A (en) 2020-07-15 2020-07-15 Video processing apparatus and video processing method
JP2020-121188 2020-07-15

Publications (1)

Publication Number Publication Date
WO2022014068A1 true WO2022014068A1 (en) 2022-01-20

Family

ID=79554555

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/047730 WO2022014068A1 (en) 2020-07-15 2020-12-21 Video processing device and video processing method

Country Status (2)

Country Link
JP (1) JP2022018231A (en)
WO (1) WO2022014068A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017004117A (en) * 2015-06-05 2017-01-05 富士通テン株式会社 Line-of-sight detection apparatus and line-of-sight detection method
JP2017208007A (en) * 2016-05-20 2017-11-24 株式会社デンソー Face orientation estimation device and face orientation estimation method
WO2018150485A1 (en) * 2017-02-15 2018-08-23 三菱電機株式会社 Driving state determination device and driving state determination method
WO2019159364A1 (en) * 2018-02-19 2019-08-22 三菱電機株式会社 Passenger state detection device, passenger state detection system, and passenger state detection method
JP2019205078A (en) * 2018-05-24 2019-11-28 株式会社ユピテル System and program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017004117A (en) * 2015-06-05 2017-01-05 富士通テン株式会社 Line-of-sight detection apparatus and line-of-sight detection method
JP2017208007A (en) * 2016-05-20 2017-11-24 株式会社デンソー Face orientation estimation device and face orientation estimation method
WO2018150485A1 (en) * 2017-02-15 2018-08-23 三菱電機株式会社 Driving state determination device and driving state determination method
WO2019159364A1 (en) * 2018-02-19 2019-08-22 三菱電機株式会社 Passenger state detection device, passenger state detection system, and passenger state detection method
JP2019205078A (en) * 2018-05-24 2019-11-28 株式会社ユピテル System and program

Also Published As

Publication number Publication date
JP2022018231A (en) 2022-01-27

Similar Documents

Publication Publication Date Title
JP6458734B2 (en) Passenger number measuring device, passenger number measuring method, and passenger number measuring program
JP6281492B2 (en) Passenger counting device, method and program
US9405982B2 (en) Driver gaze detection system
JP5230748B2 (en) Gaze direction determination device and gaze direction determination method
JP5207249B2 (en) Driver condition monitoring system
WO2014061195A1 (en) Passenger counting system, passenger counting method and passenger counting program
US7379559B2 (en) Method and apparatus for determining an occupant's head location in an actuatable occupant restraining system
JP6573193B2 (en) Determination device, determination method, and determination program
JP5790762B2 (en) 瞼 Detection device
CN101093164A (en) Vehicle occupant detecting system, movement device control system and vehicle
CN104573622B (en) Human face detection device, method
CN101120379A (en) Image processing process, image processing system, image processing apparatus and computer program
EP3966740A1 (en) Systems, devices and methods for measuring the mass of objects in a vehicle
JP6814977B2 (en) Image processing device, detection device, learning device, image processing method, and image processing program
CN109345591A (en) A kind of vehicle itself attitude detecting method and device
JP7134364B2 (en) physique determination device and physique determination method
WO2022014068A1 (en) Video processing device and video processing method
JP2004053324A (en) Collision safety controller for automobile
US11161470B2 (en) Occupant observation device
US20230227044A1 (en) Apparatus, method, and computer program for monitoring driver
CN110199318A (en) Driver status estimating device and driver status estimate method
US11983952B2 (en) Physique determination apparatus and physique determination method
JP2009048261A (en) Camera system
WO2021171538A1 (en) Facial expression recognition device and facial expression recognition method
JP7301256B2 (en) Face direction determination device and face direction determination method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20944844

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20944844

Country of ref document: EP

Kind code of ref document: A1