WO2023286228A1 - Dispositif d'aide à l'enregistrement d'informations faciales - Google Patents

Dispositif d'aide à l'enregistrement d'informations faciales Download PDF

Info

Publication number
WO2023286228A1
WO2023286228A1 PCT/JP2021/026554 JP2021026554W WO2023286228A1 WO 2023286228 A1 WO2023286228 A1 WO 2023286228A1 JP 2021026554 W JP2021026554 W JP 2021026554W WO 2023286228 A1 WO2023286228 A1 WO 2023286228A1
Authority
WO
WIPO (PCT)
Prior art keywords
occupant
information
face
display device
notification
Prior art date
Application number
PCT/JP2021/026554
Other languages
English (en)
Japanese (ja)
Inventor
貴羅 明神
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2021/026554 priority Critical patent/WO2023286228A1/fr
Priority to JP2023534532A priority patent/JPWO2023286228A1/ja
Publication of WO2023286228A1 publication Critical patent/WO2023286228A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • This disclosure relates to support for registration of face information used for face authentication of vehicle occupants.
  • facial recognition has been used as biometric authentication for vehicle occupants. This face authentication is performed not only for the driver but also for other passengers for the purpose of providing services optimized for each passenger in the vehicle.
  • Face authentication is performed, for example, by comparing an image of the face of a vehicle occupant with a pre-registered face image (hereinafter also referred to as a "registered image"). Therefore, it is necessary to register the face image of the passenger as a pre-stage of face authentication.
  • a pre-registered face image hereinafter also referred to as a "registered image”
  • Embodiment 2 of Patent Document 1 discloses a configuration for guiding the position or orientation of the occupant's face by an image or sound in the facial image registration process when the position or orientation of the occupant's face is inappropriate. .
  • the registered images are variations of images of the occupant's face taken from various directions.
  • the passenger bears a heavy burden when registering the face image.
  • a registered image is taken as an example, but face information used for face authentication is not limited to a face image, and may be face information representing a face feature amount or the like.
  • the present disclosure has been made to solve the above problems, and aims to reduce the burden on vehicle occupants in registering face information used for face authentication of vehicle occupants.
  • the face information registration support device of the present disclosure is a face information registration support device that supports registration of face information for performing face authentication of an occupant boarding a vehicle, and is an occupant photographed by an imaging device provided in the vehicle.
  • an acquisition unit that acquires a face image of the occupant
  • an occupant information processing unit that calculates occupant information including at least position information of the occupant's face and a line-of-sight direction from the occupant's face image, and control information including the effective visual field range of the occupant from the occupant information
  • a control information calculation unit for calculating, a control information determination unit for determining whether or not the display device provided in the vehicle is present in the effective visual field range, a notification control unit for notifying the occupant to move the face
  • the notification includes audio notification that outputs a sound from an audio output device provided in the vehicle to prompt the occupant to visually recognize the display device, and guidance information that guides the occupant to move at least one of the position and orientation of the face. is displayed on the display device, and the notification control
  • the face information registration support device of the present disclosure it is possible to reduce the burden on the occupant in the process of registering face information used for face authentication of the occupant of the vehicle. Objects, features, aspects and advantages of the present disclosure will become more apparent with the following detailed description and accompanying drawings.
  • FIG. 1 is a block diagram showing the configuration of a face information registration support device according to Embodiment 1;
  • FIG. It is a figure which shows the arrangement
  • FIG. 10 is a perspective view showing a case where the display device is not present within the effective visual range of the occupant;
  • FIG. 11 is a perspective view showing a case where a display device exists within the effective visual range of an occupant;
  • FIG. 4 is a perspective view showing the effective visual field range of an occupant and the coordinates of the display device; It is an xz plane view which shows a passenger
  • FIG. 10 is a perspective view showing a case where the display device is not present within the effective visual range of the occupant
  • FIG. 11 is a perspective view showing a case where a display device exists within the effective visual range of an occupant
  • FIG. 4 is a perspective
  • FIG. 4 is a diagram showing the position and orientation of the face of the occupant at times t1 and t2; 4 is a flow chart showing the operation of the face information registration support device of Embodiment 1;
  • FIG. 11 is a block diagram showing the configuration of a face information registration support device according to Embodiment 2; It is a perspective view which shows a passenger
  • FIG. 9 is a flow chart showing the operation of the face information registration support device of Embodiment 2; 2 is a diagram showing the hardware configuration of the face information registration support device according to Embodiment 1; FIG. 2 is a diagram showing the hardware configuration of the face information registration support device according to Embodiment 1; FIG.
  • FIG. 1 is a diagram showing the configuration of face information registration support apparatus 101 according to Embodiment 1.
  • the face information registration support device 101 is connected to the imaging device 1, the display device 2, and the audio output device 3, and is configured to be able to use them.
  • the face information registration support device 101 is a device that supports registration of face information used for face authentication of a vehicle occupant.
  • the imaging device 1, the display device 2, and the audio output device 3 are all mounted in the interior of the vehicle (hereinafter referred to as "vehicle interior").
  • a vehicle refers to a vehicle boarded by an occupant whose face information is to be registered by the face information registration support device of each embodiment.
  • an occupant refers to an occupant whose face information is registered by the face information registration support device of each embodiment.
  • the face information registration support device 101 is also typically installed in a vehicle, and a part of its configuration is composed of a server located outside the vehicle, a portable terminal carried by the passenger and brought into the vehicle, and the like. may
  • FIG. 2 is a diagram showing the arrangement of the imaging device 1, the display device 2, and the audio output device 3 in the vehicle compartment.
  • the cameras 51 and 52 in FIG. 2 are an example of the imaging device 1.
  • the camera 51 is installed at a position inside the vehicle that can photograph the occupant from the front and above.
  • a camera 52 is installed below the camera 51 and above the CID (Center Information Display) 21, and photographs the occupant from the front.
  • CID Center Information Display
  • FIG. 2 a plurality of cameras 51 and 52 are shown in FIG. 2, the number of cameras constituting the imaging device 1 may be singular or plural.
  • the facial images of the passengers captured by the cameras 51 and 52 are used by the facial information registration support device 101 for facial information registration.
  • the CID 21, the meter cluster panel 22, the passenger seat display 23, and the HUD (Head Up Display) 24 are examples of the display device 2.
  • the CID 21 is installed in front between the driver's seat 41 and the passenger's seat 42 .
  • the meter cluster panel 22 is installed in front of the steering handle 40 .
  • the passenger seat display 23 is installed in front of the passenger seat 42 .
  • a HUD 24 is installed above the meter cluster panel 22 .
  • the display device 2 may include a windshield 25 that is made into a display. The display device 2 displays guidance information for guiding the position and orientation of the passenger's face in a specific direction when the imaging device 1 captures the passenger's face image.
  • the speakers 31 and 32 are an example of the audio output device 3.
  • the speaker 31 is installed near the driver's seat 41 and the speaker 32 is installed near the passenger seat 42 .
  • the audio output device 3 outputs audio prompting the passenger to visually recognize the display device 2 .
  • the configuration of the face information registration support device 101 will be described with reference to FIG.
  • the face information registration support device 101 includes an acquisition unit 11, an occupant information processing unit 12, a registration information determination unit 13, an occupant information storage unit 14, a control information calculation unit 15, a control information determination unit 16, an output information generation unit 17, and notification control. 18, a unique information storage unit 19 and a registration information storage unit 20.
  • FIG. 1 An acquisition unit 11, an occupant information processing unit 12, a registration information determination unit 13, an occupant information storage unit 14, a control information calculation unit 15, a control information determination unit 16, an output information generation unit 17, and notification control. 18, a unique information storage unit 19 and a registration information storage unit 20.
  • the acquisition unit 11 acquires from the imaging device 1 the facial image of the passenger captured by the imaging device 1 .
  • the occupant information processing unit 12 acquires the facial image of the occupant from the acquiring unit 11 and acquires occupant information from the facial image of the occupant.
  • the occupant information includes at least one of the occupant's face position, face direction, line-of-sight direction, and eye opening degree.
  • the registration information determination unit 13 acquires the passenger's face image from the passenger information processing unit 12 and determines whether the passenger's face image satisfies a predetermined registration condition.
  • the registration information determination unit 13 may be configured by a DMS (Driver Monitoring System), which is an existing face information registration system.
  • the occupant information storage unit 14 acquires occupant information from the occupant information processing unit 12 and stores it.
  • the occupant information storage unit 14 may store the occupant information acquired by the occupant information processing unit 12 in one process each time, or may collectively store the occupant information acquired in a plurality of processes.
  • the control information calculation unit 15 acquires occupant information from the occupant information storage unit 14 and calculates control information from the occupant information.
  • the control information includes the occupant's face movement amount and effective visual field range. A method of calculating control information will be described in detail later.
  • the control information determination unit 16 acquires control information from the control information calculation unit 15 and determines whether the acquired control information satisfies a predetermined condition.
  • the output information generation unit 17 generates output information according to the determination result of the control information determination unit 16 .
  • the output information includes either audio information that prompts visual recognition of the display device 2 or guidance information that guides the position and orientation of the face to a predetermined position and orientation.
  • the notification control unit 18 acquires output information from the output information generation unit 17 and provides the output information to either the display device 2 or the audio output device 3 according to the determination result of the control information determination unit 16 . In other words, the notification control unit 18 controls whether to notify the occupant by display or by voice according to the determination result of the control information determination unit 16 .
  • the unique information storage unit 19 stores unique information used by the face information registration support device 101 .
  • the specific information includes, for example, the position coordinates of the display device 2 or the threshold value of the amount of face movement.
  • the registration information storage unit 20 stores, that is, registers the face information regarding the face image.
  • the face information is information used to identify the occupant's face in face authentication processing, and may be the face image itself or a score obtained by evaluating the face image using a machine learning model.
  • the registered face information is also referred to as registration information.
  • ⁇ A-2 Operation> 3 and 4 show the positional relationship between the effective visual range of the occupant and the display device.
  • CID 21 and meter cluster panel 22 correspond to display device 2 .
  • the control information calculation unit 15 sets the effective visual field range A2 in consideration of the human effective visual field, using the visual line direction A1 of the occupant P included in the occupant information as a reference.
  • the effective visual field range A2 is expressed as a visual field angle ⁇ 1 range centered on the visual line direction A1.
  • neither the CID 21 nor the meter cluster panel 22 exist within the effective visual field range A2.
  • the control information determination unit 16 compares the effective visual field range A2 and the position A3 of the CID 21 to identify that the CID 21 is not within the effective visual field range A2.
  • the control information determination unit 16 compares the effective visual field range A2 and the position A3 of the meter cluster panel 22 to identify that the meter cluster panel 22 is not within the effective visual field range A2. Even if guidance information for guiding the position and orientation of the passenger's face is displayed on the CID 21 or the meter cluster panel 22 in this state, it is not visually recognized by the passenger. Therefore, the face information registration support device 101 prompts the passenger to visually recognize the display device 2 by voice.
  • the speaker 31 outputs a voice saying "Move your face to the right".
  • the occupant who hears this voice moves his or her head to the right
  • the occupant's effective visual field range A2 moves to the right and overlaps with the CID 21 as shown in FIG.
  • the movement of the face direction is instructed by voice, but the movement of the face position may be instructed along with the face orientation or instead of the face orientation.
  • FIG. 4 illustrates a case where CID 21 exists in effective visual field range A2.
  • the face information registration support device 101 displays on the CID 21 guidance information A4 for guiding the position and orientation of the passenger's face.
  • the passenger can view the guidance information A4 without changing the position and orientation of the face, and then change the position and orientation of the face according to the guidance information A4.
  • the position coordinates of the CID 21 be (x1, y1, z1). Assume that the position coordinates of the occupant's right eye are (rx1, ry1, rz1) and the position coordinates of the left eye are (lx1, ly1, lz1). Let (rx2, ry2, rz2) be the position coordinates of the intersection of the xy plane parallel to the display surface of the CID 21 and the line of sight A1r of the right eye. Similarly, let (lx2, ly2, lz2) be the position coordinates of the intersection of the xy plane parallel to the display surface of the CID 21 and the line of sight A1l of the left eye. Assuming that the effective visual field range A2r for the right eye is a circular area with a radius r1 from the line of sight A1r of the right eye, the radius r1 of the effective visual field range A2r is calculated by Equation (1).
  • the effective visual field range of the left eye is calculated.
  • the effective visual field range A2 is calculated by combining the effective visual field range A2r for the right eye and the effective visual field range for the left eye.
  • the face information registration support device 101 registers a plurality of face images of the occupant by changing the position or orientation of the face. Therefore, guidance of the occupant's face and photographing and registration of the face image are alternately repeated multiple times. Therefore, the face information registration support apparatus 101 calculates the face movement amount of the occupant in the control information calculation unit 15 so that the next guidance is not performed while the occupant is moving the face according to the guidance, and the control information A judgment unit 16 judges whether or not the occupant has completed the movement of the face based on the amount of movement of the face.
  • FIG. 7 shows the facial images of the occupant P at two times t1 and t2.
  • the face movement amount m1 between times t1 and t2 is represented by the sum of the movement amount of the face position between times t1 and t2 and the movement amount of the face direction between times t1 and t2.
  • the position coordinates of the face at time t1 be (x2, y2, z2) and the orientation of the face be ( ⁇ x1, ⁇ y1)
  • the position coordinates of the face at time t2 be (x3, y3, z3) and the orientation of the face be ( ⁇ x2, ⁇ y2)
  • the face movement amount m1 is calculated by the following equation (2).
  • FIG. 8 is a flowchart showing the operation of the face information registration support device 101.
  • FIG. The operation of the face information registration support device 101 will be described below according to the flowchart of FIG.
  • the imaging device 1 captures the face image of the passenger, and the acquisition unit 11 acquires the face image of the passenger from the imaging device 1 (step S101).
  • the occupant information processing unit 12 acquires the facial image of the occupant from the acquiring unit 11, and calculates occupant information from the facial image of the occupant (step S102).
  • the occupant information includes at least one of the occupant's face position, face direction, line-of-sight direction, and degree of eye opening.
  • the method for calculating the position and orientation of the face is disclosed in Patent Document 1, the method for calculating the gaze direction is disclosed in Japanese Patent Application Laid-Open No.
  • the occupant information calculated by the occupant information processing section 12 is stored in the occupant information storage section 14 .
  • the registration information determination unit 13 determines whether face information needs to be registered (step S103). Specifically, the registration information determination unit 13 acquires the registration information from the registration information storage unit 20, and determines that registration of the face information is necessary when the passenger information is not included in the registration information. Further, the registration information determination unit 13 may determine that registration of face information is necessary when a specific input operation is performed on a display device such as a touch panel by a passenger. The occupant whose face information is to be registered may be one or more.
  • step S102 when the registration information determination unit 13 determines that registration of passenger information is unnecessary, the processing of the face information registration support device 101 returns to step S101.
  • step S102 when the registration information determination unit 13 determines that registration of passenger information is necessary, the processing of the face information registration support device 101 proceeds to step S104 and subsequent steps.
  • step S104 the control information calculation unit 15 calculates the effective visual field range of the occupant from the occupant information calculated in step S102. Specifically, the control information calculation unit 15 calculates a range having a certain angle with respect to the line-of-sight direction from the position of the face or the position of the eyes included in the occupant information calculated in step S102. Set the field of view range.
  • the above constant angle is appropriately set in consideration of the effective field of view of a person with respect to the viewing direction, and is generally 4 to 20 degrees.
  • the detailed calculation method of the effective visual field range is as described with reference to FIGS. 4 to 6.
  • FIG. In addition to the above method, the control information calculation unit 15 may set the effective visual field range using information such as the degree of eye opening.
  • step S105 the control information determination unit 16 determines whether or not the display device 2 exists within the effective visual field range. Specifically, the control information determination unit 16 determines whether or not the display device 2 exists within the effective visual field range by comparing the effective visual field range and the position coordinates of the display device 2 . If the display device 2 does not exist within the effective visual field range, the processing of the face information registration support device 101 proceeds to step S106. The process proceeds to step S107.
  • step S106 the face information registration support device 101 prompts the passenger to visually recognize the display device 2 by voice.
  • the output information generation unit 17 creates audio information for prompting the user to visually recognize the display device 2
  • the notification control unit 18 provides the audio information to the audio output device 3 .
  • the sound output device 3 outputs a sound prompting the user to visually recognize the display device 2 .
  • the voice output device 3 outputs a voice such as "Face information will be registered, so please look at the display device.”
  • step S107 the face information registration support device 101 guides the position and orientation of the occupant's face through display.
  • the output information generation unit 17 creates guidance information that guides movement of the position and orientation of the face.
  • the notification control unit 18 provides the guidance information to the display device 2 .
  • the display device 2 provides a display that guides the position and orientation of the occupant's face.
  • This guidance display is also called display notification.
  • an image is displayed in which an animation indicating the direction of movement is added to the face image of the occupant captured by the imaging device 1 .
  • a face model created with a 3D mesh may be used instead of the face image.
  • step S108 the registration information determination unit 13 determines whether or not the registration of the image of the variation necessary for face authentication processing has been completed.
  • Variation images are, for example, images in which the face is facing the front, up, down, left, and right. If registration of the variation image has been completed, the processing of the face information registration support apparatus 101 ends. On the other hand, if registration of the variation image has not been completed, the process of the face information registration support apparatus 101 proceeds to step S109.
  • step S ⁇ b>109 the imaging device 1 captures the passenger's facial image, and the acquisition unit 11 acquires the passenger's facial image from the imaging device 1 . Thereafter, the occupant information processing unit 12 calculates occupant information from the occupant's face image acquired in step S109 (step S110).
  • the occupant information includes at least one of the occupant's face position, face orientation, line-of-sight direction, and degree of eye opening.
  • the occupant information calculated by the occupant information processing section 12 is stored in the occupant information storage section 14 .
  • the control information calculation unit 15 calculates the face movement amount as control information in order to grasp the movement status of the passenger (step S111).
  • the occupant information calculated in step S110 that is, the current occupant information is defined as first occupant information
  • the previous occupant information stored in the occupant information storage unit 14 is defined as second occupant information.
  • the control information calculation unit 15 calculates the sum of squares of differences in axial direction components between the face position in the first occupant information and the face position in the second occupant information as the movement amount of the face position. Further, the control information calculation unit 15 calculates the sum of the squares of the differences in the axial direction components between the face direction of the first occupant information and the face direction of the second occupant information as the movement amount of the face direction. Then, the control information calculation unit 15 calculates the sum of the amount of movement of the position of the face and the amount of movement of the direction of the face as the amount of movement of the face.
  • the detailed calculation method of the face movement amount is as described in ⁇ A-1>.
  • control information determination unit 16 determines whether or not the face movement has been completed (step S112).
  • the control information determination unit 16 acquires a predetermined face movement amount threshold from the unique information storage unit 19, and if the face movement amount calculated in step S111 exceeds the face movement amount threshold, the face movement is completed. judge that it did.
  • step S112 If the face movement has not been completed in step S112, the processing of the face information registration support device 101 returns to step S109. If the face movement has been completed in step S112, the process of the face information registration support apparatus 101 proceeds to step S113.
  • step S113 the control information calculation unit 15 calculates the occupant's effective visual field range from the occupant information acquired in step S110. Then, the control information determination unit 16 determines whether or not any display device 2 exists within the effective visual field range (step S114).
  • step S114 the processing of the face information registration support device 101 proceeds to step S115. If the display device 2 exists within the effective visual field range in step S114, the processing of the face information registration support device 101 proceeds to step S107, and guidance information for guiding the face to the next position and orientation is displayed.
  • the face information registration support device 101 prompts the passenger to visually recognize any display device 2 by voice.
  • the output information generation unit 17 creates audio information for prompting the user to visually recognize the display device 2
  • the notification control unit 18 provides the audio information to the audio output device 3 .
  • the sound output device 3 outputs a sound prompting the user to visually recognize the display device 2 .
  • This voice is, for example, "Please move the position of the face to the right”.
  • the output information generator 17 prepares in advance voice information for each pattern in which the face moves in different directions, such as up, down, left, right, and diagonal. method may be applied to select audio information. Since the nearest neighbor method is publicly known, detailed description is omitted.
  • the face information registration support device 101 instead of prompting the occupant to visually confirm 2 by voice, the display position of the guidance information may be changed within the effective visual field range.
  • the face information registration support device 101 of Embodiment 1 supports registration of face information for performing face authentication of a passenger on board a vehicle.
  • the face information registration support device 101 includes an acquisition unit 11 that acquires a facial image of the occupant photographed by an imaging device 1 provided in the vehicle; An occupant information processing unit 12 that calculates information, a control information calculation unit 15 that calculates control information including the effective visual field range of the occupant from the occupant information, and whether or not the display device 2 provided in the vehicle exists within the effective visual field range. and a notification control unit 18 that notifies the occupant to move his/her face.
  • the notification by the notification control unit 18 includes a voice notification that outputs a voice prompting the passenger to visually recognize the display device 2 from the voice output device 3 provided in the vehicle, and a voice notification that prompts the passenger to move at least one of the position and orientation of the face. and a display notification for causing the display device 2 to display guidance information to be guided.
  • the notification control unit 18 performs voice notification when the display device 2 is not present within the effective visual field range, and performs display notification when the display device 2 is present within the effective visual field range.
  • the face information registration support device 101 guides the face by display only when the occupant is visually recognizing the display device 2, and when the occupant is not visually recognizing the display device 2, the display device 2 is guided by voice. In order to encourage the visual recognition of the face information, the passenger can visually recognize the display device 2 according to the voice and then move the face according to the information displayed on the display device 2, so that the face information registration process can be performed smoothly. It reduces the burden on passengers.
  • the control information includes the passenger's face movement amount calculated from the passenger information calculated from the face images acquired before and after the display notification.
  • the amount of movement is compared with a predetermined threshold, and the notification control unit 18 may perform the following display notification when the amount of movement of the face is equal to or greater than the threshold.
  • the face information registration support device 101 can give the next display notification to the passenger at the timing when the movement of the passenger's face is completed according to the previous display notification.
  • FIG. 9 is a diagram showing the configuration of face information registration support apparatus 102 according to the second embodiment.
  • the face information registration support device 102 is connected to the imaging device 1, the display device 2, and the audio output device 3, and is configured to be able to use them.
  • the face information registration support device 102 is a device that supports registration of face information used for face authentication of a vehicle occupant.
  • the face information registration support device 102 includes an acquisition unit 11, an occupant information processing unit 12, a control information calculation unit 15, a control information determination unit 16, an output information generation unit 17, a notification control unit 18, a unique information storage unit 19, and a display device control unit. It is configured with a portion 20A.
  • the display device control unit 20A has a mechanism that is attached to the display device 2 and adjusts the position and orientation of the display device 2 . A method for adjusting the position and orientation of the display device is disclosed in Japanese Patent Application Laid-Open No. 20007-78172, so a detailed description thereof will be omitted.
  • the control information calculation unit 15 acquires occupant information from the occupant information processing unit 12 and calculates control information from the occupant information.
  • the control information includes the occupant's effective visual field range and the amount of movement of the display device.
  • the amount of movement of the display device is the amount of movement of the position and angle of the display device 2 to the position and angle at which the display surface is perpendicular to the line-of-sight direction of the occupant.
  • the control information determination unit 16 acquires the occupant's effective visual field range from the control information calculation unit 15, and determines whether or not the effective visual field range satisfies a predetermined condition.
  • the output information generation unit 17 generates output information according to the determination result of the control information determination unit 16 .
  • the notification control unit 18 switches between display and sound output information based on the determination result of the control information determination unit 16, and outputs the output information generated by the output information generation unit 17 to the display device 2 or voice output. provided to any of the devices 3;
  • the unique information storage unit 19 stores unique information used by the face information registration support device 101 .
  • the unique information includes, for example, the position coordinates of the display device 2 .
  • the display device control unit 20A acquires the display direction of the display device 2 from the control information calculation unit 15 via the notification control unit 18, and acquires the determination result of the control information determination unit 16 via the notification control unit 18. Then, the display device control section 20A changes the position and display direction of the display device 2 based on the determination result of the control information determination section 16 . Details will be described later.
  • FIG. 10 is a perspective view showing the positional relationship between the occupant P and the CID 21.
  • the horizontal and vertical directions of the display surface of the CID 21 are the x-axis direction and the y-axis direction, respectively, and the direction perpendicular to the display surface of the CID 21 is the z-axis direction.
  • 11 is an xz plan view showing the positional relationship between the occupant P and the CID 21, and FIG.
  • FIG. 12 is a yz plan view showing the positional relationship between the occupant P and the CID 21.
  • FIG. 11 and 12 show the change of the display direction of the CID 21 by the display device control section 20A.
  • the x, y and z axes are set on the basis of the CID 21 before the change.
  • (x1, y1, z1) be the absolute position coordinates of the position A3 before adjustment of the CID 21 in the vehicle compartment
  • (x1, y1a, z1) be the absolute position coordinates of the position A3a after the adjustment of the CID 21 in the vehicle compartment.
  • (x3, y3, z3) be the position coordinates of the midpoint A5 of the straight line connecting the left and right eyes of the occupant.
  • A6 be a straight line connecting the position A3 of the CID 21 before adjustment and the midpoint A5.
  • ⁇ 2 be the angle between the straight line A6 and the y-axis direction.
  • A6a be a straight line connecting the adjusted position A3a of the CID 21 and the midpoint A5.
  • ⁇ 2a be the angle between the straight line A6a and the vertical direction of the display surface of the CID 21 after adjustment.
  • ⁇ 3 be the angle between the straight line A6a and the horizontal direction of the display surface of the CID 21 after adjustment.
  • ⁇ 4 be the angular movement amount (rotation angle) of the CID 21 in the yz plane. Note that 0 ⁇ 4 ⁇ max ⁇ 4, and max ⁇ 4 is the maximum value of the rotation angle of the CID 21 on the yz plane.
  • ⁇ 5 is the angle formed between the horizontal direction of the display screen of the CID 21 after adjustment and the x-axis, in other words, the angular movement amount (rotational angle) of the CID 21 on the xz plane.
  • the position and angle of the CID 21 are adjusted so that the angle ⁇ 2a formed between the straight line A6a and the y-axis direction and the angle ⁇ 3 formed between the straight line A6a and the x-axis direction are 90 degrees, respectively, so that the CID 21 moves in the front direction of the occupant. do.
  • the positional movement amount m2 of the display device 2 in the y-axis direction is calculated by equation (3).
  • the angular movement amount ⁇ 4 on the yz plane of the display device 2 is calculated by Equation (4). Note that 0 ⁇ m2 ⁇ maxm2, and maxm2 is the maximum value of the positional movement amount m2 of the display device 2 in the y-axis direction.
  • FIG. 13 is a flowchart showing the operation of the face information registration support device 102.
  • FIG. The operation of the face information registration support device 102 will be described below according to the flowchart of FIG.
  • the face information registration support device 102 starts face information registration processing in step S201.
  • a condition for starting the face information registration process is, for example, that the passenger performs a specific input operation on a display device such as a touch panel.
  • the occupant whose occupant information is to be registered may be one or more.
  • the imaging device 1 captures the facial image of the passenger, and the acquiring unit 11 acquires the facial image of the passenger from the imaging device 1 (step S202).
  • the occupant information processing unit 12 acquires the facial image of the occupant from the acquisition unit 11, and calculates occupant information from the facial image of the occupant (step S203).
  • the occupant information includes at least one of the occupant's face position, face direction, line-of-sight direction, and degree of eye opening.
  • the occupant information calculated by the occupant information processing section 12 is stored in the occupant information storage section 14 .
  • the face information registration support device 102 notifies the occupant that the display device 2 will move by voice, and prompts the occupant to visually recognize the display device 2 (step S204).
  • the output information generation unit 17 creates audio information for prompting the user to visually recognize the display device 2
  • the notification control unit 18 provides the audio information to the audio output device 3 .
  • the sound output device 3 outputs a sound prompting the user to visually recognize the display device 2 .
  • the voice output device 3 outputs a voice such as "The display device will be moved to a position where it is easy to see. Please look at the display device.”
  • the audio output from the audio output device 3 may be any content that conveys to the passenger that the display device 2 is moving and prompts the passenger to visually recognize the display device 2 .
  • control information calculation unit 15 calculates the effective visual field range of the occupant from the occupant information calculated in step S203 (step S205). After that, the control information determination unit 16 determines whether or not the display device 2 exists within the effective visual field range (step S206). If the display device 2 does not exist within the effective visual field range in step S206, the processing of the face information registration support device 102 returns to step S204.
  • step S207 the control information calculation unit 15 calculates the position and direction of the display device 2 that is most visible to the occupant, that is, the optimum position and direction, and calculates the amount of movement of the position and direction of the display device 2 to the position and direction (step S207). Specifically, the control information calculation unit 15 acquires the eye position information of the occupant from the occupant information. The amount of movement to the position and angle at which the display surface of is vertical is calculated as the amount of movement of the display device. Note that the detailed calculation method of the amount of movement of the display device is as described with reference to FIGS. 10 to 12 .
  • the display device control unit 20A moves the display device 2 by the display device movement amount acquired from the control information calculation unit 15 (step S208).
  • step S209 the face information registration support device 102 guides the position and orientation of the occupant's face by display (step S209). This step is the same as step S107 in FIG.
  • the face information registration support device 102 of Embodiment 2 is a display device that rotates the display device so that the display surface of the display device is perpendicular to the line-of-sight direction of the passenger when the display device 2 exists within the effective visual field range.
  • a control unit 20A is provided. Then, the notification control unit 18 performs display notification after the display device 2 is rotated. Therefore, the visibility of the display notification by the passenger is improved.
  • the face information registration support apparatuses 101 and 102 are composed of computers, and the computers have a processor 81 and memories 82 and 83 as shown in FIG.
  • the computer is stored with the acquisition unit 11, the passenger information processing unit 12, the registration information determination unit 13, the control information calculation unit 15, the control information determination unit 16, the output information generation unit 17, the notification control unit 18, and unique information storage.
  • a program for functioning as the unit 19 (hereinafter also referred to as “acquisition unit 11 etc.”) is stored.
  • acquisition unit 11 is stored.
  • the functions of the acquisition unit 11 and the like are realized by the processor 81 reading and executing the program stored in the memory 82 .
  • the memory 83 implements the functions of the occupant information storage unit 14 and the registration information storage unit 20 .
  • the face information registration support devices 101 and 102 may have a memory 83 and a processing circuit 84.
  • the functions of the acquisition unit 11 and the like may be implemented by the processing circuit 84 .
  • the face information registration support devices 101 and 102 may have a processor 81 , memories 82 and 83 and a processing circuit 84 .
  • part of the functions of the acquisition unit 11 and the like may be implemented by the processor 81 and the memory 82 and the remaining functions may be implemented by the processing circuit 84 .
  • the processor 81 uses, for example, a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), a microprocessor, a microcontroller, or a DSP (Digital Signal Processor).
  • a CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • microprocessor a microcontroller
  • DSP Digital Signal Processor
  • the memories 82 and 83 use, for example, semiconductor memories or magnetic disks. More specifically, the memory 82 includes RAM (Random Access Memory), ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Memory, Solid-On Memory). State Drive) or HDD (Hard Disk Drive).
  • RAM Random Access Memory
  • ROM Read Only Memory
  • flash memory EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Memory, Solid-On Memory). State Drive) or HDD (Hard Disk Drive).
  • the processing circuit 84 is, for example, ASIC (Application Specific Integrated Circuit), PLD (Programmable Logic Device), FPGA (Field-Programmable Gate Array), SoC (System-on-a-Chip) or system LSI (Large-Scale) is used.
  • ASIC Application Specific Integrated Circuit
  • PLD Programmable Logic Device
  • FPGA Field-Programmable Gate Array
  • SoC System-on-a-Chip
  • system LSI Large-Scale
  • Imaging device 2 Display device 3 Audio output device 11 Acquisition unit 12 Passenger information processing unit 13 Registration information determination unit 14 Passenger information storage unit 15 Control information calculation unit 16 Control information determination unit 17 Output information Generation unit, 18 Notification control unit, 19 Unique information storage unit, 20 Registration information storage unit, 20A Display device control unit, 21 CID, 22 Meter cluster panel, 23 Passenger seat display, 25 Windshield, 31, 32 Speakers, 40 Steering Steering wheel, 41 Driver's seat, 42 Passenger seat, 51, 52 Camera, 81 Processor, 82, 83 Memory, 84 Processing circuit, 101, 102 Face information registration support device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)

Abstract

L'objet de la présente divulgation est de réduire une charge sur un occupant d'un véhicule lors d'un processus d'enregistrement d'informations faciales utilisé à des fins d'authentification de visage de l'occupant. À cet effet, un dispositif (101) d'aide à l'enregistrement d'informations faciales comprend : une unité (12) de traitement d'informations d'occupant permettant de calculer, à partir d'une image faciale d'un occupant, des informations d'occupant comprenant au moins une direction de ligne visuelle et des informations de position concernant le visage de l'occupant ; une unité (15) de calcul d'informations de commande permettant de calculer, à partir des informations d'occupant, des informations de commande comprenant la plage de champ visuel efficace de l'occupant ; une unité (16) de détermination d'informations de commande permettant de déterminer si un dispositif d'affichage (2) se trouve dans la plage de champ visuel efficace ; et une unité (18) de commande de notification permettant de fournir une notification invitant l'occupant à bouger le visage. La notification comprend une notification vocale destinée à émettre une voix invitant l'occupant à regarder le dispositif d'affichage (2), et une notification d'affichage destinée à amener le dispositif d'affichage (2) à afficher des informations de guidage permettant de guider l'occupant à bouger le visage. L'unité (18) de commande de notification fournit la notification vocale lorsque le dispositif d'affichage (2) ne se trouve pas dans la plage de champ visuel efficace, et fournit la notification d'affichage lorsque le dispositif d'affichage (2) se trouve dans la plage de champ visuel efficace.
PCT/JP2021/026554 2021-07-15 2021-07-15 Dispositif d'aide à l'enregistrement d'informations faciales WO2023286228A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2021/026554 WO2023286228A1 (fr) 2021-07-15 2021-07-15 Dispositif d'aide à l'enregistrement d'informations faciales
JP2023534532A JPWO2023286228A1 (fr) 2021-07-15 2021-07-15

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/026554 WO2023286228A1 (fr) 2021-07-15 2021-07-15 Dispositif d'aide à l'enregistrement d'informations faciales

Publications (1)

Publication Number Publication Date
WO2023286228A1 true WO2023286228A1 (fr) 2023-01-19

Family

ID=84919761

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/026554 WO2023286228A1 (fr) 2021-07-15 2021-07-15 Dispositif d'aide à l'enregistrement d'informations faciales

Country Status (2)

Country Link
JP (1) JPWO2023286228A1 (fr)
WO (1) WO2023286228A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003098982A (ja) * 2001-09-26 2003-04-04 Alpine Electronics Inc 表示装置
JP2006134038A (ja) * 2004-11-05 2006-05-25 Omron Corp 車載用顔面データ登録装置及び車両における顔面データ登録方法
JP2019012046A (ja) * 2017-06-30 2019-01-24 株式会社デンソーテン 通知装置、通知システムおよび通知方法
JP2019010929A (ja) * 2017-06-29 2019-01-24 株式会社デンソーテン 運転支援装置および運転支援方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003098982A (ja) * 2001-09-26 2003-04-04 Alpine Electronics Inc 表示装置
JP2006134038A (ja) * 2004-11-05 2006-05-25 Omron Corp 車載用顔面データ登録装置及び車両における顔面データ登録方法
JP2019010929A (ja) * 2017-06-29 2019-01-24 株式会社デンソーテン 運転支援装置および運転支援方法
JP2019012046A (ja) * 2017-06-30 2019-01-24 株式会社デンソーテン 通知装置、通知システムおよび通知方法

Also Published As

Publication number Publication date
JPWO2023286228A1 (fr) 2023-01-19

Similar Documents

Publication Publication Date Title
JP5966341B2 (ja) 画像処理装置、画像処理方法、画像処理装置用プログラム、画像表示装置
CN111665513B (zh) 面部特征部检测装置、面部特征部检测方法
EP3033999A1 (fr) Appareil et procede de determination de l'état d'un conducteur
US9606623B2 (en) Gaze detecting apparatus and method
JP2006522397A (ja) マルチビュー・ディスプレイ
CN111873911B (zh) 调整后视镜的方法、装置、介质以及电子设备
JP6573193B2 (ja) 判定装置、判定方法、および判定プログラム
JP6479272B1 (ja) 視線方向較正装置、視線方向較正方法および視線方向較正プログラム
JP2009012652A (ja) 車両の周辺監視装置
JP2017502876A (ja) 車両のミラー調整
JP2013216286A (ja) 車両周囲確認用モニター装置
US11488319B2 (en) Three-dimensional position estimation device and three-dimensional position estimation method
JP2009183473A (ja) 視線方向検出装置及び視線方向検出方法
JP6669182B2 (ja) 乗員監視装置
JP2016115117A (ja) 判定装置および判定方法
KR100982171B1 (ko) 안면 영상 촬영장치
CN114007054A (zh) 车载屏幕画面投影矫正的方法及装置
WO2023286228A1 (fr) Dispositif d'aide à l'enregistrement d'informations faciales
JP2016115120A (ja) 開閉眼判定装置および開閉眼判定方法
JP4855278B2 (ja) カメラパラメータ取得装置
JP2018101212A (ja) 車載器および顔正面度算出方法
JP5082620B2 (ja) 余所見判定装置
WO2017217044A1 (fr) Dispositif d'estimation de la direction de visée
JP2016115118A (ja) 下方視判定装置および下方視判定方法
JP5049304B2 (ja) 車両の周辺を画像表示するための装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21950167

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023534532

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE