CN116797652A - Sight line calibration method and device, electronic equipment and storage medium - Google Patents

Sight line calibration method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN116797652A
CN116797652A CN202310703858.3A CN202310703858A CN116797652A CN 116797652 A CN116797652 A CN 116797652A CN 202310703858 A CN202310703858 A CN 202310703858A CN 116797652 A CN116797652 A CN 116797652A
Authority
CN
China
Prior art keywords
sight
calibration
target
coordinates
projection area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310703858.3A
Other languages
Chinese (zh)
Inventor
胡恒恒
王晶晶
谢迪
浦世亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN202310703858.3A priority Critical patent/CN116797652A/en
Publication of CN116797652A publication Critical patent/CN116797652A/en
Pending legal-status Critical Current

Links

Abstract

The embodiment of the application provides a sight line calibration method, a sight line calibration device, electronic equipment and a storage medium, wherein the electronic equipment can acquire target sight line data captured by a cabin camera; determining the coordinate of the calibration mark under the cabin camera coordinate system based on the coordinate of the projection area under the cabin camera coordinate system and the coordinate of the calibration mark in the projection area, and taking the coordinate as a predicted line-of-sight falling point coordinate; determining target line-of-sight landing point coordinates under a cabin camera coordinate system based on the target line-of-sight data; and determining the sight calibration parameters according to the target sight falling point coordinates and the predicted sight falling point coordinates. Since the coordinates of the calibration marks in the projection area are known, the electronic device can accurately determine the coordinates of the calibration marks in the cabin camera coordinate system, i.e. the predicted line of sight landing point coordinates. And furthermore, the calibration parameters are accurately determined according to the predicted line-of-sight falling point coordinates and the target line-of-sight falling point coordinates, so that line-of-sight calibration can be realized, and the accuracy of line-of-sight calibration is higher.

Description

Sight line calibration method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of line of sight calibration technologies, and in particular, to a line of sight calibration method, device, electronic apparatus, and storage medium.
Background
Most traffic accidents are currently related to fatigue driving or driver distraction, and driving monitoring systems (driver monitoring system, DMS) are therefore widely used. The driving monitoring system is based on real-time tracking of the sight of the driver, and further judges whether the driver is tired or distracted. When the situation occurs to the driver, the driver can be reminded in a voice broadcasting mode and the like, so that traffic accidents are reduced.
The accuracy of gaze tracking depends on the accuracy of gaze calibration, and in the current gaze calibration mode, a driver is usually guided to gaze at certain fixed points, gaze data of the fixed points of the driver is captured, and calibration parameters are determined according to predicted gaze data corresponding to the gaze data and the fixed points. However, in an actual cabin environment, the coordinates of the fixed point position are difficult to accurately determine, so that the sight alignment is difficult to realize.
Disclosure of Invention
The embodiment of the application aims to provide a sight line calibration method, a sight line calibration device, electronic equipment and a storage medium, so as to realize sight line calibration. The specific technical scheme is as follows:
in a first aspect, an embodiment of the present application provides a line-of-sight calibration method, including:
Acquiring target sight data of a user for a calibration mark, wherein the target sight data is captured by a cockpit camera, the calibration mark is displayed in a projection area of the cockpit, the calibration mark is used for indicating the position where the user looks at the calibration mark, and the projection area is an area which is positioned in front of the cockpit and used for displaying projection information;
determining the coordinate of the calibration mark under the cabin camera coordinate system as a predicted line-of-sight landing point coordinate based on the coordinate of the pre-calibrated projection area under the cabin camera coordinate system and the coordinate of the calibration mark in the projection area;
determining target line-of-sight landing coordinates of the user for the calibration mark in the cabin camera coordinate system based on the target line-of-sight data;
and determining a sight calibration parameter according to the target sight falling point coordinates and the predicted sight falling point coordinates.
Optionally, before the step of acquiring the user's target line of sight data for calibration identification captured by the cockpit camera, the method further comprises:
acquiring a first target image of the projection area acquired by the cabin camera after being reflected by a plane mirror;
determining coordinates of the reflected projection area under a world coordinate system according to the parameters of the cabin camera calibrated in advance and pixel coordinates of the projection area in the first target image;
Determining the coordinates of the projection area under the world coordinate system according to the distance between the projection area and the plane mirror, the coordinates of the reflected projection area under the world coordinate system and a plane reflection imaging principle;
and converting the coordinates of the projection area under the world coordinate system into the coordinates of the projection area under the cabin camera coordinate system according to the conversion relation between the pre-calibrated world coordinate system and the cabin camera coordinate system.
Optionally, before the step of acquiring the user's target line of sight data for calibration identification captured by the cockpit camera, the method further comprises:
acquiring a second target image, acquired by the cabin camera, of the projection area, wherein the second target image is displayed on the projection area and reflected by a plane mirror;
determining coordinates of the reflected projection area under a cabin camera coordinate system according to the parameters of the cabin camera calibrated in advance and the pixel coordinates of the first target in the second target image;
acquiring a third target image of the plane mirror, which is acquired by the cabin camera and has a second target attached to the surface;
determining coordinates of the plane mirror under a cabin camera coordinate system according to the parameters of the cabin camera calibrated in advance and the pixel coordinates of the second target in the third target image;
And determining the coordinates of the projection area under the coordinate system of the cabin camera according to the coordinates of the plane mirror under the coordinate system of the cabin camera, the coordinates of the reflected projection area under the coordinate system of the cabin camera and the plane reflection imaging principle.
Optionally, the step of acquiring the target line-of-sight data of the user for calibration identification captured by the cabin camera includes:
controlling the projection area of the cabin to display calibration marks, and recording the number of the displayed calibration marks;
acquiring line-of-sight data of a user captured by the cockpit camera for the displayed calibration mark;
determining whether the number of recorded displayed calibration marks reaches a preset number;
if the number of the recorded and displayed calibration marks does not reach the preset number, returning to the projection area of the control cabin to display the calibration marks, and recording the number of the displayed calibration marks;
and if the number of the recorded and displayed calibration marks reaches the preset number, determining the acquired sight line data as target sight line data of the user for the calibration marks.
Optionally, before the step of determining the target line of sight landing coordinates of the user for the calibration identification in the cabin camera coordinate system based on the target line of sight data, the method further comprises:
Performing data cleaning on the target sight line data to obtain cleaned target sight line data;
the step of determining the target line of sight landing point coordinates of the user for the calibration mark under the cabin camera coordinate system based on the target line of sight data comprises the following steps:
and determining the target sight drop point coordinates of the user for the calibration mark under the cabin camera coordinate system based on the cleaned target sight data.
Optionally, the sight line data includes a plurality of sight line position coordinates of the user captured by the cabin camera, which are acquired within a preset time period after the calibration identifier is displayed;
the step of performing data cleaning on the target sight line data to obtain cleaned target sight line data comprises the step of performing data cleaning by adopting one of the following modes to obtain cleaned target sight line data:
clustering the plurality of sight line position coordinates to determine a clustering center; determining the cluster center as cleaned target sight data;
clustering the plurality of sight line position coordinates, and determining the category with the largest number of sight line position coordinates; determining the sight line position coordinates included in the determined category as cleaned target sight line data;
Determining a sight focus set of the user based on capture timing corresponding to the plurality of sight position coordinates; determining the end position coordinates of the sight line focus set as cleaned target sight line data;
determining a sight focus set of the user based on capture timing corresponding to the plurality of sight position coordinates; and determining the sight line position coordinates with the distance between the sight line position coordinates and the end point position coordinates of the sight line focus set being smaller than a preset threshold value as cleaned target sight line data.
Optionally, the step of determining the line of sight calibration parameter according to the difference between the target line of sight landing point coordinate and the predicted line of sight landing point coordinate includes:
and calculating a sight line calibration matrix R according to the following formula based on the target sight line falling point coordinates and the predicted sight line falling point coordinates:
wherein n is the number of coordinates of the falling point of the target sight, V 3 ×n For a matrix of column vectors composed of components of the predicted line-of-sight drop coordinates, V n ×3 For a matrix of row vectors composed of components of the predicted line-of-sight drop coordinates, V n×3 Is a matrix of row vectors composed of components of the target line-of-sight drop coordinates.
Optionally, after the step of determining the gaze calibration parameter based on the difference between the target gaze point coordinates and the predicted gaze point coordinates, the method further comprises:
Acquiring current sight line data of the user;
and determining the product of the sight line calibration matrix and the current sight line data as calibrated sight line data.
Optionally, before the step of acquiring the user's target line of sight data for calibration identification captured by the cockpit camera, the method further comprises:
controlling a projection area of the cockpit to display sight line calibration prompt information, wherein the sight line calibration prompt information is used for prompting a user to watch a calibration mark displayed in the projection area, and/or,
and controlling the cockpit audio output device to play a sight line calibration prompt voice, wherein the sight line calibration prompt voice is used for prompting a user to watch the calibration mark displayed in the projection area.
Optionally, the calibration mark is displayed in a projection area of a head-up display HUD in the pod.
In a second aspect, embodiments of the present application provide a gaze calibration apparatus, the apparatus comprising:
the system comprises a sight line data acquisition module, a calibration identification module and a display module, wherein the sight line data acquisition module is used for acquiring target sight line data of a user for the calibration identification, which is captured by a cockpit camera, wherein the calibration identification is displayed in a projection area of the cockpit, the calibration identification is used for indicating the position where the user looks at the calibration identification, and the projection area is an area which is positioned in front of the cockpit and used for displaying projection information;
The prediction line of sight coordinate determining module is used for determining the coordinate of the calibration mark under the cabin camera coordinate system as the prediction line of sight falling point coordinate based on the coordinate of the projection area under the cabin camera coordinate system calibrated in advance and the coordinate of the calibration mark in the projection area;
a target sight line coordinate determining module, configured to determine, based on the target sight line data, a target sight line drop point coordinate of the user for the calibration identifier in the cabin camera coordinate system;
and the sight calibration parameter determining module is used for determining the sight calibration parameter according to the target sight falling point coordinates and the predicted sight falling point coordinates.
Optionally, the apparatus further includes:
the first target image acquisition module is used for acquiring a first target image, which is acquired by the cabin camera and is reflected by the plane mirror, of the projection area;
the first projection area coordinate determining module is used for determining coordinates of the reflected projection area under a world coordinate system according to the parameters of the cabin camera calibrated in advance and pixel coordinates of the projection area in the first target image;
the second projection area coordinate determining module is used for determining the coordinates of the projection area under the world coordinate system according to the distance between the projection area and the plane mirror, the coordinates of the reflected projection area under the world coordinate system and the plane reflection imaging principle;
The third projection area coordinate determining module is used for converting the coordinate of the projection area under the world coordinate system into the coordinate of the projection area under the cabin camera coordinate system according to the conversion relation between the world coordinate system and the cabin camera coordinate system which are calibrated in advance;
the apparatus further comprises:
the second target image acquisition module is used for acquiring a second target image, acquired by the cabin camera, of the projection area, wherein the projection area is displayed with the first target, and the second target image is reflected by the plane mirror;
the fourth projection area coordinate determining module is used for determining the coordinate of the reflected projection area under the coordinate system of the cabin camera according to the parameters of the cabin camera calibrated in advance and the pixel coordinate of the first target in the second target image;
the third target image acquisition module is used for acquiring a third target image of the plane mirror, the surface of which is attached with a second target, acquired by the cabin camera;
the plane mirror coordinate determining module is used for determining coordinates of the plane mirror under a cabin camera coordinate system according to the parameters of the cabin camera calibrated in advance and the pixel coordinates of the second target in the third target image;
The fifth projection area coordinate determining module is used for determining the coordinate of the projection area under the coordinate system of the cabin camera according to the coordinate of the plane mirror under the coordinate system of the cabin camera, the coordinate of the reflected projection area under the coordinate system of the cabin camera and the plane reflection imaging principle;
the target sight line coordinate determination module includes:
the first calibration mark display sub-module is used for controlling the projection area of the cabin to display calibration marks and recording the number of the displayed calibration marks;
the sight line data acquisition sub-module is used for acquiring sight line data of the displayed calibration mark, which is captured by the cabin camera, of a user;
the quantity judging sub-module is used for determining whether the quantity of the recorded and displayed calibration marks reaches a preset quantity;
a second calibration mark display sub-module, configured to return to the projection area of the control cabin to display calibration marks and record the number of displayed calibration marks if the number of recorded and displayed calibration marks does not reach the preset number;
a target sight line data determining sub-module, configured to determine the acquired sight line data as target sight line data of the calibration marks for the user if the number of the recorded and displayed calibration marks reaches the preset number;
The apparatus further comprises:
the data cleaning module is used for cleaning the data of the target sight line data to obtain cleaned target sight line data;
the target sight line coordinate determination module includes:
a target line of sight falling point coordinate sub-module, configured to determine, based on the cleaned target line of sight data, a target line of sight falling point coordinate of the user with respect to the calibration identifier in the cabin camera coordinate system;
the sight line data comprise a plurality of sight line position coordinates of a user captured by a cabin camera, wherein the sight line position coordinates are acquired within a preset time period after a calibration mark is displayed;
the data cleaning module comprises:
the first cleaning submodule is used for clustering the plurality of sight line position coordinates and determining a clustering center; determining the cluster center as cleaned target sight data;
the second cleaning submodule is used for clustering the plurality of sight line position coordinates and determining the category with the largest number of sight line position coordinates; determining the sight line position coordinates included in the determined category as cleaned target sight line data;
a third cleaning sub-module, configured to determine a sight focal point set of the user based on capture timings corresponding to the plurality of sight position coordinates; determining the end position coordinates of the sight line focus set as cleaned target sight line data;
A fourth cleaning sub-module, configured to determine a sight focal point set of the user based on capture timings corresponding to the plurality of sight position coordinates; determining sight line position coordinates with the distance between the sight line position coordinates and the end point position coordinates of the sight line focus set being smaller than a preset threshold value as cleaned target sight line data;
the sight line calibration parameter determination module comprises:
the calibration matrix calculation sub-module is used for calculating a sight calibration matrix R according to the following formula based on the target sight falling point coordinates and the predicted sight falling point coordinates:
wherein n is the number of coordinates of the falling point of the target sight, V 3 ×n For a matrix of column vectors composed of components of the predicted line-of-sight drop coordinates, V n ×3 For a matrix of row vectors composed of components of the predicted line-of-sight drop coordinates, V n×3 A matrix which is formed by components of the coordinates of the falling point of the target vision for a row vector;
the apparatus further comprises:
the current sight line data acquisition module is used for acquiring current sight line data of the user;
the sight line data calibration module is used for determining the product of the sight line calibration matrix and the current sight line data as calibrated sight line data;
the apparatus further comprises:
The sight calibration prompt information display module is used for controlling a projection area of the cabin to display sight calibration prompt information, wherein the sight calibration prompt information is used for prompting a user to watch a calibration mark displayed in the projection area;
and the sight calibration prompt voice playing module is used for controlling the cabin audio output equipment to play a sight calibration prompt voice, wherein the sight calibration prompt voice is used for prompting a user to watch the calibration mark displayed in the projection area.
The calibration marks are displayed in a projection area of a head-up display HUD in the pod.
In a third aspect, an embodiment of the present application provides an electronic device, including:
a memory for storing a computer program;
a processor configured to implement the method according to any one of the first aspect when executing a program stored in the memory.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium having a computer program stored therein, which when executed by a processor implements the method of any of the first aspects.
The embodiment of the application has the beneficial effects that:
in the scheme provided by the embodiment of the application, the electronic equipment can acquire the target sight data of the user for the calibration mark, which is captured by the cockpit camera, wherein the calibration mark is displayed in a projection area of the cockpit, the calibration mark is used for indicating the position where the user looks at the calibration mark, and the projection area is an area which is positioned in front of the cockpit and used for displaying projection information; determining the coordinate of the calibration mark under the cabin camera coordinate system based on the coordinate of the pre-calibrated projection area under the cabin camera coordinate system and the coordinate of the calibration mark in the projection area, and taking the coordinate as a predicted line-of-sight falling point coordinate; determining target sight drop point coordinates of a user for calibration identification under a cabin camera coordinate system based on the target sight data; and determining the sight calibration parameters according to the target sight falling point coordinates and the predicted sight falling point coordinates. Because the electronic device can control the projection area to display the calibration mark, and the coordinates of the calibration mark in the projection area are known, the electronic device can accurately determine the coordinates of the calibration mark in the cabin camera coordinate system, namely the predicted line-of-sight landing point coordinates according to the coordinates of the pre-calibrated projection area in the cabin camera coordinate system and the coordinates of the calibration mark in the projection area. Therefore, the electronic equipment can accurately determine the calibration parameters according to the difference between the predicted line-of-sight falling point coordinates and the target line-of-sight falling point coordinates, line-of-sight calibration can be realized, and the accuracy of the line-of-sight calibration is higher.
Of course, it is not necessary for any one product or method of practicing the application to achieve all of the advantages set forth above at the same time.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions in the prior art, the drawings used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the application, and other embodiments may be obtained according to these drawings to those skilled in the art.
FIG. 1 is a flow chart of a line-of-sight calibration method according to an embodiment of the present application;
FIG. 2 is a schematic illustration of a calibration mark displayed based on the projection area of the embodiment of FIG. 1;
FIG. 3 is a flow chart of a manner of determining coordinates of a projection area in a cabin camera coordinate system based on the embodiment of FIG. 1;
FIG. 4 is a schematic illustration of a projection area imaged by mirror reflection based on the embodiment of FIG. 1;
FIG. 5 is another flow chart of a manner of determining coordinates of a projected area in a cabin camera coordinate system based on the embodiment of FIG. 1;
FIG. 6 is a specific flowchart of step S101 in the embodiment shown in FIG. 1;
FIG. 7 is a flow chart of a line-of-sight calibration scheme based on the embodiment of FIG. 1;
FIG. 8 is a flow chart of a manner of determining gaze calibration parameters based on the embodiment of FIG. 1;
fig. 9 is a schematic structural diagram of a sight line calibration apparatus according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. Based on the embodiments of the present application, all other embodiments obtained by the person skilled in the art based on the present application are included in the scope of protection of the present application.
In order to achieve line-of-sight calibration, embodiments of the present application provide a line-of-sight calibration method, apparatus, electronic device, computer-readable storage medium, and computer program product. A line-of-sight calibration method provided by an embodiment of the present application will be first described below.
The sight line calibration method provided by the embodiment of the application can be applied to any electronic equipment needing to calibrate the sight line, for example, a processor of a cabin, a sight line calibration device and the like, and is not particularly limited. For clarity of description, hereinafter referred to as an electronic device.
As shown in fig. 1, a line-of-sight calibration method includes:
s101, acquiring target sight data of a user for calibration identification, which is captured by a cabin camera;
the calibration mark is displayed in a projection area of the cabin, the calibration mark is used for indicating the position where a user looks at the calibration mark, and the projection area is an area which is positioned in front of the cabin and used for displaying projection information.
S102, determining the coordinate of the calibration mark in the cabin camera coordinate system based on the coordinate of the projection area in the cabin camera coordinate system calibrated in advance and the coordinate of the calibration mark in the projection area, and taking the coordinate of the calibration mark in the cabin camera coordinate system as a predicted line-of-sight falling point coordinate;
s103, determining target sight drop point coordinates of the user for the calibration mark under the cabin camera coordinate system based on the target sight data;
s104, determining the sight calibration parameters according to the target sight falling point coordinates and the predicted sight falling point coordinates.
In the scheme provided by the embodiment of the application, the electronic equipment can acquire the target sight line data of the user for the calibration mark, which is captured by the cockpit camera, wherein the calibration mark is displayed in the projection area of the cockpit, the calibration mark is used for indicating the position where the user looks at the calibration mark, and the projection area is the area which is positioned in front of the cockpit and used for displaying projection information; determining the coordinate of the calibration mark under the cabin camera coordinate system based on the coordinate of the pre-calibrated projection area under the cabin camera coordinate system and the coordinate of the calibration mark in the projection area, and taking the coordinate as a predicted line-of-sight falling point coordinate; determining target sight drop point coordinates of a user for calibration identification under a cabin camera coordinate system based on the target sight data; and determining the sight calibration parameters according to the target sight falling point coordinates and the predicted sight falling point coordinates. Because the electronic device can control the projection area to display the calibration mark, and the coordinates of the calibration mark in the projection area are known, the electronic device can accurately determine the coordinates of the calibration mark in the cabin camera coordinate system, namely the predicted line-of-sight landing point coordinates according to the coordinates of the pre-calibrated projection area in the cabin camera coordinate system and the coordinates of the calibration mark in the projection area. Therefore, the electronic equipment can accurately determine the calibration parameters according to the difference between the predicted line-of-sight falling point coordinates and the target line-of-sight falling point coordinates, line-of-sight calibration can be realized, and the accuracy of the line-of-sight calibration is higher.
The projection area in the embodiment of the application may be a projection area corresponding to a head-up display (HUD), and the head-up display may project information that a user needs to look at with a low head to the projection area in front of the cockpit. The projection area may be used to display projection information, where the projection information may include navigation information, incoming call information, social software information, and the like, and is not specifically limited herein.
In one embodiment, the heads-up display may project the projected information onto a solid carrier in front of the cockpit, which may be a translucent resin panel, for example. In this case, the projection area is a solid area on the solid carrier in which the projection information is displayed. In another embodiment, the heads-up display may project the projected information onto a virtual plane in front of the cockpit. In this case, the projection area is a virtual area in which projection information is displayed in front of the cockpit.
The electronic device may control the projection area to display the calibration mark, so that when the user looks at the calibration mark, the cockpit camera may capture the target line of sight data of the user looking at the calibration mark, i.e. perform step S101. The calibration mark may be any pattern capable of indicating the position where the user looks at the calibration mark, and may be set according to the line-of-sight calibration requirement, where parameters such as shape, display position, color, and size of the calibration mark are not specifically limited.
As shown in fig. 2, the calibration mark 201 may be a black dot, the calibration mark 201 is displayed in the projection area 202, the position of the projection area 202 is located in front of the cabin, the projection area 202 and the driver's eyes 203 are respectively located at two sides of the front windshield 204, and the cabin camera 205 may capture the target line of sight data of the driver's eyes 203 with respect to the calibration mark 201.
In the embodiment of the present application, the format of the target sight line data is not particularly limited. For example, the target line of sight data may be an image captured by the cockpit camera when the user gazes at the calibration mark, line of sight direction coordinates when the user gazes at the calibration mark, and so on.
Since the cabin camera is oriented in the direction of the user and the projection area is located in front of the cabin, the projection area cannot be photographed by the cabin camera, and the calibration mark displayed in the projection area cannot be photographed by the cabin camera, so that in order to determine the coordinates of the calibration mark displayed in the projection area in the cabin camera coordinate system, the coordinates of the projection area in the cabin camera coordinate system can be calibrated in advance.
Thus, since the coordinates of the calibration mark in the projection area are known, the electronic device can determine the coordinates of the calibration mark in the cabin camera coordinate system as the predicted line-of-sight landing point coordinates from the coordinates of the projection area in the cabin camera coordinate system and the coordinates of the calibration mark in the projection area, i.e., perform step S102.
If the user's sight does not deviate, the position where the user should look at when looking at a certain calibration mark is the above-mentioned predicted sight drop point coordinate, so the predicted sight drop point coordinate can be used for representing the sight direction when the user looks at the calibration mark when the user's sight does not deviate.
In order to determine the deviation between the gaze direction coordinates and the predicted gaze location coordinates when the user actually gazes at the calibration marker, the electronic device may determine the target gaze location coordinates of the user for the calibration marker in the cabin camera coordinate system based on the obtained target gaze data, i.e. execute step S103.
In one embodiment, the target line of sight data is an image captured by the cockpit camera when the user gazes at the calibration mark. The electronic device can identify the eyeball position and the eyeball direction of the user in the image, and according to the internal reference of the cabin camera, the conversion relation between the cabin camera coordinate system and the image coordinate system is determined, and then according to the conversion relation and the corresponding coordinate of the eyeball position and the eyeball direction of the user in the image, the eyeball position and the eyeball direction under the cabin camera coordinate system are determined, wherein the eyeball direction is a ray under the cabin camera coordinate system. The coordinates of the eyeball position in the cabin camera coordinate system can also be obtained through the depth camera. Then, the electronic device can determine an intersection point of the eyeball direction and the projection plane under the cabin camera coordinate system, and determine the coordinate of the intersection point as the target line of sight falling point coordinate of the user for the calibration mark.
In another embodiment, the target gaze data is an eye position and an eye direction of a user captured by the head mounted gaze capturing device when looking at the calibration marker. For example, the head-mounted gaze capturing device may be an eye-tracking device. The eye tracker captures the eyeball position and the eyeball direction under the coordinate system of the eye tracker, and the electronic equipment can convert the eyeball position and the eyeball direction from the coordinate system of the eye tracker to the coordinate system of the cabin camera according to the conversion relation between the coordinate system of the eye tracker and the coordinate system of the cabin camera, which are calibrated in advance. Therefore, the electronic equipment can determine the intersection point of the eyeball direction and the projection plane under the cabin camera coordinate system, and the coordinate of the intersection point is determined as the target sight falling point coordinate of the user for the calibration mark.
Since the target line of sight landing point coordinates may represent the actual line of sight direction of the user looking at the calibration mark for the same calibration mark, the predicted line of sight landing point coordinates may represent the predicted line of sight direction of the user looking at the calibration mark without deviation of the user's line of sight. Therefore, the electronic device may determine the gaze calibration parameter according to the target gaze point coordinates and the predicted gaze point coordinates, i.e. execute step S104.
The sight line calibration parameters are used for calibrating the actual sight line data of the user so as to correct deviation of the sight line of the user. In the first embodiment, the electronic device may calculate a mapping relationship between the target line-of-sight falling point coordinates and the predicted line-of-sight falling point coordinates, and further use the mapping relationship as the line-of-sight calibration parameter.
In a second embodiment, the electronic device may determine, according to the target gaze point coordinates and the eyeball position of the user, a target gaze direction vector of the user gaze calibration marker; and determining a predicted line of sight direction vector when the user gazes at the calibration mark under the condition that the user line of sight is not deviated according to the predicted line of sight falling point coordinates and the eyeball position of the user. In this way, the electronic device can determine the gaze calibration parameters based on the difference between the target gaze direction vector and the predicted gaze direction vector.
In a third embodiment, the electronic device may determine, according to the target line-of-sight landing point coordinates and the eyeball position of the user, that the user gazes at the target line-of-sight direction vector; and determining a predicted line of sight direction vector when the user gazes at the calibration mark under the condition that the user line of sight is not deviated according to the predicted line of sight falling point coordinates and the eyeball position of the user. In this way, the electronic device may determine the gaze calibration parameters based on the user's eye position and the target gaze direction vector, the user's eye position, and the predicted gaze direction vector.
It can be seen that, in the embodiment of the present application, since the electronic device may control the projection area to display the calibration mark, and the coordinates of the calibration mark in the projection area are known, the electronic device may accurately determine the coordinates of the calibration mark in the cabin camera coordinate system, that is, the predicted line-of-sight landing coordinates, according to the coordinates of the projection area under the cabin camera coordinate system that is calibrated in advance and the coordinates of the calibration mark in the projection area. Therefore, the electronic equipment can accurately determine the calibration parameters according to the difference between the predicted line-of-sight falling point coordinates and the target line-of-sight falling point coordinates, line-of-sight calibration can be realized, and the accuracy of the line-of-sight calibration is higher.
As shown in fig. 3, before the step of obtaining the target line-of-sight data of the user for calibration identification captured by the cabin camera, the method may further include:
s301, acquiring a first target image, which is acquired by the cabin camera and reflected by a plane mirror, of the projection area;
because the projection area is in the front of the cabin, and the direction of the cabin camera faces the direction of the driver's seat, in order to enable the cabin camera to indirectly shoot the projection area, a plane mirror can be erected in the cabin, so that an image formed by the projection area after being reflected by the plane mirror is positioned in the direction of the driver's seat, in this way, the cabin camera can acquire a first target image of the image, namely, the first target image of the projection area after being reflected by the plane mirror, and the electronic equipment can acquire the first target image acquired by the cabin camera.
A schematic view of reflection imaging of the projection area through a plane mirror may be shown in fig. 4, where the projection area 401 is located in front of the cockpit, and the cockpit camera 402 is oriented in the direction of the driving seat in the cockpit. An image 404 formed by reflection of the projection area 401 by the plane mirror 403 is located in the direction of the driving position in the cockpit, and the cockpit camera 402 may acquire a first target image for this image 404.
S302, determining coordinates of the reflected projection area under a world coordinate system according to the parameters of the cabin camera calibrated in advance and pixel coordinates of the projection area in the first target image;
in order to determine the conversion relationship between the pixel coordinate system and the world coordinate system, the internal parameters of the cabin camera may be calibrated in advance. The internal parameters may include scale factors of the cabin camera in two coordinate axis directions of the image coordinate system, principal point coordinates with respect to the image coordinate system, focal length, optical center position, tilt parameters of the coordinate axes of the image coordinate system, and distortion parameters. As an embodiment, the internal parameters of the cockpit camera may be determined by a Zhang Zhengyou calibration method.
In this way, the electronic device can determine the coordinates of the reflected projection area in the world coordinate system according to the parameters of the cabin camera and the pixel coordinates of the projection area in the first target image.
S303, determining the coordinates of the projection area under the world coordinate system according to the distance between the projection area and the plane mirror, the coordinates of the reflected projection area under the world coordinate system and a plane reflection imaging principle;
the projection area is reflected by the plane mirror to form a reflected projection area, so that the distance between the reflected projection area and the plane mirror is the same. In the case where the coordinates of the reflected projection area in the world coordinate system have been determined, the electronic device may determine the coordinates of the projection area in the world coordinate system according to the coordinates of the projection area in the world coordinate system, the distance between the projection area and the plane mirror, and the plane reflection imaging principle.
S304, converting the coordinate of the projection area under the world coordinate system into the coordinate of the projection area under the cabin camera coordinate system according to the conversion relation between the world coordinate system and the cabin camera coordinate system calibrated in advance.
In order to determine the coordinates of the projection area under the cabin camera coordinate system, the conversion relationship between the cabin camera coordinate system and the world coordinate system may be calibrated in advance. Specifically, the external parameters of the cabin camera may be calibrated in advance, and may include a translation vector and a rotation matrix. As an embodiment, the external parameters of the cockpit camera may be determined by a Zhang Zhengyou calibration method.
In this way, the electronic device can convert the coordinates of the projection area in the world coordinate system into the coordinates of the projection area in the cabin camera coordinate system according to the conversion relation between the world coordinate system and the cabin camera coordinate system.
Therefore, in the embodiment of the application, the projection area is positioned in front of the cabin, and the cabin camera faces the direction of the driver seat where the user is positioned, so that the plane mirror can be erected in the cabin. The cockpit camera may capture a first target image of the projected area after reflection off of the planar mirror. In order to determine the conversion relationship between the pixel coordinate system and the world coordinate system and the conversion relationship between the world coordinate system and the cabin camera coordinate system, the internal parameters and the external parameters of the cabin camera may be calibrated in advance. In this way, the electronic device can determine the coordinates of the reflected projection area under the world coordinate system according to the pixel coordinates of the projection area in the first target image and the internal reference of the cabin camera. The electronic device can determine the coordinates of the projection area in the world coordinate system according to the distance between the projection area and the plane mirror, the coordinates of the reflected projection area in the world coordinate system and the plane reflection imaging principle. Further, the coordinates of the projection area in the world coordinate system are converted into the coordinates of the projection area in the cabin camera coordinate system according to the conversion relation between the world coordinate system and the cabin camera coordinate system. Therefore, the coordinates of the projection area under the cabin camera coordinate system can be accurately determined, and further, the sight calibration is realized.
As shown in fig. 5, before the step of obtaining the target line-of-sight data of the user for calibration identification captured by the cabin camera, the method may further include:
s501, acquiring a second target image, acquired by the cabin camera, of the projection area, wherein the second target image is displayed on the projection area and reflected by a plane mirror;
because the projection area is in the front of the cabin, and the direction of the cabin camera faces the direction of the driver's seat where the user is located, in order to enable the cabin camera to indirectly shoot the projection area, a plane mirror can be erected in the cabin, the position of the plane mirror is fixed, and an image formed by the projection area after being reflected by the plane mirror is located in the direction of the driver's seat where the user is located, so that the cabin camera can acquire a second target image of the image, namely the second target image of the projection area after being reflected by the plane mirror, and the electronic equipment can acquire the second target image acquired by the cabin camera.
The electronic device may control the projection area to display a first target, which may be one of a checkerboard, a circular target, and a random target, which is not particularly limited herein. In this way, the electronic device can determine the coordinates of the first target in the cabin camera coordinate system, and since the first target is displayed on the projection area, the coordinates of the first target in the cabin camera coordinate system can be regarded as the coordinates of the projection area in the cabin camera coordinate system.
S502, determining coordinates of the reflected projection area under a cabin camera coordinate system according to the parameters of the cabin camera calibrated in advance and the pixel coordinates of the first target in the second target image;
in order to determine the conversion relationship between the image coordinate system and the cabin camera coordinate system, the internal parameters of the cabin camera may be calibrated in advance. The internal parameters may include scale factors of the cabin camera in two coordinate axis directions of the image coordinate system, principal point coordinates with respect to the image coordinate system, focal length, optical center position, tilt parameters of the coordinate axes of the image coordinate system, and distortion parameters. As an embodiment, the internal parameters of the cockpit camera may be determined by a Zhang Zhengyou calibration method.
In this way, the electronic device can determine the coordinates of the reflected projection area in the cabin camera coordinate system according to the parameters of the cabin camera and the pixel coordinates of the first target in the second target image.
S503, acquiring a third target image of the plane mirror, wherein the surface of the third target image is attached with a second target, and the third target image is acquired by the cockpit camera;
after obtaining the coordinates of the reflected projection area in the cabin camera coordinate system, the electronic device may first determine the coordinates of the plane mirror in the cabin camera coordinate system in order to determine the coordinates of the projection area in the cabin camera coordinate system, since the reflected projection area is formed by reflecting the projection area by the plane mirror.
In particular, the electronic device may acquire a third target image of the planar mirror with the second target attached to the surface acquired by the cockpit camera. Wherein the second target may be a checkerboard, a circular target, and a randomized target. In this way, the electronic device can determine the coordinates of the second target in the cabin camera coordinate system, and since the second target is attached to the surface of the plane mirror, the coordinates of the second target in the cabin camera coordinate system can be regarded as the coordinates of the plane mirror in the cabin camera coordinate system.
S504, determining coordinates of the plane mirror under a cabin camera coordinate system according to the parameters of the cabin camera calibrated in advance and pixel coordinates of the second target in the third target image;
in order to determine the coordinates of the plane mirror in the cabin camera coordinate system, and thus the coordinates of the projection area in the cabin camera coordinate system, the electronic device may determine the coordinates of the plane mirror in the cabin camera coordinate system according to the pre-calibrated internal reference of the cabin camera and the pixel coordinates of the second target in the third target image.
As an embodiment, if the cabin camera captures the second target image and the third target image simultaneously, there is no need to fix the position of the plane mirror. Since the reflected projection area is formed by reflecting the projection area by the plane mirror, the position of the reflected projection area is related to the position of the plane mirror, and it is necessary to ensure that the position of the plane mirror when the second target image is captured is the same as the position of the plane mirror when the third target image is captured. If the cabin camera shoots the second target image of the projection area reflected by the plane mirror and the third target image of the plane mirror at the same time, the position of the plane mirror is not changed, and the position of the plane mirror is not required to be fixed.
S505, determining the coordinates of the projection area under the coordinate system of the cabin camera according to the coordinates of the plane mirror under the coordinate system of the cabin camera, the coordinates of the reflected projection area under the coordinate system of the cabin camera and the plane reflection imaging principle.
Because the projection area after reflection is formed after the projection area is reflected by the plane mirror, the projection area and the projection area after reflection still meet the plane reflection imaging principle under the coordinate system of the cabin camera, and therefore the electronic equipment can determine the distance from the projection area after reflection to the plane mirror under the coordinate system of the cabin camera according to the coordinates of the plane mirror under the coordinate system of the cabin camera and the coordinates of the projection area after reflection under the coordinate system of the cabin camera.
Next, the electronic device may determine the coordinates of the projection area in the cabin camera coordinate system based on the distance, the planar reflection principle, and the coordinates of the planar mirror in the cabin camera coordinate system.
As an embodiment, the coordinates of the projection area in the cabin camera coordinate system may be determined by the auxiliary camera. And erecting a pre-calibrated auxiliary camera at a position including a projection area in a shooting range, and acquiring an image acquired by the auxiliary camera for the projection area. The coordinates of the projection area in the image coordinate system are converted into coordinates in the auxiliary camera coordinate system according to the internal parameters of the auxiliary camera, and the coordinates of the projection area in the auxiliary camera coordinate system are converted into coordinates in the world coordinate system according to the external parameters of the auxiliary camera. Furthermore, the coordinates of the projection area in the world coordinate system are converted into the coordinates of the projection area in the cabin camera coordinate system according to the position relation of the cabin camera and the auxiliary camera in the world coordinate system and the external parameters of the cabin camera.
In the embodiment of the application, the electronic device can acquire the second target image acquired by the cabin camera and reflected by the plane mirror in the projection area displayed with the first target; determining coordinates of the reflected projection area under a cabin camera coordinate system according to parameters of a cabin camera calibrated in advance and pixel coordinates of a first target in a second target image; acquiring a third target image of a plane mirror, the surface of which is attached with a second target, acquired by a cabin camera; determining coordinates of the plane mirror under a cabin camera coordinate system according to parameters of a cabin camera calibrated in advance and pixel coordinates of a second target in a third target image; and determining the coordinates of the projection area under the cabin camera coordinate system according to the coordinates of the plane mirror under the cabin camera coordinate system, the coordinates of the reflected projection area under the cabin camera coordinate system and the plane reflection imaging principle. Because the projection area after reflection is formed after the projection area is reflected by the plane mirror, the projection area under the cabin camera coordinate system and the projection area after reflection also meet the plane reflection imaging principle, and therefore the electronic equipment can determine the coordinate of the projection area under the cabin camera coordinate system according to the plane reflection imaging principle, the coordinate of the plane mirror under the cabin camera coordinate system and the coordinate of the projection area after reflection under the cabin camera coordinate system. The above process only uses the internal parameters of the cabin camera, and does not need to use the external parameters of the cabin camera or convert the coordinates into the world coordinate system, so that the coordinates of the projection area in the cabin camera coordinate system can be rapidly determined.
As shown in fig. 6, the step of obtaining the target line of sight data of the user for calibration identification, which is captured by the cabin camera, may include:
s601, controlling a projection area of a cabin to display calibration marks, and recording the number of the displayed calibration marks;
in order to determine whether the number of calibration marks displayed reaches the preset number, the electronic device may record the number of calibration marks displayed while controlling the projection area of the cockpit to display the calibration marks.
For example, if the electronic device has displayed two calibration marks before the calibration marks are displayed in the projection area of the control cabin, the number of displayed calibration marks recorded by the electronic device is 2.
For another example, if the electronic device does not display the calibration marks before the calibration marks are displayed in the projection area of the control pod, then the number of displayed calibration marks recorded by the electronic device is 0.
In the embodiment of the application, the number of calibration marks and the movement track are not particularly limited. For example, the calibration marks may be dots displayed one by one in a circular track, fixed-position dots, or the like.
S602, acquiring sight line data of a user for the displayed calibration mark, which is captured by the cockpit camera;
to determine the direction of the line of sight of the user looking at the calibration markers, a cockpit camera may be used to capture line of sight data of the user for the displayed calibration markers. In this way, the electronic device can acquire gaze data captured by the cockpit camera. For example, the gaze data may be a user image of the user looking at the calibration identification.
As one implementation of embodiments of the application, whenever the electronic device controls the projection area to display a calibration mark, the cockpit camera will capture line of sight data for the user for that calibration mark. For example, the electronic device controls the projection area to display a calibration mark for which the cockpit camera captures line of sight data of the user. In the case where the electronic device next controls the projection area to display the calibration mark, the cockpit camera will also capture line of sight data of the user for the next calibration mark.
S603, determining whether the number of the recorded and displayed calibration marks reaches a preset number; if the number of the recorded and displayed calibration marks does not reach the preset number, returning to the step S601; if the number of recorded and displayed calibration marks reaches the preset number, step S604 is performed.
In order to further improve the accuracy of the gaze calibration, the electronic device may preset the number of calibration identifications. Thus, the electronic device can calibrate the user with the line of sight according to the preset number of calibration marks. The electronic device may thus determine whether the number of recorded displayed calibration marks reaches a preset number.
For example, if the preset number is 5 and the number of displayed calibration marks is 4, the electronic device will return to execute step S601 to control the projection area of the cockpit to display the 5 th calibration mark.
And S604, determining the acquired sight line data as target sight line data of the user for the calibration identification.
In case the number of recorded displayed calibration marks reaches a preset number, it is indicated that sufficient line of sight data has been acquired. Then, the electronic device may determine the acquired gaze data as target gaze data for the calibration identity for the user.
With the above example, after the electronic device controls the projection area of the cabin to display the 5 th calibration mark, the electronic device may acquire line of sight data of the 5 th calibration mark for the user captured by the cabin camera. Further, it is judged that the number of the displayed calibration marks reaches a preset number. The electronic device can determine the sight line data corresponding to the 5 calibration marks which are displayed as the target sight line data of the user for the 5 calibration marks which are displayed.
It can be seen that, in the embodiment of the present application, the electronic device may control the projection area of the cabin to display the calibration marks, and record the number of the displayed calibration marks; acquiring sight line data of a user for the displayed calibration mark, which is captured by a cabin camera; determining whether the number of recorded displayed calibration marks reaches a preset number; if the recorded number of the displayed calibration marks does not reach the preset number, returning to the step of controlling the projection area of the cabin to display the calibration marks and recording the number of the displayed calibration marks; and if the number of the recorded and displayed calibration marks reaches the preset number, determining the acquired sight line data as target sight line data of the user for the calibration marks. Because the electronic device can acquire the sight line data corresponding to the preset number of calibration marks and takes the sight line data corresponding to the preset number of calibration marks as the target sight line data, the sight line calibration of the user can be performed on the basis of multiple groups of target sight line data. In this way, the accuracy of the line-of-sight calibration can be further improved.
As an implementation manner of the embodiment of the present application, before the step of determining, based on the target line of sight data, the target line of sight landing coordinate of the user for the calibration mark in the cabin camera coordinate system, the method may further include:
And performing data cleaning on the target sight line data to obtain cleaned target sight line data.
Since the user's gaze direction is typically moved from somewhere to the calibration marker during the user's gaze at the calibration marker, the target gaze data may comprise data during the movement of the user's gaze direction. Also, the user may be distracted, i.e. the user's gaze point location may be further from the calibration marker. Therefore, in order to determine the corresponding target sight line data when the user looks at the calibration mark, the electronic device can perform data cleaning on the target sight line data, and further obtain the cleaned target sight line data.
In one embodiment, the electronic device may perform data cleansing on a set of target line-of-sight data after acquiring the set of target line-of-sight data. For example, the electronic device may perform data cleansing on the first set of target line-of-sight data after acquiring the first set of target line-of-sight data. After the second set of target line of sight data is acquired, data cleansing is performed on the second set of target line of sight data.
In another embodiment, the electronic device may first acquire a preset number of sets of target line-of-sight data, and perform data cleansing on each set of target line-of-sight data. For example, the electronic device may acquire a first set of target line-of-sight data and a second set of target line-of-sight data, and then perform data cleansing for the first set of target line-of-sight data and the second set of target line-of-sight data, respectively.
The step of determining, based on the target line of sight data, the target line of sight landing point coordinates of the user for the calibration mark in the cabin camera coordinate system may include:
and determining the target sight drop point coordinates of the user for the calibration mark under the cabin camera coordinate system based on the cleaned target sight data.
Because the cleaned target sight line data can more accurately reflect the position of the user looking at the calibration mark, after the cleaned target sight line data is obtained, the electronic equipment can determine the target sight line drop point coordinates of the user for the calibration mark under the cabin camera coordinate system based on the cleaned target sight line data.
Therefore, in the embodiment of the application, the electronic equipment can perform data cleaning on the target sight line data to obtain cleaned target sight line data; and determining the target line-of-sight falling point coordinates of the user for the calibration mark under the cabin camera coordinate system based on the cleaned target line-of-sight data. Because the target sight line data may include a case that the sight line of the user does not fall on the calibration mark, the electronic device may perform data cleaning on the target sight line data. Therefore, the cleaned target sight line data can more accurately reflect the position of the user gazing at the calibration mark, and the accuracy of sight line calibration can be further improved.
As an implementation manner of the embodiment of the present application, the line-of-sight data may include a plurality of line-of-sight position coordinates of the user captured by the cabin camera acquired within a preset time period after the calibration identifier is displayed. In this case, the step of performing data cleansing on the target line-of-sight data to obtain cleansed target line-of-sight data may include performing data cleansing in one of the following manners to obtain cleansed target line-of-sight data:
clustering the plurality of sight line position coordinates to determine a clustering center; determining the cluster center as cleaned target sight data;
clustering the plurality of sight line position coordinates, and determining the category with the largest number of sight line position coordinates; determining the sight line position coordinates included in the determined category as cleaned target sight line data;
determining a sight focus set of the user based on capture timing corresponding to the plurality of sight position coordinates; determining the end position coordinates of the sight line focus set as cleaned target sight line data;
determining a sight focus set of the user based on capture timing corresponding to the plurality of sight position coordinates; and determining the sight line position coordinates with the distance between the sight line position coordinates and the end point position coordinates of the sight line focus set being smaller than a preset threshold value as cleaned target sight line data.
In the first embodiment, since the distance between the corresponding line of sight position coordinate and most of the other line of sight position coordinates is long in the case that the user is distracted, and the time of the sliding process is far less than the time when the final line of sight stays in the calibration mark in the process that the line of sight slides to the calibration mark, the number of the corresponding line of sight position coordinates in the sliding process is far less than the number of the corresponding line of sight position coordinates when the final line of sight stays in the calibration mark.
Based on the above reasons, the electronic device may cluster the plurality of line-of-sight position coordinates acquired within the preset time period after the calibration identifier is displayed, and determine a cluster center corresponding to the plurality of line-of-sight position coordinates. And further determining the cluster center as the cleaned target line-of-sight data.
For example, in the cabin camera coordinate system, the plurality of line-of-sight position coordinates are (10,21,5), (13,21,5), and (7,21,5), respectively, and the electronic device may cluster the plurality of line-of-sight position coordinates to determine the cluster center as (10,21,5). Further, the cluster center (10,21,5) is determined as the target line-of-sight data after washing.
As an implementation manner, the electronic device may cluster the plurality of line-of-sight position coordinates acquired within the preset time period after the calibration identifier is displayed, determine a cluster center corresponding to the plurality of line-of-sight position coordinates, and determine the line-of-sight position coordinates with a distance from the cluster center smaller than a preset threshold as the cleaned target line-of-sight data.
For example, in the cabin camera coordinate system, the plurality of line-of-sight position coordinates are (24,15,32), (25,15,32), (23,15,32), (10,21,5), (13,21,5), and (7,21,5), respectively. The electronic device may cluster the plurality of gaze location coordinates, determining a cluster center as (24,15,32). If the preset threshold is 2, the electronic device may determine that the line-of-sight position coordinates with a distance from the cluster center less than the preset threshold are (23,15,32), (25,15,32). (23,15,32), (25,15,32) and (24,15,32) are determined as post-cleaning target line-of-sight data.
In the second embodiment, since the cabin camera continuously collects the line of sight position coordinates of the user during the display of the calibration mark, the number of line of sight position coordinates concentrated near the calibration mark is large after the line of sight landing point of the user stabilizes at the calibration mark. The electronic device may cluster the plurality of gaze location coordinates to obtain a plurality of categories. The category having the largest number of sight line position coordinates is determined, and the sight line position coordinates included in the category are determined as the cleaned target sight line data.
For example, after clustering the plurality of line-of-sight position coordinates, the electronic device obtains three categories, namely a category one, a category two and a category three. Category one includes line of sight position coordinates (7,10,32), (8,10,31) and (7,11,30); the line of sight position coordinates included in category two are (15,16,31), (16,15,31) and (14,17,30); category three includes line of sight position coordinates (26,16,31), (25,15,31), (24,15,32), (25,15,32) and (25,16,30). Since the number of line-of-sight position coordinates included in the category three is the largest, the electronic apparatus may use the line-of-sight position coordinates included in the category three as the post-cleaning target line-of-sight data.
In a third embodiment, since the user is looking at the calibration mark, the line-of-sight position coordinates start to move from somewhere and finally stop at the calibration mark. Therefore, in order to clean the sight line position coordinates when the sight line of the user moves, the electronic device can determine the sight line focus set of the user based on the capturing time sequences corresponding to the plurality of sight line position coordinates. The sight focal point set is a sequence in which a plurality of sight focal points are arranged according to the sequence of the acquisition time.
Because the end point of the user's sight focus set is the sight position coordinate of the user's gaze calibration mark, the electronic device may determine the end point position coordinate of the sight focus set as the cleaned target sight data. Namely, data cleaning is performed by a data cleaning mode of time sequence analysis.
For example, under the cabin camera coordinate system, the electronic device determines the start point position coordinates and the end point position coordinates of the user's gaze focus set as (10,15,21) and (16,25,31), respectively, and then the electronic device may determine the end point position coordinates (16,25,31) as cleaned target gaze data.
In the fourth embodiment, since the end position of the line-of-sight focus set corresponds to the line-of-sight position coordinates when the user gazes at the calibration mark, the electronic apparatus can determine the line-of-sight focus set of the user based on the capturing timings corresponding to the plurality of line-of-sight position coordinates. Further, line-of-sight position coordinates having a distance from the end position coordinates of the line-of-sight focus set smaller than a preset threshold value are determined as the cleaned target line-of-sight data.
For example, the gaze focus location coordinates acquired by the electronic device are (12,13,20), (14,15,21), (24,15,32), (25,15,32), and (25,16,30), where the end location of the gaze focus set is (25,16,30). If the preset threshold is 2, the electronic device may determine that the gaze location coordinates for which the distance from the end location of the set of gaze foci is less than the preset threshold are (24,15,32) and (25,15,32). Accordingly, the electronic apparatus can use (24,15,32), (25,15,32), and (25,16,30) as the post-cleaning target line-of-sight data.
As an implementation, the electronic device may perform data cleansing in a manner that eliminates outliers. For example, the electronic device may calculate distances between the line of sight position coordinate and other line of sight position coordinates for the acquired plurality of line of sight position coordinates, and if there is a distance greater than a preset distance among the minimum preset number of distances, determine the line of sight position coordinate as an outlier, and delete the line of sight position coordinate.
Therefore, in the embodiment of the application, the electronic equipment can cluster a plurality of sight line position coordinates to determine a cluster center; determining the clustering center as cleaned target sight data; clustering the plurality of sight line position coordinates, and determining the category with the largest number of sight line position coordinates; determining the sight line position coordinates included in the determined category as cleaned target sight line data; determining a sight focus set of the user based on capture time sequences corresponding to the plurality of sight position coordinates; determining the end position coordinates of the sight line focus set as cleaned target sight line data; determining a sight focus set of the user based on capture timing corresponding to the plurality of sight position coordinates; and determining the sight line position coordinates with the distance between the sight line position coordinates and the end point position coordinates of the sight line focus set being smaller than a preset threshold value as cleaned target sight line data. Because the distance between the corresponding sight line position coordinate and most other sight line position coordinates is far under the condition that the user is distracted, and the number of the corresponding sight line position coordinates of the user in the sight line sliding process is far smaller than the number of the corresponding sight line position coordinates when the final sight line stays at the calibration mark, the electronic equipment can conduct data cleaning in a clustering mode. Since the user gazes at the calibration mark will eventually stop at the calibration mark, the end point position coordinates of the line-of-sight focus set are determined as the cleaned target line-of-sight data. Since the number of line-of-sight position coordinates concentrated near the calibration mark is large after the line-of-sight landing point of the user is stabilized at the calibration mark, the category with the largest number of line-of-sight position coordinates obtained by clustering is determined as the cleaned target line-of-sight data. Since the end position of the sight line focus set corresponds to the sight line position coordinate when the user gazes at the calibration mark, the sight line position coordinate of which the distance from the end position coordinate of the sight line focus set is smaller than the preset threshold value is determined as the cleaned target sight line data. Therefore, the cleaned target sight line data can more accurately reflect the position of the user gazing at the calibration mark, and the sight line calibration accuracy can be further improved. In the case where the number of line-of-sight position coordinates after washing is plural, since the number of target line-of-sight position coordinates for performing line-of-sight calibration is plural, the accuracy of line-of-sight calibration can be improved.
As an implementation manner of the embodiment of the present application, the step of determining the gaze calibration parameter according to the difference between the target gaze landing point coordinate and the predicted gaze landing point coordinate may include:
and calculating a sight line calibration matrix R according to the following formula based on the target sight line falling point coordinates and the predicted sight line falling point coordinates:
R 3×3 =(V′ 3×n V′ n×3 T ) -1 V′ 3×n V n×3 T ) T
wherein n is the number of coordinates of the falling point of the target sight, V' 3×n For a matrix of column vectors composed of components of the predicted line-of-sight drop coordinates, V' n×3 For a matrix of row vectors composed of components of the predicted line-of-sight drop coordinates, V n×3 Is a matrix of row vectors composed of components of the target line-of-sight drop coordinates.
Since the target line of sight landing point coordinates and the predicted line of sight landing point coordinates are identified for the same calibration, the number of target line of sight landing point coordinates and predicted line of sight landing point coordinates is the same. Since the target line of sight landing point coordinates and the predicted line of sight landing point coordinates are both in the cabin camera coordinate system, the target line of sight landing point coordinates and the predicted line of sight landing point coordinates both contain three components.
If v i Representing the target line of sight falling point coordinates of the user for the ith calibration mark, v i ' represents the predicted line-of-sight drop point coordinates for the ith calibration mark, which may be constructed. Target line of sight landing point coordinates and predicted line of sight landing point sitting The number of targets is n, i.e. i has a value of 1,2 and … n, then n target line of sight falling point coordinates and n predicted line of sight falling point coordinates are expressed in a matrix manner, and are respectively [ v ] 1 ,v 2 ,…,v n ] 3×n And [ v ] 1 ′v 2 ′,…,v n ′] 3×n
If the gaze calibration parameter is a gaze calibration matrix R, the electronic device may establish a relationship between the target gaze location coordinates and a matrix corresponding to the predicted gaze location coordinates as [ v ] 1 ,v 2 ,…,v n ] 3×n =R 3×3 [v 1 ′v 2 ′,…,v n ′] 3×n . The formula is an overdetermined equation, which can be solved by the electronic equipment in a least square method, namely the electronic equipment can calculate the overdetermined equation according to the formula R 3×3 =((V′ 3×n V′ n×3 T ) -1 V′ 3×n V n×3 T ) T And solving a sight line calibration matrix R.
It can be seen that, in the embodiment of the present application, the electronic device may calculate, based on the target line of sight falling point coordinate and the predicted line of sight falling point coordinate, a line of sight calibration matrix R according to the following formula: r is R 3×3 =(V′ 3×n V′ n×3 T ) -1 V′ 3×n V n×3 T ) T . Wherein n is the number of coordinates of the falling point of the target sight, V' 3×n For a matrix of column vectors composed of components of the predicted line-of-sight drop coordinates, V' n×3 For a matrix of row vectors composed of components of the predicted line-of-sight drop coordinates, V n×3 Is a matrix of row vectors composed of components of the target line-of-sight drop coordinates. Since the target line of sight landing point coordinates and the predicted line of sight landing point coordinates are identified for the same calibration, the number of target line of sight landing point coordinates and predicted line of sight landing point coordinates is the same. Since the target line of sight landing point coordinates and the predicted line of sight landing point coordinates are both in the cabin camera coordinate system, the target line of sight landing point coordinates and the predicted line of sight landing point coordinates both contain three components. The target line of sight falling point coordinates and the predicted line of sight falling point coordinates respectively correspond to The matrix may be expressed as [ v ] 1 ,v 2 ,…,v n ] 3×n And [ v ] 1 ′v 2 ′,…,v n ′] 3×n . The electronic device can build the formula v 1 ,v 2 ,…,v n ] 3×n =R 3×3 [v 1 ′v 2 ′,…,v n ′] 3×n And then solving the sight line calibration matrix R by adopting a least square method. Thus, the solved sight line calibration matrix can be accurate.
As shown in fig. 7, after the step of determining the sight calibration parameter according to the difference between the target sight landing point coordinate and the predicted sight landing point coordinate, the method may further include:
s701, acquiring current sight line data of the user;
after determining the gaze calibration matrix, the electronic device may obtain current gaze data of the user for gaze calibration of the user. The current line of sight data may be current line of sight coordinates of the user in a cabin camera coordinate system.
S702, determining the product of the sight line calibration matrix and the current sight line data as calibrated sight line data.
After acquiring the current line of sight data of the user, the electronic device may calculate v=rv according to the formula v =rv The sight line calibration matrix R is compared with the current sight line data v Multiplying to obtain calibrated sight line data v.
As an embodiment, the electronic device may determine, according to the calibrated line-of-sight data of the user, whether the line-of-sight direction of the user deviates from the front of the cabin for a long period of time, and further determine whether the user is in a state of fatigue driving, distraction, or the like. If the user is in the state, the electronic device can control the cabin audio output device to play the audio for reminding the driver, so that traffic accidents are reduced.
As another implementation, the electronic device may determine whether the gaze point location of the user is within the wake-up area based on the user's calibrated gaze data. In the case that the gaze point position of the user is located in a certain wake-up area, the electronic device may wake up a device in the cabin corresponding to the wake-up area. The device may include at least one of a vehicle screen, a sound, an air conditioner, a wiper, and an electric heating device, which is not particularly limited herein.
In one embodiment, the gaze calibration matrix is a 3 x 3 matrix and the current gaze data may be represented as a three-dimensional column vector. The electronic device may multiply the line-of-sight calibration matrix with the current line-of-sight data, the resulting product also being a three-dimensional column vector, i.e. the calibrated line-of-sight data.
In the embodiment of the application, the electronic device can acquire the current sight line data of the user; and determining the product of the sight line calibration matrix and the current sight line data as calibrated sight line data. To calibrate the user's gaze, the electronic device may acquire current gaze data of the user. After acquiring the current line of sight data of the user, the electronic device may multiply the line of sight calibration matrix with the current line of sight data, and use the product as the calibrated line of sight data. In this way, line-of-sight calibration can be achieved, and the accuracy of line-of-sight calibration is high.
As an implementation manner of the embodiment of the present application, before the step of acquiring the target line-of-sight data of the user for calibration identification captured by the cabin camera, the method may further include:
controlling a projection area of the cockpit to display sight line calibration prompt information, wherein the sight line calibration prompt information is used for prompting a user to watch a calibration mark displayed in the projection area, and/or,
and controlling the cockpit audio output device to play a sight line calibration prompt voice, wherein the sight line calibration prompt voice is used for prompting a user to watch the calibration mark displayed in the projection area.
In a first embodiment, in order to guide the user to perform the gaze calibration, i.e. to look at the calibration marker of the projection area, the electronic device may control the projection area of the cockpit to display gaze calibration cues. For example, the gaze calibration prompt may be the text "please look at calibration identification".
In a second embodiment, to guide the user in line of sight calibration, i.e. looking at the calibration identification of the projection area, the electronic device may control the cabin audio output device to play line of sight calibration prompt speech. For example, the content of the gaze calibration alert voice may be "please look at the calibration logo displayed in the front projection area".
In the embodiment of the application, the electronic device may control the projection area of the cabin to display the sight calibration prompt information, where the sight calibration prompt information is used to prompt the user to watch the calibration identifier displayed in the projection area, and/or control the audio output device of the cabin to play the sight calibration prompt voice, where the sight calibration prompt voice is used to prompt the user to watch the calibration identifier displayed in the projection area. In order to guide the user to perform vision calibration and further obtain target vision data of the user, the electronic device may control the projection area of the cabin to display vision calibration prompt information and/or control the audio output device of the cabin to play vision calibration prompt voice. Therefore, the user can be guided, the user looks at the calibration mark of the projection area, and the sight calibration is realized.
As an implementation of the embodiment of the application, the calibration mark is displayed in a projection area of a head-up display HUD in the cabin.
Since the HUD is an original device of the cabin, no additional device is required. The location of the projection area to which the HUD corresponds is known, so that the location of the calibration marks in the projection area can be accurately determined. And the HUD can display any number of calibration marks at any position in the projection area, so that the flexibility of display of the calibration marks can be improved. In addition, the HUD is adopted for projection, so that the calibration mark can be clearly projected, and the user can conveniently watch the calibration mark. Therefore, the HUD is adopted to project the calibration mark, so that the sight calibration in the seat cabin can be realized, and the accuracy of the sight calibration is higher.
As a mode of implementation of the embodiment of the present application, a flowchart of a manner of determining the gaze calibration parameters may be as shown in fig. 8. The method specifically comprises the following steps:
s801, guiding a user to start calibration;
in order for the user to look at the calibration logo displayed in the projection area, the electronic device may guide the user to start calibration. Specifically, the electronic device may guide the user to start the gaze calibration by controlling the projection area to display the gaze calibration prompt information, controlling the cockpit audio output device to play the gaze calibration prompt voice, and displaying a pattern for guiding the user to start the gaze calibration in the projection area. It should be noted that, the manner of guiding the user to start calibration in the embodiment of the present application is not particularly limited.
The calibration identification number i may be used to represent the number of calibration identifications that are currently displayed. At the start of line-of-sight calibration, the start calibration identification number i is 0.
S802, controlling the projection area to display an ith calibration mark, and guiding a user to watch the calibration mark displayed in the projection area;
in order for the cockpit camera to capture target line of sight data for the calibration marks by the user, the electronic device may control the projection area to display the ith calibration mark while directing the user to look at the calibration mark displayed in the projection area.
S803, synchronously collecting data;
while the user looks at the calibration mark, the cockpit camera may capture target line of sight data of the user for the calibration mark, and further, the electronic device may collect the target line of sight data captured by the cockpit camera.
S804, data cleaning;
after the electronic device collects the target sight line data of the ith calibration mark of the user, the electronic device can perform data cleaning on the collected target sight line data. In this way, the cleaned target sight line data can be used for more accurately representing the position of the user looking at the calibration mark.
S805, judging whether the calibration identification number i is larger than or equal to a preset number N; if not, returning to step S802, and letting i=i+1; if so, step S806 is performed.
In order to circularly collect N sets of target line-of-sight data corresponding to the N calibration identifiers, the electronic device may determine whether the calibration identifier number i is greater than or equal to a preset number N.
S806, solving and outputting the sight line calibration parameters.
The predicted line of sight falling point coordinates are the line of sight positions where the predicted user gazes at a certain calibration mark, and the electronic equipment can determine and output line of sight calibration parameters according to the target line of sight falling point coordinates and the predicted line of sight falling point coordinates.
It can be seen that, in the embodiment of the present application, the electronic device may guide the user to start calibration; controlling the projection area to display the ith calibration mark, and guiding the user to watch the calibration mark displayed in the projection area; synchronously collecting data; data cleaning; judging whether the calibration identification number i is larger than or equal to a preset number N; if not, displaying the next calibration mark; if so, the gaze calibration parameters are solved and output. Because the electronic device can acquire the sight line data corresponding to the calibration marks of the preset quantity and takes the sight line data corresponding to the calibration marks of the preset quantity as the target sight line data, the sight line calibration can be performed on the user based on multiple groups of target sight line data. In this way, line-of-sight calibration can be achieved with a high degree of accuracy.
In the technical scheme of the application, the related operations of acquiring, storing, using, processing, transmitting, providing, disclosing and the like of the personal information of the user are all performed under the condition of obtaining the authorization of the user.
Corresponding to the sight line calibration method, the embodiment of the application also provides a sight line calibration device, and the sight line calibration device provided by the embodiment of the application is described below.
As shown in fig. 9, a gaze calibration apparatus, the apparatus comprising:
the sight line data acquisition module 901 is configured to acquire target sight line data of a calibration identifier, captured by a cockpit camera, of a user, where the calibration identifier is displayed in a projection area of the cockpit, the calibration identifier is used to instruct the user to watch a position where the calibration identifier is located, and the projection area is an area located in front of the cockpit and used to display projection information;
a predicted line-of-sight coordinate determining module 902, configured to determine, as a predicted line-of-sight landing coordinate, a coordinate of the calibration marker in the cabin camera coordinate system based on a coordinate of the projection region in the cabin camera coordinate system calibrated in advance and a coordinate of the calibration marker in the projection region;
a target line of sight coordinate determination module 903, configured to determine, based on the target line of sight data, a target line of sight landing point coordinate of the user for the calibration identifier in the cabin camera coordinate system;
the sight line calibration parameter determination module 904 is configured to determine a sight line calibration parameter according to the target sight line drop point coordinate and the predicted sight line drop point coordinate.
In the scheme provided by the embodiment of the application, the electronic equipment can acquire the target sight line data of the user for the calibration mark, which is captured by the cockpit camera, wherein the calibration mark is displayed in the projection area of the cockpit, the calibration mark is used for indicating the position where the user looks at the calibration mark, and the projection area is the area which is positioned in front of the cockpit and used for displaying projection information; determining the coordinate of the calibration mark under the cabin camera coordinate system based on the coordinate of the pre-calibrated projection area under the cabin camera coordinate system and the coordinate of the calibration mark in the projection area, and taking the coordinate as a predicted line-of-sight falling point coordinate; determining target sight drop point coordinates of a user for calibration identification under a cabin camera coordinate system based on the target sight data; and determining the sight calibration parameters according to the target sight falling point coordinates and the predicted sight falling point coordinates. Because the electronic device can control the projection area to display the calibration mark, and the coordinates of the calibration mark in the projection area are known, the electronic device can accurately determine the coordinates of the calibration mark in the cabin camera coordinate system, namely the predicted line-of-sight landing point coordinates according to the coordinates of the pre-calibrated projection area in the cabin camera coordinate system and the coordinates of the calibration mark in the projection area. Therefore, the electronic equipment can accurately determine the calibration parameters according to the difference between the predicted line-of-sight falling point coordinates and the target line-of-sight falling point coordinates, line-of-sight calibration can be realized, and the accuracy of the line-of-sight calibration is higher.
As an implementation manner of the embodiment of the present application, the foregoing apparatus may further include:
the first target image acquisition module is used for acquiring a first target image, which is acquired by the cabin camera and is reflected by the plane mirror, of the projection area;
the first projection area coordinate determining module is used for determining coordinates of the reflected projection area under a world coordinate system according to the parameters of the cabin camera calibrated in advance and pixel coordinates of the projection area in the first target image;
the second projection area coordinate determining module is used for determining the coordinates of the projection area under the world coordinate system according to the distance between the projection area and the plane mirror, the coordinates of the reflected projection area under the world coordinate system and the plane reflection imaging principle;
and the third projection area coordinate determining module is used for converting the coordinate of the projection area under the world coordinate system into the coordinate of the projection area under the cabin camera coordinate system according to the conversion relation between the pre-calibrated world coordinate system and the cabin camera coordinate system.
As an implementation manner of the embodiment of the present application, the foregoing apparatus may further include:
The second target image acquisition module is used for acquiring a second target image, acquired by the cabin camera, of the projection area, wherein the projection area is displayed with the first target, and the second target image is reflected by the plane mirror;
the fourth projection area coordinate determining module is used for determining the coordinate of the reflected projection area under the coordinate system of the cabin camera according to the parameters of the cabin camera calibrated in advance and the pixel coordinate of the first target in the second target image;
the third target image acquisition module is used for acquiring a third target image of the plane mirror, the surface of which is attached with a second target, acquired by the cabin camera;
the plane mirror coordinate determining module is used for determining coordinates of the plane mirror under a cabin camera coordinate system according to the parameters of the cabin camera calibrated in advance and the pixel coordinates of the second target in the third target image;
and the fifth projection area coordinate determining module is used for determining the coordinates of the projection area under the coordinate system of the cabin camera according to the coordinates of the plane mirror under the coordinate system of the cabin camera, the coordinates of the reflected projection area under the coordinate system of the cabin camera and the plane reflection imaging principle.
As an implementation manner of the embodiment of the present application, the target line of sight coordinate determining module 903 may include:
The first calibration mark display sub-module is used for controlling the projection area of the cabin to display calibration marks and recording the number of the displayed calibration marks;
the sight line data acquisition sub-module is used for acquiring sight line data of the displayed calibration mark, which is captured by the cabin camera, of a user;
the quantity judging sub-module is used for determining whether the quantity of the recorded and displayed calibration marks reaches a preset quantity;
a second calibration mark display sub-module, configured to return to the projection area of the control cabin to display calibration marks and record the number of displayed calibration marks if the number of recorded and displayed calibration marks does not reach the preset number;
and the target sight line data determination submodule is used for determining the acquired sight line data as target sight line data of the calibration marks for a user if the number of the recorded and displayed calibration marks reaches the preset number.
As an implementation manner of the embodiment of the present application, the foregoing apparatus may further include:
the data cleaning module is used for cleaning the data of the target sight line data to obtain cleaned target sight line data;
the target line-of-sight coordinate determination module 903 may include:
And the target sight drop point coordinate sub-module is used for determining the target sight drop point coordinate of the user for the calibration mark under the cabin camera coordinate system based on the cleaned target sight data.
As an implementation manner of the embodiment of the present application, the line-of-sight data may include a plurality of line-of-sight position coordinates of the user captured by the cabin camera acquired within a preset time period after the calibration identifier is displayed;
the data cleaning module may include:
the first cleaning submodule is used for clustering the plurality of sight line position coordinates and determining a clustering center; determining the cluster center as cleaned target sight data;
the second cleaning submodule is used for clustering the plurality of sight line position coordinates and determining the category with the largest number of sight line position coordinates; determining the sight line position coordinates included in the determined category as cleaned target sight line data;
a third cleaning sub-module, configured to determine a sight focal point set of the user based on capture timings corresponding to the plurality of sight position coordinates; and determining the end position coordinates of the sight line focus set as cleaned target sight line data.
A fourth cleaning sub-module, configured to determine a sight focal point set of the user based on capture timings corresponding to the plurality of sight position coordinates; and determining the sight line position coordinates with the distance between the sight line position coordinates and the end point position coordinates of the sight line focus set being smaller than a preset threshold value as cleaned target sight line data.
As an implementation manner of the embodiment of the present application, the above-mentioned gaze calibration parameter determining module 904 may include:
the calibration matrix calculation sub-module is used for calculating a sight calibration matrix R according to the following formula based on the target sight falling point coordinates and the predicted sight falling point coordinates:
R 3×3 =((V′ 3×n V′ n×3 T ) -1 V′ 3×n V n×3 T ) T
wherein n is the number of coordinates of the falling point of the target sight, V' 3×n For a matrix of column vectors composed of components of the predicted line-of-sight drop coordinates, V' n×3 Is in the direction of rowA matrix of quantities consisting of components of the predicted line-of-sight drop coordinates, V n×3 Is a matrix of row vectors composed of components of the target line-of-sight drop coordinates.
As an implementation manner of the embodiment of the present application, the foregoing apparatus may further include:
the current sight line data acquisition module is used for acquiring current sight line data of the user;
and the sight line data calibration module is used for determining the product of the sight line calibration matrix and the current sight line data as calibrated sight line data.
As an implementation manner of the embodiment of the present application, the foregoing apparatus may further include:
the sight calibration prompt information display module is used for controlling a projection area of the cabin to display sight calibration prompt information, wherein the sight calibration prompt information is used for prompting a user to watch a calibration mark displayed in the projection area;
and the sight calibration prompt voice playing module is used for controlling the cabin audio output equipment to play a sight calibration prompt voice, wherein the sight calibration prompt voice is used for prompting a user to watch the calibration mark displayed in the projection area.
As an implementation of the embodiment of the present application, the calibration mark is displayed in a projection area of a head-up display HUD in the cabin.
The embodiment of the application also provides an electronic device, as shown in fig. 10, including:
a memory 1001 for storing a computer program;
the processor 1002 is configured to implement the sight line calibration method steps described in any of the above embodiments when executing the program stored in the memory 1001.
And the electronic device may further comprise a communication bus and/or a communication interface, through which the processor 1002, the communication interface, and the memory 1001 communicate with each other.
In the scheme provided by the embodiment of the application, the electronic equipment can acquire the target sight line data of the user for the calibration mark, which is captured by the cockpit camera, wherein the calibration mark is displayed in the projection area of the cockpit, the calibration mark is used for indicating the position where the user looks at the calibration mark, and the projection area is the area which is positioned in front of the cockpit and used for displaying projection information; determining the coordinate of the calibration mark under the cabin camera coordinate system based on the coordinate of the pre-calibrated projection area under the cabin camera coordinate system and the coordinate of the calibration mark in the projection area, and taking the coordinate as a predicted line-of-sight falling point coordinate; determining target sight drop point coordinates of a user for calibration identification under a cabin camera coordinate system based on the target sight data; and determining the sight calibration parameters according to the target sight falling point coordinates and the predicted sight falling point coordinates. Because the electronic device can control the projection area to display the calibration mark, and the coordinates of the calibration mark in the projection area are known, the electronic device can accurately determine the coordinates of the calibration mark in the cabin camera coordinate system, namely the predicted line-of-sight landing point coordinates according to the coordinates of the pre-calibrated projection area in the cabin camera coordinate system and the coordinates of the calibration mark in the projection area. Therefore, the electronic equipment can accurately determine the calibration parameters according to the difference between the predicted line-of-sight falling point coordinates and the target line-of-sight falling point coordinates, line-of-sight calibration can be realized, and the accuracy of the line-of-sight calibration is higher.
The communication bus mentioned above for the electronic devices may be a peripheral component interconnect standard (Peripheral Component Interconnect, PCI) bus or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, etc. The communication bus may be classified as an address bus, a data bus, a control bus, or the like. For ease of illustration, the figures are shown with only one bold line, but not with only one bus or one type of bus.
The communication interface is used for communication between the electronic device and other devices.
The Memory may include random access Memory (Random Access Memory, RAM) or may include Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the aforementioned processor.
The processor may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; but also digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
In yet another embodiment of the present application, a computer readable storage medium is also provided, in which a computer program is stored, which when executed by a processor, implements the steps of any of the line-of-sight calibration methods described above.
In yet another embodiment of the present application, there is also provided a computer program product containing instructions that, when run on a computer, cause the computer to perform any of the line-of-sight calibration methods of the above embodiments.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted from one computer-readable storage medium to another, for example, by wired (e.g., coaxial cable, optical fiber, digital Subscriber Line (DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, tape), an optical medium (e.g., DVD), or a Solid State Disk (SSD), etc.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In this specification, each embodiment is described in a related manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for the apparatus, electronic device, computer readable storage medium, and computer program product embodiments, the description is relatively simple, as relevant to the method embodiments being referred to in the section of the description of the method embodiments.
The foregoing description is only of the preferred embodiments of the present application and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application are included in the protection scope of the present application.

Claims (14)

1. A line-of-sight calibration method, the method comprising:
acquiring target sight data of a user for a calibration mark, wherein the target sight data is captured by a cockpit camera, the calibration mark is displayed in a projection area of the cockpit, the calibration mark is used for indicating the position where the user looks at the calibration mark, and the projection area is an area which is positioned in front of the cockpit and used for displaying projection information;
determining the coordinate of the calibration mark under the cabin camera coordinate system as a predicted line-of-sight landing point coordinate based on the coordinate of the pre-calibrated projection area under the cabin camera coordinate system and the coordinate of the calibration mark in the projection area;
determining target line-of-sight landing coordinates of the user for the calibration mark in the cabin camera coordinate system based on the target line-of-sight data;
and determining a sight calibration parameter according to the target sight falling point coordinates and the predicted sight falling point coordinates.
2. The method of claim 1, wherein prior to the step of acquiring user gaze data captured by the cockpit camera for calibration of the identified target gaze data, the method further comprises:
acquiring a first target image of the projection area acquired by the cabin camera after being reflected by a plane mirror;
determining coordinates of the reflected projection area under a world coordinate system according to the parameters of the cabin camera calibrated in advance and pixel coordinates of the projection area in the first target image;
determining the coordinates of the projection area under the world coordinate system according to the distance between the projection area and the plane mirror, the coordinates of the reflected projection area under the world coordinate system and a plane reflection imaging principle;
and converting the coordinates of the projection area under the world coordinate system into the coordinates of the projection area under the cabin camera coordinate system according to the conversion relation between the pre-calibrated world coordinate system and the cabin camera coordinate system.
3. The method of claim 1, wherein prior to the step of acquiring user gaze data captured by the cockpit camera for calibration of the identified target gaze data, the method further comprises:
Acquiring a second target image, acquired by the cabin camera, of the projection area, wherein the second target image is displayed on the projection area and reflected by a plane mirror;
determining coordinates of the reflected projection area under a cabin camera coordinate system according to the parameters of the cabin camera calibrated in advance and the pixel coordinates of the first target in the second target image;
acquiring a third target image of the plane mirror, which is acquired by the cabin camera and has a second target attached to the surface;
determining coordinates of the plane mirror under a cabin camera coordinate system according to the parameters of the cabin camera calibrated in advance and the pixel coordinates of the second target in the third target image;
and determining the coordinates of the projection area under the coordinate system of the cabin camera according to the coordinates of the plane mirror under the coordinate system of the cabin camera, the coordinates of the reflected projection area under the coordinate system of the cabin camera and the plane reflection imaging principle.
4. The method of claim 1, wherein the step of obtaining user gaze data captured by the cockpit camera for the calibration identified target comprises:
controlling the projection area of the cabin to display calibration marks, and recording the number of the displayed calibration marks;
Acquiring line-of-sight data of a user captured by the cockpit camera for the displayed calibration mark;
determining whether the number of recorded displayed calibration marks reaches a preset number;
if the number of the recorded and displayed calibration marks does not reach the preset number, returning to the projection area of the control cabin to display the calibration marks, and recording the number of the displayed calibration marks;
and if the number of the recorded and displayed calibration marks reaches the preset number, determining the acquired sight line data as target sight line data of the user for the calibration marks.
5. The method of claim 4, wherein prior to the step of determining target line-of-sight landing coordinates for the user for the calibration identification in the cabin camera coordinate system based on the target line-of-sight data, the method further comprises:
performing data cleaning on the target sight line data to obtain cleaned target sight line data;
the step of determining the target line of sight landing point coordinates of the user for the calibration mark under the cabin camera coordinate system based on the target line of sight data comprises the following steps:
and determining the target sight drop point coordinates of the user for the calibration mark under the cabin camera coordinate system based on the cleaned target sight data.
6. The method of claim 5, wherein the gaze data comprises a plurality of gaze location coordinates of a user captured by a cabin camera acquired within a preset time period after displaying the calibration identification;
the step of performing data cleaning on the target sight line data to obtain cleaned target sight line data comprises the step of performing data cleaning by adopting one of the following modes to obtain cleaned target sight line data:
clustering the plurality of sight line position coordinates to determine a clustering center; determining the cluster center as cleaned target sight data;
clustering the plurality of sight line position coordinates, and determining the category with the largest number of sight line position coordinates; determining the sight line position coordinates included in the determined category as cleaned target sight line data;
determining a sight focus set of the user based on capture timing corresponding to the plurality of sight position coordinates; determining the end position coordinates of the sight line focus set as cleaned target sight line data;
determining a sight focus set of the user based on capture timing corresponding to the plurality of sight position coordinates; and determining the sight line position coordinates with the distance between the sight line position coordinates and the end point position coordinates of the sight line focus set being smaller than a preset threshold value as cleaned target sight line data.
7. The method of any one of claims 1-6, wherein the step of determining a gaze calibration parameter based on a difference between the target gaze landing point coordinates and predicted gaze landing point coordinates comprises:
and calculating a sight line calibration matrix R according to the following formula based on the target sight line falling point coordinates and the predicted sight line falling point coordinates:
R 3×3 =((V′ 3×n V′ n×3 T ) -1 V′ 3×n V n×3 T ) T
wherein n is the number of coordinates of the falling point of the target sight, V' 3×n For a matrix of column vectors composed of components of the predicted line-of-sight drop coordinates, V' n×3 For a matrix of row vectors composed of components of the predicted line-of-sight drop coordinates, V n×3 Is a matrix of row vectors composed of components of the target line-of-sight drop coordinates.
8. The method of claim 7, wherein after the step of determining a gaze calibration parameter based on a difference between the target gaze landing point coordinates and predicted gaze landing point coordinates, the method further comprises:
acquiring current sight line data of the user;
and determining the product of the sight line calibration matrix and the current sight line data as calibrated sight line data.
9. The method of any of claims 1-6, wherein prior to the step of acquiring target line-of-sight data captured by the cockpit camera for the user identified by the calibration, the method further comprises:
Controlling a projection area of the cockpit to display sight line calibration prompt information, wherein the sight line calibration prompt information is used for prompting a user to watch a calibration mark displayed in the projection area, and/or,
and controlling the cockpit audio output device to play a sight line calibration prompt voice, wherein the sight line calibration prompt voice is used for prompting a user to watch the calibration mark displayed in the projection area.
10. The method according to any of claims 1-6, wherein the calibration mark is displayed in a projection area of a heads-up display, HUD, within the pod.
11. A gaze calibration apparatus, the apparatus comprising:
the system comprises a sight line data acquisition module, a calibration identification module and a display module, wherein the sight line data acquisition module is used for acquiring target sight line data of a user for the calibration identification, which is captured by a cockpit camera, wherein the calibration identification is displayed in a projection area of the cockpit, the calibration identification is used for indicating the position where the user looks at the calibration identification, and the projection area is an area which is positioned in front of the cockpit and used for displaying projection information;
the prediction line of sight coordinate determining module is used for determining the coordinate of the calibration mark under the cabin camera coordinate system as the prediction line of sight falling point coordinate based on the coordinate of the projection area under the cabin camera coordinate system calibrated in advance and the coordinate of the calibration mark in the projection area;
A target sight line coordinate determining module, configured to determine, based on the target sight line data, a target sight line drop point coordinate of the user for the calibration identifier in the cabin camera coordinate system;
and the sight calibration parameter determining module is used for determining the sight calibration parameter according to the target sight falling point coordinates and the predicted sight falling point coordinates.
12. The apparatus of claim 11, wherein the apparatus further comprises:
the first target image acquisition module is used for acquiring a first target image, which is acquired by the cabin camera and is reflected by the plane mirror, of the projection area;
the first projection area coordinate determining module is used for determining coordinates of the reflected projection area under a world coordinate system according to the parameters of the cabin camera calibrated in advance and pixel coordinates of the projection area in the first target image;
the second projection area coordinate determining module is used for determining the coordinates of the projection area under the world coordinate system according to the distance between the projection area and the plane mirror, the coordinates of the reflected projection area under the world coordinate system and the plane reflection imaging principle;
the third projection area coordinate determining module is used for converting the coordinate of the projection area under the world coordinate system into the coordinate of the projection area under the cabin camera coordinate system according to the conversion relation between the world coordinate system and the cabin camera coordinate system which are calibrated in advance;
The apparatus further comprises:
the second target image acquisition module is used for acquiring a second target image, acquired by the cabin camera, of the projection area, wherein the projection area is displayed with the first target, and the second target image is reflected by the plane mirror;
the fourth projection area coordinate determining module is used for determining the coordinate of the reflected projection area under the coordinate system of the cabin camera according to the parameters of the cabin camera calibrated in advance and the pixel coordinate of the first target in the second target image;
the third target image acquisition module is used for acquiring a third target image of the plane mirror, the surface of which is attached with a second target, acquired by the cabin camera;
the plane mirror coordinate determining module is used for determining coordinates of the plane mirror under a cabin camera coordinate system according to the parameters of the cabin camera calibrated in advance and the pixel coordinates of the second target in the third target image;
the fifth projection area coordinate determining module is used for determining the coordinate of the projection area under the coordinate system of the cabin camera according to the coordinate of the plane mirror under the coordinate system of the cabin camera, the coordinate of the reflected projection area under the coordinate system of the cabin camera and the plane reflection imaging principle;
The target sight line coordinate determination module includes:
the first calibration mark display sub-module is used for controlling the projection area of the cabin to display calibration marks and recording the number of the displayed calibration marks;
the sight line data acquisition sub-module is used for acquiring sight line data of the displayed calibration mark, which is captured by the cabin camera, of a user;
the quantity judging sub-module is used for determining whether the quantity of the recorded and displayed calibration marks reaches a preset quantity;
a second calibration mark display sub-module, configured to return to the projection area of the control cabin to display calibration marks and record the number of displayed calibration marks if the number of recorded and displayed calibration marks does not reach the preset number;
a target sight line data determining sub-module, configured to determine the acquired sight line data as target sight line data of the calibration marks for the user if the number of the recorded and displayed calibration marks reaches the preset number;
the apparatus further comprises:
the data cleaning module is used for cleaning the data of the target sight line data to obtain cleaned target sight line data;
the target sight line coordinate determination module includes:
A target line of sight falling point coordinate sub-module, configured to determine, based on the cleaned target line of sight data, a target line of sight falling point coordinate of the user with respect to the calibration identifier in the cabin camera coordinate system;
the sight line data comprise a plurality of sight line position coordinates of a user captured by a cabin camera, wherein the sight line position coordinates are acquired within a preset time period after a calibration mark is displayed;
the data cleaning module comprises:
the first cleaning submodule is used for clustering the plurality of sight line position coordinates and determining a clustering center; determining the cluster center as cleaned target sight data;
the second cleaning submodule is used for clustering the plurality of sight line position coordinates and determining the category with the largest number of sight line position coordinates; determining the sight line position coordinates included in the determined category as cleaned target sight line data;
a third cleaning sub-module, configured to determine a sight focal point set of the user based on capture timings corresponding to the plurality of sight position coordinates; determining the end position coordinates of the sight line focus set as cleaned target sight line data;
a fourth cleaning sub-module, configured to determine a sight focal point set of the user based on capture timings corresponding to the plurality of sight position coordinates; determining sight line position coordinates with the distance between the sight line position coordinates and the end point position coordinates of the sight line focus set being smaller than a preset threshold value as cleaned target sight line data;
The sight line calibration parameter determination module comprises:
the calibration matrix calculation sub-module is used for calculating a sight calibration matrix R according to the following formula based on the target sight falling point coordinates and the predicted sight falling point coordinates:
R 3×3 =((V′ 3×n V′ n×3 T ) -1 V′ 3×n V n×3 T ) T
wherein n is the number of coordinates of the falling point of the target sight, V' 3×n For a matrix of column vectors composed of components of the predicted line-of-sight drop coordinates, V' n×3 For a matrix of row vectors composed of components of the predicted line-of-sight drop coordinates, V n×3 A matrix which is formed by components of the coordinates of the falling point of the target vision for a row vector;
the apparatus further comprises:
the current sight line data acquisition module is used for acquiring current sight line data of the user;
the sight line data calibration module is used for determining the product of the sight line calibration matrix and the current sight line data as calibrated sight line data;
the apparatus further comprises:
the sight calibration prompt information display module is used for controlling a projection area of the cabin to display sight calibration prompt information, wherein the sight calibration prompt information is used for prompting a user to watch a calibration mark displayed in the projection area;
the sight line calibration prompt voice playing module is used for controlling the cabin audio output equipment to play a sight line calibration prompt voice, wherein the sight line calibration prompt voice is used for prompting a user to watch the calibration mark displayed in the projection area;
The calibration marks are displayed in a projection area of a head-up display HUD in the pod.
13. An electronic device, comprising:
a memory for storing a computer program;
a processor for implementing the method of any of claims 1-10 when executing a program stored on a memory.
14. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program which, when executed by a processor, implements the method of any of claims 1-10.
CN202310703858.3A 2023-06-14 2023-06-14 Sight line calibration method and device, electronic equipment and storage medium Pending CN116797652A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310703858.3A CN116797652A (en) 2023-06-14 2023-06-14 Sight line calibration method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310703858.3A CN116797652A (en) 2023-06-14 2023-06-14 Sight line calibration method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116797652A true CN116797652A (en) 2023-09-22

Family

ID=88034106

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310703858.3A Pending CN116797652A (en) 2023-06-14 2023-06-14 Sight line calibration method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116797652A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117152411A (en) * 2023-11-01 2023-12-01 安徽蔚来智驾科技有限公司 Sight line calibration method, control device and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117152411A (en) * 2023-11-01 2023-12-01 安徽蔚来智驾科技有限公司 Sight line calibration method, control device and storage medium

Similar Documents

Publication Publication Date Title
EP2927634B1 (en) Single-camera ranging method and system
CN107533761B (en) Image processing apparatus and image processing system
US8947533B2 (en) Parameter determining device, parameter determining system, parameter determining method, and recording medium
US11010646B2 (en) Object tracking assisted with hand or eye tracking
US10168787B2 (en) Method for the target recognition of target objects
EP2683154A2 (en) Image pickup apparatus and lens apparatus
US20160371843A1 (en) Tracking objects in bowl-shaped imaging systems
KR20210104107A (en) Gaze area detection method, apparatus and electronic device
US20170344110A1 (en) Line-of-sight detector and line-of-sight detection method
CN106778641B (en) Sight estimation method and device
WO2022088103A1 (en) Image calibration method and apparatus
CN109993713B (en) Image distortion correction method and device for vehicle-mounted head-up display system
CN110006634B (en) Viewing field angle measuring method, viewing field angle measuring device, display method and display equipment
CN116797652A (en) Sight line calibration method and device, electronic equipment and storage medium
US11140312B2 (en) Long-range optical device with image capturing channel
US10061995B2 (en) Imaging system to detect a trigger and select an imaging area
EP3496041A1 (en) Method and apparatus for estimating parameter of virtual screen
WO2019135281A1 (en) Line-of-sight direction calibration device, line-of-sight direction calibration method, and line-of-sight direction calibration program
CN102761684A (en) An image capturing device and an image data recording method
CN112130325A (en) Parallax correction system and method for vehicle-mounted head-up display, storage medium and electronic device
CN114415826A (en) Data processing method and equipment thereof
US11587247B1 (en) Synchronous event driven readout of pixels in a detector for direct time-of-flight depth sensing
JP2005261728A (en) Line-of-sight direction recognition apparatus and line-of-sight direction recognition program
CN115018942A (en) Method and apparatus for image display of vehicle
US11321874B2 (en) Calibration of mobile electronic devices connected to headsets wearable by users

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination