CN110458104B - Human eye sight direction determining method and system of human eye sight detection system - Google Patents

Human eye sight direction determining method and system of human eye sight detection system Download PDF

Info

Publication number
CN110458104B
CN110458104B CN201910741288.0A CN201910741288A CN110458104B CN 110458104 B CN110458104 B CN 110458104B CN 201910741288 A CN201910741288 A CN 201910741288A CN 110458104 B CN110458104 B CN 110458104B
Authority
CN
China
Prior art keywords
coordinate system
coordinates
human eye
dimensional world
world coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910741288.0A
Other languages
Chinese (zh)
Other versions
CN110458104A (en
Inventor
李连吉
黄晗
郭彦东
王哲灿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Xiaopeng Motors Technology Co Ltd
Original Assignee
Guangzhou Xiaopeng Motors Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Xiaopeng Motors Technology Co Ltd filed Critical Guangzhou Xiaopeng Motors Technology Co Ltd
Priority to CN201910741288.0A priority Critical patent/CN110458104B/en
Publication of CN110458104A publication Critical patent/CN110458104A/en
Application granted granted Critical
Publication of CN110458104B publication Critical patent/CN110458104B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention provides a method and a system for determining human eye sight direction of a human eye sight detection system, wherein the human eye sight detection system comprises: the method comprises the following steps of A, a plurality of cameras, a steering engine holder and a laser ranging device arranged on the steering engine holder, wherein the method comprises the following steps: the method comprises the steps of determining coordinates of light spots emitted by a laser ranging device in a three-dimensional world coordinate system with a rotation center of a steering engine holder as a coordinate origin, obtaining a plurality of user images collected by a plurality of cameras to determine coordinates of human eyes in the three-dimensional world coordinate system, and determining direction vectors of human eyes sight lines in the three-dimensional world coordinate system based on the light spot coordinates and the human eye coordinates. Through setting up the steering wheel cloud platform for can collect the people's eye sight image data of continuous direction angle, but also can be applicable to various scenes, narrow and small auttombilism room etc. for example.

Description

Human eye sight direction determining method and system of human eye sight detection system
Technical Field
The invention relates to the technical field of computer vision, in particular to a human eye sight direction determining method of a human eye sight detection system and the human eye sight detection system.
Background
At present, human eye sight direction detection has become a popular problem in the field of computer vision, and with the development of deep learning technology, a human eye sight direction detection scheme based on appearance gradually becomes a mainstream scheme of the problem. But the collection of the data of the human eye sight line direction is very difficult.
The existing human eye sight direction data collection scheme comprises the following steps: 1. the object to be captured keeps the head position fixed and looks at several fixed position cameras while the cameras record image data. 2. The collected object looks at a plurality of points of fixed positions, data are recorded by a multi-view camera, and the sight line direction of the human eyes is obtained by calculating the space positions of the human eyes and the fixed points. 3. Two distinguishable points are randomly displayed by a large screen, the face of the collected object faces one of the points, the eyes look at the other point, data are recorded by a camera array, and the sight line direction is obtained by calculating the relative space coordinates of the human eyes and the random points. The 1 st and 2 nd schemes can acquire sight line data in different scenes (such as narrow spaces of an automobile cab and the like), but the sight line direction angle is discrete and does not accord with the natural law of sight line. The 3 rd scheme can collect sight line data of continuous direction angles, but has great limitation in the aspect of collecting scenes.
Disclosure of Invention
In view of the above problems, embodiments of the present invention are proposed to provide a human eye gaze direction determination method of a human eye gaze detection system and a corresponding human eye gaze detection system that overcome or at least partially solve the above problems.
In order to solve the above problems, an embodiment of the present invention discloses a method for determining a human eye sight direction of a human eye sight detection system, where the human eye sight detection system includes: the method comprises the following steps of A, a plurality of cameras, a steering engine cradle head and a laser ranging device arranged on the steering engine cradle head, wherein the method comprises the following steps:
determining the coordinates of the light spots emitted by the laser ranging device in a three-dimensional world coordinate system taking the rotation center of the steering engine holder as the origin of coordinates;
acquiring a plurality of user images acquired by the plurality of cameras to determine coordinates of human eyes in the three-dimensional world coordinate system;
and determining the direction vector of the human eye sight in the three-dimensional world coordinate system based on the spot coordinate and the human eye coordinate.
Optionally, the steering engine pan-tilt is composed of two steering engines with mutually perpendicular rotation directions; the coordinate of the light spot emitted by the laser ranging device in a three-dimensional world coordinate system taking the rotation center of the steering engine holder as the origin of coordinates is determined, and the method comprises the following steps:
acquiring the rotation angle of the steering engine holder when the laser ranging device emits light spots;
determining the distance from the light spot to the rotation center of the steering engine holder;
and determining the coordinates of the light spot in a three-dimensional world coordinate system taking the rotation center of the steering engine holder as the origin of coordinates by adopting the distance and the rotation angle.
Optionally, the acquiring a plurality of user images captured by the plurality of cameras to determine coordinates of the human eye in the three-dimensional world coordinate system includes:
determining coordinates of a plurality of human eye images in the plurality of user images in a pixel coordinate system;
converting the coordinates of the human eye images in the pixel coordinate system into the coordinates of the human eye images in the three-dimensional world coordinate system;
converting origin coordinates of the camera coordinate systems of the cameras into a plurality of camera origin coordinates in a three-dimensional world coordinate system;
and determining the coordinates of human eyes in the three-dimensional world coordinate system according to the coordinates of the origin of the cameras and the coordinates of the corresponding human eye images in the three-dimensional world coordinate system.
Optionally, the converting the coordinates of the plurality of human eye images in the camera coordinate system into the coordinates of the plurality of human eye images in the three-dimensional world coordinate system includes:
converting the coordinates of the plurality of human eye images in the pixel coordinate system into the coordinates of the plurality of human eye images in the image coordinate system by adopting a conversion relation between a preset pixel coordinate system and an image coordinate system;
converting the coordinates of the human eye images in the image coordinate system into the coordinates of the human eye images in the camera coordinate system by adopting a conversion relation between a preset image coordinate system and a camera coordinate system;
and converting the coordinates of the plurality of human eye images in the camera coordinate system into the coordinates of the plurality of human eye images in the three-dimensional world coordinate system by adopting a conversion relation between a preset camera coordinate system and the three-dimensional world coordinate system.
Optionally, the determining the coordinates of the human eyes in the three-dimensional world coordinate system according to the coordinates of the plurality of camera origin points and the coordinates of the corresponding human eye images in the three-dimensional world coordinate system further includes:
determining a plurality of straight lines according to the coordinates of the origin points of the plurality of cameras and the coordinates of the corresponding human eye images in the three-dimensional world coordinate system;
calculating coordinates of closest points of the plurality of straight lines; the closest point is the point with the shortest straight line distance with the straight lines;
and determining the coordinates of the closest points of the straight lines as the coordinates of the human eyes in the three-dimensional world coordinate system.
Optionally, the method further includes:
and generating a human eye sight image by adopting the direction vector of the human eye sight in the three-dimensional world coordinate system and the user image.
Optionally, the human eye sight line detection system further comprises an upper computer and a steering engine control panel, wherein the upper computer is connected with the steering engine control panel and the laser ranging device; the steering engine pan-tilt is connected with the steering engine control plate;
the human eye sight direction determining method of the human eye sight detection system is applied to the upper computer, and the rotating angle of the steering engine holder is controlled by the steering engine control plate.
Optionally, the method further includes:
acquiring positional relationship information between the plurality of cameras;
and determining the conversion relation between the camera coordinate systems of the cameras and the three-dimensional world coordinate system by adopting the position relation information among the cameras.
The embodiment of the invention also discloses a human eye sight line detection system, which comprises: host computer, a plurality of camera, steering wheel cloud platform and setting are in the laser range unit of steering wheel cloud platform, the host computer includes:
the spot coordinate determination module is used for determining the coordinates of the spots emitted by the laser ranging device in a three-dimensional world coordinate system taking the rotation center of the steering engine holder as the origin of coordinates;
the human eye coordinate determination module is used for acquiring a plurality of user images acquired by the plurality of cameras so as to determine the coordinates of human eyes in the three-dimensional world coordinate system;
and the direction vector determining module is used for determining the direction vector of the human eye sight in the three-dimensional world coordinate system based on the light spot coordinates and the human eye coordinates.
Optionally, the steering engine pan-tilt is composed of two steering engines with mutually perpendicular rotation directions; the spot coordinate determination module includes:
the rotation angle acquisition submodule is used for acquiring the rotation angle of the steering engine pan-tilt when the laser ranging device emits light spots;
the distance determination submodule is used for determining the distance from the light spot to the rotation center of the steering engine pan-tilt;
and the light spot coordinate determination submodule is used for determining the coordinates of the light spots in a three-dimensional world coordinate system taking the rotation center of the steering engine holder as the origin of coordinates by adopting the distance and the rotation angle.
Optionally, the eye coordinate determination module includes:
a first eye coordinate determination submodule for determining coordinates of a plurality of eye images of the plurality of user images in a pixel coordinate system;
the human eye coordinate conversion sub-module is used for converting the coordinates of the human eye images in a pixel coordinate system into the coordinates of the human eye images in the three-dimensional world coordinate system;
the origin coordinate conversion sub-module is used for converting origin coordinates of the camera coordinate systems of the cameras into a plurality of camera origin coordinates in a three-dimensional world coordinate system;
and the second human eye coordinate determination submodule is used for determining the coordinates of human eyes in the three-dimensional world coordinate system according to the coordinates of the camera origin points and the coordinates of the corresponding human eye images in the three-dimensional world coordinate system.
Optionally, the human eye coordinate transformation submodule includes:
the first coordinate conversion unit is used for converting the coordinates of the human eye images in the pixel coordinate system into the coordinates of the human eye images in the image coordinate system by adopting a conversion relation between a preset pixel coordinate system and an image coordinate system;
the second coordinate conversion unit is used for converting the coordinates of the human eye images in the image coordinate system into the coordinates of the human eye images in the camera coordinate system by adopting a conversion relation between a preset image coordinate system and a camera coordinate system;
and the third coordinate conversion unit is used for converting the coordinates of the plurality of human eye images in the camera coordinate system into the coordinates of the plurality of human eye images in the three-dimensional world coordinate system by adopting a conversion relation between a preset camera coordinate system and the three-dimensional world coordinate system.
Optionally, the second human eye coordinate determination sub-module includes:
the straight line determining unit is used for determining a plurality of straight lines according to the coordinates of the origin points of the cameras and the coordinates of the corresponding human eye images in the three-dimensional world coordinate system;
a coordinate calculation unit for calculating coordinates of closest points of the plurality of straight lines; the closest point is the point with the shortest straight line distance with the straight lines;
and the coordinate determination unit is used for determining the coordinates of the closest points of the straight lines as the coordinates of the human eyes in the three-dimensional world coordinate system.
Optionally, the method further includes:
and the human eye sight image generation module is used for generating a human eye sight image by adopting the direction vector of the human eye sight in the three-dimensional world coordinate system and the user image.
Optionally, the human eye sight line detection system further comprises a steering engine control panel, and the upper computer is connected with the steering engine control panel and the laser ranging device; the steering engine pan-tilt is connected with the steering engine control plate;
the human eye sight direction determining method of the human eye sight detection system is applied to the upper computer, and the rotating angle of the steering engine holder is controlled by the steering engine control plate.
Optionally, the method further includes:
the relation acquisition module is used for acquiring the position relation information among the cameras;
and the conversion relation determining module is used for determining the conversion relation between the camera coordinate systems of the cameras and the three-dimensional world coordinate system by adopting the position relation information among the cameras.
The embodiment of the invention also discloses a device, which comprises: a processor, a memory and a computer program stored on the memory and capable of running on the processor, the computer program, when executed by the processor, implementing the steps of the method for determining a direction of a human eye gaze detection system according to an embodiment of the invention.
The embodiment of the invention also discloses a computer readable storage medium, wherein a computer program is stored on the computer readable storage medium, and when the computer program is executed by a processor, the steps of the human eye sight direction determining method of the human eye sight detection system are realized.
The embodiment of the invention has the following advantages:
in the embodiment of the invention, the coordinates of the light spots emitted by the laser ranging device in the three-dimensional world coordinate system taking the rotation center of the steering engine holder as the origin of coordinates are determined, the images of a plurality of users collected by a plurality of cameras are obtained, so that the coordinates of human eyes in the three-dimensional world coordinate system are determined, and the direction vector of the human eye sight line in the three-dimensional world coordinate system is determined based on the coordinates of the light spots and the coordinates of the human eyes. Through setting up the steering wheel cloud platform for can collect the people's eye sight image data of continuous direction angle, but also can be applicable to various scenes, narrow and small auttombilism room etc. for example.
Drawings
FIG. 1 is a flowchart illustrating steps of an embodiment of a method for determining a direction of a human eye gaze in a human eye gaze detection system of the present invention;
FIG. 2 is a schematic diagram of a human eye gaze detection system of the present invention;
FIG. 3 is a schematic illustration of a human eye gaze detection system of the present invention in a simplified installation in a laboratory setting;
FIG. 4 is an image of a human eye's gaze captured in accordance with the present invention;
fig. 5 is a block diagram of an embodiment of a human eye gaze detection system of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
Referring to fig. 1, a flowchart illustrating steps of an embodiment of a method for determining a human eye gaze direction of a human eye gaze detection system of the present invention is shown, the human eye gaze detection system comprising: the device comprises a plurality of cameras, a steering engine holder and a laser ranging device arranged on the steering engine holder.
Fig. 2 shows a schematic structural diagram of a human eye sight line detection system of the present invention. In fig. 2, the human eye sight line detection system includes: the device comprises an upper computer (1), a steering engine control device (2), a laser ranging device (3), two steering engines (4 and 5) and three cameras (6, 7 and 8).
The upper computer can be a control center of the human eye sight line detection system, and can execute commands and perform a series of calculations. The upper computer (1) can be connected with the steering engine control device (2) through a TTL serial port, a fixed communication protocol is arranged between the TTL serial port and the steering engine control device, the upper computer can rotate any angle, calibrate and compensate rotation errors, determine a rotation range and a rotation mode, control rotation speed and other control operations of the steering engine cradle head according to the communication protocol, and the upper computer controls the steering engine by sending commands to the steering engine control device.
It should be noted that the steering engine control device (2) can control a plurality of steering engines simultaneously, for example, 6 steering engines are controlled, and 6 groups of connection pins with serial numbers are arranged on the steering engine control device (2), and the steering engine controlled correspondingly can be found through the serial numbers, which is not limited in the embodiment of the present invention. In the embodiment of the present invention, two steering engines are controlled by the steering engine control device (2). The steering engine control device (2) is connected with the two steering engines (4 and 5) through two power supply wires and a signal wire. The rotation directions of the two steering engines are mutually vertical.
The laser ranging device (3) is arranged on the steering engine pan-tilt, the laser beam just passes through the rotating shafts of the two steering engines (4 and 5) during installation, and the direction is consistent with the direction of the two steering engines (4 and 5) at 0 degree. The laser ranging device (3) is directly connected with the upper computer (1) through the TTL serial port, a fixed communication protocol is also arranged between the laser ranging device and the upper computer, and the upper computer can perform three simple control operations of opening and closing a light beam and measuring a distance on the laser ranging device (3). Three cameras (6, 7, 8) are fixed at proper positions and are directly controlled by the upper computer (1).
When the direction of the sight of the human eyes is collected, the collected object faces the camera and is seated behind the steering engine pan-tilt. Fig. 3 is a schematic diagram showing a human eye sight detection system of the present invention which is easily installed in a laboratory scene.
The method for determining the human eye sight direction of the human eye sight detection system specifically comprises the following steps:
and 101, determining the coordinates of the light spot emitted by the laser ranging device in a three-dimensional world coordinate system taking the rotation center of the steering engine holder as the origin of coordinates.
The steering engine cloud platform can use a point to rotate as the center, and this point is the rotation center of steering engine cloud platform promptly. The laser ranging device arranged on the steering engine holder rotates along with the steering engine holder, so that laser can be emitted at a plurality of angles. In the embodiment of the invention, a three-dimensional world coordinate system can be established by taking the rotation center of the steering engine pan-tilt as the origin of coordinates.
The laser ranging device may emit a spot of light and measure a distance between the spot of light and the laser ranging device.
The coordinates of the light spot in the three-dimensional world coordinate system can be calculated by acquiring the distance between the light spot measured by the laser ranging device and the laser ranging device. It should be noted that, for convenience of calculation, the laser distance measuring device may be disposed at the position of the rotation center of the steering engine pan-tilt, and at this time, it may be considered that the distance between the light spot and the rotation center of the steering engine pan-tilt is measured by the laser distance measuring device, the distance between the light spot measured by the laser distance measuring device and the rotation center of the steering engine pan-tilt is directly obtained, and the coordinates of the light spot in the three-dimensional world coordinate system are calculated.
In addition, even if the laser ranging device is not placed at the rotating center of the steering engine cloud platform, the coordinates of the light spot in the three-dimensional world coordinate system can be corrected according to the distance from the laser ranging device to the rotating center of the steering engine cloud platform.
In a preferred embodiment of the present invention, the steering engine pan-tilt head may include two steering engines whose rotation planes are perpendicular to each other; for example, it includes a horizontally rotating steering engine and a vertically rotating steering engine.
The step 101 may comprise the following sub-steps:
acquiring the rotation angle of the steering engine holder when the laser ranging device emits light spots; determining the distance from the light spot to the rotation center of the steering engine holder; and determining the coordinates of the light spot in a three-dimensional world coordinate system taking the rotation center of the steering engine holder as the origin of coordinates by adopting the distance and the rotation angle.
In the embodiment of the invention, the laser ranging device is arranged on the two steering engines, and when the two steering engines rotate, the laser ranging device can be driven to move, so that the projection position of a light spot emitted by the laser ranging device is changed.
Specifically, two rotation angles θ may be randomly generated,
Figure BDA0002164024270000081
when the rotation angles are all 90 degrees, the corresponding step length of the steering engine is Na、NbAnd the rotation angle is theta,
Figure BDA0002164024270000082
step value V of time steering engine1,V2The following formulas are satisfied:
V1=θ÷90×Na
Figure BDA0002164024270000091
when two steering engines are controlled by the steering engine control device, the step value V of the steering engines can be adjusted1,V2And coding the command into a command statement of the steering engine to generate a rotation command. The steering engine control device is transmitted through the serial port, so that the two steering engines controlling the steering engine holder can be rotated, and meanwhile, the steering engine control device can transmit the rotation signals to the steering engine control device through the serial portThe rotation angle theta is recorded, and,
Figure BDA0002164024270000092
after the two steering engines complete rotation at random angles, a ranging command can be sent to the laser ranging device, the laser ranging device receives the ranging command and then emits laser (namely, emits a light spot), and the laser can form the light spot on an obstacle after encountering the obstacle. The laser ranging device can further measure the distance L between the light spot and the rotation center of the steering engine holder.
In the embodiment of the invention, the distance L and the rotation angle theta between the light spot and the rotation center of the steering engine pan-tilt can be adopted,
Figure BDA0002164024270000093
to calculate the coordinates of the light spot in the three-dimensional world coordinate system.
Specifically, the coordinates of the light spot in the three-dimensional world coordinate system satisfy the following formula:
Figure BDA0002164024270000094
wherein, the ratio of theta,
Figure BDA0002164024270000095
the rotating angle of the steering engine pan-tilt is shown, L shows the distance between the light spot and the rotating center of the steering engine pan-tilt, and P is the coordinate of the light spot in the three-dimensional world coordinate system.
The measured distance L and the recorded theta are,
Figure BDA0002164024270000096
substituting the value of the light spot into a formula which is satisfied by the coordinates of the light spot in the three-dimensional world coordinate system, so as to obtain the coordinates of the light spot in the three-dimensional world coordinate system.
In a preferred embodiment of the invention, the human eye sight line detection system further comprises an upper computer and a steering engine control board, wherein the upper computer is connected with the steering engine control board and the laser ranging device; the steering engine pan-tilt is connected with the steering engine control plate; the human eye sight direction determining method of the human eye sight detection system is applied to the upper computer, and the rotating angle of the steering engine holder is controlled by the steering engine control plate.
And 102, acquiring a plurality of user images acquired by the plurality of cameras to determine the coordinates of human eyes in the three-dimensional world coordinate system.
When the laser ranging device emits light spots, the deployed multiple cameras can acquire user images respectively. When the user image is collected, the collected object watches the light spot left on the front obstacle, and each camera collects the user image to complete the shooting task.
In order to ensure that eyes (human eyes) of an acquired object watch the facula when acquiring the user image, the facula can be set to randomly flash for several times each time, and the observer needs to speak the times of the facula flashing so as to ensure that the watching action of the acquired object occurs.
In the embodiment of the invention, the coordinates of the human eye in the three-dimensional world coordinate system can be determined by adopting the human eye image in the plurality of user images. The human eye image may be an image of a user including a human eye part in the acquired user image.
In a preferred embodiment of the present invention, the step 102 may include the following sub-steps:
a substep S11 of determining coordinates of a plurality of human eye images among the plurality of user images in a pixel coordinate system;
in the embodiment of the present invention, to determine the coordinates of the human eye in the three-dimensional world coordinate system, the coordinates of the plurality of human eye images in the pixel coordinate system of the plurality of user images may be determined first.
The origin of coordinates of the pixel coordinate system is the upper left corner of the user image, the positive direction of the X axis is from the upper left corner to the right, and the positive direction of the Y axis is from the upper left corner to the lower. The position P of the eye in the image of the human eye can be manually marked so as to obtain the coordinates of P in the pixel coordinate system. In particular, in implementation, the position P of the eye in the human eye image may also be automatically labeled according to a certain rule, for example, the position P of the eye in the human eye image may be automatically labeled as the center position of the human eye image, which is not limited in this embodiment of the present invention.
A substep S12, converting the coordinates of the plurality of human eye images in the pixel coordinate system into the coordinates of the plurality of human eye images in the three-dimensional world coordinate system;
after determining the coordinates of the plurality of human eye images in the pixel coordinate system in the plurality of user images, the coordinates of the plurality of human eye images in the pixel coordinate system may be further converted into the coordinates of the plurality of human eye images in the three-dimensional world coordinate system according to a preset conversion relationship.
In a preferred embodiment of the present invention, the sub-step S12 may include the following steps:
converting the coordinates of the plurality of human eye images in the pixel coordinate system into the coordinates of the plurality of human eye images in the image coordinate system by adopting a conversion relation between a preset pixel coordinate system and an image coordinate system; converting the coordinates of the human eye images in the image coordinate system into the coordinates of the human eye images in the camera coordinate system by adopting a conversion relation between a preset image coordinate system and a camera coordinate system; and converting the coordinates of the plurality of human eye images in the camera coordinate system into the coordinates of the plurality of human eye images in the three-dimensional world coordinate system by adopting a conversion relation between a preset camera coordinate system and the three-dimensional world coordinate system.
The original point of the camera coordinate system is a pinhole of the camera, the direction parallel to the X axis of the pixel coordinate system is the direction of the X axis of the camera coordinate system, and the direction parallel to the Y axis of the pixel coordinate system is the direction of the Y axis of the camera coordinate system.
In a particular implementation, the pixel coordinate system and the image coordinate system are both on the imaging plane, except that the respective origins and measurement units are different. The origin of the image coordinate system is usually the midpoint of the imaging plane, in mm, which is the physical unit. The unit of the pixel coordinate system is pixel, and we usually describe that a pixel is several rows and several columns. The transition between the two is as follows: where dx and dy denote how many mm each column and each row respectively represents, i.e. 1pixel ═ dx mm.
The conversion relationship between the preset pixel coordinate system and the image coordinate system, the conversion relationship between the preset image coordinate system and the camera coordinate system, and the conversion relationship between the preset camera coordinate system and the three-dimensional world coordinate system are conversion relationships between coordinate systems preset for each camera. Specifically, the parameters of each camera can be determined through camera calibration, and the conversion relation between the coordinate systems can be determined by using the parameters of each camera.
In the embodiment of the invention, the coordinates of a plurality of human eye images in the pixel coordinate system can be converted into the coordinates of the plurality of human eye images in the image coordinate system by adopting a conversion relation between a preset pixel coordinate system and an image coordinate system; converting the coordinates of the plurality of human eye images in the image coordinate system into the coordinates of the plurality of human eye images in the camera coordinate system by adopting a conversion relation between a preset image coordinate system and a camera coordinate system; and converting the coordinates of the plurality of human eye images in the camera coordinate system into the coordinates of the plurality of human eye images in the three-dimensional world coordinate system by adopting a conversion relation between a preset camera coordinate system and the three-dimensional world coordinate system.
In an embodiment of the present invention, the method may further include the following steps:
acquiring positional relationship information between the plurality of cameras; and respectively determining the conversion relation between the camera coordinate systems of the cameras and the three-dimensional world coordinate system by adopting the position relation information among the cameras.
The cameras have their own camera coordinate systems, which may be different for each camera. The transformation relationship between the camera coordinate system of each camera and the three-dimensional world coordinate system needs to be determined separately.
The transformation relation between the camera coordinate system of the video camera and the three-dimensional world coordinate system can be determined by determining the pose of the video camera in the three-dimensional world coordinate system. The pose comprises translation information and rotation information, and the pose of each camera in a three-dimensional world coordinate system needs to be determined in the human eye sight line detection system only by using at least three cameras.
Specifically, the distances between the laser ranging device and the plurality of cameras respectively measured by the laser ranging device can be acquired; acquiring a rotation angle when the steering engine pan-tilt measures the distance between the steering engine pan-tilt and the camera; and solving translation information and rotation information of the camera coordinate systems of the cameras and the three-dimensional world coordinate system by adopting the plurality of distances and the plurality of rotation angles, so as to determine the poses of the camera coordinate systems in the three-dimensional world coordinate system.
In an embodiment of the invention, the laser ranging device may measure the distance between the (laser ranging device) and each camera.
In specific implementation, the light spot can be emitted to the camera, so that the measured distance between the light spot and the laser ranging device is the distance between the camera and the laser ranging device. In order to send light spots to the camera by the laser ranging device, the steering engine of the steering engine holder needs to rotate by a certain angle, so that the laser ranging device is driven to rotate by a certain angle, the rotation angle of the steering engine at the moment can be obtained, and the pose of the camera is determined by adopting the measured distance and the rotation angle. By the method for semi-automatically calibrating the pose of the camera, the calibration process of the camera is greatly simplified, and the efficiency of determining the direction of the human eye sight detection system can be improved.
A substep S13 of converting origin coordinates of the camera coordinate systems of the plurality of cameras into a plurality of camera origin coordinates in a three-dimensional world coordinate system;
in an embodiment of the present invention, the coordinates of the pinhole (i.e., the origin of the camera coordinate system) of each camera in the three-dimensional world coordinates may be determined.
Specifically, the origin coordinates of the camera coordinate systems of the plurality of cameras may be converted into a plurality of camera origin coordinates in the three-dimensional world coordinate system by using a conversion relationship between a preset camera coordinate system and the three-dimensional world coordinate system.
And a substep S14 of determining coordinates of human eyes in the three-dimensional world coordinate system according to the coordinates of the camera origin points and the coordinates of the corresponding human eye images in the three-dimensional world coordinate system respectively.
In the embodiment of the invention, the coordinates of the human eye in the three-dimensional world coordinate system can be respectively determined according to the coordinates of each camera origin and the coordinates of the corresponding human eye image in the three-dimensional world coordinate system.
Specifically, a straight line may be determined according to the coordinates of the origin of each camera and the coordinates of the eye image in the three-dimensional world coordinate system, and then the coordinates of the eye in the three-dimensional world coordinate system may be determined according to the respectively determined straight lines. The coordinate of the camera origin in each straight line and the coordinate of the human eye image in the three-dimensional world coordinate system correspond to the coordinate of the human eye image of the user image shot by the same camera in the three-dimensional world coordinate system, and the coordinate of the origin of the user image in the camera coordinate system.
In a preferred embodiment of the present invention, the sub-step S14 may include the following steps:
determining a plurality of straight lines according to the coordinates of the origin points of the plurality of cameras and the coordinates of the corresponding human eye images in the three-dimensional world coordinate system; calculating coordinates of closest points of the plurality of straight lines; the closest point is the point with the shortest straight line distance with the straight lines; and determining the coordinates of the closest points of the straight lines as the coordinates of the human eyes in the three-dimensional world coordinate system.
According to the imaging principle of the pinhole camera, an image (human eye image), a pinhole and an object (human eyes) are on the same spatial straight line under a camera coordinate system.
In the embodiment of the present invention, after determining the coordinates of the human eye image in the three-dimensional world coordinate system and the camera origin coordinates of the pinhole (the origin of the camera coordinate system) in the three-dimensional world coordinate system, one can determine a direct line using the two coordinates.
Specifically, let the human eye be point P in the camera coordinate system corresponding to the first video cameracThe origin of the camera coordinate system is point Oc. The point P of the human eye under the three-dimensional world coordinate systemwAnd a cameraPoint O of origin in three-dimensional world coordinate systemwThe following formulas are satisfied:
Pw=(A*R)-1*(Pc-T)=(xw1,yw1,zw1);
Ow=(A*R)-1*(Oc-T)=(xw2,yw2,zw2)。
a is an internal reference matrix of the camera, R is a rotation matrix of the camera, T is a translation vector of the camera, and A, R and T corresponding to each camera can be determined through camera calibration.
By two points, the straight line between the eye and the pinhole satisfies the following formula:
f1:(x-xw1)/(x-xw2)=(y-yw1)/(y-yw2)=(z-zw1)/(z-zw2)。
when a plurality of cameras exist, the equations f2, f3. and the like which are met by the corresponding straight lines of each camera can be respectively determined by the same method.
In the embodiment of the invention, the space straight lines established under the plurality of cameras are intersected in theory, because the space straight lines pass through the points of the human eye positions under the three-dimensional world coordinate system. However, since the calculation error may cause the straight lines to be disjoint, the position of the point at which the straight line distance is the shortest can be calculated as the position of the human eye by the least square method. When the shortest distance of these straight lines is 0, then these straight lines intersect.
In a specific implementation, a linear equation set can be generated according to the equation relationship satisfied by each straight line, and the coordinates of the human eye in the three-dimensional world coordinate system can be obtained by solving the linear equation set.
And 103, determining a direction vector of the human eye sight in the three-dimensional world coordinate system based on the spot coordinate and the human eye coordinate.
In the embodiment of the invention, the coordinates of the light spot in the three-dimensional world coordinate system and the coordinates of the human eye in the three-dimensional world coordinate system can be adopted to determine the direction vector of the human eye sight line in the world coordinate system.
Specifically, the direction vector of the line of sight of the human eye in the three-dimensional world coordinate system can be obtained by subtracting the coordinates of the light spot in the three-dimensional world coordinate system from the coordinates of the human eye in the world coordinate system.
In a preferred embodiment of the embodiments of the present invention, the method may further include the steps of:
and generating a human eye sight image by adopting the direction vector of the human eye sight in the three-dimensional world coordinate system and the user image.
Specifically, the direction vector of the human eye sight line in the three-dimensional world coordinate system is converted into the direction vector of the human eye sight line in the pixel coordinate system through the conversion relationship between the preset camera coordinate system and the three-dimensional world coordinate system, the conversion relationship between the preset image coordinate system and the camera coordinate system, and the conversion relationship between the preset pixel coordinate system and the image coordinate system, so that the user image collected by the camera and the direction vector of the human eye sight line in the pixel coordinate system are adopted to generate the human eye sight line image.
In addition, the coordinates of the light spot in the three-dimensional world coordinate system can be converted into the coordinates of the light spot in the camera coordinate system and the coordinates of the human eye in the three-dimensional world coordinate system can be converted into the coordinates of the human eye in the camera coordinate system through the conversion relation between the camera coordinate system and the three-dimensional world coordinate system; and determining the direction vector of the human eye in the camera coordinate system by adopting the coordinates of the light spot in the camera coordinate system and the coordinates of the human eye in the camera coordinate system. And further converting the direction vector of the human eye sight line in the camera coordinate system into the direction vector of the human eye sight line in the pixel coordinate system according to the conversion relationship between the preset image coordinate system and the camera coordinate system and the conversion relationship between the preset pixel coordinate system and the image coordinate system, so that the user image collected by the camera and the direction vector of the human eye sight line in the pixel coordinate system are adopted to generate the human eye sight line image.
Specifically, let the coordinate of the human eye in the camera coordinate system corresponding to the first video camera be EcThe coordinate in the three-dimensional world coordinate system is Ew(ii) a The coordinate of the light spot in the camera coordinate system is QcThe coordinate in the world coordinate system is Qw. Then, EcAnd EwAnd QcAnd QwThe following formulas are satisfied:
Figure BDA0002164024270000151
Figure BDA0002164024270000152
wherein, R is the rotation matrix of the camera, and T is the translation vector of the camera.
By the above-mentioned EcAnd EwAnd QcAnd QwThe satisfied formula can respectively determine the coordinates Q of the light spots in the camera coordinate systemcAnd the coordinates of the human eye in the camera coordinate system are Ec. Will QcAnd EcAnd subtracting to obtain the direction vector of the human eye sight line under the camera coordinate system.
Fig. 4 is a collected human eye sight line image according to the present invention, and in fig. 4, human eye sight lines in different directions are marked in the collected human eye sight line image. The human eye sight direction determining method of the human eye sight detecting system can collect human eye sights in various directions.
In the embodiment of the invention, the coordinates of the light spots emitted by the laser ranging device in the three-dimensional world coordinate system taking the rotation center of the steering engine holder as the origin of coordinates are determined, the images of a plurality of users collected by a plurality of cameras are obtained, so that the coordinates of human eyes in the three-dimensional world coordinate system are determined, and the direction vector of the human eye sight line in the three-dimensional world coordinate system is determined based on the coordinates of the light spots and the coordinates of the human eyes. Through setting up the steering wheel cloud platform for can collect the people's eye sight image data of continuous direction angle, but also can be applicable to various scenes, narrow and small auttombilism room etc. for example.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 5, a block diagram of a human eye sight line detection system according to an embodiment of the present invention is shown, and is applied to an upper computer, and the human eye sight line detection system includes: host computer, a plurality of camera, steering wheel cloud platform and setting are in the laser range unit of steering wheel cloud platform, the host computer specifically can include following module:
the light spot coordinate determination module 501 is configured to determine coordinates of a light spot emitted by the laser distance measurement device in a three-dimensional world coordinate system using a rotation center of the steering engine pan-tilt as an origin of coordinates;
a human eye coordinate determination module 502, configured to acquire a plurality of user images acquired by the plurality of cameras to determine coordinates of human eyes in the three-dimensional world coordinate system;
a direction vector determining module 503, configured to determine a direction vector of the human eye sight line in the three-dimensional world coordinate system based on the spot coordinates and the human eye coordinates.
In a preferred embodiment of the invention, the steering engine pan-tilt head is composed of two steering engines with mutually vertical rotation directions; the light spot coordinate determination module 501 includes:
the rotation angle acquisition submodule is used for acquiring the rotation angle of the steering engine pan-tilt when the laser ranging device emits light spots;
the distance determination submodule is used for determining the distance from the light spot to the rotation center of the steering engine pan-tilt;
and the light spot coordinate determination submodule is used for determining the coordinates of the light spots in a three-dimensional world coordinate system taking the rotation center of the steering engine holder as the origin of coordinates by adopting the distance and the rotation angle.
In a preferred embodiment of the present invention, the human eye coordinate determination module 502 includes:
a first eye coordinate determination submodule for determining coordinates of a plurality of eye images of the plurality of user images in a pixel coordinate system;
the human eye coordinate conversion sub-module is used for converting the coordinates of the human eye images in a pixel coordinate system into the coordinates of the human eye images in the three-dimensional world coordinate system;
the origin coordinate conversion sub-module is used for converting origin coordinates of the camera coordinate systems of the cameras into a plurality of camera origin coordinates in a three-dimensional world coordinate system;
and the second human eye coordinate determination submodule is used for determining the coordinates of human eyes in the three-dimensional world coordinate system according to the coordinates of the camera origin points and the coordinates of the corresponding human eye images in the three-dimensional world coordinate system.
In a preferred embodiment of the present invention, the human eye coordinate transformation submodule includes:
the first coordinate conversion unit is used for converting the coordinates of the human eye images in the pixel coordinate system into the coordinates of the human eye images in the image coordinate system by adopting a conversion relation between a preset pixel coordinate system and an image coordinate system;
the second coordinate conversion unit is used for converting the coordinates of the human eye images in the image coordinate system into the coordinates of the human eye images in the camera coordinate system by adopting a conversion relation between a preset image coordinate system and a camera coordinate system;
and the third coordinate conversion unit is used for converting the coordinates of the plurality of human eye images in the camera coordinate system into the coordinates of the plurality of human eye images in the three-dimensional world coordinate system by adopting a conversion relation between a preset camera coordinate system and the three-dimensional world coordinate system.
In a preferred embodiment of the present invention, the second human eye coordinate determination submodule includes:
the straight line determining unit is used for determining a plurality of straight lines according to the coordinates of the origin points of the cameras and the coordinates of the corresponding human eye images in the three-dimensional world coordinate system;
a coordinate calculation unit for calculating coordinates of closest points of the plurality of straight lines; the closest point is the point with the shortest straight line distance with the straight lines;
and the coordinate determination unit is used for determining the coordinates of the closest points of the straight lines as the coordinates of the human eyes in the three-dimensional world coordinate system.
In a preferred embodiment of the present invention, the method further includes:
and the human eye sight image generation module is used for generating a human eye sight image by adopting the direction vector of the human eye sight in the three-dimensional world coordinate system and the user image.
In a preferred embodiment of the invention, the human eye sight line detection system further comprises a steering engine control panel, and the upper computer is connected with the steering engine control panel and the laser ranging device; the steering engine pan-tilt is connected with the steering engine control plate;
the human eye sight direction determining method of the human eye sight detection system is applied to the upper computer, and the rotating angle of the steering engine holder is controlled by the steering engine control plate.
In a preferred embodiment of the present invention, the method further includes:
the relation acquisition module is used for acquiring the position relation information among the cameras;
and the conversion relation determining module is used for determining the conversion relation between the camera coordinate systems of the cameras and the three-dimensional world coordinate system by adopting the position relation information among the cameras.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
An embodiment of the present invention further provides an apparatus, including:
the method comprises a processor, a memory and a computer program which is stored in the memory and can run on the processor, wherein when the computer program is executed by the processor, each process of the embodiment of the method for determining the direction of the human eye sight detection system is realized, the same technical effect can be achieved, and the process is not repeated here to avoid repetition.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when being executed by a processor, the computer program implements each process of the embodiment of the method for determining a direction of a human eye gaze of the human eye gaze detection system, and can achieve the same technical effect, and is not described herein again to avoid repetition.
Each embodiment in the present specification is described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts in each embodiment are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The method for determining the direction of the human eye sight detection system and the human eye sight detection system provided by the invention are described in detail, specific examples are applied in the text to explain the principle and the implementation mode of the invention, and the description of the examples is only used for helping to understand the method and the core idea of the invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (11)

1. A human eye sight direction determination method of a human eye sight detection system is characterized in that the human eye sight detection system comprises: the method comprises the following steps of A, a plurality of cameras, a steering engine cradle head and a laser ranging device arranged on the steering engine cradle head, wherein the method comprises the following steps:
determining the coordinates of the light spots emitted by the laser ranging device in a three-dimensional world coordinate system taking the rotation center of the steering engine holder as the origin of coordinates;
acquiring a plurality of user images acquired by the plurality of cameras to determine coordinates of human eyes in the three-dimensional world coordinate system; wherein, when the plurality of user images are acquired, the acquired object gazes at the facula;
determining a direction vector of a human eye sight in the three-dimensional world coordinate system based on the light spot coordinate and the human eye coordinate;
wherein, the determining the direction vector of the human eye sight line in the three-dimensional world coordinate system based on the light spot coordinate and the human eye coordinate comprises:
and subtracting the coordinates of the light spots from the coordinates of the human eyes to obtain a direction vector of the sight of the human eyes in the three-dimensional world coordinate system.
2. The method according to claim 1, wherein the steering engine pan-tilt head is composed of two steering engines with mutually perpendicular rotation directions; the coordinate of the light spot emitted by the laser ranging device in a three-dimensional world coordinate system taking the rotation center of the steering engine holder as the origin of coordinates is determined, and the method comprises the following steps:
acquiring the rotation angle of the steering engine holder when the laser ranging device emits light spots;
determining the distance from the light spot to the rotation center of the steering engine holder;
and determining the coordinates of the light spot in a three-dimensional world coordinate system taking the rotation center of the steering engine holder as the origin of coordinates by adopting the distance and the rotation angle.
3. The method of claim 1, wherein said acquiring a plurality of images of the user captured by the plurality of cameras to determine the coordinates of the human eye in the three-dimensional world coordinate system comprises:
determining coordinates of a plurality of human eye images in the plurality of user images in a pixel coordinate system;
converting the coordinates of the human eye images in the pixel coordinate system into the coordinates of the human eye images in the three-dimensional world coordinate system;
converting origin coordinates of the camera coordinate systems of the cameras into a plurality of camera origin coordinates in a three-dimensional world coordinate system;
and determining the coordinates of human eyes in the three-dimensional world coordinate system according to the coordinates of the origin of the cameras and the coordinates of the corresponding human eye images in the three-dimensional world coordinate system.
4. The method of claim 3, wherein transforming the coordinates of the plurality of human eye images in a pixel coordinate system to the coordinates of the plurality of human eye images in the three-dimensional world coordinate system comprises:
converting the coordinates of the plurality of human eye images in the pixel coordinate system into the coordinates of the plurality of human eye images in the image coordinate system by adopting a conversion relation between a preset pixel coordinate system and an image coordinate system;
converting the coordinates of the human eye images in the image coordinate system into the coordinates of the human eye images in the camera coordinate system by adopting a conversion relation between a preset image coordinate system and a camera coordinate system;
and converting the coordinates of the plurality of human eye images in the camera coordinate system into the coordinates of the plurality of human eye images in the three-dimensional world coordinate system by adopting a conversion relation between a preset camera coordinate system and the three-dimensional world coordinate system.
5. The method of claim 3, wherein determining coordinates of the human eye in the three-dimensional world coordinate system based on the plurality of camera origin coordinates and the coordinates of the corresponding human eye image in the three-dimensional world coordinate system, respectively, further comprises:
determining a plurality of straight lines according to the coordinates of the origin points of the plurality of cameras and the coordinates of the corresponding human eye images in the three-dimensional world coordinate system;
calculating coordinates of closest points of the plurality of straight lines; the closest point is the point with the shortest straight line distance with the straight lines;
and determining the coordinates of the closest points of the straight lines as the coordinates of the human eyes in the three-dimensional world coordinate system.
6. The method of claim 3, further comprising:
and generating a human eye sight image by adopting the direction vector of the human eye sight in the three-dimensional world coordinate system and the user image.
7. The method according to claim 1, wherein the human eye sight line detection system further comprises an upper computer and a steering engine control panel, wherein the upper computer is connected with the steering engine control panel and the laser ranging device; the steering engine pan-tilt is connected with the steering engine control plate;
the human eye sight direction determining method of the human eye sight detection system is applied to the upper computer, and the rotating angle of the steering engine holder is controlled by the steering engine control plate.
8. The method of claim 1, further comprising:
acquiring positional relationship information between the plurality of cameras;
and determining the conversion relation between the camera coordinate systems of the cameras and the three-dimensional world coordinate system by adopting the position relation information among the cameras.
9. A human eye gaze detection system, characterized in that the human eye gaze detection system comprises: host computer, a plurality of camera, steering wheel cloud platform and setting are in the laser range unit of steering wheel cloud platform, the host computer includes:
the spot coordinate determination module is used for determining the coordinates of the spots emitted by the laser ranging device in a three-dimensional world coordinate system taking the rotation center of the steering engine holder as the origin of coordinates;
the human eye coordinate determination module is used for acquiring a plurality of user images acquired by the plurality of cameras so as to determine the coordinates of human eyes in the three-dimensional world coordinate system; wherein, when the plurality of user images are acquired, the acquired object gazes at the facula;
the direction vector determining module is used for determining a direction vector of the human eye sight in the three-dimensional world coordinate system based on the light spot coordinate and the human eye coordinate;
the direction vector determining module is specifically configured to:
and subtracting the coordinates of the light spots from the coordinates of the human eyes to obtain a direction vector of the sight of the human eyes in the three-dimensional world coordinate system.
10. An apparatus, comprising: a processor, a memory and a computer program stored on the memory and being executable on the processor, the computer program, when executed by the processor, implementing the steps of the method for determining a direction of a human eye gaze detection system as claimed in any of the claims 1-8.
11. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method for determining a direction of a human eye gaze detection system as claimed in any one of claims 1 to 8.
CN201910741288.0A 2019-08-12 2019-08-12 Human eye sight direction determining method and system of human eye sight detection system Active CN110458104B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910741288.0A CN110458104B (en) 2019-08-12 2019-08-12 Human eye sight direction determining method and system of human eye sight detection system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910741288.0A CN110458104B (en) 2019-08-12 2019-08-12 Human eye sight direction determining method and system of human eye sight detection system

Publications (2)

Publication Number Publication Date
CN110458104A CN110458104A (en) 2019-11-15
CN110458104B true CN110458104B (en) 2021-12-07

Family

ID=68486028

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910741288.0A Active CN110458104B (en) 2019-08-12 2019-08-12 Human eye sight direction determining method and system of human eye sight detection system

Country Status (1)

Country Link
CN (1) CN110458104B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111723716B (en) * 2020-06-11 2024-03-08 深圳地平线机器人科技有限公司 Method, device, system, medium and electronic equipment for determining target object orientation
CN114449319B (en) * 2020-11-04 2024-06-04 深圳Tcl新技术有限公司 Video picture dynamic adjustment method and device, intelligent terminal and storage medium
CN113701710B (en) * 2021-08-31 2024-05-17 高新兴科技集团股份有限公司 Laser spot positioning method, ranging method, medium and equipment applied to security monitoring
CN117928383A (en) * 2024-03-09 2024-04-26 广州泰宣科技有限公司 Image pickup measurement method and system

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4829141B2 (en) * 2007-02-09 2011-12-07 株式会社東芝 Gaze detection apparatus and method
CN102749991B (en) * 2012-04-12 2016-04-27 广东百泰科技有限公司 A kind of contactless free space sight tracing being applicable to man-machine interaction
US9785233B2 (en) * 2014-04-11 2017-10-10 Facebook, Inc. Systems and methods of eye tracking calibration
WO2016142489A1 (en) * 2015-03-11 2016-09-15 SensoMotoric Instruments Gesellschaft für innovative Sensorik mbH Eye tracking using a depth sensor
EP3353630B1 (en) * 2015-09-24 2021-05-26 Tobii AB Eye-tracking enabled wearable devices
JP2018004950A (en) * 2016-07-01 2018-01-11 フォーブ インコーポレーテッド Video display system, video display method, and video display program
CN206161864U (en) * 2016-10-27 2017-05-10 武汉理工大学 3D laser scanner device
CN108229284B (en) * 2017-05-26 2021-04-09 北京市商汤科技开发有限公司 Sight tracking and training method and device, system, electronic equipment and storage medium
WO2019135281A1 (en) * 2018-01-05 2019-07-11 三菱電機株式会社 Line-of-sight direction calibration device, line-of-sight direction calibration method, and line-of-sight direction calibration program
CN108985172A (en) * 2018-06-15 2018-12-11 北京七鑫易维信息技术有限公司 A kind of Eye-controlling focus method, apparatus, equipment and storage medium based on structure light

Also Published As

Publication number Publication date
CN110458104A (en) 2019-11-15

Similar Documents

Publication Publication Date Title
CN110458104B (en) Human eye sight direction determining method and system of human eye sight detection system
EP3665506B1 (en) Apparatus and method for generating a representation of a scene
CN112655024B (en) Image calibration method and device
AU2020417796B2 (en) System and method of capturing and generating panoramic three-dimensional images
CN110572630B (en) Three-dimensional image shooting system, method, device, equipment and storage medium
CN110377148B (en) Computer readable medium, method of training object detection algorithm, and training apparatus
CN106027887B (en) For the method, apparatus and electronic equipment of the rifle ball linkage control of rotating mirror holder
JP2010113720A (en) Method and apparatus for combining range information with optical image
JP2013502558A (en) Generation of depth data based on spatial light patterns
CN207766424U (en) A kind of filming apparatus and imaging device
US9990739B1 (en) Method and device for fisheye camera automatic calibration
CN105306922A (en) Method and device for obtaining depth camera reference diagram
JP2019100995A (en) Measurement image display control unit, measurement image display control method, and program for measurement image display control
US20240179416A1 (en) Systems and methods for capturing and generating panoramic three-dimensional models and images
CN111699412B (en) Method for calculating three-dimensional driving value of three-dimensional numerical driving control instrument by using driving measurement of laser tracking distance meter
CN107659772B (en) 3D image generation method and device and electronic equipment
TWI502162B (en) Twin image guiding-tracking shooting system and method
KR102298047B1 (en) Method of recording digital contents and generating 3D images and apparatus using the same
CN208752459U (en) Movable equipment
JP2005258792A (en) Apparatus, method and program for generating image
KR102373572B1 (en) Surround view monitoring system and method thereof
CN110930460B (en) Full-automatic calibration method and device for structured light 3D vision system
WO2023139988A1 (en) Measurement system, information processing device, measurement method and program
EP4113251A1 (en) Calibration method of a system comprising an eye tracking device and a computing device comprising one or multiple screens
CN116385537A (en) Positioning method and device for augmented reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant