CN109040736B - Method, device, equipment and storage medium for calibrating spatial position of human eye - Google Patents

Method, device, equipment and storage medium for calibrating spatial position of human eye Download PDF

Info

Publication number
CN109040736B
CN109040736B CN201810897442.9A CN201810897442A CN109040736B CN 109040736 B CN109040736 B CN 109040736B CN 201810897442 A CN201810897442 A CN 201810897442A CN 109040736 B CN109040736 B CN 109040736B
Authority
CN
China
Prior art keywords
angle
camera
eye
distance
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810897442.9A
Other languages
Chinese (zh)
Other versions
CN109040736A (en
Inventor
夏正国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhangjiagang Kangdexin Optronics Material Co Ltd
Original Assignee
Zhangjiagang Kangdexin Optronics Material Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhangjiagang Kangdexin Optronics Material Co Ltd filed Critical Zhangjiagang Kangdexin Optronics Material Co Ltd
Priority to CN201810897442.9A priority Critical patent/CN109040736B/en
Priority to PCT/CN2018/106385 priority patent/WO2020029373A1/en
Publication of CN109040736A publication Critical patent/CN109040736A/en
Application granted granted Critical
Publication of CN109040736B publication Critical patent/CN109040736B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Eye Examination Apparatus (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The embodiment of the invention discloses a method, a device, equipment and a storage medium for calibrating the spatial position of human eyes. The method comprises the following steps: determining a pixel distance between a camera and an image plane according to the field angle of the camera and the preset resolution of an image shot by the camera; determining imaging coordinates of a left eye and a right eye of a viewer in the image plane, respectively, in a horizontal plane perpendicular to the image plane; respectively calculating a first angle of a left eye deviating from the optical axis of the camera and a second angle of a right eye deviating from the optical axis of the camera according to the pixel distance and the imaging coordinate; and determining the spatial positions of the left eye and the right eye according to the first angle, the second angle and the actual interpupillary distance of the left eye and the right eye. By adopting the technical scheme, the accuracy of calibrating the spatial position of the human eye is improved, the position of the human eye is accurately tracked in the process of realizing naked eye 3D display, so that the arrangement parameters are accurately adjusted, and the naked eye 3D display effect is improved.

Description

Method, device, equipment and storage medium for calibrating spatial position of human eye
Technical Field
The embodiment of the invention relates to the technical field of naked eye 3D, in particular to a method, a device, equipment and a storage medium for calibrating the spatial position of human eyes.
Background
The naked-eye 3D display technology is a 3D display technology in which an observer can directly view a three-dimensional image with naked eyes without wearing special 3D glasses to show a 3D effect. In the process of realizing naked eye 3D display, if the fixed light distribution is used under the condition that human eye tracking is not carried out, a user needs to search for a proper viewing position from front to back, left to right, and the ideal stereoscopic effect can be seen. When the viewing position is inappropriate, the light that should enter the left eye may enter the right eye, and at this time, the right eye can see both the left image and the right image, which is easy to generate crosstalk (crosstalk) phenomenon, and the user experience is poor. Therefore, the position of the eyes needs to be tracked in real time, and the layout parameters of the display content need to be adjusted according to the collected positions of the eyes.
At present, the method adopted when tracking the positions of the human eyes is to assume that the left eye and the right eye of the viewer are parallel to the connecting line of the camera, and construct the relationship between the first imaging distance of the actual positions of the left eye and the right eye on the image plane and the second imaging distance of the actual positions of the left eye and the right eye on the image plane opposite to the camera. Therefore, the second imaging distance of the left eye and the second imaging distance of the right eye are calculated according to the first imaging distance shot in real time. And further determining the positions of the left eye and the right eye of the viewer in the space according to the second imaging distance and the actual interpupillary distance of the left eye and the right eye of the viewer.
However, in practice, the left and right eyes of the viewer are at a certain angle with respect to the line connecting the cameras, and are not parallel to each other. Therefore, if calculated in a parallel condition, the calculated spatial positions of the left and right eyes of the viewer are approximately close to the actual positions of the left and right eyes of the viewer only when the actual positions of the left and right eyes of the viewer are sufficiently distant from the display screen. When the viewer is close to the display screen, the calculated positions of the left eye and the right eye are inaccurate, and certain deviation exists. Therefore, the calculation scheme of the spatial position of the human eye provided by the prior art has certain limitations.
Disclosure of Invention
The embodiment of the invention provides a calibration method, a device, equipment and a storage medium for a human eye space position, which are used for improving the accuracy of calibration of the human eye space position, are beneficial to accurately adjusting layout parameters during naked eye 3D display and improving the naked eye 3D display effect.
In a first aspect, an embodiment of the present invention provides a method for calibrating a spatial position of a human eye, where the method includes:
determining a pixel distance between a camera and an image plane according to the field angle of the camera and the preset resolution of an image shot by the camera;
determining imaging coordinates of a left eye and a right eye of a viewer in the image plane, respectively, in a horizontal plane perpendicular to the image plane;
respectively calculating a first angle of a left eye deviating from the optical axis of the camera and a second angle of a right eye deviating from the optical axis of the camera according to the pixel distance and the imaging coordinate;
and determining the spatial positions of the left eye and the right eye according to the first angle, the second angle and the actual interpupillary distance of the left eye and the right eye.
In a second aspect, an embodiment of the present invention further provides a device for calibrating a spatial position of a human eye, where the device includes:
the pixel distance determining module is used for determining the pixel distance between the camera and an image plane according to the field angle of the camera and the preset resolution of an image shot by the camera;
the imaging coordinate determination module is used for determining the imaging coordinates of the left eye and the right eye of the viewer in the image plane in a horizontal plane perpendicular to the image plane;
the angle calculation module is used for respectively calculating a first angle of a left eye deviating from the optical axis of the camera and a second angle of a right eye deviating from the optical axis of the camera according to the pixel distance and the imaging coordinate;
and the spatial position determining module is used for determining the spatial positions of the left eye and the right eye according to the first angle, the second angle and the actual interpupillary distance of the left eye and the right eye.
In a third aspect, an embodiment of the present invention further provides an apparatus, where the apparatus includes:
one or more processors;
a storage device for storing one or more programs,
when the one or more programs are executed by the one or more processors, the one or more processors implement the method for calibrating the spatial position of the human eye provided by any embodiment of the invention.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the method for calibrating the spatial position of the human eye according to any embodiment of the present invention.
According to the technical scheme of the embodiment of the invention, the pixel distance between the camera and the image plane can be determined according to the field angle of the camera and the preset resolution of the image shot by the camera; in a horizontal plane perpendicular to the image plane, the imaging coordinates of the left eye and the right eye of the viewer in the image plane respectively can also be determined; according to the pixel distance and the imaging coordinate, a first angle of the left eye deviating from the optical axis of the camera and a second angle of the right eye deviating from the optical axis of the camera can be respectively calculated; by determining the spatial positions of the left and right eyes from the first angle, the second angle, and the actual interpupillary distance of the left and right eyes, the spatial positions of the left and right eyes can be determined, and thus can be determined from the actual spatial positions of the left and right eyes. By adopting the technical scheme, the tracking precision of the positions of the human eyes is improved, and the arrangement parameters can be accurately adjusted when naked eye 3D display is performed, so that the naked eye 3D display effect is improved.
Drawings
Fig. 1 is a flowchart of a method for calibrating a spatial position of an eye according to an embodiment of the present invention;
fig. 2a is a schematic view of a geometric relationship between a camera and an image plane according to a first embodiment of the present invention;
fig. 2b is a schematic diagram of a geometric relationship between a camera and a viewer according to a first embodiment of the present invention;
fig. 3 is a flowchart of a method for calibrating a spatial position of a human eye according to a second embodiment of the present invention;
fig. 4 is a flowchart of a method for calibrating a spatial position of a human eye according to a third embodiment of the present invention;
fig. 5 is a schematic view of a geometric relationship between the midpoint positions of the two eyes of the viewer in a third direction and the camera according to a third embodiment of the present invention;
fig. 6 is a block diagram of a calibration apparatus for a spatial position of a human eye according to a fourth embodiment of the present invention;
fig. 7 is a schematic structural diagram of an apparatus according to a fifth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a flowchart of a method for calibrating a spatial position of a human eye according to an embodiment of the present invention, where the method may be implemented by the method for calibrating a spatial position of a human eye, the apparatus may be implemented by software and/or hardware, and the apparatus may be integrated in a control device for playing display content. Referring to fig. 1, the method of the present embodiment specifically includes:
and S110, determining the pixel distance between the camera and the image plane according to the field angle of the camera and the preset resolution of the image shot by the camera.
The camera is preferably a camera module of the display screen, can be fixedly mounted on the display screen, is in communication connection with the control device, and is an external camera facing the display screen viewing area. It should be noted that the cameras in the embodiment of the present invention are all perpendicular to the display screen. If the included angle between the optical axis of the camera and the display screen is not 90 degrees, the camera needs to be calibrated firstly so as to ensure the vertical state of the camera and the display screen.
Fig. 2a is a schematic diagram illustrating a geometric relationship between a camera and an image plane according to an embodiment of the present invention. As shown in fig. 2a, the plane where AB is located is the image plane, and A, B is the left and right boundary points of the image. The field angle of the camera is ≧ ACB, which is preferably 60 ° in the present embodiment. The resolution of the image captured by the camera may be preset according to actual requirements, for example, may be set to 640 × 480.
Specifically, as shown in fig. 2a, when the resolution of the image captured by the camera is determined, the length of the AO may be determined to be one-half of the number of pixels in the image length direction. In addition, when the camera view angle is determined, the size of &, namely, one half of the view angle can also be determined. According to the length of AO and the size of ≈ ACO, the length of CO, namely the pixel distance between the camera and the image plane can be determined. For example, if the angle of view is specifically 60 °, then ≈ ACO may be determined to be 30 °. If the number of pixels in the longitudinal direction is 640, the length of AO is 320, and accordingly, the length of CO is 554.256.
And S120, determining imaging coordinates of the left eye and the right eye of the viewer in the image plane respectively in a horizontal plane vertical to the image plane.
Because the spatial positions of the left eye and the right eye of the viewer are three-dimensional coordinates, the three-dimensional space can be projected into the two-dimensional space during calculation, the two-dimensional coordinates of the two-dimensional space are calculated first, and then the coordinates of the third dimension are determined according to the calculation result.
For example, in the present embodiment, the horizontal level perpendicular to the image plane can be calculated firstAnd in the plane, the imaging coordinates of the left eye and the right eye of the viewer in the image plane respectively. Fig. 2b is a schematic diagram of a geometric relationship between a camera and a viewer according to an embodiment of the present invention, and as shown in fig. 2b, A, B are left and right boundary points of an image. The plane in which the EF lies corresponds to the actual range of motion of the viewer. Wherein L is1And R1The actual positions in space of the left and right eyes of the viewer, respectively. Actual interpupillary distance, i.e. L, of the left and right eye1And R1The distance between them is typically 6.4 cm. Xl and Xr are the imaging coordinates of the viewer's left and right eyes in the image plane, respectively. Specifically, the values corresponding to Xl and Xr can be determined by recognizing the captured image with a camera.
And S130, respectively calculating a first angle of the left eye deviating from the optical axis of the camera and a second angle of the right eye deviating from the optical axis of the camera according to the pixel distance and the imaging coordinate.
As shown in fig. 2b, according to the imaging coordinates Xl and Xr of the left and right eyes in the image plane and the pixel distance CO, a first angle ≤ OCXl of the left eye deviating from the optical axis of the camera and a second angle ≤ OCXr of the right eye deviating from the optical axis of the camera can be calculated.
Specifically, in this embodiment, for convenience of calculation, the midpoint of the AB may be used as the origin of coordinates. When the mapping software defaults that the imaging origin of coordinates is in the upper left corner of the image, the identified values of Xl and Xr, respectively, may be subtracted 320, thereby moving the origin of coordinates from the upper left corner of the image to the imaging center, i.e., where point O is located in fig. 2 b. At this time, the imaging coordinates of the left and right eyes on the image plane are (Xl-320) and (Xr-320), respectively.
Since the length of the pixel distance CO has been calculated in step S110, the distance CO can be calculated by calculating a first quotient of the abscissa in the imaging coordinates of the left eye and the pixel distance, and a second quotient of the abscissa in the imaging coordinates of the right eye and the pixel distance; and taking an angle corresponding to the arctangent value of the first quotient as a first angle < OCXl, and taking an angle corresponding to the arctangent value of the second quotient as a second angle < OCXr. Specifically, the first angle ≥ OCXl and the second angle ≤ OCXr can be calculated by the following formula:
∠OCXl=arctan[OXl/OC]=arctan[(Xl-320)/554];
∠OCXr=arctan[OXr/OC]=arctan[(Xr-320)/554]。
and S140, determining the spatial positions of the left eye and the right eye according to the first angle, the second angle and the actual interpupillary distance of the left eye and the right eye.
Illustratively, as shown in FIG. 2b, to calculate the spatial positions of the left and right eyes, auxiliary lines CN, CL may be passed2、CR2And CM. Wherein L is2And R2The spatial positions of the left and right eyes when the viewer is positioned facing the camera, respectively, and Q, W are the left and right eye imaging coordinates corresponding to the positions. N and M are the midpoint positions of the left and right eyes, respectively, of the viewer at different positions. In this embodiment, the position of the target midpoint between the left eye and the right eye may be used as the spatial position of the left eye and the right eye. As shown in fig. 2b, the coordinates of the target midpoint N of the left and right eyes on a horizontal plane perpendicular to the image plane can be determined by calculating the distance between MN and CM. Wherein, MN and CM are two right-angle sides in the triangle Δ CMN shown in fig. 2b, and the values of MN and CM can be obtained by calculating the length of CN. Wherein CN isosceles triangle is triangle delta CR1L1Is high. It can be understood that, in the case that the first angle, the second angle, and the imaging coordinates of the left and right eyes are known, there are various ways to calculate the height of the triangle, and the first angle, the second angle, and L are preferably used in the present embodiment1The distance between the MN and the CN is calculated by the trigonometric function relation between the N, namely the horizontal and vertical coordinates of the N on the horizontal plane vertical to the image plane.
Further, the eye space position further comprises coordinates of the target midpoint in a vertical direction of the display screen, wherein the vertical direction is perpendicular to the horizontal plane. And calculating the vertical distance between the imaging coordinate of the midpoint and the optical axis of the camera according to the imaging coordinate of the midpoint of the target in the image plane, and obtaining the angle of the imaging coordinate of the midpoint deviating from the optical axis of the camera according to the vertical distance and by combining the pixel distance. After the distance of CM in fig. 2b is determined, the coordinate of the target midpoint in the vertical direction of the display screen can be calculated according to the angle.
According to the technical scheme of the embodiment of the invention, the pixel distance between the camera and the image plane can be determined according to the field angle of the camera and the preset resolution of the image shot by the camera; in a horizontal plane perpendicular to the image plane, the imaging coordinates of the left eye and the right eye of the viewer in the image plane respectively can also be determined; according to the pixel distance and the imaging coordinate, a first angle of the left eye deviating from the optical axis of the camera and a second angle of the right eye deviating from the optical axis of the camera can be respectively calculated; the spatial positions of the left and right eyes can be determined by the first angle, the second angle, and the actual interpupillary distance of the left and right eyes. Therefore, when naked eye 3D display is carried out, the actual space positions of the left eye and the right eye of a viewer are accurately tracked by adopting the technical scheme, so that the arrangement parameters are accurately adjusted, and the naked eye 3D display effect is improved.
Example two
Fig. 3 is a flowchart of a calibration method for a spatial position of a human eye according to a second embodiment of the present invention, which is optimized based on the second embodiment, and in this embodiment, the spatial position of a target midpoint between a left eye and a right eye is mainly projected into a horizontal plane perpendicular to an image plane, and a horizontal coordinate and a vertical coordinate in the horizontal plane are calculated. Wherein explanations of the same or corresponding terms as those of the above-described embodiments are omitted. Referring to fig. 3, the method provided in this embodiment includes:
and S210, determining the pixel distance between the camera and the image plane according to the field angle of the camera and the preset resolution of the image shot by the camera.
And S220, determining imaging coordinates of the left eye and the right eye of the viewer in the image plane respectively in a horizontal plane vertical to the image plane.
And S230, respectively calculating a first angle of the left eye deviating from the optical axis of the camera and a second angle of the right eye deviating from the optical axis of the camera according to the pixel distance and the imaging coordinate.
S240, calculating the linear distance between the target midpoint between the left eye and the right eye and the camera according to the first trigonometric function relation between the first angle, the second angle and the actual interpupillary distance between the left eye and the right eye.
Illustratively, between the left and right eye, as shown in FIG. 2bThe linear distance from the target midpoint to the camera is CN in FIG. 2b, and is Δ CR in the form of an isosceles triangle1L1Calculating the absolute value of the difference between the first angle and the second angle as the included angle between the first line segment and the second line segment, namely:
∠L1CR1equal angle OCXl-OCXr |; wherein, the first line segment is a connecting line L of the camera and the left eye1C, the second line segment is a connecting line R of the camera and the right eye1C。
Dividing one half of the actual interpupillary distance of the left eye and the right eye by the tangent value of the target included angle to obtain the linear distance between the target midpoint between the left eye and the right eye and the camera; wherein, the target included angle is half of the included angle between the first line segment and the second line segment, and the specific formula is as follows:
CN=3.2/tan(|∠OCXl-∠OCXr|/2),
wherein, due to R1L16.4cm, so, L1N=3.2cm。
And S250, determining a first target coordinate of the target midpoint on the horizontal plane according to the straight-line distance and a second trigonometric function relationship between the first angle and the second angle.
For example, as shown in fig. 2b, with the camera as the origin, the right direction as the positive x-axis direction, and the vertical screen direction as the positive y-axis direction, the coordinates of the center points of the two eyes are (MN, CM). In the right-angled triangle delta CMN, the lengths of CM and MN can be obtained only by determining the size of & lt MCN because the length of CN is calculated.
For example, the calculation method of the angle MCN may be: taking the first angle and one half of the sum of the second angles as a third angle of the target midpoint deviating from the optical axis of the camera in the horizontal plane, namely:
∠MCN=(∠OCXl+∠OCXr)/2。
after the third angle is determined, calculating a first product between the cosine value of the third angle and the linear distance to obtain a first vertical distance (CM) from the target midpoint to the optical axis of the camera, and taking the first vertical distance as the longitudinal coordinate of the target midpoint in the first direction of the horizontal plane; a second product between the sine of the third angle and the linear distance is calculated, the second product being used as the abscissa of the target midpoint in a second direction of the horizontal plane, i.e. MN, wherein the second direction is perpendicular to the first direction.
Specifically, the calculation formulas of CM and MN are as follows:
CM=CN*cos[(∠OXl+∠OXr)/2];
MN=CN*sin[(∠OXl+∠OXr)/2]。
on the basis of the above technical solution, the spatial positions of the left and right eyes of the viewer are divided into the first direction and the second direction on the horizontal plane, and the coordinates in the third direction where the display screen is located. The present embodiment mainly calculates the ordinate in the first direction and the abscissa in the second direction. Because the included angle between the left eye and the right eye of the viewer and the camera in the technical scheme provided by the embodiment is consistent with the actual situation and is not approximately equal to 0 degree, the calculation method provided by the embodiment has higher accuracy compared with the horizontal and vertical coordinates of the viewer on the horizontal plane calculated based on the parallel mode of the left eye and the right eye of the viewer and the connecting line of the camera in the prior art.
EXAMPLE III
Fig. 4 is a flowchart of a calibration method for a spatial position of a human eye according to a third embodiment of the present invention, which is optimized based on the third embodiment, and mainly calculates a second target coordinate of a target midpoint between a left eye and a right eye in a third direction, where the third direction is a vertical direction in which a display screen is located, and the vertical direction is perpendicular to a horizontal plane related to the third embodiment. Wherein explanations of the same or corresponding terms as those of the above-described embodiments are omitted. Referring to fig. 4, the method provided in this embodiment includes:
and S310, determining the pixel distance between the camera and the image plane according to the field angle of the camera and the preset resolution of the image shot by the camera.
And S320, determining imaging coordinates of the left eye and the right eye of the viewer in the image plane respectively in a horizontal plane vertical to the image plane.
S330, respectively calculating a first angle of the left eye deviating from the optical axis of the camera and a second angle of the right eye deviating from the optical axis of the camera according to the pixel distance and the imaging coordinate.
And S340, calculating the linear distance between the target midpoint between the left eye and the right eye and the camera according to the first trigonometric function relation between the first angle, the second angle and the actual interpupillary distance of the left eye and the right eye.
And S350, determining a first target coordinate of the target midpoint on the horizontal plane according to the straight-line distance and a second trigonometric function relationship between the first angle and the second angle.
And S360, taking the center point of the image plane as a reference point, and taking the direction which is vertical to the image plane through the reference point as the optical axis direction of the camera. In the image plane, a second perpendicular distance of the target midpoint in the vertical direction to the optical axis of the camera is determined.
Illustratively, the second vertical distance may be determined by identifying an image containing both eyes of the viewer.
And S370, calculating a second target coordinate according to a third trigonometric function relation between the second vertical distance and the ordinate of the first target coordinate.
Illustratively, at a distance where the second vertical distance is in the image plane, the tangent of a fourth angle at which the target midpoint deviates from the optical axis of the camera on the image plane is obtained by taking the quotient of the pixel distance and the second vertical distance. The tangent value of the fourth angle may also be reflected in the spatial coordinate system. Fig. 5 is a schematic geometric relationship between the midpoint positions of the two eyes of the viewer in the third direction and the camera according to the third embodiment of the present invention. As shown in fig. 5, C is the camera position, N is the position of the center point of the target of the left and right eyes, and the second target coordinate is the distance of the CG. In this embodiment, the fourth angle is ═ GCN, and the tangent value of the fourth angle is GN/CG. In the above embodiment, since CG is the ordinate of the first target coordinate, in order to calculate the second target coordinate, that is, GN in the present embodiment, the third product may be taken as the second target coordinate by calculating the third product of the tangent of the fourth angle and the ordinate of the first target coordinate, and the specific formula is as follows:
GN=CG*tan(∠GCN)。
according to the technical scheme of the embodiment, after the second target coordinate is determined, the space actual positions of the left eye and the right eye of the viewer can be determined through the first target coordinate and the second target coordinate, and by adopting the scheme provided by the embodiment, high tracking accuracy can be obtained, and the method has a good effect on a display system for tracking naked eyes by human eyes for 3 d.
Example four
Fig. 6 is a block diagram of a structure of a device for calibrating a spatial position of a human eye according to a fourth embodiment of the present invention, as shown in fig. 6, the device includes: a pixel distance determination module 410, an imaging coordinate determination module 420, an angle calculation module 430, and a spatial position determination module 440. Wherein the content of the first and second substances,
a pixel distance determining module 410, configured to determine a pixel distance between a camera and an image plane according to a field angle of the camera and a preset resolution of an image captured by the camera;
an imaging coordinate determination module 420 for determining the imaging coordinates of the left eye and the right eye of the viewer in the image plane respectively in a horizontal plane perpendicular to the image plane;
the angle calculation module 430 is configured to calculate a first angle at which the left eye deviates from the optical axis of the camera and a second angle at which the right eye deviates from the optical axis of the camera according to the pixel distance and the imaging coordinate;
and a spatial position determining module 440, configured to determine spatial positions of the left eye and the right eye according to the first angle, the second angle, and an actual interpupillary distance between the left eye and the right eye.
According to the technical scheme of the embodiment of the invention, the pixel distance between the camera and the image plane can be determined according to the field angle of the camera and the preset resolution of the image shot by the camera; in a horizontal plane perpendicular to the image plane, the imaging coordinates of the left eye and the right eye of the viewer in the image plane respectively can also be determined; according to the pixel distance and the imaging coordinate, a first angle of the left eye deviating from the optical axis of the camera and a second angle of the right eye deviating from the optical axis of the camera can be respectively calculated; the spatial positions of the left and right eyes can be determined by the first angle, the second angle, and the actual interpupillary distance of the left and right eyes. Therefore, when naked eye 3D display is carried out, the actual space positions of the left eye and the right eye of a viewer are accurately tracked by adopting the technical scheme, so that the arrangement parameters are accurately adjusted, and the naked eye 3D display effect is improved.
On the basis of the foregoing embodiment, the angle calculating module 430 is specifically configured to:
calculating a first quotient value of an abscissa in the imaging coordinates of the left eye and the pixel distance and a second quotient value of an abscissa in the imaging coordinates of the right eye and the pixel distance;
and taking an angle corresponding to the arctangent value of the first quotient as a first angle, and taking an angle corresponding to the arctangent value of the second quotient as a second angle.
On the basis of the above embodiment, the spatial position determining module 440 includes:
the linear distance calculation unit is used for calculating the linear distance between the target midpoint between the left eye and the right eye and the camera according to a first trigonometric function relation among the first angle, the second angle and the actual interpupillary distance of the left eye and the right eye;
and the first target coordinate determination unit is used for determining a first target coordinate of the target midpoint on the horizontal plane according to the straight-line distance and a second trigonometric function relationship between the first angle and the second angle.
On the basis of the foregoing embodiment, the straight-line distance calculating unit is specifically configured to:
calculating the absolute value of the difference value between the first angle and the second angle to be used as the included angle between the first line segment and the second line segment; the first line segment is a connection line between the camera and the left eye, and the second line segment is a connection line between the camera and the right eye;
dividing one half of the actual interpupillary distance of the left eye and the right eye by the tangent value of the target included angle to obtain the linear distance between the target midpoint between the left eye and the right eye and the camera; and the target included angle is half of the included angle between the first line segment and the second line segment.
On the basis of the foregoing embodiment, the first target coordinate determination unit is specifically configured to:
taking one half of the sum of the first angle and the second angle as a third angle of the target midpoint deviating from the optical axis of the camera in the horizontal plane;
calculating a first product between the cosine value of the third angle and the linear distance to obtain a first vertical distance between the target midpoint and the optical axis of the camera, and taking the first vertical distance as a vertical coordinate of the target midpoint in the first direction of the horizontal plane;
calculating a second product between the sine of the third angle and the linear distance, the second product being taken as the abscissa of the target midpoint in a second direction of the horizontal plane, wherein the second direction is perpendicular to the first direction.
On the basis of the above embodiment, the spatial position further includes a second target coordinate of the target midpoint in a third direction, where the third direction is a vertical direction in which the display screen is located, and the vertical direction is perpendicular to the horizontal plane;
correspondingly, the device further comprises:
the second vertical distance determining module is used for taking the center point of the image plane as a reference point and taking the direction which is vertical to the image plane through the reference point as the optical axis direction of the camera; determining a second vertical distance from the target midpoint to the camera optical axis in the vertical direction in the image plane;
and the second target coordinate calculation module is used for calculating a second target coordinate according to a third trigonometric function relation between the second vertical distance and the ordinate of the first target coordinate.
On the basis of the foregoing embodiment, the second target coordinate calculation module is specifically configured to:
the pixel distance and the second vertical distance are subjected to quotient operation to obtain a tangent value of a fourth angle of the target midpoint deviating from the optical axis of the camera on the image plane;
and calculating a third product of the tangent value of the fourth angle and the ordinate of the first target coordinate, and taking the third product as a second target coordinate.
The calibration device for the spatial position of the human eye provided by the embodiment of the invention can execute the calibration method for the spatial position of the human eye provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method. Technical details that are not described in detail in the above embodiments may be referred to a method for calibrating a spatial position of a human eye according to any embodiment of the present invention.
EXAMPLE five
Fig. 7 is a schematic structural diagram of an apparatus according to a fifth embodiment of the present invention. Fig. 7 illustrates a block diagram of an exemplary device 12 suitable for use in implementing embodiments of the present invention. The device 12 shown in fig. 7 is only an example and should not bring any limitation to the function and scope of use of the embodiments of the present invention.
As shown in FIG. 7, device 12 is in the form of a general purpose computing device. The components of device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including the system memory 28 and the processing unit 16.
Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Device 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)30 and/or cache memory 32. Device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 7, and commonly referred to as a "hard drive"). Although not shown in FIG. 7, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. Memory 28 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 40 having a set (at least one) of program modules 42 may be stored, for example, in memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 42 generally carry out the functions and/or methodologies of the described embodiments of the invention.
Device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), with one or more devices that enable a user to interact with device 12, and/or with any devices (e.g., network card, modem, etc.) that enable device 12 to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 22. Also, the device 12 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet) via the network adapter 20. As shown, the network adapter 20 communicates with the other modules of the device 12 via the bus 18. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with device 12, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 16 executes various functional applications and data processing by executing programs stored in the system memory 28, for example, to implement a method for calibrating a spatial position of a human eye according to any embodiment of the present invention, the method including:
determining a pixel distance between a camera and an image plane according to the field angle of the camera and the preset resolution of an image shot by the camera;
determining imaging coordinates of a left eye and a right eye of a viewer in the image plane, respectively, in a horizontal plane perpendicular to the image plane;
respectively calculating a first angle of a left eye deviating from the optical axis of the camera and a second angle of a right eye deviating from the optical axis of the camera according to the pixel distance and the imaging coordinate;
and determining the spatial positions of the left eye and the right eye according to the first angle, the second angle and the actual interpupillary distance of the left eye and the right eye.
EXAMPLE six
An embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements a method for calibrating a spatial position of a human eye according to any embodiment of the present invention, where the method includes:
determining a pixel distance between a camera and an image plane according to the field angle of the camera and the preset resolution of an image shot by the camera;
determining imaging coordinates of a left eye and a right eye of a viewer in the image plane, respectively, in a horizontal plane perpendicular to the image plane;
respectively calculating a first angle of a left eye deviating from the optical axis of the camera and a second angle of a right eye deviating from the optical axis of the camera according to the pixel distance and the imaging coordinate;
and determining the spatial positions of the left eye and the right eye according to the first angle, the second angle and the actual interpupillary distance of the left eye and the right eye.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. A calibration method of a human eye space position is applied to a display screen provided with a camera, and is characterized by comprising the following steps:
determining a pixel distance between a camera and an image plane according to the field angle of the camera and the preset resolution of an image shot by the camera;
determining imaging coordinates of a left eye and a right eye of a viewer in the image plane, respectively, in a horizontal plane perpendicular to the image plane;
respectively calculating a first angle of a left eye deviating from the optical axis of the camera and a second angle of a right eye deviating from the optical axis of the camera according to the pixel distance and the imaging coordinate;
and determining the spatial positions of the left eye and the right eye according to the first angle, the second angle and the actual interpupillary distance of the left eye and the right eye.
2. The method of claim 1, wherein calculating a first angle of the left eye from the optical axis of the camera and a second angle of the right eye from the optical axis of the camera based on the pixel distance and the imaging coordinates, respectively, comprises:
calculating a first quotient value of an abscissa in the imaging coordinates of the left eye and the pixel distance and a second quotient value of an abscissa in the imaging coordinates of the right eye and the pixel distance;
and taking an angle corresponding to the arctangent value of the first quotient as a first angle, and taking an angle corresponding to the arctangent value of the second quotient as a second angle.
3. The method of claim 1, wherein determining the spatial positions of the left and right eyes from the first angle, the second angle, and the actual interpupillary distance of the left and right eyes comprises:
calculating the linear distance between the target midpoint between the left eye and the right eye and the camera according to the first trigonometric function relationship among the first angle, the second angle and the actual interpupillary distance of the left eye and the right eye;
and determining a first target coordinate of the target midpoint on the horizontal plane according to the straight-line distance and a second trigonometric function relationship between the first angle and the second angle.
4. The method of claim 3, wherein calculating a linear distance of a target midpoint between the left and right eyes from the camera based on a first trigonometric relationship between the first angle, the second angle, and the actual interpupillary distance of the left and right eyes comprises:
calculating the absolute value of the difference value between the first angle and the second angle to be used as the included angle between the first line segment and the second line segment; the first line segment is a connection line between the camera and the left eye, and the second line segment is a connection line between the camera and the right eye;
dividing one half of the actual interpupillary distance of the left eye and the right eye by the tangent value of the target included angle to obtain the linear distance between the target midpoint between the left eye and the right eye and the camera; and the target included angle is half of the included angle between the first line segment and the second line segment.
5. The method of claim 3, wherein determining a first target coordinate of the target midpoint in the horizontal plane based on a second trigonometric relationship between the linear distance, the first angle, and the second angle comprises:
taking one half of the sum of the first angle and the second angle as a third angle of the target midpoint deviating from the optical axis of the camera in the horizontal plane;
calculating a first product between the cosine value of the third angle and the linear distance to obtain a first vertical distance between the target midpoint and the optical axis of the camera, and taking the first vertical distance as a vertical coordinate of the target midpoint in the first direction of the horizontal plane;
calculating a second product between the sine of the third angle and the linear distance, the second product being taken as the abscissa of the target midpoint in a second direction of the horizontal plane, wherein the second direction is perpendicular to the first direction.
6. The method of claim 3, wherein the spatial location further comprises second target coordinates of the target midpoint in a third direction, wherein the third direction is a vertical direction in which the display screen is located, the vertical direction being perpendicular to the horizontal plane;
correspondingly, the method further comprises the following steps:
taking the center point of the image plane as a reference point, and taking the direction which is vertical to the image plane through the reference point as the direction of the optical axis of the camera;
determining a second vertical distance from the target midpoint to the camera optical axis in the vertical direction in the image plane;
and calculating the second target coordinate according to a third trigonometric function relation between the second vertical distance and the vertical coordinate of the first target coordinate.
7. The method of claim 6, wherein calculating the second target coordinate based on a third trigonometric relationship between the second perpendicular distance and the ordinate of the first target coordinate comprises:
the pixel distance and the second vertical distance are subjected to quotient operation to obtain a tangent value of a fourth angle of the target midpoint deviating from the optical axis of the camera on the image plane;
and calculating a third product of the tangent value of the fourth angle and the ordinate of the first target coordinate, and taking the third product as a second target coordinate.
8. A calibration device for the spatial position of a human eye is characterized by comprising:
the pixel distance determining module is used for determining the pixel distance between the camera and an image plane according to the field angle of the camera and the preset resolution of an image shot by the camera;
the imaging coordinate determination module is used for determining the imaging coordinates of the left eye and the right eye of the viewer in the image plane in a horizontal plane perpendicular to the image plane;
the angle calculation module is used for respectively calculating a first angle of a left eye deviating from the optical axis of the camera and a second angle of a right eye deviating from the optical axis of the camera according to the pixel distance and the imaging coordinate;
and the spatial position determining module is used for determining the spatial positions of the left eye and the right eye according to the first angle, the second angle and the actual interpupillary distance of the left eye and the right eye.
9. An apparatus for spatial location calibration of a human eye, said apparatus comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method for spatial localization of the human eye as claimed in any one of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out a method for calibrating the spatial position of a human eye as claimed in any one of claims 1 to 7.
CN201810897442.9A 2018-08-08 2018-08-08 Method, device, equipment and storage medium for calibrating spatial position of human eye Active CN109040736B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201810897442.9A CN109040736B (en) 2018-08-08 2018-08-08 Method, device, equipment and storage medium for calibrating spatial position of human eye
PCT/CN2018/106385 WO2020029373A1 (en) 2018-08-08 2018-09-19 Method, apparatus and device for determining spatial positions of human eyes, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810897442.9A CN109040736B (en) 2018-08-08 2018-08-08 Method, device, equipment and storage medium for calibrating spatial position of human eye

Publications (2)

Publication Number Publication Date
CN109040736A CN109040736A (en) 2018-12-18
CN109040736B true CN109040736B (en) 2020-11-03

Family

ID=64632373

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810897442.9A Active CN109040736B (en) 2018-08-08 2018-08-08 Method, device, equipment and storage medium for calibrating spatial position of human eye

Country Status (2)

Country Link
CN (1) CN109040736B (en)
WO (1) WO2020029373A1 (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110300297A (en) * 2019-06-04 2019-10-01 宁波视睿迪光电有限公司 The image adjusting method and device of teaching demonstration
CN110308790A (en) * 2019-06-04 2019-10-08 宁波视睿迪光电有限公司 The image adjusting device and system of teaching demonstration
CN110969984A (en) * 2019-12-02 2020-04-07 英华达(上海)科技有限公司 Dynamic brightness adjusting method, system, equipment and storage medium of display equipment
CN111652959B (en) 2020-05-29 2022-01-18 京东方科技集团股份有限公司 Image processing method, near-to-eye display device, computer device, and storage medium
CN111860292B (en) * 2020-07-16 2024-06-07 科大讯飞股份有限公司 Monocular camera-based human eye positioning method, device and equipment
CN111885367A (en) * 2020-07-20 2020-11-03 上海青研科技有限公司 Display device and application method
CN111918052B (en) * 2020-08-14 2022-08-23 广东申义实业投资有限公司 Vertical rotary control device and method for converting plane picture into 3D image
CN112040316B (en) * 2020-08-26 2022-05-20 深圳创维-Rgb电子有限公司 Video image display method, device, multimedia equipment and storage medium
CN113534490B (en) * 2021-07-29 2023-07-18 深圳市创鑫未来科技有限公司 Stereoscopic display device and stereoscopic display method based on user eyeball tracking
CN114390267A (en) * 2022-01-11 2022-04-22 宁波视睿迪光电有限公司 Method and device for synthesizing stereo image data, electronic equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104683786A (en) * 2015-02-28 2015-06-03 上海玮舟微电子科技有限公司 Human eye tracking method and device of naked eye 3D equipment
CN106713894A (en) * 2015-11-17 2017-05-24 深圳超多维光电子有限公司 Tracking stereo display method and device
CN108234994A (en) * 2017-12-29 2018-06-29 上海玮舟微电子科技有限公司 A kind of position of human eye determines method and device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014055689A1 (en) * 2012-10-02 2014-04-10 Reald Inc. Stepped waveguide autostereoscopic display apparatus with a reflective directional element
CN104503092B (en) * 2014-11-28 2018-04-10 深圳市魔眼科技有限公司 Different angle and apart from adaptive 3 D displaying method and equipment
EP3040759A1 (en) * 2014-12-30 2016-07-06 Shenzhen Estar Technology Group Co., Ltd 3D display apparatus and dynamic grating
US10346950B2 (en) * 2016-10-05 2019-07-09 Hidden Path Entertainment, Inc. System and method of capturing and rendering a stereoscopic panorama using a depth buffer

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104683786A (en) * 2015-02-28 2015-06-03 上海玮舟微电子科技有限公司 Human eye tracking method and device of naked eye 3D equipment
CN106713894A (en) * 2015-11-17 2017-05-24 深圳超多维光电子有限公司 Tracking stereo display method and device
CN108234994A (en) * 2017-12-29 2018-06-29 上海玮舟微电子科技有限公司 A kind of position of human eye determines method and device

Also Published As

Publication number Publication date
CN109040736A (en) 2018-12-18
WO2020029373A1 (en) 2020-02-13

Similar Documents

Publication Publication Date Title
CN109040736B (en) Method, device, equipment and storage medium for calibrating spatial position of human eye
US20190012804A1 (en) Methods and apparatuses for panoramic image processing
CN108989785B (en) Naked eye 3D display method, device, terminal and medium based on human eye tracking
US20080050042A1 (en) Hardware-in-the-loop simulation system and method for computer vision
JP6008397B2 (en) AR system using optical see-through HMD
CN106815869B (en) Optical center determining method and device of fisheye camera
US8155387B2 (en) Method and system for position determination using image deformation
WO2020019548A1 (en) Glasses-free 3d display method and apparatus based on human eye tracking, and device and medium
WO2017080280A1 (en) Depth image composition method and apparatus
Fathi et al. Multistep explicit stereo camera calibration approach to improve Euclidean accuracy of large-scale 3D reconstruction
Yao et al. A flexible calibration approach for cameras with double-sided telecentric lenses
US20220405968A1 (en) Method, apparatus and system for image processing
Jiang et al. An accurate and flexible technique for camera calibration
CN113793387A (en) Calibration method, device and terminal of monocular speckle structured light system
CN113496503B (en) Point cloud data generation and real-time display method, device, equipment and medium
CN113763478A (en) Unmanned vehicle camera calibration method, device, equipment, storage medium and system
CN113689508B (en) Point cloud labeling method and device, storage medium and electronic equipment
Chen et al. A closed-form solution to single underwater camera calibration using triple wavelength dispersion and its application to single camera 3D reconstruction
US20180276879A1 (en) Finite aperture omni-directional stereo light transport
Wang et al. An onsite inspection sensor for the formation of hull plates based on active binocular stereovision
Shim et al. Performance evaluation of time-of-flight and structured light depth sensors in radiometric/geometric variations
CN109982074B (en) Method and device for obtaining inclination angle of TOF module and assembling method
CN110415196A (en) Method for correcting image, device, electronic equipment and readable storage medium storing program for executing
Córdova-Esparza et al. A panoramic 3D reconstruction system based on the projection of patterns
CN115267251A (en) Stereoscopic particle image speed measuring method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200408

Address after: 215634 north side of Chengang road and west side of Ganghua Road, Jiangsu environmental protection new material industrial park, Zhangjiagang City, Suzhou City, Jiangsu Province

Applicant after: ZHANGJIAGANG KANGDE XIN OPTRONICS MATERIAL Co.,Ltd.

Address before: 201203, room 5, building 690, No. 202 blue wave road, Zhangjiang hi tech park, Shanghai, Pudong New Area

Applicant before: WZ TECHNOLOGY Inc.

GR01 Patent grant
GR01 Patent grant