CN108259887B - Method and device for calibrating fixation point and method and device for calibrating fixation point - Google Patents

Method and device for calibrating fixation point and method and device for calibrating fixation point Download PDF

Info

Publication number
CN108259887B
CN108259887B CN201810330258.6A CN201810330258A CN108259887B CN 108259887 B CN108259887 B CN 108259887B CN 201810330258 A CN201810330258 A CN 201810330258A CN 108259887 B CN108259887 B CN 108259887B
Authority
CN
China
Prior art keywords
eye
right eye
left eye
calibration
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810330258.6A
Other languages
Chinese (zh)
Other versions
CN108259887A (en
Inventor
赵静
韦海成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
North Minzu University
Ningxia University
Original Assignee
North Minzu University
Ningxia University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by North Minzu University, Ningxia University filed Critical North Minzu University
Priority to CN201810330258.6A priority Critical patent/CN108259887B/en
Publication of CN108259887A publication Critical patent/CN108259887A/en
Application granted granted Critical
Publication of CN108259887B publication Critical patent/CN108259887B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Eye Examination Apparatus (AREA)

Abstract

The invention provides fixation point calibration methods and devices, and fixation point calibration methods and devices, wherein the fixation point calibration method needs to use a plurality of groups of binocular parallax images, a left eye image and a right eye image in each group of binocular parallax images are obtained from scenes, each scene comprises calibration points with known coordinates in a scene coordinate system, during calibration, a plurality of left eye images are sequentially played to the left eye, left eye fixation point calibration is carried out, a left eye mapping relation related to left eye features and calibration point coordinates is obtained, a plurality of right eye images are sequentially played to the right eye, right eye fixation point calibration is carried out, a right eye mapping relation related to right eye features and calibration point coordinates is obtained, and as the binocular parallax images are stereo images, the left eye images and the right eye images of the binocular parallax images are respectively calibrated to the left eye and the right eye, and the obtained left eye mapping relation and right eye mapping relation are higher in matching precision with the stereo images.

Description

Method and device for calibrating fixation point and method and device for calibrating fixation point
Technical Field
The invention relates to the technical field of visual tracking, in particular to fixation point calibration methods and devices and fixation point calibration methods and devices.
Background
The eye movement analysis instrument is important instruments for current virtual reality, hot spot capture and human daily behavior detection, and has extensive application, before the eye movement analysis instrument is used, namely before the eye movement analysis instrument is used for calibrating a fixation point, the eye movement analysis instrument is required to be used for calibrating the fixation point, the fixation point refers to a certain point of an object with aligned sight in a visual perception process, the calibration is called as calibration, namely calibration points with known coordinates are sequentially displayed on a plane, an eye camera shoots eye features of eyes when observing each calibration point, a mapping relation between the eye features and the coordinates of the calibration points is found, the fixation point is calibrated, namely when the eye movement analysis instrument is used, the position of the fixation point on the plane is determined according to the eye features shot by the eye camera and the mapping relation determined during calibration, and the calibration coordinates of the fixation point are obtained.
However, the current calibration methods do not consider the visual characteristics of human, adopt a single plane images for calibration, do not consider the left and right visual difference of human body, cannot accurately reflect the corresponding relation between human vision and actual objects, and have the limitation of .
As shown in FIG. 1a, when objects are seen by the left and right eyes simultaneously, the positions of the objects in different eyeviews are different due to the different angles of the objects seen by the two eyes, as shown in FIG. 1b, the human visual field angle range also has a limit of , theoretically semicircles of approximately 240 degrees, that is, a normal person has no stereoscopic impression on the side and the back of the person, and the stereoscopic impression range is only about 150 degrees in front of the eye, as shown in FIG. 1c, from the front image which can be observed by the person, the image observed by the person is not a complete 360-degree circular area, but is a similar heart-shaped area contained by the dotted line in the figure.
In summary, the observed image and the actual image of a person can be relatively overlapped only in partial space regions, and the observed images of two eyes are not completely overlapped, so that the conventional method using a single plane image, whether the square region of a rectangular coordinate system or the circular image of a polar coordinate system, cannot find an accurate mapping relationship, and therefore cannot completely and accurately mark a fixation point.
Disclosure of Invention
The invention aims to provide fixation point calibration methods and devices and fixation point calibration methods and devices, which are used for respectively calibrating fixation points of left and right eyes based on binocular vision, respectively finding out mapping relations between eye features of the left and right eyes and coordinates of calibration points, and improving the calibration accuracy of the fixation points.
In order to achieve the above object, the embodiments of the present invention provide the following technical solutions:
, the embodiments of the present invention provide methods for calibrating a gaze point, where the methods for calibrating a gaze point require the use of multiple sets of binocular disparity images, each set of binocular disparity images includes a left-eye image and a right-eye image, the left-eye image and the right-eye image in each set of binocular disparity images are obtained from scenes, each scene includes points with known coordinates in a scene coordinate system, and the points with known coordinates are used as calibration points;
the method for calibrating the fixation point specifically comprises the following steps:
acquiring eye characteristics of a left eye during the period that the left eye of a tester watches the calibration points in each left eye image, and acquiring a mapping relation between the eye characteristics of the left eye and the coordinates of the calibration points as a left eye mapping relation according to the coordinates of the calibration points and the acquired eye characteristics of the left eye;
and acquiring the eye characteristics of the right eye while the right eye of the tester is staring at the calibration point in each right eye image, and acquiring the mapping relation between the eye characteristics of the right eye and the calibration point coordinates as the right eye mapping relation according to the coordinates of the plurality of calibration points and the acquired eye characteristics of the right eye.
In another aspect, an embodiment of the present invention provides kinds of gazing point calibration apparatuses, including:
the virtual reality VR system is used for respectively displaying the left eye image and the right eye image of the binocular parallax image to the left eye and the right eye of a tester;
the eye characteristic acquisition device is used for acquiring an eye characteristic picture of the left eye of the tester during the calibration of the left eye fixation point; collecting eye characteristic pictures of the right eye of a tester during the calibration of the fixation point of the right eye; and the number of the first and second groups,
the central control module is used for calculating a left-eye mapping relation according to the coordinates of the calibration points in the scene coordinate system and the left-eye characteristic pictures; and calculating a right eye mapping relation according to the coordinates of the calibration points in the scene coordinate system and the right eye feature pictures.
, the gaze point calibration device further comprises a binocular camera device for acquiring the binocular parallax image.
, the gaze point calibration device further comprises a communication module, and the central control module can receive the binocular disparity images remotely transmitted by the binocular camera device through the communication module and transmit the binocular disparity images to the virtual reality VR system.
In another aspect of , the embodiment of the present invention provides methods for calibrating a fixation point, which specifically include:
acquiring left eye and right eye characteristics of a tester during the period that the left eye and the right eye of the tester respectively observe groups of binocular parallax images at the same time, wherein the left eye and the right eye images of the binocular parallax images are acquired from the same scenes, and the scenes have scene coordinate systems;
calculating the coordinates of the left eye fixation point in a scene coordinate system according to the left eye features and the left eye mapping relation;
calculating the coordinates of the right eye fixation point in a scene coordinate system according to the features of the eyes of the right eye and the mapping relation of the right eye;
and judging that the coordinates of the gazing point of the tester in the scene coordinate system are positioned between the coordinates of the left eye gazing point and the coordinates of the right eye gazing point according to the coordinates of the left eye gazing point in the scene coordinate system and the coordinates of the right eye gazing point in the scene coordinate system.
In another aspect of , an embodiment of the present invention provides kinds of gazing point calibration apparatuses, including:
the virtual reality VR system is used for respectively displaying the left eye image and the right eye image of the binocular parallax image to the left eye and the right eye of the tester;
the eye feature acquisition device is used for respectively acquiring an eye feature picture of a left eye and an eye feature picture of a right eye when the left eye observes the left eye image and the right eye observes the right eye image; and the number of the first and second groups,
the central control module is used for calculating the coordinates of the left eye fixation point in a scene coordinate system according to the left eye characteristics and the left eye mapping relation; calculating the coordinates of the right eye fixation point in a scene coordinate system according to the features of the eyes of the right eye and the mapping relation of the right eye; and judging that the coordinate of the fixation point of the tester in the scene coordinate system is positioned between the left eye fixation point coordinate and the right eye fixation point coordinate.
, the gaze point calibration device further comprises a binocular camera device for acquiring the binocular parallax image.
, the gaze point calibration device further comprises a communication module, and the central control module can receive the binocular disparity images remotely transmitted by the binocular camera device through the communication module and transmit the binocular disparity images to the virtual reality VR system.
Compared with the prior art, the invention has the beneficial effects that: because the binocular parallax image is a stereo image, the left eye and the right eye are respectively calibrated by utilizing the left eye image and the right eye image of the binocular parallax image, and the matching precision of the obtained left eye mapping relation and the right eye mapping relation with the stereo image is higher. Therefore, when the left-eye mapping relation and the right-eye mapping relation are used for calibrating the fixation point, the fixation point calibration error is smaller.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and it will be apparent to those skilled in the art that other relevant drawings can be obtained from the drawings without inventive effort.
Fig. 1a shows a reference picture relating to the background art.
Fig. 1b shows a reference picture relating to the background art.
Fig. 1c shows a reference picture relating to the background art.
FIG. 2 is a flowchart illustrating a method for calibrating a gaze point according to a preferred embodiment of the present invention.
Fig. 3 is a schematic diagram of a shooting method of a binocular camera device in a real object scene.
FIG. 4 is a schematic diagram of a scene and calibration points.
Fig. 5 is a schematic diagram of another scenarios and calibration points.
Fig. 6 is a schematic structural diagram of a gazing point calibration apparatus provided in an embodiment.
Fig. 7 is a schematic structural diagram of another kinds of gaze point calibration devices provided in the embodiment.
Fig. 8 is a schematic structural diagram of another kinds of gaze point calibration apparatuses provided in the embodiment.
Fig. 9 is a schematic structural diagram of a left eye feature acquisition device provided in the embodiment.
Fig. 10 is a flowchart illustrating a method for calibrating a gazing point according to a preferred embodiment of the present invention.
Fig. 11a is a schematic diagram illustrating methods for determining a fixation point of two eyes provided in the embodiment.
Fig. 11b is a schematic diagram illustrating another binocular fixation point determination methods provided in the embodiments.
The reference numbers in the figures illustrate:
10-infrared light source, 20-eye camera, 30- th optical filter and 40-second optical filter.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the invention without inventive step, are within the scope of the invention.
Referring to fig. 2, the present embodiment provides methods for calibrating a fixation point.
The fixation point calibration method needs to utilize multiple groups of binocular disparity images, each group of binocular disparity images comprises a left eye image and a right eye image, the left eye image and the right eye image in each group of binocular disparity images are obtained from the same scenes, each scene comprises points with known coordinates in a scene coordinate system, and the points with the known coordinates are used as calibration points;
the method for calibrating the fixation point specifically comprises the following steps:
acquiring eye characteristics of a left eye during the period that the left eye of a tester watches the calibration points in each left eye image, and acquiring a mapping relation between the eye characteristics of the left eye and the coordinates of the calibration points as a left eye mapping relation according to the coordinates of the calibration points and the acquired eye characteristics of the left eye;
and acquiring the eye characteristics of the right eye while the right eye of the tester is staring at the calibration point in each right eye image, and acquiring the mapping relation between the eye characteristics of the right eye and the calibration point coordinates as the right eye mapping relation according to the coordinates of the plurality of calibration points and the acquired eye characteristics of the right eye.
Based on the above method, specific embodiments will be given below, and the specific embodiments can be combined with each other to form a specific method without being contradictory.
For example, referring to FIG. 3, the scene may be a physical scene, and the left and right eye images of the binocular disparity image may be captured from the physical scene using a binocular camera device, the position of the binocular camera device in the scene coordinate system is fixed during the capture of the sets of images, and the calibration points are dithered in the scene coordinate system under human control, the coordinates of the calibration points are known after each dithering, the calibration points capture images, including the left and right eye images, that should fall into both the left and right eye images, for each dither times, FIG. 3 lists several sets of shot schematics, each corresponding to shots.
For example, the scene may again be a virtual scene that may be synthesized using existing virtual reality graphics generation techniques. The left-eye image and the right-eye image of the binocular disparity image are directly generated by the computer. The coordinates of the calibration point corresponding to each set of binocular disparity images in the virtual scene coordinate system are known, and the calibration points should fall into the left-eye image and the right-eye image at the same time.
For example, referring to FIG. 4, regardless of whether the scene is a physical scene or a virtual scene, the scene may include planes that are evenly divided by cells, the calibration points have different coordinates in the scene coordinate system, the calibration points are located in the planes, and the calibration points are located at the vertices of the cells.
For example, please refer to fig. 5, whether the scene is a real scene or a virtual scene, the jumping position of the calibration point in the scene may not be limited to a certain plane, but even though the jumping position of the calibration point is not limited to a certain plane, it should be understood that the coordinates of the calibration point after each jumping should still be known.
For example, a left eye image may be played to the left eye and a right eye image may be played to the right eye, respectively, via a virtual reality VR system. For example, a plurality of groups of binocular parallax images are stored in the virtual reality VR system, and the left eye image and the right eye image of each group of binocular parallax images enter the left eye and the right eye of the tester through the virtual reality VR system respectively. During the calibration of the left eye fixation point, the virtual reality VR system sets the right eye view field black, and the virtual reality VR system plays a plurality of left eye images to the left eye in sequence. During the calibration of the right eye fixation point, the virtual reality VR system blackens the left eye view field, and the virtual reality VR system plays a plurality of left eye images to the left eye in sequence. For example, the virtual reality VR system can select a virtual reality head-mounted display device, such as VR glasses.
The reflected light is stationary if the entire eyeball is a standard sphere that spins on its center, but the true case of the eyeball is that the cornea is convex from the surface of the eyeball, so when the eyeball moves, light is directed at the cornea at varying angles resulting in reflected light in different directions.
For example, the left eye features may be obtained by other conventional methods, such as pupil-cornea reflection vector method, sclera-iris edge method, biperchun image method, etc. since the above methods are conventional methods, the present embodiment will not proceed to .
For example, according to the obtained left eye feature, specific methods for extracting left eye feature data include, but are not limited to, bright-dark pupil method, Starburst algorithm, circumference difference algorithm, and other existing methods.
For example, a classical pupil-cornea reflected spot difference fit may be utilized in solving the left eye mapping. In which a quadratic regression model x is establishedv=a0+a1xe+a2ye+a3xeye+a4xe 2+a5ye 2;yv=b0+b1ye+b2xe+b3xeye+b4ye 2+b5xe 2. Wherein (x)v,yv) The coordinate position of a fixation point in a scene coordinate system; (x)e,ye) Is a vector consisting of pupil-cornea coordinate values in the eye coordinate system or in the eye camera coordinate system, which can be obtained by the corneal reflex method described above. Before the quadratic regression model is used as the left-eye mapping relation, a needs to be calculated0~a5And b0~b5These 12 parameters. The method for solving the 12 parameters comprises the steps of playing 12 left-eye images to the left eye in sequence, and utilizing infrared rays when the calibration points in the left-eye images are observed by the left eyeLight source and eye camera, and calculation of 12 sets (x) based on corneal reflection methode,ye) And (4) coordinates. From 12 sets of corresponding calibration points in 12 left eye images are known (x)v,yv) Coordinates, and the above 12 sets (x)e,ye) Coordinates, 12 parameters were calculated. The 12 parameters are brought into a quadratic regression model, which can be used as the left-eye mapping relationship. It should be understood that the right-eye mapping relationship can also be found using the above method. It should also be understood that the order of the left and right eye alignment is not limiting.
For example, when the left eye fixation point is calibrated, other existing calibration methods, such as a least square fitting method, a human eye sight direction detection method based on purkinje's point, and the like, may also be utilized. It should be understood that the right-eye mapping relationship can also be found using the above method. It should also be understood that the order of the left and right eye alignment is not limiting.
For the existing conventional calibration method, a single pieces of plane images are used to calibrate the left and right eyes, and the parallax of the left and right eyes is not considered, so that a large matching error exists between the mapping relationship obtained by the method and the three-dimensional scene.
Referring to fig. 6, based on the above-mentioned gazing point calibration method, the present embodiment also provides kinds of gazing point calibration apparatuses.
The gaze point calibration apparatus includes:
the virtual reality VR system is used for respectively displaying the left eye image and the right eye image of the binocular parallax image to the left eye and the right eye of a tester;
the eye characteristic acquisition device is used for acquiring an eye characteristic picture of the left eye of the tester during the calibration of the left eye fixation point; collecting eye characteristic pictures of the right eye of a tester during the calibration of the fixation point of the right eye; and the number of the first and second groups,
the central control module is used for calculating a left-eye mapping relation according to the coordinates of the calibration points in the scene coordinate system and the left-eye characteristic pictures; and calculating a right eye mapping relation according to the coordinates of the calibration points in the scene coordinate system and the right eye feature pictures.
Based on the above-described device, embodiments will be given below, and the embodiments can be combined with each other to form a specific device without being contradictory.
For example, the virtual reality VR system may select a virtual reality head mounted display device, such as VR glasses.
For example, multiple sets of binocular parallax images are stored in the virtual reality VR system, or multiple sets of binocular parallax images are stored in other electronic devices embedded in the virtual reality VR system, and the binocular parallax images may be derived from a real scene photographed in advance, or derived from a virtual scene synthesized by a computer in advance. During the calibration of the fixation point of the left eye or the right eye, the virtual reality VR system directly calls the left eye image and the right eye image of the binocular parallax image from the virtual reality VR system to be played to the left eye and the right eye respectively.
For example, as shown in fig. 7, the gaze point calibration device may further include a binocular camera device for collecting the binocular parallax images, the binocular camera device transmits the collected binocular parallax images to the central control module through the image collection card, and then the central control module sends the binocular parallax images to the virtual reality VR system, during the calibration of the gaze point for the left eye, the calibration points sequentially jump in the real scene, the calibration points jump times each, the binocular camera device takes images of the left eye, and transmits the images of the left eye to the virtual reality VR system, so that the images are played to the left eye of the tester, during the calibration of the gaze point for the right eye, the calibration points sequentially jump in the real scene, the calibration points jump times each, the binocular camera device takes images of the right eye, and transmits the images of the right eye to the virtual reality VR system, so that the images are played to the right eye of the tester.
For example, as shown in fig. 8, the gaze point calibration device may further include a communication module, and at this time, the binocular camera device may be connected to the central control module through multiple wireless transmission methods, and may use a remote real object scene to calibrate the gaze points of the left and right eyes, for example, a calibration laboratory provided with a special is provided, the real object scene and the binocular camera device used for calibrating the gaze points of the left and right eyes are arranged in the laboratory, and during calibration, the central control module of the user remotely receives, through the communication module, the binocular parallax images captured by the binocular camera device in the calibration laboratory, and performs calibration, for example, the communication module may select a 3G, 4G, or 5G communication module.
For example, the eye feature collecting device includes a left eye feature collecting device and a right eye feature collecting device, please refer to fig. 9, the left eye feature collecting device includes an infrared light source 10, an eye camera 20, an th optical filter 30 and a second optical filter 40, wherein the th optical filter 30 has infrared reflection performance, is located in front of the eye and below the infrared light source 10 and the eye camera 40, and is required to cover the irradiation range of the eye camera 20 without obstructing the observation of the left eye on the left eye image, so that the eye camera 20 collects clear left eye feature images, the second optical filter 40 has infrared projection performance, is located on the lens of the eye camera 20, and is capable of filtering other light entering the eye camera 20, and ensuring that most of the infrared light entering the eye camera 20 is infrared light, when in use, the infrared light emitted from the infrared light source 10 is reflected by the th optical filter 30 and is irradiated to the area where the eye is located, the infrared light reaches the lens after being reflected by the eye and reflected for the second time by the th optical filter 30, and the infrared light is filtered by the second optical filter 40, and the eye camera 20 collects the clear left eye feature images are sent to the eye control camera 20.
For example, the central control module may be a microprocessor, and the program for calculating the left-eye mapping relationship and the right-eye mapping relationship in the microprocessor may be written according to an existing calculation method, such as a bright-dark pupil method, a start burst algorithm, a circumferential difference algorithm, and the like.
Referring to fig. 10, the present embodiment further provides methods for calibrating a gaze point based on the above method for calibrating a gaze point.
The method for calibrating the fixation point specifically comprises the following steps:
acquiring left eye and right eye characteristics of a tester during the period that the left eye and the right eye of the tester respectively observe groups of binocular parallax images at the same time, wherein the left eye and the right eye images of the binocular parallax images are acquired from the same scenes, and the scenes have scene coordinate systems;
calculating the coordinates of the left eye fixation point in a scene coordinate system according to the left eye features and the left eye mapping relation;
calculating the coordinates of the right eye fixation point in a scene coordinate system according to the features of the eyes of the right eye and the mapping relation of the right eye;
and judging that the coordinates of the gazing point of the tester in the scene coordinate system are positioned between the coordinates of the left eye gazing point and the coordinates of the right eye gazing point according to the coordinates of the left eye gazing point in the scene coordinate system and the coordinates of the right eye gazing point in the scene coordinate system.
Based on the above method, specific embodiments will be given below, and the specific embodiments can be combined with each other to form a specific method without being contradictory.
For example, the binocular disparity image may be derived from a real scene having a coordinate system. And the binocular parallax images are obtained by shooting through a binocular camera device in the real scene. Binocular camera device can be connected with virtual reality VR system through wireless or wired communication module, and the virtual reality VR system is given left and right eyes with the play of binocular parallax image. During calibration, the binocular camera device in the real object scene is fixed, and the virtual reality VR system worn on the head of the detector can move randomly along with the head of the detector.
The binocular parallax image is obtained by shooting through a binocular camera in the real scene, the binocular camera is fixedly connected with a Virtual Reality (VR) system, and the virtual reality VR system plays the binocular parallax image to the left eye and the right eye, during calibration, the head of a tester keeps still, so that the binocular camera also keeps still relative to the coordinate system of the real scene, or during calibration, the binocular camera can move randomly along with the head of the tester, but at the moment, another fixed cameras and a calculation unit are required to be added to calculate the displacement of the binocular camera relative to the coordinate system of the real scene.
For example, the binocular disparity image may be derived from a real object scene, but the real object scene is stored in a virtual reality VR system in advance. Therefore, a binocular camera device is not needed during calibration.
The virtual images have coordinate systems in the scene, and the virtual images are composed of left-eye images and right-eye images.
For example, the left eye features may be obtained by the aforementioned methods, such as corneal reflex, pupil-corneal reflex vector, sclera-iris margin, biperchun image, and so forth. The left eye feature data is extracted by the method, such as bright and dark pupil method, Starburst algorithm, circumference difference algorithm and the like. For the right eye, the method is the same as for the left eye.
For example, the extracted feature data of the left eye, i.e., the vector ZY (x) composed of pupil-cornea coordinate valuese,ye) Brought into left eye mapping relation, i.e. xv=a0+a1xe+a2ye+a3xeye+a4xe 2+a5ye 2;yv=b0+b1ye+b2xe+b3xeye+b4ye 2+b5xe 2Of these, 12 parameters are obtained, and the coordinates ZY (x) of the gaze point of the left eye in the scene coordinate system can be obtainedv,yv). For the right eye, the method is the same as that for the left eye, and the coordinate YY (x) of the fixation point of the right eye in the scene coordinate system is obtainedv,yv)。
For example, each ZY (x) is obtainedv,yv) And YY (x)v,yv) And (6) finally. Referring to fig. 11a, the position of the final binocular fixation point SY in the scene coordinate system may be selected as ZY (x)v,yv)、YY(xv,yv) The midpoint of the two-point line. Alternatively, as shown in fig. 11b, the position of the final binocular fixation point SY in the scene coordinate system may be selected as follows: fall in ZY (x)v,yv)、YY(xv,yv) The two points are connected in a sphere with a diameter. It should be understood that the way of determining the position of the final binocular fixation point SY in the scene coordinate system is not limited to the above two examples.
Based on the above method for calibrating a gaze point, the present embodiment also provides types of devices for calibrating a gaze point, please refer to the hardware structure shown in fig. 6.
The gaze point calibration apparatus includes:
the virtual reality VR system is used for respectively displaying the left eye image and the right eye image of the binocular parallax image to the left eye and the right eye of the tester;
the eye feature acquisition device is used for respectively acquiring an eye feature picture of a left eye and an eye feature picture of a right eye when the left eye observes the left eye image and the right eye observes the right eye image; and the number of the first and second groups,
the central control module is used for calculating the coordinates of the left eye fixation point in a scene coordinate system according to the left eye characteristics and the left eye mapping relation; calculating the coordinates of the right eye fixation point in a scene coordinate system according to the features of the eyes of the right eye and the mapping relation of the right eye; and judging that the coordinate of the fixation point of the tester in the scene coordinate system is positioned between the left eye fixation point coordinate and the right eye fixation point coordinate.
Based on the above-described device, embodiments will be given below, and the embodiments can be combined with each other to form a specific device without being contradictory.
For example, the virtual reality VR system may select a virtual reality head mounted display device, such as VR glasses.
For example, multiple sets of binocular parallax images are stored in the virtual reality VR system, or multiple sets of binocular parallax images are stored in other electronic devices embedded in the virtual reality VR system, and the binocular parallax images may be derived from a real scene photographed in advance, or derived from a virtual scene synthesized by a computer in advance. During calibration, the virtual reality VR system is used to simultaneously play the left-eye image and the right-eye image to the left eye and the right eye, respectively.
For example, referring to the hardware structure shown in fig. 7, the gaze point calibration apparatus may further include a binocular camera device for acquiring the binocular parallax image. The binocular camera device transmits the acquired binocular parallax images to the central control module through the image acquisition card, and then the images are sent to the virtual reality VR system through the central control module.
For example, referring to the hardware structure shown in fig. 8, the gaze point calibration device may further include a communication module, and at this time, the binocular camera device may be connected to the central control module through multiple wireless transmission methods, so that a tester may perform gaze point calibration on a remote object, a scene, and the like, thereby solving the problem that a person must arrive at a site to perform gaze point calibration. For the communication module, a 3G, 4G or 5G communication module can be selected.
For example, the ocular feature acquisition devices include a left ocular feature acquisition device and a right ocular feature acquisition device. The specific structure of the left eye and right eye feature acquisition devices may be the same as the hardware structure shown in fig. 9.
For example, the central control module may be a microprocessor, and the microprocessor stores the left-eye mapping relationship and the right-eye mapping relationship obtained by the above-mentioned gaze point calibration method. The above description is only an embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of changes or substitutions within the technical scope of the present invention, and the present invention should be covered by the scope of the present invention.

Claims (10)

1, kinds of fixation point calibration methods, characterized in that, the fixation point calibration method needs to use multiple groups of binocular disparity images, each group of binocular disparity images includes a left eye image and a right eye image, the left eye image and the right eye image in each group of binocular disparity images are obtained from the same scenes, each scene includes points with known coordinates in the scene coordinate system, the points with known coordinates are used as calibration points;
the method for calibrating the fixation point specifically comprises the following steps:
acquiring eye characteristics of a left eye during the period that the left eye of a tester watches the calibration points in each left eye image, and acquiring a mapping relation between the eye characteristics of the left eye and the coordinates of the calibration points as a left eye mapping relation according to the coordinates of the calibration points and the acquired eye characteristics of the left eye;
and acquiring the eye characteristics of the right eye while the right eye of the tester is staring at the calibration point in each right eye image, and acquiring the mapping relation between the eye characteristics of the right eye and the calibration point coordinates as the right eye mapping relation according to the coordinates of the plurality of calibration points and the acquired eye characteristics of the right eye.
2. The method for calibrating a fixation point according to claim 1, wherein, for scenes corresponding to the plurality of sets of binocular disparity images, the scenes have the same content and different contents;
wherein the same is that there are planes in each scene, and the coordinates of each plane in the scene coordinate system are the same, each plane is equally divided by the same cell, the calibration point is located in the plane, and the calibration point is located at the vertex of the cell;
the difference is that for each scene the calibration points are located at different cell vertices in the plane.
3. The method for calibrating a fixation point of claim 1, wherein the left eye image and the right eye image of each group of binocular disparity images enter the left eye and the right eye of the tester through the virtual reality VR system, respectively, and the virtual reality VR system blackens the right eye view field during the calibration of the fixation point of the left eye and blackens the left eye view field during the calibration of the fixation point of the right eye.
4, A device for calibrating a fixation point, comprising:
a virtual reality VR system for displaying the left-eye image and the right-eye image of the binocular disparity image of claim 1 to the left eye and the right eye of the tester, respectively;
the eye characteristic acquisition device is used for acquiring an eye characteristic picture of the left eye of the tester during the calibration of the left eye fixation point; collecting eye characteristic pictures of the right eye of a tester during the calibration of the fixation point of the right eye; and the number of the first and second groups,
the central control module is used for calculating a left-eye mapping relation according to the coordinates of the calibration points in the scene coordinate system and the left-eye characteristic pictures; and calculating a right eye mapping relation according to the coordinates of the calibration points in the scene coordinate system and the right eye feature pictures.
5. The gaze point calibration device of claim 4, further comprising a binocular camera device for acquiring the binocular parallax images.
6. The device of claim 5, further comprising a communication module, wherein the central control module is capable of receiving the binocular disparity images remotely transmitted by the binocular camera device through the communication module and transmitting the binocular disparity images to the virtual reality VR system.
7, kinds of fixation point calibration method, characterized by, include specifically:
acquiring left eye and right eye characteristics of a tester during the period that the left eye and the right eye of the tester respectively observe groups of binocular parallax images at the same time, wherein the left eye and the right eye images of the binocular parallax images are acquired from the same scenes, and the scenes have scene coordinate systems;
calculating the coordinates of the left eye fixation point in a scene coordinate system according to the left eye features and the left eye mapping relation;
calculating the coordinates of the right eye fixation point in a scene coordinate system according to the features of the eyes of the right eye and the mapping relation of the right eye;
and judging that the coordinates of the gazing point of the tester in the scene coordinate system are positioned between the coordinates of the left eye gazing point and the coordinates of the right eye gazing point according to the coordinates of the left eye gazing point in the scene coordinate system and the coordinates of the right eye gazing point in the scene coordinate system.
8, kind of fixation point calibration device, characterized by, includes:
the virtual reality VR system is used for respectively displaying the left eye image and the right eye image of the binocular parallax image to the left eye and the right eye of the tester;
the eye feature acquisition device is used for respectively acquiring an eye feature picture of a left eye and an eye feature picture of a right eye when the left eye observes the left eye image and the right eye observes the right eye image; and the number of the first and second groups,
the central control module is used for calculating the coordinates of the left eye fixation point in a scene coordinate system according to the left eye characteristics and the left eye mapping relation; calculating the coordinates of the right eye fixation point in a scene coordinate system according to the features of the eyes of the right eye and the mapping relation of the right eye; and judging that the coordinate of the fixation point of the tester in the scene coordinate system is positioned between the left eye fixation point coordinate and the right eye fixation point coordinate.
9. The device for calibrating a fixation point of claim 8, further comprising a binocular camera device for acquiring the binocular parallax image.
10. The device for calibrating a fixation point of claim 9, further comprising a communication module, wherein the central control module is capable of receiving the binocular disparity image remotely transmitted by the binocular camera device through the communication module and transmitting the binocular disparity image to the virtual reality VR system.
CN201810330258.6A 2018-04-13 2018-04-13 Method and device for calibrating fixation point and method and device for calibrating fixation point Active CN108259887B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810330258.6A CN108259887B (en) 2018-04-13 2018-04-13 Method and device for calibrating fixation point and method and device for calibrating fixation point

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810330258.6A CN108259887B (en) 2018-04-13 2018-04-13 Method and device for calibrating fixation point and method and device for calibrating fixation point

Publications (2)

Publication Number Publication Date
CN108259887A CN108259887A (en) 2018-07-06
CN108259887B true CN108259887B (en) 2020-01-31

Family

ID=62748203

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810330258.6A Active CN108259887B (en) 2018-04-13 2018-04-13 Method and device for calibrating fixation point and method and device for calibrating fixation point

Country Status (1)

Country Link
CN (1) CN108259887B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109044263A (en) * 2018-07-13 2018-12-21 北京七鑫易维信息技术有限公司 Detection method, device, equipment and the storage medium of eye sight line
CN109032351B (en) * 2018-07-16 2021-09-24 北京七鑫易维信息技术有限公司 Fixation point function determination method, fixation point determination device and terminal equipment
CN109240497B (en) * 2018-08-28 2021-07-13 北京航空航天大学青岛研究院 Automatic calibration method for eye tracking in virtual reality scene
CN112101064B (en) * 2019-06-17 2024-07-05 北京七鑫易维科技有限公司 Sight tracking method, device, equipment and storage medium
CN110399930B (en) * 2019-07-29 2021-09-03 北京七鑫易维信息技术有限公司 Data processing method and system
CN112578901A (en) * 2019-09-30 2021-03-30 Oppo广东移动通信有限公司 Eyeball tracking calibration method and related equipment
CN113283402B (en) * 2021-07-21 2021-11-05 北京科技大学 Differential two-dimensional fixation point detection method and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010259605A (en) * 2009-05-01 2010-11-18 Nippon Hoso Kyokai <Nhk> Visual line measuring device and visual line measuring program
CN106659380A (en) * 2014-06-12 2017-05-10 Sr实验室有限公司 Device and method of calibration for an eye tracker and eye control equipment comprising said calibration device
CN107450720A (en) * 2016-05-31 2017-12-08 Fove股份有限公司 Line-of-sight detection systems

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6495457B2 (en) * 2014-12-16 2019-04-03 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Gaze tracking system with improved calibration, accuracy compensation, and gaze localization smoothing

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010259605A (en) * 2009-05-01 2010-11-18 Nippon Hoso Kyokai <Nhk> Visual line measuring device and visual line measuring program
CN106659380A (en) * 2014-06-12 2017-05-10 Sr实验室有限公司 Device and method of calibration for an eye tracker and eye control equipment comprising said calibration device
CN107450720A (en) * 2016-05-31 2017-12-08 Fove股份有限公司 Line-of-sight detection systems

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Electrooculogram-based virtual reality game control using blink detection and gaze calibration;Devender Kumar等;《2016 International Conference on Advances in Computing, Communications and Informatics (ICACCI)》;20160924;全文 *
眼动仪注视点定位技术研究;张显勇;《中国优秀硕士学位论文全文数据库》;20110831;全文 *

Also Published As

Publication number Publication date
CN108259887A (en) 2018-07-06

Similar Documents

Publication Publication Date Title
CN108259887B (en) Method and device for calibrating fixation point and method and device for calibrating fixation point
CN109558012B (en) Eyeball tracking method and device
CN113808160B (en) Sight direction tracking method and device
CN103513421B (en) Image processor, image treatment method and image processing system
CN110967166B (en) Detection method, detection device and detection system of near-eye display optical system
Hennessey et al. Noncontact binocular eye-gaze tracking for point-of-gaze estimation in three dimensions
CN103605208A (en) Content projection system and method
CN109429060B (en) Pupil distance measuring method, wearable eye equipment and storage medium
WO2015051606A1 (en) Locating method and locating system
CN110251066A (en) Based on the not positive system and method for subjective distance measuring measurement ophthalmic refractive
WO2015035822A1 (en) Pickup of objects in three-dimensional display
WO2005063114A1 (en) Sight-line detection method and device, and three- dimensional view-point measurement device
CN110537897B (en) Sight tracking method and device, computer readable storage medium and electronic equipment
WO2015051605A1 (en) Image collection and locating method, and image collection and locating device
CN113208884A (en) Visual detection and visual training equipment
CN106293100A (en) The determination method of sight line focus and virtual reality device in virtual reality device
CN109044263A (en) Detection method, device, equipment and the storage medium of eye sight line
CN109964230A (en) Method and apparatus for eyes measurement acquisition
CN108537103B (en) Living body face detection method and device based on pupil axis measurement
CN109008937A (en) Method for detecting diopter and equipment
CN109828663A (en) Determination method and device, the operating method of run-home object of aiming area
KR20230150934A (en) System for providing educational information of surgical techniques and skills and surgical guide system based on machine learning using 3 dimensional image
CN115409774A (en) Eye detection method based on deep learning and strabismus screening system
CN112336301B (en) Strabismus measuring equipment
CN107788946A (en) Subjective formula optometry equipment and subjective formula optometry program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant