CN108446018A - A kind of augmented reality eye movement interactive system based on binocular vision technology - Google Patents

A kind of augmented reality eye movement interactive system based on binocular vision technology Download PDF

Info

Publication number
CN108446018A
CN108446018A CN201810147714.3A CN201810147714A CN108446018A CN 108446018 A CN108446018 A CN 108446018A CN 201810147714 A CN201810147714 A CN 201810147714A CN 108446018 A CN108446018 A CN 108446018A
Authority
CN
China
Prior art keywords
image
coordinate
eye movement
preposition
target areas
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810147714.3A
Other languages
Chinese (zh)
Inventor
杜煜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Green Technology Co Ltd
Original Assignee
Shanghai Green Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Green Technology Co Ltd filed Critical Shanghai Green Technology Co Ltd
Priority to CN201810147714.3A priority Critical patent/CN108446018A/en
Publication of CN108446018A publication Critical patent/CN108446018A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Abstract

The augmented reality eye movement interactive system based on binocular vision technology that the present invention provides a kind of, including wearing type glasses, double preposition image acquiring devices, eye movement harvester, controller and processor, double preposition image acquiring devices are mounted on the front of wearing type glasses, eye movement harvester is mounted on wearing type glasses close to eyes side, controller receives the image data of double preposition image acquiring devices and the acquisition of eye movement harvester, result of calculation is transmitted to processor, processor calculates the space coordinate of eyes specific objective in the blinkpunkt of the target areas 3D and the target areas 3D, blinkpunkt instruction corresponding with specific objective is triggered in the target areas 3D when the space coordinate coincidence of specific objective.The interaction of blinkpunkt and specific objective in the target areas 3D may be implemented in the above-mentioned augmented reality eye movement interactive system based on binocular vision technology.

Description

A kind of augmented reality eye movement interactive system based on binocular vision technology
Technical field
The invention belongs to wearable device field more particularly to a kind of augmented reality eye movement friendships based on binocular vision technology Mutual system.
Background technology
Binocular stereo vision (Binocular Stereo Vision) is a kind of important form of machine vision, it is base In principle of parallax and two images of the imaging device from different position acquisition testees are utilized, by calculating image corresponding points Between position deviation, the method to obtain object dimensional geological information.Fusion two eyes obtain image and observe them it Between difference, so that us is obtained apparent sense of depth, establish the correspondence between feature, binocular stereo vision measurement method Have many advantages, such as that efficient, precision is suitable, system structure is simple, at low cost.
Augmented reality (Augmented Reality Technique, abbreviation AR) is the one of sciemtifec and technical sphere in recent years Virtual information can be presented on by helmet window among true environment by item hot technology, the technology.With augmented reality Technology is gradually popularized, and eye control interaction technique is fused to the research among augmented reality equipment and starts to occur.
With the development of technology, new interactive mode is continuously available application, occurs contactless manipulation side in recent years Formula, at present technology maturation and it is widely applied mainly voice control, but voice-based mode of operation it is maximum the disadvantage is that hold It is vulnerable to environmental noise interference.The eye movement interactive system for applying augmented reality can be very good to overcome lacking for voice control Point enhances the anti-interference of contactless manipulation, improves the convenience that interactive sensitivity is interacted with 3D.By binocular vision technology Applied to eye movement interactive system, the interaction precision of system can be improved.
Invention content
The interaction of blinkpunkt and specific objective in the target areas 3D may be implemented in technical scheme of the present invention.
The specific technical solution of the present invention is:A kind of augmented reality eye movement interactive system based on binocular vision technology, packet Wearing type glasses, double preposition image acquiring devices, eye movement harvester, controller and processor are included, double preposition images obtain dress It sets mounted on the front of wearing type glasses, eye movement harvester is mounted on wearing type glasses close to eyes side, and controller receives Double preposition image acquiring devices and the image data of eye movement harvester acquisition, are transmitted to processor by result of calculation, handle Device calculate eyes specific objective in the blinkpunkt of the target areas 3D and the target areas 3D space coordinate, the blinkpunkt with Triggering specific objective corresponding instruction when the space coordinate of specific objective overlaps in the target areas 3D.
As advanced optimizing for the present invention, the controller includes that pupil and Purkinje image positioning unit, 3D targets are known Other unit, binocular vision unit and eye movement data analytic unit.
As advanced optimizing for the present invention, the pupil and Purkinje image positioning unit receive the output of eye movement harvester Eye movement image data, output pupil center and Purkinje image image coordinate, the 3D object-recognition units receive double preposition figures As the image data that acquisition device exports, the image coordinate of specific objective in the target areas 3D, the binocular vision unit are exported Specific mesh in reception binocular calibration data, the image coordinate of pupil center, the image coordinate of Purkinje image and the target areas 3D Target image coordinate, exports the three-dimensional coordinate of pupil center, specific mesh in the three-dimensional coordinate of Purkinje image and the target areas 3D Target three-dimensional coordinate, eye movement data analytic unit receive the three-dimensional coordinate of pupil center, the three-dimensional coordinate and 3D of Purkinje image The three-dimensional coordinate of specific objective in target area, blinkpunkt in output eye movement vector, eye state and the target areas 3D.
As advanced optimizing for the present invention, the determination process of the coordinate of the Purkinje image and pupil center can be divided into six A step:
Step 1:Image preprocessing is smoothed eye image using gaussian filtering;
Step 2:Using the thresholding method that can be adaptive to every frame image change, to obtain the region where pupil;
Step 3:Extract Purkinje image center;
Step 4:Pupil contour feature point is obtained with gradient method;
Step 5:Adaptive threshold is sought for determining pupil position and size;
Step 6:Pupil center is determined using cluster fitting pupil profile.
As the present invention advanced optimize, in the target areas 3D the computational methods of the image coordinate of specific objective by Two step compositions:Step 1, Region Proposal Network (RPN) wait region for extracting;Step 2:It uses RoIPool carries out category classification to candidate region extraction feature and coordinate returns, for each RoI (Regin of Interest), algorithm exports a two-value mask.
As the present invention advanced optimize, in the target areas 3D the determination of the three-dimensional coordinate of specific objective be divided into two A step:
Step 1:Determine coordinates of the specific objective P in double preposition image acquiring device coordinate systems in the target areas 3D, it is double The distance of the line of the projection centre of preposition image acquiring device is b, and the origin of preposition image acquiring device coordinate system is preposition At the optical center of image acquiring device camera lens, for the imaging plane of preposition image acquiring device after the optical center of camera lens, left and right imaging is flat Iso-surface patch before the optical center of camera lens at f, the u axis and v axis of virtual plane of delineation coordinate system 01uv with and preposition image obtain dress The x-axis for setting coordinate system is consistent with y-axis direction, the intersection point O of the origin of left images coordinate system in camera optical axis and plane1And O2, Certain point P corresponding coordinates in left image and right image are respectively P in space1(u1, v1) and P2(u2, v2), it is assumed that double preposition figures As acquisition device image in the same plane, then the Y coordinate of point P image coordinates is identical, i.e. v1=v2, closed by triangle geometry System obtains:
(x in above formulac, yc, zc) be point P it is left front set image acquiring device coordinate system in coordinate, b is baseline distance, and f is The focal length of double preposition image acquiring devices, (u1, v1) and (u2, v2) it is respectively coordinates of the point P in left image and right image, depending on Difference is the alternate position spike of certain point respective point in two images:Thus certain point P in space can be calculated It is in the left front coordinate set in image acquiring device coordinate system:
Step 2:Determine phases of the specific objective P in the double preposition image acquiring device coordinate systems in left and right in the target areas 3D After should putting, according to the inside and outside parameter of the double preposition image acquiring devices in left and right, so that it may to determine the three-dimensional coordinate of P points.
As advanced optimizing for the present invention, the eye movement harvester includes each 8 infrared light supplies of eyes and eyes each 2 A infrared eye movement camera.
The beneficial effects of the present invention are the anti-interference that can enhance system and improve eyes and the target areas 3D Zhong Te The interaction precision to set the goal.
A kind of augmented reality eye movement interactive system based on binocular vision technology of present disclosure, may be implemented blinkpunkt With the interaction of specific objective in the target areas 3D.
Description of the drawings
Fig. 1 is that the present invention is based on the operational process block diagrams of the augmented reality eye movement interactive system of binocular vision technology;
Fig. 2 is that the present invention is based on the controller operational process frames of the augmented reality eye movement interactive system of binocular vision technology Figure;
Fig. 3 is that the present invention is based in the eyes of the augmented reality eye movement interactive system of binocular vision technology and the target areas 3D Specific objective interacts schematic diagram;
Fig. 4 is that the present invention is based on specific objective coordinates in the augmented reality eye movement interactive system of binocular vision technology to calculate original Manage schematic diagram;
Fig. 5 is that the present invention is based on the wearing type glasses Facad structures of the augmented reality eye movement interactive system of binocular vision technology Schematic diagram;
Fig. 6 is that the present invention is based on the wearing type glasses backside structures of the augmented reality eye movement interactive system of binocular vision technology Schematic diagram.
Specific implementation mode
For the technical solution that the present invention will be described in detail, in conjunction with the description of the drawings preferred embodiment of the invention.
Embodiment of the invention discloses that a kind of augmented reality eye movement interactive system based on binocular vision technology, such as Fig. 1 Shown in Fig. 5, including wearing type glasses 10, double preposition image acquiring devices 101, eye movement harvester 102, controller and processing Device, double preposition image acquiring devices 101 are mounted on the front of wearing type glasses, for obtaining the foreground data in front of glasses, eye Dynamic harvester 102 is mounted on the image data that wearing type glasses are used to acquire eyes close to eyes side, and controller receives double Preposition image acquiring device and the image data of eye movement harvester acquisition, processor, processor are transmitted to by result of calculation Calculate the space coordinate of eyes specific objective in the blinkpunkt of the target areas 3D and the target areas 3D, blinkpunkt and 3D targets Triggering specific objective corresponding instruction when the space coordinate of specific objective overlaps in region.
As shown in Figure 4, Figure 5, double preposition image acquiring devices are made of two front videos, obtain the 3D in front of glasses The image data of target area, eye movement harvester include each 2 infrared eye movement cameras of each 8 infrared light supplies of eyes and eyes, Image data for acquiring eyes.
As shown in Figure 1 to Figure 3, controller includes pupil and Purkinje image positioning unit, 3D object-recognition units, binocular vision Feel unit and eye movement data analytic unit.Pupil and Purkinje image positioning unit receive the eye movement image of eye movement harvester output Data, output pupil center and Purkinje image image coordinate, the 3D object-recognition units receive double preposition image acquiring devices The image data of output, exports the image coordinate of specific objective in the target areas 3D, and the binocular vision unit receives Bi-objective The image of specific objective is sat in fixed number evidence, the image coordinate of pupil center, the image coordinate of Purkinje image and the target areas 3D Mark exports the three-dimensional coordinate of pupil center, and the three-dimensional of specific objective is sat in the three-dimensional coordinate of Purkinje image and the target areas 3D It marks, the three-dimensional coordinate of eye movement data analytic unit reception pupil center, in the three-dimensional coordinate of Purkinje image and the target areas 3D The three-dimensional coordinate of specific objective, blinkpunkt in output eye movement vector, eye state and the target areas 3D.
The determination process of Purkinje image and the coordinate of pupil center can be divided into six steps:
Step 1:Image preprocessing is smoothed eye image using gaussian filtering;
Step 2:Using the thresholding method that can be adaptive to every frame image change, to obtain the region where pupil;
Step 3:Extract Purkinje image center;
Step 4:Pupil contour feature point is obtained with gradient method;
Step 5:Adaptive threshold is sought for determining pupil position and size;
Step 6:Pupil center is determined using cluster fitting pupil profile.
As the present invention advanced optimize, in the target areas 3D the computational methods of the image coordinate of specific objective by Two step compositions:Step 1, Region Proposal Network (RPN) wait region for extracting;Step 2:It uses RoIPool carries out category classification to candidate region extraction feature and coordinate returns, for each RoI (Regin of Interest), algorithm exports a two-value mask.
The determination of the three-dimensional coordinate of specific objective is divided into two steps in the target areas 3D:
Step 1:As shown in figure 4, determining that specific objective P is in double preposition image acquiring device coordinate systems in the target areas 3D In coordinate, the distance of the line of the projection centre of double preposition image acquiring devices is b, preposition image acquiring device coordinate system Origin at the optical center of preposition image acquiring device camera lens, the imaging plane of preposition image acquiring device after the optical center of camera lens, Left and right imaging plane is plotted in before the optical center of camera lens at f, virtual plane of delineation coordinate system O1The u axis and v axis of uv with and it is preposition The x-axis of image acquiring device coordinate system is consistent with y-axis direction, and the origin of left images coordinate system is in camera optical axis and plane Intersection point O1And O2, certain point P corresponding coordinates in left image and right image are respectively P in space1(u1, v1) and P2(u2, v2), it is false In the same plane, then the Y coordinate of point P image coordinates is identical, i.e. v for the image of fixed double preposition image acquiring devices1=v2, by Triangle geometrical relationship obtains:
(x in above formulac, yc, zc) be point P it is left front set image acquiring device coordinate system in coordinate, b is baseline distance, and f is The focal length of double preposition image acquiring devices, (u1, v1) and (u2, v2) it is respectively coordinates of the point P in left image and right image, depending on Difference is the alternate position spike of certain point respective point in two images:Thus certain point P in space can be calculated It is in the left front coordinate set in image acquiring device coordinate system:
Step 2:Determine phases of the specific objective P in the double preposition image acquiring device coordinate systems in left and right in the target areas 3D After should putting, according to the inside and outside parameter of the double preposition image acquiring devices in left and right, so that it may to determine the three-dimensional coordinate of P points.
Although present disclosure is discussed in detail by above preferred embodiment, but it should be appreciated that above-mentioned Description is not considered as limitation of the present invention.After those skilled in the art have read the above, for the present invention's A variety of modifications and substitutions all will be apparent.Therefore, protection scope of the present invention should be limited to the appended claims.

Claims (7)

1. a kind of augmented reality eye movement interactive system based on binocular vision technology, including wearing type glasses, double preposition images obtain Device, eye movement harvester, controller and processor, double preposition image acquiring devices is taken to be mounted on the front of wearing type glasses, Eye movement harvester is mounted on wearing type glasses close to eyes side, which is characterized in that the controller receives double preposition images Result of calculation is transmitted to processor by acquisition device and the image data of eye movement harvester acquisition, and processor calculates eyes The space coordinate of specific objective in the blinkpunkt of the target areas 3D and the target areas 3D, the blinkpunkt and the target areas 3D Triggering specific objective corresponding instruction when the space coordinate of middle specific objective overlaps.
2. the augmented reality eye movement interactive system according to claim 1 based on binocular vision technology, the controller packet Include pupil and Purkinje image positioning unit, 3D object-recognition units, binocular vision unit and eye movement data analytic unit.
3. the augmented reality eye movement interactive system according to claim 2 based on binocular vision technology, the pupil and general Spot positioning unit receives the eye movement image data that eye movement harvester exports, output pupil center and Purkinje image image seat by the emperor himself for you Mark, the 3D object-recognition units receive the image data of double preposition image acquiring device outputs, output 3D target area Zhong Te The image coordinate to set the goal, the binocular vision unit receive binocular calibration data, the image coordinate of pupil center, Purkinje image Image coordinate and the target areas 3D in specific objective image coordinate, export the three-dimensional coordinate of pupil center, Purkinje image Three-dimensional coordinate and the target areas 3D in specific objective three-dimensional coordinate, eye movement data analytic unit receive pupil center three Dimension coordinate, the three-dimensional coordinate of specific objective in the three-dimensional coordinate of Purkinje image and the target areas 3D, output eye movement vector, eyes Blinkpunkt in state and the target areas 3D.
4. the augmented reality eye movement interactive system according to claim 3 based on binocular vision technology, the Purkinje image It can be divided into six steps with the determination process of the coordinate of pupil center:
Step 1:Image preprocessing is smoothed eye image using gaussian filtering;
Step 2:Using the thresholding method that can be adaptive to every frame image change, to obtain the region where pupil;
Step 3:Extract Purkinje image center;
Step 4:Pupil contour feature point is obtained with gradient method;
Step 5:Adaptive threshold is sought for determining pupil position and size;
Step 6:Pupil center is determined using cluster fitting pupil profile.
5. the augmented reality eye movement interactive system according to claim 3 based on binocular vision technology, the target areas 3D The computational methods of the image coordinate of specific objective are made of two steps in domain:Step 1, Region Proposal Network (RPN) it is used to extract to wait region;Step 2:Category classification is carried out to candidate region extraction feature using RoIPool and coordinate returns Return, for each RoI (Regin of Interest), algorithm exports a two-value mask.
6. the augmented reality eye movement interactive system according to claim 3 based on binocular vision technology, the target areas 3D The determination of the three-dimensional coordinate of specific objective is divided into two steps in domain:
Step 1:Determine coordinates of the specific objective P in double preposition image acquiring device coordinate systems in the target areas 3D, it is double preposition The distance of the line of the projection centre of image acquiring device is b, and the origin of preposition image acquiring device coordinate system is in preposition image At the optical center of acquisition device camera lens, the imaging plane of preposition image acquiring device after the optical center of camera lens, paint by left and right imaging plane It makes before the optical center of camera lens at f, virtual plane of delineation coordinate system O1The u axis and v axis of uv with and preposition image acquiring device seat The x-axis for marking system is consistent with y-axis direction, the intersection point O of the origin of left images coordinate system in camera optical axis and plane1And O2, space In certain point P corresponding coordinates in left image and right image be respectively P1(u1, v1) and P2(u2, v2), it is assumed that double preposition images obtain Taking the image of device, then the Y coordinate of point P image coordinates is identical, i.e. v in the same plane1=v2, obtained by triangle geometrical relationship It arrives:
(x in above formulac, yc, zc) be point P it is left front set image acquiring device coordinate system in coordinate, b is baseline distance, before f is pair Set the focal length of image acquiring device, (u1, v1) and (u2, v2) it is respectively coordinates of the point P in left image and right image, parallax is The alternate position spike of certain point respective point in two images:Thus certain point P can be calculated in space on a left side Coordinate in preposition image acquiring device coordinate system is:
Step 2:Determine respective points of the specific objective P in the double preposition image acquiring device coordinate systems in left and right in the target areas 3D Afterwards, according to the inside and outside parameter of the double preposition image acquiring devices in left and right, so that it may to determine the three-dimensional coordinate of P points.
7. the augmented reality eye movement interactive system according to claim 1 based on binocular vision technology, the eye movement acquisition Device includes each 2 infrared eye movement cameras of each 8 infrared light supplies of eyes and eyes.
CN201810147714.3A 2018-02-12 2018-02-12 A kind of augmented reality eye movement interactive system based on binocular vision technology Pending CN108446018A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810147714.3A CN108446018A (en) 2018-02-12 2018-02-12 A kind of augmented reality eye movement interactive system based on binocular vision technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810147714.3A CN108446018A (en) 2018-02-12 2018-02-12 A kind of augmented reality eye movement interactive system based on binocular vision technology

Publications (1)

Publication Number Publication Date
CN108446018A true CN108446018A (en) 2018-08-24

Family

ID=63192292

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810147714.3A Pending CN108446018A (en) 2018-02-12 2018-02-12 A kind of augmented reality eye movement interactive system based on binocular vision technology

Country Status (1)

Country Link
CN (1) CN108446018A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110033652A (en) * 2019-03-11 2019-07-19 中国民航大学 A kind of radar dynamic object scaling method and system based on eye movement tracking
CN110215186A (en) * 2019-05-09 2019-09-10 南京览视医疗科技有限公司 One kind being automatically aligned to positioning fundus camera and its working method
CN110389664A (en) * 2019-06-25 2019-10-29 浙江大学 A kind of fire scenario sunykatuib analysis device and method based on augmented reality
CN112330747A (en) * 2020-09-25 2021-02-05 中国人民解放军军事科学院国防科技创新研究院 Multi-sensor combined detection and display method based on unmanned aerial vehicle platform
CN113961068A (en) * 2021-09-29 2022-01-21 北京理工大学 Close-distance real object eye movement interaction method based on augmented reality helmet

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102981616A (en) * 2012-11-06 2013-03-20 中兴通讯股份有限公司 Identification method and identification system and computer capable of enhancing reality objects
CN103983186A (en) * 2014-04-17 2014-08-13 内蒙古大学 Binocular vision system correcting method and device
CN105812778A (en) * 2015-01-21 2016-07-27 成都理想境界科技有限公司 Binocular AR head-mounted display device and information display method therefor
US20170052595A1 (en) * 2015-08-21 2017-02-23 Adam Gabriel Poulos Holographic Display System with Undo Functionality
WO2017219195A1 (en) * 2016-06-20 2017-12-28 华为技术有限公司 Augmented reality displaying method and head-mounted display device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102981616A (en) * 2012-11-06 2013-03-20 中兴通讯股份有限公司 Identification method and identification system and computer capable of enhancing reality objects
CN103983186A (en) * 2014-04-17 2014-08-13 内蒙古大学 Binocular vision system correcting method and device
CN105812778A (en) * 2015-01-21 2016-07-27 成都理想境界科技有限公司 Binocular AR head-mounted display device and information display method therefor
US20170052595A1 (en) * 2015-08-21 2017-02-23 Adam Gabriel Poulos Holographic Display System with Undo Functionality
WO2017219195A1 (en) * 2016-06-20 2017-12-28 华为技术有限公司 Augmented reality displaying method and head-mounted display device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
AMER AL-RAHAYFEH等: "Eye Tracking and Head Movement Detection: A State-of-Art Survey", 《IEEE JOURNAL OF TRANSLATIONAL ENGINEERING IN HEALTH AND MEDICINE (VOLUME:1)》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110033652A (en) * 2019-03-11 2019-07-19 中国民航大学 A kind of radar dynamic object scaling method and system based on eye movement tracking
CN110033652B (en) * 2019-03-11 2021-06-04 中国民航大学 Radar dynamic target calibration method and system based on eye tracking
CN110215186A (en) * 2019-05-09 2019-09-10 南京览视医疗科技有限公司 One kind being automatically aligned to positioning fundus camera and its working method
CN110389664A (en) * 2019-06-25 2019-10-29 浙江大学 A kind of fire scenario sunykatuib analysis device and method based on augmented reality
CN112330747A (en) * 2020-09-25 2021-02-05 中国人民解放军军事科学院国防科技创新研究院 Multi-sensor combined detection and display method based on unmanned aerial vehicle platform
CN112330747B (en) * 2020-09-25 2022-11-11 中国人民解放军军事科学院国防科技创新研究院 Multi-sensor combined detection and display method based on unmanned aerial vehicle platform
CN113961068A (en) * 2021-09-29 2022-01-21 北京理工大学 Close-distance real object eye movement interaction method based on augmented reality helmet
CN113961068B (en) * 2021-09-29 2023-01-06 北京理工大学 Close-range real object eye movement interaction method based on augmented reality helmet

Similar Documents

Publication Publication Date Title
CN108446018A (en) A kind of augmented reality eye movement interactive system based on binocular vision technology
US11749025B2 (en) Eye pose identification using eye features
CN104317391B (en) A kind of three-dimensional palm gesture recognition exchange method and system based on stereoscopic vision
CN106503671B (en) The method and apparatus for determining human face posture
KR101874494B1 (en) Apparatus and method for calculating 3 dimensional position of feature points
JP2008535116A (en) Method and apparatus for three-dimensional rendering
KR20160042564A (en) Apparatus and method for tracking gaze of glasses wearer
JP2018163481A (en) Face recognition device
CN110909571B (en) High-precision face recognition space positioning method
KR20160081775A (en) Method and apparatus for generating three dimension image
US20220405500A1 (en) Computationally efficient and robust ear saddle point detection
JP2022133133A (en) Generation device, generation method, system, and program
CN112101261A (en) Face recognition method, device, equipment and storage medium
KR20210147075A (en) Method and apparatus for measuring local power and/or power distribution of spectacle lenses
CN113140046A (en) AR (augmented reality) cross-over control method and system based on three-dimensional reconstruction and computer readable medium
Zhang et al. A 3D Face Modeling and Recognition Method Based on Binocular Stereo Vision and Depth-Sensing Detection
CN110189267A (en) A kind of real-time location method and device based on machine vision
US20220395176A1 (en) System and method for digital optician measurements
CN116883472B (en) Face nursing system based on face three-dimensional image registration
Kim et al. Confocal disparity estimation and recovery of pinhole image for real-aperture stereo camera systems
CN112052827B (en) Screen hiding method based on artificial intelligence technology
Zhang Acquisition and preprocessing of face recognition data
WO2024037722A1 (en) Devices, methods and computer programs for virtual eyeglasses try-on
Luo et al. Tracking of moving heads in cluttered scenes from stereo vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20180824

RJ01 Rejection of invention patent application after publication