CN106214118A - A kind of ocular movement based on virtual reality monitoring system - Google Patents

A kind of ocular movement based on virtual reality monitoring system Download PDF

Info

Publication number
CN106214118A
CN106214118A CN201610658291.2A CN201610658291A CN106214118A CN 106214118 A CN106214118 A CN 106214118A CN 201610658291 A CN201610658291 A CN 201610658291A CN 106214118 A CN106214118 A CN 106214118A
Authority
CN
China
Prior art keywords
image
module
lens
virtual reality
movement based
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610658291.2A
Other languages
Chinese (zh)
Inventor
刘志勇
刘甘林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BEIJING ISEN TECHNOLOGY TRADE CO LTD
Original Assignee
BEIJING ISEN TECHNOLOGY TRADE CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BEIJING ISEN TECHNOLOGY TRADE CO LTD filed Critical BEIJING ISEN TECHNOLOGY TRADE CO LTD
Publication of CN106214118A publication Critical patent/CN106214118A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/103Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining refraction, e.g. refractometers, skiascopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B25/00Eyepieces; Magnifying glasses
    • G02B25/001Eyepieces
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted

Landscapes

  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Molecular Biology (AREA)
  • Medical Informatics (AREA)
  • Optics & Photonics (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • General Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Human Computer Interaction (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

A kind of ocular movement based on virtual reality monitoring system, including: virtual image display module, eyeball moving track image capture module, virtual image display module includes virtual image eyepiece device and imaging locating module, and eyeball moving track image capture module includes Pupil diameter processing module, eyeball deflection angle mapping calculation module and data acquisition module.The system of the present invention, it is combined by virtual image eyepiece and computer software, reduces the interference of outer bound pair detection process, make the standardized testing of dynamic visual acuity, it is obtained in that reliable ocular movement characteristic, the foundation of science is provided for subsequent dynamic vision characteristic evaluating.

Description

A kind of ocular movement based on virtual reality monitoring system
Technical field
The present invention relates to a kind of ocular movement monitoring system, particularly relate to one based on virtual image eyepiece virtual existing Real ocular movement monitoring system.
Background technology
At present, common vision testing method is to utilize static visual acuity chart, and common dynamic visual acuity inspection is mainly Relative motion feature between detection testee and sensation target.But the latter wants the auxiliary of doctor, and to test environment requirement The highest, interference factor is many, and testing result subjective factors ratio is high.
Dynamic visual acuity checks that equipment is wanted effectively get rid of environmental disturbances, can show various simple or complicated regarding as required Power checks signal, particularly can utilize virtual image technology in eyepiece, checks other indexs such as the depth of field, stereoscopic vision.And energy Simultaneously gather examination of visual acuity time eyes image, identify tested check when eye whether staring target, and then permissible Judge that tested tracking regards the ability of target;This device has good portability, meets human engineering simultaneously, comfortable wearing, and Adapt to the facial contours of most people.
Dynamic visual acuity inspection and assessment software are wanted to facilitate the tested essential information of succinct management, can manage tested relevant Check data and check result, can configure and present simple or complicated examination of visual acuity signal, when can show examination of visual acuity in real time Tested electrooculogram picture and relevant information.
Summary of the invention
For solving above-mentioned technical problem, the invention provides a kind of ocular movement based on virtual reality monitoring system, logical Cross virtual image eyepiece and computer software combination, reduce the interference of outer bound pair detection process, make the examination criteria of dynamic visual acuity Change, it is possible to obtain reliable ocular movement characteristic, the foundation of science is provided for subsequent dynamic vision characteristic evaluating.
For achieving the above object, present invention provide the technical scheme that
A kind of ocular movement based on virtual reality monitoring system, including: virtual image display module, eyeball moving track Image capture module, data matching and analysis module.
Further, described virtual image display module includes virtual image eyepiece device and imaging localization method, described Lens subassembly, human eye test section, miniscope, camera assembly and data transfer components, wherein, institute it is provided with inside eyepiece device Stating two battery of lens that lens subassembly disposes by left and right to form, described miniscope is positioned at the rear portion of described lens subassembly, Described lens subassembly front is provided with two human eye test sections, left and right accordingly, and described camera assembly comprises two photographic head, described Two photographic head are respectively toward to said two human eye test section, and described data transfer components is respectively by described miniscope and taking the photograph As assembly is connected with controller, for the transmission of data.
Further, described battery of lens is made up of 2 or more than 2 spheres or aspherical mirror, and at least one of which is convex Shape mirror, diameter range is 20-40mm, center thickness 10-18mm;Another is spill mirror diameter 23-43mm, and center thickness is 4- 10mm。
Further, described convex mirror is polymethyl methacrylate (PMMA) material or other transparent materials;Described recessed Shape mirror is polyester resin for optical use (OKP1) material or other transparent materials.
Further, described human eye test section is 4-30mm with the spacing of battery of lens;Described battery of lens optical axis center line Vertical with described miniscope plane, battery of lens is 18-32mm with the spacing of miniscope.
Further, two camera assemblies of described camera assembly are respectively correspondingly positioned at the human eye test of two, described left and right The lower section in district;Described battery of lens front portion is provided with the reflecting optics from the horizontal by 33-38 ° of angle, for camera assembly record The image of eyeball, camera assembly and battery of lens central shaft are away from for 9-13mm, and camera assembly and battery of lens horizontal range are 13mm- 17mm。
Further, two camera assemblies of described camera assembly are respectively correspondingly positioned at the rear portion of said two battery of lens Or it is anterior;Described camera assembly rear portion is provided with the reflecting optics from the horizontal by 33-38 ° of angle, for camera assembly record The image of eyeball, camera assembly and battery of lens central shaft are away from for 23-27mm, and camera assembly and battery of lens horizontal range are 16mm- 20mm。
Further, in the front portion of described human eye test section, also include face support device, a length of 140-180mm, a width of 120-160mm, described face support device is thermoplastic elastomer (TPE) (TPE) material.
Further, the localization method of the module of described imaging location comprises the following steps:
Step one reads left and right two width image P1, the P2 needing display on miniscope respectively
Step 2 calculates the miniscope screen left side and the right respectively can viewing area;
Step 3 needs image P1, P2 of display according to viewing area size bi-directional scaling, obtains image P3, P4;
Step 4 determines the top left co-ordinate datum mark of image in step 3 respectively;
Step 5 determines initial on miniscope screen of image P3, P4 respectively according to the datum mark that step 4 obtains Display position
Further, described eyeball moving track image capture module includes Pupil diameter processing module, eyeball deflection angle Degree mapping calculation module and data acquisition module.
Further, described Pupil diameter processing module, comprise the following steps:
Step one reads image information;
Step 2 carries out Gaussian Blur process, image binaryzation to image;
Step 3 high and low thresholds is respectively 43, and the edge detection operator (canny) of 131 carries out rim detection to image, Obtain canny image;
Step 4 carries out Image erosion process;
Step 5 searches all profiles;
Step 6 searches contoured according to contour area, obtains result set;
Profiles all in result set are done ellipse fitting by step 7;
Whether the ellipse after step 8 judges matching meets major and minor axis ratio, ellipse area restriction;
The elliptical center coordinate that step 9 obtains is pupil coordinate pn (xn, yn).
Further, eyeball deflection angle mapping calculation module, comprise the following steps:
The stimulating image signal of step one display centre position;
The pupil coordinate pn (xn, yn) of step 2 continuous acquisition multiple image, calculates pupil coordinate meansigma methods p0 of n times (x0, y0);
Step 3 display X deflection angle is the stimulating image signal of A;
The pupil coordinate pn (xn, yn) of step 4 continuous acquisition multiple image, calculates pupil coordinate meansigma methods pr of n times (xr, yr);
Step 5 display vertical deflection angle is the stimulating image signal of B;
The pupil coordinate pn (xn, yn) of step 6 continuous acquisition multiple image, calculates pupil coordinate meansigma methods pu of n times (xu, yu);
Step 7 obtains mapping equation;
Step 8 enters data acquisition module.
Further, described data acquisition module comprises the following steps:
Step one, according to mapping equation and pupil coordinate, calculates pupil X deflection angle in any one two field picture With vertical deflection angle
Step 2 judges whether to continue to gather;
Step 3 judged result is yes, then repeat eyeball deflection angle mapping calculation module to Pupil diameter processing module Step;Judged result is no, then store data.
Further, including data matching and analysis module, described data matching and analysis module by eyeball moving track Image capture module obtains real-time eye movement data and fits to curves of kinetic feature.
Use technique scheme, there is advantages that
First, the eyepiece of the present invention utilizes the principle of virtual image, fixing distance between virtual plane and eyes, makes to regard The condition standard of power detection.
Second, in the present invention, eyepiece uses and meets the design of ergonomics, meet when most people uses convenience and The requirement of comfortableness.
3rd, detection method is clear, simple, utilizes eyeball moving track image capture module can obtain with real-time continuous The coordinates of motion of testee eyeball pupil, algorithm is simple, and data are accurately and reliably.
4th, testee eye movement data matching can be formed movement locus by the ocular movement monitoring system of the present invention Curve, testing result can intuitively present, it is simple to follow-up analysis, evaluation.
Accompanying drawing explanation
Fig. 1 is the structural representation of virtual eyepiece in embodiment 1;
Fig. 2 is the structural representation of virtual eyepiece in embodiment 2;
Fig. 3 is the method flow diagram in Pupil diameter processing module;
Fig. 4 is the flow chart of method in eyeball deflection angle mapping calculation module;
Fig. 5 is the Technology Roadmap of eyeball moving track image capture module.
In figure, the implication of each reference is as follows:
1: virtual image face, 2: miniscope, 3: battery of lens, 4: reflector plate, 5: camera assembly, 6: human eye.
Detailed description of the invention
In order to make the purpose of the present invention, technical scheme and advantage clearer, below in conjunction with the accompanying drawings and embodiment, right The present invention is further elaborated.Should be appreciated that structure chart described herein and specific embodiment are only in order to explain this Invention, is not intended to limit the present invention.
The invention provides a kind of ocular movement based on virtual reality monitoring system, including: virtual image display module, Eyeball moving track image capture module, data matching and analysis module.Wherein, virtual image display module includes that hardware is virtual Imaging eyepiece device and software image location.
Embodiment 1
Fig. 1 is the structural representation of the virtual image eyepiece device of one embodiment of the invention, as it is shown in figure 1, eyepiece dress Put inside and be provided with lens subassembly 3, human eye test section, miniscope 2, camera assembly 5 and data transfer components, wherein, lens Two battery of lens that assembly is disposed by left and right form, and miniscope is positioned at the rear portion of described lens subassembly, before lens subassembly Side is provided with two human eye test sections, left and right accordingly, and camera assembly comprises two photographic head, and two photographic head are respectively toward to two Human eye test section, miniscope and camera assembly are connected by data transfer components respectively with controller, for the transmission of data.
Battery of lens is made up of 2 aspherical mirrors, and one of them is convex mirror, and diameter range is 20-40mm, center thickness 10-18mm;Another is spill mirror diameter 23-43mm, and center thickness is 4-10mm.Convex mirror is polymethyl methacrylate (PMMA) material or other transparent materials;Described spill mirror is polyester resin for optical use (OKP1) material or other transparent materials. Human eye test section is 4-30mm with the spacing of battery of lens;Described battery of lens optical axis center line hangs down with described miniscope plane Directly, battery of lens is 18-32mm with the spacing of miniscope.
Two photographic head of camera assembly are respectively correspondingly positioned at human eye test section, two, described left and right in the present embodiment Lower section;Battery of lens front is provided with the reflecting optics from the horizontal by 35 ° of angles, for the figure of camera assembly record eyeball Picture.Photographic head and battery of lens central shaft are away from for 9-13mm, and photographic head and battery of lens horizontal range are 13mm-17mm.Such set Meter simple in construction, but it is little to there is visible angle, increases thickness, the problem reducing the comfort level worn.
Preferably, the front of described human eye test section, also include face support device, a length of 140-180mm, a width of 120- 160mm, described face support device is thermoplastic elastomer (TPE) (TPE) material.Such design meets ergonomics, reduces tested Person supports exerting oneself of virtual eyepiece, provides comfort level during test simultaneously.
Embodiment 2
Fig. 2 is the structural representation of the virtual image eyepiece device of another embodiment of the present invention, itself and the difference of embodiment 1 Not being, two photographic head of camera assembly are respectively correspondingly positioned at the rear of two battery of lens;Be provided with at photographic head rear with Horizontal direction becomes the reflecting optics of 36 ° of angles, for the image of camera assembly record eyeball.Photographic head and battery of lens central shaft away from For 23-27mm, photographic head be 16mm-20mm with battery of lens horizontal range.The visible angle that such design provides is big, thickness Diminish, improve the comfort level worn.But exist and need specific optical design to carry out correcting image distortion, optical design complexity Problem.
Embodiment 3
Miniscope forms a virtual image plane after being amplified by battery of lens, then assume there is target B at display Upper display position is Bl, Br, then the display position of target B on virtual image plane becomes L1, R1, and the distance between them is Parallax P, virtual image plane and eye distance are L, and eyes interpupillary distance is D, then the target after the fusion that final eyes are observed is for regarding Point A, if the three-dimensional perceived depth of viewpoint A is V, then the relation between these variablees is represented by following formula:
V=L*P/ (P-D)
According to above-mentioned formula, it is assumed that the virtual image plane of virtual reality display module and eye distance are defined as 1000mm, micro- Type display resolution is W*H, it is desirable to the three-dimensional target of formation should be positioned on virtual image screen, i.e. V=0, then target B is in the virtual image Parallax P in plane should be 0, and in conjunction with interpupillary distance parameter, the spacing of now Bl, Br is W/2+ (D/62) * 10 pixel.
The localization method of the module of imaging location comprises the following steps
Step one reads left and right two width image P1, the P2 needing display on miniscope respectively
Step 2 calculates the miniscope screen left side and the right respectively can viewing area;
Step 3 needs image P1, P2 of display according to viewing area size bi-directional scaling, obtains image P3, P4;
Step 4 determines the top left co-ordinate datum mark of image in step 3 respectively;
Step 5 determines initial on miniscope screen of image P3, P4 respectively according to the datum mark that step 4 obtains Display position.
Embodiment 4
Eyeball moving track image capture module, including Pupil diameter processing module, eyeball deflection angle mapping calculation mould Block and data acquisition module.The Technology Roadmap of its entirety is as shown in Figure 5.Pupil is obtained real-time by Pupil diameter processing module Coordinate, calculates mapping equation by eyeball deflection angle mapping calculation module, obtains pupil by Pupil diameter processing module Real-time coordinates, calculates eyeball deflection angle, stores data.
Wherein, Fig. 3 is the method flow diagram of Pupil diameter processing module.The method that Pupil diameter processes includes following step Rapid:
The first step: read image from internal memory;
Second step: use 3*3 operator that image is carried out Gaussian Blur process;
3rd step: image binaryzation;
4th step: be respectively 43 with high and low thresholds, the canny operator of 131 carries out rim detection to image, obtains canny Image;
5th step: canny image is done corrosion treatmentCorrosion Science;
6th step: search all profiles;
7th step: search contoured according to contour area, obtain result set A;
8th step: profiles all in A are done ellipse fitting;
9th step: judge whether the ellipse after matching meets major and minor axis ratio, ellipse area limits;
Tenth step: the elliptical center coordinate obtained is pupil coordinate pn (xn, yn).
Embodiment 5
In eyeball deflection angle mapping calculation module, the flow chart of algorithm is as shown in Figure 4.
Method comprises the following steps:
The first step: the stimulating image signal of display centre position;
Second step: gather eyes image;
3rd step: use pupil Processing Algorithm to obtain pupil coordinate;
4th step: repeat second step, the 3rd step n times, calculates pupil coordinate meansigma methods p0 (x0, y0) of n times;
5th step: display X deflection angle is the stimulating image signal of A;
6th step: gather eyes image;
7th step: use pupil Processing Algorithm to obtain pupil coordinate;
8th step: repeat the 6th step, the 8th step n times, calculates pupil coordinate meansigma methods pr (xr, yr) of n times;
9th step: display vertical deflection angle is the stimulating image signal of B;
Tenth step: gather eyes image;
11st step: use pupil Processing Algorithm to obtain pupil coordinate;
12nd step: repeat the tenth step, the 11st step n times, calculates pupil coordinate meansigma methods pu (xu, yu) of n times;
Horizontal map ratio k1=A/ (xr-x0)
Vertical mapping ratio k2=B/ (yr-y0).
Embodiment 6
Method in data acquisition module, with reference to Fig. 5, comprises the following steps:
Step one, according to mapping equation and pupil coordinate, calculates pupil X deflection angle in any one two field picture With vertical deflection angleNow pupil X deflection angleVertical deflection angle
Step 2 judges whether to continue to gather;
Step 3 judged result is yes, then repeat eyeball deflection angle mapping calculation module to Pupil diameter processing module Step;Judged result is no, then store data.
Embodiment 7
The present invention can also preferably include data matching and analyze module, and data matching and analysis module are by ocular movement Trace image acquisition module obtains real-time eye movement data and fits to curves of kinetic feature.
Embodiment described above only have expressed embodiments of the present invention, and it describes more concrete and detailed, but can not Therefore the restriction to the scope of the claims of the present invention it is interpreted as.It should be pointed out that, for the person of ordinary skill of the art, Without departing from the inventive concept of the premise, it is also possible to make some deformation and improvement, these broadly fall into the protection model of the present invention Enclose.Therefore, the protection domain of patent of the present invention should be as the criterion with claims.

Claims (14)

1. ocular movement based on a virtual reality monitoring system, it is characterised in that including: virtual image display module, eye Ball movement locus image capture module.
Ocular movement based on virtual reality the most according to claim 1 monitoring system, it is characterised in that: described virtual graph As display module includes virtual image eyepiece device and imaging locating module, inside described eyepiece device, it is provided with lens subassembly, people Eye test section, miniscope, camera assembly and data transfer components, two battery of lens that described lens subassembly is disposed by left and right Composition, described miniscope is positioned at the rear portion of described lens subassembly, is provided with left and right two accordingly in described lens subassembly front portion Camera assembly described in individual human eye test section comprises two camera assemblies, and said two camera assembly is respectively toward to said two human eye Test section, described data transfer components is connected wired or wireless with computer to described miniscope and camera assembly respectively, Transmission for data.
Ocular movement based on virtual reality the most according to claim 2 monitoring system, it is characterised in that: described battery of lens Being made up of 2 or more than 2 spheres or aspherical mirror, at least one of which is convex mirror, and diameter range is 20-40mm, center Thickness 10-18mm;Another is spill mirror diameter 23-43mm, and center thickness is 4-10mm.
Ocular movement based on virtual reality the most according to claim 3 monitoring system, it is characterised in that: described convex mirror For polymethyl methacrylate (PMMA) material or other transparent materials;Described spill mirror is polyester resin for optical use (OKP1) material Matter or other transparent materials.
Ocular movement based on virtual reality the most according to claim 2 monitoring system, it is characterised in that: described human eye is surveyed Examination district is 4-30mm with the spacing of battery of lens;Described battery of lens optical axis center line is vertical, thoroughly with described miniscope plane Mirror group is 18-32mm with the spacing of miniscope.
Ocular movement based on virtual reality the most according to claim 1 monitoring system, it is characterised in that described shooting group Two photographic head of part are respectively correspondingly positioned at the lower section of two human eye test sections, described left and right;Described battery of lens front be provided with Horizontal direction becomes the reflecting optics of 33-38 ° of angle, for the image of camera assembly record eyeball, photographic head and battery of lens center Wheelbase is 9-13mm, and photographic head and battery of lens horizontal range are 13mm-17mm.
Ocular movement based on virtual reality the most according to claim 2 monitoring system, it is characterised in that described shooting group Two camera assemblies of part are respectively correspondingly positioned at rear portion or the front portion of said two battery of lens;Described camera assembly rear portion is provided with From the horizontal by the reflecting optics of 33-38 ° of angle, for the image of camera assembly record eyeball, camera assembly and battery of lens Central shaft is away from for 23-27mm, and camera assembly and battery of lens horizontal range are 16mm-20mm.
Ocular movement based on virtual reality the most according to claim 2 monitoring system, it is characterised in that described human eye is surveyed The front in examination district, also includes face support device, a length of 140-180mm, a width of 120-160mm, and described face support device is Thermoplastic elastomer (TPE) (TPE) material.
Ocular movement based on virtual reality the most according to claim 2 monitoring system, it is characterised in that described imaging is fixed The localization method of the module of position comprises the following steps:
Step one reads left and right two width image P1, the P2 needing display on miniscope respectively
Step 2 calculates the miniscope screen left side and the right respectively can viewing area;
Step 3 needs image P1, P2 of display according to viewing area size bi-directional scaling, obtains image P3, P4;
Step 4 determines the top left co-ordinate datum mark of image in step 3 respectively;
Step 5 determines image P3, P4 initial display on miniscope screen respectively according to the datum mark that step 4 obtains Position.
Ocular movement based on virtual reality the most according to claim 1 monitoring system, it is characterised in that described eyeball Movement locus image capture module includes Pupil diameter processing module, eyeball deflection angle mapping calculation module and data acquisition module Block.
11. ocular movement based on virtual reality according to claim 10 monitoring systems, it is characterised in that described pupil Localization process module, comprises the following steps:
Step one reads image information;
Step 2 carries out Gaussian Blur process, image binaryzation to image;
Step 3 high and low thresholds is respectively 43, and the edge detection operator (canny) of 131 carries out rim detection to image, obtains Canny image;
Step 4 carries out Image erosion process;
Step 5 searches all profiles;
Step 6 searches contoured according to contour area, obtains result set;
Profiles all in result set are done ellipse fitting by step 7;
Whether the ellipse after step 8 judges matching meets major and minor axis ratio, ellipse area restriction;
The elliptical center coordinate that step 9 obtains is pupil coordinate pn (xn, yn).
12. ocular movement based on virtual reality according to claim 10 monitoring systems, it is characterised in that eyeball deflects Angle map computing module, comprises the following steps:
The stimulating image signal of step one display centre position;
The pupil coordinate pn (xn, yn) of step 2 continuous acquisition multiple image, calculate n times pupil coordinate meansigma methods p0 (x0, y0);
Step 3 display X deflection angle is the stimulating image signal of A;
The pupil coordinate pn (xn, yn) of step 4 continuous acquisition multiple image, calculate n times pupil coordinate meansigma methods pr (xr, yr);
Step 5 display vertical deflection angle is the stimulating image signal of B;
The pupil coordinate pn (xn, yn) of step 6 continuous acquisition multiple image, calculate n times pupil coordinate meansigma methods pu (xu, yu);
Step 7 obtains mapping equation;
Step 8 enters data acquisition module.
13. ocular movement based on virtual reality according to claim 10 monitoring systems, it is characterised in that described data Acquisition module comprises the following steps:
Step one, according to mapping equation and pupil coordinate, calculates pupil X deflection angle in any one two field pictureWith vertical Straight deflection angle
Step 2 judges whether to continue to gather;
Step 3 judged result is yes, then repeat the eyeball deflection angle mapping calculation module step to Pupil diameter processing module Suddenly;Judged result is no, then store data.
14. ocular movement based on virtual reality according to claim 1 monitoring systems, also include data matching and analysis Module, it is characterised in that described data matching and analysis module will obtain in eyeball moving track image capture module in real time Eye movement data fits to curves of kinetic feature.
CN201610658291.2A 2016-01-28 2016-08-11 A kind of ocular movement based on virtual reality monitoring system Pending CN106214118A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2016100587683 2016-01-28
CN201610058768 2016-01-28

Publications (1)

Publication Number Publication Date
CN106214118A true CN106214118A (en) 2016-12-14

Family

ID=57548111

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610658291.2A Pending CN106214118A (en) 2016-01-28 2016-08-11 A kind of ocular movement based on virtual reality monitoring system

Country Status (1)

Country Link
CN (1) CN106214118A (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106908951A (en) * 2017-02-27 2017-06-30 阿里巴巴集团控股有限公司 Virtual reality helmet
CN106923785A (en) * 2017-02-17 2017-07-07 大辅科技(北京)有限公司 Vision screening system based on virtual reality technology
CN107174195A (en) * 2017-05-16 2017-09-19 上海展志光学仪器有限公司 Visual chart projecting method and VR spectacle vision table projecting apparatus based on VR technologies
CN107451551A (en) * 2017-07-24 2017-12-08 武汉秀宝软件有限公司 A kind of optimization method and system for preventing float
WO2018153368A1 (en) * 2017-02-27 2018-08-30 阿里巴巴集团控股有限公司 Virtual reality head-mounted apparatus
CN108742510A (en) * 2018-06-20 2018-11-06 首都医科大学附属北京儿童医院 Slant visibility and horizontal torsion angle detector suitable for low age infant
CN109839742A (en) * 2017-11-29 2019-06-04 深圳市掌网科技股份有限公司 A kind of augmented reality device based on Eye-controlling focus
CN110208947A (en) * 2019-06-03 2019-09-06 歌尔股份有限公司 Display equipment and display methods based on human eye tracking
CN111429567A (en) * 2020-03-23 2020-07-17 成都威爱新经济技术研究院有限公司 Digital virtual human eyeball real environment reflection method
WO2020216106A1 (en) * 2019-04-24 2020-10-29 洪浛檩 Wearable computing device and human-computer interaction method
CN113288044A (en) * 2021-05-27 2021-08-24 北京大学第三医院(北京大学第三临床医学院) Dynamic vision testing system and method
CN113633257A (en) * 2021-07-29 2021-11-12 佛山市第一人民医院(中山大学附属佛山医院) Vestibular function examination method, system, device and medium based on virtual reality
CN116642670A (en) * 2023-07-27 2023-08-25 武汉精立电子技术有限公司 Optical imaging method and device for micro display detection

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000307973A (en) * 2000-01-01 2000-11-02 Atr Media Integration & Communications Res Lab Head-mounted display
US20070177103A1 (en) * 2004-02-04 2007-08-02 Migliaccio Americo A Method and apparatus for three-dimensional video-oculography
CN101273880A (en) * 2008-04-23 2008-10-01 中国人民解放军空军航空医学研究所 Method for examining opto-kinetic reflex and ocular kinetic reflex using virtual vision target
CN102068237A (en) * 2004-04-01 2011-05-25 威廉·C·托奇 Controllers and Methods for Monitoring Eye Movement, System and Method for Controlling Calculation Device
CN104539932A (en) * 2014-12-18 2015-04-22 青岛歌尔声学科技有限公司 3D glasses camera
CN104732191A (en) * 2013-12-23 2015-06-24 北京七鑫易维信息技术有限公司 Device and method for achieving eye-tracking of virtual display screens by means of crossratio invariability
CN104793741A (en) * 2015-04-03 2015-07-22 深圳市虚拟现实科技有限公司 Imaging system and method for guiding eyeballs to trace virtual reality
CN204515250U (en) * 2015-02-28 2015-07-29 鲍宇曦 A kind of novel wear-type image documentation equipment
US20150301599A1 (en) * 2014-04-18 2015-10-22 Magic Leap, Inc. Eye tracking systems and method for augmented or virtual reality
US20150335278A1 (en) * 2008-10-09 2015-11-26 Neuro Kinetics, Inc. Noninvasive rapid screening of mild traumatic brain injury using combination of subject's objective oculomotor, vestibular and reaction time analytic variables
CN105142498A (en) * 2013-03-15 2015-12-09 感知技术有限公司 Enhanced optical and perceptual digital eyewear

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000307973A (en) * 2000-01-01 2000-11-02 Atr Media Integration & Communications Res Lab Head-mounted display
US20070177103A1 (en) * 2004-02-04 2007-08-02 Migliaccio Americo A Method and apparatus for three-dimensional video-oculography
CN102068237A (en) * 2004-04-01 2011-05-25 威廉·C·托奇 Controllers and Methods for Monitoring Eye Movement, System and Method for Controlling Calculation Device
CN101273880A (en) * 2008-04-23 2008-10-01 中国人民解放军空军航空医学研究所 Method for examining opto-kinetic reflex and ocular kinetic reflex using virtual vision target
US20150335278A1 (en) * 2008-10-09 2015-11-26 Neuro Kinetics, Inc. Noninvasive rapid screening of mild traumatic brain injury using combination of subject's objective oculomotor, vestibular and reaction time analytic variables
CN105142498A (en) * 2013-03-15 2015-12-09 感知技术有限公司 Enhanced optical and perceptual digital eyewear
CN104732191A (en) * 2013-12-23 2015-06-24 北京七鑫易维信息技术有限公司 Device and method for achieving eye-tracking of virtual display screens by means of crossratio invariability
US20150301599A1 (en) * 2014-04-18 2015-10-22 Magic Leap, Inc. Eye tracking systems and method for augmented or virtual reality
CN104539932A (en) * 2014-12-18 2015-04-22 青岛歌尔声学科技有限公司 3D glasses camera
CN204515250U (en) * 2015-02-28 2015-07-29 鲍宇曦 A kind of novel wear-type image documentation equipment
CN104793741A (en) * 2015-04-03 2015-07-22 深圳市虚拟现实科技有限公司 Imaging system and method for guiding eyeballs to trace virtual reality

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106923785A (en) * 2017-02-17 2017-07-07 大辅科技(北京)有限公司 Vision screening system based on virtual reality technology
CN106908951A (en) * 2017-02-27 2017-06-30 阿里巴巴集团控股有限公司 Virtual reality helmet
WO2018153368A1 (en) * 2017-02-27 2018-08-30 阿里巴巴集团控股有限公司 Virtual reality head-mounted apparatus
WO2018153371A1 (en) * 2017-02-27 2018-08-30 阿里巴巴集团控股有限公司 Virtual reality head-mounted apparatus
US10996477B2 (en) 2017-02-27 2021-05-04 Advanced New Technologies Co., Ltd. Virtual reality head-mounted apparatus
TWI659229B (en) * 2017-02-27 2019-05-11 香港商阿里巴巴集團服務有限公司 Virtual reality headset
US11442270B2 (en) 2017-02-27 2022-09-13 Advanced New Technologies Co., Ltd. Virtual reality head-mounted apparatus with a partial-reflection partial-transmission wedge
CN107174195A (en) * 2017-05-16 2017-09-19 上海展志光学仪器有限公司 Visual chart projecting method and VR spectacle vision table projecting apparatus based on VR technologies
CN107174195B (en) * 2017-05-16 2019-05-14 上海展志光学仪器有限公司 Visual chart projecting method and VR spectacle vision table projector based on VR technology
CN107451551A (en) * 2017-07-24 2017-12-08 武汉秀宝软件有限公司 A kind of optimization method and system for preventing float
CN109839742A (en) * 2017-11-29 2019-06-04 深圳市掌网科技股份有限公司 A kind of augmented reality device based on Eye-controlling focus
CN108742510B (en) * 2018-06-20 2023-06-06 首都医科大学附属北京儿童医院 Oblique vision and horizontal torsion angle detector suitable for children with low age
CN108742510A (en) * 2018-06-20 2018-11-06 首都医科大学附属北京儿童医院 Slant visibility and horizontal torsion angle detector suitable for low age infant
WO2020216106A1 (en) * 2019-04-24 2020-10-29 洪浛檩 Wearable computing device and human-computer interaction method
CN110208947A (en) * 2019-06-03 2019-09-06 歌尔股份有限公司 Display equipment and display methods based on human eye tracking
CN110208947B (en) * 2019-06-03 2021-10-08 歌尔光学科技有限公司 Display device and display method based on human eye tracking
CN111429567A (en) * 2020-03-23 2020-07-17 成都威爱新经济技术研究院有限公司 Digital virtual human eyeball real environment reflection method
CN113288044B (en) * 2021-05-27 2022-01-07 北京大学第三医院(北京大学第三临床医学院) Dynamic vision testing system and method
CN113288044A (en) * 2021-05-27 2021-08-24 北京大学第三医院(北京大学第三临床医学院) Dynamic vision testing system and method
CN113633257A (en) * 2021-07-29 2021-11-12 佛山市第一人民医院(中山大学附属佛山医院) Vestibular function examination method, system, device and medium based on virtual reality
CN113633257B (en) * 2021-07-29 2023-12-05 佛山市第一人民医院(中山大学附属佛山医院) Vestibular function checking method, system, equipment and medium based on virtual reality
CN116642670A (en) * 2023-07-27 2023-08-25 武汉精立电子技术有限公司 Optical imaging method and device for micro display detection
CN116642670B (en) * 2023-07-27 2024-02-27 武汉精立电子技术有限公司 Optical imaging method and device for micro display detection

Similar Documents

Publication Publication Date Title
CN106214118A (en) A kind of ocular movement based on virtual reality monitoring system
CN109690553A (en) The system and method for executing eye gaze tracking
Morimoto et al. Detecting eye position and gaze from a single camera and 2 light sources
CN201307266Y (en) Binocular sightline tracking device
CN106840112B (en) A kind of space geometry measuring method measured using free space eye gaze point
Hennessey et al. Noncontact binocular eye-gaze tracking for point-of-gaze estimation in three dimensions
CN106796449A (en) Eye-controlling focus method and device
KR20150117553A (en) Method, apparatus and computer readable recording medium for eye gaze tracking
CN105354825B (en) The intelligent apparatus of reading matter position and its application in automatic identification read-write scene
CN103366157A (en) Method for judging line-of-sight distance of human eye
EP3644826A1 (en) A wearable eye tracking system with slippage detection and correction
WO2020020022A1 (en) Method for visual recognition and system thereof
CN105739707A (en) Electronic equipment, face identifying and tracking method and three-dimensional display method
CN100569176C (en) A kind ofly utilize virtual the looking-eye movement reflex method of target of looking
CN105354822B (en) The intelligent apparatus of read-write element position and application in automatic identification read-write scene
CN104615978A (en) Sight direction tracking method and device
CN106218409A (en) A kind of can the bore hole 3D automobile instrument display packing of tracing of human eye and device
US20190196221A1 (en) System and Method of Obtaining Fit and Fabrication Measurements for Eyeglasses Using Simultaneous Localization and Mapping of Camera Images
US20200393896A1 (en) System and method for gaze estimation
JP5016959B2 (en) Visibility determination device
CN103517060A (en) Method and device for display control of terminal device
CN108537103B (en) Living body face detection method and device based on pupil axis measurement
CN108985291A (en) A kind of eyes tracing system based on single camera
WO2016101861A1 (en) Head-worn display device
CN105354828A (en) Intelligent identification method of three-dimensional coordinates of book in reading and writing scene and application thereof

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20161214