CN112860057A - VR (virtual reality) equipment-based searching method for tracking sea surface ship by eye movement - Google Patents

VR (virtual reality) equipment-based searching method for tracking sea surface ship by eye movement Download PDF

Info

Publication number
CN112860057A
CN112860057A CN202011616601.7A CN202011616601A CN112860057A CN 112860057 A CN112860057 A CN 112860057A CN 202011616601 A CN202011616601 A CN 202011616601A CN 112860057 A CN112860057 A CN 112860057A
Authority
CN
China
Prior art keywords
eye
point
searching method
equipment
calibrating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011616601.7A
Other languages
Chinese (zh)
Inventor
陈奕钪
黄小卫
张维佳
吴聪
蔡驰
李晓骏
郭强
陈诚
赵刚
蒋道宇
张劲中
金卫华
孙成军
范大宏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Abas Information Technology Co ltd
Guangzhou Bureau of Extra High Voltage Power Transmission Co
Original Assignee
Nanjing Abas Information Technology Co ltd
Guangzhou Bureau of Extra High Voltage Power Transmission Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Abas Information Technology Co ltd, Guangzhou Bureau of Extra High Voltage Power Transmission Co filed Critical Nanjing Abas Information Technology Co ltd
Priority to CN202011616601.7A priority Critical patent/CN112860057A/en
Publication of CN112860057A publication Critical patent/CN112860057A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Ophthalmology & Optometry (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Astronomy & Astrophysics (AREA)
  • Remote Sensing (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a VR equipment-based searching method for tracking sea surface ships by eye movement, which comprises the following steps: (1) two high power cameras are arranged on the ship surface and are in signal connection with a computer; (2) calibrating the picture output by the high-resolution camera, calibrating the kappa angle (3) of the picture, calculating the inner product of the gradient vector and the position vector of each point of the image, finding the point with the maximum average value as the pupil center point, XiAll points in the image; (4) estimating and calibrating a fixation point; (5) the computer calculates the coordinates of the fixation point through the mapping function, the VR equipment is used for quickly and accurately measuring the sea surface direction watched by the user according to the eye movement tracking technology, the AI algorithm is used for identifying the measured data, and after the training is finished, reasonable suggestions are given to the search skills of the operator. The method designed by the invention has high automation level, accurate positioning and sufficient evidence, overcomes the defects of manual inspection, and is safe for submarine cablesThe stable operation is of great significance.

Description

VR (virtual reality) equipment-based searching method for tracking sea surface ship by eye movement
Technical Field
The invention relates to a VR (virtual reality) equipment-based sea surface vessel searching method by eye tracking, and belongs to the field of navigation.
Background
Submarine cables are complex in structure and large in size, the cost per kilometer is more than 100 ten thousand yuan, once the laid cables are hooked off, a large amount of manpower and material resources must be spent on fishing and repairing, the consumed time is long, huge economic losses are brought to power enterprises and users, and therefore the damage of anchoring accidents is huge. After an anchor accident occurs, the hit-and-accident ship is determined in time, and accident claims are organized, which is very important for determining accident responsibility and recovering economic loss. Because of the influence of sea waves and ocean currents, the distance between the anchoring point and the position of the ship is uncertain, and especially when more than 1 ship is on the sea, the ship cannot be accurately identified by the naked eyes of operators on duty, which brings great difficulty to the identification of the ship.
Disclosure of Invention
The purpose of the invention is as follows: in order to overcome the defects in the prior art, the invention provides the VR equipment-based searching method for tracking the sea surface ship by eye movement, which can accurately and timely search the troubled ship and greatly save the identification capability.
The technical scheme is as follows: in order to solve the above technical problem, the method for searching sea vessels by eye tracking based on VR equipment of the present invention comprises the following steps:
(1) two high power cameras are arranged on the ship surface and are in signal connection with a computer;
(2) calibrating a picture output by a high-resolution camera, and calibrating a kappa angle of the picture, wherein the method comprises the following specific steps:
(21) the user watches the fixed point Pe on the screen, and calculates the optical axis direction OP0 according to the previous process;
(22) connecting the eyeball center O and the fixed point Pe to obtain a sight line direction OPe;
(23) respectively solving the azimuth angles of two straight lines of the sight line and the optical axis;
(3) calculating the inner product of the gradient vector and the position vector of each point of the image, and finding the point with the maximum average value as:
Figure BDA0002875046270000011
the center point of the pupil is provided,
Figure BDA0002875046270000012
Xiall points in the image;
(4) and (3) estimating and calibrating a fixation point: 6 calibration points were determined, the fitted polynomial being: xf-a 0+ a1 Δ X + a2 Δ Y + A3 Δ X Δ Y + a4 Δ X2+A5ΔY2,Yf=B0+B1ΔX+B2ΔY+B3ΔXΔY+B4ΔX2+B5ΔY2Starting an eye movement tracking system, sequentially passing through 6 calibration points to obtain 6 groups of eye movement parameters, substituting the eye movement parameters into the formula, namely, obtaining a mapping function after a coefficient matrix A and a coefficient matrix B are obtained;
(5) the computer calculates the coordinates of the fixation point through the mapping function, the VR equipment is used for quickly and accurately measuring the sea surface direction watched by the user according to the eye movement tracking technology, the AI algorithm is used for identifying the measured data, and after the training is finished, reasonable suggestions are given to the search skills of the operator.
Preferably, the high power camera is an eye camera.
Preferably, the processor identifies the eye camera by using a vertical integral projection method, and specifically includes the following steps:
(1) obtaining a frame eye pattern, and carrying out graying and denoising treatment on the eye pattern;
(2) obtaining a gray level vertical square diagram of the eye diagram and smoothing;
(3) acquiring a corresponding value at the first peak as a binarization threshold;
(4) calculating the distance D between two inflection points by using vertical integral projection;
(5) if D > ELEn, it is a closed-eye diagram, and if D < ELEn, it is an open-eye diagram.
Preferably, in the step (2), the image smoothing may adopt mean filtering or gaussian filtering.
Preferably, in the step (4), the vertical integration includes the steps of:
(41) smoothing the histogram;
(42) dividing the threshold value of the through hole, selecting a gray value corresponding to a first wave trough in the histogram, and using the gray value corresponding to a first wave crest as a binarization threshold value;
(43) and projecting the binarized eye pattern.
Preferably, the value of ELen is 130.
Preferably, in the step (5), the AIS monitoring system records information of passing ships above the submarine cable in real time, stores the information as historical data into a database, extracts ship information in a certain range around the ship by taking the geographic coordinates of the alarm point as the center when an alarm occurs, and lists all ships in the range as suspicious ships.
Preferably, the computer receives the longitude and latitude of the submarine cable fault, the fault time and the AIS monitoring data, and utilizes the VR equipment to focus the VR equipment on a certain area to quickly form an imaging scene.
When human eyes see the screen H, although the whole screen can be seen, only the foveal vision of the B area is clear, and the AC area is imaged in a fuzzy manner, so that only a small range of the foveal vision is required to be rendered in the picture rendering process, and peripheral vision is rendered in a fuzzy manner. Eyeball rotation, high definition rendering area B changes along with the change of the fixation point, so that high definition visual experience can be realized, and extra performance overhead and resource waste caused by all high definition rendering can be reduced.
Firstly, eyes are regarded as an eyeball model, the visual axis of a person (namely the direction watched by the person) is reconstructed through pupil and flare point information in an eye image obtained by camera shooting, and then the visual axis is intersected with a 3D screen to obtain a focused point.
Because the direction of the visual axis and the position change of the head are relatively independent, the limitation that the position of the head in the 2D model cannot be changed can be well solved, and the good tracking precision can be ensured under the condition of free head movement.
Because the center of the retina is not completely on the geometric axis of the eyeball, a certain included angle exists between the sight line direction and the optical axis of the eyeball. The included angle between the sight line direction and the optical axis of the eyeball in the horizontal direction is about 5 degrees, the included angle between the left eye and the right eye is-5 degrees, the included angle between the right eye and the vertical direction is about 1.5 degrees, and 3 degrees of difference exists between different users to the maximum extent, so that the kappa angle of the device needs to be calibrated during use: the method comprises the following specific steps:
1) the user watches the fixed point Pe on the screen, and calculates the optical axis direction OP0 according to the previous process;
2) connecting the eyeball center O and the fixed point Pe to obtain a sight line direction OPe;
3) respectively solving the azimuth angles of two straight lines of the sight line and the optical axis;
in the invention, when eyes watch each point, the relative offset between the corresponding pupil and the position of the facula is measured, and then the mapping relation between the change of the watching point and the offset caused by the rotation of the eyes is determined. Due to the fact that the shape, the size and the structure of human eyes have individual differences, the position of a projection point of a point on the spherical surface of the eyes in a camera reference system and the rotation angle of the eyes have a nonlinear relation, and the sight line estimation direction and the real sight line direction have model errors, the sight line tracking system needs a calibration link.
The user will first perform a calibration procedure before starting to record eye movement data. In this process, the built-in software system measures the characteristics of the user's eyes and uses these in conjunction with the internal eye model to calculate gaze data. The model contains the shape, light refraction and reflection information of different parts of the eye (e.g., cornea, foveal position, etc.). During calibration, the user needs to observe the point on the screen that appears at a particular location, referred to as the calibration point. During this time, the eye tracker analyzes several acquired images of the eye. The resulting information of the analysis is then combined with the eye model and the gaze point of each image is calculated. The user does not need to hold the head completely still during the calibration process, as long as the focus of the line of sight is moved following the moving calibration point. The eye tracker will be tested during calibration using both bright and dark pupil modes to identify the tracking mode that best suits the current light conditions and the user's eye characteristics.
During the calibration process, the user is required to gaze their point of attention at some known target on the sea surface or on the sea floor screen, such as a mountain at the sea, a cable at the sea floor, or a coral fish school, to obtain a set of corresponding points. The image acquisition module converts the input signals from one or more sensors into some suitable signal pattern. Typically, an infrared camera is used to take an image of one eye and capture an image of the appropriate resolution, such as 640 x 480. To reduce noise and processing costs, one may also choose to generate smaller images.
In addition, for a system with special hardware, the image acquisition module is responsible for disassembling the video signal and respectively generating bright pupil images and dark pupil images. Bright pupil tracking, i.e. the light source and the imaging device are in the same optical path, so that the pupil has a brightening effect (which is the same as the red eye phenomenon in a photo); dark pupil tracking, i.e., the light source is placed further away from the imaging device (not in the same optical path), produces the effect that the pupil is darker than the iris (significant contrast).
Generally, the bright pupil effect is generated when the axis of the infrared light source and the camera lens are coaxial; conversely, when the two are not coaxial, the pupil is darker than the rest of the eye.
The boundary between the pupil and the iris is not clear, in order to improve the precision of the step, the light sources in different directions are used alternately to emit near infrared rays to human eyes, then the dim pupil of the user is obtained in every two adjacent frames of images, so that the pupil is more clearly scratched out, and the parameters such as the mass center, the shape and the like of the pupil are calculated.
The reason why near infrared rays are used is that human eyes cannot perceive the near infrared rays, so that dazzling is avoided, and users are not affected. These beams are weak and so long as the researcher arranges the user to sit at the distance indicated on the eye tracker specification (e.g. more than 60cm from the eye tracker), the user will not be at risk of radioactivity even after 8 hours in front of the working eye tracker.
The module processes an eye image based on the digital eye movement video. Pupil center coordinates and corneal reflection spot center coordinates are extracted from the collected eye diagram. Firstly, detecting a pupil contour and obtaining characteristic points, then verifying and fitting the pupil contour, and determining the pupil center coordinates. Because the pupil center coordinate is the basis of subsequent work, in the system, especially in the extraction link of the pupil center coordinate, the quality of the positioning algorithm directly and seriously affects the accuracy of the whole eye movement tracking system. While also addressing cases of blinking or eye lid shadowing of the pupil.
In order to determine the location of the pupil and the high light of the cornea, an image of the eye is first taken with an infrared camera, the infrared image is then segmented and the resulting eye parts are parameterized analytically. Generally, the eye image is grayed, and then the detection of the pupil is achieved by using a threshold method or a method of searching a connected region in the eye image.
After detecting the candidate pupil, the pupil is confirmed using anthropometric methods. The pupil is then parametrized to eliminate the effect of the eyelash, and Pujin field images on the coverage of the pupil area. The double ellipse fitting method can eliminate these noises well.
The method comprises the steps of roughly determining the position of the pupil center point and the pupil radius, carrying out pupil rough positioning, and providing a basis for accurately calculating the pupil center coordinate in the next step. On the basis of roughly positioning the pupil, the edge of the pupil is detected, then the contour of the pupil is fitted, and finally the accurate position of the center of the pupil is determined.
Has the advantages that: the invention relates to a VR equipment-based eye tracking sea surface ship searching method, which can accurately identify a hit ship by using AIS information and a hit ship confirmation algorithm. The image of the ship causing the accident can be timely and accurately recorded by using the lens deflection adjusting algorithm, and the AIS information is also recorded, so that sufficient evidence is provided for settlement of the accident. The method designed by the invention has the advantages of high automation level, accurate positioning and sufficient evidence, overcomes the defects of manual inspection, and has important significance for safe and stable operation of the submarine cable. The method for confirming the ship causing the submarine cable anchor trouble is suitable for confirming various ships causing the submarine pipeline anchor trouble which can lay optical fibers.
Detailed Description
The invention discloses a VR (virtual reality) equipment-based searching method for tracking sea surface ships by eye movement, which comprises the following steps of:
(1) two high power cameras are arranged on the ship surface and are in signal connection with a computer;
(2) calibrating a picture output by a high-resolution camera, and calibrating a kappa angle of the picture, wherein the method comprises the following specific steps:
(21) the user watches the fixed point Pe on the screen, and calculates the optical axis direction OP0 according to the previous process;
(22) connecting the eyeball center O and the fixed point Pe to obtain a sight line direction OPe;
(23) respectively solving the azimuth angles of two straight lines of the sight line and the optical axis;
(3) calculating the inner product of the gradient vector and the position vector of each point of the image, and finding the point with the maximum average value as:
Figure BDA0002875046270000051
the center point of the pupil is provided,
Figure BDA0002875046270000052
Xiall points in the image;
(4) and (3) estimating and calibrating a fixation point: 6 calibration points were determined, the fitted polynomial being: xf-a 0+ a1 Δ X + a2 Δ Y + A3 Δ X Δ Y + a4 Δ X2+A5ΔY2,Yf=B0+B1ΔX+B2ΔY+B3ΔXΔY+B4ΔX2+B5ΔY2Starting an eye movement tracking system, sequentially passing through 6 calibration points to obtain 6 groups of eye movement parameters, substituting the eye movement parameters into the formula, namely, obtaining a mapping function after a coefficient matrix A and a coefficient matrix B are obtained;
(5) the computer calculates the coordinates of the fixation point through the mapping function, the VR equipment is used for quickly and accurately measuring the sea surface direction watched by the user according to the eye movement tracking technology, the AI algorithm is used for identifying the measured data, and after the training is finished, reasonable suggestions are given to the search skills of the operator.
In the present invention, the high power camera is an eye camera. The processor identifies the eye camera by adopting a vertical integral projection method, and specifically comprises the following steps:
(1) obtaining a frame eye pattern, and carrying out graying and denoising treatment on the eye pattern;
(2) obtaining a gray level vertical square diagram of the eye diagram and smoothing;
(3) acquiring a corresponding value at the first peak as a binarization threshold;
(4) calculating the distance D between two inflection points by using vertical integral projection;
(5) if D > ELEN, it is a closed eye diagram, and if D < ELEN, it is an open eye diagram, and the value of ELEN is 130.
In the step (2), the image smoothing may adopt mean filtering or gaussian filtering. In the step (4), the vertical integration includes the following steps:
(41) smoothing the histogram;
(42) dividing the threshold value of the through hole, selecting a gray value corresponding to a first wave trough in the histogram, and using the gray value corresponding to a first wave crest as a binarization threshold value;
(43) and projecting the binarized eye pattern.
In the present invention, the human fovea of the retina, within which are densely packed light-sensitive cells, provides high resolution vision to our center of the field of view. When the eye looks directly at an object, photons from the object fall on the foveal region or foveal region of the retina. In short, the fovea centralis the only part of the retina that has high resolution, and it is a small part of the entire retina. In a field of view of the human eye exceeding 150 degrees, the region of highest resolution spans only 3 degrees and resolution within 10 degrees from the foveal centre drops by an order of magnitude. Because our brain maintains a model of the surrounding environment and fills in missing details while moving the fovea rapidly to any object of interest we are unaware that we cannot clearly see that object. In fact, we have only a small high resolution field of view, only the size of the thumbnail is kept at a distance from the arm, and the perception of everything around it is very blurred. However, this presents a disadvantage in the search for a culprit vessel in a sea cable, since the sea surface is wide and it is difficult to focus on a certain center point. However, the VR-based built-in eyeball identification technology, combined with the longitude and latitude of occurrence of submarine cable faults, the time of occurrence of faults and AIS monitoring data, can help people to continuously improve the capability of searching for hit ships on a wide sea surface, high-resolution submarine cable sea surface and submarine scenes are presented in the area seen by the retina fovea, the details of imaging scenes are greatly reduced in peripheral vision, and therefore the watching center of a user can be quickly and accurately identified. And the system establishes a set of search libraries for skill improvement for each operator engaged in the operation and maintenance of the submarine cable, can adjust the imaging range of the central concave staring when the operator searches for the starting ship by using VR eye movement head display equipment, and guides the operator to adjust the staring focus by combining detailed data of faults so as to gradually improve the capability of searching for the identification of the starting ship.
The above description is only of the preferred embodiments of the present invention, and it should be noted that: it will be apparent to those skilled in the art that various modifications and adaptations can be made without departing from the principles of the invention and these are intended to be within the scope of the invention.

Claims (8)

1. A VR device-based eye tracking sea surface vessel searching method is characterized by comprising the following steps:
(1) two high power cameras are arranged on the ship surface and are in signal connection with a computer;
(2) calibrating a picture output by a high-resolution camera, and calibrating a kappa angle of the picture, wherein the method comprises the following specific steps:
(21) the user watches the fixed point Pe on the screen, and calculates the optical axis direction OP0 according to the previous process;
(22) connecting the eyeball center O and the fixed point Pe to obtain a sight line direction OPe;
(23) respectively solving the azimuth angles of two straight lines of the sight line and the optical axis;
(3) calculating the inner product of the gradient vector and the position vector of each point of the image, and finding the point with the maximum average value as:
Figure FDA0002875046260000011
the center point of the pupil is provided,
Figure FDA0002875046260000012
Xiall points in the image;
(4) and (3) estimating and calibrating a fixation point: 6 calibration points were determined, the fitted polynomial being: xf-a 0+ a1 Δ X + a2 Δ Y + A3 Δ X Δ Y + a4 Δ X2+A5ΔY2,Yf=B0+B1ΔX+B2ΔY+B3ΔXΔY+B4ΔX2+B5ΔY2Starting an eye movement tracking system, sequentially passing through 6 calibration points to obtain 6 groups of eye movement parameters, substituting the eye movement parameters into the formula, namely, obtaining a mapping function after a coefficient matrix A and a coefficient matrix B are obtained;
(5) the computer calculates the coordinates of the fixation point through the mapping function, the VR equipment is used for quickly and accurately measuring the sea surface direction watched by the user according to the eye movement tracking technology, the AI algorithm is used for identifying the measured data, and after the training is finished, reasonable suggestions are given to the search skills of the operator.
2. The VR device-based eye tracking marine vessel searching method of claim 1, wherein: the high power camera is an eye camera.
3. The VR device-based eye tracking marine vessel searching method of claim 1, wherein: the processor identifies the eye camera by adopting a vertical integral projection method, and specifically comprises the following steps:
(1) obtaining a frame eye pattern, and carrying out graying and denoising treatment on the eye pattern;
(2) obtaining a gray level vertical square diagram of the eye diagram and smoothing;
(3) acquiring a corresponding value at the first peak as a binarization threshold;
(4) calculating the distance D between two inflection points by using vertical integral projection;
(5) if D > ELEn, it is a closed-eye diagram, and if D < ELEn, it is an open-eye diagram.
4. The VR device-based eye tracking marine vessel searching method of claim 3, wherein: in the step (2), the image smoothing may adopt mean filtering or gaussian filtering.
5. The VR device-based eye tracking marine vessel searching method of claim 3, wherein: in the step (4), the vertical integration includes the following steps:
(41) smoothing the histogram;
(42) dividing the threshold value of the through hole, selecting a gray value corresponding to a first wave trough in the histogram, and using the gray value corresponding to a first wave crest as a binarization threshold value;
(43) and projecting the binarized eye pattern.
6. The VR device-based eye tracking marine vessel searching method of claim 3, wherein: the value of ELen is 130.
7. The VR device-based eye tracking marine vessel searching method of claim 1, wherein: and (5) recording information of passing ships above the submarine cable in real time by the AIS monitoring system, storing the information into a database as historical data, and extracting ship information in a certain range around by taking the geographic coordinates of the alarm point as the center when an alarm occurs, and classifying all ships in the range as suspicious ships.
8. The VR device-based eye tracking marine vessel searching method of claim 7, wherein: the computer receives the longitude and latitude of the submarine cable fault, the fault time and AIS monitoring data, and utilizes VR equipment to focus the VR equipment on a certain area to quickly form an imaging scene.
CN202011616601.7A 2020-12-31 2020-12-31 VR (virtual reality) equipment-based searching method for tracking sea surface ship by eye movement Pending CN112860057A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011616601.7A CN112860057A (en) 2020-12-31 2020-12-31 VR (virtual reality) equipment-based searching method for tracking sea surface ship by eye movement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011616601.7A CN112860057A (en) 2020-12-31 2020-12-31 VR (virtual reality) equipment-based searching method for tracking sea surface ship by eye movement

Publications (1)

Publication Number Publication Date
CN112860057A true CN112860057A (en) 2021-05-28

Family

ID=75998767

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011616601.7A Pending CN112860057A (en) 2020-12-31 2020-12-31 VR (virtual reality) equipment-based searching method for tracking sea surface ship by eye movement

Country Status (1)

Country Link
CN (1) CN112860057A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100302233A1 (en) * 2009-05-26 2010-12-02 Holland David Ames Virtual Diving System and Method
CN109544640A (en) * 2018-11-05 2019-03-29 北京科技大学 A kind of sight line tracking system Kappa angle scaling method and caliberating device
CN110809148A (en) * 2019-11-26 2020-02-18 大连海事大学 Sea area search system and three-dimensional environment immersive experience VR intelligent glasses
CN112069986A (en) * 2020-09-04 2020-12-11 江苏慧明智能科技有限公司 Machine vision tracking method and device for eye movements of old people

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100302233A1 (en) * 2009-05-26 2010-12-02 Holland David Ames Virtual Diving System and Method
CN109544640A (en) * 2018-11-05 2019-03-29 北京科技大学 A kind of sight line tracking system Kappa angle scaling method and caliberating device
CN110809148A (en) * 2019-11-26 2020-02-18 大连海事大学 Sea area search system and three-dimensional environment immersive experience VR intelligent glasses
CN112069986A (en) * 2020-09-04 2020-12-11 江苏慧明智能科技有限公司 Machine vision tracking method and device for eye movements of old people

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
张闯等: "一种基于立体视觉的视线估计方法", 《电子学报》 *
李兵等: "基于人工神经网络的视线跟踪系统", 《计算机技术与发展》 *

Similar Documents

Publication Publication Date Title
CA3039116C (en) Method and apparatus and computer program for establishing a representation of a spectacle lens edge
US20170098117A1 (en) Method and apparatus for robustly collecting facial, ocular, and iris images
JP5979904B2 (en) Image processing apparatus, ophthalmic imaging system, and image processing method
US10216010B2 (en) Determining user data based on image data of a selected eyeglass frame
AU2014229610B2 (en) Wavefront generation for ophthalmic applications
CN109086675B (en) Face recognition and attack detection method and device based on light field imaging technology
JPH10507953A (en) Corneal analyzer for compound camera
CN103006174A (en) Image processing apparatus, ophthalmologic imaging apparatus, and image processing method
CN110261069B (en) Detection method for optical lens
CN108182377A (en) Human eye sight detection method and device based on photogrammetric technology
CN108021926A (en) A kind of vehicle scratch detection method and system based on panoramic looking-around system
CN110998417A (en) Method, apparatus and computer program for determining a near point of view
CN113692527A (en) Method and device for measuring the local refractive power and/or the power distribution of an ophthalmic lens
CN101564323B (en) Auxiliary equipment for diagnosing galactophore nidus based on galactophore X-ray photograph
CN110472546B (en) Infant non-contact eye movement feature extraction device and method
CN114360043A (en) Model parameter calibration method, sight tracking method, device, medium and equipment
JP4771797B2 (en) Distance measuring device and distance measuring method
EP0846439B1 (en) A method for analyzing a stereoscopic image of a fundus, and an apparatus for executing that method
CN116269198B (en) Eyeball rotation angle measurement method and device based on convolutional neural network
CN112860057A (en) VR (virtual reality) equipment-based searching method for tracking sea surface ship by eye movement
CN116934833A (en) Binocular vision-based underwater structure disease detection method, equipment and medium
CN102826209A (en) Method for realizing stereo shooting of ship draft image by using one-armed wall-climbing robot
CN115670370A (en) Retina imaging method and device for removing vitreous opacity spots of fundus image
CN109089032A (en) A kind of smog vision imaging device
CN110598635B (en) Method and system for face detection and pupil positioning in continuous video frames

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210528

RJ01 Rejection of invention patent application after publication