WO2015198023A1 - Outil de simulation oculaire - Google Patents

Outil de simulation oculaire Download PDF

Info

Publication number
WO2015198023A1
WO2015198023A1 PCT/GB2015/051811 GB2015051811W WO2015198023A1 WO 2015198023 A1 WO2015198023 A1 WO 2015198023A1 GB 2015051811 W GB2015051811 W GB 2015051811W WO 2015198023 A1 WO2015198023 A1 WO 2015198023A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
movement
eye
ocular condition
ocular
Prior art date
Application number
PCT/GB2015/051811
Other languages
English (en)
Inventor
Luke Anderson
Original Assignee
Swansea Medical Apps Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Swansea Medical Apps Limited filed Critical Swansea Medical Apps Limited
Publication of WO2015198023A1 publication Critical patent/WO2015198023A1/fr

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine

Definitions

  • This invention relates generally to medical tools, and more particularly to computer simulation techniques for assisting in the training of medical students and practitioners.
  • the invention is particularly suited for use in teaching how to diagnose and/or treat eye- related conditions such as ocular motility problems, pupil abnormalities, lid abnormalities and neurological problems concerning the eye(s) and face.
  • Ophthalmologists and neuro-ophthalmologists are specialist physicians who are trained to diagnose and treat patients with a variety of conditions which affect and/or arise from the eye. These could include ocular motility problems, pupil abnormalities, lid abnormalities and neurological problems. The symptoms of such conditions may be manifested in the patient's eye(s) and/or face, and the physician must be able to recognise the symptom(s), diagnose the cause, and recommend a corrective solution.
  • the solution may take a variety of forms, including medication, physiotherapy or surgery. A variety of tests can be used in clinical practice to assess the eye condition.
  • training usually consists of clinical exposure to patients who have eye conditions. This may be undertaken as part of a medical residency, where the student shadows a trained and qualified practitioner. The student observes and practices diagnoses of conditions, and the tests that can be used in clinical practice. The more patients the student is exposed to, and the greater the variety and spectrum of ocular conditions encountered, the faster the student will learn and the deeper the student's understanding will be.
  • the LUMA Optical system also provided by Eyemaginations provides a patient education tool for use in reception areas and waiting rooms of eye care specialists' offices. However, this is a passive system, which merely present information to the patient without interaction. It does not enable the viewer to control or influence the performance of a simulated ocular assessment or treatment.
  • the 'retinoscopy simulator' from eyedocs.co.uk provides a computer-based simulator which enables a user to replicate a retinoscopy on a virtual eye displayed on the screen.
  • Retinoscopy is a technique used by practitioners to obtain an objective measurement of the refractive error of a patient's eyes.
  • the examiner uses a retinoscope to shine light into the patient's eye and observes the reflection (reflex) off the patient's retina. While moving the streak or spot of light across the pupil the examiner observes the relative movement of the reflex then uses a phoropter or manually places lenses over the eye (using a trial frame and trial lenses) to "neutralize" the reflex.
  • the 'retinoscopy simulator' displays a virtual streak or beam of light over the eye, which the user can then move using a mouse so as to observe the simulated patient's response.
  • the position and movement of the 'light' is constrained by the input device used to control its movement, and 3 dimensional movement is not possible.
  • the distance of the light source from the eye is 'hard wired' into the simulator and cannot be altered during use by a user. In clinical practice, however, the practitioner would be able to move his hand and the light source in any direction he choose relative to the patient's eye - up, down, towards or away relative to the eye.
  • Such a solution should be relatively cheap to supply, easy and intuitive to use, and accessible at any time.
  • Such a solution may provide the student with exposure to a spectrum of ocular conditions, to enhance the student's clinical learning experience.
  • the solution would provide a simulation-based training experience which allows a more realistic interaction with the user than previously provided by the prior art.
  • US 2011/0091856 which relates to a Virtual Reality (VR) system for simulating the use of an op thalmo scope.
  • the user wears a pair of head-mounted display devices which are associated with a pair of cameras.
  • the user manual positions an opthalmoscope lens in front of a simulated patient's head.
  • the cameras display an image of the lens dummy on the display devices in front of the user's eyes.
  • the system comprises means for recognising the position of the dummy lens relative to the user and simulated patient.
  • the system displays a virtual retina to the user.
  • US 2011/0091856 does not provide a solution for simulation of a patient's facial/ocular response to the user's gestures or movement, and does not provide a mechanism for simulating patient responses in accordance with different eye conditions.
  • the phrase 'movement in free space' may be interpreted as meaning 'gestures'.
  • the movement may be motion in any direction. It may be physical motion made in air. It may be distinguished, therefore, from movement which is made via contact with an object eg in contrast to a finger moving across or on a touch screen.
  • the free movement may be made without the aid of a physical device such as a pointing or computer input device e.g. a mouse, trackball, haptic, touch screen etc.
  • the present invention enables the user to provide input to control the simulator's behaviour by moving his body or a part thereof in 3-dimensional space, without the use of a pointing or tracking device such as a mouse.
  • the user's detected movement may be in any direction or angle relative to a sensor device and/or a device on which the invention is installed for execution.
  • the system further comprises motion sensing means arranged to detect the user's movement in free space.
  • the motion sensing means may be a motion sensing input device or apparatus. It may detect motion visually. It may comprise one or more 3-D depth sensors.
  • the motion sensing means may be a sensing system comprising a camera and suitably arranged software. The sensing means may measure the user's movement relative to a reference position such as the position of the sensing means. Data relating to the user's movement may be captured. Such data may relate to the speed, direction, plane of movement. The movement-related data may then be used to change what is shown on a screen. It may be used to reconfigure the virtual face or portion therefore (e.g. by causing the eye(s) to move in response to the user's movement) or may be used to reposition a virtual (simulated) tool or piece of equipment, such as a scalpel, aiming beam or lens.
  • a virtual (simulated) tool or piece of equipment such as a scalpel, aiming beam or lens
  • the motion sensing apparatus may enable the user to interact with the invention through a natural user interface using gestures and/or spoken commands.
  • the system may also comprise voice recognition technology to facilitate the user's input via spoken commands.
  • the invention may provide a hands-free interface for the input of user commands.
  • the system may further comprise at least one display device for displaying the virtual face or portion thereof.
  • the display device may be a screen. It may be a touch screen.
  • the display device may be provided as part of a computer-based device such as a tablet computer, smart phone, laptop, PC etc. Additionally or alternatively, at least one of the display devices may be provided on a head-mounted support such as a helmet or head gear, goggles or glasses.
  • the head mounted support may form part of a Virtual Reality system.
  • the invention may comprise VR technologies and techniques to provide an enhanced experience for the user.
  • the user may be able to provide input to the software via one or more other input devices in addition to the motion sensing means.
  • the user may be able to provide input via a touch screen, and/or by clicking on something shown on the screen.
  • Voice recognition may be used to receive the user's input.
  • the predetermined ocular condition may be selected from a plurality of predetermined ocular conditions.
  • the system may be configured to simulate more than one ocular condition.
  • the user may select which ocular condition is to be simulated.
  • the software may select the ocular condition for simulation.
  • the training system may be described as a medical simulation tool.
  • the software may be arranged to control the response of the virtual face according to the predetermined ocular condition.
  • the movement or presentation of the virtual face may be displayed such that it exhibits and mimics symptoms expected from a real patient having the predetermined condition.
  • the training system may simulate at least a portion of the face of a real (i.e. live) patient who has at least one ocular-related condition.
  • the virtual face may be a graphical representation of a human face. It may include (virtual) eyes, nose and/or mouth.
  • the software may be configured to present the virtual face as part of a human head.
  • the software may be configured to present the virtual face and/or head from a variety of perspectives or angles.
  • the image of the head may be rotatable so that it can be viewed from a variety of angles.
  • the virtual face may be a graphical representation or an image.
  • the representation may depict a view of a human face. It may graphically replicate what a trained practitioner would see before him in practice. It may be an external view of a human face. It may be a 2-D or 3-D graphical representation.
  • the tool may simulate the experience of a medical practitioner who is looking at a live patient being treated or assessed in clinical practice. This is in contrast to some prior art arrangements which display only a representation of the internal, anatomical structure of the eye and/or face.
  • the present invention more closely simulates the experience of assessing and/or treating a real patient in practice.
  • the user may be a student e.g. medical student or other medical practitioner who wishes to learn about the treatment and/or diagnosis of ocular-related conditions.
  • the software may be configured to execute on any kind of computing device, including a tablet computer, laptop, desktop computer, smartphone etc.
  • the computing device may comprise a screen.
  • the screen may be a touch screen.
  • the virtual face or portion therefore may be presented on a screen associated with a computing device upon which the software is installed for execution.
  • Ocular condition' may be used to refer to any condition which pertains to or affects the eye in any way.
  • it could be an ocular motility problem, or a pupil abnormality, or a lid abnormality, or a neurological problem concerning the eye and face. It could be any condition which would fall within the medical field of
  • the condition may be caused by an individual's ocular anatomy or physiology, or an eye disease.
  • the predetermined ocular condition may be selected from a plurality of pre-determined ocular conditions.
  • the ocular conditions may be stored in computer memory. Thus, during a training exercise a particular ocular condition may be selected for presentation to the user using the virtual face.
  • the virtual face or portion thereof may be reconfigured during use such that the virtual face exhibits one or more symptoms known to relate to the predetermined ocular condition.
  • the user's task may be to diagnose (identify) and/or assess the condition by observing the symptom(s) displayed.
  • the behaviour symptomatic of the ocular condition may be demonstrated in a variety of ways. It may be manifested in a graphical representation - for example, an eye (of the virtual face) may be depicted as blood shot, or an eye lid may be shown as dropping.
  • the symptom may be shown in an animated form - for example, a virtual eye may not track a moving finger as expected in a healthy individual, or a virtual pupil may not respond as expected in a healthy eye.
  • the symptom may be associated with one or both of the eyes of the virtual face, and/or another part of the virtual face e.g. the mouth or lips or cheeks.
  • the user may be able to input a diagnosis of the at least one pre-determined ocular condition.
  • the user's input diagnosis may be received in a variety of ways. For example, the user may speak his diagnosis into a microphone, or may type it via a keyboard, or may select it on the screen using a pointing device such as a mouse.
  • the software may be arranged to determine whether the user has correctly assessed the predetermined ocular condition by comparing the user's inputted assessment with the predetermined ocular condition.
  • One or more rules may be stored in association with the pre-determined ocular condition. These may be stored in volatile and/or non- volatile memory associated with the device upon which the software is arranged to execute.
  • the one or more rules may specify how the virtual face is to be displayed and/or manipulated in accordance with the pre-determined ocular condition.
  • the rules may specify how the virtual face is to respond to the user's input in accordance with the predetermined condition.
  • the rules may specify how the software is to reconfigure the virtual face in response to the user's motion in accordance with behaviour symptomatic of the predetermined condition and the user's movement.
  • the invention may comprise one or more rule sets, each rule set specifying behaviour associated with a particular ocular condition and how the face or eyes are to be
  • the system may be arranged to show or hide the name or other content identifying the ocular condition upon instruction from the user.
  • the system may be arranged so that the identity (eg name) of the ocular condition is communicated to the user. For example, it may be displayed on the screen for the user to see. It may also be arranged so that the identity of the ocular condition is not communicated to the user, so that the user has to determine which ocular condition is being demonstrated via the reconfiguration and/or behaviour of the virtual face.
  • the system may provide a mechanism eg button or icon via which the user can choose to have the condition communicated or not communicated. Eg the name of the condition may be displayed or hidden by clicking on a button on a screen, or selecting an option from a menu.
  • the user may choose to know the name of the condition and then observe the relevant symptoms with this knowledge in mind, or may choose to observe the symptoms and attempt to diagnose the condition.
  • the identity of the condition may then be revealed after the user has inputted a diagnosis.
  • the software may be configured to move the eyes in a smooth motion or saccade movement.
  • the user's input may be used to control whether smooth or saccade eye movement is shown.
  • the movement of the simulated beam of light or medical equipment may be influenced by the user's movement in free space.
  • the system may be arranged to enable the user to simulate performance of a corrective procedure.
  • the corrective procedure may be laser surgery. It may be retinoscopy.
  • the system may be arranged to provide a medical training tool.
  • the system may comprise teaching modules or components arranged to guide a student through a learning
  • the system may comprise stored content for use in assessing the student's ability to diagnose, assess and/or treat one or more ocular conditions.
  • the system may comprise metrics against which the student's (user's) performance may be assessed. Data relating to the student's performance may be stored for future reference and/or sent to a destination via a network.
  • a medical training system arranged to simulate facial and/or ocular behaviour associated with a variety of ocular conditions, the system comprising:
  • a motion sensing arrangement may also apply to this aspect of the invention. Also in accordance with the invention, there is provided a method corresponding to the above described system. Thus, there is provided a medical training method comprising the step:
  • 'virtual face' may be interpreted as meaning a simulation or image which represents a head or face.
  • the method may further comprise any or all of the following steps:
  • the system may further comprise a motion sensing means arranged to detect the user's gestures (movement made in free space).
  • the software may be arranged to display a variety of virtual faces to the user.
  • the tool may be configured to simulate a variety of 'real' patients. This provides a more realistic and enhanced training experience.
  • the different virtual faces may be displayed sequentially or concurrently.
  • Figure 1 provides a screen shot of a medical training tool in accordance with an
  • Figure 8 illustrates how an embodiment of the invention could be used to assess a virtual patient's light response.
  • the motion sensing apparatus could comprise a 3D motion sensor such as the Leap Motion Controller from Leap Motion, Inc.
  • Virtual Reality technology such as the Oculus Rift from Oculus VR, LLC, may be utilized to provide an enhanced visual experience.
  • Such technologies may be used in combination.
  • 3D motion sensing technology may be used in combination with the VR headset. The motion sensing technology may enable the user to control the virtual patient's response via the user's movements, and this may then be displayed in a realistic manner by a VR display arrangement worn in front of the user's eyes (eg on a head-mounted support, goggles, or glasses).
  • the invention generates a representation of a patient's face, head or portion thereof 1.
  • an external view of the patient's entire face is shown, from a front- facing perspective, as if the patient were sitting facing a practitioner in clinical practice.
  • the representation can be termed as a 'virtual patient' because the representation simulates a real patient' s head/face 1 in a computer- implemented form.
  • This enables the invention to be used to simulate the performance of an eye examination, test or procedure.
  • the examination could include testing visual acuity, refraction, pupil function, ocular motility, visual field (confrontation) testing, external examination, slit lamp testing, or retinal examination.
  • the image of the head or face 1 may be rotated so as to change the perspective on the screen, as shown in figure 3 in which the head is shown rotated slightly to one side.
  • Rotation of the head may be achieved by the user moving a finger across a touch screen.
  • the user may control the orientation of the virtual head/face via the motion sensing apparatus 2, and/or by using some pointing device.
  • the rotation of the head can be used to test oculocephelic reflex, and to differentiate between incomitant and comitant strabsimus types.
  • the user may indicate that the head is to be rotated by selecting a 'rotate head' option 3 provided on the screen.
  • Various controls, buttons and menus are provided on the screen to allow the user to select options and control how the training session is to be conducted. These options may include an occluder option 4 which enables the user to simulate covering an eye with a paddle as shown in figures 5 and 7.
  • FIG. 4 An illustrative main screen is shown in Figure 4.
  • a single touch event (tap) on the touch screen generates saccade eye movement. Dragging a finger generates a smooth movement.
  • the 'near' button 5 option changes the fixation distance of the eye to 'near', thus altering the program's response to user input.
  • the 'far' button 6 changes the fixation distance of the eye to 'far', gain altering the software's response accordingly.
  • Touch events can also generate characteristic changes in the lids, dependent on the disease or condition being simulated.
  • the 'reset' button 8 can be used to return the face or head to the default orientation and configuration as shown in figure 1.
  • a list of ocular conditions 13 can be provided on the screen from which the user can choose.
  • the user can alter the way in which the virtual patient will respond to the simulated eye test(s) such as the cornea reflection shown in figure 6.
  • the software executes a set of behavioural rules associated with the selected condition. Examples of these rules are provided further below. These options can be hidden so that the user is not told which condition is being simulated, so that the user's task is to observe the response to the simulated test and recognize (diagnose) the condition.
  • Figure 6 shows a corneal reflection being simulated using an embodiment of the invention. If the corneal reflection is not in the centre of the eye it can be determined that it is not the fixating eye.
  • Figure 7 shows the virtual patient wearing glasses. Putting glasses on the patient enables the user to test for accommodative/ refractive element of strabismus. The user can add or remove glasses using the 'glasses' icon or button 7 on the screen. Other assessment options are shown in Figures 6 and 8. In ocular physiology, 'adaptation' is the ability of the eye to adjust to various levels of darkness and light. An ambient light switch can be used to test the response of a patient's pupils to light, or direct light can be used to test pupil response. Figure 8 shows direct light being used to test pupil response. This option can be selected using the 'direct light' button 11.
  • the invention provides the ability to simulate switching ambient light levels. In clinical practice, the practitioner would alter the light setting in the room within which the test is being performed. If the room lights are switched off, the pupils should enlarge.
  • the invention simulates this by providing an 'ambient light' switch 10 which the user can use to change the ambient light conditions and observe the pupil response.
  • the switch 10 has two settings - I for 'ambient light on' and 0 for 'ambient light off .
  • the user is able to simulate shining a beam of light into the virtual patient's eye and/or onto the face, as shown in figure 6.
  • This allows the user to simulate the performance of a cornea reflection test which is performed by shining a light in the person's eyes and observing where the light reflects off the corneas.
  • the light reflex lies slightly nasal from the center of the cornea (approximately 11 prism diopters— or 0.5mm from the pupillary axis), as a result of the cornea acting as a temporally-turned convex mirror to the observer.
  • Movement of the beam of light can be controlled in all directions by the user's gestures.
  • the motion sensing apparatus detects the position, angle and relative displacement of the user's hand and coordinates the movement of the simulated beam accordingly.
  • An algorithm is used to calculate how the beam of light would be affected in response to the user's movement.
  • Another algorithm then adapts the face/eyes on the display so that the virtual patient responds to the moving light. The response is tailored so that the virtual patient's response is indicative of a given condition.
  • Touching the screen causes both eyes to move to corresponding location at 400°sec.
  • tablet/ computing device indicates looking up 30° this will be fixation.
  • Intorsion is the eye rotating inwards and extortion is outwards. Therefore the right eye will rotate clockwise when intorting and anticlockwise when extorting.
  • the invention provides the user with a more effective and realistic training tool than prior art techniques.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Medicinal Chemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Algebra (AREA)
  • Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Medical Informatics (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Theoretical Computer Science (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

L'invention concerne un système de formation médical comprenant un logiciel conçu pour reconfigurer un visage virtuel ou une partie de celui-ci en réponse à un mouvement de l'utilisateur dans un espace libre et conformément à un ou plusieurs symptômes liés à une condition oculaire prédéterminée. Le système comprend en outre un moyen de détection de mouvement conçu pour détecter un mouvement de l'utilisateur effectué dans l'espace libre. Par conséquent, les gestes de l'utilisateur peuvent être utilisés pour entrer des commandes au moyen d'une interface d'entrée à mains libres. L'invention simule une ou plusieurs conditions oculaires qui sont affichées sur un écran. L'utilisateur, par exemple un étudiant en médecine, est capable d'interagir avec le système par l'intermédiaire de toutes sortes de moyens d'entrée, notamment des mouvements dans l'espace libre, de façon à produire une réponse sur le visage du patient virtuel, la réponse étant symptomatique d'une ou plusieurs conditions oculaires. Par exemple, les yeux peuvent suivre le doigt de l'utilisateur d'une manière particulière. L'invention concerne également un outil de simulation servant à former des utilisateurs sur une opération de correction, par exemple une intervention chirurgicale au laser.
PCT/GB2015/051811 2014-06-23 2015-06-23 Outil de simulation oculaire WO2015198023A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1411134.8 2014-06-23
GB1411134.8A GB2527514A (en) 2014-06-23 2014-06-23 Ocular simulation tool

Publications (1)

Publication Number Publication Date
WO2015198023A1 true WO2015198023A1 (fr) 2015-12-30

Family

ID=51409984

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2015/051811 WO2015198023A1 (fr) 2014-06-23 2015-06-23 Outil de simulation oculaire

Country Status (2)

Country Link
GB (1) GB2527514A (fr)
WO (1) WO2015198023A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106448399A (zh) * 2016-08-31 2017-02-22 刘锦宏 基于虚拟现实技术的微创手术模拟方法
CN106530880A (zh) * 2016-08-31 2017-03-22 徐丽芳 基于虚拟现实技术的实验模拟方法
CN109559573A (zh) * 2019-01-07 2019-04-02 王登芹 一种诊断学教学用物理诊断教学装置
US10810907B2 (en) 2016-12-19 2020-10-20 National Board Of Medical Examiners Medical training and performance assessment instruments, methods, and systems
KR20220015811A (ko) * 2020-07-31 2022-02-08 전남대학교산학협력단 가상현실 기반 사시 진단 교육용 시뮬레이션 시스템

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070207448A1 (en) * 2006-03-03 2007-09-06 The National Retina Institute Method and system for using simulation techniques in ophthalmic surgery training
DE102008027832A1 (de) * 2008-06-11 2009-12-17 Vrmagic Gmbh Ophthalmoskop-Simulator

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070207448A1 (en) * 2006-03-03 2007-09-06 The National Retina Institute Method and system for using simulation techniques in ophthalmic surgery training
DE102008027832A1 (de) * 2008-06-11 2009-12-17 Vrmagic Gmbh Ophthalmoskop-Simulator

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
YOUSUF M KHALIFA ET AL: "Virtual Reality in Ophthalmology Training", SURVEY OF OPHTHALMOLOGY, SURVEY OF OPHTHALMOLOGY INC, XX, vol. 51, no. 3, 1 May 2006 (2006-05-01), pages 259 - 273, XP002647063, ISSN: 0039-6257, [retrieved on 20060425], DOI: 10.1016/J.SURVOPHTHAL.2006.02.005 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106448399A (zh) * 2016-08-31 2017-02-22 刘锦宏 基于虚拟现实技术的微创手术模拟方法
CN106530880A (zh) * 2016-08-31 2017-03-22 徐丽芳 基于虚拟现实技术的实验模拟方法
US10810907B2 (en) 2016-12-19 2020-10-20 National Board Of Medical Examiners Medical training and performance assessment instruments, methods, and systems
CN109559573A (zh) * 2019-01-07 2019-04-02 王登芹 一种诊断学教学用物理诊断教学装置
KR20220015811A (ko) * 2020-07-31 2022-02-08 전남대학교산학협력단 가상현실 기반 사시 진단 교육용 시뮬레이션 시스템
KR102406472B1 (ko) * 2020-07-31 2022-06-07 전남대학교산학협력단 가상현실 기반 사시 진단 교육용 시뮬레이션 시스템

Also Published As

Publication number Publication date
GB201411134D0 (en) 2014-08-06
GB2527514A (en) 2015-12-30

Similar Documents

Publication Publication Date Title
US20240099575A1 (en) Systems and methods for vision assessment
KR102669685B1 (ko) 광 필드 프로세서 시스템
US10083631B2 (en) System, method and computer program for training for ophthalmic examinations
Khademi et al. Comparing “pick and place” task in spatial augmented reality versus non-immersive virtual reality for rehabilitation setting
WO2015198023A1 (fr) Outil de simulation oculaire
RU2634682C1 (ru) Портативное устройство для исследования зрительных функций
CN104244859A (zh) 通用微手术模拟器
US20210045628A1 (en) Methods, systems, and computer readable media for testing visual function using virtual mobility tests
Soto et al. AR stereoscopic 3D human eye examination app
Nguyen et al. An experimental training support framework for eye fundus examination skill development
CN114402378B (zh) 手术模拟器系统和方法
US11337605B2 (en) Simulator for the evaluation of a concussion from signs displayed during a visual cranial nerve assessment
US11768594B2 (en) System and method for virtual reality based human biological metrics collection and stimulus presentation
Uribe-Quevedo et al. Physical and physiological data for customizing immersive VR training
US20230293004A1 (en) Mixed reality methods and systems for efficient measurement of eye function
US20230404388A1 (en) Method and apparatus for measuring relative afferent pupillary defects
Hvass et al. A preliminary exploration in the correlation of cybersickness and gaze direction in VR
Pezzei Visual and Oculomotoric Assessment with an Eye-Tracking Head-Mounted Display
WO2023172768A1 (fr) Procédés, systèmes et supports lisibles par ordinateur d'évaluation de fonction visuelle en utilisant des tests de mobilité virtuels
Yendigeri Development of Customizable Head Mounted Device for Virtual Reality Applications
Weiss Evaluation of Augmented Reality and Wearable Sensors to Assess Neurovestibular and Sensorimotor Performance in Astronauts for Extravehicular Activity Readiness
Chan Development and comparison of augmented and virtual reality interactions for direct ophthalmoscopy
Acosta et al. DEVELOPMENT OF A SMARTPHONE AUGMENTED REALITY EYE EXAMINATION TOOL
Gupta Head Mounted Eye Tracking Aid for Central Visual Field Loss

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15732900

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 11.05.2017)

122 Ep: pct application non-entry in european phase

Ref document number: 15732900

Country of ref document: EP

Kind code of ref document: A1