WO2019013563A1 - 동체 시력 검사 방법 및 시스템 - Google Patents

동체 시력 검사 방법 및 시스템 Download PDF

Info

Publication number
WO2019013563A1
WO2019013563A1 PCT/KR2018/007886 KR2018007886W WO2019013563A1 WO 2019013563 A1 WO2019013563 A1 WO 2019013563A1 KR 2018007886 W KR2018007886 W KR 2018007886W WO 2019013563 A1 WO2019013563 A1 WO 2019013563A1
Authority
WO
WIPO (PCT)
Prior art keywords
marker
user
head
gaze
gaze tracking
Prior art date
Application number
PCT/KR2018/007886
Other languages
English (en)
French (fr)
Korean (ko)
Inventor
권순철
김정호
손호준
황이환
이재현
이승현
홍성대
Original Assignee
광운대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 광운대학교 산학협력단 filed Critical 광운대학교 산학협력단
Priority to JP2020501481A priority Critical patent/JP2020528304A/ja
Publication of WO2019013563A1 publication Critical patent/WO2019013563A1/ko

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/103Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining refraction, e.g. refractometers, skiascopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography

Definitions

  • the present invention relates to a technique for inspecting a fuselage visual acuity, in which a fuselage visual acuity test for inspecting a fusiform visual acuity based on a voice of a user who reads and reads a user's gaze and a target gazing at a target of a test chart output to an optically- ≪ / RTI >
  • Patent Document 1 Korean Patent Laid-Open No. 10-2012-0052224
  • a method for inspecting a fuselage visual acuity performed by a control device connected to a head-mounted display having a line-of-sight sensor and a speech recognition sensor, Providing the head-mounted display; (b) providing the head-mounted display with a marker for displaying a specific target to be read by a user in the inspection chart; (c) displaying the gaze tracking data according to the movement of the user's gaze obtained from the gaze tracking sensor and the index indicated by the marker in accordance with the display of the marker, ; And (d) generating a visual acuity test result based on the gaze tracking data and the speech recognition data.
  • the method may further include calibrating gaze tracking data obtained from the gaze tracking sensor and an image output to the head mount display before the step (a).
  • the step of calibrating further comprises the steps of creating a calibration marker and providing it to the head-mounted display; Acquiring gaze tracking data of a user staring the calibration marker and obtaining a gaze position coordinate of a user in the display image output from the head mounted display from the gaze tracking data; And matching the position coordinate of the gaze position with the position coordinate of the marker for calibration in the display image output from the head mount display.
  • the step (b) includes setting a valid region for the specific target to display whether the user's gaze reaches a specific target, and displaying the valid region as a marker, A specific target to be read by the target is sequentially determined in the inspection chart, and a marker for displaying the target according to the determined target may be sequentially provided to the head mount display.
  • the step (d) includes: determining whether the user's line of sight is located within the effective area from the gaze tracking data; And recording the time when the user ' s line of sight reaches the effective area.
  • the step (d) may include recording the time when the voice recognition data is received.
  • the step (d) includes calculating a time difference when the user's gaze reaches the effective area and a time difference when the voice recognition data is received, wherein the marker is sequentially provided If a plurality of time differences are calculated, an average of the plurality of time differences can be calculated.
  • the step (d) may include detecting an error for each of the vertical inspection and the horizontal inspection based on the speech recognition data and the mark indicated by the marker on the inspection chart, and calculating the total number of errors .
  • a system for inspecting a fuselage comprising: a head-mounted display including a line-of-sight sensor and a speech recognition sensor; And a control device connected to the head mount display, wherein the control device comprises: a test chart transmitting unit for generating a test chart and providing the test chart to the head mount display; A marker providing unit for providing the head mount display with a marker for displaying a specific target to be read by the user in the inspection chart; The visual track data corresponding to the movement of the user's gaze obtained from the visual tracker and the voice recognition data obtained from the voice recognition sensor as the user reads the index indicated by the marker, A data receiving unit; And a fuselage visual acuity measuring unit for generating a fusiform visual acuity test result based on the gaze tracking data and the speech recognition data.
  • the control device may further include a calibration unit for calibrating gaze tracking data obtained from the gaze tracking sensor and an image output to the head mounted display.
  • the calibrating unit generates and provides a calibration marker to the head-mounted display, acquires gaze tracking data of a user staring at the marker for calibration, and displays, from the gaze tracking data, And the position coordinate of the eye mark position in the display image output from the head mount display can be matched with the position coordinate of the eye mark position in the display image output from the head mount display.
  • the marker providing unit sets a valid region for the specific target to determine whether the user's gaze reaches a specific target, displays the valid region as a marker, and the specific target to be read by the user is
  • the marker is sequentially determined in the inspection chart, and a marker for displaying the target according to the determined specific target may be sequentially provided to the head mount display.
  • the fuselage visual acuity measuring unit may determine from the gaze tracking data whether the user's line of sight is located within the effective area, and record the time when the user's line of sight reaches the effective area.
  • the fusiform visual acuity measuring unit may record a time when the speech recognition data is received.
  • the fuselage visual acuity measuring unit calculates a time difference when the user's gaze reaches the effective area and a time difference when the user's voice recognition data is received, The average of the plurality of time differences can be calculated.
  • the moving body visual acuity measuring unit may detect an error for each of the vertical inspection and the horizontal inspection on the basis of the voice recognition data and the mark indicated by the marker on the inspection chart, thereby calculating the total number of errors.
  • the conventional dynamic visual acuity test chart is outputted through the optically transmissive head-mounted display, there is an effect that the visual acuity test of the fuselage close to life can be performed.
  • a virtual test chart is mixed and provided in a real world using an optically transmissive head-mounted display, so that visual activities performed by a user in daily life can be given to a living body vision test environment, It is possible to implement an in-vivo visual acuity test system, and thus the practical and practical aspects of the visual acuity test system are enhanced.
  • FIG. 1 is a block diagram of a vision system for a fuselage vision system according to a preferred embodiment of the present invention
  • FIG. 2 is a view for explaining a head-mounted display according to an embodiment.
  • FIG. 3 is a block diagram of a controller according to one embodiment.
  • FIG. 4 is a flowchart illustrating a method of inspecting a fusiform body vision according to an exemplary embodiment.
  • FIG. 5 is a diagram illustrating an environment in which a fusiform vision visual inspection method according to an embodiment is performed.
  • FIG. 6 is a schematic diagram showing the Landolt C target.
  • FIG. 8 is an exemplary diagram for explaining providing a marker.
  • Fig. 9 is an exemplary diagram for explaining how to measure the time when the line of sight reaches the effective area.
  • first, second, etc. are used to describe various elements, components and / or sections, it is needless to say that these elements, components and / or sections are not limited by these terms. These terms are only used to distinguish one element, element or section from another element, element or section. Therefore, it goes without saying that the first element, the first element or the first section mentioned below may be the second element, the second element or the second section within the technical spirit of the present invention.
  • each step the identification code (e.g., a, b, c, etc.) is used for convenience of explanation, and the identification code does not describe the order of each step, Unless the order is described, it may happen differently from the stated order. That is, each step may occur in the same order as described, may be performed substantially concurrently, or may be performed in reverse order.
  • the identification code e.g., a, b, c, etc.
  • Dynamic Visual Acuity refers to the ability of an object or an observer to see an object in motion. Types of visual acuity include the following: Saccadic Eye Movement, Pursuit Eye Movement, Vestibular-ocular Reflex, Visual Fixation, and the like. In the inspection system, impulsive eye movements are measured.
  • Impulsive eye movement refers to the movement of the pupil to position the image of a moving object in the center of the retina.
  • the movement of the eyeball when you want to move your gaze from an object you are already watching to another is your impulsive eyeball movement.
  • impulsive eye movements refer to eye movements that place an image of an object other than the center of the eye in the center of the eye. Reading belongs to a typical impulsive eye movement.
  • Clinically normal impulsive eye movements are faster than those of the follow - up eye movements and the upward movements, which are about 300 seconds per second and 30 seconds per second.
  • Impulsive eye movements have higher speed and longer duration of eye movement. For example, turning a pupil by 40 degrees than turning a pupil by 5 degrees is faster and longer.
  • the starting point of the exercise is fast and the ending is slowing down.
  • the ending is slowing down.
  • impulsive eye movement especially when reading, middle and middle words are read and read, skip by one line, and reading position is lost.
  • you move your head and read a book, or when you play ball you will not be able to catch or catch the ball because your eyes and hands do not cooperate.
  • FIG. 1 is a block diagram of a vision system for a fuselage vision system according to a preferred embodiment of the present invention
  • the fusiform vision system 100 includes a head mound display 110 and a control device 120, and the head mounted display 110 and the control device 120 are connected to each other via a network.
  • a head mounted display (HMD) 110 is a device mounted on a user's head for displaying a virtual reality or augmented reality, and outputs a test chart for examining the visual acuity .
  • the head-mounted display 110 may include a line-of-sight sensor 111, e.g., a camera capable of tracking the user's line of sight, and a speech recognition sensor 112, e.g., a microphone.
  • the line-of-sight sensor 111 senses and tracks movement of the line of sight of the user wearing the head-mounted display 110, and the voice recognition sensor 112 can sense and recognize the voice of the user reading the target.
  • the line-of-sight sensor 111 may be provided with two lines for tracking the line of sight of the user's right eye and the left eye.
  • the head mount display 110 may include a GPS, a geomagnetic sensor, a gyroscope sensor, and the like.
  • the head-mounted display 110 is capable of outputting a test chart to an augmented reality corresponding to an optical see-through head-mounted display.
  • the head mount display 110 corresponding to the optically transmissive head mount display displays a test chart provided from the controller 120 through an image output unit (for example, a mini projector)
  • the inspection chart outputted through the image output unit can be mixed with the image of the actual environment passing through the prism and can be displayed to the user. That is, the user can view an augmented reality image mixed with a test chart in an actual environment.
  • the speech recognition sensor 112 is included in the head mount display 110, the speech recognition sensor 112 may be coupled to the head mount display 110 or the control device 120 May be implemented as separate components or devices.
  • the controller 120 is connected to the head mount display 110 to perform a visual inspection of the body.
  • the control device 120 generates and provides a test chart for performing a fusiform visual acuity test to the head-mounted display 110, and the line-of-sight sensor 111 and the speech recognition sensor 112
  • the visual acuity of the fuselage may be measured based on the line-of-sight tracking data and the speech recognition data obtained from the fuselage visual acuity data and the fusional sight test result.
  • FIG. 3 is a block diagram of a controller according to one embodiment.
  • the control device 120 includes a test chart transmitting unit 121, a marker providing unit 122, a data receiving unit 123, a moving body visual acuity measuring unit 124, and a control unit 125.
  • the control unit 125 controls the operation of the test chart transmitting unit 121, the marker providing unit 122, the data receiving unit 123, and the moving body visual acuity measuring unit 124 and the flow of data.
  • a method of inspecting a visual body of a moving body performed by the control device 120 will be described in detail with reference to FIGS.
  • the user wears the head-mounted display 110 and displays the image of the real environment through the head-mounted display 110 It is possible to examine the visual acuity of the fuselage while viewing the test chart outputted to the augmented reality.
  • the test chart may be provided, for example, as a King-Devic chart or a DEM (Developmental eye movement) chart, and the test chart target may be a 1.5 mm inner diameter Can be applied based on the built-in convex lens or prism magnification when viewing the test chart through the head mount display 110 based on the Landolt C test chart which defines the visual acuity and the time to discriminate between the visual acuity and the visual acuity.
  • OTF Open Type Font
  • OTF Open Type Font
  • the OTF method produces a curved line with a three-dimensional Bézier method, and a detailed curve can be expressed although the calculation process is complicated and the expression speed is slow.
  • the controller 120 controls the calibration unit (not shown) for calibrating the gaze tracking data obtained from the gaze tracking sensor 111 and the image output to the head mounted display 110 .
  • the calibration section generates a marker for calibration and provides it to the head mount display 110.
  • the calibration unit can generate a 5X5 calibration marker, and the generated calibration marker is sequentially displayed on the head-mounted display (FIG. 7B) 110).
  • the calibration unit acquires the gaze tracking data of the user gazing at the marker for calibration, and detects the gaze tracking data from the gaze tracking data in the display image output from the head- And aligns the position coordinates of the visual axis position with the position coordinates of the markers for calibration in the display image output from the head mount display 110. [ For example, as shown in (b) of FIG.
  • the user gazes at the marker for calibration # 1 when a marker for calibration # 1 is provided, the user gazes at the marker for calibration # 1, and the calibration unit obtains, from the obtained gaze tracking data, It is possible to acquire the user's gaze position coordinate indicating where the gaze is gazing and to match the gaze position coordinate of the user with the position coordinate of the marker for No. 1 calibration. Then, for the second calibration marker and the remaining calibration markers, the user's gaze position coordinate and the marker's position coordinate can be matched in the same manner, so that the area where the user looks at the gaze and the head mount display 110 can be completed.
  • test chart transmitting unit 121 generates a test chart and provides it to the head mount display (step S410).
  • the marker providing unit 122 displays a specific target to be read by the user in the test chart To the head mount display 110 (step S420).
  • the marker providing unit 122 may set a valid region for a specific target and display the valid region as a marker in order to determine whether or not the user's gaze reaches a specific target.
  • the marker providing unit 122 sets the effective area as indicated by a red box for the targets "5" and "9" in the inspection chart, A marker corresponding to the red box may be provided to the head mount display 110. [ That is, the user can see a test chart marked with a marker for a specific target on the head mount display 110 as shown in FIG.
  • the specific target to be read by the user is determined sequentially in the inspection chart, and markers for displaying the target according to the determined specific target are sequentially provided to the head mount display 110.
  • markers for displaying the target according to the determined specific target are sequentially provided to the head mount display 110.
  • the data receiving unit 123 receives the gaze tracking data and the marker indicated by the marker obtained by the gaze tracking sensor 111 from the gaze tracking sensor 111 and acquires the gaze tracking data from the voice recognition sensor 112 (Step S430).
  • the fuselage visual acuity measuring unit 124 generates a fuselage visual acuity test result based on the gaze tracking data and the speech recognition data (step S440).
  • the fuselage visual acuity measuring unit 124 determines whether the user's line of sight is located within the effective area from the line-of-sight trace data, records the time when the user's line of sight reaches the effective area, 124 may record the time at which the voice recognition data is received.
  • the time to be recorded can be measured in units of 1/100 second, and can be recorded in the form of HH: MM 'SS' '. Ss.
  • the moving body visual acuity measuring unit 124 measures It is possible to determine whether or not the user's line of sight reaches the effective area on the basis of the line-of-sight trace data corresponding to the movement of the user's line of sight received by the data receiving unit 123. For example, as shown, when the user's line of sight indicated by the black dot enters the effective area indicated by the red box, it can be determined that the user's line of sight has reached the effective area. When the user's line of sight reaches the effective area, the fusiform visual acuity measuring unit 124 can measure the time.
  • the fusiform visual acuity measuring unit 124 measures the time at which the user's gaze reaches the effective area of the target & For example, 00: 00'35 ".99 can be recorded.
  • the user can check the visual acuity of the body of the user
  • the unit 124 can record the time when the speech recognition data is received.
  • the fuselage visual acuity measuring unit 124 may calculate the time when the user's line of sight reaches the effective area and the time difference when the speech recognition data is received using the following equation (1).
  • the fusiform visual acuity measuring unit 124 may calculate an average of a plurality of time differences using the following Equation (2).
  • the fusiform visual acuity measuring unit 124 can calculate the average of the time difference values.
  • adr time is the time when the speech recognition data is received
  • tdr time is the time when the line of sight reaches the effective area
  • diff time is the time difference
  • cn is the number of the target
  • average difftime is the average value of the time differences.
  • the moving body visual acuity measuring unit 124 may calculate the total number of errors by detecting an error for each of the vertical inspection and the horizontal inspection based on the voice recognition data and the index indicated by the marker on the inspection chart.
  • the moving body visual acuity measuring unit 124 can convert the voice in the voice recognition data into text using the STT (Speech To Text) algorithm.
  • the visual inspection of the fuselage consists of a vertical inspection using a vertical test chart and a horizontal test using a horizontal test chart.
  • the fuselage visual acuity test result can be generated on the basis of the time required for reading, that is, the test execution time, the wrong number calculated by comparing the voice data and the inspection chart, the number read repeatedly, and the number omitted.
  • the control device 120 can calculate the total number of errors by detecting an error for each of the vertical inspection and the horizontal inspection based on the voice recognition data received from the voice recognition sensor 112 and the inspection chart.
  • the total number of errors can be calculated by Equation 3 below, where Total Errors is the total number of errors, s is the number read incorrectly, o is the number read out, The number, and t, is the number that is read.
  • control device 120 may calculate the adjustment time based on the inspection execution time and the number of errors detected for each of the vertical inspection and the horizontal inspection.
  • ADJ Time is the adjustment time
  • Time is the test execution time
  • cn is the number of the target
  • o is the number read out
  • a is Additional readings.
  • the number of errors and the adjustment time analyzed on the basis can be provided as a result of the fuselage visual acuity test.
  • the present invention it is possible not only to determine the visual acuity of the fuselage but also to judge whether the action is fast despite the good fuselage visual acuity, or whether the behavior is slow or the fusiform visual acuity is poor.
  • the method of inspecting a fuselage visual acuity can also be implemented as a computer-readable code on a computer-readable recording medium.
  • a computer-readable recording medium includes all kinds of recording apparatuses in which data that can be read by a computer system is stored.
  • the computer-readable recording medium includes a ROM, a RAM, a CD-ROM, a magnetic tape, a hard disk, a floppy disk, a removable storage device, a nonvolatile memory, , And optical data storage devices.
  • the computer readable recording medium may be distributed and executed in a distributed manner in a computer system connected to a computer network, and may be stored and executed in a code that can be read in a distributed manner.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Human Computer Interaction (AREA)
  • Eye Examination Apparatus (AREA)
  • User Interface Of Digital Computer (AREA)
PCT/KR2018/007886 2017-07-13 2018-07-12 동체 시력 검사 방법 및 시스템 WO2019013563A1 (ko)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020501481A JP2020528304A (ja) 2017-07-13 2018-07-12 動体視力検査方法およびシステム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2017-0089067 2017-07-13
KR1020170089067A KR101922343B1 (ko) 2017-07-13 2017-07-13 동체 시력 검사 방법 및 시스템

Publications (1)

Publication Number Publication Date
WO2019013563A1 true WO2019013563A1 (ko) 2019-01-17

Family

ID=64602880

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2018/007886 WO2019013563A1 (ko) 2017-07-13 2018-07-12 동체 시력 검사 방법 및 시스템

Country Status (3)

Country Link
JP (1) JP2020528304A (ja)
KR (1) KR101922343B1 (ja)
WO (1) WO2019013563A1 (ja)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111143640A (zh) * 2019-12-28 2020-05-12 中国银行股份有限公司 一种交易数据记录方法及装置
WO2021130743A1 (en) * 2019-12-25 2021-07-01 Shamir Optical Industry Ltd. System and method for automatically evaluating a vision of a user
CN113080842A (zh) * 2021-03-15 2021-07-09 青岛小鸟看看科技有限公司 一种头戴式视力检测设备、视力检测方法及电子设备
CN113508331A (zh) * 2019-03-29 2021-10-15 豪雅镜片泰国有限公司 测量被测试者的眼球的旋转特性的测量方法及渐进屈光力镜片的设计方法

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110377157B (zh) * 2019-07-22 2023-05-26 北京七鑫易维信息技术有限公司 一种应用于眼动追踪的校准方法、装置及系统
KR102226198B1 (ko) * 2019-08-27 2021-03-10 단국대학교 산학협력단 양안 흐름을 이용한 시력 진단 시스템 및 방법
DE102021202451A1 (de) * 2021-03-12 2022-09-15 Rodenstock Gmbh Verfahren, System und Computerprogrammprodukt zur Bestimmung optometrischer Parameter

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000126128A (ja) * 1998-10-26 2000-05-09 Nidek Co Ltd 動体視力検査装置
JP2003126036A (ja) * 2001-10-22 2003-05-07 Techno Network Shikoku Co Ltd 視覚検査設備、視覚検査装置および視覚検査方法
KR20100015538A (ko) * 2007-04-13 2010-02-12 나이키 인코포레이티드 근거리 및 원거리 시력의 검사 및/또는 훈련을 위한 시스템 및 방법
JP2016202716A (ja) * 2015-04-24 2016-12-08 株式会社コーエーテクモゲームス プログラム及び記録媒体
KR20170048072A (ko) * 2015-10-26 2017-05-08 에스케이플래닛 주식회사 웨어러블을 통한 안구상태 검사 및 관련 컨텐츠 제공 서비스

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4818234A (en) * 1986-06-25 1989-04-04 Redington Dana J Psychophysiological reflex arc training simulator
US9788714B2 (en) * 2014-07-08 2017-10-17 Iarmourholdings, Inc. Systems and methods using virtual reality or augmented reality environments for the measurement and/or improvement of human vestibulo-ocular performance
US11426069B2 (en) * 2013-03-13 2022-08-30 The Henry M. Jackson Foundation For The Advancement Of Military Medicine, Inc. Enhanced neuropsychological assessment with eye tracking

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000126128A (ja) * 1998-10-26 2000-05-09 Nidek Co Ltd 動体視力検査装置
JP2003126036A (ja) * 2001-10-22 2003-05-07 Techno Network Shikoku Co Ltd 視覚検査設備、視覚検査装置および視覚検査方法
KR20100015538A (ko) * 2007-04-13 2010-02-12 나이키 인코포레이티드 근거리 및 원거리 시력의 검사 및/또는 훈련을 위한 시스템 및 방법
JP2016202716A (ja) * 2015-04-24 2016-12-08 株式会社コーエーテクモゲームス プログラム及び記録媒体
KR20170048072A (ko) * 2015-10-26 2017-05-08 에스케이플래닛 주식회사 웨어러블을 통한 안구상태 검사 및 관련 컨텐츠 제공 서비스

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113508331A (zh) * 2019-03-29 2021-10-15 豪雅镜片泰国有限公司 测量被测试者的眼球的旋转特性的测量方法及渐进屈光力镜片的设计方法
EP3951482A4 (en) * 2019-03-29 2022-04-20 Hoya Lens Thailand Ltd. MEASUREMENT METHOD FOR MEASURING A ROTATION CHARACTERISTIC OF AN EYEBALL OF A SUBJECT AND ADJUSTMENT METHOD FOR PROGRESSIVE POWER LENS
CN113508331B (zh) * 2019-03-29 2023-12-15 豪雅镜片泰国有限公司 测量被测试者的眼球的旋转特性的测量方法及渐进屈光力镜片的设计方法
WO2021130743A1 (en) * 2019-12-25 2021-07-01 Shamir Optical Industry Ltd. System and method for automatically evaluating a vision of a user
CN111143640A (zh) * 2019-12-28 2020-05-12 中国银行股份有限公司 一种交易数据记录方法及装置
CN111143640B (zh) * 2019-12-28 2023-11-21 中国银行股份有限公司 一种交易数据记录方法及装置
CN113080842A (zh) * 2021-03-15 2021-07-09 青岛小鸟看看科技有限公司 一种头戴式视力检测设备、视力检测方法及电子设备
US11744462B2 (en) 2021-03-15 2023-09-05 Qingdao Pico Technology Co., Ltd. Head-mounted vision detection equipment, vision detection method and electronic device

Also Published As

Publication number Publication date
KR101922343B1 (ko) 2018-11-26
JP2020528304A (ja) 2020-09-24

Similar Documents

Publication Publication Date Title
WO2019013563A1 (ko) 동체 시력 검사 방법 및 시스템
US9004687B2 (en) Eye tracking headset and system for neuropsychological testing including the detection of brain damage
US7506979B2 (en) Image recording apparatus, image recording method and image recording program
US20130171596A1 (en) Augmented reality neurological evaluation method
WO2019208848A1 (ko) 3차원 안구 움직임 측정 방법 및 전자동 딥러닝 기반 어지럼 진단 시스템
CN103429141B (zh) 用于确定优势眼的方法
US8150118B2 (en) Image recording apparatus, image recording method and image recording program stored on a computer readable medium
WO2009145596A2 (ko) 얼굴 분석 서비스 제공 방법 및 장치
KR20140046652A (ko) 학습 모니터링 장치 및 학습 모니터링 방법
JP2010523290A (ja) 一元的視覚検査センター
EP3651457A1 (en) Pupillary distance measurement method, wearable eye equipment and storage medium
JP2022002761A (ja) 人の視覚挙動を検査する機器及びそのような装置を用いて眼鏡レンズの少なくとも1個の光学設計パラメータを決定する方法
WO2021071335A1 (ko) 스마트 안경의 시선 트래킹 시스템 및 그 방법
JP2005253778A (ja) 視線検出方法及び同装置
JP2019215688A (ja) 自動キャリブレーションを行う視線計測装置、視線計測方法および視線計測プログラム
US20110279665A1 (en) Image recording apparatus, image recording method and image recording program
JP2018180090A (ja) 作業教育システム
WO2021215800A1 (ko) 3차원 영상을 활용한 술기 교육 시스템 및 머신 러닝 기반의 수술 가이드 시스템
CN100443041C (zh) 具有测量功能的内窥镜系统及其测量方法
JP2005261728A (ja) 視線方向認識装置及び視線方向認識プログラム
JP3777439B2 (ja) 視覚検査設備および視覚検査方法
JPH0449943A (ja) 眼球運動分析装置
WO2019172503A1 (ko) 안구추적을 통한 시야장애 평가 장치, 이를 이용한 시야장애 평가 방법 및 컴퓨터 판독 가능한 저장 매체
US20210345924A1 (en) Evaluation device, evaluation method, and non-transitory compter-readable recording medium
JP2014149794A (ja) 視線分析装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18831447

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020501481

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18831447

Country of ref document: EP

Kind code of ref document: A1