WO2022138102A1 - 眼科装置および眼科装置の制御プログラム - Google Patents
眼科装置および眼科装置の制御プログラム Download PDFInfo
- Publication number
- WO2022138102A1 WO2022138102A1 PCT/JP2021/044669 JP2021044669W WO2022138102A1 WO 2022138102 A1 WO2022138102 A1 WO 2022138102A1 JP 2021044669 W JP2021044669 W JP 2021044669W WO 2022138102 A1 WO2022138102 A1 WO 2022138102A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- eye
- inspected
- image
- face image
- unit
- Prior art date
Links
- 238000000034 method Methods 0.000 claims abstract description 45
- 230000001815 facial effect Effects 0.000 claims abstract description 35
- 230000008569 process Effects 0.000 claims abstract description 18
- 238000007689 inspection Methods 0.000 claims description 69
- 238000001514 detection method Methods 0.000 claims description 45
- 238000003384 imaging method Methods 0.000 claims description 24
- 238000013459 approach Methods 0.000 claims 1
- 238000004458 analytical method Methods 0.000 abstract description 15
- 230000003287 optical effect Effects 0.000 description 75
- 210000001747 pupil Anatomy 0.000 description 59
- 238000005259 measurement Methods 0.000 description 31
- 210000004087 cornea Anatomy 0.000 description 10
- 239000000834 fixative Substances 0.000 description 8
- 238000010586 diagram Methods 0.000 description 7
- 238000010191 image analysis Methods 0.000 description 6
- NJPPVKZQTLUDBO-UHFFFAOYSA-N novaluron Chemical compound C1=C(Cl)C(OC(F)(F)C(OC(F)(F)F)F)=CC=C1NC(=O)NC(=O)C1=C(F)C=CC=C1F NJPPVKZQTLUDBO-UHFFFAOYSA-N 0.000 description 5
- 230000004907 flux Effects 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 206010015995 Eyelid ptosis Diseases 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 210000004709 eyebrow Anatomy 0.000 description 2
- 210000000720 eyelash Anatomy 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 230000004410 intraocular pressure Effects 0.000 description 2
- 238000012014 optical coherence tomography Methods 0.000 description 2
- 230000002035 prolonged effect Effects 0.000 description 2
- 201000003004 ptosis Diseases 0.000 description 2
- 230000004397 blinking Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000002950 deficient Effects 0.000 description 1
- 238000012850 discrimination method Methods 0.000 description 1
- 210000001061 forehead Anatomy 0.000 description 1
- 210000004209 hair Anatomy 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000007493 shaping process Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/14—Arrangements specially adapted for eye photography
- A61B3/15—Arrangements specially adapted for eye photography with means for aligning, spacing or blocking spurious reflection ; with means for relaxing
- A61B3/152—Arrangements specially adapted for eye photography with means for aligning, spacing or blocking spurious reflection ; with means for relaxing for aligning
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/0016—Operational features thereof
- A61B3/0041—Operational features thereof characterised by display arrangements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/0075—Apparatus for testing the eyes; Instruments for examining the eyes provided with adjusting devices, e.g. operated by control lever
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/0083—Apparatus for testing the eyes; Instruments for examining the eyes provided with means for patient positioning
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/102—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for optical coherence tomography [OCT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/12—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
Definitions
- the present disclosure relates to an ophthalmic apparatus for inspecting an eye to be inspected and a control program for the ophthalmic apparatus.
- a technique is known in which an ophthalmic apparatus having an inspection unit for inspecting an eye to be inspected takes a facial image including the left and right eyes to be inspected and analyzes the photographed facial image to detect the position of the eye to be inspected.
- automatic alignment shifts to precise alignment based on the image of the anterior eye portion of the anterior eye portion.
- the position of the eye to be inspected may not be detected or the position of the eye to be inspected may be erroneously detected depending on the condition of the face of the subject and the examination environment.
- the detection time of the eye to be inspected may be prolonged. In such a case, there is a problem that alignment is not performed well.
- the technical subject of the present disclosure is to provide an ophthalmic apparatus and a control program for the ophthalmic apparatus that can perform good alignment.
- An ophthalmologist device including an inspection unit for inspecting an eye to be inspected is a driving means for moving the inspection unit three-dimensionally relative to the eye to be inspected, and a first for taking a facial image including the left and right eyes to be inspected.
- 1 imaging means a second imaging means for capturing an image of the anterior segment of the eye to be inspected, a position acquisition step for acquiring the position of the eye to be inspected specified based on the facial image, and an acquired eye to be inspected.
- a first drive control step that controls the drive means so that the eye to be inspected is included in the imaging range of the second imaging means based on the position, and the first drive control step followed by the anterior segment image.
- a control means for executing an adjustment process including a second drive control step for controlling the drive means and adjusting the relative position of the inspection unit with respect to the eye to be inspected, and detecting the position of the eye to be inspected by analyzing the face image.
- the control program executed in the ophthalmologic apparatus provided with the second imaging means for photographing the eye image is executed by the control unit of the ophthalmology apparatus to identify the eye to be inspected based on the facial image.
- an adjustment process including a second drive control step of controlling the drive means based on the front eye portion image and adjusting the relative position of the inspection unit with respect to the eye to be inspected is executed.
- the ophthalmic apparatus is made to execute a control step to detect the position of the eye to be inspected by analyzing the facial image and a first acquisition step to acquire the position of the eye to be inspected specified by the detection as the detection position.
- a second acquisition step of accepting an input operation of the position of the eye to be inspected with respect to the face image and acquiring the position of the eye to be inspected specified based on the input operation as a designated position is performed by the ophthalmologist. ..
- 1 to 12 are diagrams illustrating the configuration of the ophthalmic apparatus according to the embodiment.
- the ophthalmic apparatus (for example, the ophthalmic apparatus 1) includes an optometry means (for example, an examination unit 2) for inspecting an eye to be inspected, a driving means (for example, a driving unit 4), and a first imaging means (for example, a facial imaging unit). 3), a second imaging means (for example, anterior ocular segment photographing optical system 60), a control means (for example, a control unit 70), a first acquisition means (for example, a control unit 70), and a second acquisition means (for example).
- a control unit 70 is provided.
- the optometry means includes an inspection optical system (for example, a measurement optical system 20) for inspecting (including measurement) the eye to be inspected.
- an inspection optical system for example, a measurement optical system 20 for inspecting (including measurement) the eye to be inspected.
- the ophthalmic apparatus includes a base (for example, a base 5) on which an optometry means is mounted, a face support means (for example, a face support part 9), a detection means (for example, a control unit 70), and an input means (for example).
- the operation unit 8), the notification means (for example, the display 7), the display means (for example, the display 7), and the discriminating means (for example, the control unit 70) may be provided at least one of them.
- the face support means is configured to support the subject's face in a certain positional relationship with respect to the base.
- the face support means may include a chin rest on which the subject's chin rests.
- the driving means is configured to move the optometry means three-dimensionally relative to the eye to be inspected.
- the optometry means can move in the X direction (horizontal direction), the Y direction (vertical direction), and the Z direction (front-back direction) with respect to the subject's eye supported by the face support means. It is installed.
- the driving means moves the optometry means with respect to the base in the X, Y, and Z directions.
- the ophthalmic apparatus may include a jaw pedestal driving means (for example, a jaw pedestal driving unit 12).
- the chin rest driving means is provided in the face supporting means for driving the chin rest in the Y direction.
- the driving means may include the jaw pedestal driving means as a configuration in which the optometry means is relatively moved in the Y direction with respect to the eye to be inspected.
- the first photographing means is configured to capture a facial image including the left and right eyes to be inspected.
- the second imaging means is configured to capture an image of the anterior segment of the eye to be inspected.
- the second photographing means is configured to take an image of the anterior segment of the eye to be inspected at a higher magnification than the photographing magnification of the first photographing means.
- the first photographing means may also be used as the second photographing means, and the photographing magnification may be changed.
- the detecting means accurately detects the alignment state of the optometry means with respect to the eye to be inspected based on the image of the anterior eye portion by the second photographing means.
- the display means displays a face image taken by the first shooting means.
- the display means may display an anterior eye portion image taken by the second photographing means.
- the display means may display the face image and the anterior eye portion image at the same time.
- the control means may display the facial image larger than the anterior eye image when performing alignment based on the facial image.
- the control means may display the anterior eye portion image larger than the facial image when the alignment shifts to the alignment based on the anterior eye portion image. For example, by alignment based on the facial image, the positional relationship between the optometry object and the optometry means is adjusted until the alignment based on the anterior ocular segment image becomes possible.
- the alignment based on the anterior ocular segment image aligns the positional relationship between the eye to be inspected and the optometry means to a predetermined positional relationship.
- the predetermined positional relationship is, for example, the positional relationship between the optometry subject and the optometry means that can be inspected by the optometry means.
- the input means is provided in the ophthalmology device so that the examiner can specify and input the position corresponding to the eye to be inspected with respect to the face image obtained by the first photographing means.
- the input means can input a designated position for roughly aligning the optometry means with respect to the eye to be inspected.
- the input means comprises a pointing device (eg, at least one human interface such as a touch panel, joystick, mouse, keyboard, trackball, button, etc.).
- the input means can input a designated position to the face image displayed on the display means at the time of rough alignment.
- the face image for inputting the designated position may be a moving image taken by the first shooting means or a still image.
- control means executes an adjustment process for adjusting the relative position of the examination unit with respect to the eye to be inspected by controlling the drive means.
- the adjustment process executed by the control means includes a position acquisition step, a first drive control step, and a second drive control step.
- the position acquisition step is a step of acquiring the position of the eye to be inspected specified based on the facial image.
- the first drive control step is a step of controlling the drive means so that the acquired position of the eye to be inspected is included in the imaging range of the second imaging means.
- the first drive control step may be a rough alignment for roughly aligning the inspection unit with respect to the eye to be inspected.
- the second drive control step is a step of controlling the drive means based on the image of the anterior eye portion after the first drive control step and adjusting the relative position of the examination unit with respect to the eye to be inspected.
- the second drive control step may be a precise alignment for precisely aligning the inspection unit with respect to the eye to be inspected.
- the first acquisition means detects the position of the eye to be inspected by analyzing the face image, and acquires the position of the eye to be inspected specified by the detection as the detection position.
- the second acquisition means accepts an input operation of the position of the eye to be inspected with respect to the face image, and acquires the position of the eye to be inspected specified based on the input operation as a designated position.
- the second acquisition means may acquire the designated position based on the input operation of the eye position with respect to the face image displayed on the display means in the position acquisition step.
- the control means executes the adjustment process of the first drive control step based on the position of the eye to be inspected acquired by the second acquisition means.
- the second acquisition means detects the position of the eye to be inspected by using a part of the face image based on the coordinates specified by the input operation in the face image as an analysis target, and designates the position of the eye to be inspected specified by the detection. Get as a position.
- a part of the face image to be analyzed is within a predetermined area with respect to the designated coordinates.
- the display means may display the candidate for the position of the eye to be specified.
- the second acquisition means acquires a candidate designated from a plurality of candidates as a designated position by an input operation of the examiner.
- the condition of the subject's face for example, the subject's mascara, eyelashes, ptosis, etc.
- the examination environment extraction taken by the first imaging means
- Alignment can be performed well when the position of the eye to be inspected cannot be detected by the first acquisition means due to the background of the person's face, illumination light, ambient light, etc., or when the detection of the eye to be inspected takes a long time. ..
- control means when the control means accepts the input operation of the designated position after the adjustment process using the detection position is started, the control means first receives the input operation based on the designated position regardless of the detection position by the first acquisition means.
- a drive control step may be performed. This makes it possible to perform good alignment even when it takes a long time to detect the eye to be inspected by the first acquisition means.
- the control means adjusts the positions of the left eye and the right eye for one of the left eye and the right eye to be examined first. It may be possible to acquire all at once in the position acquisition step in. Then, when the positions of the left eye and the right eye are collectively acquired in the position acquisition step, an input operation for each eye to be inspected may be required. As a result, the examiner can grasp that it is necessary to input the positions of the left eye and the right eye, and can prevent forgetting to input.
- the control means performs the first drive control step by a predetermined procedure using a designated position corresponding to one eye to be inspected, and one eye. After the examination is completed, the first drive control step may be performed using the designated position corresponding to the opposite eye.
- the ophthalmic apparatus may be provided with a discriminating means (for example, a control unit 70) for discriminating which of the left eye and the right eye corresponds to the two designated positions.
- the discriminating means determines whether the two designated positions correspond to the left eye or the right eye based on the positional relationship between the two designated positions or the positional relationship of the designated positions with respect to the face image. Then, for example, the control means determines the designated position corresponding to the eye of the two designated positions to be inspected first from the two designated positions based on the left and right discrimination results, and determines the designated position corresponding to the eye of the first drive control step. Start execution. This makes it possible to perform good alignment in the examination of both eyes.
- the predetermined procedure is, for example, a procedure for determining which of the left and right eyes to be inspected to perform the examination. Further, the predetermined procedure may be defined, for example, to perform the examination from the eye to be inspected to which the designated position is first input.
- the control means may wait for the execution of the first drive control step until the two positions of the left eye and the right eye are input. For example, if the position of one of the left eye and the right eye is input and then the position of the opposite eye is not input within a predetermined time, the notification means for notifying the examiner to prompt the input operation of the position of the eye to be inspected. (For example, the display 7) may be provided in the ophthalmic apparatus. Alternatively, for example, if the opposite eye position is not input within a predetermined time, the control means may initiate execution of the first drive control step using one eye position.
- the discriminating means discriminates between the left and right eyes to be inspected based on whether the designated position exists in the right eye region or the left eye region in the face image when there is one designated position input by the input operation. It may be configured to do so.
- the control means shifts to the second drive control step after performing the first drive control step based on the input designated position, and after the inspection of one of the left and right eyes to be inspected is completed, the uninspected eye is not inspected.
- the left-right direction in which the inspected unit is moved relative to the eye to be inspected is determined based on the determination result of the discriminating means, and the inspected portion is brought closer to the uninspected eye based on the determined direction.
- the drive means is controlled, and the process shifts to the second drive control step. As a result, even if there is only one designated position, it is possible to perform good alignment in continuous inspection of both the left and right eyes.
- the notification means when the notification means fails to acquire the detection position (that is, when an acquisition error occurs) in the adjustment process using the detection position, the notification means notifies the examiner to input the specified position. I do.
- the examiner can know that the detection of the position of the eye to be inspected by the first acquisition means is defective, and can perform good alignment by performing the input operation of the position of the eye to be inspected. ..
- control program software that performs the functions of the above-described embodiment may be supplied to the system or device via a network or various storage media. Then, a control unit (for example, a CPU or the like) of the system or the device can read and execute the program.
- control program executed in the ophthalmologic device may be executed by the control unit of the ophthalmology device to cause the ophthalmology device to execute a control step for executing an adjustment process for adjusting the relative position of the examination unit with respect to the eye to be inspected. good.
- control step includes various processing steps as described above performed by the control means.
- an ophthalmic refractive force measuring device will be described as an example of an ophthalmic device, but a corneal curvature measuring device, an intraocular pressure measuring device, a fundus camera, an OCT (Optical Coherence Tomography), an SLO (Scanning Laser Ophthalmoscope), and a local area. It can also be applied to other ophthalmic devices such as a micro perimeter.
- the ophthalmic appliance of this embodiment objectively measures, for example, the refractive power of the eye to be inspected.
- the ophthalmologic device of this embodiment may be a device that performs measurement for each eye or may perform measurement for both eyes at the same time (binocular vision).
- the ophthalmic apparatus mainly includes, for example, an examination unit, an imaging unit, a drive unit, and a control unit.
- the ophthalmic apparatus 1 of the present embodiment mainly includes an examination unit 2, a face photographing unit 3, and a driving unit 4.
- the inspection unit 2 inspects the eye to be inspected.
- the inspection unit 2 may include, for example, an optical system for measuring the optical power of the eye to be inspected, the curvature of the cornea, the intraocular pressure, and the like. Further, the inspection unit 2 may be provided with an optical system or the like for photographing the anterior eye portion, the fundus, etc. of the eye to be inspected. In this embodiment, the inspection unit 2 for measuring the refractive power will be described as an example.
- the face photographing unit 3 photographs, for example, the face of the eye to be inspected.
- the face photographing unit 3 photographs, for example, a face including the left and right eyes to be inspected.
- the drive unit 4 moves, for example, the inspection unit 2 and the face photographing unit 3 in the vertical, horizontal, front-back directions (three-dimensional directions) with respect to the base 5.
- the ophthalmic appliance 1 of the present embodiment may include, for example, a housing 6, a display 7, an operation unit 8, a face support unit 9, and the like.
- the housing 6 houses an inspection unit 2, a face photographing unit 3, a driving unit 4, and the like.
- the display 7 displays, for example, a face image If by the face photographing unit 3, an anterior eye portion image Ia by the anterior eye portion photographing optical system 60, a measurement result, and the like.
- the display of the display 7 is switched from the face image If to the anterior eye image Ia when the precision alignment (see FIG. 4) described later is started.
- the face image If and the anterior eye portion image Ia may be displayed on the display 7 at the same time.
- the control unit 70 may display on the display 7 which of the face image If and the anterior eye image Ia is selected by the examiner.
- the ophthalmologic apparatus 1 is provided with a selection means (for example, a switch) for selecting whether to display the facial image If or the anterior eye portion image Ia.
- a selection means for example, a switch
- the display 7 may be provided integrally with the ophthalmologic device 1, or may be provided separately from the device, for example.
- the ophthalmic appliance 1 may include an operation unit 8.
- the operation unit 8 includes a pointing device capable of designating a position with respect to the screen of the display 7.
- the pointing device may be various human interfaces such as a touch panel, a joystick, a mouse, a keyboard, a trackball, and a button.
- Various operation instructions by the examiner are input to the operation unit 8.
- the display 7 has a touch function
- the operation unit 8 is also used as the display 7. That is, when the examiner touches the display 7, the ophthalmic appliance 1 is operated.
- the operation unit 8 is used for various settings of the ophthalmic appliance 1 and operations at the start of measurement. Further, in the present embodiment, the operation unit 8 is used by the examiner to specify the coordinates (position) on the face image If displayed on the display 7. For example, a two-dimensional coordinate axis (x-axis and y-axis) is set in advance on the display 7, and a position (designated point) touched by the examiner is recognized by the control unit 70 (see FIG. 2).
- the two-dimensional coordinate axes (x-axis and y-axis) on the display 7 are names given for explanation, and are in the X direction (horizontal direction) and the Y direction (vertical direction) in the direction in which the inspection unit 2 is driven. And the Z direction (front-back direction) is different.
- control unit 70 recognizes the input by touch is the same as that described in Japanese Patent Application Laid-Open No. 2014-205878, so refer to this.
- the face support portion 9 may include, for example, a forehead pad 10 and a chin rest 11.
- the jaw pedestal 11 may be moved in the vertical direction by the drive of the jaw pedestal driving unit 12.
- the ophthalmic appliance 1 includes a control unit 70.
- the control unit 70 controls various controls of the ophthalmic appliance 1.
- the control unit 70 includes, for example, a general CPU (Central Processing Unit) 71, a flash ROM 72, a RAM 73, and the like.
- the flash ROM 72 stores a control program, initial values, and the like for controlling the ophthalmic appliance 1.
- RAM temporarily stores various types of information.
- the control unit 70 is connected to an inspection unit 2, a face photographing unit 3, a drive unit 4, a display 7, an operation unit 8, a chin rest drive unit 12, a storage unit (for example, a non-volatile memory) 74, and the like.
- the storage unit 74 is, for example, a non-transient storage medium capable of retaining the stored contents even when the power supply is cut off.
- a hard disk drive, a detachable USB flash memory, or the like can be used as the storage unit 74.
- the face photographing unit 3 can photograph a face including the left and right eyes to be inspected, for example.
- the face photographing unit 3 of this embodiment includes, for example, an photographing optical system 3A for photographing the face of a subject.
- the photographing optical system 3A mainly includes, for example, an image pickup element 3Aa and an image pickup lens 3Ab.
- the face photographing unit 3 is, for example, a non-telecentric optical system. As a result, for example, it is not necessary to provide a telecentric lens or the like, and the configuration can be simplified. In addition, the shooting range can be widened as compared with the telecentric optical system.
- the face photographing unit 3 of this embodiment is moved together with the inspection unit 2 by the driving unit 4.
- the face photographing unit 3 may be fixed to the base 5 and may not move, for example.
- the face photographing unit 3 is provided above the inspection unit 2, but the position of the face photographing unit 3 is not limited to this.
- the face photographing unit 3 may be provided below the inspection unit 2 or may be provided laterally.
- the position of the face photographing unit 3 (optical axis of the photographing optical system 3A) in the left-right direction is the same as the optical axis L2 of the inspection unit 2, but the position is not limited to this. For example, assuming that the measurement is started from the right eye of the subject, the initial position of the examination unit 2 in the left-right direction is located on the right eye side when viewed from the subject, and the face photographing unit 3 is the jaw stand 11.
- the face photographing unit 3 may be provided so that the measurement optical axis of the inspection unit 2 and the photographing optical axis of the face photographing unit 3 are coaxial. Further, the face photographing unit 3 may be arranged independently of the movement of the inspection unit 2.
- the face photographing unit 3 is provided on the base 5 so as to be three-dimensionally driveable, and is three-dimensionally mounted on the eye to be inspected by a drive unit (second drive unit) different from the drive unit (first drive unit) 4. May be driven.
- the first drive unit for moving the inspection unit 2 and the second drive unit for moving the face photographing unit 3 may be used in combination.
- the inspection unit 2 measures, inspects, photographs, etc. the eye to be inspected.
- the inspection unit 2 may include, for example, a measurement optical system for measuring the refractive power of the eye to be inspected.
- the inspection unit 2 includes a measurement optical system 20, a fixation target presentation optical system 40, an alignment index projection optical system 50, and an observation optical system (anterior ocular segment photographing optical system) 60. , May be provided.
- the measurement optical system 20 includes a projection optical system (projection optical system) 20a and a light receiving optical system 20b.
- the projection optical system 20a projects a luminous flux onto the fundus Ef through the pupil of the eye to be inspected.
- the light receiving optical system 20b takes out the reflected light flux (reflected light from the fundus) from the fundus Ef through the peripheral portion of the pupil in a ring shape, and captures a ring-shaped reflected image of the fundus mainly used for measuring the refractive power.
- the projection optical system 20a has a light source 21, a relay lens 22, a hall mirror 23, and an objective lens 24 on the optical axis L1.
- the light source 21 projects a spot-shaped light source image from the relay lens 22 onto the fundus Ef via the objective lens 24 and the central portion of the pupil.
- the light source 21 is moved in the optical axis L1 direction by the moving mechanism 33.
- the hole mirror 23 is provided with an opening through which the light flux from the light source 21 is passed through the relay lens 22.
- the hole mirror 23 is arranged at a position optically conjugate with the pupil of the eye to be inspected.
- the light receiving optical system 20b shares the hall mirror 23 and the objective lens 24 with the projection optical system 20a. Further, the light receiving optical system 20b includes a relay lens 26 and a total reflection mirror 27. Further, the light receiving optical system 20b has a light receiving diaphragm 28, a collimator lens 29, a ring lens 30, and an image pickup element 32 on the optical axis L2 in the reflection direction of the hall mirror 23. A two-dimensional light receiving element such as an area CCD can be used for the image pickup element 32.
- the light receiving diaphragm 28, the collimator lens 29, the ring lens 30, and the image pickup element 32 are moved integrally with the light source 21 of the projection optical system 20a in the optical axis L2 direction by the moving mechanism 33. When the light source 21 is arranged at a position optically conjugate with the fundus Ef by the moving mechanism 33, the light receiving diaphragm 28 and the image pickup element 32 are also arranged at a position optically conjugate with the fundus Ef.
- the ring lens 30 is an optical element for shaping the fundus reflected light guided from the objective lens 24 via the collimator lens 29 into a ring shape.
- the ring lens 30 has a ring-shaped lens portion and a light-shielding portion.
- the ring lens 30 is arranged at a position optically conjugate with the pupil of the eye to be inspected.
- the image pickup device 32 receives the ring-shaped fundus reflected light (hereinafter referred to as a ring image) via the ring lens 30.
- the image sensor 32 outputs the image information of the received ring image to the control unit 70.
- the control unit 70 displays the ring image on the display 7 and calculates the refractive power based on the ring image.
- the dichroic mirror 39 is arranged between the objective lens 24 and the eye to be inspected.
- the dichroic mirror 39 transmits the light emitted from the light source 21 and the fundus reflected light corresponding to the light from the light source 21. Further, the dichroic mirror 39 guides the light flux from the fixative display optical system 40, which will be described later, to the eye to be inspected. Further, the dichroic mirror 39 reflects the anterior segment reflected light of the light from the alignment index projection optical system 50 described later, and guides the anterior segment reflected light to the anterior segment photographing optical system 60.
- an alignment index projection optical system 50 is arranged in front of the eye to be inspected.
- the alignment index projection optical system 50 mainly projects an index image used for alignment of the optical system with respect to the eye to be inspected to the anterior eye portion.
- the alignment index projection optical system 50 includes a ring index projection optical system 51 and an index projection optical system 52.
- the ring index projection optical system 51 projects diffused light onto the cornea of the subject's eye examination E, and projects the ring index 51a.
- the ring index projection optical system 51 is also used as anterior ocular segment illumination for illuminating the anterior segment of the subject's optometry E in the ophthalmic apparatus 1 of the present embodiment.
- the index projection optical system 52 projects parallel light onto the cornea of the eye to be inspected and projects the infinity index 52a.
- the fixative display optical system 40 is provided on the optical axis L4 in the reflection direction of the light source 41, the fixative 42, the relay lens 43, and the reflection mirror 46.
- the fixative 42 is used to fix the eye to be inspected when measuring the objective refractive power.
- the fixative 42 is illuminated by the light source 41 and is presented to the eye to be inspected.
- the light source 41 and the fixative 42 are integrally moved in the direction of the optical axis L4 by the drive mechanism 48.
- the presentation position (presentation distance) of the fixative may be changed by moving the light source 41 and the fixative. This makes it possible to measure the refractive power by applying cloud fog to the eye to be inspected.
- the front eye photographing optical system 60 includes an image pickup lens 61 and an image pickup element 62 on the optical axis L3 in the reflection direction of the half mirror 63.
- the image sensor 62 is arranged at a position optically conjugate with the anterior segment of the eye to be inspected.
- the image sensor 62 photographs the anterior eye portion illuminated by the ring index projection optical system 51.
- the output from the image sensor 62 is input to the control unit 70.
- the anterior eye portion image Ia of the eye to be inspected taken by the image sensor 62 is displayed on the display 7 (see FIG. 2).
- an alignment index image (in this embodiment, the ring index 51a and the infinity index) formed on the cornea of the eye to be inspected is photographed by the alignment index projection optical system 50.
- the control unit 70 can detect the alignment index image based on the image pickup result of the image sensor 62. Further, the control unit 70 can determine the suitability of the alignment state based on the position where the alignment index image is detected.
- the operation of the ophthalmic appliance 1 of this embodiment will be described with reference to the flowcharts of FIGS. 4 to 5.
- the detection of the position of the pupil of the eye to be inspected based on the face image If, the coarse alignment which is the alignment based on the detected position of the pupil, and the precision alignment which is the alignment based on the anterior segment image Ia. And are done automatically.
- the control unit 70 performs measurement (inspection) when the precise alignment is completed.
- the flowcharts of FIGS. 4 to 5 are cases where the right eye and the left eye of the subject are continuously inspected.
- the position of the pupil is detected as the position of the eye to be inspected in this embodiment, the content of the present disclosure is not limited to this, and other characteristic sites such as the inner corner of the eye may be detected.
- the inspection unit 2 is located at the rear position away from the subject and at the center on the left and right with respect to the base 5.
- the face image If including the left and right eyes of the subject is photographed by the face photographing unit 3, and the control unit 70 acquires the face image If.
- the control unit 70 analyzes the acquired facial image If and detects the positions of the left and right pupils (pupil EPR of the right eye and pupil EPL of the left eye) (S101).
- the pupil position of the eye to be inspected specified by the detection is acquired as a detection position.
- the control unit 70 may detect the edge of the face image and detect the position of the pupil from the face image based on the shape or the like.
- the method for detecting the pupil refer to the method described in JP-A-2017-196304.
- S104 a method of setting the direction in which the control unit 70 moves the inspection unit 2 will be briefly described.
- the eye of the person to be inspected first is programmed in advance. For example, it is preset to start the examination from the right eye.
- the control unit 70 identifies the pupil EPR of the right eye among the pupil positions of both eyes detected by the analysis of the facial image If.
- the control unit 70 performs a calculation based on the position (x, y) (two-dimensional coordinates) of the identified pupil EPR of the right eye, and the direction (three-dimensional) in which the pupil EPR of the right eye exists with respect to the inspection unit 2. (Coordinates of) is obtained.
- the control unit 70 controls the drive of the drive unit 4 and moves the inspection unit 2 in the required direction to perform rough alignment with respect to the right eye (S105).
- control unit 70 shifts from coarse alignment to precision alignment (S109).
- the control unit 70 analyzes the anterior eye portion image Ia imaged by the anterior eye portion photographing optical system 60.
- the control unit 70 shifts from the coarse alignment to the precise alignment.
- control unit 70 controls the drive of the drive unit 4 based on the alignment index projected by the alignment index projection optical system 50, and the measurement optical axis of the inspection unit 2 and the center of the cornea or the center of the cornea. Alignment in the XY direction that aligns the center of the pupil and alignment in the Z direction that adjusts the distance between the examination unit and the cornea of the eye to be inspected so as to have a predetermined working distance are performed.
- the control unit 70 inspects so that the measurement optical axis of the inspection unit 2 and the center of the cornea coincide with each other based on the ring index 51a projected by the ring index projection optical system 51 as the alignment in the XY direction. Change the position of part 2 (alignment in the XY direction). For example, the center of the cornea and the measurement optical axis are aligned by changing the position of the inspection unit 2 in the XY direction so that the center of the ring of the ring index 51a and the measurement optical axis coincide with each other.
- the control unit 70 uses the Z-direction alignment of the distance between the infinity index 52a projected by the index projection optical system 52 and the diameter of the ring index 51a projected by the ring index projection optical system 51.
- the alignment state in the working distance direction is detected based on the ratio and the detection result, and the inspection unit 2 is moved based on the detection result.
- the working distance alignment technique for example, the technique described in Japanese Patent Application Laid-Open No. 10-127581 can be used, so refer to this for details.
- the alignment method based on these indexes is an example of precision alignment, and is not limited to this configuration and method.
- the control unit 70 automatically emits a trigger signal for starting measurement, and the measurement optical system 20 of the inspection unit 2 starts measurement of the eye to be inspected.
- the inspection unit 2 measures the refractive power of the eye to be inspected (S110).
- control unit 70 controls the drive of the drive unit 4 based on the position corresponding to the unmeasured left eye pupil EPL among the positions of the left and right pupils acquired in S101, and performs rough alignment (S111). ).
- control unit 70 performs precision alignment (S112) in the same manner as the method performed on the measured eye to be inspected, and when the alignment is completed, automatically emits a trigger signal to perform measurement (S113).
- the control unit 70 starts detecting the pupil positions of the left and right eyes by analyzing the face image If captured by the face photographing unit 3 (S101).
- the position of the subject's eye may be erroneously detected due to the subject's mascara, eyelashes, ptosis, etc., and as a result, pupil detection by facial image analysis fails (that is, an acquisition error occurs).
- the control unit 70 determines whether the pupil detection by the face image analysis is successful within a predetermined time (S103). Further, if the position of the pupil is not acquired within a predetermined time (timeout), the control unit 70 determines that the pupil detection has failed (acquisition error).
- the control unit 70 performs error processing (S106).
- the control unit 70 displays on the display 7 that the position of the eye to be inspected (pupil position) is poorly detected, and the operation unit 8 specifies the position corresponding to the eye to be inspected. Notify that it is necessary.
- the control unit 70 displays a message 201 to the effect that the pupil on the face image displayed on the display 7 is touched (see FIG. 8).
- the message 201 is erased by touching the display 201a of "OK", and the state shifts to the state of accepting the input to the display 7.
- the control unit 70 controls the face touched by the examiner.
- Rough alignment is performed by using the position on the image as the position of the pupil, acquiring the position on the image instead of the position of the pupil detected by the face image analysis of S101, and controlling the drive unit 4 using the input designated position.
- the touch to the display 7 is an example of a method of designating the pupil position on the face image, and is not limited to this.
- the pupil position on the face image may be designated by a method such as a mouse or a button. ..
- two designated positions a designated position SP1 corresponding to the right eye of the subject and a designated position SP2 corresponding to the left eye, can be input to the face image If.
- the control unit 70 waits for the start of coarse alignment until the second designated position is input. If the second designated position is not input within the predetermined time, the control unit 70 may display a message to that effect to prompt the input of the second designated position.
- the display of this message is an example of the notification means, and the display is not limited to this, and the examiner may be prompted to input the designated position by other means such as voice or blinking of a light.
- the display of the face image If on the display 7 when the examiner inputs a designated position corresponding to the eye to be inspected may be a moving image taken by the face photographing unit 3 or a still image. May be good.
- the control unit 70 may display the still image of the face image If at the time when the designated position is input on the display 7. For example, the control unit 70 may temporarily switch the display of the display 7 from a moving image to a still image while the examiner is touching the display 7 to input a designated position.
- the designated position on the still image may be changed by the input of the examiner.
- the control unit 70 may stop the pupil detection process based on the face image while the examiner changes the designated position on the still image.
- control unit 70 acquires the coordinates of the designated position and obtains the direction in which the inspection unit 2 is moved based on the acquired coordinates (S107).
- the drive unit 4 is controlled to perform rough alignment of the inspection unit 2 (S108).
- a method of performing rough alignment based on the input of the designated position will be described with reference to the flowchart of FIG.
- the control unit 70 sets the coordinates on the face image If displayed on the display 7 for each of the designated positions.
- Acquire S201
- the control unit 70 controls the display of the display 7 so that the location touched by the examiner can be known, and the visually distinguishable marks are superimposed and displayed on the touched positions SP1 and SP2. You may. Further, the control unit 70 has a right side display 76R showing the subject's right eye on the left side of the face image If and a face so that the examiner does not confuse the left and right of the subject's eye in the face image If displayed on the display 7. The left side display 76L showing the left eye of the subject and the left and right display 76 may be displayed on the right side of the image If. Further, the control unit 70 may display the clear button 78 so that the input can be redone when the examiner mistakenly touches the touch.
- control unit 70 discards the input position and accepts the input from the examiner again. Further, the control unit 70 may store the face image If and the mark of the input position when the designated position is input to the face image If in the storage unit 74. In that case, the control unit 70 may be able to confirm the designated position by calling the image and displaying it on the display 7.
- control unit 70 may acquire the last input position as the designated position. In that case, the control unit 70 may detect that the same position of the eye to be inspected on the left and right is input by using the determination means described later.
- the control unit 70 determines which of the two designated positions corresponds to the positions of the left and right eyes to be inspected (S202). For example, the control unit 70 determines the left and right of the eye to be inspected based on the positional relationship between the two designated positions. That is, the control unit 70 refers to the coordinates on the display 7 for each of the two designated positions and compares the positions in the x direction. According to this, the control unit 70 can obtain the left-right relationship of the two designated positions. For example, the designated position on the left side corresponds to the eye to be inspected on the right, and the designated position on the right side corresponds to the eye to be inspected on the left.
- the left and right discrimination regarding the two designated positions may be performed based on the positional relationship of the designated positions with respect to the face image If. That is, it is determined that the designated position existing in the region on the left side of the face image If corresponds to the right eye and the designated position existing in the region on the right side corresponds to the left eye.
- the left side region and the right side region are, for example, experimentally predetermined regions.
- the control unit 70 determines the designated position corresponding to the predetermined position for the first measurement among the two positions based on the left / right discrimination result performed in S202, and the determined position is determined.
- the direction in which the inspection unit 2 is moved is determined based on the coordinates of the position corresponding to the eye to be inspected (S203).
- the two-dimensional coordinates (x, y) on the face image which is the position of the pupil detected by the analysis of the face image in S105 described above, are used. , It may be replaced with the coordinates (x, y) of the designated position specified by the examiner.
- the control unit 70 associates the left and right discrimination results with the left and right measurement results.
- the control unit 70 controls the drive unit 4 based on the designated position and moves the inspection unit 2 in the required direction to perform rough alignment (S108).
- the position of the chin rest 11 may be adjusted based on the designated position with respect to the face image If. For example, when the position of the eye to be inspected corresponding to the designated position in the Y direction is outside the movement range of the inspection unit 2 (optical axis L1) by the drive unit 4, the control unit 70 controls the drive of the jaw post drive unit 12. Then, by moving the jaw stand 11 in the Y direction, the position of the eye to be inspected corresponding to the designated position may be within the moving range of the inspection unit 2.
- control unit 70 is projected onto the anterior eye portion by the alignment index projection optical system 50 by the analysis of the anterior eye portion image, as in the case where the position of the eye to be inspected is normally detected by the above-mentioned facial image analysis.
- the control for performing coarse alignment shifts to the control for performing precision alignment.
- the control unit 70 controls the drive of the drive unit 4 to perform precise alignment based on the alignment index based on the front eye portion image (S109).
- the control unit 70 measures the eye to be inspected (S110).
- control unit 70 performs rough alignment on the unmeasured eye among the left and right eyes to be inspected (S111).
- the control unit 70 moves the inspection unit 2 using the coordinates (x, y) corresponding to the other unmeasured eye to be inspected among the coordinates of the designated position with respect to the face image If acquired in S107. Decide the direction to make it.
- the control unit 70 performs rough alignment by moving the inspection unit 2 in the required direction (S111).
- control unit 70 performs precision alignment (S112) and measurement (S113) in the same manner as the method used for the eye to be measured first.
- the ophthalmic apparatus 1 includes a binocular examination mode for continuously inspecting the right eye and the left eye of a subject, and a right eye for selectively inspecting only the right eye among the left and right eye to be inspected.
- An eye examination selection switch (eye selection means) that can select an examination mode and a left eye examination mode in which only the left eye is examined is provided (not shown).
- the examiner may mistakenly recognize the left and right sides of the eye to be inspected and specify the position of the inspected eye in the wrong direction.
- control unit 70 guides the examiner to input the designated position to the selected eye for the facial image.
- control unit 70 controls the display of the display 7 and performs an induced display so that the eye to be touched by the examiner can recognize it.
- the control unit 70 determines the brightness of the face image If displayed on the display 7 from the left and right center to the left region of the face image and from the left and right centers. Display with a difference between the brightness in the right area and. With such a guidance display, it is possible to touch the eye to be inspected, which is included in the one with the higher brightness (the one emphasized) among the left side and the right side of the face image, and guide the user to specify the position.
- the induced representation is not limited to the example shown in FIG.
- the display of the eye selected by the eye selection switch may be displayed so as to be distinguishable from the other.
- the control unit 70 changes the color corresponding to the selected eye to a predetermined color (for example, orange).
- the control may be such that the drive unit 4 is driven based on the pupil position (EPR, EPL) detected by the analysis of (see FIG. 11).
- This control method is used when the wrong part is detected as the pupil (for example, the shadow of the mask covering the hair, nose and mouth, the outer background of the face, as a case of poor position detection of the pupil by analyzing the facial image. It is suitably applied to black areas, noise of reflected light due to makeup, kuroko, eye bands, scratches, eyebrows, eyebrows, etc., which may cause false detection of the pupil). Further, this control method is preferably applied even when the examiner touches the display 7 with a finger and the examiner has a low skill level and cannot accurately touch and specify the position of the pupil.
- control unit 70 may perform alignment with respect to the designated position (SP1, SP2). good. In that case, the control unit 70 displays a message on the display 7 to notify the examiner that the eye to be inspected has not been detected and that the alignment is performed with respect to the designated positions (SP1, SP2). good.
- the control unit 70 may obtain the reliability based on the characteristics of the detected part (for example, the difference in contrast with the surroundings and the shape). good.
- the reliability is a value for evaluating the characteristics of the detection site. For example, if the reliability is lower than a predetermined threshold value, the control unit 70 may determine that an erroneous portion has been detected as a pupil and request the examiner to input a designated position.
- control for performing the rough alignment using the designated positions may be performed as follows. That is, for example, when an erroneous part is detected as a pupil in the position detection of the pupil by analyzing a facial image, as shown in FIG. 12, a plurality of candidate regions 75a, 75b, 75c, 75d, which are detected as the pupil, 75e, 75f and 75g are displayed on the display 7.
- the examiner confirms the display of the candidate area 75a-75g, and designates the position of the area to be identified as a pupil from the display by touching.
- the control unit 70 limits the region of the designated designated position to the detection target of the pupil position, controls the drive unit 4 based on the pupil position detected by the analysis, and performs rough alignment.
- the input of the designated position for the face image is performed when the detection of the position of the eye to be inspected fails (in the case of an acquisition error), but the input is not limited to this.
- the control unit 70 accepted the input of the designated position by the touch panel or the like while detecting the position of the eye to be inspected based on the face image.
- rough alignment may be performed based on the input of the designated position regardless of the position detection of the eye to be inspected based on the facial image.
- the rough alignment is performed after the examiner designates two pupil positions for the face image If, but the rough alignment is performed after one designated position is input. You may be broken. Alternatively, for example, when the second designated position is not input within a predetermined time, rough alignment using the first input designated position may be started. The control in that case will be described with reference to the flowcharts of FIGS. 4 and 6.
- the control unit 70 acquires the coordinates of the designated position in S107, and uses the result to obtain the direction in which the inspection unit 2 is moved.
- the process of S107 when the designated position is one place is shown in the flowchart of FIG.
- the control unit 70 receives an input for designating one place as the pupil position for a predetermined time, and acquires the coordinates of the designated position (S301). If no input is made within a predetermined time, for example, the device times out. When the time-out occurs, for example, the control unit 70 may notify the examiner that the face image displayed on the display 7 is touched and the pupil position is input.
- the control unit 70 determines which of the left and right eyes to be inspected corresponds to the designated designated position (S302).
- the control unit 70 determines that the position of the right eye to be inspected is specified when the left half of the face image is touched from the left and right central axes of the face image, and the face image is determined.
- the right half is touched from the center of the left and right, it can be determined that the position of the left eye to be inspected is specified.
- the left and right centers are examples of criteria for dividing the face image, and may differ depending on the configuration of the device.
- the direction in which the inspection unit 2 is moved is obtained (S303), and rough alignment is performed (S108 in FIG. 4).
- the method of rough alignment the same method as in the case of designating the positions of both eyes to be inspected and performing coarse alignment on one of them can be performed. That is, the alignment control based on the coordinates (x, y) obtained by the analysis of the face image can be roughly aligned by replacing the alignment control with the coordinates (x, y) on the designated face image.
- control unit 70 performs precision alignment in the same manner as described above (S109), and measures the aligned eye (S110).
- control unit 70 performs rough alignment on the unmeasured eye to be inspected (S111).
- the input position of the eye to be inspected is one place, the subject to which the designated position is not input after the measurement of the eye to be inspected to which the designated position is input is completed based on the left / right discrimination result in S302.
- the optometry is specified, and the optometry unit 2 is moved in the direction of the optometry to perform rough alignment.
- the control unit 70 Since the left and right eyes to be inspected are at substantially the same height in the Y direction, the control unit 70 does not move the inspection unit 2 in the Y direction, but controls the drive unit 4 to move it in the X direction. At this time, the control unit 70 moves the inspection unit 2 in the direction in which the other eye is present, based on the result of the left-right discrimination. The positions of the left and right eyes to be examined are almost even on the left and right with respect to the center of the subject's face. Therefore, the amount of movement to move the inspection unit 2 in the X direction can be obtained based on the amount of movement of the inspection unit 2 with respect to the reference position (center position) of the base 5 when one of the eyes to be inspected is measured.
- the amount of movement of the inspection unit 2 with respect to the reference position (center position) of the base 5 in the X direction can be obtained from the drive data of the drive unit 4 in the X direction.
- the position of the inspection unit 2 in the Z direction with respect to the eye to be inspected can be regarded as substantially the same for the left and right eyes to be inspected, the position in the Z direction of the inspection unit 2 at the time of one measurement can be used.
- the control unit 70 performs rough alignment with respect to the other eye to be inspected, analyzes the anterior eye portion image Ia acquired by the anterior ocular segment photographing optical system 60, and if the alignment index is detected, Precise alignment is performed based on the detection result.
- control unit 70 automatically emits a trigger signal to execute the measurement by the measurement optical system 20.
- the control unit 70 outputs the left and right measurement results corresponding to the left and right discrimination results of S302.
- the alignment can be performed well.
- the face image If displayed on the display 7 may be enlarged or reduced.
- the examiner may be able to enlarge or reduce the image by pinching with the touch panel.
- the examiner may be able to specify the position of the eye to be inspected in the Z direction based on the amount of enlargement / reduction of the facial image.
- the ophthalmologic apparatus 1 uses machine learning to increase the accuracy with which the control unit 70 from the next time onward detects the eye to be inspected from the face image If based on the information of the position specified on the face image If. It may have the configuration. In that case, the control unit 70 acquires parameters such as the coordinates of the designated position on the face image and the difference in contrast between the designated position and its surroundings as success factors. For example, based on the obtained success factor, the control unit 70 increases the detection accuracy of the eye to be inspected by the face image analysis from the next time onward.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Biophysics (AREA)
- Ophthalmology & Optometry (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Physics & Mathematics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Eye Examination Apparatus (AREA)
Abstract
Description
(2) 被検眼を検査する検査部を被検眼に対して3次元的に相対移動させる駆動手段と、左右の被検眼を含む顔画像を撮影するための第1撮影手段と、被検眼の前眼部像を撮影するための第2撮影手段と、を備える眼科装置において実行される制御プログラムは、眼科装置の制御部に実行されることで、前記顔画像に基づいて特定される被検眼の位置を取得する位置取得ステップと、取得された被検眼の位置が前記第2撮影手段によって撮影される前記前眼部像の撮影範囲に含まれるように前記駆動手段を制御する第1駆動制御ステップと、前記第1駆動制御ステップの後に前記前眼部像に基づいて前記駆動手段を制御し、被検眼に対する前記検査部の相対位置を調整する第2駆動制御ステップと、を含む調整処理を実行する制御ステップと、前記顔画像を解析することで被検眼の位置を検出し、検出により特定される被検眼の位置を検出位置として取得する第1取得ステップと、を眼科装置に実行させ、さらに、前記顔画像に対する被検眼の位置の入力操作を受け付け、前記入力操作に基づいて特定される被検眼の位置を指定位置として取得する第2取得ステップを、眼科装置に実行させることを特徴とする。
例えば、眼科装置(例えば、眼科装置1)は、被検眼を検査する検眼手段(例えば、検査部2)と、駆動手段(例えば、駆動部4)と、第1撮影手段(例えば、顔撮影部3)と、第2撮影手段(例えば、前眼部撮影光学系60)と、制御手段(例えば、制御部70)と、第1取得手段(例えば、制御部70)と、第2取得手段(例えば、制御部70)と、を備える。
本開示に係る眼科装置を図面に基づいて説明する。なお、以下の説明では、眼科装置として眼屈折力測定装置を例に説明するが、角膜曲率測定装置、眼圧測定装置、眼底カメラ、OCT(Optical Coherence Tomography)、SLO(Scanning Laser Ophthalmoscope)、局所視野計(Micro Perimeter)等の他の眼科装置にも適用可能である。
以下、本実施例の眼科装置1における動作について、図4~図5のフローチャートを用いて説明する。本実施例において、例えば、顔画像Ifに基づいて被検眼の瞳孔の位置の検出と、検出された瞳孔の位置に基づくアライメントである粗アライメントと、前眼部像Iaに基づくアライメントである精密アライメントと、が自動で行われる。制御部70は、精密アライメントが完了すると測定(検査)を行う。なお、図4~図5のフローチャートは、被検者の右眼および左眼を連続的に検査する場合である。
まず、顔撮影部3によって撮影された顔画像の解析によって被検眼の位置が正常に検出された場合を説明する。
次に、顔画像Ifに対して指定位置が入力された場合の動作を、被検眼の位置検出が正常にできた場合(指定位置が入力されなかった場合)の動作との相違を中心に説明する。
次に、片眼検査(左右の被検眼のうち、一方の眼のみ検査)の場合の例を説明する。例えば、眼科装置1には、被検者の右眼および左眼を連続的に検査するための両眼検査モードと、左右の被検眼のうち、右眼のみ選択的に検査するための右眼検査モードと、左眼のみ検査する左眼検査モードと、を選択できる被検眼選択スイッチ(被検眼選択手段)が設けられている(図示を略す)。
上記の実施例では、指定位置(SP1,SP2)を用いた粗アライメントを行う例として、操作部8により指定位置(SP1,SP2)が入力されたときには、駆動部4の駆動を、顔画像の解析によって検出された被検眼の位置の検出結果に基づく制御から、指定位置に基づく制御に切換える例を説明したが、これに限られない。例えば、操作部8により指定位置が入力されたときには、指定位置(SP1,SP2)を基準にした所定領域内(AP1,AP2)で顔画像の解析による瞳孔位置の検出を行い、その所定領域内の解析によって検出された瞳孔位置(EPR,EPL)に基づいて駆動部4を駆動する制御であってもよい(図11参照)。この制御方法は、顔画像の解析による瞳孔の位置検出の不良のケースとして、誤った部位を瞳孔として検出された場合(例えば、髪の毛、鼻及び口を覆うマスクの影、顔の外側の背景の黒い部分、化粧による反射光のノイズ、黒子、眼帯、傷、眉毛、まつ毛、等によって瞳孔が誤検知される場合あり)に好適に適用される。また、この制御方法は、検者がディスプレイ7を指でタッチする場合において、検者の練度が低く、瞳孔の位置を精度よくタッチして指定できない場合においても好適に適用される。
2 検査部
3 顔撮影部
4 駆動部
5 基台
7 ディスプレイ
8 操作部
20 測定光学系
70 制御部
Claims (9)
- 被検眼を検査する検査部を備える眼科装置において、
被検眼に対して前記検査部を3次元的に相対移動させる駆動手段と、
左右の被検眼を含む顔画像を撮影するための第1撮影手段と、
被検眼の前眼部像を撮影するための第2撮影手段と、
前記顔画像に基づいて特定される被検眼の位置を取得する位置取得ステップと、取得された被検眼の位置に基づいて被検眼が前記第2撮影手段の撮影範囲に含まれるように前記駆動手段を制御する第1駆動制御ステップと、前記第1駆動制御ステップの後に前記前眼部像に基づいて前記駆動手段を制御し、被検眼に対する前記検査部の相対位置を調整する第2駆動制御ステップと、を含む調整処理を実行する制御手段と、
前記顔画像を解析することで被検眼の位置を検出し、検出により特定される被検眼の位置を検出位置として取得する第1取得手段と、
前記顔画像に対する被検眼の位置の入力操作を受け付け、前記入力操作に基づいて特定される被検眼の位置を指定位置として取得する第2取得手段と、を備える眼科装置。 - 請求項1の眼科装置において、
前記制御手段は、前記検出位置を利用した前記調整処理が開始されてから、前記指定位置の入力操作を受け付けた場合は、前記検出位置に関わらず、前記指定位置に基づいて前記第1駆動制御ステップを行うことを特徴とする眼科装置。 - 請求項1又は2の眼科装置において、
前記第1撮影手段によって撮影された顔画像を表示する表示手段を備え、
前記第2取得手段は、前記位置取得ステップにおいて、前記表示手段に表示された顔画像に対する被検眼の位置の入力操作に基づいて前記指定位置を取得することを特徴とする眼科装置。 - 請求項1~3の何れかの眼科装置において、
前記制御手段は、左眼および右眼を連続的に検査する場合に、左眼および右眼のそれぞれの位置を、左眼および右眼のうち最初に検査される片方の被検眼に対する前記調整処理での前記位置取得ステップにおいて一括して取得可能であり、
前記位置取得ステップにおいて、左眼および右眼のそれぞれの位置が一括して取得される場合は、それぞれの眼に対する前記入力操作が要求可能である、ことを特徴とする眼科装置。 - 請求項4の眼科装置において、
前記制御手段は、前記片方の被検眼に対応する指定位置を用いた前記第1駆動制御ステップを予め定められた手順で行い、前記片方の被検眼の検査が終了した後、反対の眼に対応する指定位置を用いて前記第1駆動制御ステップを行うことを特徴とする眼科装置。 - 請求項1~5の何れかの眼科装置において、
前記第2取得手段は、前記顔画像において入力操作によって指定された座標を基準とする前記顔画像の一部を解析対象として被検眼の位置を検出し、検出により特定される被検眼の位置を前記指定位置として取得することを特徴とする眼科装置。 - 請求項1~6の何れかの眼科装置において、
前記検出位置を利用した前記調整処理において、前記検出位置の取得エラーが生じた場合は、前記指定位置の入力操作を検者に対して要求するための報知を行う報知手段を備えることを特徴とする眼科装置。 - 請求項1の眼科装置において、
前記入力操作により入力された指定位置が1つの場合は、前記第1撮影手段による撮影画像における右眼領域と左眼領域のいずれに指定位置が存在するかに基づいて被検眼の左右を判別する判別手段を備え、
前記制御手段は、
入力された指定位置に基づいて前記第1駆動制御ステップを行った後に前記第2駆動制御ステップに移行し、
左右片方の被検眼の検査の終了後に未検査の被検眼を検査するために、前記判別手段の判別結果に基づいて被検眼に対して前記検査部を相対移動させる左右を決定し、
決定した左右に基づいて未検査の被検眼に前記検査部が近づくように前記駆動手段を制御し、前記第2駆動制御ステップに移行させることを特徴とする眼科装置。 - 被検眼を検査する検査部を被検眼に対して3次元的に相対移動させる駆動手段と、左右の被検眼を含む顔画像を撮影するための第1撮影手段と、被検眼の前眼部像を撮影するための第2撮影手段と、を備える眼科装置において実行される制御プログラムであって、
眼科装置の制御部に実行されることで、
前記顔画像に基づいて特定される被検眼の位置を取得する位置取得ステップと、取得された被検眼の位置が前記第2撮影手段によって撮影される前記前眼部像の撮影範囲に含まれるように前記駆動手段を制御する第1駆動制御ステップと、前記第1駆動制御ステップの後に前記前眼部像に基づいて前記駆動手段を制御し、被検眼に対する前記検査部の相対位置を調整する第2駆動制御ステップと、を含む調整処理を実行する制御ステップと、
前記顔画像を解析することで被検眼の位置を検出し、検出により特定される被検眼の位置を検出位置として取得する第1取得ステップと、
を眼科装置に実行させ、さらに、
前記顔画像に対する被検眼の位置の入力操作を受け付け、前記入力操作に基づいて特定される被検眼の位置を指定位置として取得する第2取得ステップを、眼科装置に実行させることを特徴とする制御プログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202180086676.5A CN116669614A (zh) | 2020-12-25 | 2021-12-06 | 眼科装置及眼科装置的控制程序 |
JP2022572074A JPWO2022138102A1 (ja) | 2020-12-25 | 2021-12-06 | |
US18/340,388 US20230329551A1 (en) | 2020-12-25 | 2023-06-23 | Ophthalmic apparatus and non-transitory computer-readable storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020217515 | 2020-12-25 | ||
JP2020-217515 | 2020-12-25 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/340,388 Continuation US20230329551A1 (en) | 2020-12-25 | 2023-06-23 | Ophthalmic apparatus and non-transitory computer-readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022138102A1 true WO2022138102A1 (ja) | 2022-06-30 |
Family
ID=82159563
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/044669 WO2022138102A1 (ja) | 2020-12-25 | 2021-12-06 | 眼科装置および眼科装置の制御プログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20230329551A1 (ja) |
JP (1) | JPWO2022138102A1 (ja) |
CN (1) | CN116669614A (ja) |
WO (1) | WO2022138102A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023199933A1 (ja) * | 2022-04-13 | 2023-10-19 | キヤノン株式会社 | 医用画像撮影装置の制御装置、タッチパネル端末、医用画像撮影システム、医用画像撮影装置の制御プログラム |
WO2024009841A1 (ja) * | 2022-07-05 | 2024-01-11 | 株式会社ニデック | 被検者情報取得装置および眼科システム |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10216089A (ja) * | 1997-02-10 | 1998-08-18 | Nidek Co Ltd | 眼科装置 |
JP2006288610A (ja) * | 2005-04-08 | 2006-10-26 | Tomey Corporation | 眼科装置 |
JP2009015518A (ja) * | 2007-07-03 | 2009-01-22 | Panasonic Corp | 眼画像撮影装置及び認証装置 |
JP2015064700A (ja) * | 2013-09-24 | 2015-04-09 | ホヤ レンズ タイランド リミテッドHOYA Lens Thailand Ltd | 眼鏡装用パラメータ測定装置、眼鏡装用パラメータ測定プログラムおよび位置指定方法 |
JP2017080151A (ja) * | 2015-10-29 | 2017-05-18 | キヤノン株式会社 | 制御装置、眼科装置、システム、制御方法およびプログラム |
WO2017171058A1 (ja) * | 2016-03-31 | 2017-10-05 | 株式会社ニデック | 眼科装置用コントローラ、および眼科装置 |
JP2020156555A (ja) * | 2019-03-25 | 2020-10-01 | 株式会社トプコン | 眼科装置 |
JP2021040793A (ja) * | 2019-09-09 | 2021-03-18 | 株式会社トプコン | 眼科装置及び眼科装置の制御方法 |
-
2021
- 2021-12-06 WO PCT/JP2021/044669 patent/WO2022138102A1/ja active Application Filing
- 2021-12-06 JP JP2022572074A patent/JPWO2022138102A1/ja active Pending
- 2021-12-06 CN CN202180086676.5A patent/CN116669614A/zh active Pending
-
2023
- 2023-06-23 US US18/340,388 patent/US20230329551A1/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10216089A (ja) * | 1997-02-10 | 1998-08-18 | Nidek Co Ltd | 眼科装置 |
JP2006288610A (ja) * | 2005-04-08 | 2006-10-26 | Tomey Corporation | 眼科装置 |
JP2009015518A (ja) * | 2007-07-03 | 2009-01-22 | Panasonic Corp | 眼画像撮影装置及び認証装置 |
JP2015064700A (ja) * | 2013-09-24 | 2015-04-09 | ホヤ レンズ タイランド リミテッドHOYA Lens Thailand Ltd | 眼鏡装用パラメータ測定装置、眼鏡装用パラメータ測定プログラムおよび位置指定方法 |
JP2017080151A (ja) * | 2015-10-29 | 2017-05-18 | キヤノン株式会社 | 制御装置、眼科装置、システム、制御方法およびプログラム |
WO2017171058A1 (ja) * | 2016-03-31 | 2017-10-05 | 株式会社ニデック | 眼科装置用コントローラ、および眼科装置 |
JP2020156555A (ja) * | 2019-03-25 | 2020-10-01 | 株式会社トプコン | 眼科装置 |
JP2021040793A (ja) * | 2019-09-09 | 2021-03-18 | 株式会社トプコン | 眼科装置及び眼科装置の制御方法 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023199933A1 (ja) * | 2022-04-13 | 2023-10-19 | キヤノン株式会社 | 医用画像撮影装置の制御装置、タッチパネル端末、医用画像撮影システム、医用画像撮影装置の制御プログラム |
WO2024009841A1 (ja) * | 2022-07-05 | 2024-01-11 | 株式会社ニデック | 被検者情報取得装置および眼科システム |
Also Published As
Publication number | Publication date |
---|---|
US20230329551A1 (en) | 2023-10-19 |
JPWO2022138102A1 (ja) | 2022-06-30 |
CN116669614A (zh) | 2023-08-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6238551B2 (ja) | 眼科装置、眼科装置の制御方法、プログラム | |
JP6716752B2 (ja) | 眼科装置 | |
JP6071304B2 (ja) | 眼科装置及びアライメント方法 | |
WO2022138102A1 (ja) | 眼科装置および眼科装置の制御プログラム | |
JP6641730B2 (ja) | 眼科装置、および眼科装置用プログラム | |
JP2014079496A (ja) | 眼科装置および眼科制御方法並びにプログラム | |
JP7221587B2 (ja) | 眼科装置 | |
JP7271976B2 (ja) | 眼科装置 | |
JP6499884B2 (ja) | 眼科装置 | |
JP6892540B2 (ja) | 眼科装置 | |
EP3150111B1 (en) | Ophthalmic apparatus and control program for the ophthalmic apparatus | |
JP6705313B2 (ja) | 眼科装置 | |
JP7249097B2 (ja) | 眼科装置及び検眼システム | |
JP6825338B2 (ja) | 自覚式検眼装置および自覚式検眼プログラム | |
JP6480748B2 (ja) | 眼科装置 | |
JP6823339B2 (ja) | 眼科装置 | |
JP7248770B2 (ja) | 眼科装置 | |
JP6930841B2 (ja) | 眼科装置 | |
JP7434729B2 (ja) | 眼科装置、および眼科装置制御プログラム | |
JP6927389B2 (ja) | 眼科装置 | |
JP7423912B2 (ja) | 眼科装置、および眼科装置制御プログラム | |
JP7187769B2 (ja) | 眼科装置、および眼科装置制御プログラム | |
WO2020250820A1 (ja) | 眼科装置、および眼科装置制御プログラム | |
JP7101578B2 (ja) | 眼科装置及びその作動方法 | |
JP7043758B2 (ja) | 眼科装置、およびそれに用いるジョイスティック |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21910252 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022572074 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202180086676.5 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21910252 Country of ref document: EP Kind code of ref document: A1 |