CN115100729A - Ophthalmic imaging method and system for automatic retina feature detection - Google Patents

Ophthalmic imaging method and system for automatic retina feature detection Download PDF

Info

Publication number
CN115100729A
CN115100729A CN202210687615.0A CN202210687615A CN115100729A CN 115100729 A CN115100729 A CN 115100729A CN 202210687615 A CN202210687615 A CN 202210687615A CN 115100729 A CN115100729 A CN 115100729A
Authority
CN
China
Prior art keywords
fundus
mydriatic state
acquired
mydriatic
acquiring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210687615.0A
Other languages
Chinese (zh)
Inventor
汤德林
汤锦海
潘爱霞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai New Eyes Medical Inc
Original Assignee
Shanghai New Eyes Medical Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai New Eyes Medical Inc filed Critical Shanghai New Eyes Medical Inc
Priority to CN202210687615.0A priority Critical patent/CN115100729A/en
Publication of CN115100729A publication Critical patent/CN115100729A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Surgery (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The invention discloses an ophthalmologic imaging method and system for automatic retina feature detection, and relates to the technical field of ophthalmologic detection. The invention obtains a plurality of fundus images collected under the state of non-mydriasis; acquiring a fundus image acquired in a mydriatic state corresponding to a fundus image acquired in a non-mydriatic state; acquiring a mapping relation between the graphic features of the fundus images acquired in the non-mydriatic state and the graphic features of the fundus images acquired in the mydriatic state according to the fundus images acquired in the non-mydriatic state and the corresponding fundus images acquired in the mydriatic state; acquiring an eye fundus image to be detected collected in a non-mydriatic state in real time; and acquiring the image characteristics of the to-be-detected eye fundus image acquired in the non-mydriatic state according to the to-be-detected eye fundus image acquired in the non-mydriatic state acquired in real time. The invention reduces discomfort and harm of the detection light beam to eyes by controlling the detection light beam.

Description

Ophthalmic imaging method and system for automatic retina feature detection
Technical Field
The invention belongs to the technical field of ophthalmologic detection, and particularly relates to an ophthalmologic imaging method and system for automatic retina feature detection.
Background
After the fundus develops a lesion, a fundus examination is required. The conventional way is to irradiate the fundus after mydriasis and acquire an image of the fundus to acquire retinal features.
After the eye is mydriatically dilated by instillation of the liquid medicine, ciliary muscles of the eye are in a completely paralyzed state, and intense light for a long time stimulates or causes discomfort, so that discomfort and harm caused by detection light beams to the eye need to be reduced according to the actual condition of the eye fundus of a patient.
Disclosure of Invention
The invention aims to provide an ophthalmic imaging method and system for automatic retina feature detection, which can reduce discomfort and harm of detection light beams to eyes by controlling the detection light beams.
In order to solve the technical problems, the invention is realized by the following technical scheme:
the invention provides an ophthalmic imaging method for automatic retinal feature detection, which comprises the following steps:
acquiring a plurality of fundus images acquired in a non-mydriatic state;
acquiring a fundus image acquired in a mydriatic state corresponding to a fundus image acquired in a non-mydriatic state;
acquiring a mapping relation between the graphic features of the fundus images acquired in the non-mydriatic state and the graphic features of the fundus images acquired in the mydriatic state according to the fundus images acquired in the non-mydriatic state and the corresponding fundus images acquired in the mydriatic state;
acquiring an eye fundus image to be detected collected in a non-mydriatic state in real time;
acquiring image characteristics of the to-be-detected eye fundus image acquired in the non-mydriatic state according to the to-be-detected eye fundus image acquired in the non-mydriatic state acquired in real time;
acquiring a predicted graph characteristic of the fundus image acquired in the mydriatic state according to the image characteristic of the fundus image to be detected acquired in the non-mydriatic state and the mapping relation between the graph characteristic of the fundus image acquired in the non-mydriatic state and the graph characteristic of the fundus image acquired in the mydriatic state;
acquiring a fundus key region to be checked according to the prediction graphic characteristics of the fundus images acquired in the mydriatic state;
in the process of acquiring the fundus images in a mydriatic state, preferentially acquiring the important region to be inspected of the fundus, and acquiring the fundus images acquired in the mydriatic state;
retinal features are acquired according to fundus images acquired in a mydriatic state.
In one embodiment of the present invention, the step of acquiring a mapping relationship between a figure feature of the fundus image acquired in the non-mydriatic state and a figure feature of the fundus image acquired in the mydriatic state based on the fundus image acquired in the non-mydriatic state and the corresponding fundus image acquired in the mydriatic state includes,
acquiring the graphic characteristics of the fundus images acquired in the non-mydriatic state according to the fundus images acquired in the non-mydriatic state;
acquiring the graphic characteristics of the fundus image acquired in the mydriatic state according to the fundus image acquired in the mydriatic state;
and acquiring the mapping relation between the graphic features of the fundus images acquired in the non-mydriatic state and the graphic features of the fundus images acquired in the mydriatic state according to the graphic features of the fundus images acquired in the non-mydriatic state and the corresponding graphic features of the fundus images acquired in the mydriatic state.
In one embodiment of the present invention, the step of acquiring the fundus image acquired in the mydriatic state by preferentially acquiring the important region to be inspected of the fundus during the acquisition of the fundus image in the mydriatic state includes,
irradiating the eye in the non-mydriatic state by using low-brightness positioning light beams to acquire shape and position information of the eye fundus;
when the detection light beam starts to be emitted, acquiring the predicted graphic characteristics of the fundus image acquired in a mydriatic state;
high-brightness detection light beams are emitted to the eye in a mydriatic state according to the shape and position information of the eye fundus, so that the detection light beams are irradiated to the eye fundus to acquire an eye fundus image acquired in the mydriatic state.
In one embodiment of the present invention, the step of irradiating the eye in the mydriatic state with a detection light beam of high brightness based on the shape and position information of the eye fundus so that the detection light beam is irradiated to the eye fundus for acquiring the image of the eye fundus acquired in the mydriatic state includes,
arranging an electrode patch on the head of a collected human body, wherein the electrode patch is arranged on the surface of the scalp corresponding to the brain visual imaging area;
and gradually increasing the brightness of the detection light beam injected into the eye in the mydriatic state until the brightness limit is reached according to the detection current value of the electrode patch.
In an embodiment of the present invention, the step of gradually increasing the brightness of the detection beam incident to the eye in the mydriatic state until reaching the brightness limit according to the detection current value of the electrode patch includes,
the safe low-brightness detection light beam is emitted to the eyes in the mydriatic state;
acquiring a detection current value of the electrode patch corresponding to the brightness of the safe low-brightness detection light beam which is emitted into eyes in a mydriatic state;
according to the safe low-brightness detection light beam incident to the eye in the mydriatic state and the obtained value of the detection current value of the electrode patch, obtaining the functional relation between the brightness of the detection light beam incident to the eye in the mydriatic state and the obtained value of the detection current value of the electrode patch;
and obtaining the maximum brightness of the detection light beam incident to the eye in the mydriatic state according to the functional relation between the brightness of the detection light beam incident to the eye in the mydriatic state and the obtained value of the detection current value of the electrode patch and the preset maximum value of the detection current value of the electrode patch.
In one embodiment of the present invention, the step of acquiring the detection current value of the electrode patch corresponding to the brightness of the safe low-brightness detection beam incident to the eye in the mydriatic state includes,
detecting a bioelectric current value of a scalp surface corresponding to the brain visual imaging area using a plurality of electrode patches;
and adjusting the corresponding weight according to the distance between each electrode patch and the visual imaging area of the brain.
In one embodiment of the present invention, the step of adjusting the corresponding weight according to the distance between each electrode patch and the visual brain imaging region includes,
dividing each electrode patch into a plurality of plane units;
dividing a brain vision imaging region into a plurality of volume units;
and adjusting the weight of the detection current value of each electrode patch according to the distance between each plane unit in each electrode patch and the volume unit in the brain visual imaging area.
In an embodiment of the present invention, the step of adjusting the weight of the detected current value of the electrode patch according to the distance between each planar unit in each electrode patch and the volume unit in the brain vision imaging area includes,
acquiring the central point position of each plane unit;
and adjusting the weight of the detection current value of each electrode patch according to the distance between the central point of each plane unit in each electrode patch and the volume unit in the brain vision imaging area.
In one embodiment of the present invention, the step of adjusting the weight of the detected current value of the electrode patch according to the distance between the center point of each planar cell in each electrode patch and the center point of the volume cell in the brain vision imaging region comprises,
acquiring the central point position of each volume unit;
and adjusting the weight of the detection current value of each electrode patch according to the distance between the central point of each plane unit in each electrode patch and the central point of the volume unit in the brain visual imaging area.
In one embodiment of the present invention, the acquiring a plurality of fundus images acquired in a non-mydriatic state is obtained by a fundus camera, the fundus camera including, when obtaining the fundus images:
s1: aligning a lens of the mobile fundus camera with a pupil in a non-mydriatic state;
s2: controlling a lens of the eye fundus camera to approach an eyeball to acquire an image picture, and adjusting the image picture presented in the eye fundus camera;
s3: selecting image pictures displayed in the eye fundus camera after adjustment under different focal lengths according to the distance between the lens of the eye fundus camera and the eyeballs to obtain a plurality of eye fundus images collected under the non-mydriatic state; wherein S3 further includes:
analyzing whether the image picture reaches a preset definition or not, when the image picture does not reach the preset definition, carrying out definition adjustment on the image picture, when the image picture reaches the preset definition, judging whether a facula phenomenon appears in the image picture or not,
if the image picture has the facula phenomenon, adjusting the position of the facula phenomenon until the position of the facula phenomenon coincides with the center of the image picture, obtaining the adjusted image picture in the focal length state at the moment, then adjusting the focal length, and analyzing whether the image picture reaches the preset definition again, and circulating in sequence until obtaining the adjusted image picture in different focal length states of the target quantity.
In an embodiment of the present invention, the detection beam is emitted from a detection instrument, the monitoring instrument further analyzes the detection beam when emitting the detection beam, and starts emitting the detection beam when the detection beam meets a requirement, wherein analyzing the detection beam includes: estimating the intensity of the detection light beam to obtain the light intensity estimation data of the detection light beam:
Figure BDA0003700251640000051
wherein, R represents the estimated data of the light intensity of the detection light beam; c represents the central intensity of the maximum bright spot in the central area of the detection beam; j. the design is a square α (X) denotes a Bessel function of order alpha, where X is expressed as
Figure BDA0003700251640000052
Alpha is 1; pi represents a circumferential ratio; r represents the aperture of the emission outlet of the monitoring instrument; l represents a wavelength; t represents the included angle between the emergent ray and the main optical axis; d represents an image distance;
determining an analysis result according to the light intensity estimated data of the detection light beam:
Figure BDA0003700251640000053
wherein W represents the sensitivity range of human eyes to light beams; and G represents an analysis result, when G is T, the detection light beam is in accordance with the requirement, the detection light beam is emitted out at the moment, and when G is F, the detection light beam is in accordance with the requirement, the detection light beam is not emitted out at the moment.
The invention also discloses an ophthalmic imaging system for automatic retinal feature detection, which is characterized by comprising,
a history collection unit for acquiring a plurality of fundus images acquired in a non-mydriatic state;
acquiring a fundus image acquired in a mydriatic state corresponding to a fundus image acquired in a non-mydriatic state;
acquiring a mapping relation between the graphic features of the fundus images acquired in the non-mydriatic state and the graphic features of the fundus images acquired in the mydriatic state according to the fundus images acquired in the non-mydriatic state and the corresponding fundus images acquired in the mydriatic state;
the detection unit is used for acquiring an eye fundus image to be detected collected in a non-mydriatic state in real time;
acquiring image characteristics of the to-be-detected eye fundus image acquired in the non-mydriatic state according to the to-be-detected eye fundus image acquired in the non-mydriatic state acquired in real time;
acquiring a predicted graph characteristic of the fundus image acquired in the mydriatic state according to the image characteristic of the fundus image to be detected acquired in the non-mydriatic state and the mapping relation between the graph characteristic of the fundus image acquired in the non-mydriatic state and the graph characteristic of the fundus image acquired in the mydriatic state;
acquiring a key region to be inspected of the eye fundus according to the predicted graphic characteristics of the eye fundus image acquired in the mydriasis state;
in the process of acquiring the fundus images in a mydriatic state, preferentially acquiring the important region to be inspected of the fundus, and acquiring the fundus images acquired in the mydriatic state;
retinal features are acquired according to fundus images acquired in a mydriatic state.
The invention reduces discomfort and harm of the detection light beam to eyes by controlling the detection light beam.
Of course, it is not necessary for any product in which the invention is practiced to achieve all of the above-described advantages at the same time.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic flow chart illustrating an exemplary embodiment of an ophthalmic imaging method for automatic retinal feature detection according to the present invention;
fig. 2 is a flowchart illustrating an implementation of the step S3 in one embodiment of the present invention, which is to obtain a mapping relationship between a graph feature of a fundus image collected in a non-mydriatic state and a graph feature of a fundus image collected in a mydriatic state according to a fundus image collected in a non-mydriatic state and a corresponding fundus image collected in a mydriatic state;
fig. 3 is a flowchart illustrating an implementation of the step S8 of acquiring the fundus image acquired in the mydriatic state by preferentially acquiring the important region to be inspected of the fundus during the acquisition of the fundus image in the mydriatic state according to an embodiment of the present invention;
fig. 4 is a flowchart illustrating an embodiment of the step S83 of irradiating the eye with a high-brightness detection light beam according to the shape and position information of the eye fundus so that the detection light beam irradiates the eye fundus to obtain the eye fundus image collected in the mydriatic state;
fig. 5 is a flowchart illustrating an embodiment of the step S832 according to the detected current value of the electrode patch, gradually increasing the brightness of the detected light beam incident into the eye in the mydriatic state until reaching the brightness limit;
FIG. 6 is a block diagram of an ophthalmic imaging system for automated retinal feature detection according to one embodiment of the present invention.
In the drawings, the components represented by the respective reference numerals are listed below:
1-history collection unit, 2-detection unit.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, in an ophthalmologic imaging method for automatic retinal feature detection according to the present invention, step S1 may be performed first to acquire a plurality of fundus images taken in a non-mydriatic state, and step S2 may be performed next to acquire fundus images taken in a mydriatic state corresponding to the fundus images taken in the non-mydriatic state. Step S3 may be executed next to acquire a mapping relationship of the pattern feature of the fundus image acquired in the non-mydriatic state and the pattern feature of the fundus image acquired in the mydriatic state from the fundus image acquired in the non-mydriatic state and the corresponding fundus image acquired in the mydriatic state. Step S4 may be performed next to acquire a fundus image to be detected acquired in a non-mydriatic state in real time. Step S5 may be performed next to acquire image characteristics of the to-be-detected fundus image acquired in the non-mydriatic state from the to-be-detected fundus image acquired in the non-mydriatic state acquired in real time. Step S6 may be executed next to acquire the predicted pattern feature of the fundus image acquired in the mydriatic state from the image feature of the fundus image to be inspected acquired in the non-mydriatic state and the mapping relationship of the pattern feature of the fundus image acquired in the non-mydriatic state and the pattern feature of the fundus image acquired in the mydriatic state. The step S7 may be performed next to acquire a fundus emphasis to-be-inspected region from the predicted pattern characteristics of the fundus image acquired in the mydriatic state, and the step S8 may be performed next to acquire a fundus image acquired in the mydriatic state by preferentially acquiring the fundus emphasis to-be-inspected region during acquisition of the fundus image in the mydriatic state. Step S9 may be finally executed to acquire a retinal feature from the fundus image acquired in the mydriatic state. Discomfort and harm caused by the detection light beam to eyes are reduced through the mode.
Referring to fig. 2, in order to obtain a mapping relationship between the pattern feature of the fundus image acquired in the non-mydriatic state and the pattern feature of the fundus image acquired in the mydriatic state, step S3 may be executed first in step S31 to obtain the pattern feature of the fundus image acquired in the non-mydriatic state from the fundus image acquired in the non-mydriatic state. Step S32 may be performed next to acquire a graphic feature of the fundus image acquired in the mydriatic state from the fundus image acquired in the mydriatic state. Finally, step S33 may be executed to acquire a mapping relationship between the pattern features of the fundus image acquired in the non-mydriatic state and the pattern features of the fundus image acquired in the mydriatic state, based on the pattern features of the fundus image acquired in the non-mydriatic state and the corresponding pattern features of the fundus image acquired in the mydriatic state.
Referring to fig. 3, in order to acquire the fundus image acquired in the mydriatic state, in the above step S8, step SS81 may be first executed to acquire shape and position information of the fundus by irradiating the eye in the non-mydriatic state with a positioning beam of low brightness. Step S82 may be performed next to start acquiring the predicted pattern feature of the fundus image acquired in the mydriatic state when emission of the detection light beam is started. Finally, step S83 may be performed to inject a high-luminance detection light beam to the eye in the mydriatic state in accordance with the shape and position information of the fundus so that the detection light beam is irradiated to the fundus for acquiring the fundus image acquired in the mydriatic state. The fundus image acquired in the mydriatic state is acquired in the above manner.
Referring to fig. 4, in order to obtain the fundus image collected in the mydriatic state, step S83 may be executed to set an electrode patch on the head of the collected human body, where the electrode patch is set on the scalp surface corresponding to the brain vision imaging area. Next, step S832 may be executed to gradually increase the brightness of the detection light beam incident into the eye in the mydriatic state according to the detection current value of the electrode patch until reaching the brightness limit. The fundus image acquired in the mydriatic state is acquired in the above manner.
As shown in fig. 5, in order to protect the eyes, step S832 may be executed to first inject the safety low-brightness detection light beam into the eyes in the mydriatic state in step S8321. Step S8322 may be performed next to acquire the detection current value of the electrode patch corresponding to the luminance of the safe low-luminance detection beam incident to the eye in the mydriatic state. Next, step S8323 may be executed to obtain a functional relationship between the brightness of the detection light beam incident on the eye in the mydriatic state and the obtained value of the detection current value of the electrode patch according to the safe low-brightness detection light beam incident on the eye in the mydriatic state and the obtained value of the detection current value of the electrode patch. Finally, step S8324 may be executed to obtain the maximum brightness of the detection beam incident to the eye in the mydriatic state according to the functional relationship between the brightness of the detection beam incident to the eye in the mydriatic state and the obtained value of the detection current of the electrode patch, and the preset maximum value of the detection current of the electrode patch. The brightness is limited in a limiting mode, and the function of protecting eyes is achieved.
In one embodiment of the present disclosure, the step of acquiring the detection current value of the electrode patch corresponding to the brightness of the safe low-brightness detection beam incident to the eye in the mydriatic state may first perform detection of the bioelectric current value of the scalp surface corresponding to the brain vision imaging area using a plurality of electrode patches. Finally, adjusting the corresponding weight according to the distance between each electrode patch and the brain vision imaging area can be performed.
In an embodiment of the present disclosure, the step of adjusting the corresponding weight according to the distance between each electrode patch and the brain vision imaging area may first perform dividing each electrode patch into a plurality of plane units, then may perform dividing the brain vision imaging area into a plurality of individual volume units, and then may perform adjusting the weight of the detection current value of the electrode patch according to the distance between each plane unit in each electrode patch and the volume unit in the brain vision imaging area.
In an embodiment of the present disclosure, in the step of adjusting the weight of the detection current value of the electrode patch according to the distance between each planar unit in each electrode patch and the volume unit in the brain vision imaging area, first obtaining the center point position of each planar unit may be performed, and then adjusting the weight of the detection current value of the electrode patch according to the distance between the center point of each planar unit in each electrode patch and the volume unit in the brain vision imaging area may be performed.
In an embodiment of the present disclosure, the step of adjusting the weight of the detection current value of the electrode patch according to the distance between the center point of each planar unit in each electrode patch and the center point of the volume unit in the brain vision imaging region may be performed first to obtain the center point position of each volume unit, and then may be performed to adjust the weight of the detection current value of the electrode patch according to the distance between the center point of each planar unit in each electrode patch and the center point of the volume unit in the brain vision imaging region.
In one embodiment of the present aspect, the acquiring of a plurality of fundus images acquired in a non-mydriatic state is performed by a fundus camera, the fundus camera including, when acquiring the fundus images:
s1: aligning a lens of the mobile fundus camera with a pupil in a non-mydriatic state;
s2: controlling a lens of the eye fundus camera to approach an eyeball to acquire an image picture, and adjusting the image picture presented in the eye fundus camera;
s3: selecting image pictures displayed in the eye fundus camera after adjustment under different focal lengths according to the distance between the lens of the eye fundus camera and the eyeballs to obtain a plurality of eye fundus images collected under the non-mydriatic state; wherein S3 further includes:
analyzing whether the image picture reaches a preset definition or not, when the image picture does not reach the preset definition, carrying out definition adjustment on the image picture, when the image picture reaches the preset definition, judging whether a facula phenomenon appears in the image picture or not,
if the image picture has the facula phenomenon, adjusting the position of the facula phenomenon until the position of the facula phenomenon coincides with the center of the image picture, obtaining the adjusted image picture in the focal length state at the moment, then adjusting the focal length, and analyzing whether the image picture reaches the preset definition again, and circulating in sequence until obtaining the adjusted image picture in different focal length states of the target quantity.
If the facula phenomenon appears in the image picture, the focal length is directly adjusted, so that the position of the facula phenomenon is adjusted after the facula phenomenon appears in the image picture appearing in the fundus camera. The above-mentioned fundus image that not only can the high accuracy obtain through the fundus camera, conveniently adjust to the image picture in the process of obtaining the fundus image moreover, and make the image picture that obtains present the fundus situation more accurately through definition adjustment and facula phenomenon adjustment, thereby make and obtain more accurate fundus information in the image picture, in addition, the image picture that appears in the fundus camera after adjusting under through different focuses obtains the fundus image that a plurality of do not gather under the mydriasis state and can effectively eliminate accidental situation, reduce the information error, improve the accuracy of automatic retina characteristic detection.
In one embodiment of the present disclosure, the detection light beam is emitted from a detection instrument, the detection instrument further analyzes the detection light beam when emitting the detection light beam, and starts to emit the detection light beam when the detection light beam meets the requirement, wherein analyzing the detection light beam includes: estimating the intensity of the detection light beam to obtain the light intensity estimation data of the detection light beam:
Figure BDA0003700251640000111
wherein, R represents the estimated data of the light intensity of the detection light beam; c represents the central intensity of the maximum bright spot in the central area of the detection beam; j is a unit of α (X) denotes a Bessel function of order alpha, where X is denoted
Figure BDA0003700251640000112
Alpha is 1; pi represents a circumferential ratio; r represents the aperture of the emission outlet of the monitoring instrument; l represents a wavelength; t represents the included angle between the emergent ray and the main optical axis; d represents an image distance;
determining an analysis result aiming at the light intensity estimation data of the detection light beam:
Figure BDA0003700251640000113
wherein W represents the sensitivity range of human eyes to light beams; and G represents an analysis result, when G is T, the detection light beam is in accordance with the requirement, the detection light beam is emitted out at the moment, and when G is F, the detection light beam is not in accordance with the requirement, the detection light beam is not emitted out at the moment. The monitoring instrument enables the transmitted detection light beam to be better detected through analyzing the detection light beam when the detection light beam is transmitted, meanwhile, the detection light beam can be prevented from damaging eyes, the instantaneous influence of the detection light beam on the eyes is reduced, the error between the predicted graphic characteristic and the actual condition of the eye fundus image acquired in the mydriasis state is reduced, and the accuracy of automatic retina characteristic detection is improved.
The present solution also provides an ophthalmic imaging system for automatic retinal feature detection that may include a history collection unit 1 and a detection unit 2. A history collecting unit 1 for acquiring a plurality of fundus images acquired in a non-mydriatic state, and acquiring a fundus image acquired in a mydriatic state corresponding to the fundus image acquired in the non-mydriatic state. And acquiring the mapping relation between the graphic features of the fundus images acquired in the non-mydriatic state and the graphic features of the fundus images acquired in the mydriatic state according to the fundus images acquired in the non-mydriatic state and the corresponding fundus images acquired in the mydriatic state. The detection unit 2 is used for acquiring the eye fundus image to be detected collected in the non-mydriatic state in real time, acquiring the image characteristics of the eye fundus image to be detected collected in the non-mydriatic state according to the eye fundus image to be detected acquired in the non-mydriatic state, acquiring the predicted graphic characteristics of the fundus image acquired in the mydriatic state according to the image characteristics of the fundus image to be detected acquired in the non-mydriatic state and the mapping relation between the graphic characteristics of the fundus image acquired in the non-mydriatic state and the graphic characteristics of the fundus image acquired in the mydriatic state, acquiring a fundus key region to be inspected according to the predicted graphic characteristics of the fundus image acquired in the mydriatic state, in the process of acquiring the eye fundus image in the mydriatic state, the important region to be inspected of the eye fundus is preferentially acquired, the eye fundus image acquired in the mydriatic state is acquired, and the retinal feature is acquired according to the eye fundus image acquired in the mydriatic state.
In summary, by controlling the detection beam, discomfort and harm to eyes caused by the detection beam are reduced.
The above description of illustrated embodiments of the invention, including what is described in the abstract of the specification, is not intended to be exhaustive or to limit the invention to the precise forms disclosed herein. While specific embodiments of, and examples for, the invention are described herein for illustrative purposes only, various equivalent modifications are possible within the spirit and scope of the present invention, as those skilled in the relevant art will recognize and appreciate. As indicated, these modifications may be made to the present invention in light of the foregoing description of illustrated embodiments of the present invention and are to be included within the spirit and scope of the present invention.
The systems and methods have been described herein in general terms as the details aid in understanding the invention. Furthermore, various specific details have been given to provide a general understanding of the embodiments of the invention. One skilled in the relevant art will recognize, however, that an embodiment of the invention can be practiced without one or more of the specific details, or with other apparatus, systems, assemblies, methods, components, materials, parts, and/or the like. In other instances, well-known structures, materials, and/or operations are not specifically shown or described in detail to avoid obscuring aspects of embodiments of the invention.
Thus, although the present invention has been described herein with reference to particular embodiments thereof, a latitude of modification, various changes and substitutions are intended in the foregoing disclosures, and it will be appreciated that in some instances some features of the invention will be employed without a corresponding use of other features without departing from the scope and spirit of the invention as set forth. Thus, many modifications may be made to adapt a particular situation or material to the essential scope and spirit of the present invention. It is intended that the invention not be limited to the particular terms used in following claims and/or to the particular embodiment disclosed as the best mode contemplated for carrying out this invention, but that the invention will include any and all embodiments and equivalents falling within the scope of the appended claims. Accordingly, the scope of the invention is to be determined solely by the appended claims.

Claims (10)

1. An ophthalmic imaging method for automated retinal feature detection, comprising,
acquiring a plurality of fundus images acquired in a non-mydriatic state;
acquiring a fundus image acquired in a mydriatic state corresponding to a fundus image acquired in a non-mydriatic state;
acquiring a mapping relation between the graphic features of the fundus images acquired in the non-mydriatic state and the graphic features of the fundus images acquired in the mydriatic state according to the fundus images acquired in the non-mydriatic state and the corresponding fundus images acquired in the mydriatic state;
acquiring an eye fundus image to be detected collected in a non-mydriatic state in real time;
acquiring image characteristics of the to-be-detected eye fundus image acquired in the non-mydriatic state according to the to-be-detected eye fundus image acquired in the non-mydriatic state acquired in real time;
acquiring a predicted graph characteristic of the fundus image acquired in the mydriatic state according to the image characteristic of the fundus image to be detected acquired in the non-mydriatic state and the mapping relation between the graph characteristic of the fundus image acquired in the non-mydriatic state and the graph characteristic of the fundus image acquired in the mydriatic state;
acquiring a key region to be inspected of the eye fundus according to the predicted graphic characteristics of the eye fundus image acquired in the mydriasis state;
in the process of acquiring the eye fundus image in the mydriatic state, preferentially acquiring a key region to be inspected of the eye fundus to acquire the eye fundus image acquired in the mydriatic state;
retinal features are acquired according to fundus images acquired in a mydriatic state.
2. The method according to claim 1, wherein the step of acquiring a mapping relationship between the pattern feature of the fundus image acquired in the non-mydriatic state and the pattern feature of the fundus image acquired in the mydriatic state based on the fundus image acquired in the non-mydriatic state and the corresponding fundus image acquired in the mydriatic state includes,
acquiring the graphic characteristics of the fundus image acquired in the non-mydriatic state according to the fundus image acquired in the non-mydriatic state;
acquiring the graphic characteristics of the fundus image acquired in the mydriatic state according to the fundus image acquired in the mydriatic state;
and acquiring a mapping relation between the graphic features of the fundus images acquired in the non-mydriatic state and the graphic features of the fundus images acquired in the mydriatic state according to the graphic features of the fundus images acquired in the non-mydriatic state and the corresponding graphic features of the fundus images acquired in the mydriatic state.
3. The method according to claim 2, wherein the step of acquiring the fundus image acquired in the mydriatic state by preferentially acquiring the important region to be inspected of the fundus during the acquisition of the fundus image in the mydriatic state includes,
irradiating the eye in a non-mydriatic state by using low-brightness positioning light beams to acquire shape and position information of the eye fundus;
when the detection light beam starts to be emitted, acquiring the predicted graphic characteristics of the fundus image acquired in a mydriatic state;
high-brightness detection light beams are emitted to the eye in a mydriatic state according to the shape and position information of the eye fundus, so that the detection light beams are irradiated to the eye fundus to acquire an eye fundus image acquired in the mydriatic state.
4. The method according to claim 3, wherein the step of irradiating the eye in the mydriatic state with a detection beam of high brightness based on the shape and position information of the eye fundus so that the detection beam is irradiated to the eye fundus for acquiring the image of the eye fundus acquired in the mydriatic state includes,
arranging an electrode patch on the head of a collected human body, wherein the electrode patch is arranged on the surface of the scalp corresponding to the brain visual imaging area;
and gradually increasing the brightness of the detection light beam injected into the eye in the mydriatic state until the brightness limit is reached according to the detection current value of the electrode patch.
5. The method according to claim 4, wherein the step of gradually increasing the brightness of the detection beam incident to the eye in the mydriatic state until reaching the brightness limit according to the detection current value of the electrode patch comprises,
the safe low-brightness detection light beam is emitted to the eyes in the mydriatic state;
acquiring a detection current value of the electrode patch corresponding to the brightness of the safe low-brightness detection light beam which is emitted into eyes in a mydriatic state;
according to the safe low-brightness detection light beam incident to the eye in the mydriatic state and the obtained value of the detection current value of the electrode patch, obtaining the functional relation between the brightness of the detection light beam incident to the eye in the mydriatic state and the obtained value of the detection current value of the electrode patch;
and obtaining the maximum brightness of the detection light beam incident to the eye in the mydriatic state according to the functional relation between the brightness of the detection light beam incident to the eye in the mydriatic state and the obtained value of the detection current value of the electrode patch and the preset maximum value of the detection current value of the electrode patch.
6. The method according to claim 5, wherein the step of obtaining the detected current value of the electrode patch corresponding to the brightness of the safe low-brightness detection beam incident to the eye in the mydriatic state includes,
detecting a bioelectric current value of a scalp surface corresponding to the brain visual imaging area using a plurality of electrode patches;
adjusting the corresponding weight according to the distance between each electrode patch and the brain visual imaging area;
wherein the step of adjusting the corresponding weight according to the distance between each electrode patch and the brain vision imaging area comprises,
dividing each electrode patch into a plurality of plane units;
dividing a brain vision imaging region into a plurality of volume units;
and adjusting the weight of the detection current value of each electrode patch according to the distance between each plane unit in each electrode patch and the volume unit in the brain visual imaging area.
7. The method of claim 6, wherein the step of adjusting the weight of the detected current values of the electrode patches according to the distance between each planar cell in each electrode patch and the volume cell in the brain vision imaging area comprises,
acquiring the central point position of each plane unit;
adjusting the weight of the detection current value of each electrode patch according to the distance between the central point of each plane unit in each electrode patch and the volume unit in the brain visual imaging area;
wherein the step of adjusting the weight of the detected current value of the electrode patch according to the distance between the center point of each planar unit in each electrode patch and the center point of the volume unit in the brain vision imaging area comprises,
acquiring the central point position of each volume unit;
and adjusting the weight of the detection current value of each electrode patch according to the distance between the central point of each plane unit in each electrode patch and the central point of the volume unit in the brain visual imaging area.
8. The method of claim 3, wherein the detection beam is emitted from a detection instrument, the monitoring instrument further analyzes the detection beam when emitting the detection beam, and initiates emission of the detection beam when the detection beam is satisfactory, wherein analyzing the detection beam comprises: estimating the intensity of the detection light beam to obtain the light intensity estimation data of the detection light beam:
Figure FDA0003700251630000041
wherein, R represents the estimated data of the light intensity of the detection light beam; c represents the central intensity of the maximum bright spot in the central area of the detection beam; j. the design is a square α (X) denotes a Bessel function of order alpha, where X is denoted
Figure FDA0003700251630000042
Alpha is 1; pi represents a circumferential ratio; r represents the aperture of the emission outlet of the monitoring instrument; l represents a wavelength; t represents the included angle between the emergent ray and the main optical axis; d represents an image distance;
determining an analysis result according to the light intensity estimated data of the detection light beam:
Figure FDA0003700251630000043
wherein W represents the sensitivity range of human eyes to light beams; and G represents an analysis result, when G is T, the detection light beam is in accordance with the requirement, the detection light beam is emitted out at the moment, and when G is F, the detection light beam is in accordance with the requirement, the detection light beam is not emitted out at the moment.
9. The method according to claim 1, wherein said acquiring a plurality of fundus images acquired in a non-mydriatic state is obtained by a fundus camera, said fundus camera comprising the steps of, when obtaining fundus images:
aligning a lens of the mobile fundus camera with a pupil in a non-mydriatic state;
controlling a lens of the eye fundus camera to approach an eyeball to acquire an image picture, and adjusting the image picture presented in the eye fundus camera;
selecting image pictures displayed in the fundus camera after adjustment under different focal lengths according to the distance between the lens of the fundus camera and the eyeballs to obtain a plurality of fundus images collected under the non-mydriatic state, wherein the method comprises the following steps:
analyzing whether the image picture reaches a preset definition or not, when the image picture does not reach the preset definition, carrying out definition adjustment on the image picture, when the image picture reaches the preset definition, judging whether a facula phenomenon appears in the image picture or not,
if the image picture has the facula phenomenon, adjusting the position of the facula phenomenon until the position of the facula phenomenon coincides with the center of the image picture, obtaining the adjusted image picture in the focal length state at the moment, then adjusting the focal length, and analyzing whether the image picture reaches the preset definition again, and circulating in sequence until obtaining the adjusted image picture in different focal length states of the target quantity.
10. An ophthalmic imaging system for automated retinal feature detection, comprising,
a history collection unit for acquiring a plurality of fundus images acquired in a non-mydriatic state;
acquiring a fundus image acquired in a mydriatic state corresponding to a fundus image acquired in a non-mydriatic state;
acquiring a mapping relation between the graphic features of the fundus images acquired in the non-mydriatic state and the graphic features of the fundus images acquired in the mydriatic state according to the fundus images acquired in the non-mydriatic state and the corresponding fundus images acquired in the mydriatic state;
the detection unit is used for acquiring an eye fundus image to be detected collected in a non-mydriatic state in real time;
acquiring image characteristics of the fundus image to be detected acquired in the non-mydriatic state according to the fundus image to be detected acquired in the non-mydriatic state acquired in real time;
acquiring a predicted graph characteristic of the fundus image acquired in the mydriatic state according to the image characteristic of the fundus image to be detected acquired in the non-mydriatic state and the mapping relation between the graph characteristic of the fundus image acquired in the non-mydriatic state and the graph characteristic of the fundus image acquired in the mydriatic state;
acquiring a fundus key region to be checked according to the prediction graphic characteristics of the fundus images acquired in the mydriatic state;
in the process of acquiring the fundus images in a mydriatic state, preferentially acquiring the important region to be inspected of the fundus, and acquiring the fundus images acquired in the mydriatic state;
retinal features are acquired according to fundus images acquired in a mydriatic state.
CN202210687615.0A 2022-06-17 2022-06-17 Ophthalmic imaging method and system for automatic retina feature detection Pending CN115100729A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210687615.0A CN115100729A (en) 2022-06-17 2022-06-17 Ophthalmic imaging method and system for automatic retina feature detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210687615.0A CN115100729A (en) 2022-06-17 2022-06-17 Ophthalmic imaging method and system for automatic retina feature detection

Publications (1)

Publication Number Publication Date
CN115100729A true CN115100729A (en) 2022-09-23

Family

ID=83290391

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210687615.0A Pending CN115100729A (en) 2022-06-17 2022-06-17 Ophthalmic imaging method and system for automatic retina feature detection

Country Status (1)

Country Link
CN (1) CN115100729A (en)

Similar Documents

Publication Publication Date Title
US20210161378A1 (en) Photorefraction Ocular Screening Device and Methods
US6260968B1 (en) Pupilometer with pupil irregularity detection capability
US10716704B2 (en) Ophthalmoscope having a laser device
Markowitz et al. Microperimetry and clinical practice: an evidence-based review
US7331667B2 (en) Iris pattern recognition and alignment
US7614745B2 (en) System for analyzing eye responses to automatically determine impairment of a subject
AU2018438719A1 (en) Fundus image automatic analysis and comparison method and storage device
US9980635B2 (en) System and device for preliminary diagnosis of ocular diseases
JP2005342107A (en) Perimeter
EP3769283A1 (en) Pupil edge detection in digital imaging
US20180263491A1 (en) Ophthalmic apparatus, and treatment site measuring method for the apparatus
JP3851824B2 (en) Contrast sensitivity measuring device
EP3402388B1 (en) System and method for performing objective perimetry and diagnosis of patients with retinitis pigmentosa and other ocular diseases
CN115100729A (en) Ophthalmic imaging method and system for automatic retina feature detection
CN105411523A (en) Cornea image processing method
CN115868920A (en) Quantitative evaluation method and device for conjunctival hyperemia and storage medium
CN108230287A (en) A kind of detection method and device of the crystalline region of anterior segment image
Thomson Retinal topography with the Heidelberg retina tomograph
Cassanelli et al. A new screening system for the estimation of ocular anterior chamber angle width
KR101731972B1 (en) Automatic focusing apparatus of ophthalmometer and methdo thereof
JP7163039B2 (en) Diagnosis support device, diagnosis support method and program
CN115414002A (en) Eye detection method based on video stream and strabismus screening system
CN118526156A (en) Functional multispectral pupil measuring device
CN117243560A (en) View meter system for view detection and method thereof
CN116849602A (en) Fundus image shooting method and device and main control equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination