CN117942024A - Eye parameter detection method, device and system - Google Patents

Eye parameter detection method, device and system Download PDF

Info

Publication number
CN117942024A
CN117942024A CN202410131739.XA CN202410131739A CN117942024A CN 117942024 A CN117942024 A CN 117942024A CN 202410131739 A CN202410131739 A CN 202410131739A CN 117942024 A CN117942024 A CN 117942024A
Authority
CN
China
Prior art keywords
detection
detector
eye
parameter
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410131739.XA
Other languages
Chinese (zh)
Inventor
赵国良
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Doctor Eye Optometry Technology Research Center
Original Assignee
Harbin Doctor Eye Optometry Technology Research Center
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Doctor Eye Optometry Technology Research Center filed Critical Harbin Doctor Eye Optometry Technology Research Center
Priority to CN202410131739.XA priority Critical patent/CN117942024A/en
Publication of CN117942024A publication Critical patent/CN117942024A/en
Pending legal-status Critical Current

Links

Abstract

An eye parameter detection method, device and system relates to an eye parameter detection technology, and aims to solve the problems that the existing eye examination equipment is single in function and causes complex examination process of patients. The method comprises the following steps: in response to receiving the eye parameter detection instruction, sequentially controlling each detector on the electric translation table to move to the front of the detection table for eye parameter detection; extracting eye parameters from detection data obtained by a detector; and outputting the ocular parameter to a user. The device comprises: the device comprises a detection module, a data processing module and an output module. The system comprises an electric translation table, a detection table, a control center and a plurality of detectors positioned on the electric translation table; the detector is used for detecting the eye data; the control center is used for receiving the instruction of the user and controlling the electric translation table and the detector to work according to the instruction. The application can realize one-key operation and greatly improve the detection efficiency.

Description

Eye parameter detection method, device and system
Technical Field
The application relates to an eye parameter detection technology.
Background
The eye problems are problems faced by people of all ages, such as hyperopia, amblyopia and the like, myopia, astigmatism and the like of teenagers and middle-aged people, glaucoma, cataract and the like of the elderly, and fundus diseases of all ages are likely to occur. Professional instruments are needed for examining eye problems, one instrument can only detect one type of problems, but in order to comprehensively obtain eye conditions, patients need to arrange numbers and examination for each instrument in the examination process, and after examination results are obtained, doctors need to review the numbers after examination, so that the whole process is very complicated.
Disclosure of Invention
The first aspect of the application provides an eye parameter detection method, which comprises the following steps: in response to receiving the eye parameter detection instruction, sequentially controlling each detector on the electric translation table to move to the front of the detection table for eye parameter detection; extracting eye parameters from detection data obtained by the detector; and outputting the ocular parameter to a user.
In one possible implementation manner, the sequentially controlling each detector on the electric translation stage to move in front of the detection stage for detecting the eye parameter includes: acquiring an ID of a detector currently positioned in front of the detection table, and sending a parameter detection instruction to the detector; when a parameter detection completion instruction returned by the detector is received, a movement instruction is sent to the electric translation stage, so that all detectors sequentially move to the next position, and the ID of the detector currently positioned in front of the detection stage is recorded; judging whether all detectors complete the detection; if yes, the detection is finished, and the user is prompted to finish the detection; and if not, returning to the step of acquiring the ID of the detector currently positioned in front of the detection table and sending a detection instruction to the detector.
In one possible implementation manner, the sequentially controlling each detector on the electric translation stage to move in front of the detection stage for detecting the eye parameter further includes: when receiving an adjustment instruction returned by the detector, sending a control instruction to the electric translation stage according to adjustment distance and adjustment direction information carried by the adjustment instruction, so that the electric translation stage moves according to the adjustment distance and the adjustment direction.
In one possible implementation manner, the extracting the ocular parameter from the detection data acquired by the detector includes: when the detection data is an image, extracting a specific target from the image, performing size measurement on the extracted target, and taking the measurement result as an eye parameter.
In one possible implementation, the method further includes: analyzing the eye parameters and forming prompt information according to analysis results; and outputting the prompt information to a user.
A second aspect of the present application provides an eye parameter detecting apparatus comprising: the detection module is suitable for responding to the received eye parameter detection instruction and sequentially controlling each detector on the electric translation table to move to the front of the detection table for eye parameter detection; the data processing module is suitable for extracting eye parameters from detection data acquired by the detector; and an output module adapted to output the ocular parameter to a user.
In one possible implementation, when the data is an image, the data processing module extracts a specific target from the image, performs a size measurement on the extracted target, and uses the measurement result as an eye parameter.
A third aspect of the present application provides an eye parameter detection system, comprising an electric translation stage, a detection stage, a control center, and a plurality of detectors positioned on the electric translation stage; the detection platform is positioned at one side of the electric translation platform; the plurality of detectors are respectively used for detecting different types of data of eyes; the control center is used for receiving a user instruction and controlling the electric translation table and the detector to work according to the instruction.
In one possible implementation, the motorized translation stage includes a coarse translation stage and a plurality of fine translation stages, each detector mounted on one fine translation stage, the plurality of fine translation stages each mounted on the coarse translation stage.
A fourth aspect of the present application provides an electronic device, including: a memory; a processor; a computer program; wherein the computer program is stored in the memory and configured to be executed by the processor to implement the method of any one of claims 1 to 5.
By adopting the method, the device and the system for detecting the eye parameters, provided by the application, a user can detect various parameters of the eye at one time only by placing the head on the detection table, one-key operation is realized, repeated elimination is not needed, and the detection efficiency can be improved.
Drawings
Fig. 1 is a schematic diagram of a mechanism of an eye parameter detecting system 100 according to an embodiment of the present invention;
FIG. 2 is a schematic electrical diagram of an ocular parameter detection system 100;
Fig. 3 is a schematic structural diagram of an electronic device 300 according to an embodiment of the present invention;
FIG. 4 is a flow chart of an eye parameter detection method 400 according to an embodiment of the invention;
fig. 5 is a flowchart of step S410 according to an embodiment of the present invention;
Fig. 6 is a schematic structural diagram of an eye parameter detecting apparatus 600 according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions and advantages of the embodiments of the present application more apparent, the following detailed description of exemplary embodiments of the present application is provided in conjunction with the accompanying drawings, and it is apparent that the described embodiments are only some embodiments of the present application and not exhaustive of all embodiments. It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other.
Aiming at the problems of single function and complicated patient examination process of the existing eye examination equipment, the invention provides an eye parameter detection method, an eye parameter detection device and an eye parameter detection system, which are used for solving or relieving the problems.
Fig. 1 and 2 are schematic structural diagrams of an eye parameter detecting system 100 according to an embodiment of the present invention. As shown in fig. 1, ocular parameter detection system 100 includes an motorized translation stage 110, a detection stage 120, a control center 130, and a plurality of detectors 140 positioned on the motorized translation stage. Each detector 140 is assigned an ID to distinguish between the individual detectors 140. Each detector 140 is capable of detecting a type of ocular parameter, e.g., a first detector 140 is capable of checking the refractive status of both eyes, a second detector 140 is capable of checking cataract, a third detector 140 is used to check fundus condition, etc. As shown in fig. 1, the motorized translation stage 110 includes a coarse translation stage 111 and a fine translation stage 112. The coarse tuning translation stage 111 preferably adopts a circular track or a rectangular track, and a plurality of fine tuning translation stages 112 are mounted on the coarse tuning translation stage 111, and a detector 140 is mounted on each fine tuning translation stage 112. Under the control of the control center 130, the plurality of fine tuning translation stages 112 can move a minute distance on the coarse tuning translation stage 111, and the plurality of fine tuning translation stages 112 can move as a whole on the coarse tuning translation stage 111 along a circular or rectangular orbit. The detection stage 120 is located at one side of the electric translation stage 110, the user coordinates are at the detection stage 120, after detection starts, under the control of the control center 130, the plurality of detectors 140 sequentially move to the front of the user on the track, and the accurate positions of the detectors are adjusted by fine adjustment of the translation stage 112, so that the eyes of the user are exactly located at the center of the detectors 140, and the detection of the eye parameters is performed.
The control center 130 is implemented using electronics. The eye parameter detection method provided by the application is executed in the electronic equipment. The electronic device may be any device having storage and computing capabilities, and may be implemented, for example, as a server, a workstation, or the like, or may be implemented as a personal configured computer such as a desktop computer, a notebook computer, or may be implemented as a terminal device such as a mobile phone, a tablet computer, or the like, but is not limited thereto.
Fig. 3 shows a schematic diagram of an electronic device 300 according to an embodiment of the invention. It should be noted that, the electronic device 300 shown in fig. 3 is only an example, and in practice, the electronic device for implementing the method for detecting an eye parameter of the present invention may be any type of device, and the hardware configuration of the electronic device may be the same as the electronic device 300 shown in fig. 3 or may be different from the electronic device 300 shown in fig. 3. In practice, the electronic device for implementing the eye parameter detection method of the present invention may add or subtract hardware components of the electronic device 300 shown in fig. 3, and the present invention is not limited to the specific hardware configuration of the electronic device.
As shown in fig. 3, electronic device 300 generally includes one or more processors 310, memory 320, bus 330, and interface 340. The processor 310 may be any type of processing including, but not limited to: a microprocessor, a microcontroller, a digital information processor, or any combination thereof. Memory 320 may be any type of memory including, but not limited to: volatile memory (such as RAM), non-volatile memory (such as ROM, flash memory, etc.), or any combination thereof. The memory 320 stores therein computer program instructions for performing the ocular parameter detection method of the present invention, which may instruct the processor 310 to perform the ocular parameter detection method of the present invention. The electronic device 300 also includes an interface 340 for connecting to an external device (e.g., a printer). Communication is made between processor 310, memory 320, and interface 340 via bus 330.
An eye parameter detection method 400 according to an embodiment of the present invention includes: in response to receiving the eye parameter detection instruction, sequentially controlling each detector on the electric translation table to move to the front of the detection table for eye parameter detection; extracting eye parameters from detection data obtained by the detector; and outputting the ocular parameter to a user.
Fig. 4 is a flow chart of an eye parameter detection method 400 according to an embodiment of the invention. As shown in fig. 4, the method 400 begins at step S410.
In step S410, in response to receiving the eye parameter detection command, each detector 140 on the motorized translation stage 110 is sequentially controlled to move in front of the detection stage 120 for eye parameter detection.
The user places the head on the inspection station 120 and then issues an instruction to start inspection through the man-machine interface of the control center 130. Before sending the instruction, the user can input personal information such as name, gender, age and the like through a human-computer interaction interface, and then sends the instruction. Next, a log is created in the memory of the control center 130, and the log first needs to record personal information such as the name, sex, age, etc. of the current user, and if the user does not input the information, the information may not be recorded in the log. Each detector 140 is sequentially driven by the electric translation stage 110 to move to the front of the detection stage 120, so as to detect the eye parameters of the user. The specific process of detection is shown in fig. 5.
In step S411, the ID of the detector 140 currently located in front of the detection stage 120 is acquired, and a detection instruction is transmitted to the detector 140.
The ID of the detector 140 currently located in front of the detection stage 120 may be obtained from the last log file. The method 400 creates a new log file when executing each time, the log file can use the time of creating the log file (XX minutes and XX seconds in XX year, XX month, XX day, XX minutes and XX seconds) as the file name, the log file created in the last detection can be quickly found according to the file name, and the ID of the last detector 140 for measuring the eye parameters can be found from the log file.
Before the eye parameters are detected, it is necessary to ensure that the user's eyes are located exactly in the center of the field of view of the detector 140. Accordingly, the control center 130 acquires the ID of the detector 140 currently located in front of the detection stage 120, and then transmits a position detection instruction to the detector 140. The detector 140 starts checking the positions of both eyes after receiving the position detection instruction. Specifically, the detector 140 may prompt the user to look ahead first, start capturing images of both eyes after 3-5 seconds, then look at the position deviation between the midpoint of the connecting line of both eyes and the center of the field of view of the detector 140, and if the deviation is smaller than the preset threshold, consider that both eyes are located at the center of the field of view of the detector 140, otherwise, may return an adjustment instruction to the control center 130, where the adjustment instruction includes the ID of the detector 140, the position deviation between the midpoint of the connecting line of both eyes and the center of the field of view of the detector 140, and the direction (e.g., right or left) of the deviation. After receiving the adjustment command, the control center 130 sends a control command to the fine adjustment translation stage 112 corresponding to the detector 140 according to the position deviation and deviation direction information carried by the adjustment command, where the control command includes the distance and direction of the fine adjustment. The fine adjustment translation stage 112 adjusts the position of the detector 140 according to the distance and direction information carried by the control command. For example, if the detector 140 with ID of 001 detects that the midpoint of the connection line between the eyes of the user is located 1cm to the left (positive direction to the right and negative direction to the left) in the center of the field of view of the detector 140, the detector 140 returns an adjustment command to the control center 130, and the adjustment command is "ID001; 1cm ", the control center 130, upon receiving this adjustment command, issues a control command to the fine translation stage 112 of the detector 140, which may be" ID001 ", for example; 1cm ", which means that the fine tuning translation stage 112 corresponding to the detector 140 with ID001 is controlled to move 1cm to the left. Typically, the detector 140 is adjusted to the proper position by fine tuning 1 to 2 times.
Further, the fine tuning translation stage 112 may comprise a horizontal fine tuning translation stage and a vertical fine tuning translation stage, which may be disposed on the vertical fine tuning translation stage. If the detector 140 detects that the midpoint of the binocular connecting line deviates from the center position of the field of view of the detector 140, the adjustment command sent to the control center 130 should include both the deviation of the midpoint of the binocular connecting line from the center position of the field of view of the detector 140 in the horizontal direction (X direction) and the deviation of the midpoint in the vertical direction (Y direction). For example, when the detector 140 with ID of 001 detects that the midpoint of the connection line between the eyes of the user is located 1cm to the left (positive direction to the right and negative direction to the left) and 1cm above (positive direction above and negative direction below) in the center of the field of view of the detector 140, the detector 140 returns an adjustment command to the control center 130, and the adjustment command is "ID001; x-1cm; y1cm ", the control center 130 receives the adjustment command and sends a control command to the fine adjustment translation stage 112 of the detector 140, where the control command may be" ID001 ", for example; x-1cm; y1cm ", which means that the fine tuning translation stage 112 corresponding to the detector 140 having ID001 is controlled to move 1cm leftward and 1cm upward. Typically, the detector 140 is adjusted to the proper position by fine tuning 1 to 2 times.
After each fine adjustment, it is necessary to confirm whether or not the user's eyes are located at the center of the field of view of the detector 140 again, and after confirming that the user's eyes are located at the center of the field of view of the detector 140, the detector 140 returns an adjustment completion instruction to the control center 130.
At the same time, the time of each position adjustment and the distance and direction of the adjustment are recorded in the newly created log file.
After receiving the adjustment completion instruction, the control center 130 sends a parameter detection instruction to the detector 140, and after receiving the parameter detection instruction, the detector 140 starts to perform parameter detection on both eyes of the user. After the detection is completed, the detector 140 returns a detection completion instruction to the control center 130.
Next, in step S412, upon receiving the detection completion instruction returned from the detector 140, the control center 130 issues a movement instruction to the electric translation stage 110 so that all the detectors 140 sequentially move to the next position, and after the movement is completed, the ID of the detector 140 currently located in front of the detection stage 120 is recorded.
Note that all fine tuning translation stages 112 should be mounted on the coarse tuning translation stage 111 at equal intervals. Each time one detector 140 completes parameter detection, all detectors 140 move to the position where the next detector 140 is located.
Next, in step S413, it is determined whether all the detectors 140 have completed the detection.
The number of detectors 140 is N, and in this step, it is determined whether the number of detectors 140 that have completed the present detection is equal to N. If the number of detectors 140 that have completed the present detection is less than N, returning to step S411; if the number of detectors 140 that have completed the present detection is equal to N, the user is prompted to complete the detection.
Next, in step S420, an eye parameter is extracted from the detection data acquired by the detector 140.
The detection data refers to data directly obtained by the detector 140, and is mainly classified into two types by form, one is text data and one is image data. The text data may be, for example, data describing the refraction situation, etc., and the image data may be, for example, an image taken of the fundus.
The eye parameters are extracted from the text data, which may be data that is generally understood by the user, and noted in chinese to form eye parameters that are viewable by the user.
The extraction of ocular parameters from the image data may be, for example, the extraction of certain specific objects from the image and the measurement of the size of the objects. For example, (1) if a yellow patch appears in the image, extracting the yellow patch to determine an edge of the yellow patch, and then measuring a size thereof, taking the measured size as an eye parameter; (2) The diameters of the arterial blood vessel and the venous blood vessel in the image are extracted, and the ratio of the diameters is calculated.
All the extracted eye parameters are formed into a file.
In addition, suggestions to the user can be added to the formed file. The advice should be provided separately for the detection data of each detector. For the detection data of diopter, the recommended content may be, for example, the recommended wearing of glasses, and specific parameters of the binocular lenses are given, including the degrees of myopia, hyperopia, and astigmatism; for the detection data of the fundus, the recommended content may be to be checked by a doctor, or to make further examination, or the like.
Next, in step S430, the ocular parameter is output to the user.
In this embodiment, the output of the eye parameters is mainly two ways, one is to form an electronic version file for the user to download and view, and the other is to print a paper file for the user to view through a printer.
By adopting the system 100 and the method 400 of the embodiment, the efficiency of eye parameter detection can be remarkably improved, and the user inspection is greatly facilitated.
The present embodiment also provides an eye parameter detecting apparatus capable of executing the processing of each step of the method 400.
Fig. 6 is a schematic structural diagram of an eye parameter detecting apparatus 600 according to an embodiment of the present invention, and as shown in fig. 6, the apparatus 600 includes a detecting module 610, a data processing module 620 and an output module 630.
The detection module 610 is adapted to control each detector on the motorized translation stage in turn to move in front of the detection stage for eye parameter detection in response to receiving the eye parameter detection command.
The data processing module 620 extracts the eye parameters from the detection data obtained by the detector.
The output module 630 outputs the eye parameters to the user.
In one possible implementation, when the data is an image, the data processing module extracts a specific target from the image, performs a size measurement on the extracted target, and uses the measurement result as an eye parameter.
The various techniques described herein may be implemented in connection with hardware or software or, alternatively, with a combination of both. Thus, the methods and apparatus of the present invention, or certain aspects or portions of the methods and apparatus of the present invention, may take the form of program code (i.e., instructions) embodied in tangible media, such as removable hard drives, U-drives, floppy diskettes, CD-ROMs, or any other machine-readable storage medium, wherein, when the program is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the invention.
In the case of program code execution on programmable computers, the electronic device will generally include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. Wherein the memory is configured to store program code; the processor is configured to execute the method of the invention for building contours based on automatic pick-up editing according to instructions in said program code stored in the memory.
By way of example, and not limitation, readable media comprise readable storage media and communication media. The readable storage medium stores information such as computer readable instructions, data structures, program modules, or other data. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. Combinations of any of the above are also included within the scope of readable media.
In the description provided herein, algorithms and displays are not inherently related to any particular computer, virtual system, or other apparatus. Various general-purpose systems may also be used with examples of the invention. The required structure for a construction of such a system is apparent from the description above. In addition, the present invention is not directed to any particular programming language. It should be appreciated that the teachings of the present invention as described herein may be implemented in a variety of programming languages and that the foregoing descriptions of specific languages are provided for disclosure of preferred embodiments of the present invention.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be construed as reflecting the intention that: i.e., the claimed invention requires more features than are expressly recited in each claim. Those skilled in the art will appreciate that the modules or units or components of the devices in the examples disclosed herein may be arranged in a device as described in this embodiment, or alternatively may be located in one or more devices different from the devices in this example. The modules in the foregoing examples may be combined into one module or may be further divided into a plurality of sub-modules.
Those skilled in the art will appreciate that the modules in the apparatus of the embodiments may be adaptively changed and disposed in one or more apparatuses different from the embodiments. The modules or units or components of the embodiments may be combined into one module or unit or component and, furthermore, they may be divided into a plurality of sub-modules or sub-units or sub-components. Any combination of all features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or units of any method or apparatus so disclosed, may be used in combination, except insofar as at least some of such features and/or processes or units are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features but not others included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. Furthermore, some of the embodiments are described herein as methods or combinations of method elements that may be implemented by a processor of a computer system or by other means of performing the functions. Thus, a processor with the necessary instructions for implementing the described method or method element forms a means for implementing the method or method element. Furthermore, the elements of the apparatus embodiments described herein are examples of the following apparatus: the apparatus is for carrying out the functions performed by the elements for carrying out the objects of the invention.
As used herein, unless otherwise specified the use of the ordinal terms "first," "second," "third," etc., to describe a general object merely denote different instances of like objects, and are not intended to imply that the objects so described must have a given order, either temporally, spatially, in ranking, or in any other manner.
While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of the above description, will appreciate that other embodiments are contemplated within the scope of the invention as described herein. Furthermore, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present application without departing from the spirit or scope of the application. Thus, it is intended that the present application also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (10)

1. A method for detecting an ocular parameter, comprising:
in response to receiving the eye parameter detection instruction, sequentially controlling each detector on the electric translation table to move to the front of the detection table for eye parameter detection;
extracting eye parameters from detection data obtained by the detector; and
And outputting the eye parameters to a user.
2. The method of claim 1, wherein sequentially controlling each detector on the motorized translation stage to move in front of the detection stage for detection of the ocular parameter comprises:
Acquiring an ID of a detector currently positioned in front of the detection table, and sending a parameter detection instruction to the detector;
when a parameter detection completion instruction returned by the detector is received, a movement instruction is sent to the electric translation stage, so that all detectors sequentially move to the next position, and the ID of the detector currently positioned in front of the detection stage is recorded;
Judging whether all detectors complete the detection;
If yes, the detection is finished, and the user is prompted to finish the detection;
and if not, returning to the step of acquiring the ID of the detector currently positioned in front of the detection table and sending a detection instruction to the detector.
3. The method for detecting an ocular parameter as in claim 2, wherein sequentially controlling each detector on the motorized translation stage to move in front of the detection stage for ocular parameter detection further comprises:
when receiving an adjustment instruction returned by the detector, sending a control instruction to the electric translation stage according to adjustment distance and adjustment direction information carried by the adjustment instruction, so that the electric translation stage moves according to the adjustment distance and the adjustment direction.
4. The method of claim 1, wherein extracting the ocular parameter from the test data obtained from the test meter comprises:
when the detection data is an image, extracting a specific target from the image, performing size measurement on the extracted target, and taking the measurement result as an eye parameter.
5. The ocular parameter detection method of claim 1, further comprising:
analyzing the eye parameters and forming prompt information according to analysis results; and
And outputting the prompt information to a user.
6. An eye parameter detecting device, comprising:
the detection module is suitable for responding to the received eye parameter detection instruction and sequentially controlling each detector on the electric translation table to move to the front of the detection table for eye parameter detection;
the data processing module is suitable for extracting eye parameters from detection data acquired by the detector; and
And the output module is suitable for outputting the eye parameters to a user.
7. The ocular parameter detection device of claim 6, wherein when the data is an image, the data processing module extracts a specific target from the image, performs a size measurement on the extracted target, and takes the measurement result as an ocular parameter.
8. An eye parameter detection system is characterized by comprising an electric translation table, a detection table, a control center and a plurality of detectors positioned on the electric translation table;
The detection platform is positioned at one side of the electric translation platform;
the plurality of detectors are respectively used for detecting different types of data of eyes;
The control center is used for receiving a user instruction and controlling the electric translation table and the detector to work according to the instruction.
9. The ocular parameter testing system of claim 8, wherein the motorized translation stage includes a coarse translation stage and a plurality of fine translation stages, each detector mounted on one fine translation stage, the plurality of fine translation stages mounted on the coarse translation stage.
10. An electronic device, comprising:
A memory;
A processor; and
A computer program;
Wherein the computer program is stored in the memory and configured to be executed by the processor to implement the method of any one of claims 1 to 5.
CN202410131739.XA 2024-01-30 2024-01-30 Eye parameter detection method, device and system Pending CN117942024A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410131739.XA CN117942024A (en) 2024-01-30 2024-01-30 Eye parameter detection method, device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410131739.XA CN117942024A (en) 2024-01-30 2024-01-30 Eye parameter detection method, device and system

Publications (1)

Publication Number Publication Date
CN117942024A true CN117942024A (en) 2024-04-30

Family

ID=90796066

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410131739.XA Pending CN117942024A (en) 2024-01-30 2024-01-30 Eye parameter detection method, device and system

Country Status (1)

Country Link
CN (1) CN117942024A (en)

Similar Documents

Publication Publication Date Title
AU2021221508B2 (en) Digital visual acuity eye examination for remote physician assessment
US11210789B2 (en) Diabetic retinopathy recognition system based on fundus image
WO2021068523A1 (en) Method and apparatus for positioning macular center of eye fundus image, electronic device, and storage medium
US10349826B2 (en) Image processing apparatus, image processing method, and storage medium
US9489574B2 (en) Apparatus and method for enhancing user recognition
EP3695416A1 (en) Deep learning-based diagnosis and referral of ophthalmic diseases and disorders
WO2019180742A1 (en) System and method for retinal fundus image semantic segmentation
CN114937024A (en) Image evaluation method and device and computer equipment
JP2006263127A (en) Ocular fundus diagnostic imaging support system and ocular fundus diagnostic imaging support program
CN113534945A (en) Method, device and equipment for determining eyeball tracking calibration coefficient and storage medium
CN117942024A (en) Eye parameter detection method, device and system
CN110101360A (en) Vision assessment method, apparatus and storage medium
WO2023077238A1 (en) Hybrid classifier training for feature annotation
CN113126762B (en) Medical data checking device and method for monitoring medical behaviors
US20240013431A1 (en) Image capture devices, systems, and methods
WO2019171398A1 (en) A fundus image analysis system
Garg et al. Diabetic retinopathy prediction device system
CN116831516B (en) Cornea curvature generating method and device
CN113116292B (en) Eye position measurement method, device, terminal and equipment based on eye appearance image
WO2019082203A1 (en) A system and method for detection and classification of retinal disease
US11510742B2 (en) System and method of utilizing computer-aided identification with medical procedures
CN115120179A (en) Interpretation system and interpretation method for detecting fundus abnormality based on fundus image
CN115131267A (en) System and method for interpreting fundus abnormality based on fundus image

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination