CN114202499B - Method and device for measuring refractive information and computer readable storage medium - Google Patents

Method and device for measuring refractive information and computer readable storage medium Download PDF

Info

Publication number
CN114202499B
CN114202499B CN202110694938.8A CN202110694938A CN114202499B CN 114202499 B CN114202499 B CN 114202499B CN 202110694938 A CN202110694938 A CN 202110694938A CN 114202499 B CN114202499 B CN 114202499B
Authority
CN
China
Prior art keywords
fundus image
definition
model
data
parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110694938.8A
Other languages
Chinese (zh)
Other versions
CN114202499A (en
Inventor
黄叶权
郭静云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Shengda Tongze Technology Co ltd
Original Assignee
Shenzhen Shengda Tongze Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Shengda Tongze Technology Co ltd filed Critical Shenzhen Shengda Tongze Technology Co ltd
Priority to CN202110694938.8A priority Critical patent/CN114202499B/en
Publication of CN114202499A publication Critical patent/CN114202499A/en
Application granted granted Critical
Publication of CN114202499B publication Critical patent/CN114202499B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30041Eye; Retina; Ophthalmic

Abstract

The invention discloses a method for measuring refraction information, which comprises the following steps: acquiring fundus images of a current human eye to be detected and shooting parameters of the fundus images, wherein the number of the fundus images is multiple; calculating the definition of the fundus image, and extracting a data point set of the fundus image according to the definition of the fundus image and the shooting parameters; inputting the extracted data point set into a prior model, and estimating corresponding model parameters in the prior model; and substituting the model parameters into a diopter calculation function to calculate the diopter information of human eyes. The invention also discloses a refraction information measuring device and a computer readable storage medium, and provides a refraction information calculation scheme based on a small number of fundus images with different definition.

Description

Refractive information measuring method, device and computer readable storage medium
Technical Field
The invention relates to the technical field of fundus refraction measurement, in particular to a method and a device for measuring refraction information and a computer readable storage medium.
Background
The incidence of myopia has increased worldwide in recent years. There have been several studies that suggest that the peripheral refractive state of the retina may be related to the onset and progression of central myopia. The accurate and rapid measurement and evaluation of peripheral refractive conditions using appropriate techniques is the basis of intensive research. The existing refraction measuring methods on the market at present comprise an optometry method, a windowing optometry method and an aberration measuring method, but the methods are complex to operate, long in measuring time and time-consuming and labor-consuming. Moreover, in the measurement process of the refraction information, because a plurality of images need to be shot, the exposure time is long, even if infrared light which can avoid stimulating human eyes is adopted, the conditions of eye shake and pupil contraction still occur in the shooting process, and the measurement precision of the refraction information is greatly influenced.
The above is only for the purpose of assisting understanding of the technical aspects of the present invention, and does not represent an admission that the above is prior art.
Disclosure of Invention
The invention mainly aims to provide a method and a device for measuring refraction information and a computer readable storage medium, aiming at solving the technical problems that when the refraction information of human eyes is detected, the exposure time for shooting a plurality of fundus images is longer, the eye state in the shooting process is influenced, and the measurement precision of the refraction information is reduced.
To achieve the above object, the present invention provides a refractive information measuring method including:
acquiring fundus images of the current human eyes to be detected and a plurality of photographing parameters of the fundus images;
calculating the definition of the fundus image, and extracting a data point set of the fundus image according to the definition of the fundus image and the shooting parameters;
inputting the extracted data point set into a prior model, and estimating corresponding model parameters in the prior model;
and substituting the model parameters into a diopter calculation function to calculate the diopter information of the fundus image.
Optionally, the step of calculating the fundus image clarity of the fundus image, and extracting the data point set of the fundus image according to the fundus image clarity and the shooting parameters includes:
confirming target position points with definition to be calculated in the fundus image, and calculating the definition of the fundus image by using the target position points, wherein the number of the target position points is multiple;
and extracting the definition and shooting parameters of the target position point to generate a data point set of the fundus image.
Optionally, the step of calculating the fundus image sharpness with the target location point includes:
defining a preset neighborhood range by taking the target position point as a center;
and calculating the definition of the fundus image according to the preset neighborhood range.
Optionally, the step of extracting the sharpness and the shooting parameters of the target position point to generate the data point set of the fundus image includes:
confirming definition data of the target position point based on the shooting parameters, forming a data point of the target position point by using the shooting parameters and the definition data of the target position point of the fundus image, and forming the data point set based on a plurality of data points obtained by the fundus image at the target position point;
polling the fundus image for a plurality of target location points to obtain a set of data points for each target location point.
Optionally, the refractive information measurement method further comprises:
acquiring training sample data, fitting the training sample data based on a candidate definition model, and selecting a target candidate definition model from the candidate definition model according to a fitting result;
and forming a prior model by using the target candidate definition model, and taking the model parameters of the prior model as diopter calculation functions of independent variables.
Optionally, before the step of acquiring training sample data, the method further includes:
acquiring shooting parameters of the training sample data;
and adjusting the current dioptric cartograph to shoot the training sample data according to the shooting parameters.
Optionally, before the step of forming the prior model with the target candidate sharpness model, the method further includes:
acquiring an optical model and optical parameters of the current dioptric map instrument;
creating the candidate sharpness models, one or more, with the optical model and optical parameters.
Optionally, the step of forming a prior model from the target candidate sharpness model includes:
extracting a data point set of the training sample data, and fitting the data point set by using the definition model;
and determining a prior model according to the fitting result.
Further, to achieve the above object, the present invention also provides a refractive information measuring apparatus comprising: a memory, a processor, the memory having stored thereon a computer program capable of being invoked by the processor, the computer program, when executed by the processor, implementing the steps of the refractive information measurement method as described above.
The present invention also provides a computer-readable storage medium having stored thereon a refractive information measuring program which, when executed by a processor, implements the steps of the refractive information measuring method as described above.
The embodiment of the invention provides a method for measuring refraction information, which comprises the steps of obtaining fundus images of human eyes to be measured at present and shooting parameters of the fundus images, wherein the number of the fundus images is multiple; calculating the definition of the fundus image, and extracting a data point set of the fundus image according to the definition of the fundus image and the shooting parameters; inputting the extracted data point set into a prior model, and estimating corresponding model parameters in the prior model; and substituting the model parameters into a diopter calculation function to calculate the diopter information of human eyes. According to the invention, by shortening the image acquisition time and completing exposure shooting within 250ms of the typical blinking period of human eyes, the interference of eye shaking, pupil contraction and the like can be better inhibited, so that visible light shooting can be adopted, refraction measurement approaching the real visual effect of human eyes can be realized, refraction information measurement errors under the abnormal condition of the eye shooting state are avoided, and the eye refraction information measurement precision is improved.
Drawings
FIG. 1 is a schematic diagram of a terminal \ device structure of a hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of a first embodiment of the method for measuring refractive information according to the present invention;
FIG. 3 is a schematic flow chart of a second embodiment of the method for measuring refractive information according to the present invention;
FIG. 4 is a schematic view of an optical system for acquiring fundus images;
FIG. 5 is a schematic view of a sharpness model.
The implementation, functional features and advantages of the present invention will be further described with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The main solution of the embodiment of the invention is as follows: acquiring fundus images of the current human eyes to be detected and a plurality of photographing parameters of the fundus images; calculating the fundus image definition of the fundus image, and extracting a data point set of the fundus image according to the fundus image definition and the shooting parameters; inputting the extracted data point set into a prior model, and estimating corresponding model parameters in the prior model; and substituting the model parameters into a diopter calculation function to calculate the diopter information of the fundus image.
Because when present detect eye ground refractive information, need shoot many eye images, and when shooing many images, can be because exposure time is longer, make the eye state of shooting process influenced and then reduced refractive information measurement accuracy.
The invention provides a solution, which can better inhibit the interference of eye shake, pupil contraction and the like by shortening the image acquisition time and completing exposure shooting within 250ms of the typical blinking period of human eyes, so that visible light shooting can be adopted, thereby realizing refraction measurement approaching the real visual effect of the human eyes, avoiding the refraction information measurement error under the abnormal condition of the eye shooting state and improving the eye refraction information measurement precision.
As shown in fig. 1, fig. 1 is a schematic terminal structure diagram of a hardware operating environment according to an embodiment of the present invention.
The terminal in the embodiment of the invention can be a mobile or non-mobile terminal device such as a PC, a smart phone, a tablet computer and a portable computer.
As shown in fig. 1, the terminal may include: a processor 1001, such as a CPU, a network interface 1004, a user interface 1003, a memory 1005, a communication bus 1002. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display), an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory, a non-volatile memory (non-volatile memory), such as a magnetic disk memory, a network memory, or a cloud storage. The memory 1005 may alternatively be a storage device separate from the processor 1001 described previously.
Those skilled in the art will appreciate that the terminal structure shown in fig. 1 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 1, the memory 1005, which is a kind of computer storage medium, may include therein an operating system, a network communication module, a user interface module, and a refractive information measuring program.
In the terminal shown in fig. 1, the network interface 1004 is mainly used for connecting to a backend server and performing data communication with the backend server; the user interface 1003 is mainly used for connecting a client (user side) and performing data communication with the client; and the processor 1001 may be configured to call the refractive information measurement program stored in the memory 1005 and perform the following operations:
acquiring fundus images of the current human eyes to be detected and a plurality of photographing parameters of the fundus images;
calculating the definition of the fundus image, and extracting a data point set of the fundus image according to the definition of the fundus image and the shooting parameters;
inputting the extracted data point set into a prior model, and estimating corresponding model parameters in the prior model;
and substituting the model parameters into a diopter calculation function to calculate the diopter information of the fundus image.
Further, the processor 1001 may call the refractive information measurement program stored in the memory 1005, and also perform the following operations:
confirming target position points with definition to be calculated in the fundus image, and calculating the definition of the fundus image by using the target position points, wherein the number of the target position points is multiple;
and extracting the definition and shooting parameters of the target position point to generate a data point set of the fundus image.
Further, the processor 1001 may call the refractive information measurement program stored in the memory 1005, and also perform the following operations:
defining a preset neighborhood range by taking the target position point as a center;
and calculating the definition of the fundus image according to the preset neighborhood range.
Further, the processor 1001 may call the refractive information measurement program stored in the memory 1005, and also perform the following operations:
confirming definition data of the target position point based on the shooting parameters, forming a data point of the target position point by using the shooting parameters and the definition data of the target position point of the fundus image, and forming the data point set based on a plurality of data points obtained by the fundus image at the target position point;
polling the fundus image for a plurality of target location points to obtain a set of data points for each target location point.
Further, the processor 1001 may call the refractive information measurement program stored in the memory 1005, and also perform the following operations:
acquiring training sample data, fitting the training sample data based on a candidate definition model, and selecting a target candidate definition model from the candidate definition model according to a fitting result;
and forming a prior model by using the target candidate definition model, and taking the model parameters of the prior model as diopter calculation functions of independent variables.
Further, the processor 1001 may call the refractive information measurement program stored in the memory 1005, and also perform the following operations:
acquiring shooting parameters of the training sample data;
and adjusting the current dioptric cartograph to shoot the training sample data according to the shooting parameters.
Further, the processor 1001 may call the refractive information measurement program stored in the memory 1005, and also perform the following operations:
acquiring an optical model and optical parameters of the current dioptric map instrument;
creating the candidate sharpness models, one or more, with the optical model and optical parameters.
Further, the processor 1001 may call the refractive information measurement program stored in the memory 1005, and also perform the following operations:
extracting a data point set of the training sample data, and fitting the data point set by using the definition model;
and determining a prior model according to the fitting result.
Referring to fig. 2, fig. 2 is a schematic flow chart of a first embodiment of the refractive information measuring method of the present invention, which includes:
step S10, acquiring fundus images of the current human eye to be detected and shooting parameters of the fundus images, wherein the number of the fundus images is multiple;
the method comprises the steps that according to the refractive information detection requirement of a current user, fundus images of the user are shot through fundus image shooting equipment, the fundus image shooting equipment refers to a refraction map instrument, when the fundus images of the user are shot through the refraction map instrument, fundus image shooting operation is rapidly completed within a typical blinking period of 250ms of human eyes, the shot fundus images need to be shot according to shooting parameters set by the fundus image shooting equipment, the fundus images of the current human eyes to be detected need to be shot, when the shooting parameters of the fundus image shooting equipment are set, the shooting parameters comprise a detection surface position and an image space focal length, and when the shooting parameters are different parameter values of a plurality of parameters, the plurality of fundus images are shot. When the fundus images are taken by setting the shooting parameters of the current fundus image shooting device, the relationship between the fundus images taken by the shooting parameters can be seen in fig. 4, fig. 4 is a schematic view of an optical system for acquiring the fundus images, and as shown in fig. 4, by changing the shooting parameters of the fundus imaging optical system, fundus images of different blurring degrees can be obtained on the detection surface. The specific principle is as follows: p is a point of the eye fundus of the eye to be measured, a corresponding conjugate image point P' exists in the image space of the imaging light path of the eye fundus camera, and when the conjugate image point is just positioned on the image detection surface, the clearest eye fundus image, namely the clearest eye fundus image, can be obtained in the point P and the smaller neighborhood. When the detection plane position L is not changed and only the image side focal length f of the fundus camera is changed, the conjugate point P 'of the point P may deviate from the detection plane, the image of the point P on the detection plane may become blurred, resulting in a blurred image P ", and the sharpness at the point P ″ on the image may be reduced compared to the image at the conjugate point P'. The greater the change in the image side focal length f, the lower the sharpness at P ". When the image side focal length f is kept constant and only the position L of the detection plane is changed, the detection plane may be deviated from the conjugate point P 'of the point P, the image of the point P on the detection plane may become blurred, resulting in a blurred image P ", and the sharpness at the point P ″ on the image may be reduced compared to the image at the conjugate point P'. The greater the change in the position L of the detection surface, the lower the sharpness at P ".
Step S20, calculating the fundus image definition of the fundus image, and extracting a data point set of the fundus image according to the fundus image definition and the shooting parameters;
in this embodiment, when the definition of the fundus image is calculated, a scheme for calculating the definition of the fundus image by defining target position points of the fundus image, that is, the definition of the fundus image is calculated, and a data point set of the fundus image is extracted using the definition of the fundus image and the shooting parameters, including:
confirming target position points with definition to be calculated in the fundus image, and calculating the definition of the fundus image by using the target position points, wherein the number of the target position points is multiple;
and extracting the definition and shooting parameters of the target position point to generate a data point set of the fundus image.
According to the currently shot fundus image, when the definition of the fundus image is calculated, a target position point of the fundus image for calculating the definition needs to be confirmed. When the target is confirmed to be a fulcrum, the dimension MxN of the photographed fundus image is selected, and a target position point for calculating definition is selected based on the matrix of the MxN dimension, and the method for selecting the target position point can be selected as follows:
a. all MxN points are used as target position points;
b. selecting position points at constant intervals, for example, selecting a target position point every 10 in the transverse direction and the longitudinal direction;
c. several points are selected as target location points in the two-dimensional matrix at unequal intervals.
As described above, based on the target position point currently selected in the fundus image, each of the selected position points has coordinates corresponding to the MxN matrix, and the coordinates of the target position point can be defined as (x, y), since the fundus image is based on the matrix of MxN dimensions; as described above, when there are a plurality of fundus images from the currently confirmed target position points of the fundus image, the resolution of each fundus image based on the selected target position point is calculated sequentially from the 1 st to the Q th fundus images, and the method is specifically implemented as follows: q (Q is 1, 2, …, Q) th picture I q The sharpness based on the target location point (x, y) on the image can be calculated. The definition adopts a calculation method in the field of image processing, such as gradient, gray variance, entropy function and the like, and can select a proper calculation method and calculation parameters according to a specific device and an application scene.
Since the data of the fundus image is excessively large, in view of the efficiency of the sharpness calculation, the sharpness calculation range may be defined based on the confirmed target position point to improve the sharpness calculation efficiency, that is, the step of calculating the sharpness of the fundus image with the confirmed target position point includes:
defining a preset neighborhood range by taking the target position point as a center;
and calculating the definition of the fundus image according to the preset neighborhood range.
According to the currently determined target position point (x, y) based on each fundus image, selecting a proper neighborhood range by taking the target position point as a center, and calculating the definition in the neighborhood range, wherein the definition of the target position point is marked as C q (x,y)。
When capturing the data point set of the fundus image according to the calculated definition of the fundus image, capturing the data point set by using the target position point as a reference, namely, the step of extracting the definition and the shooting parameters of the target position point to generate the data point set of the fundus image comprises the following steps:
confirming definition data of the target position point based on the shooting parameters, forming a data point of the target position point by the shooting parameters and the definition data of the target position point on each image, and forming a data point set by data points obtained by the shot images at the target position point;
polling the fundus image for a plurality of target location points to obtain a set of data points for each target location point.
Confirming definition data based on a target position point in the fundus image according to the calculated definition of the current fundus image, obtaining shooting parameters of the target position point on the definition data, namely corresponding shooting parameters of the definition of the target position point, obtaining a data point of the target position point according to the definition of the target position point and the shooting parameters, and forming a data point set of the target position point based on the definition and the shooting parameters of the target position point on all fundus images under the condition that the fundus images are multiple. In the event that multiple target location points are included in the fundus image, polling a set of data points for the multiple target location points in the fundus image generates a set of data points for the target location points with a set of data points for all target location points in the fundus image.
Step S30, inputting the extracted data point set into a prior model, and estimating corresponding model parameters in the prior model;
the method comprises the steps of inputting a data point set to a prior model according to the data point set of a currently extracted fundus image, estimating corresponding model parameters from the prior model by the data point set according to a data structure of the prior model, wherein the prior model comprises data parameters applied by various data structures, generally, the data parameters applied by the data structure based on the prior model are defined as the model parameters, the estimation operation of estimating the corresponding model parameters based on the data point set is used for fitting the data point set of the fundus image based on the data structure in the prior model, and the model parameters corresponding to the data point set are determined according to the fitting result. In addition, when the model parameters corresponding to the data point set are determined according to the fitting result, and when a plurality of model parameters are available in the prior model, the model parameter with the optimal fitting result is used as the model parameter corresponding to the data point set, and the optimal fitting result is defined as the fitting result with the minimum deviation as the optimal fitting result.
Step S40, substituting the model parameters into a diopter calculation function to calculate diopter information of the fundus image.
Substituting the model parameters into a diopter calculation function according to the model parameters determined by the data point set of the current fundus image, and calculating the diopter information of the fundus image by using the diopter calculation function, wherein the diopter calculation function is a model of an imaging optical system formed on the basis of the model parameters and is used for expressing the corresponding diopter of the light vergence of the fundus retina at the exit position of the outer side of the cornea.
In the embodiment, in a small number of currently acquired fundus images with different definitions, the refractive information of the fundus images is calculated according to the data model parameters in the prior model, so that the image acquisition time is shortened, and the eye refractive information measurement precision is also improved.
Further, referring to fig. 3, fig. 3 is a schematic flow chart of a second embodiment of the refractive information measuring method according to the present invention, and based on the first embodiment shown in fig. 2, the refractive information measuring method further includes:
step S50, training sample data is obtained, fitting is carried out on the training sample data based on the candidate definition model, and a target candidate definition model is selected from the candidate definition model according to a fitting result;
and step S60, forming a prior model by using the target candidate definition model, and taking the model parameters of the prior model as diopter calculation functions of independent variables.
In this embodiment, a refraction information calculation model based on a current refraction map instrument, that is, a prior model, is created based on application of the current refraction map instrument, in the prior model, a data model parameter set of the prior model is related to an optical system parameter for shooting a refraction topographic map, so that training sample data shot by the current refraction topographic map is obtained, the training sample data is fundus images obtained based on a plurality of training entities, and in consideration of diversity of the training sample data, shooting parameters of the training sample data during shooting are different, and therefore, before the step of obtaining the training sample data, the method further includes:
acquiring shooting parameters of the training sample data;
and adjusting the current dioptric cartograph to shoot the training sample data according to the shooting parameters.
According to the establishment requirement of the current prior model, determining training sample data required by the current prior model establishment, and confirming shooting parameters based on the training sample data, wherein the shooting parameters are data information directly influencing the definition of the training sample data and can be obtained through a large amount of experimental data, or based on the requirement of the current data information, continuously updating and optimizing the shooting parameters so as to obtain the training sample data with various definition degrees. Therefore, according to the shooting parameters of the obtained training sample data, the current dioptric cartograph is adjusted to shoot the training sample data by the shooting parameters, under the condition that the training sample data has multiple definitions, the shooting parameters are multiple, and the shooting parameters comprise the position of a detection surface, the focal distance of an image space and other optical parameters which possibly have effects on the definition of the image. Further, when there are a plurality of imaging parameters, the imaging parameter may be a certain parameter, or may be an imaging parameter in which a plurality of data are simultaneously changed. Therefore, the current dioptric mapper is adjusted to shoot the training sample data according to the obtained shooting parameters, when the dioptric mapper is used for shooting the training sample data, the equipment parameters are adjusted to be data consistent with the shooting parameters for shooting the training sample data, shooting parameter conditions can be further limited, and the shooting parameter conditions are defined as a certain point in a shooting parameter space formed by a plurality of shooting parameters. Meanwhile, when the training sample data is shot based on the shooting parameters, shooting can be performed based on human eyes of the subjects, human eye data of a plurality of subjects can be collected in consideration of model generalization, and when the training sample data is shot based on the human eyes of the same subject, enough sampling density in a shooting parameter space needs to be ensured, and at least sample points near a definition extreme point are covered.
After obtaining training sample data, fitting the training sample data by using a preset candidate definition model according to the training sample data and the preset candidate definition model, and obtaining a definition model capable of forming the prior model according to a fitting result, wherein the definition model is a data structure based on an image definition rule and constructed based on equipment parameters of a dioptric mapping instrument, namely before the step of forming the prior model by using the target candidate definition model, the method further comprises the following steps of:
acquiring an optical model and optical parameters of the current dioptric map instrument;
creating the candidate sharpness models, one or more, with the optical model and optical parameters.
According to the imaging rule of the current dioptric map instrument for shooting the fundus image, the shooting parameters comprise the position of the detection surface and the image space focus in the operation of shooting the fundus imageDistance and possibly other optical parameters that have an effect on sharpness. When the shooting parameters are changed, changes in the sharpness presentation are caused. Ideally, the optical system model and the physical action of the parameters determine the function relationship between the definition and the shooting parameters, i.e. the definition model, and fig. 5 can be seen, and fig. 5 is a schematic diagram of the definition model. According to the content shown in FIG. 5, at two detection planes (e.g. v) at the same distance μ from the focus plane i And v j ) The radii of the upper blurred image are equal, in the ideal case v i And v j The image sharpness on the detection plane is also equal. Therefore, the relation between the image definition and the position of the detection surface can be expressed by a relatively simple symmetrical distribution definition model, such as a Gaussian model, an even-order polynomial function model, a cosine function model and the like.
Such as Gaussian model, expressed as
Figure BDA0003127568850000121
In the expression shown above, L is a shooting parameter, and a, μ, σ are model parameters. The physical meaning of the model parameter μ corresponding to the maximum sharpness is that the conjugate point P' of the point P is exactly on the detection plane. In the optical system where the model is located, the position where the model parameter mu is located corresponds to a specific human eye diopter, that is, the model parameter mu can be combined with the imaging optical system and the optical parameters thereof to calculate the corresponding human eye diopter.
However, the data points may have fixed interference and random disturbance in the definition of each point due to optical system processing, alignment, optical interference, electronic noise, and the like. The dashed part in fig. 5 expresses the presence of interference. Because the existence of these fixed disturbances and random disturbances cannot obtain a true sharpness model, a plurality of sample data points need to be acquired to identify the sharpness model, that is, a plurality of candidate sharpness models based on an optical system and shooting parameters are created, and the candidate sharpness models are used to fit the sharpness models related to the training sample data identification, that is, the step of generating a prior model according to the fitting result of the training sample data and the sharpness model includes:
extracting a data point set of the training sample data, and fitting the data point set by using the definition model;
and determining a prior model according to the fitting result.
And extracting a data point set of training sample data from the created candidate definition model, wherein the data point set is a data set of definition of a target position point of the fundus image and shooting parameters, fitting the data point set by using the candidate definition model, determining a target candidate definition model by using a fitting result, and extracting a data structure based on the target candidate definition model to form the prior model.
According to the technical content shown above, after a prior model is generated based on a candidate sharpness model, a diopter calculation function of an independent variable is generated according to model parameters of the prior model, and the prior model is obtained by the sharpness model according to the model parameters of the prior model, so that the maximum sharpness S of the sharpness model is solved max Corresponding model parameters A 0 、μ 0 、σ 0 Using model parameters A 0 、μ 0 、σ 0 Combined with a model of an imaging optical system to express diopter of vergence of light of a to-be-measured point of the fundus retina at the exit of the outer side of the cornea, namely a human eye diopter calculation function
Figure BDA0003127568850000122
Furthermore, an embodiment of the present invention further provides a computer-readable storage medium, on which a refractive information measurement program is stored, where the refractive information measurement program, when executed by a processor, implements the following operations:
acquiring fundus images of a current human eye to be detected and shooting parameters of the fundus images, wherein the number of the fundus images is multiple;
calculating the definition of the fundus image, and extracting a data point set of the fundus image according to the definition of the fundus image and the shooting parameters;
inputting the extracted data point set into a prior model, and estimating corresponding model parameters in the prior model;
and substituting the model parameters into a diopter calculation function to calculate the diopter information of the fundus image.
Further, the refractive information measurement program when executed by the processor further performs the following operations:
confirming target position points with definition to be calculated in the fundus image, and calculating the definition of the fundus image by using the target position points, wherein the number of the target position points is multiple;
and extracting the definition and shooting parameters of the target position point to generate a data point set of the fundus image.
Further, the refractive information measurement program when executed by the processor further performs the following operations:
defining a preset neighborhood range by taking the target position point as a center;
and calculating the definition of the fundus image according to the preset neighborhood range.
Further, the refractive information measurement program when executed by the processor further performs the following operations:
confirming definition data of the target position point based on the shooting parameters, forming a data point of the target position point by using the shooting parameters and the definition data of the target position point of the fundus image, and forming the data point set based on a plurality of data points obtained by the fundus image at the target position point;
polling the fundus image for a plurality of target location points to obtain a set of data points for each target location point.
Further, the refractive information measurement program when executed by the processor further performs the following operations:
acquiring training sample data, fitting the training sample data based on a candidate definition model, and selecting a target candidate definition model from the candidate definition model according to a fitting result;
and forming a prior model by using the target candidate definition model, and taking the model parameter of the prior model as the diopter calculation function of an independent variable.
Further, the refractive information measurement program when executed by the processor further performs the following operations:
acquiring shooting parameters of the training sample data;
and adjusting the current dioptric cartograph to shoot the training sample data according to the shooting parameters.
Further, the refractive information measurement program when executed by the processor further performs the following operations:
acquiring an optical model and optical parameters of the current dioptric map instrument;
creating the candidate sharpness models, one or more, with the optical model and optical parameters.
Further, the refractive information measurement program when executed by the processor further performs the operations of:
extracting a data point set of the training sample data, and fitting the data point set by using the definition model;
and determining a prior model according to the fitting result.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solution of the present invention or the portions contributing to the prior art may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) as described above and includes several instructions for enabling a terminal device (which may be a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (7)

1. A refractive information measurement method, characterized in that the refractive information measurement comprises the steps of:
acquiring fundus images of a current human eye to be detected and shooting parameters of the fundus images, wherein the number of the fundus images is multiple;
calculating the definition of the fundus image, and extracting a data point set of the fundus image according to the definition of the fundus image and the shooting parameters;
inputting the extracted data point set into a prior model, and estimating corresponding model parameters in the prior model;
substituting the model parameters into a diopter calculation function to calculate the diopter information of the fundus image;
wherein the refractive information measurement method further comprises:
acquiring an optical model and optical parameters of the current dioptric map instrument;
creating one or more candidate definition models by using the optical model and the optical parameters;
acquiring training sample data, fitting the training sample data based on the candidate definition model, and selecting a target candidate definition model from the candidate definition model according to a fitting result;
extracting a data point set of the training sample data, and fitting the data point set by using the target candidate definition model;
and determining a prior model according to the fitting result, and taking the model parameters of the prior model as diopter calculation functions of independent variables.
2. The refractive information measurement method of claim 1, wherein the step of calculating the fundus image clarity of the fundus image, and extracting the set of data points of the fundus image with the fundus image clarity and the photographing parameters comprises:
confirming target position points with definition to be calculated in the fundus image, and calculating the definition of the fundus image by using the target position points, wherein the number of the target position points is multiple;
and extracting the definition and shooting parameters of the target position point to generate a data point set of the fundus image.
3. The refractive information measurement method of claim 2, wherein the step of calculating the fundus image clarity from the target location point comprises:
defining a preset neighborhood range by taking the target position point as a center;
and calculating the definition of the fundus image according to the preset neighborhood range.
4. The refractive information measuring method of claim 2, wherein the step of extracting the clarity of the target location point and the photographing parameters to generate the set of data points of the fundus image comprises:
confirming definition data of the target position point based on the shooting parameters, forming a data point of the target position point by using the shooting parameters and the definition data of the target position point of the fundus image, and forming the data point set based on a plurality of data points obtained by the fundus image at the target position point;
polling the fundus image for a plurality of target location points to obtain a set of data points for each target location point.
5. The refractive information measurement method of claim 1, wherein the step of acquiring training sample data is preceded by:
acquiring shooting parameters of the training sample data;
and adjusting the current dioptric cartograph to shoot the training sample data according to the shooting parameters.
6. A refractive information measuring apparatus, characterized by comprising: a memory, a processor, and a refractive information measurement program stored on the memory and executable on the processor, the refractive information measurement program when executed by the processor implementing the steps of the refractive information measurement method of any one of claims 1 to 5.
7. A computer-readable storage medium, characterized in that the computer-readable storage medium includes a refractive information measurement program stored in the computer-readable storage medium, the refractive information measurement program when executed implementing the steps of the refractive information measurement method according to any one of claims 1 to 5.
CN202110694938.8A 2021-06-22 2021-06-22 Method and device for measuring refractive information and computer readable storage medium Active CN114202499B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110694938.8A CN114202499B (en) 2021-06-22 2021-06-22 Method and device for measuring refractive information and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110694938.8A CN114202499B (en) 2021-06-22 2021-06-22 Method and device for measuring refractive information and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN114202499A CN114202499A (en) 2022-03-18
CN114202499B true CN114202499B (en) 2022-09-09

Family

ID=80645804

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110694938.8A Active CN114202499B (en) 2021-06-22 2021-06-22 Method and device for measuring refractive information and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN114202499B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108335757A (en) * 2018-02-05 2018-07-27 王雁 A method of diopter adjusted value in prediction SMILE operations
CN109008937A (en) * 2018-07-26 2018-12-18 上海鹰瞳医疗科技有限公司 Method for detecting diopter and equipment

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020501191A (en) * 2016-12-26 2020-01-16 シェンジェン ロイオル テクノロジーズ カンパニー リミテッドShenzhen Royole Technologies Co., Ltd. Optical system and diopter adjustment method
CN111358421B (en) * 2020-03-16 2021-02-09 深圳盛达同泽科技有限公司 Dioptric pattern generation method and device and computer-readable storage medium
CN112599244A (en) * 2020-12-16 2021-04-02 温州医科大学 Intraocular lens refractive power calculation system based on machine learning

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108335757A (en) * 2018-02-05 2018-07-27 王雁 A method of diopter adjusted value in prediction SMILE operations
CN109008937A (en) * 2018-07-26 2018-12-18 上海鹰瞳医疗科技有限公司 Method for detecting diopter and equipment

Also Published As

Publication number Publication date
CN114202499A (en) 2022-03-18

Similar Documents

Publication Publication Date Title
JP6894925B2 (en) Lens meter without fixtures and how to operate it
CN109740491B (en) Human eye sight recognition method, device, system and storage medium
CN107357429B (en) Method, apparatus, and computer-readable storage medium for determining gaze
CN104809424B (en) Method for realizing sight tracking based on iris characteristics
CN113227747B (en) There is not fixing device to examine mirror appearance system
CN113939851A (en) Method and system for estimating eye-related geometrical parameters of a user
EP3270098B2 (en) Measurement system for eyeglasses-wearing parameter, measurement program, measurement method therefor, and manufacturing method for eyeglasses lens
CN105748033A (en) Image Processing Apparatus And Image Processing Method
CN111344222A (en) Method of performing an eye examination test
JP2014128367A (en) Image processor
US20200124496A1 (en) Fixtureless lensmeter system
KR101938361B1 (en) Method and program for predicting skeleton state by the body ouline in x-ray image
CN111358421B (en) Dioptric pattern generation method and device and computer-readable storage medium
JP5719216B2 (en) Gaze measurement apparatus and gaze measurement program
CN114202499B (en) Method and device for measuring refractive information and computer readable storage medium
CN113723293A (en) Sight direction determination method and device, electronic equipment and storage medium
CN110598652B (en) Fundus data prediction method and device
CN110766631A (en) Face image modification method and device, electronic equipment and computer readable medium
CN114189623B (en) Light field-based refraction pattern generation method, device, equipment and storage medium
EP4185184A1 (en) Method for determining a coronal position of an eye relative to the head
JP2023533839A (en) Method and system for evaluating human vision
JP2018049566A (en) Image processing apparatus, image processing method, and program
CN112674714A (en) Mobile phone image examination optometry method combining filter and peripheral equipment
JP6419249B2 (en) Image processing apparatus, image processing method, and image processing program
JP2015123262A (en) Sight line measurement method using corneal surface reflection image, and device for the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant