WO2021184452A1 - 屈光图形生成方法、装置及计算机可读存储介质 - Google Patents

屈光图形生成方法、装置及计算机可读存储介质 Download PDF

Info

Publication number
WO2021184452A1
WO2021184452A1 PCT/CN2020/083552 CN2020083552W WO2021184452A1 WO 2021184452 A1 WO2021184452 A1 WO 2021184452A1 CN 2020083552 W CN2020083552 W CN 2020083552W WO 2021184452 A1 WO2021184452 A1 WO 2021184452A1
Authority
WO
WIPO (PCT)
Prior art keywords
refractive
sharpness
fundus image
generating
target
Prior art date
Application number
PCT/CN2020/083552
Other languages
English (en)
French (fr)
Inventor
王宁利
冬雪川
黄叶权
崔焱
郭静云
Original Assignee
深圳盛达同泽科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳盛达同泽科技有限公司 filed Critical 深圳盛达同泽科技有限公司
Priority to CN202080069140.8A priority Critical patent/CN114980800B/zh
Priority to CA3154850A priority patent/CA3154850A1/en
Priority to EP20925559.5A priority patent/EP4023141A4/en
Publication of WO2021184452A1 publication Critical patent/WO2021184452A1/zh
Priority to US17/717,289 priority patent/US20220233070A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/103Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining refraction, e.g. refractometers, skiascopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/028Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing visual acuity; for determination of refraction, e.g. phoropters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography

Definitions

  • the present invention relates to the technical field of refractive graph generation, and in particular to a method, device and computer-readable storage medium for generating refractive graphs.
  • a general refractor only measures the refractive power of the macular region of the human eye, and the refractive topography instrument can measure the refractive power of the fundus region of the human eye with a large field of view.
  • ophthalmoscopes were used to obtain large-field fundus refractive information by artificially refraction from different viewing angles of the human eye. This method is time-consuming and labor-intensive, and can only obtain the horizontal and vertical fundus refractive distribution, which cannot meet the technical requirements of the existing human eye detection.
  • the main purpose of the present invention is to provide a method, device and computer-readable storage medium for generating refractive patterns, aiming to solve the problem of using ophthalmoscopes in the prior art to obtain a large field of view by artificially refraction from different perspectives of the human eye
  • the method of fundus refractive information is extremely time-consuming and labor-intensive, and only the horizontal and vertical fundus refractive distribution can be obtained, which cannot meet the technical problems required for detection.
  • the present invention provides a method for generating a refractive graph, which is applied to a refractive topographic map instrument, and the method for calculating a continuous refractive topographic map includes the following contents:
  • a refractive matrix is generated according to the acquired refractive information, and a refractive graph is generated through the refractive matrix.
  • the method before the step of capturing a fundus image of the human eye currently to be tested, and obtaining refractive information corresponding to the fundus image, the method further includes:
  • the refractive scan range includes a dynamic range or a fixed range.
  • the method before the step of calculating the sharpness of the fundus image and generating the sharpness sequence of the fundus image according to the calculated sharpness, the method further includes:
  • the step of calculating the sharpness of the fundus image is performed.
  • the method further includes:
  • the step of calculating the sharpness of the fundus image is performed through the registered fundus image.
  • the step of calculating the sharpness of the fundus image and generating the sharpness sequence of the fundus image according to the calculated sharpness includes:
  • the step of calculating the sharpness of the fundus image based on the confirmed target position point includes:
  • the step of confirming the target definition in the generated definition sequence and obtaining the refractive compensation value of the fundus image corresponding to the target definition includes:
  • the outliers of the sharpness sequence are eliminated, and the target sharpness is confirmed according to the sharpness sequence for eliminating outliers.
  • the method further includes:
  • the step of confirming the outliers in the clarity sequence includes:
  • the sharpness is less than or equal to the preset sharpness threshold, it is confirmed that the sharpness is an outlier.
  • the method further includes:
  • the step of confirming the target definition in the generated definition sequence and obtaining the refractive compensation value of the fundus image corresponding to the target definition includes:
  • the target sharpness is defined as the maximum sharpness.
  • the photographing the fundus image of the human eye currently to be tested, and obtaining the refractive compensation value corresponding to the fundus image, and before the step of having multiple fundus images further includes:
  • the shooting threshold range of the fundus image is set based on the refractive value.
  • the step of generating a refractive matrix according to the acquired refractive information, and generating a refractive graph through the refractive matrix includes:
  • the refractive information is written into a preset matrix in a preset format, and the refractive matrix is generated according to the preset matrix in which the refractive information is written.
  • the step of generating a refractive matrix according to the acquired refractive information, and generating a refractive graph through the refractive matrix includes:
  • the refraction matrix is generated by using the generation format to generate a corresponding refraction figure.
  • the type of refractive pattern includes a point map, a block map, a three-dimensional map, a statistical map, a simulated visual map, a naked eye defocus curve, and a simulated curve diagram of wearing a lens.
  • the present invention also provides a refraction figure generating device, the refraction figure generating device includes: a memory, a processor, the memory stores a computer program that can be called by the processor, and When the computer program is executed by the processor, the steps of the method for generating a refractive figure as described above are realized.
  • the present invention also provides a computer-readable storage medium, the computer-readable storage medium stores a refractive graph generation program, and when the refractive graph generation program is executed by a processor, the method for generating the refractive graph as described above is implemented step.
  • An embodiment of the present invention provides a method for generating a refractive image. Take the fundus image of the current human eye to be tested, and obtain the refractive information corresponding to the fundus image, where there are multiple fundus images; calculate the sharpness of the fundus image, and generate the The sharpness sequence of the fundus image; confirm the target sharpness in the generated sharpness sequence, obtain the refractive information of the target sharpness corresponding to the fundus image, the target sharpness is multiple; according to the acquired all The refractive information generates a refractive matrix, and a refractive pattern is generated from the refractive matrix.
  • the application determines the target refractive information to form a refractive matrix based on the mapping relationship between refractive information and sharpness, which realizes the measurement operation of refractive information of the entire fundus area at one time in a short time, and improves Measurement efficiency and measurement accuracy.
  • Figure 1 is a schematic diagram of the terminal ⁇ device structure of the hardware operating environment involved in the solution of the embodiment of the present invention
  • FIG. 2 is a schematic flowchart of the first embodiment of the method for generating refractive graphs according to the present invention
  • Figure 3 is a schematic diagram of the coordinate direction of the refractive compensation value
  • Figure 4 is a schematic diagram of clarity based on target location points
  • Figure 5 is a schematic diagram of target clarity confirmation in the clarity sequence.
  • the main solution of the embodiment of the present invention is to take a fundus image of the human eye currently to be tested, and obtain refractive information corresponding to the fundus image, where the fundus image is multiple; to calculate the sharpness of the fundus image, The calculated sharpness generates the sharpness sequence of the fundus image; confirms the target sharpness in the generated sharpness sequence, and obtains the refractive information of the target sharpness corresponding to the fundus image, and the target is clear There are multiple degrees; a refractive matrix is generated according to the acquired refractive information, and a refractive graph is generated through the refractive matrix.
  • the present invention provides a solution.
  • the target refractive information is determined by the mapping relationship between refractive information and sharpness to form a refractive matrix, which realizes the refractive index of the entire fundus area at one time in a short time.
  • Information measurement operation improves measurement efficiency and measurement accuracy.
  • Fig. 1 is a schematic diagram of a terminal structure of a hardware operating environment involved in a solution of an embodiment of the present invention.
  • the terminal in the embodiment of the present invention may be a portable or non-removable terminal device such as a PC, a smart phone, a tablet computer, an e-book reader, and a portable computer.
  • a portable or non-removable terminal device such as a PC, a smart phone, a tablet computer, an e-book reader, and a portable computer.
  • the terminal may include: a processor 1001, such as a CPU, a network interface 1004, a user interface 1003, a memory 1005, and a communication bus 1002.
  • the communication bus 1002 is used to implement connection and communication between these components.
  • the user interface 1003 may include a display screen (Display) and an input unit such as a keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface and a wireless interface.
  • the network interface 1004 may optionally include a standard wired interface and a wireless interface (such as a WI-FI interface).
  • the memory 1005 may be a high-speed RAM memory, or a non-volatile memory (non-volatile memory), such as a magnetic disk memory.
  • the memory 1005 may also be a storage device independent of the aforementioned processor 1001.
  • terminal structure shown in FIG. 1 does not constitute a limitation on the terminal, and may include more or less components than shown in the figure, or combine some components, or arrange different components.
  • the memory 1005 which is a computer storage medium, may include an operating system, a network communication module, a user interface module, and a refractive image generation program.
  • the network interface 1004 is mainly used to connect to a back-end server and communicate with the back-end server;
  • the user interface 1003 is mainly used to connect to a client (user side) to communicate with the client;
  • the processor 1001 can be used to call the refractive image generation program stored in the memory 1005 and perform the following operations:
  • a refractive matrix is generated according to the acquired refractive information, and a refractive graph is generated through the refractive matrix.
  • processor 1001 may call the refractive image generation program stored in the memory 1005, and also perform the following operations:
  • processor 1001 may call the refractive image generation program stored in the memory 1005, and also perform the following operations:
  • processor 1001 may call the refractive pattern generation program stored in the memory 1005, and also perform the following operations:
  • processor 1001 may call the refractive image generation program stored in the memory 1005, and also perform the following operations:
  • the outliers of the sharpness sequence are eliminated, and the target sharpness is confirmed according to the sharpness sequence for eliminating outliers.
  • processor 1001 may call the refractive image generation program stored in the memory 1005, and also perform the following operations:
  • the sharpness is less than or equal to the preset sharpness threshold, it is confirmed that the sharpness is an outlier.
  • processor 1001 may call the refractive image generation program stored in the memory 1005, and also perform the following operations:
  • processor 1001 may call the refractive image generation program stored in the memory 1005, and also perform the following operations:
  • processor 1001 may call the refractive image generation program stored in the memory 1005, and also perform the following operations:
  • the target sharpness is defined as the maximum sharpness.
  • processor 1001 may call the refractive image generation program stored in the memory 1005, and also perform the following operations:
  • the shooting threshold range of the fundus image is set based on the refractive value.
  • processor 1001 may call the refractive image generation program stored in the memory 1005, and also perform the following operations:
  • the refractive information is written into a preset matrix in a preset format, and the refractive matrix is generated according to the preset matrix in which the refractive information is written.
  • processor 1001 may call the refractive image generation program stored in the memory 1005, and also perform the following operations:
  • the refraction matrix is generated by using the generation format to generate a corresponding refraction figure.
  • processor 1001 may call the refractive image generation program stored in the memory 1005, and also perform the following operations:
  • refractive graphs include point maps, block graphs, three-dimensional graphs, statistical graphs, simulated vision graphs, naked eye defocus curves, and simulated curve graphs of wearing lenses.
  • Fig. 2 is a schematic flow chart of the first embodiment of the method for generating a refractive figure according to the present invention.
  • the method for generating a refractive figure is applied to a refractive topography instrument, and the method for generating a refractive figure includes:
  • Step S10 taking a fundus image of the human eye currently to be tested, and acquiring refractive information corresponding to the fundus image, where there are multiple fundus images;
  • the fundus image of the current human eye to be tested is taken.
  • the fundus image of the current human eye to be tested can be based on a refractive topograph, a fundus camera, or a combination with the refractive topographic map.
  • Other control devices connected to the instrument/fundus camera take the fundus image, and obtain the refractive information corresponding to the fundus image according to the captured fundus image.
  • the fundus camera is adjusted by a motor. The relative position of the focusing optical module is moved, and the fundus image is taken after refractive adjustment.
  • the moving distance of the focusing optical module corresponds to a certain range of refractive compensation values, that is, the refractive scanning range
  • the refractive scan range may be a fixed range or a dynamic range.
  • the definition of the refractive scan range may be based on the refractive information previously verified by the patient to be tested, and then set the refractive index based on the refractive information.
  • the dynamic refractive range corresponding to the fundus image That is, before the step of capturing the fundus image of the current human eye to be tested, and obtaining the refractive compensation value corresponding to the fundus image, the method further includes:
  • the shooting threshold range of the fundus image is set based on the refractive value.
  • the refractive value of the human eye to be tested is acquired, so as to pre-set the shooting threshold range of the fundus image based on the refractive value, that is, the refractive scan range.
  • the refractive scan range may be a fixed range or a dynamic range, so that the fundus camera captures the fundus image according to the set shooting threshold range, that is, the refractive scan range.
  • the fundus refractive scan of the fundus image captured by the optical module is performed at equal intervals of fixed refractive power, or in a non-refractive interval, the shooting operation of the fundus image is performed within a certain range, so, based on the captured images In the fundus image, the fundus image is multiple.
  • the refractive information corresponding to the fundus image when the refractive information corresponding to the fundus image is acquired, since the fundus image is captured within a certain refractive scan range, the refractive information corresponding to the fundus image can be It is obtained based on the set refractive scan range, and in this way, the refractive information corresponding to the fundus image is confirmed with the specific refractive scan range value corresponding to when the fundus image was taken.
  • Step S20 Calculate the sharpness of the fundus image, and generate a sharpness sequence of the fundus image according to the calculated sharpness;
  • the calculation method and calculation parameters corresponding to the scene selection such as gradient, gray-scale variance, or entropy function, can be calculated based on the fundus image to calculate the corresponding sharpness calculation method.
  • the refractive threshold range of the fundus image currently captured and the shooting time it is necessary to confirm whether the captured fundus image needs to be registered.
  • the refractive threshold is currently defined
  • the refractive scan process time corresponding to the range is short, when the change in the magnification of the fundus image relative to the fundus is small, the image registration operation of the fundus image may not be performed, and the image registration operation may be based on the current settings
  • the registration preset time is used to determine the registration operation.
  • the relevant technical content of the fundus image obtained based on the refractive information and the calculation of the sharpness based on the fundus image all belong to the protection scope of the present invention. In this way, according to the calculated sharpness of the fundus image, in the case of multiple fundus images, there are multiple values of the sharpness.
  • the arrangement format of the sharpness sequence may be based on the refractive threshold range of the fundus image currently taken.
  • the change of refractive compensation value is in order, and the arrangement order under different coordinates may be different.
  • the coordinate directions include X, Y, and Z directions, which correspond to different refractive compensation values.
  • the coordinate definition information can be shown in Figure 3. As shown, Fig. 3 is a schematic diagram of the coordinate direction of the refractive compensation value.
  • the sharpness when calculating the sharpness of the fundus image, the sharpness can be calculated based on the definition of sharp points, that is, the calculation of the sharpness of the fundus image is based on the calculated sharpness
  • the step of generating the sharpness sequence of the fundus image includes:
  • the target position point of the fundus image is selected, and the target position point for calculating the definition is selected based on the MxN dimensional matrix.
  • the method of selecting the target position point can be performed in the following manner choose:
  • All MxN points are used as target location points
  • each selected location point has coordinates corresponding to the MxN matrix, and its target location
  • the coordinates of the point can be defined as (x, y); in this way, according to the target position point of the currently confirmed fundus image, if there are multiple fundus images, calculate each fundus image in turn from the first to the Qth fundus image.
  • the definition uses calculation methods in the image processing field, such as gradient, gray-scale variance, entropy function, etc., and suitable calculation methods and calculation parameters can be selected according to specific devices and application scenarios.
  • the step of the sharpness of the fundus image includes:
  • a suitable neighborhood range is selected with the target position point as the center, and the definition in the neighborhood range is calculated.
  • the target position The definition of the point is denoted as Cq(x, y).
  • the way of confirming the definition based on the neighborhood range can be as shown in Fig. 4, which is a schematic diagram of the definition based on the target location point.
  • Step S30 confirming the target definition in the generated definition sequence, and acquiring refractive information of the fundus image corresponding to the target definition, where there are multiple target definitions;
  • the sharpness sequence formed by the sharpness is used to confirm the optimal sharpness in the sharpness sequence.
  • the optimal sharpness can be defined as the maximum value or the best sharpness, etc., Can be defined as target clarity. In this way, among the fundus images currently captured, the sharpness calculated at the target position point (x, y) in the Q fundus images is different.
  • the calculation method of the target definition can confirm the maximum value of the definition of the target position points in the Q fundus images, which is defined as the target definition, further, Since there are multiple definition sequences in different orientations, there are also multiple target definitions defined based on the optimal definition, that is, there are multiple target definitions, where the target definition can be confirmed as shown in Figure 5. As shown, Figure 5 is a schematic diagram of target clarity confirmation in the clarity sequence.
  • the fitting operation can be performed based on the sharpness sequence, and according to the fitting
  • the result of fitting is to confirm the target sharpness from the sharpness sequence. That is, the step of confirming the target definition in the generated definition sequence and obtaining the refractive compensation value of the fundus image corresponding to the target definition includes:
  • the steps of confirming the definition of the target and obtaining the refractive compensation value of the fundus image corresponding to the definition of the target include:
  • the outliers of the sharpness sequence are eliminated, and the target sharpness is confirmed according to the sharpness sequence for eliminating outliers.
  • the operation of identifying outliers based on the clarity sequence can be determined by a preset clarity threshold, that is, the step of confirming the outliers in the clarity sequence includes:
  • the sharpness is less than or equal to the preset sharpness threshold, it is confirmed that the sharpness is an outlier.
  • the outlier operation confirmed in the definition sequence can be confirmed by comparing each definition value in the definition sequence with a preset value, that is, after confirming that the definition is greater than all
  • a preset value that is, after confirming that the definition is greater than all
  • the sharpness sequence for eliminating outliers is defined as the vector Q', and the vector Q'is fitted with a polynomial function, and the clarity is confirmed according to the fitting result
  • the sharpness sequence calculated based on the target position point (x, y) confirmed by the current fundus image after the target sharpness confirmed by the sharpness sequence, confirm that the target sharpness corresponds to the refractive information of the fundus image , That is, the refractive information is the refractive information of the fundus image corresponding to the target definition at the time of shooting.
  • the refractive information can also be defined as the refractive compensation value, which can be reversed and linear by the refractive compensation value.
  • the value processed by transformation or some other function is defined as refractive information.
  • the refractive information can also be defined as fundus data based on fundus images, such as visual acuity values, refractive values, etc., and those related to fundus images
  • fundus data based on fundus images
  • visual acuity values such as visual acuity values, refractive values, etc.
  • Step S40 generating a refractive matrix according to the acquired refractive information, and generating a refractive graph through the refractive matrix.
  • a refractive matrix is generated using the refractive information.
  • the refractive information may be written based on a preset matrix format to form the refractive information.
  • the matrix that is, the step of generating a refractive matrix according to the acquired refractive information, and generating a refractive graph through the refractive matrix, includes:
  • the refractive information is written into a preset matrix in a preset format, and the refractive matrix is generated according to the preset matrix in which the refractive information is written.
  • the refractive information is written into a preset matrix in a preset format.
  • the matrix is defined as a set of complex or real numbers arranged in a rectangular array.
  • the preset matrix needs to be deformed based on the existing matrix definition, and the deformation operation can be set by relevant technicians to write the refractive information.
  • the refractive matrix can be subjected to quadratic numerical calculations and graphically displayed. Typical applications of numerical operations are:
  • the refractive information is used to obtain the matrix of refractive compensation values, and the inverse operation can be performed to obtain the absolute refractive value of each position point to study the change of peripheral refractive power compared to the central refractive power. Calculate the mean value of the center refractive index, and then subtract the mean value from the entire refractive matrix to get the change in the periphery compared to the center. Convert the refractive information in the unit of D into a numerical value in the unit of optics power, and confirm the topographic map type of the current refractive topographic map.
  • the topographic map types include point maps, block maps, stereo maps, and statistical maps.
  • Point map Provides 10°, 20°, 30°, 40° field of view angles and the difference between the refractive value of each location and the refractive value of the macula.
  • Grid-mean The point map is divided into 8*8 blocks on average, and the digital value represents the average value of the refractive power in each block.
  • 3D-color Convert the refractive value of the point to the 3D shape.
  • the XY direction of the 3D image is consistent with the previous two images.
  • the ordinate is the bottom-up (Inferior--Superior).
  • the value of each point represents the defocus value of the current position, that is, the diopter difference with the macula.
  • the diopter value at the macula is taken as the 0 reference, and the other positions are the difference from the macula.
  • the "+” means that the diopter is greater than the macular position.
  • -" means that the diopter is smaller than the position of the macula.
  • Simulated visual map Simulate the clarity of the actual scene at different positions of the eyes according to the refractive topographic map data. It is considered that the position of the macula is clear, and the absolute difference between each position and the macula is used as a measure of clarity. The larger the absolute difference, the more blurred, and the smaller the absolute difference, the clearer. Here, smooth filtering is used to realize this partial blur operation.
  • the filter coefficient of each position is related to its corresponding absolute difference. The larger the absolute difference, the larger the filter coefficient.
  • Naked-eye defocus curve (naked-eye): As shown in Figure 5, calculate the diopter difference between the 15°, 30°, 45° position and the macula.
  • the difference "+” is the hyperopic defocus, that is, the defocus curve On the outside of the eyeball, a difference of "-" means myopic defocus, and the defocus curve is measured inside the eyeball.
  • Frame-glass defocus curve when wearing ordinary glasses Take the naked eye as a reference to simulate the defocus curve when wearing glasses. Wearing glasses will enhance the original degree of defocus, that is, the curvature of the defocused surface becomes smaller, and it deviates further from the surface of the eyeball.
  • Defocus curve ok-glass when wearing ok lens After wearing ok lens, the defocus curve and the surface of the eyeball return.
  • the target refractive information is determined by the mapping relationship between refractive information and sharpness to form a refractive matrix, which realizes the measurement of refractive information of the entire fundus area at one time in a short time. Operation improves the measurement efficiency and measurement accuracy.
  • the embodiment of the present invention also provides a computer-readable storage medium, the computer-readable storage medium stores a refraction figure generation program, and the following operations are implemented when the refraction figure generation program is executed by a processor:
  • a refractive matrix is generated according to the acquired refractive information, and a refractive graph is generated through the refractive matrix.
  • the outliers of the sharpness sequence are eliminated, and the target sharpness is confirmed according to the sharpness sequence for eliminating outliers.
  • the sharpness is less than or equal to the preset sharpness threshold, it is confirmed that the sharpness is an outlier.
  • the target sharpness is defined as the maximum sharpness.
  • the shooting threshold range of the fundus image is set based on the refractive value.
  • the refractive information is written into a preset matrix in a preset format, and the refractive matrix is generated according to the preset matrix in which the refractive information is written.
  • the refraction matrix is generated by using the generation format to generate a corresponding refraction figure.
  • refractive graphs include point maps, block graphs, three-dimensional graphs, statistical graphs, simulated vision graphs, naked eye defocus curves, and simulated curve graphs of wearing lenses.
  • the technical solution of the present invention essentially or the part that contributes to the existing technology can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM) as described above. , Magnetic disks, optical disks), including several instructions to make a terminal device (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the method described in each embodiment of the present invention.
  • a terminal device which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

一种屈光图形生成方法、装置及计算机可读存储介质,其中的方法包括:拍摄当前待测人眼的眼底图像,并获取眼底图像对应的屈光信息,眼底图像为多张(S10);计算眼底图像的清晰度,根据计算到的清晰度生成眼底图像的清晰度序列(S20);在已生成的清晰度序列中确认目标清晰度,获取目标清晰度对应眼底图像的屈光信息,目标清晰度为多个(S30);根据获取到的屈光信息生成屈光矩阵,并通过屈光矩阵生成屈光图形(S40)。通过构建屈光图形计算方法,以屈光信息与清晰度的映射关系确定目标屈光信息形成屈光矩阵,可以在短时间内一次性对整个眼底区域的屈光信息测量操作,提高了测量效率与测量准确度。

Description

屈光图形生成方法、装置及计算机可读存储介质
本申请要求2020年03月16日申请的,申请号202010185058.3,名称为“屈光图形生成方法、装置及计算机可读存储介质”的中国专利申请的优先权,在此将其全文引入作为参考。
技术领域
本发明涉及屈光图形生成的技术领域,尤其涉及一种屈光图形生成方法、装置及计算机可读存储介质。
背景技术
随着青少年近视防控的需求增加,一种对周边视网膜屈光度测试的方法被提出。一般的验光仪只对人眼黄斑区域的屈光度进行测量,屈光地形图仪可以对人眼大视场眼底区域的屈光度进行测量。在之前的传统方法中采用检眼镜,通过人工对人眼的不同视角进行验光的方法来获取大视场眼底屈光信息。这种方法耗时耗力还只能获取水平方向和垂直方向眼底屈光分布,无法满足现有人眼检测的技术要求。
上述内容仅用于辅助理解本发明的技术方案,并不代表承认上述内容是现有技术。
发明概述
技术问题
问题的解决方案
技术解决方案
本发明的主要目的在于提供一种屈光图形生成方法、装置及计算机可读存储介质,旨在解决现有技术采用检眼镜,通过人工对人眼的不同视角进行验光的方法来获取大视场眼底屈光信息的方式极耗时耗力,还只能获取水平方向和垂直方向眼底屈光分布,不能满足检测要求的技术问题。
为实现上述目的,本发明提供一种屈光图形生成方法,所述屈光图形生成方法应用于屈光地形图仪,所述续屈光地形图计算方法包括以下内容:
拍摄当前待测人眼的眼底图像,并获取所述眼底图像对应的屈光信息,所述眼底图像为多张;
计算所述眼底图像的清晰度,根据计算到的所述清晰度生成所述眼底图像的清晰度序列;
在已生成的所述清晰度序列中确认目标清晰度,获取所述目标清晰度对应眼底图像的屈光信息,所述目标清晰度为多个;
根据获取到的所述屈光信息生成屈光矩阵,并通过所述屈光矩阵生成屈光图形。
可选地,所述拍摄当前待测人眼的眼底图像,并获取所述眼底图像对应的屈光信息的步骤之前,还包括:
获取调焦光学模组的预设的相对位置;
根据获取到的所述相对位置设置当前拍摄眼底图像的屈光扫描范围。
可选地,所述屈光扫描范围包括动态范围或者固定范围。
可选地,所述计算所述眼底图像的清晰度,根据计算到的所述清晰度生成所述眼底图像的清晰度序列的步骤之前,还包括:
确认拍摄所述眼底图像的屈光扫描过程的扫描时间;
在确认所述扫描时间小于配准预设时间时,执行计算所述眼底图像的清晰度的步骤。
可选地,所述确认拍摄所述眼底图像的屈光扫描过程的扫描时间的步骤之后,还包括:
在确认所述扫描时间大于或等于配准预设时间时,配准所述眼底图像;
通过配准后的眼底图像执行计算所述眼底图像的清晰度的步骤。
可选地,所述计算所述眼底图像的清晰度,根据计算到的所述清晰度生成所述眼底图像的清晰度序列的步骤,包括:
确认所述眼底图像计算清晰度的目标位置点;
通过已确认的所述目标位置点计算所述眼底图像的清晰度。
可选地,所述通过已确认的所述目标位置点计算所述眼底图像的清晰度的步骤,包括:
通过所述目标位置点为中心划定预设邻域范围;
根据所述预设邻域范围计算所述眼底图像的清晰度。
可选地,所述在已生成的所述清晰度序列中确认目标清晰度,获取所述目标清晰度对应眼底图像的屈光补偿值的步骤,包括:
确认所述清晰度序列中的离群点;
将所述清晰度序列的离群点剔除,并根据剔除离群点的清晰度序列确认目标清晰度。
可选地,所述确认所述清晰度序列中的离群点的步骤之后,还包括:
统计所述离群点的数量;
在确认所述离群点的数量大于非离群点的数量时,根据所述非离群点生成新的清晰度序列。
可选地,所述确认所述清晰度序列中的离群点的步骤,包括:
将所述清晰度序列的清晰度与预设清晰阈值比对;
若所述清晰度小于或等于所述预设清晰阈值,确认所述清晰度为离群点。
可选地,所述将所述清晰度序列的清晰度与预设清晰阈值比对的步骤之后,还包括:
若所述清晰度超出所述预设清晰阈值,则确认所述清晰度为非离群点。
可选地,所述在已生成的所述清晰度序列中确认目标清晰度,获取所述目标清晰度对应眼底图像的屈光补偿值的步骤,包括:
拟合所述清晰度序列,根据拟合结果确认所述目标清晰度。
可选地,所述目标清晰度定义为最大值清晰度。
可选地,所述拍摄当前待测人眼的眼底图像,并获取所述眼底图像对应的屈光补偿值,所述眼底图像为多张的步骤之前,还包括:
获取所述待测人眼的屈光值;
基于所述屈光值设定所述眼底图像的拍摄阈值范围。
可选地,所述根据获取到的所述屈光信息生成屈光矩阵,并通过所述屈光矩阵生成屈光图形的步骤,包括:
将所述屈光信息通过预设格式写入至预设矩阵,根据写入所述屈光信息的预设 矩阵生成所述屈光矩阵。
可选地,所述根据获取到的所述屈光信息生成屈光矩阵,并通过所述屈光矩阵生成屈光图形的步骤,包括:
确认屈光图形类型,并根据所述屈光图形类型确定生成格式;
通过所述生成格式将所述屈光矩阵生成对应的屈光图形。
可选地,所述屈光图形类型包括点位图、区块图、立体图、统计图、模拟视觉图、裸眼离焦曲线以及配戴镜片的模拟曲线图。
此外,为实现上述目的,本发明还提供一种屈光图形生成装置,所述屈光图形生成装置包括:存储器、处理器,所述存储器上存储有能够被处理器调用的计算机程序,所述计算机程序被所述处理器执行时实现如上所述屈光图形生成方法的步骤。
本发明还提供一种计算机可读存储介质,所述计算机可读存储介质上存储有屈光图形生成程序,所述屈光图形生成程序被处理器执行时实现如上所述屈光图形生成方法的步骤。
本发明实施例提出的一种屈光图形生成方法。拍摄当前待测人眼的眼底图像,并获取所述眼底图像对应的屈光信息,所述眼底图像为多张;计算所述眼底图像的清晰度,根据计算到的所述清晰度生成所述眼底图像的清晰度序列;在已生成的所述清晰度序列中确认目标清晰度,获取所述目标清晰度对应眼底图像的屈光信息,所述目标清晰度为多个;根据获取到的所述屈光信息生成屈光矩阵,并通过所述屈光矩阵生成屈光图形。本申请通过构建屈光图形计算方法,以屈光信息与清晰度的映射关系确定目标屈光信息形成屈光矩阵,实现了短时间内一次性对整个眼底区域的屈光信息测量操作,提高了测量效率与测量准确度。
发明的有益效果
对附图的简要说明
附图说明
图1是本发明实施例方案涉及的硬件运行环境的终端\装置结构示意图;
图2为本发明屈光图形生成方法第一实施例的流程示意图;
图3为屈光补偿值坐标向示意图;
图4为基于目标位置点的清晰度示意图;
图5为清晰度序列中目标清晰度确认示意图。
本发明目的的实现、功能特点及优点将结合实施例,参照附图做进一步说明。
发明实施例
本发明的实施方式
应当理解,此处所描述的具体实施例仅仅用以解释本发明,并不用于限定本发明。
本发明实施例的主要解决方案是:拍摄当前待测人眼的眼底图像,并获取所述眼底图像对应的屈光信息,所述眼底图像为多张;计算所述眼底图像的清晰度,根据计算到的所述清晰度生成所述眼底图像的清晰度序列;在已生成的所述清晰度序列中确认目标清晰度,获取所述目标清晰度对应眼底图像的屈光信息,所述目标清晰度为多个;根据获取到的所述屈光信息生成屈光矩阵,并通过所述屈光矩阵生成屈光图形。
由于现有技术采用检眼镜,通过人工对人眼的不同视角进行验光的方法来获取大视场眼底屈光信息的方式极耗时耗力,还只能获取水平方向和垂直方向眼底屈光分布,不能满足检测要求的技术问题。
本发明提供一种解决方案,通过构建屈光图形计算方法,以屈光信息与清晰度的映射关系确定目标屈光信息形成屈光矩阵,实现了短时间内一次性对整个眼底区域的屈光信息测量操作,提高了测量效率与测量准确度。
如图1所示,图1是本发明实施例方案涉及的硬件运行环境的终端结构示意图。
本发明实施例终端可以是是PC、智能手机、平板电脑、电子书阅读器,便携计算机等可移动式或不可移动式的终端设备。
如图1所示,该终端可以包括:处理器1001,例如CPU,网络接口1004,用户接口1003,存储器1005,通信总线1002。其中,通信总线1002用于实现这些组件之间的连接通信。用户接口1003可以包括显示屏(Display)、输入单元比如键盘(Keyboard),可选用户接口1003还可以包括标准的有线接口、无线接口。网络接口1004可选的可以包括标准的有线接口、无线接口(如WI-FI接口)。存 储器1005可以是高速RAM存储器,也可以是稳定的存储器(non-volatile memory),例如磁盘存储器。存储器1005可选的还可以是独立于前述处理器1001的存储装置。
本领域技术人员可以理解,图1中示出的终端结构并不构成对终端的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
如图1所示,作为一种计算机存储介质的存储器1005中可以包括操作系统、网络通信模块、用户接口模块以及屈光图形生成程序。
在图1所示的终端中,网络接口1004主要用于连接后台服务器,与后台服务器进行数据通信;用户接口1003主要用于连接客户端(用户端),与客户端进行数据通信;而处理器1001可以用于调用存储器1005中存储的屈光图形生成程序,并执行以下操作:
拍摄当前待测人眼的眼底图像,并获取所述眼底图像对应的屈光信息,所述眼底图像为多张;
计算所述眼底图像的清晰度,根据计算到的所述清晰度生成所述眼底图像的清晰度序列;
在已生成的所述清晰度序列中确认目标清晰度,获取所述目标清晰度对应眼底图像的屈光信息,所述目标清晰度为多个;
根据获取到的所述屈光信息生成屈光矩阵,并通过所述屈光矩阵生成屈光图形。
进一步地,处理器1001可以调用存储器1005中存储的屈光图形生成程序,还执行以下操作:
获取调焦光学模组的预设的相对位置;
根据获取到的所述相对位置设置当前拍摄眼底图像的屈光扫描范围。
进一步地,处理器1001可以调用存储器1005中存储的屈光图形生成程序,还执行以下操作:
确认所述眼底图像计算清晰度的目标位置点;
通过已确认的所述目标位置点计算所述眼底图像的清晰度。
进一步地,处理器1001可以调用存储器1005中存储的屈光图形生成程序,还执 行以下操作:
通过所述目标位置点为中心划定预设邻域范围;
根据所述预设邻域范围计算所述眼底图像的清晰度。
进一步地,处理器1001可以调用存储器1005中存储的屈光图形生成程序,还执行以下操作:
确认所述清晰度序列中的离群点;
将所述清晰度序列的离群点剔除,并根据剔除离群点的清晰度序列确认目标清晰度。
进一步地,处理器1001可以调用存储器1005中存储的屈光图形生成程序,还执行以下操作:
将所述清晰度序列的清晰度与预设清晰阈值比对;
若所述清晰度小于或等于所述预设清晰阈值,确认所述清晰度为离群点。
进一步地,处理器1001可以调用存储器1005中存储的屈光图形生成程序,还执行以下操作:
若所述清晰度超出所述预设清晰阈值,确认所述清晰度为非离群点。
进一步地,处理器1001可以调用存储器1005中存储的屈光图形生成程序,还执行以下操作:
拟合所述清晰度序列,根据拟合结果确认所述目标清晰度。
进一步地,处理器1001可以调用存储器1005中存储的屈光图形生成程序,还执行以下操作:
所述目标清晰度定义为最大值清晰度。
进一步地,处理器1001可以调用存储器1005中存储的屈光图形生成程序,还执行以下操作:
获取所述待测人眼的屈光值;
基于所述屈光值设定所述眼底图像的拍摄阈值范围。
进一步地,处理器1001可以调用存储器1005中存储的屈光图形生成程序,还执行以下操作:
将所述屈光信息通过预设格式写入至预设矩阵,根据写入所述屈光信息的预设 矩阵生成所述屈光矩阵。
进一步地,处理器1001可以调用存储器1005中存储的屈光图形生成程序,还执行以下操作:
确认屈光图形类型,并根据所述屈光图形类型确定生成格式;
通过所述生成格式将所述屈光矩阵生成对应的屈光图形。
进一步地,处理器1001可以调用存储器1005中存储的屈光图形生成程序,还执行以下操作:
所述屈光图形类型包括点位图、区块图、立体图、统计图、模拟视觉图、裸眼离焦曲线以及配戴镜片的模拟曲线图。
参照图2,图2为本发明屈光图形生成方法第一实施例的流程示意图,所述屈光图形生成方法应用于屈光地形图仪,所述屈光图形生成方法包括:
步骤S10,拍摄当前待测人眼的眼底图像,并获取所述眼底图像对应的屈光信息,所述眼底图像为多张;
根据当前的人眼检测需求,拍摄当前待测人眼的眼底图像,其中,在拍摄当前待测人眼的眼底图像时,可基于屈光地形图仪、眼底照相机或者与所述屈光地形图仪/眼底照相机连接的其他控制设备进行眼底图像的拍摄,根据拍摄到的眼底图像,获取所述眼底图像对应的屈光信息,在实际应用中,眼底图像在拍摄时,眼底照相机通过电机驱动调焦光学模组的相对位置移动,进行屈光调节后拍摄所述眼底图像,进一步的,所述调焦光学模组移动的距离对应着一定范围的屈光补偿值,即屈光扫描范围,并且,所述屈光扫描范围可以是一固定范围,亦可以是一动态范围,其屈光扫描范围的定义可基于待测患者预先验证的屈光信息,继而基于所述屈光信息设定所述眼底图像对应的动态屈光范围。即所述拍摄当前待测人眼的眼底图像,并获取所述眼底图像对应的屈光补偿值,所述眼底图像为多张的步骤之前,还包括:
获取所述待测人眼的屈光值;
基于所述屈光值设定所述眼底图像的拍摄阈值范围。
基于当前待测人眼检测到的屈光值,获取所述待测人眼的屈光值,以预基于所述屈光值设定所述眼底图像的拍摄阈值范围,即屈光扫描范围,所述屈光扫描 范围可以是一固定范围,亦可以是一动态范围,以使得眼底照相机根据已设定的所述拍摄阈值范围,即屈光扫描范围拍摄所述眼底图像,如此,所述调教光学模组拍摄眼底图像的眼底屈光扫描是以等间隔的固定屈光率进行,或者以非屈光间隔的方式,在一定范围内执行所述眼底图像的拍摄操作,如此,基于拍摄到的所述眼底图像,所述眼底图像为多张。
因此,根据拍摄到的所述眼底图像,在获取所述眼底图像对应的屈光信息时,由于所述眼底图像在一定的屈光扫描范围拍摄得到,因此所述眼底图像对应的屈光信息可基于已设定的所述屈光扫描范围得出,如此,以所述眼底图像拍摄时对应的具体屈光扫描范围数值确认所述眼底图像对应的屈光信息。
步骤S20,计算所述眼底图像的清晰度,根据计算到的所述清晰度生成所述眼底图像的清晰度序列;
根据当前拍摄到的眼底图像以及所述眼底图像对应的屈光信息,计算所述眼底图像的清晰度,其中,在计算所述眼底图像的清晰度时,可根据当前眼底图像的拍摄装置以及应用场景选择对应的计算方法以及计算参数,梯度、灰度方差或者熵函数等可以基于眼底图像计算对应清晰度的计算方式。另外,在计算所述眼底图像之前,基于当前拍摄所述眼底图像的屈光阈值范围,以及拍摄时间,需确认拍摄到的所述眼底图像是否需进行配准操作,若当前定义的屈光阈值范围所对应的屈光扫描过程时间较短时,其拍摄到的所述眼底图像相对于眼底的放大率变化较小时,可不进行所述眼底图像的图像配准操作,且可基于当前已设定的配准预设时间进行配准操作的判定。并且,基于屈光信息所得到的眼底图像,并基于眼底图像计算清晰度的有关技术内容,均属于本发明的保护范围。如此,根据计算到的所述眼底图像的清晰度,在所述眼底图像为多张的情况下,所述清晰度的数值亦为多个,如此,根据计算到的多个清晰度数值,生成基于所述眼底图像的清晰度序列,进一步的,所述清晰度序列中在包括多个清晰度的情况下,其清晰度序列的排列格式可基于当前拍摄所述眼底图像的屈光阈值范围的屈光补偿值的变化为顺序,且不同坐标下的排列顺序可能存在区别,坐标方向包括X、Y、Z三个方向,均对应有不同的屈光补偿值,其坐标定义信息可如图3所示,图3为屈光补偿值坐标向示意图。
其中,在计算所述眼底图像的清晰度时,可通过定义清晰点的方式为基准进行所述清晰度的计算,即所述计算所述眼底图像的清晰度,根据计算到的所述清晰度生成所述眼底图像的清晰度序列的步骤,包括:
确认所述眼底图像计算清晰度的目标位置点;
通过已确认的所述目标位置点计算所述眼底图像的清晰度。
根据当前拍摄到的眼底图像,在计算所述眼底图像的清晰度时,需确认所述眼底图像计算清晰度的目标位置点。在确认所述目标为支点时,将拍摄到的所述眼底图像的维度MxN,并基于MxN维度的矩阵选择计算清晰度的目标位置点,其目标位置点选择的方法可如下所述的方式进行选择:
a、所有的MxN个点作为目标位置点;
b、按照恒定的间隔数选择位置点,例如横向和纵向每隔10选择一目标位置点;
c、在二维矩阵中非等间隔地选择若干个点作为目标位置点。
如上所述,基于当前在眼底图像中选择的目标位置点,由于所述眼底图像以MxN维度的矩阵为基准,因此所选择的每一位置点,都具备在MxN矩阵对应的坐标,其目标位置点的坐标可定义为(x,y);如此,根据当前已确认的眼底图像的目标位置点,在眼底图像具备多个的情况下,从第1张到第Q张眼底图像,依次计算各张眼底图像基于所选择目标位置点的清晰度,具体实现为:第q张(q=1,2,...,Q)图片Iq,可以计算得到图像上基于目标位置点(x,y)的清晰度。清晰度采用图像处理领域的计算方法,如梯度、灰度方差、熵函数等,可以根据具体的装置和应用场景选择合适的计算方法和计算参数。
由于眼底图像的数据过大,考虑到清晰度计算的效率,可基于已确认的目标位置点划定清晰度计算范围以提高清晰度计算效率,即所述通过已确认的所述目标位置点计算所述眼底图像的清晰度的步骤,包括:
通过所述目标位置点为中心划定预设邻域范围;
根据所述预设邻域范围计算所述眼底图像的清晰度
根据当前已确定的基于各眼底图像的目标位置点(x,y),以所述目标位置点为中心选择合适的邻域范围,计算所述邻域范围内的清晰度,如此,该目标位 置点的清晰度记为Cq(x,y),基于邻域范围内清晰度的确认方式,可如图4所示,图4为基于目标位置点的清晰度示意图。
步骤S30,在已生成的所述清晰度序列中确认目标清晰度,获取所述目标清晰度对应眼底图像的屈光信息,所述目标清晰度为多个;
根据当前已计算出清晰度的眼底图像,以所述清晰度形成的清晰度序列,确认所述清晰度序列中最优清晰度,最优清晰度可定义为最大值或者最佳清晰度等,可定义为目标清晰度。如此,在当前拍摄到的眼底图像中,Q张眼底图像中,在目标位置点(x,y)点计算到的清晰度是有区别的,在当前眼底图像具备对应的屈光信息的情况下,会存在某个最优清晰度,即目标清晰度,该目标清晰度的计算方法可确认Q个眼底图像中目标位置点的清晰度的最大值,以此定义为目标清晰度,进一步的,由于不同方位具备多个清晰度序列,因而基于最优清晰度定义的目标清晰度也具备多个,即所述目标清晰度为多个,其中,所述目标清晰度的确认方式可如图5所示,图5为清晰度序列中目标清晰度确认示意图。另外,由于拍摄到的所述眼底图像在拍摄时会受到光学噪声、电子噪声和软件处理等诸多因素的影响,而出现一定的扰动,可基于所述清晰度序列进行拟合操作,根据拟合的拟合结果自所述清晰度序列中确认所述目标清晰度。即所述在已生成的所述清晰度序列中确认目标清晰度,获取所述目标清晰度对应眼底图像的屈光补偿值的步骤,包括:
拟合所述清晰度序列,根据拟合结果确认所述目标清晰度。
另外,基于当前图像中每个像素点的清晰度确认,考虑到受试者在检测过程中可能存在眨眼的情况,眨眼导致的图像会出现异常,因此所述在已生成的所述清晰度序列中确认目标清晰度,获取所述目标清晰度对应眼底图像的屈光补偿值的步骤,包括:
确认所述清晰度序列中的离群点;
将所述清晰度序列的离群点剔除,并根据剔除离群点的清晰度序列确认目标清晰度。
由于屈光扫描过程中,患者眨眼、眼皮下搭等现象会导致部分眼底图像计算到的清晰度会明显偏离于整个眼底图像的清晰度趋势,如此,考虑到整个眼底图 像的清晰度数值,基于当前清晰度生成的清晰度序列中,需剔除所述清晰度序列中的离群点;或者在所述清晰度序列中的清晰度大部分异常的情况下,即多个清晰度都为离群点时,可只选择所述清晰度序列中的部分清晰度数据形成新的清晰度序列,继而通过已生成的新的清晰度序列确认目标清晰度。
如上所述,基于清晰度序列确认离群点的操作,可通过预设清晰阈值进行确定,即所述确认所述清晰度序列中的离群点的步骤,包括:
将所述清晰度序列的清晰度与预设清晰阈值比对;
若所述清晰度小于或等于所述预设清晰阈值,确认所述清晰度为离群点。
在所述清晰度序列中所确认的离群点操作,其确认方式可通过所述清晰度序列中的各清晰度数值与预设数值的比对进行确认,即在确认所述清晰度大于所述预设数值时,确认所述清晰度数值无异常;在确认所述清晰度均值小于所述预设数值时,确认所述清晰度的数值为离群点,自所述清晰度序列中剔除所述清晰度。根据如上所述的清晰度序列中剔除离群点的操作,以剔除离群点的清晰度序列定义为向量Q’,以一多项式函数去拟合向量Q’,根据拟合结果确认所述清晰度序列中的目标清晰度,即清晰度的最大值Gmax,清晰度最大时对应的眼底图像定义为编号nmax,且对应所述清晰度对应目标位置点的屈光补偿值定义为Dx,y,且Dx,y=(nmax-30)*0.1。
如此,基于当前眼底图像所确认的目标位置点(x,y)计算到的清晰度序列,在所述清晰度序列确认的目标清晰度后,确认所述目标清晰度对应眼底图像的屈光信息,即所述屈光信息即为目标清晰度对应眼底图像在拍摄时的屈光信息,另外,所述屈光信息亦可定义为屈光补偿值,可通过屈光补偿值经过取反、线性变换或者其他某种函数处理后的数值定义为屈光信息,进一步的,所述屈光信息亦可定义为基于眼底图像的眼底数据,例如视力数值、屈光数值等等,而涉及眼底图像的清晰度所确认眼底图像的图像应用数据的技术内容均属于本发明的保护范围。
步骤S40,根据获取到的所述屈光信息生成屈光矩阵,并通过所述屈光矩阵生成屈光图形。
根据获取到的屈光信息,以所述屈光信息生成屈光矩阵,所述屈光矩阵在生成 操作中,可基于预设的矩阵格式进行所述屈光信息的写入形成所述屈光矩阵,即所述根据获取到的所述屈光信息生成屈光矩阵,并通过所述屈光矩阵生成屈光图形的步骤,包括:
将所述屈光信息通过预设格式写入至预设矩阵,根据写入所述屈光信息的预设矩阵生成所述屈光矩阵。
根据获取到的屈光信息,将所述屈光信息以预设格式写入至预设矩阵,所述矩阵定义为一按照长方阵列排列的复数或实数集合,最早来自于方程组的系数及常数所构成的方阵。基于本实施例公开的预设矩阵,需基于现有的矩阵定义变形后形成预设矩阵,其变形操作可通过相关技术人员进行设置以便写入所述屈光信息。
进一步的,基于特定的临床应用需求,屈光矩阵可以进行二次数值运算,并进行图形化显示。数值运算的典型应用有:
基于屈光信息矩阵,以屈光信息得出屈光补偿值的矩阵,进行取反操作,可以得到各个位置点的绝对屈光值为了研究周边屈光相比于中心屈光的变化,可以先计算中心屈光的均值,然后再将整个屈光矩阵减去均值,即可以得到周边相比于中心的变化。将以D为单位的屈光信息,转化为以配镜度数为单位的数值,确认当前屈光地形图的地形图类型,所述地形图类型包括点位图、区块图、立体图、统计图及模拟视觉图;如此,根据已确认的所述地形图类型,执行根据所述图像以及对应的屈光补偿值地图生成屈光地形图的步骤。在实际应用中,不同的地形图类型的表现形式存在区别,其具体的内容可如下所示:
点位图(fov-raw):提供10°,20°,30°,40°视场角和各方位点对应的位点屈光值与黄斑位置屈光值的差值。
区块图(grid-mean):将点位图平均划分为8*8个区块,数字值代表各区块内屈光度的平均值。
立体图(3d-color):将点位屈光值进行3D形态的转换,3D图的XY方向与前两幅图一致,横坐标为右眼,左眼颞侧--鼻侧(Nasal--Temporal),纵坐标为下方--上方(Inferior--Superior)。每个点的值代表当前位置的离焦值,即与黄斑的屈光度差值,以黄斑处的屈光度值作为0基准,其他位置与黄斑作差值,其中“+” 表示屈光度大于黄斑位置,“-”表示屈光度小于黄斑位置。
统计(statistics):如图4所示,根据屈光地形图数据进行不同眼底区域的统计计算参数,其中TRDV为总区域的平均离焦值,代表立体图中曲面偏离水平面的平均值。RDV-15,RDV-30,RDV-45代表距离黄斑不同范围内的平均离焦值,RDV-S,RDV-N,RDV-I,RDV-T代表了上方、鼻侧、下方、颞侧不同象限的平均离焦值。
模拟视觉图(visual):根据屈光地形图数据模拟眼睛不同位置看实际景物的清晰度。认为黄斑位置是清晰的,用每个位置与黄斑的绝对差值作为清晰度的衡量标准,绝对差值越大,越模糊,绝对差值越小,越清晰。这里采用平滑滤波来实现这种局部模糊操作,每个位置的滤波系数与其对应的绝对差值有关,绝对差值越大,滤波系数越大。
裸眼离焦曲线(naked-eye):如图5所示,计算15°,30°,45°位置与黄斑的屈光度差值,差值为“+”则是远视性离焦,即离焦曲线在眼球外侧,差值为“-”则是近视性离焦,则离焦曲线在眼球内测。
此外,还包括佩戴镜片时的模拟曲线图,如下所示:
戴普通眼镜时的离焦曲线frame-glass:以裸眼为参考,模拟戴眼镜时的离焦曲线。戴眼镜会增强原始性离焦程度,即离焦曲面曲率变小,偏离眼球表面更远。
戴ok镜时的离焦曲线ok-glass:经过佩戴ok镜,离焦曲面与眼球表面重回。
在本实施例中,通过构建屈光图形计算方法,以屈光信息与清晰度的映射关系确定目标屈光信息形成屈光矩阵,实现了短时间内一次性对整个眼底区域的屈光信息测量操作,提高了测量效率与测量准确度。
此外,本发明实施例还提出一种计算机可读存储介质,所述计算机可读存储介质上存储有屈光图形生成程序,所述屈光图形生成程序被处理器执行时实现如下操作:
拍摄当前待测人眼的眼底图像,并获取所述眼底图像对应的屈光信息,所述眼底图像为多张;
计算所述眼底图像的清晰度,根据计算到的所述清晰度生成所述眼底图像的清 晰度序列;
在已生成的所述清晰度序列中确认目标清晰度,获取所述目标清晰度对应眼底图像的屈光信息,所述目标清晰度为多个;
根据获取到的所述屈光信息生成屈光矩阵,并通过所述屈光矩阵生成屈光图形。
进一步地,所述屈光图形生成程序被处理器执行时还实现如下操作:
获取调焦光学模组的预设的相对位置;
根据获取到的所述相对位置设置当前拍摄眼底图像的屈光扫描范围。
进一步地,所述屈光图形生成程序被处理器执行时还实现如下操作:
确认所述眼底图像计算清晰度的目标位置点;
通过已确认的所述目标位置点计算所述眼底图像的清晰度。
进一步地,所述屈光图形生成程序被处理器执行时还实现如下操作:
通过所述目标位置点为中心划定预设邻域范围;
根据所述预设邻域范围计算所述眼底图像的清晰度。
进一步地,所述屈光图形生成程序被处理器执行时还实现如下操作:
确认所述清晰度序列中的离群点;
将所述清晰度序列的离群点剔除,并根据剔除离群点的清晰度序列确认目标清晰度。
进一步地,所述屈光图形生成程序被处理器执行时还实现如下操作:
将所述清晰度序列的清晰度与预设清晰阈值比对;
若所述清晰度小于或等于所述预设清晰阈值,确认所述清晰度为离群点。
进一步地,所述屈光图形生成程序被处理器执行时还实现如下操作:
若所述清晰度超出所述预设清晰阈值,确认所述清晰度为非离群点。
进一步地,所述屈光图形生成程序被处理器执行时还实现如下操作:
拟合所述清晰度序列,根据拟合结果确认所述目标清晰度。
进一步地,所述屈光图形生成程序被处理器执行时还实现如下操作:
所述目标清晰度定义为最大值清晰度。
进一步地,所述屈光图形生成程序被处理器执行时还实现如下操作:
获取所述待测人眼的屈光值;
基于所述屈光值设定所述眼底图像的拍摄阈值范围。
进一步地,所述屈光图形生成程序被处理器执行时还实现如下操作:
将所述屈光信息通过预设格式写入至预设矩阵,根据写入所述屈光信息的预设矩阵生成所述屈光矩阵。
进一步地,所述屈光图形生成程序被处理器执行时还实现如下操作:
确认屈光图形类型,并根据所述屈光图形类型确定生成格式;
通过所述生成格式将所述屈光矩阵生成对应的屈光图形。
进一步地,所述屈光图形生成程序被处理器执行时还实现如下操作:
所述屈光图形类型包括点位图、区块图、立体图、统计图、模拟视觉图、裸眼离焦曲线以及配戴镜片的模拟曲线图。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、药品或者系统不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、药品或者系统所固有的要素。在没有更多限制的情况下,由语句“包括一个......”限定的要素,并不排除在包括该要素的过程、方法、药品或者系统中还存在另外的相同要素。
上述本发明实施例序号仅仅为了描述,不代表实施例的优劣。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在如上所述的一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本发明各个实施例所述的方法。
以上仅为本发明的优选实施例,并非因此限制本发明的专利范围,凡是利用本发明说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本发明的专利保护范围内。

Claims (15)

  1. 一种屈光图形生成方法,其中,所述屈光图形生成方法包括以下步骤:
    拍摄当前待测人眼的眼底图像,并获取所述眼底图像对应的屈光信息,所述眼底图像为多张;
    计算所述眼底图像的清晰度,根据计算到的所述清晰度生成所述眼底图像的清晰度序列;
    在已生成的所述清晰度序列中确认目标清晰度,获取所述目标清晰度对应眼底图像的屈光信息,所述目标清晰度为多个;
    根据获取到的所述屈光信息生成屈光矩阵,并通过所述屈光矩阵生成屈光图形。
  2. 如权利要求1所述的屈光图形生成方法,其中,所述拍摄当前待测人眼的眼底图像,并获取所述眼底图像对应的屈光信息的步骤之前,还包括:
    获取调焦光学模组的预设的相对位置;
    根据获取到的所述相对位置设置当前拍摄眼底图像的屈光扫描范围。
  3. 如权利要求1所述的屈光图形生成方法,其中,所述计算所述眼底图像的清晰度,根据计算到的所述清晰度生成所述眼底图像的清晰度序列的步骤,包括:
    确认所述眼底图像计算清晰度的目标位置点;
    通过已确认的所述目标位置点计算所述眼底图像的清晰度。
  4. 如权利要求3所述的屈光图形生成方法,其中,所述通过已确认的所述目标位置点计算所述眼底图像的清晰度的步骤,包括:
    通过所述目标位置点为中心划定预设邻域范围;
    根据所述预设邻域范围计算所述眼底图像的清晰度。
  5. 如权利要求1所述的屈光图形生成方法,其中,所述在已生成的所述清晰度序列中确认目标清晰度,获取所述目标清晰度对应眼底 图像的屈光补偿值的步骤,包括:
    确认所述清晰度序列中的离群点;
    将所述清晰度序列的离群点剔除,并根据剔除离群点的清晰度序列确认目标清晰度。
  6. 如权利要求5所述的屈光图形生成方法,其中,所述确认所述清晰度序列中的离群点的步骤,包括:
    将所述清晰度序列的清晰度与预设清晰阈值比对;
    若所述清晰度小于或等于所述预设清晰阈值,确认所述清晰度为离群点。
  7. 如权利要求6所述的屈光图形生成方法,其中,所述将所述清晰度序列的清晰度与预设清晰阈值比对的步骤之后,还包括:
    若所述清晰度超出所述预设清晰阈值,确认所述清晰度为非离群点。
  8. 如权利要求1所述的屈光图形生成方法,其中,所述在已生成的所述清晰度序列中确认目标清晰度,获取所述目标清晰度对应眼底图像的屈光补偿值的步骤,包括:
    拟合所述清晰度序列,根据拟合结果确认所述目标清晰度。
  9. 如权利要求8所述的屈光图形生成方法,其中,所述目标清晰度定义为最大值清晰度。
  10. 如权利要求1所述的屈光图形生成方法,其中,所述拍摄当前待测人眼的眼底图像,并获取所述眼底图像对应的屈光补偿值,所述眼底图像为多张的步骤之前,还包括:
    获取所述待测人眼的屈光值;
    基于所述屈光值设定所述眼底图像的拍摄阈值范围。
  11. 如权利要求1所述的屈光图形生成方法,其中,所述根据获取到的所述屈光信息生成屈光矩阵,并通过所述屈光矩阵生成屈光图形的步骤,包括:
    将所述屈光信息通过预设格式写入至预设矩阵,根据写入所述屈 光信息的预设矩阵生成所述屈光矩阵。
  12. 如权利要求1所述的屈光图形生成方法,其中,所述根据获取到的所述屈光信息生成屈光矩阵,并通过所述屈光矩阵生成屈光图形的步骤,包括:
    确认屈光图形类型,并根据所述屈光图形类型确定生成格式;
    通过所述生成格式将所述屈光矩阵生成对应的屈光图形。
  13. 如权利要求12所述的屈光图形生成方法,其中,所述屈光图形类型包括点位图、区块图、立体图、统计图、模拟视觉图、裸眼离焦曲线以及配戴镜片的模拟曲线图。
  14. 一种屈光图形生成装置,其中,所述屈光图形生成装置包括:存储器、处理器以及存储在所述存储器上并可以在所述处理器上运行的屈光图形生成程序,所述屈光图形生成程序被处理器执行时实现如权利要求1至13任一项所述的屈光图形生成方法的步骤。
  15. 一种计算机可读存储介质,其中,所述计算机可读存储介质包括存储在所述计算机可读存储介质的屈光图形生成程序,所述屈光图形生成程序执行时实现如权利要求1至13任一项所述的屈光图形生成方法的步骤。
PCT/CN2020/083552 2020-03-16 2020-04-07 屈光图形生成方法、装置及计算机可读存储介质 WO2021184452A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN202080069140.8A CN114980800B (zh) 2020-03-16 2020-04-07 屈光图形生成方法、装置及计算机可读存储介质
CA3154850A CA3154850A1 (en) 2020-03-16 2020-04-07 Method and device for generating refractive pattern, and computer-readable storage medium
EP20925559.5A EP4023141A4 (en) 2020-03-16 2020-04-07 METHOD AND DEVICE FOR GENERATING A REFRACTIVE PATTERN AND COMPUTER-READABLE STORAGE MEDIUM
US17/717,289 US20220233070A1 (en) 2020-03-16 2022-04-11 Method and device for generating refractive pattern, and computer-readable storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010185058.3A CN111358421B (zh) 2020-03-16 2020-03-16 屈光图形生成方法、装置及计算机可读存储介质
CN202010185058.3 2020-03-16

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/717,289 Continuation US20220233070A1 (en) 2020-03-16 2022-04-11 Method and device for generating refractive pattern, and computer-readable storage medium

Publications (1)

Publication Number Publication Date
WO2021184452A1 true WO2021184452A1 (zh) 2021-09-23

Family

ID=71199918

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/083552 WO2021184452A1 (zh) 2020-03-16 2020-04-07 屈光图形生成方法、装置及计算机可读存储介质

Country Status (5)

Country Link
US (1) US20220233070A1 (zh)
EP (1) EP4023141A4 (zh)
CN (2) CN111358421B (zh)
CA (1) CA3154850A1 (zh)
WO (1) WO2021184452A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114202499B (zh) * 2021-06-22 2022-09-09 深圳盛达同泽科技有限公司 屈光信息测量方法、装置及计算机可读存储介质
CN116058786B (zh) * 2023-03-06 2023-07-07 北京鹰瞳科技发展股份有限公司 眼底屈光地形图的确定方法、装置、电子设备及存储介质

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080079899A1 (en) * 2006-09-29 2008-04-03 Nidek Co., Ltd. Ophthalmic apparatus
WO2014169148A1 (en) * 2013-04-10 2014-10-16 Eyenetra, Inc. Methods and apparatus for refractive condition assessment
JP2015226729A (ja) * 2014-06-03 2015-12-17 株式会社トーメーコーポレーション 眼屈折力測定装置
CN105496351A (zh) * 2015-12-30 2016-04-20 深圳市莫廷影像技术有限公司 一种双目验光装置及验光方法
CN109512380A (zh) * 2018-11-02 2019-03-26 爱尔眼科医院集团股份有限公司 基于波前传感技术制作全视网膜屈光地形图的方法
CN110022756A (zh) * 2016-09-29 2019-07-16 伟伦公司 离焦视网膜图像捕获
CN110461213A (zh) * 2016-12-21 2019-11-15 奥克塞拉有限公司 基于家庭眼科应用的小型移动低成本光学相干断层扫描系统
CN110575134A (zh) * 2019-07-25 2019-12-17 北京爱诺斯科技有限公司 一种基于偏心摄影的近视度数的计算方法

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5740815A (en) * 1995-06-07 1998-04-21 Alpins; Noel A. Method for surgically achieving minimum astigmatism measured refractively and topographically
DE1060703T1 (de) * 1999-06-16 2001-09-06 20 10 Perfect Vision Optische Vorrichtung und Verfahren zur Vorkompensierung von Brechungseigenschaften des menschlichen Auges mit adaptativer optischer Steuerung der Rückkopplung
AU2001260197A1 (en) * 2000-04-11 2001-10-23 Thomas Bende Selective corneal aberrometry
JP4458937B2 (ja) * 2004-06-02 2010-04-28 株式会社ニデック 眼屈折力測定装置
CN101203795B (zh) * 2005-05-05 2015-09-30 卡尔蔡司视觉澳大利亚控股有限公司 渐进多焦点眼科镜片元件阵列
JP5073377B2 (ja) * 2007-06-22 2012-11-14 株式会社ニデック 眼科測定装置
CN107249431A (zh) * 2014-06-30 2017-10-13 上海国通视光医疗科技发展有限公司 使用利用视网膜照相机测量的屈光参数的视觉问题诊断
JP6496019B2 (ja) * 2015-05-26 2019-04-03 ホヤ レンズ タイランド リミテッドHOYA Lens Thailand Ltd 累進屈折力レンズの設計方法及び設計装置
US9734396B2 (en) * 2015-09-15 2017-08-15 G&A Technical Software, Inc. Method of refraction angle mapping of earth'S limb
CN108567405A (zh) * 2017-04-28 2018-09-25 分界线(天津)网络技术有限公司 一种近视屈光度的测量系统及方法
CN109645956B (zh) * 2018-12-25 2021-08-06 重庆远视科技有限公司 眼睛屈光度测量装置

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080079899A1 (en) * 2006-09-29 2008-04-03 Nidek Co., Ltd. Ophthalmic apparatus
WO2014169148A1 (en) * 2013-04-10 2014-10-16 Eyenetra, Inc. Methods and apparatus for refractive condition assessment
JP2015226729A (ja) * 2014-06-03 2015-12-17 株式会社トーメーコーポレーション 眼屈折力測定装置
CN105496351A (zh) * 2015-12-30 2016-04-20 深圳市莫廷影像技术有限公司 一种双目验光装置及验光方法
CN110022756A (zh) * 2016-09-29 2019-07-16 伟伦公司 离焦视网膜图像捕获
CN110461213A (zh) * 2016-12-21 2019-11-15 奥克塞拉有限公司 基于家庭眼科应用的小型移动低成本光学相干断层扫描系统
CN109512380A (zh) * 2018-11-02 2019-03-26 爱尔眼科医院集团股份有限公司 基于波前传感技术制作全视网膜屈光地形图的方法
CN110575134A (zh) * 2019-07-25 2019-12-17 北京爱诺斯科技有限公司 一种基于偏心摄影的近视度数的计算方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4023141A4 *

Also Published As

Publication number Publication date
CN111358421B (zh) 2021-02-09
CA3154850A1 (en) 2021-09-23
EP4023141A4 (en) 2023-12-27
CN111358421A (zh) 2020-07-03
CN114980800B (zh) 2023-04-25
US20220233070A1 (en) 2022-07-28
CN114980800A (zh) 2022-08-30
EP4023141A1 (en) 2022-07-06

Similar Documents

Publication Publication Date Title
JP6612959B2 (ja) 主観的距離計測に基づく眼の屈折異常の測定のためのシステムおよび方法
JP6894925B2 (ja) 固定具なしのレンズメータ及びその操作方法
US20220233070A1 (en) Method and device for generating refractive pattern, and computer-readable storage medium
US9195078B2 (en) Method for determining, in a natural posture, at least one geometric/physiognomic parameter associated with the mounting of an ophthalmic lens in a spectacle frame
CN112399817A (zh) 测量眼睛屈光
US11181978B2 (en) System and method for gaze estimation
CN113227747B (zh) 无固定装置检镜仪系统
JP2008264341A (ja) 眼球運動計測方法および眼球運動計測装置
CN113240592A (zh) 基于ar-hud动态眼位下计算虚像平面的畸变矫正方法
WO2020019286A1 (zh) 眼睑下垂检测方法及系统
US11585724B2 (en) Fixtureless lensmeter system
CN111699432B (zh) 使用沉浸式系统确定眼睛的屈光力的方法及其电子设备
CN110119258A (zh) 一种显示屏与光学系统位置测试与调整方法和系统
JP3711053B2 (ja) 視線測定装置及びその方法と、視線測定プログラム及びそのプログラムを記録した記録媒体
WO2014061294A1 (ja) 瞳孔位置測定方法及び両用レンズの作製方法
CN113587895A (zh) 双目测距方法及装置
CN114202499B (zh) 屈光信息测量方法、装置及计算机可读存储介质
CN116974369B (zh) 术中医学影像操作方法、系统、设备及存储介质
CN117422758A (zh) 虹膜直径的获取方法、装置、电子设备及存储介质
CN116091354A (zh) 一种图像畸变校准方法、产品、介质和设备
CN117045191A (zh) 基于vr的自动验光配镜方法、装置、智能眼镜及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20925559

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020925559

Country of ref document: EP

Effective date: 20220330

ENP Entry into the national phase

Ref document number: 3154850

Country of ref document: CA

NENP Non-entry into the national phase

Ref country code: DE