CN111658145A - ICL implantation surgical robot system - Google Patents

ICL implantation surgical robot system Download PDF

Info

Publication number
CN111658145A
CN111658145A CN202010549579.2A CN202010549579A CN111658145A CN 111658145 A CN111658145 A CN 111658145A CN 202010549579 A CN202010549579 A CN 202010549579A CN 111658145 A CN111658145 A CN 111658145A
Authority
CN
China
Prior art keywords
time
eye
data
real
dimensional model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010549579.2A
Other languages
Chinese (zh)
Other versions
CN111658145B (en
Inventor
王开杰
万修华
王进达
张景尚
李猛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Tongren Hospital
Original Assignee
Beijing Tongren Hospital
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Tongren Hospital filed Critical Beijing Tongren Hospital
Priority to CN202010549579.2A priority Critical patent/CN111658145B/en
Publication of CN111658145A publication Critical patent/CN111658145A/en
Application granted granted Critical
Publication of CN111658145B publication Critical patent/CN111658145B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/02Prostheses implantable into the body
    • A61F2/14Eye parts, e.g. lenses, corneal implants; Implanting instruments specially adapted therefor; Artificial eyes
    • A61F2/16Intraocular lenses
    • A61F2/1662Instruments for inserting intraocular lenses into the eye
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F9/00Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
    • A61F9/007Methods or devices for eye surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/108Computer aided selection or customisation of medical implants or cutting guides
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2063Acoustic tracking systems, e.g. using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition

Abstract

The invention provides a robot system for implanting operation of posterior chamber type artificial lens of crystalline eye, which can be used by matching with the existing ophthalmic operation robot, judges the state of operation by monitoring the eye image in the operation process in real time and comparing the eye image with a model preset before the operation, does not directly control the operation, but monitors and judges the operation process in real time at different moments, accurately detects and judges the implantation path, implantation position and unfolding direction and angle of the artificial lens, provides real-time reference for a main surgeon, so as to find and correct the deviation in the operation process in time, ensures the success of the implantation operation and reduces the damage of surrounding tissues. The system has the advantages of strong timeliness and accuracy in operation state evaluation, high result reliability and suitability for popularization and application in the intraocular lens implantation operation.

Description

ICL implantation surgical robot system
Technical Field
The invention belongs to the technical field of surgical robots, and particularly relates to an ICL implantation surgical robot system.
Background
Phakic posterior chamber intraocular lens implantation is considered to be a recent alternative to LASIK, PRK and other ablation procedures for refractive correction. Phakic posterior chamber intraocular lenses mainly include Implantable Contact Lenses (ICLs) and phakic refractive lenses. ICL implantation is the mainstream intraocular lens implantation with crystalline lens at home and abroad at present, under the premise of keeping the complete cornea, the intraocular lens with refractive power is implanted into the ciliary sulcus of the posterior chamber to correct high myopia, hypermetropia and astigmatism, the postoperative vision and visual quality of a patient can be effectively improved, and the ICL implantation is a new choice for successfully taking glasses for high and ultrahigh myopia patients. ICL implantation is a high precision intraocular refractive surgery, with few surgical complications and a wide range of procedures, but the fineness of the surgery directly affects the rapid recovery of the patient's postoperative vision. Therefore, an ICL implantation surgical robot system for microscopy ophthalmology is urgently needed to be developed, a doctor is assisted to carry out surgical path tracking, track prediction, ICL online calibration and distance feedback, the physiological limit of manual operation and perception is expected to be broken through, the operation difficulty is reduced, iatrogenic injury is reduced, and the accuracy and safety of surgery are improved.
Disclosure of Invention
In order to solve the technical problems, the invention provides an ICL implantation surgical robot method and system.
The specific technical scheme of the invention is as follows:
the invention provides an ICL implantation surgery robot system, which comprises at least one processor and a memory, wherein the memory stores instructions, and when the instructions are executed by the at least one processor, the following method is implemented:
respectively collecting eye original data and eye real-time data before and during an operation;
respectively constructing a preoperative eye three-dimensional model and an intraoperative eye three-dimensional model according to the original data and the real-time data;
respectively extracting three-dimensional data from the eye three-dimensional models before and during the operation, and comparing the three-dimensional data to obtain a real-time operation path;
and comparing the real-time surgical path with an expected surgical path, and judging and monitoring the real-time state of the surgery.
Further, when the processor executes the instructions to construct the three-dimensional model of the eye, the following method is implemented:
obtaining the eye original data before operation, converting the eye original data into data which can be used for constructing a three-dimensional model, and constructing the eye original three-dimensional model; in the intraocular lens implantation process, acquiring the eye real-time data at regular time, converting the eye real-time data into data capable of being used for building a three-dimensional model, and building eye real-time three-dimensional models at different moments;
constructing an artificial lens three-dimensional model which is in equal proportion to the eye original three-dimensional model according to preset parameter information of the artificial lens;
and splicing the artificial lens three-dimensional model and the original three-dimensional model according to a preset implantation path, and generating a GIF (geometric figure of merit) model by the complete splicing process.
Further, when the processor executes the instructions to obtain the surgical path, the following method is implemented:
converting the extracted three-dimensional data into data which can be used for generating a real-time surgical path;
theoretical parameter information is obtained from the eye original three-dimensional model, real-time parameter information is obtained from the eye real-time three-dimensional model, a real-time implantation path of each data obtaining time is generated after comparison, and the relative position of the artificial lens and surrounding tissues is judged so as to determine the implantation accuracy of the artificial lens.
Further, when the processor executes the instructions to collect the eye raw data and the eye real-time data, the following method is implemented:
acquiring original ocular optical data of a plurality of angles for an eye before an operation, and acquiring real-time ocular optical data from the plurality of angles at regular time in an intraocular lens implantation process;
emitting ultrasonic waves with the frequency of 40-100 MHz to the eyes before operation, and collecting original acoustic data at a plurality of positions of the eyes;
at this time, the eye original data includes the original eye optical data and the original acoustic data, and the eye real-time data is the real-time eye optical data.
Further, when the processor executes the instructions to construct the three-dimensional model of the eye, the following method is implemented:
acquiring slice data through the original eye optical data, and acquiring depth data through the original acoustic data, thereby constructing the original three-dimensional model of the eye;
and in the intraocular lens implantation process, slice data is obtained through the real-time eye optical data at regular time, and accordingly, real-time three-dimensional models of the eyes at different moments are constructed.
Further, when the processor executes the three-dimensional eye model built by the instructions to build a GIF model, the following method is implemented:
and splicing the artificial lens three-dimensional model and the eye original three-dimensional model, selecting a plurality of time points according to a preset implantation path, respectively extracting coordinate information of each point, constructing a simulated three-dimensional model of the plurality of time points, and fitting and filling the interrupted part to generate a complete GIF (group information fusion) ultrasonic model.
Further, when the processor executes the instructions to obtain a real-time implantation path, the following method is implemented:
theoretical parameter information is obtained from the eye original three-dimensional model, real-time parameter information is obtained from the eye real-time three-dimensional model, a real-time implantation path at each data acquisition time is generated after comparison, and the relative position of the intraocular lens and surrounding tissues is judged.
Further, when the processor executes the instructions to compare the real-time surgical path to an expected surgical path, the following method is implemented:
and comparing the real-time implantation path with the preset implantation path, recording the deviation condition of each data acquisition moment, and calculating a deviation value according to the relative position of the intraocular lens and the surrounding tissues.
Further, when the processor executes the instructions, judges and monitors the real-time status of the operation, the following method is implemented:
and comparing the calculated deviation value with a preset allowable deviation range, and sending reminding information to the master doctor according to the comparison result.
Further, when the processor executes the instruction and compares the calculated deviation value with the preset allowable deviation range, the following method is implemented:
setting allowable deviation ranges for a plurality of links of the preset implantation path, and setting judgment weights for the links respectively, wherein the allowable deviation ranges comprise at least one deviation direction and deviation distances in the deviation direction; and comparing the deviation value of a link including the current link with the corresponding allowable deviation range, extracting all the links of which the deviation values exceed the allowable deviation range, calculating the total deviation according to the weight, and judging the operation state of the current link according to the total deviation.
The invention has the following beneficial effects: the invention provides an ICL implantation surgical robot system which can be used in cooperation with the existing ophthalmic surgical robot, can judge the state of operation by monitoring an eye image in the operation process in real time and comparing the eye image with a model preset before the operation, does not directly control the operation, but monitors and judges the operation process in real time at different moments, accurately detects and judges an implantation path, an implantation position and an unfolding direction and angle of an artificial crystal, provides real-time reference for a doctor, so that the deviation in the operation process can be found and corrected in time, the success of the implantation operation is ensured, and the damage of surrounding tissues is reduced; meanwhile, the current operation information can be stored in the system and used as historical data to provide reference for the subsequent operation. The system has the advantages of strong timeliness and accuracy in operation state evaluation, high result reliability and suitability for popularization and application in the intraocular lens implantation operation.
Drawings
FIG. 1 is a flowchart illustrating operation of an ICL surgical robotic implantation system according to an exemplary embodiment;
fig. 2 is a flowchart of a method for constructing a three-dimensional model by an ICL surgical robot according to an embodiment.
Detailed Description
The present invention will be described in further detail with reference to the following examples and drawings.
Examples
As shown in fig. 1, the present embodiment provides an ICL implantation surgical robot system, which includes at least one processor 1 and a memory 2, where the memory 2 stores instructions, and when the instructions are executed by the at least one processor 1, the following method is implemented:
respectively collecting eye original data and eye real-time data before and during an operation;
respectively constructing a preoperative eye three-dimensional model and an intraoperative eye three-dimensional model according to the original data and the real-time data;
respectively extracting three-dimensional data from the eye three-dimensional models before and during the operation, and comparing the three-dimensional data to obtain a real-time operation path;
and comparing the real-time surgical path with an expected surgical path, and judging and monitoring the real-time state of the surgery.
The robot system can be used in cooperation with the existing ophthalmic surgery robot, the state of the operation is judged by monitoring the eye image in the operation process in real time and comparing the eye image with a model preset before the operation, the operation is not directly controlled, but the operation process is monitored in real time, real-time reference is provided for a doctor in the main surgeon, and therefore deviation in the operation process can be found and corrected in time; meanwhile, the current operation information can be stored in the system and used as historical data to provide reference for the subsequent operation.
In some particular embodiments, when the processor 1 executes the instructions to construct the three-dimensional model of the eye, the following method is implemented:
obtaining eye original data before an operation, converting the eye original data into data which can be used for constructing a three-dimensional model, and constructing the eye original three-dimensional model; in the intraocular lens implantation process, regularly acquiring eye real-time data, converting the eye real-time data into data capable of being used for building a three-dimensional model, and building eye real-time three-dimensional models at different moments;
constructing an artificial lens three-dimensional model which is in equal proportion to the original eye three-dimensional model according to preset parameter information of the artificial lens;
and splicing the artificial lens three-dimensional model and the original three-dimensional model according to a preset implantation path, and generating the GIF model by the complete splicing process.
The GIF model can completely simulate the process of implanting the three-dimensional model of the intraocular lens into the original three-dimensional model of the eye according to a predetermined implantation path, thereby completely exhibiting the ideal state of the implantation surgery. The method comprises the steps of respectively acquiring images at a plurality of moments in the operation process, constructing a real-time three-dimensional model, and determining real-time operation states at different sampling moments, so that different moments in the actual operation process can be respectively judged by taking the GIF model as a reference under the condition that the eye structure state and various parameters are determined, and the timeliness and the accuracy of operation state evaluation are ensured.
In some specific embodiments, when the processor 1 executes the instructions to obtain the surgical path, the following method is implemented:
converting the extracted three-dimensional data into data which can be used for generating a real-time surgical path;
theoretical parameter information is obtained from the original three-dimensional model of the eye, real-time parameter information is obtained from the real-time three-dimensional model of the eye, a real-time implantation path of each data obtaining time is generated after comparison, and the relative position of the artificial lens and surrounding tissues is judged so as to determine the implantation accuracy of the artificial lens.
In some specific embodiments, when the processor 1 executes the instructions to collect the eye raw data and the eye real-time data, the following method is implemented:
acquiring original eye optical data of a plurality of angles for an eye before an operation, and acquiring real-time eye optical data of the plurality of angles at regular time in the process of implanting the artificial lens;
emitting ultrasonic waves with the frequency of 40-100 MHz to the eyes before operation, and collecting original acoustic data at a plurality of positions of the eyes;
at this time, the eye original data includes original eye optical data and original acoustic data, and the eye real-time data is real-time eye optical data.
Optical data can be acquired by adopting Optical Coherence Tomography (OCT), the OCT is a low-loss, high-resolution and non-invasive medical imaging technology, the fundamental principle of weak coherent light is utilized, and back reflection or scattering signals of different depth layers of biological tissues to incident weak coherent light are detected, so that nanoscale tangential fundus imaging or three-dimensional images can be obtained; it has the disadvantage of insufficient penetration depth and is therefore used in conjunction with highly penetrating ultrasound, but with lower imaging resolution, to more accurately acquire data from the eye.
The acoustic data can be obtained by scanning through an Ultrasonic Biological Microscope (UBM), the UBM utilizes an object to image the reflection and scattering of ultrasonic waves, and can clearly display different tissue structures of eyes in a living body state, and the technology has the characteristics of no damage, accuracy, repeatability, dynamics, simplicity, convenience and feasibility, has high resolution, is equivalent to an optical microscope, and is not interfered by corneal opacity; however, the operation requires an upper eye cup and a couplant injection, and the operation cannot be performed during the operation, so that the method can be only applied to various examinations before and after the operation.
As shown in fig. 2, on the basis of simultaneously adopting an optical method and an acoustic method to acquire eye data, when the processor 1 executes instructions to construct a three-dimensional model of an eye, the following methods are implemented:
acquiring slice data through original eye optical data, acquiring depth data through original acoustic data, and constructing an original three-dimensional model of the eye according to the depth data;
in the intraocular lens implantation process, slice data are acquired regularly through real-time eye optical data, and accordingly real-time three-dimensional models of eyes at different moments are built.
In some particular embodiments, when the processor 1 executes instructions to construct the GIF model using the three-dimensional model of the eye, the following method is implemented:
the artificial crystal three-dimensional model and the eye original three-dimensional model are spliced, a plurality of time points are selected according to a preset implantation path, coordinate information of each time point is respectively extracted, a simulated three-dimensional model of the time points is constructed, and the interrupted part is fitted and filled to generate a complete GIF ultrasonic model.
In some specific embodiments, when the processor 1 executes the instructions to obtain the real-time implantation path, the following method is implemented:
theoretical parameter information is obtained from the original three-dimensional model of the eye, real-time parameter information is obtained from the real-time three-dimensional model of the eye, a real-time implantation path at each data acquisition time is generated after comparison, and the relative position of the artificial crystal and the surrounding tissues is judged.
In some particular embodiments, when the processor 1 executes instructions to compare the real-time surgical path with the expected surgical path, the following method is implemented:
and comparing the real-time implantation path with a preset implantation path, recording the deviation condition of each data acquisition time, and calculating a deviation value according to the relative position of the intraocular lens and the surrounding tissues.
In some specific embodiments, when the processor 1 executes the instructions, determines and monitors the real-time status of the operation, the following method is implemented:
and comparing the calculated deviation value with a preset allowable deviation range, and sending reminding information to the master doctor according to the comparison result.
The reminding information can be sent in a mode of sound alarm or monitoring picture icon flashing and the like, and meanwhile, specific comparison conditions can be displayed in the monitoring picture, so that a doctor can observe and judge conveniently.
In some specific embodiments, when the processor 1 executes the instruction, and compares the calculated deviation value with the preset allowable deviation range, the following method is implemented:
setting allowable deviation ranges for a plurality of links of a preset implantation path, and respectively setting judgment weights for all the links, wherein the allowable deviation ranges comprise at least one deviation direction and deviation distances in the deviation direction; and comparing the deviation value of the link including the current link with the corresponding allowable deviation range, extracting all links with the deviation values exceeding the allowable deviation range, calculating the total deviation according to the weight, and judging the operation state of the current link according to the total deviation.
The allowable deviation range represents the acceptable deviation degree in the intraocular lens implantation process, and since the surgical process cannot be performed 100% strictly according to the set implantation path, the range of the comparison template needs to be properly expanded on the premise of ensuring safety and effect; thus, for small deviations, if they are within the range of the expanded contrast template, the effect of the surgery is not sufficiently influenced, and they need not be taken into account in the case of statistical deviations. For deviations beyond the allowable deviation range, the influence on the operation effect is different according to the specific positions (including the physical position of eye tissues, and the time position and the logic position in the implantation path), which can be reflected by the preset weight; therefore, the total deviation of the operation at the current time (current link) can be calculated by the weight and the specific deviation value of each link before the current time, and the operation state at the current time can be further judged.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. An ICL implant surgical robotic system comprising at least one processor and a memory, said memory storing instructions that when executed by the at least one processor implement a method comprising:
respectively collecting eye original data and eye real-time data before and during an operation;
respectively constructing a preoperative eye three-dimensional model and an intraoperative eye three-dimensional model according to the original data and the real-time data;
respectively extracting three-dimensional data from the eye three-dimensional models before and during the operation, and comparing the three-dimensional data to obtain a real-time operation path;
and comparing the real-time surgical path with an expected surgical path, and judging and monitoring the real-time state of the surgery.
2. An ICL implant surgical robotic system as claimed in claim 1, wherein when said processor executes said instructions to construct said three-dimensional model of the eye, the following method is implemented:
obtaining the eye original data before operation, converting the eye original data into data which can be used for constructing a three-dimensional model, and constructing the eye original three-dimensional model; in the intraocular lens implantation process, acquiring the eye real-time data at regular time, converting the eye real-time data into data capable of being used for building a three-dimensional model, and building eye real-time three-dimensional models at different moments;
constructing an artificial lens three-dimensional model which is in equal proportion to the eye original three-dimensional model according to preset parameter information of the artificial lens;
and splicing the artificial lens three-dimensional model and the original three-dimensional model according to a preset implantation path, and generating a GIF (geometric figure of merit) model by the complete splicing process.
3. An ICL implant surgical robotic system as claimed in claim 2, wherein when said processor executes said instructions to acquire said surgical path, the following method is implemented:
converting the extracted three-dimensional data into data which can be used for generating a real-time surgical path;
theoretical parameter information is obtained from the eye original three-dimensional model, real-time parameter information is obtained from the eye real-time three-dimensional model, a real-time implantation path of each data obtaining time is generated after comparison, and the relative position of the artificial lens and surrounding tissues is judged so as to determine the implantation accuracy of the artificial lens.
4. An ICL implant surgical robotic system as claimed in claim 3, wherein when said processor executes said instructions to acquire said eye raw data and said eye real-time data, the following method is implemented:
acquiring original ocular optical data of a plurality of angles for an eye before an operation, and acquiring real-time ocular optical data from the plurality of angles at regular time in an intraocular lens implantation process;
emitting ultrasonic waves with the frequency of 40-100 MHz to the eyes before operation, and collecting original acoustic data at a plurality of positions of the eyes;
at this time, the eye original data includes the original eye optical data and the original acoustic data, and the eye real-time data is the real-time eye optical data.
5. An ICL implant surgical robotic system as claimed in claim 4, wherein when said processor executes said instructions to construct said three-dimensional model of the eye, the following method is implemented:
acquiring slice data through the original eye optical data, and acquiring depth data through the original acoustic data, thereby constructing the original three-dimensional model of the eye;
and in the intraocular lens implantation process, slice data is obtained through the real-time eye optical data at regular time, and accordingly, real-time three-dimensional models of the eyes at different moments are constructed.
6. An ICL implant surgical robotic system as claimed in claim 5, wherein when said processor executes said instructions to construct a GIF model using said three-dimensional model of the eye, the following method is implemented:
and splicing the artificial lens three-dimensional model and the eye original three-dimensional model, selecting a plurality of time points according to a preset implantation path, respectively extracting coordinate information of each point, constructing a simulated three-dimensional model of the plurality of time points, and fitting and filling the interrupted part to generate a complete GIF (group information fusion) ultrasonic model.
7. An ICL implant surgical robotic system as claimed in claim 6, wherein when said processor executes said instructions to acquire a real-time implant path, the following method is implemented:
theoretical parameter information is obtained from the eye original three-dimensional model, real-time parameter information is obtained from the eye real-time three-dimensional model, a real-time implantation path at each data acquisition time is generated after comparison, and the relative position of the intraocular lens and surrounding tissues is judged.
8. An ICL implant surgical robotic system as claimed in any one of claims 1 to 7, wherein when said processor executes said instructions to compare said real-time surgical path to an expected surgical path, the method is implemented as follows:
and comparing the real-time implantation path with the preset implantation path, recording the deviation condition of each data acquisition moment, and calculating a deviation value according to the relative position of the intraocular lens and the surrounding tissues.
9. An ICL implantable surgical robotic system as claimed in claim 8, wherein when said processor executes said instructions to determine and monitor the real-time status of the procedure, the following method is implemented:
and comparing the calculated deviation value with a preset allowable deviation range, and sending reminding information to the master doctor according to the comparison result.
10. An ICL implantable surgical robotic system as claimed in claim 9, wherein when said processor executes said instructions and when comparing said calculated deviation value to said preset allowable deviation range, performing the method of:
setting allowable deviation ranges for a plurality of links of the preset implantation path, and setting judgment weights for the links respectively, wherein the allowable deviation ranges comprise at least one deviation direction and deviation distances in the deviation direction; and comparing the deviation value of a link including the current link with the corresponding allowable deviation range, extracting all the links of which the deviation values exceed the allowable deviation range, calculating the total deviation according to the weight, and judging the operation state of the current link according to the total deviation.
CN202010549579.2A 2020-06-16 2020-06-16 ICL implantation operation robot system Active CN111658145B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010549579.2A CN111658145B (en) 2020-06-16 2020-06-16 ICL implantation operation robot system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010549579.2A CN111658145B (en) 2020-06-16 2020-06-16 ICL implantation operation robot system

Publications (2)

Publication Number Publication Date
CN111658145A true CN111658145A (en) 2020-09-15
CN111658145B CN111658145B (en) 2022-06-21

Family

ID=72387734

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010549579.2A Active CN111658145B (en) 2020-06-16 2020-06-16 ICL implantation operation robot system

Country Status (1)

Country Link
CN (1) CN111658145B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112043383A (en) * 2020-09-30 2020-12-08 复旦大学附属眼耳鼻喉科医院 Ophthalmic surgery navigation system and electronic equipment
CN112259192A (en) * 2020-10-22 2021-01-22 华志微创医疗科技(北京)有限公司 Surgical operation system and control method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105592829A (en) * 2013-07-29 2016-05-18 拜尔普泰戈恩公司 Procedural optical coherence tomography (OCT) for surgery and related systems and methods
CN110559087A (en) * 2019-09-02 2019-12-13 清华大学深圳研究生院 Safety monitoring system for corneal surgery
CN111128362A (en) * 2020-01-22 2020-05-08 复旦大学附属华山医院 Intelligent control system for ophthalmic surgery

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105592829A (en) * 2013-07-29 2016-05-18 拜尔普泰戈恩公司 Procedural optical coherence tomography (OCT) for surgery and related systems and methods
CN110559087A (en) * 2019-09-02 2019-12-13 清华大学深圳研究生院 Safety monitoring system for corneal surgery
CN111128362A (en) * 2020-01-22 2020-05-08 复旦大学附属华山医院 Intelligent control system for ophthalmic surgery

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112043383A (en) * 2020-09-30 2020-12-08 复旦大学附属眼耳鼻喉科医院 Ophthalmic surgery navigation system and electronic equipment
CN112259192A (en) * 2020-10-22 2021-01-22 华志微创医疗科技(北京)有限公司 Surgical operation system and control method

Also Published As

Publication number Publication date
CN111658145B (en) 2022-06-21

Similar Documents

Publication Publication Date Title
US20210137738A1 (en) Laser methods and systems for addressing conditions of the lens
US11311418B2 (en) Apparatus for individual therapy planning and positionally accurate modification of an optical element
US20230043966A1 (en) System and apparatus for treating the lens of an eye
CN105451638B (en) For the integrated OCT dioptrics meter systems of eye biometrics
RU2500374C2 (en) System for performing ophthalmologic refractive operation
US8452372B2 (en) System for laser coagulation of the retina from a remote location
AU2020289827B2 (en) Laser methods and systems for the aligned insertion of devices into a structure of the eye
CN111658145B (en) ICL implantation operation robot system
AU2021200326B2 (en) Laser methods and systems for addressing conditions of the lens
US9037217B1 (en) Laser coagulation of an eye structure or a body surface from a remote location
US20230329909A1 (en) Systems and methods for determining the characteristics of structures of the eye including shape and positions
CN110559087B (en) Safety monitoring system for corneal surgery
US7347554B2 (en) Determining criteria for phakic intraocular lens implant procedures
Steele et al. Effects of different ocular fixation conditions on A‐scan ultrasound biometry measurements
Lazzaro et al. High frequency ultrasound evaluation of radial keratotomy incisions
CN111166530A (en) Method for predicting postoperative position of artificial lens
CN216365091U (en) Speculum
US20240008811A1 (en) Using artificial intelligence to detect and monitor glaucoma
CN111554377A (en) System and method for monitoring anterior segment of eye in real time medical image
Agarwal et al. Diagnostic and Imaging Techniques in Ophthalmology
JPH0351166B2 (en)
CA2270273A1 (en) Methods of ocular biometry

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant