CN112053600B - Orbit endoscope navigation surgery training method, device, equipment and system - Google Patents

Orbit endoscope navigation surgery training method, device, equipment and system Download PDF

Info

Publication number
CN112053600B
CN112053600B CN202010895178.2A CN202010895178A CN112053600B CN 112053600 B CN112053600 B CN 112053600B CN 202010895178 A CN202010895178 A CN 202010895178A CN 112053600 B CN112053600 B CN 112053600B
Authority
CN
China
Prior art keywords
path
video
orbital
normal
movement data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010895178.2A
Other languages
Chinese (zh)
Other versions
CN112053600A (en
Inventor
宋雪霏
李寅炜
李伦昊
邓远
李政康
范先群
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ninth Peoples Hospital Shanghai Jiaotong University School of Medicine
Original Assignee
Ninth Peoples Hospital Shanghai Jiaotong University School of Medicine
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ninth Peoples Hospital Shanghai Jiaotong University School of Medicine filed Critical Ninth Peoples Hospital Shanghai Jiaotong University School of Medicine
Priority to CN202010895178.2A priority Critical patent/CN112053600B/en
Publication of CN112053600A publication Critical patent/CN112053600A/en
Application granted granted Critical
Publication of CN112053600B publication Critical patent/CN112053600B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F9/00Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
    • A61F9/007Methods or devices for eye surgery

Abstract

The application provides a method, a device, equipment and a system for training orbital endoscopic navigation surgery, wherein a normal orbital surgery video and a path orbital surgery video marked with eyeball movement data of an operator are obtained; displaying the normal orbital surgery video and the path orbital surgery video in the same interface; and acquiring a visual path of the learner and displaying the visual path on the normal orbital surgery video for comparison with eyeball movement data of the operator on the path orbital surgery video. This application can make the beginner fully understand art person's operation intention to carry out abundant quantifiable training to the beginner, improve orbital endoscope navigation operation's training efficiency.

Description

Orbit endoscope navigation surgery training method, device, equipment and system
Technical Field
The application relates to the technical field of orbit navigation operation training, in particular to an orbit endoscope navigation operation training method, device, equipment and system.
Background
Orbital navigation surgery is one of the complex surgeries in the surgical field, and can be safely and effectively assisted by using orbital endoscopic navigation technology. The current orbit endoscope navigation operation video for learning and training, because of lacking the effective region of interest that marks, the beginner only relies on watching the operation video and can't accurately understand the art person's intention, consequently leads to this operation study with high costs, and the study degree of difficulty is big.
Disclosure of Invention
In view of the above-mentioned shortcomings of the prior art, the technical problem to be solved by the present application is to provide an orbital endoscopic navigation surgery training method, device, apparatus and system for solving at least one problem in the prior art.
To achieve the above and other related objects, the present application provides an orbital endoscopic navigation surgery training method, comprising: acquiring a normal orbital surgery video and a path orbital surgery video marked with eyeball movement data of an operator; displaying the normal orbital surgery video and the path orbital surgery video in the same interface; and acquiring a visual path of the learner and displaying the visual path on the normal orbital surgery video for comparison with eyeball movement data of the operator on the path orbital surgery video.
In one embodiment of the present application, the normal orbital surgery video is an image of a surgical area in an orbital surgery captured by an endoscope; the path orbit surgery video is obtained through an endoscope, a display and an eye movement instrument, and real-time eyeball movement data of a corresponding operator are synchronously marked on the basis of the normal orbit surgery video.
In an embodiment of the present application, the method for generating the path orbital surgery video with the operator eyeball movement data marked thereon includes: the method comprises the steps of collecting an operation area image in an orbit operation through an endoscope, and playing the operation area image in real time through a display; synchronously acquiring eyeball movement information of an operator in front of the display in real time by using an eye tracker so as to obtain the visual field coordinate position and the watching duration of the operator on a screen of the display correspondingly through conversion; and synchronously marking real-time eyeball movement data of the corresponding operator on the normal orbit surgery video according to the visual field coordinate position and the watching duration.
In an embodiment of the present application, the method for generating the path orbital surgery video with the operator eyeball movement data marked thereon includes: in the eyeball movement data, the gazing position of an operator is represented by a circular point, and the gazing duration is correspondingly represented by the diameter of the circular point; and superposing and recording the round points marked on each frame of the normal orbital surgery video so as to generate eyeball movement data corresponding to the operator.
In one embodiment of the present application, the acquiring a visual path of a learner and displaying the visual path on the normal orbital surgery video includes: a learner manually drawing circles in the normal orbital surgery video at a plurality of locations to characterize a visual route, or manually drawing lines in the normal orbital surgery video to characterize a visual route; or, the eyeball movement information of the learner is acquired through the eye tracker, and the visual path corresponding to the real-time eyeball movement data of the learner is synchronously marked on the normal orbit surgery video.
In an embodiment of the present application, the comparing with the eyeball movement data of the surgeon on the path orbit surgery video includes: hiding or setting certain transparency aiming at eyeball movement data in the path orbit surgery video; after the visual path of the learner is acquired and displayed on the normal orbital surgery video, eyeball movement data in the path orbital surgery video is displayed or transparency is not set so as to be compared with eyeball movement data of a surgeon on the path orbital surgery video.
To achieve the above and other related objects, the present application provides an electronic device, comprising: the acquisition module is used for acquiring a normal orbital surgery video and a path orbital surgery video marked with eyeball movement data of an operator; a display module for displaying the normal orbital surgery video and the path orbital surgery video in the same interface; and the processing module is used for acquiring a visual path of a learner and displaying the visual path on the normal orbital surgery video for comparison with eyeball movement data of an operator on the path orbital surgery video.
To achieve the above and other related objects, there is provided a computer apparatus, comprising: a memory, and a processor; the memory is to store computer instructions; the processor executes computer instructions to implement the method as described above.
In one embodiment of the present application, an orbital endoscopic navigation surgery training system, the system comprising: the display is used for acquiring a normal orbital surgery video and a path orbital surgery video marked with eyeball movement data of an operator; displaying the normal orbital surgery video and the path orbital surgery video in the same interface; the computer device is used for acquiring a visual path of a learner and displaying the visual path on the normal orbital surgery video for comparison with eyeball movement data of a surgeon on the path orbital surgery video.
In an embodiment of the present application, the system further includes: and the eye tracker is used for acquiring the eyeball movement information of the learner and synchronously marking a visual path corresponding to the real-time eyeball movement data of the learner on the normal orbit surgery video.
In summary, the present application provides a method, device, apparatus, and system for orbital endoscopic navigation surgery training, by acquiring a normal orbital surgery video and a path orbital surgery video labeled with data of the eye movement of the operator; displaying the normal orbital surgery video and the path orbital surgery video in the same interface; and acquiring a visual path of the learner and displaying the visual path on the normal orbital surgery video for comparison with eyeball movement data of the operator on the path orbital surgery video.
Has the following beneficial effects:
this application can make the beginner fully understand art person's operation intention to carry out abundant quantifiable training to the beginner, improve orbital endoscope navigation operation's training efficiency.
Drawings
Fig. 1 is a schematic flow chart illustrating an orbital endoscopic navigation surgery training method according to an embodiment of the present application.
Fig. 2 is a schematic view of a video capture system for orbital surgery recording of a path according to an embodiment of the present application.
FIG. 3 is a schematic view of a scene of dots in a video recording of orbital surgery for a path according to an embodiment of the present application.
Fig. 4 is a schematic view showing a scene of eyeball movement data in a video recording of a orbital path according to an embodiment of the present invention.
FIG. 5 is an interface schematic diagram showing a normal orbital surgery video and a path orbital surgery video in the same interface according to an embodiment of the present application.
FIGS. 6A-6C are schematic diagrams of the interface of the learner's visual path displayed on the normal orbital surgery video of one embodiment of the present application.
Fig. 7 is a block diagram of an electronic device according to an embodiment of the present disclosure.
Fig. 8 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Fig. 9 is a schematic structural diagram of an orbital endoscopic navigation surgery training system according to an embodiment of the present application.
Fig. 10 is a schematic view of a training system for orbital endoscopic navigation surgery according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application is provided by way of specific examples, and other advantages and effects of the present application will be readily apparent to those skilled in the art from the disclosure herein. The present application is capable of other and different embodiments and its several details are capable of modifications and/or changes in various respects, all without departing from the spirit of the present application. It is to be noted that the features in the following embodiments and examples may be combined with each other without conflict.
It should be noted that the drawings provided in the following embodiments are only schematic and illustrate the basic idea of the present application, and although the drawings only show the components related to the present application and are not drawn according to the number, shape and size of the components in actual implementation, the type, quantity and proportion of the components in actual implementation may be changed at will, and the layout of the components may be more complex.
Throughout the specification, when a part is referred to as being "connected" to another part, this includes not only a case of being "directly connected" but also a case of being "indirectly connected" with another element interposed therebetween. In addition, when a certain part is referred to as "including" a certain component, unless otherwise stated, other components are not excluded, but it means that other components may be included.
The terms first, second, third, etc. are used herein to describe various elements, components, regions, layers and/or sections, but are not limited thereto. These terms are only used to distinguish one element, component, region, layer or section from another element, component, region, layer or section. Thus, a first element, component, region, layer or section discussed below could be termed a second element, component, region, layer or section without departing from the scope of the present application.
Also, as used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context indicates otherwise. It will be further understood that the terms "comprises," "comprising," and/or "comprising," when used in this specification, specify the presence of stated features, operations, elements, components, items, species, and/or groups, but do not preclude the presence, or addition of one or more other features, operations, elements, components, items, species, and/or groups thereof. The terms "or" and/or "as used herein are to be construed as inclusive or meaning any one or any combination. Thus, "A, B or C" or "A, B and/or C" means "any of the following: a; b; c; a and B; a and C; b and C; A. b and C ". An exception to this definition will occur only when a combination of elements, functions or operations are inherently mutually exclusive in some way.
The current orbit endoscope navigation operation video for learning and training, because of lacking the effective region of interest that marks, the beginner only relies on watching the operation video and can't accurately understand the art person's intention, consequently leads to this operation study with high costs, and the study degree of difficulty is big. In order to solve the problems, the application provides an orbit endoscope navigation surgery training method, device, equipment and system, which can enable a beginner to fully understand the surgery intention of an operator, carry out sufficient quantifiable training on the beginner and improve the training efficiency of the orbit endoscope navigation surgery.
Fig. 1 is a schematic flow chart of the orbital endoscopic navigation surgery training method according to an embodiment of the present application. As shown, the method comprises:
step S101: and acquiring a normal orbital surgery video and a path orbital surgery video marked with eyeball movement data of the operator.
Wherein the normal orbital surgery video is an image of a surgical area in an orbital surgery collected by an endoscope.
The path orbit surgery video is obtained through an endoscope, a display and an eye movement instrument, and real-time eyeball movement data of a corresponding operator are synchronously marked on the basis of the normal orbit surgery video.
Specifically, the method for generating the path orbit surgery video with the operator eyeball movement data labeled thereon includes:
A. the method comprises the steps of collecting an operation area image in an orbit operation through an endoscope, and playing the operation area image in real time through a display;
B. synchronously acquiring eyeball movement information of an operator in front of the display in real time by using an eye tracker so as to obtain the visual field coordinate position and the watching duration of the operator on a screen of the display correspondingly through conversion;
C. and synchronously marking real-time eyeball movement data of the corresponding operator on the normal orbit surgery video according to the visual field coordinate position and the watching duration.
Fig. 2 is a schematic view of a scene of a video capture system for orbital surgery recording of a path according to an embodiment of the present application. As shown, the system comprises: endoscope 201, display 202, and eye tracker 203.
The endoscope 201 is used for collecting the operation area images in the orbital surgery and playing the operation area images in real time through the display 202. For example, a light source and a micro-camera are disposed inside the endoscope 201 so as to collect clear images of the operation region.
And the display 202 is used for playing or displaying the operation area images acquired by the endoscope 201 in real time. For example, the display 202 is a commonly used movable display for operating table, which is dedicated to displaying the image of the surgical field during the operation.
The eye tracker 203 is used for acquiring eye movement data of an operating surgeon facing the display 202 for observation in real time, converting the eye movement data to obtain a visual field coordinate position and a watching duration of the operating surgeon correspondingly falling on a screen of the display 202, and synchronously marking real-time eyeball movement data of the corresponding operator on the normal orbit surgery video according to the visual field coordinate position and the watching duration.
For example, the eye tracker 203 is disposed on the display 202 and electrically connected thereto. The display 202 is disposed in an upper region or a lower region, depending on the surgical posture, such as standing posture or sitting posture, of the doctor during the operation.
Preferably, the display 202 and the eye tracker 203 are also designed as a single unit, such as the display 202 with the eye tracker 203 produced on the market, for collecting high quality eye movement data from various viewing angles of the doctor or various positions on the display 202 in front of the display 202.
For example, in the present application, the eye tracker 203 can be connected to the display 202 via USB 3.0 and fixed in a position facing the operator. In the operation process, when a doctor observes an image of an operation area collected by the endoscope 201 through the display 202, the eye tracker 203 can collect eyeball movement data of the doctor in real time, record data such as the sight direction, the watching duration or the space coordinate of a pupil of the doctor in real time, convert the data into a corresponding position of a screen of the display 202 through an eye tracking technology, and synchronously mark real-time eyeball movement data of a corresponding operator on the normal orbit operation video according to the visual field coordinate position and the watching duration.
In an embodiment of the present application, the method for generating the path orbital surgery video with the operator eyeball movement data marked thereon includes:
A. in the eyeball movement data, the positions watched by the operator are represented by dots, and the watching time lengths are correspondingly represented by the diameters of the dots. Specifically, the longer the fixation time, the larger the diameter of the dot, which can be referred to in fig. 3.
B. And superposing and recording the round points marked on each frame of the normal orbital surgery video so as to generate eyeball movement data corresponding to the operator. Referring to fig. 4, the circles a, b, c, and d marked on different frames of the image are recorded by superposition to generate eye movement data or visual paths corresponding to the operator. If the diameter of the dot d is larger than the dots a, b and d, it means that the watching duration of the operator at the position of the dot d is larger than the watching duration at the positions of other dots.
In this embodiment, the eye tracker in the collection system of the orbital surgery video corresponding to the path in fig. 2 records the sight line position watched by the operator in the surgery, and the sight line position is marked by a dot, so that the longer the time the operator watches the position, the larger the diameter of the dot, and the dot marked on each frame of the picture is superposed and recorded to generate eyeball movement data or a visual path of the operator, thereby improving the understandability of the surgery video. The operation significance of the operator can be deeply understood by others according to the sight line position and the visual path of the operator, and the operation video with the 'first visual angle' in the real sense is formed.
Step S102: displaying the normal orbital surgery video and the path orbital surgery video in the same interface.
In this embodiment, for the normal orbital surgery video and the path orbital surgery video displayed in the same interface, any one or more operations of playing, pausing, dragging and dropping, fast forwarding and slow playing can be performed synchronously.
Fig. 5 is a schematic interface diagram showing the normal orbital surgery video and the path orbital surgery video in the same interface in one embodiment of the present application. For example, the interface schematic may be implemented in one or more embodiments as surgical training software corresponding to the methods or systems described herein.
For example, in the same interface as the training software of FIG. 5, the left screen displays a normal orbital surgery video, i.e., normal mode; the right screen displays a path orbit surgery video marked with eyeball movement data of the operator, namely an eye movement mode. The videos in the two modes are synchronously played in the initial state, and basic functions of playing, pausing, dragging and dropping, fast forwarding, slow forwarding and the like of the two groups of videos can be realized through the progress bar below the videos, so that an operation learner can conveniently compare the understanding of the operation with the operation intention of an operator, and the learning efficiency of the operation is improved.
Step S103: and acquiring a visual path of the learner and displaying the visual path on the normal orbital surgery video for comparison with eyeball movement data of the operator on the path orbital surgery video.
In one embodiment of the present application, the acquiring a visual path of a learner and displaying the visual path on the normal orbital surgery video includes:
A. a learner manually drawing circles in the normal orbital surgery video at a plurality of locations to characterize a visual route, or manually drawing lines in the normal orbital surgery video to characterize a visual route;
for example, on the basis of the interface diagram shown in fig. 5, for a normal orbital surgery video in the left normal mode, when the playing is suspended, the surgical learner or the beginner can draw a circle of the understood surgical focal area by using a painting tool in the upper left corner of the interface according to the understood surgical focal area, as shown in fig. 6A; or label the subsequent surgical path as understood by itself, as shown in fig. 6B. Then, the actual operation area of the operator displayed on the right side and the operation path corresponding to the eyeball movement data can be compared.
Alternatively, the first and second electrodes may be,
B. and acquiring eyeball movement information of the learner through an eye tracker, and synchronously marking a visual path corresponding to real-time eyeball movement data of the learner on the normal orbit surgery video.
In this embodiment, the method or system may further utilize an eye tracker to obtain eyeball movement information of the learner while watching the normal orbital surgery video, and accordingly, a visual path corresponding to real-time eyeball movement data of the learner is synchronously marked on the normal orbital surgery video. Specifically, the step or the content of acquiring the eyeball movement information and generating the visual path through the eye tracker may refer to the content of the eye tracker described in fig. 2, which is not described herein again.
For example, when an operating learner or beginner uses the surgical training software using computer hardware with an eye tracker. On the basis of fig. 5, a beginner can select an interested surgical segment to watch by starting the eye movement tracking function at the upper left corner of the interface. Software can acquire eyeball movement information of the learner through an eye tracker, so that a visual path corresponding to real-time eyeball movement data of the learner is synchronously marked on the normal orbit surgery video and displayed on a left screen. The beginner can review the video recording with his eye movement data at the third viewing angle, and compare the video with the operation intention of the operator, as shown in fig. 6C.
In an embodiment of the present application, the comparing with the eyeball movement data of the surgeon on the path orbit surgery video includes:
A. hiding or setting certain transparency aiming at eyeball movement data in the path orbit surgery video;
B. after the visual path of the learner is acquired and displayed on the normal orbital surgery video, eyeball movement data in the path orbital surgery video is displayed or transparency is not set so as to be compared with eyeball movement data of a surgeon on the path orbital surgery video.
For example, based on the interface diagram shown in fig. 5, for the path orbit surgery video of the right eye movement mode, the eyeball movement data of the operator in the path orbit surgery video can be displayed and hidden, and the transparency can be adjusted to adapt to the learning habits of different surgical learners.
It should be noted that, the method or system described in the present application has an advantage that a learner's visual path can be added to the normal orbital surgery video, for example, the learner can directly mark his/her understood surgery intention in the normal orbital surgery video or the current frame picture, for example, a circle or a line is drawn to represent the visual path, and an operation of hiding/displaying or setting a certain transparency can be performed for the eyeball movement data in the path orbital surgery video. And these operations can fully help the beginner to compare the operation intention that oneself understood with the operation intention of art person to make the beginner fully understand art person's operation intention, and carry out abundant quantifiable training to the beginner, improve orbital endoscope navigation operation's training efficiency.
Fig. 7 is a block diagram of an electronic device according to an embodiment of the present invention. As shown, the apparatus 700 includes:
an obtaining module 701, configured to obtain a normal orbital surgery video and a path orbital surgery video marked with data of eyeball movement of an operator;
a display module 702 configured to display the normal orbital surgery video and the path orbital surgery video in the same interface;
the processing module 703 is configured to obtain a visual path of the learner and display the visual path on the normal orbital surgery video for comparison with the eyeball movement data of the operator on the path orbital surgery video.
It should be noted that, because the contents of information interaction, execution process, and the like between the modules/units of the apparatus are based on the same concept as the method embodiment described in the present application, the technical effect brought by the contents is the same as the method embodiment of the present application, and specific contents may refer to the description in the foregoing method embodiment of the present application, and are not described herein again.
It should be further noted that the division of the modules of the above apparatus is only a logical division, and the actual implementation may be wholly or partially integrated into one physical entity, or may be physically separated. And these units can be implemented entirely in software, invoked by a processing element; or may be implemented entirely in hardware; and part of the modules can be realized in the form of calling software by the processing element, and part of the modules can be realized in the form of hardware. For example, the processing module 703 may be a processing element separately installed, or may be integrated into a chip of the system, or may be stored in a memory of the system in the form of program code, and a processing element of the apparatus calls and executes the function of the processing module 703. Other modules are implemented similarly. In addition, all or part of the modules can be integrated together or can be independently realized. The processing element described herein may be an integrated circuit having signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software.
For example, the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), among others. For another example, when one of the above modules is implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. For another example, these modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
Fig. 8 is a schematic structural diagram of a computer device according to an embodiment of the present invention. As shown, the computer device 800 includes: a memory 801, and a processor 802; the memory 801 is used for storing computer instructions; the processor 802 executes computer instructions to implement the method described in fig. 1.
In some embodiments, the number of the memories 801 in the computer device 800 may be one or more, the number of the processors 802 may be one or more, and fig. 8 is taken as an example.
In an embodiment of the present application, the processor 802 in the computer device 800 loads one or more instructions corresponding to the processes of the application program into the memory 801 according to the steps described in fig. 1, and the processor 802 executes the application program stored in the memory 801, thereby implementing the method described in fig. 1.
The Memory 801 may include a Random Access Memory (RAM), or may also include a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. The memory 801 stores an operating system and operating instructions, executable modules or data structures, or a subset thereof, or an expanded set thereof, wherein the operating instructions may include various operating instructions for implementing various operations. The operating system may include various system programs for implementing various basic services and for handling hardware-based tasks.
The Processor 802 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the Integrated Circuit may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component.
In some specific applications, the various components of the computer device 800 are coupled together by a bus system that may include a power bus, a control bus, a status signal bus, etc., in addition to a data bus. But for the sake of clarity the various buses are referred to as a bus system in figure 8.
Fig. 9 is a schematic diagram of the orbital endoscopic navigation surgery training system according to an embodiment of the present application. As shown, the system 900 includes:
a display 901, configured to obtain a normal orbital surgery video and a path orbital surgery video marked with data of eyeball movement of the operator; and displaying the normal orbital surgery video and the path orbital surgery video in the same interface.
The computer apparatus 902 of fig. 8 is used for obtaining a learner's visual path and displaying it on the normal orbital surgery video for comparison with the operator's eye movement data on the path orbital surgery video.
In an embodiment of the present application, the system 900 further includes: and the eye tracker 903 is used for acquiring eyeball movement information of the learner, and synchronously marking a visual path corresponding to real-time eyeball movement data of the learner on the normal orbit surgery video.
Preferably, the eye tracker 903 is disposed on the display 901 and electrically connected to the display, or the display 901 and the eye tracker 903 are integrally designed, for example, the eye tracker 903 may be disposed in an upper area or a lower area of the display 901 according to different surgical postures such as standing posture or sitting posture of a doctor during surgery, so as to collect high-quality eye movement data at various viewing angles of the doctor or at various positions on the display 901.
Preferably, the display 901 and the eye tracker 903 may be designed as a single unit, such as the display 202 with the eye tracker 203 produced on the market, for collecting high quality eye movement data from various viewing angles of the doctor or various positions on the display 202 in front of the display 202.
For example, a schematic view of a scene of the eyetracker-based intraorbital endoscopic navigation surgery training system described herein may be found with reference to fig. 10.
It should be noted that, for the information interaction, execution process, and other contents between the devices of the system, since the same concept is based on the method embodiment described in the present application, the technical effect brought by the method embodiment is the same as that of the method embodiment of the present application, and specific contents may refer to the description in the foregoing method embodiment of the present application, and are not described herein again.
In summary, the present application provides a method, device, apparatus, and system for training an orbital endoscopic navigation surgery. The operation intention of the operator can be fully understood by the beginner, the beginner can be trained quantitatively and fully, and the training efficiency of the orbit endoscope navigation operation is improved.
The application effectively overcomes various defects in the prior art and has high industrial utilization value.
The above embodiments are merely illustrative of the principles and utilities of the present application and are not intended to limit the application. Any person skilled in the art can modify or change the above-described embodiments without departing from the spirit and scope of the present application. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical concepts disclosed in the present application shall be covered by the claims of the present application.

Claims (7)

1. An orbital endoscopic navigation surgery training method, the method comprising:
acquiring a normal orbital surgery video and a path orbital surgery video marked with eyeball movement data of an operator; wherein the normal orbital surgery video is an operation area image in an orbital surgery collected by an endoscope; the path orbit surgery video is obtained by an endoscope, a display and an eye movement instrument, and real-time eyeball movement data of a corresponding operator is synchronously marked on the basis of the normal orbit surgery video; in the eyeball movement data, the gazing position of an operator is represented by a circular point, and the gazing duration is correspondingly represented by the diameter of the circular point; superposing and recording the round points marked on each frame of the normal orbital surgery video to generate eyeball movement data corresponding to a surgeon;
displaying the normal orbital surgery video and the path orbital surgery video in the same interface;
hiding or setting certain transparency for eyeball movement data in the path eyepit operation video, acquiring a visual path of a learner through manual drawing or an eye tracker and displaying the visual path on the normal eyepit operation video, and displaying or canceling the setting of the transparency for comparing with eyeball movement data of an operator on the path eyepit operation video.
2. The method of claim 1, wherein the method for generating the orbital surgery video of the path marked with the eyeball movement data of the operator comprises:
the method comprises the steps of collecting an operation area image in an orbit operation through an endoscope, and playing the operation area image in real time through a display;
synchronously acquiring eyeball movement information of an operator in front of the display in real time by using an eye tracker so as to obtain the visual field coordinate position and the watching duration of the operator on a screen of the display correspondingly through conversion;
and synchronously marking real-time eyeball movement data of the corresponding operator on the normal orbit surgery video according to the visual field coordinate position and the watching duration.
3. The method of claim 1, wherein the capturing and displaying a learner's visual path on the normal orbital surgery video comprises:
a learner manually drawing circles at a plurality of locations in the normal orbital surgery video to characterize a visual path or manually drawing lines in the normal orbital surgery video to characterize a visual path;
alternatively, the first and second electrodes may be,
and acquiring eyeball movement information of the learner through an eye tracker, and synchronously marking a visual path corresponding to real-time eyeball movement data of the learner on the normal orbit surgery video.
4. An electronic device, the device comprising:
the acquisition module is used for acquiring a normal orbital surgery video and a path orbital surgery video marked with eyeball movement data of an operator; wherein the normal orbital surgery video is an operation area image in an orbital surgery collected by an endoscope; the path orbit surgery video is obtained by an endoscope, a display and an eye movement instrument, and real-time eyeball movement data of a corresponding operator is synchronously marked on the basis of the normal orbit surgery video; in the eyeball movement data, the gazing position of an operator is represented by a circular point, and the gazing duration is correspondingly represented by the diameter of the circular point; superposing and recording the round points marked on each frame of the normal orbital surgery video to generate eyeball movement data corresponding to a surgeon;
a display module for displaying the normal orbital surgery video and the path orbital surgery video in the same interface;
and the processing module is used for hiding or setting certain transparency for eyeball movement data in the path eye socket operation video, acquiring a visual path of a learner through manual drawing or an eye tracker and displaying the visual path on the normal eye socket operation video, and displaying or canceling the setting of the transparency for eyeball movement data of an operator in the path eye socket operation video to compare with the eyeball movement data of the operator on the path eye socket operation video.
5. A computer device, the device comprising: a memory, and a processor; the memory is to store computer instructions; the processor executes computer instructions to implement the method of any one of claims 1 to 3.
6. An orbital endoscopic navigation surgery training system, the system comprising:
the display is used for acquiring a normal orbital surgery video and a path orbital surgery video marked with eyeball movement data of an operator; displaying the normal orbital surgery video and the path orbital surgery video in the same interface;
the computer apparatus of claim 5, wherein the computer apparatus is used for obtaining a learner's visual path and displaying the obtained visual path on the normal orbital surgery video for comparison with the operator's eye movement data on the path orbital surgery video.
7. The system of claim 6, further comprising: and the eye tracker is used for acquiring the eyeball movement information of the learner and synchronously marking a visual path corresponding to the real-time eyeball movement data of the learner on the normal orbit surgery video.
CN202010895178.2A 2020-08-31 2020-08-31 Orbit endoscope navigation surgery training method, device, equipment and system Active CN112053600B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010895178.2A CN112053600B (en) 2020-08-31 2020-08-31 Orbit endoscope navigation surgery training method, device, equipment and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010895178.2A CN112053600B (en) 2020-08-31 2020-08-31 Orbit endoscope navigation surgery training method, device, equipment and system

Publications (2)

Publication Number Publication Date
CN112053600A CN112053600A (en) 2020-12-08
CN112053600B true CN112053600B (en) 2022-05-03

Family

ID=73606975

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010895178.2A Active CN112053600B (en) 2020-08-31 2020-08-31 Orbit endoscope navigation surgery training method, device, equipment and system

Country Status (1)

Country Link
CN (1) CN112053600B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016148765A (en) * 2015-02-12 2016-08-18 国立大学法人大阪大学 Surgery training device
CN107277463A (en) * 2017-08-03 2017-10-20 苏州医视医疗科技有限公司 Video acquisition platform based on intelligent glasses
WO2020020022A1 (en) * 2018-07-25 2020-01-30 卢帆 Method for visual recognition and system thereof

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070072159A1 (en) * 2005-09-27 2007-03-29 Olson Jerald J Sports practice video training system and method
CN101947157B (en) * 2009-12-18 2012-02-15 中国科学院光电技术研究所 Eye self-adaptive optical visual perception learning and training instrument
CN204010351U (en) * 2014-05-26 2014-12-10 侯志伟 A kind of bore hole 3D anatomy experiment teaching demonstration system
CN106297472A (en) * 2016-10-25 2017-01-04 深圳市科创数字显示技术有限公司 The cornea intelligent operation training system that AR with VR combines
EP4170631A1 (en) * 2017-05-11 2023-04-26 Applied Medical Resources Corporation Camera navigation training system
US10553207B2 (en) * 2017-12-29 2020-02-04 Facebook, Inc. Systems and methods for employing predication in computational models
CN110866936B (en) * 2018-08-07 2023-05-23 创新先进技术有限公司 Video labeling method, tracking device, computer equipment and storage medium
CN109979600A (en) * 2019-04-23 2019-07-05 上海交通大学医学院附属第九人民医院 Orbital Surgery training method, system and storage medium based on virtual reality
CN111353429A (en) * 2020-02-28 2020-06-30 深圳壹账通智能科技有限公司 Interest degree method and system based on eyeball turning

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016148765A (en) * 2015-02-12 2016-08-18 国立大学法人大阪大学 Surgery training device
CN107277463A (en) * 2017-08-03 2017-10-20 苏州医视医疗科技有限公司 Video acquisition platform based on intelligent glasses
WO2020020022A1 (en) * 2018-07-25 2020-01-30 卢帆 Method for visual recognition and system thereof

Also Published As

Publication number Publication date
CN112053600A (en) 2020-12-08

Similar Documents

Publication Publication Date Title
WO2020042345A1 (en) Method and system for acquiring line-of-sight direction of human eyes by means of single camera
RU2740259C2 (en) Ultrasonic imaging sensor positioning
JP2018535568A (en) Camera system and method for registering images and presenting a series of aligned images
JP5689850B2 (en) Video analysis apparatus, video analysis method, and gaze point display system
RU2015102613A (en) OVERLAYING AND COMBINING PREOPERATIVE DATA ON VIDEO IN REAL TIME USING A PORTABLE DEVICE
Lin et al. A first-person mentee second-person mentor AR interface for surgical telementoring
US8902305B2 (en) System and method for managing face data
Marinoiu et al. Pictorial human spaces: How well do humans perceive a 3d articulated pose?
JP6397277B2 (en) Support device for interpretation report creation and control method thereof
Rebol et al. Mixed reality communication for medical procedures: teaching the placement of a central venous catheter
CN112053600B (en) Orbit endoscope navigation surgery training method, device, equipment and system
CN107256375A (en) Human body sitting posture monitoring method before a kind of computer
CN104679226B (en) Contactless medical control system, method and Medical Devices
Wieringa et al. Improved depth perception with three-dimensional auxiliary display and computer generated three-dimensional panoramic overviews in robot-assisted laparoscopy
US11715425B2 (en) Display method, display device, display system and storage medium
Mehrubeoglu et al. Capturing reading patterns through a real-time smart camera iris tracking system
Green et al. Microanalysis of video from a robotic surgical procedure: implications for observational learning in the robotic environment
WO2021192704A1 (en) Cognitive impairment diagnostic device and cognitive impairment diagnostic program
CN114882742A (en) Ear endoscope operation simulation teaching method, system, equipment and medium based on VR technology
Lee et al. Comparison of six display modes for a multi-resolution foveated laparoscope
Fuster-Guilló et al. 3D technologies to acquire and visualize the human body for improving dietetic treatment
JP3951592B2 (en) MEDICAL IMAGE DISPLAY METHOD, MEDICAL IMAGE DISPLAY DEVICE, AND MEDICAL DIAGNOSIS SYSTEM
Xu et al. Ravengaze: A dataset for gaze estimation leveraging psychological experiment through eye tracker
JP7049636B1 (en) Information processing equipment, programs, systems, and information processing methods
JP2022131735A (en) User interface device, control method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant