CN114041874B - Interface display control method and device, computer equipment and system and medium - Google Patents

Interface display control method and device, computer equipment and system and medium Download PDF

Info

Publication number
CN114041874B
CN114041874B CN202111316069.1A CN202111316069A CN114041874B CN 114041874 B CN114041874 B CN 114041874B CN 202111316069 A CN202111316069 A CN 202111316069A CN 114041874 B CN114041874 B CN 114041874B
Authority
CN
China
Prior art keywords
interface
area
field
view
visual field
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111316069.1A
Other languages
Chinese (zh)
Other versions
CN114041874A (en
Inventor
黄帆
王牌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Edge Medical Co Ltd
Original Assignee
Shenzhen Edge Medical Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Edge Medical Co Ltd filed Critical Shenzhen Edge Medical Co Ltd
Priority to CN202111316069.1A priority Critical patent/CN114041874B/en
Priority to CN202310888179.8A priority patent/CN116849803A/en
Publication of CN114041874A publication Critical patent/CN114041874A/en
Application granted granted Critical
Publication of CN114041874B publication Critical patent/CN114041874B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00147Holding or positioning arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00174Optical arrangements characterised by the viewing angles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/302Surgical robots specifically adapted for manipulations within body cavities, e.g. within abdominal or thoracic cavities

Abstract

The embodiment of the application provides an interface display control method and device, a computer device and system and a computer storage medium, wherein the method comprises the following steps: determining a current display area of a target instrument in a surgical environment visual field interface; determining an idle visual field area in the surgical environment visual field interface according to the current display area of the target instrument; acquiring the current position of the interface icon, if the situation that the interface icon is blocked in view is determined according to the position, adjusting the interface icon to the idle view area, so that real-time optimization of the position of the interface icon in the surgical environment view interface is realized, and on the premise that the surgical environment view interface is supported to be capable of presenting icons which are richer in number and respectively represent more types of instrument positions and execution states, the interface icon is ensured not to block the view of a doctor in the operation progress, so that the doctor can perform the operation better.

Description

Interface display control method and device, computer equipment and system and medium
Technical Field
The present application relates to the field of surgical robots, and in particular, to an interface display control method and apparatus, a computer device and system, and a computer readable storage medium.
Background
Minimally invasive surgery refers to a surgical mode for performing surgery in a human body by using modern medical instruments such as laparoscopes, thoracoscopes and related devices. Compared with the traditional operation mode, the minimally invasive operation has the advantages of small wound, light pain, quick recovery and the like. With the progress of technology, minimally invasive surgical robot technology is gradually mature and widely applied. The minimally invasive surgical robot generally includes a master console for transmitting control commands to the slave operating devices according to operations of doctors to control the slave operating devices, and the slave operating devices are for responding to the control commands transmitted from the master console and performing corresponding surgical operations. The doctor observes the view field condition of the operation environment in real time through the display interface in the main operation console, so that the operation can be performed better, various icons are usually displayed in the display interface to represent the positions, states and the like of corresponding instruments in order to assist the doctor in acquiring the required information.
However, as the surgical procedure dynamically progresses, the focus of the doctor on the display interface also changes dynamically, so that the situation that the focus is blocked by the icon may exist, which affects the doctor operation.
Disclosure of Invention
In order to solve the existing technical problems, the application provides an interface display control method and device, a computer device and system and a computer readable storage medium, wherein the interface display control method and device can dynamically optimize the positions of interface icons.
In order to achieve the above object, the technical solution of the embodiment of the present application is as follows:
in a first aspect, an embodiment of the present application provides an interface display control method, applied to a computer device, including:
determining a current display area of a target instrument in a surgical environment visual field interface;
determining an idle visual field area in the surgical environment visual field interface according to the current display area of the target instrument;
and acquiring the current position of the interface icon, and if the interface icon is determined to have the view shielding according to the position, adjusting the interface icon to be in the free view area.
In a second aspect, an embodiment of the present application provides an interface display control apparatus, including:
the determining module is used for determining the current display area of the target instrument in the surgical environment visual field interface;
the visual field dividing module is used for determining an idle visual field area in the surgical environment visual field interface according to the current display area of the target instrument;
The adjustment module is used for acquiring the current position of the interface icon, and adjusting the interface icon to be in the free visual field area if the visual field shielding exists in the interface icon according to the position.
In a third aspect, an embodiment of the present application provides a computer device, including a processor, a memory connected to the processor, and a computer program stored in the memory and executable by the processor, where the computer program when executed by the processor implements the interface display control method according to any one of the embodiments of the present application.
In a fourth aspect, an embodiment of the present application provides a teleoperational medical system, including a computer device according to any embodiment of the present application and a slave operating device connected to the computer device, where the slave operating device includes a plurality of instruments of different types and a driving assembly for driving the instruments to perform specified actions.
In a fifth aspect, an embodiment of the present application further provides a computer readable storage medium, where a computer program is stored, where the computer program when executed by the processor implements an interface display control method according to any one of the embodiments of the present application.
According to the interface display control method and device provided by the embodiment, the current display area of the target instrument in the surgical environment visual field interface is determined, the free visual field area in the surgical environment visual field interface is determined according to the current display area of the target instrument, the current position of the interface icon is obtained, if the visual field shielding exists in the interface icon according to the position of the interface icon, the interface icon is adjusted to the free visual field area, the free visual field area is distinguished according to the current focused part of a doctor through the surgical environment visual field interface, when the visual field shielding exists in the interface icon is identified, the interface icon is adjusted to the free visual field area, real-time optimization of the position of the interface icon in the surgical environment visual field interface is realized, and on the premise that the surgical environment visual field interface is supported to be capable of presenting icons which are richer in number and respectively represent more types of instrument positions and execution states, the interface icon is ensured not to shield the doctor visual field in the surgical progress, so that the doctor can perform surgery better.
In the above embodiments, the computer device, the system, the computer readable storage medium and the corresponding interface display control method embodiments belong to the same concept, so that the computer device, the system, the computer readable storage medium and the corresponding interface display control method embodiments have the same technical effects as the corresponding interface display control method embodiments, and are not described herein again.
Drawings
FIG. 1 is a schematic diagram of an optional application scenario of an interface display control method according to an embodiment;
FIG. 2 is a flow chart of an interface display control method according to an embodiment;
FIG. 3 is a schematic diagram of an interface icon in an embodiment;
FIG. 4 is a schematic view of an surgical environment field interface in one embodiment;
FIG. 5 is a schematic illustration of a surgical environment field interface determining a working field of view area and a free field of view area based on a display area of a target instrument in one embodiment;
FIG. 6 is a flow chart of an interface display control method in an alternative embodiment;
FIG. 7 is a schematic diagram of an interface display control device according to an embodiment;
fig. 8 is a schematic structural diagram of a computer device in an embodiment.
Detailed Description
The technical scheme of the application is further elaborated below by referring to the drawings in the specification and the specific embodiments.
The present application will be further described in detail with reference to the accompanying drawings, for the purpose of making the objects, technical solutions and advantages of the present application more apparent, and the described embodiments should not be construed as limiting the present application, and all other embodiments obtained by those skilled in the art without making any inventive effort are within the scope of the present application.
In the following description, reference is made to the expression "some embodiments" which describe a subset of all possible embodiments, it being noted that "some embodiments" may be the same subset or different subsets of all possible embodiments and may be combined with each other without conflict.
In the following description, the terms "first, second, third" and the like are used merely to distinguish between similar objects and do not represent a specific ordering of the objects, it being understood that the "first, second, third" may be interchanged with a specific order or sequence, as permitted, to enable embodiments of the application described herein to be practiced otherwise than as illustrated or described herein.
Referring to fig. 1, a schematic diagram of an optional application scenario of an interface display control method provided in an embodiment of the present application is applied to a teleoperational medical system, where the teleoperational medical system includes a master operation device 10 and a slave operation device 12 connected to the master operation device 10. Wherein the slave manipulator device comprises a plurality of instruments 121 of different types and a drive assembly 122 for driving the instruments 121 to perform a prescribed action, typically the instruments 121 comprise endoscopes and surgical tools associated with performing surgical procedures, such as electrocautery, forceps, staplers, shears, ultrasound probes, etc. The endoscope is used for acquiring and transmitting images of the surgical environment field to the main operation device 10 for display. The master operation device 10 may be used as a doctor-side master operation console, and may be installed in the master operation device for a user, for example, and generally includes a client for a doctor S to remotely observe the current operation progress and remotely manage a slave operation device, where the client may be an application client (such as a mobile phone APP) or a web client, which is not limited herein. The slave operating device 12 is used to receive the operation instructions issued by the doctor based on the client to perform the corresponding actions. Optionally, the instrument 121 comprises an electrocautery, a jaw, a stapler, a scissor, an ultrasound probe. Wherein the drive assembly 122 may include an articulating component (e.g., an articulation assembly) coupled to the instrument 121 such that the position and orientation of the instrument 121 can be manipulated to move with one or more mechanical degrees of freedom relative to the instrument shaft. Optionally, instrument 121 also includes jaws that have more morphological changes in functional mechanical degrees of freedom, such as openable and closable.
Referring to fig. 2, an embodiment of the present application provides an interface display control method applied to a computer device, where the computer device may include one or more physically independent and separated intelligent devices with computing processing capabilities. In an alternative specific example, the computer device includes a main operation device as shown in fig. 1, and the interface display control method includes the steps of:
s101, determining the current display area of the target instrument in the surgical environment visual field interface.
The target instrument comprises one or more instruments that are predetermined, e.g., the target instrument may be the predetermined instrument 1 and the instrument 2, or one or more instruments that are determined according to a predetermined rule, e.g., one or more activation instruments that are determined according to a rule of whether the instrument is in use. The activation instrument includes an instrument currently being used in surgery, and the activation instrument may be one or more of a plurality of instruments included in the operation device. The surgical environment view field comprises visual scene information for a doctor to observe the current implementation state of a surgery in real time, such as the position of an activated instrument on a human body, the current execution action of the activated instrument, the characteristics of the surgical site of the human body and the like.
Optionally, taking a computer device executing the interface display control method according to the embodiment of the present application as a main operation device, the main operation device collects an image of a surgical environment field of view through an endoscope, and the surgical environment field of view interface may be a display page for displaying the image of the surgical environment field of view collected in real time by an endoscope imaging system in a client of a remote operation medical device. Determining the current display area of the activation instrument in the surgical environment visual field interface comprises determining the imaging area of the designated part of the activation instrument as the current display area of the activation instrument in the surgical environment visual field interface according to the position of the corresponding imaging of the activation instrument in the surgical environment visual field interface by the main operation equipment. The imaging region of the designated portion of the activation instrument may be an imaging region including the end-effector portion of the activation instrument, or an imaging region determined by expanding the end-effector portion of the activation instrument to the periphery according to a predetermined strategy around the end-effector portion.
S103, determining an idle visual field area in the visual field interface of the surgical environment according to the current display area of the target instrument.
The free field of view region refers to a region of the surgical environment field of view image displayed by the surgical scene field of view interface that is not of relatively concern to the patient. The main operation equipment displays the view field image of the surgical environment acquired by the endoscope system in the view field interface of the surgical scene at the client, so that a doctor can conveniently and clearly observe various index information under the current surgical environment, follow the continuous operation and dynamic change of the operation, and the attention degree of the doctor to the image information in different areas in the view field interface of the surgical scene can possibly change, for example, the doctor gradually shifts from the central part of the view field interface of the surgical scene which is more concerned initially to the part which is more concerned to the right side in the view field interface of the surgical scene, namely, the idle view field area gradually shifts from the peripheral part surrounding the central part to the part which is far to the left side in the view field interface of the surgical scene.
In an optional embodiment, the determining the free field of view area in the surgical environment field of view interface according to the current display area of the target instrument includes:
determining a working visual field area in the surgical environment visual field interface according to the current display area of the target instrument;
and determining an idle visual field area in the surgical environment visual field interface based on the operation visual field area.
The work field area includes an area determined based on a prediction of a portion of interest in performing a surgical operation by a doctor. The main operation device displays the view field image of the operation environment acquired by the endoscope system in the view field interface of the operation scene of the client, so that a doctor can conveniently and clearly observe various index information in the current operation environment, follow the continuous operation and dynamic change of the operation, the attention degree of the doctor to the image information in different areas in the view field interface of the operation scene can be changed, for example, the doctor gradually shifts from the central part which is more concerned with the view field interface of the operation scene to the part which is more concerned with the right side in the view field interface of the operation scene, namely, the doctor changes the position of the operation view field area in the view field interface of the operation scene displayed by the client of the main operation device. The free field of view area includes portions of the surgical scene field of view interface other than the working field of view area.
The computer equipment predicts the part of the doctor, which is concerned by the operation, based on the current display area of the target instrument according to the current display area of the target instrument, so as to determine the operation visual field area and the idle visual field area in the operation environment visual field interface.
S105, acquiring the current position of the interface icon, and if the situation that the interface icon is blocked in the visual field is determined according to the position, adjusting the interface icon to be in the free visual field area.
The interface icons include virtual icons in the surgical environment field of view interface for achieving human-machine interaction, logical operation, interface aesthetics, slave operation device status and/or instrument status, such as: and the endoscope icon is used for representing state information such as the rotation angle, the mirror angle and the like of the endoscope in the surgical environment visual field interface. Referring to fig. 3, a schematic view of an endoscope icon in an alternative specific example, in which the view of the endoscope icon is upward and correspondingly indicates that the mirror angle is upward, as shown in fig. 3.A and 3.b; the view down indicates the mirror angle down as shown in fig. 3.c and fig. 3. D. Rotation 0 degrees as in fig. 3.A and 3.c; the spin is indicated by a few degrees as in figures 3.b and 3.D (spin is no more than 90 degrees). Alternatively, the mirror angle may be obtained from a parameter set by the user, and the rotation angle is calculated from the position of the rotation motor that controls the endoscope to perform the action. The doctor can grasp the state of the endoscope under the surgical environment in real time through the state of the endoscope icon in the surgical environment visual field interface. And the computer equipment judges whether the interface icon has vision shielding for a doctor according to the current position of the interface icon, and if so, the interface icon is adjusted to be in an idle vision area of a surgical environment vision interface.
In the above embodiment, the interface is adjusted to the free visual field area by distinguishing the free visual field area from the surgical environment visual field interface according to the current focused attention position of the doctor, when the visual field shielding exists on the interface icon, the real-time optimization of the position of the interface icon in the surgical environment visual field interface is realized, and on the premise that the icons which are richer in number and respectively represent more types of instrument positions and execution states can be presented on the support of the surgical environment visual field interface, the situation that the doctor is not shielded by the interface icon in the progress of the surgery is ensured, so that the doctor can perform the surgery better.
In some embodiments, the determining the free field of view area in the surgical environment field of view interface from the current display area of the target instrument comprises:
acquiring a surgical environment view field image, performing target detection on the surgical environment view field image, and determining the position and the size of the target instrument contained in the surgical environment view field image;
and determining a current display area of the target instrument according to the position and the size of the target instrument, and determining a working visual field area and an idle visual field area in the visual field interface of the surgical environment based on the current display area.
The computer device may determine the location and size of the target instrument in the surgical environment field of view image by performing target detection on the surgical environment field of view image. The target detection may be performed by using a known target detection algorithm for detecting whether the image includes a specified object. Optionally, taking a computer device executing the interface display control method according to the embodiment of the present application as a main operation device, the acquiring the view field image of the surgical environment may include that an endoscope acquires video data of the view field of the surgical environment and sends the video data of the view field of the surgical environment to the main operation device, and the main operation device extracts one or more image frames from the video data of the view field of the surgical environment.
Wherein the target instrument may comprise an activation instrument; and/or the target instrument may comprise a designated inactive instrument.
In the above embodiment, the computer device determines the position and the size of the target instrument in the surgical environment view field image by acquiring the surgical environment view field image and adopting a method for performing target detection based on the surgical environment view field image, thereby determining the current display area of the target instrument, and predicting the part of the doctor, which is concerned by the surgical operation, based on the current display area of the target instrument, thereby determining the operation view field area and the idle view field area in the surgical environment view field interface.
Optionally, the acquiring a field image of the surgical environment field, performing object detection on the field image, and determining a position and a size of the target instrument included in the field image includes:
acquiring an operation environment view field image through a neural network model, performing target detection on the operation environment view field image, and determining the position and the size of the target instrument contained in the operation environment view field image.
And acquiring a neural network model based on image recognition to perform target detection on the surgical environment visual field image, wherein the neural network model can be obtained by training a known neural network architecture, such as a convolutional neural network (Fast R-CNN) based on a Fast YOLO (YOU ONLY LOOK ONCE) algorithm. The target detection algorithm of the visual field image of the surgical environment can be used for realizing end-to-end detection on the position and the size of the target instrument contained in the visual field image of the surgical environment by adopting the neural network model, so that the target detection and identification can be rapidly carried out under the condition of higher accuracy.
In some embodiments, the acquiring, by the neural network model, a field-of-view image of the surgical environment field of view, performing object detection on the field-of-view image, and before determining the position and the size of the target instrument included in the field-of-view image, includes:
Constructing an initial neural network model;
training the initial neural network model based on a training sample set containing a surgical environment visual field image of a target object mark to obtain a trained neural network model; the target object annotation includes a position and size annotation of a target instrument contained in the field of view image.
The neural network model comprises a model which can be obtained through deep learning and can be used for extracting key characteristics representing whether a target object is carried in an image. Among them, deep Learning (DL) is a new research direction in the field of Machine Learning (ML), which is introduced into Machine Learning to make it closer to the original target, i.e., AI. Deep learning is the inherent regularity and presentation hierarchy of learning sample data, and the information obtained during such learning is helpful in interpreting data such as text, images and sounds. Its final goal is to have the machine have analytical learning capabilities like a person, and to recognize text, image, and sound data. Deep learning makes machines imitate human brain activities such as audio-visual and thinking, and the like, have achieved many results in search technologies, data mining, machine learning, machine translation, natural language processing, voice recognition, recommendation and personalization technologies and other related fields, and solves many complex pattern recognition problems, so that artificial intelligence related technologies have greatly advanced. The initial neural network model can be a known convolutional neural network, and comprises a convolutional layer for extracting image features, a pooling layer for performing processing such as dimension reduction and redundant information removal on the extracted image features, and an output layer for classifying, identifying and outputting a target object based on the image features output by the pooling layer.
The training sample set may include a positive sample image and a negative sample image, where the positive sample image includes a surgical environment field image including a target object annotation in the image, and the negative sample image includes a surgical environment field image including no target object in the image, a surgical environment field image including a target object annotation error, or other images.
Based on the training sample set, the neural network model may be trained by: the sample image is subjected to category labeling, wherein the sample image can be subjected to category labeling according to label information capable of uniquely characterizing the category identity, the position and the size of the instrument, for example, the category labeling of the image containing the designated instrument 1 in the sample image corresponds to 1, the corresponding position and the size of the instrument in the image are labeled, the category labeling of the image containing the designated instrument 2 in the sample image corresponds to 2, the corresponding position and the size of the instrument in the image are labeled, and the category labeling of the image not containing any designated instrument in the sample image corresponds to 0, so that the sample image containing the target object labeling is obtained; inputting a sample image containing target object labels into a neural network model, carrying out category prediction on target objects borne by the sample image through the neural network model, comparing the predicted category with a standard target category, determining the value of a loss function of the neural network model based on the difference between the predicted category and the standard target category, reversely transmitting the value of the loss function into each layer of the neural network model, and updating model parameters of each layer through a random gradient descent method (SGD, stochastic Gradient Descent) until the loss function converges, so as to realize training of the neural network model. Optionally, the initial neural network model may further include a regression layer, and a counter-propagating neural network may be used, and through training of sample data, the network weights and thresholds are continuously modified to decrease the error function along the negative gradient direction, approaching the desired output.
The trained neural network model performs feature extraction on the surgical environment view field image acquired by the main operation equipment, forms feature vectors based on the extracted image features, and performs classification prediction to determine corresponding classification labels, so that the type of the target instrument and the position and size information thereof contained in the surgical environment view field image are output. In an alternative specific example, the target instrument position and size information includes coordinate and size information of a target frame containing the active instrument, and by taking the active instrument as the target instrument, a corresponding display area is determined based on the position and size of the active instrument, and then the operation visual field area and the idle visual field area are distinguished, so that the interface icon does not shade the active instrument. In another optional specific example, the target instrument position and size information may also include coordinate and size information of a target frame containing the designated inactive instrument, and by taking the designated inactive instrument as the target instrument, determining a corresponding display area based on the position and size of the inactive instrument, and further distinguishing a working field of view area from an idle field of view area, so as to ensure that the interface icon does not obstruct the designated inactive instrument. The target instrument may also include both an active instrument and a designated inactive instrument, thereby ensuring that the interface icon does not obscure any instrument.
In the embodiment, the neural network model can be obtained after training, a training sample set can be constructed and continuously enriched according to the surgical environment visual field image collected in the practical application process, and the neural network model can be updated and upgraded along with the increase of sample data based on self learning and iteration of the neural network model, so that the recognition result of the surgical environment visual field image can be better and better represented.
The method for determining the current display area of the target instrument in the surgical environment visual field interface is not limited to image recognition, for example, in another optional embodiment, the determining the free visual field area in the surgical environment visual field interface according to the current display area of the target instrument includes:
determining the relative distance between a target part on the target instrument and the mirror surface of the endoscope;
determining the position and the size of the target instrument contained in the view field image according to the relative distance and the conversion relation between an image coordinate system corresponding to the view field of the surgical environment of the endoscope and a world coordinate system;
and determining a current display area of the target instrument according to the position and the size of the target instrument, and determining a working visual field area and an idle visual field area in the visual field interface of the surgical environment based on the current display area.
The target site of the target instrument may be any predetermined reference site that characterizes the location of the target instrument, such as the tip of the target instrument. Different types of target instruments may be preset with different locations as target locations corresponding thereto. The world coordinate system comprises a reference coordinate system which is established based on the surgical environment scene and is convenient for measuring the relative position relation among the entities, for example, the world coordinate system can be established based on the plane of the endoscope mirror surface, and the relative distance between the target part on the target instrument and the endoscope mirror surface in the world coordinate system can be determined. The image coordinate system comprises a coordinate system established based on the relation between an entity and an imaging point when the image acquisition device acquires the view field image of the surgical environment, such as a three-dimensional coordinate system established by taking the focusing center of the image acquisition device as an origin and taking the optical axis of the image acquisition device as a longitudinal axis. The conversion relationship between the image coordinate system and the world coordinate system can be generally represented by a rotation matrix and a translation matrix. The method comprises the steps that a main operation device determines the position and the size of a target instrument contained in a view field image by acquiring the relative distance between a target part on the target instrument and a mirror surface of an endoscope and utilizing a projection imaging relation during the acquisition of the view field image of the surgical environment, and according to the relative distance and the conversion relation between an image coordinate system corresponding to the view field of the surgical environment of the endoscope and a world coordinate system, the position and the size of the target instrument contained in the view field image are determined, and then the current display area of the target instrument in a view field interface of the surgical environment is determined according to the position and the size of the target instrument in the view field image of the surgical environment.
In the above embodiment, by obtaining the distance between the target portion of the target instrument and the mirror surface of the endoscope, determining the current display area of the target instrument in the surgical environment field of view interface by using the optical projection imaging relationship provides further alternative embodiments for determining the operation field of view area and the free field of view area in the surgical environment field of view interface. In the execution process of the interface display control method, the current display area of the target instrument in the surgical environment visual field interface can be determined by adopting a mode of carrying out target detection on the surgical environment visual field image and a mode of calculating the position and the size of the target instrument in the surgical environment visual field interface by utilizing an optical projection imaging relation, and the result can be corrected by combining the two modes so as to more accurately determine the operation visual field area and the idle visual field area in the surgical environment visual field interface.
Wherein the relative distance between the target site on the target instrument and the endoscope mirror surface can be determined by different technical means. In some embodiments, the determining the relative distance of the target site on the target instrument to the endoscope mirror surface comprises:
obtaining structural parameters of an endoscope;
Determining the type of the target instrument, and determining instrument structure parameters of the target instrument according to the type of the target instrument;
obtaining driving parameters corresponding to the target instrument;
and determining the relative distance between the end part of the target instrument and the endoscope mirror surface according to the endoscope structure parameter, the instrument structure parameter and the driving parameter.
The endoscope configuration parameters may include, among other things, the shape, size, etc. of the endoscope portion configuration for determining the relevant parameters of the endoscope mirror position. The instrument configuration parameters of the target instrument may include parameters related to the shape, size, etc. of the target site of the instrument, such as the shape and size of the end of the activation instrument. The driving parameters corresponding to the target instrument can be obtained from a driving assembly for driving the target instrument to execute corresponding actions, such as the rotation cycle number of a motor shaft of a driving motor, and the gesture of the target instrument, such as the rotation angle, the extension distance and the like of the activated instrument, can be correspondingly determined through the driving parameters. In the execution process of the interface display control method, the relative distance between the end part of the target instrument and the endoscope mirror surface can be calculated according to the structural parameters of the endoscope, the structural parameters of the instrument of the target instrument and the corresponding driving parameters.
In the above embodiment, the distance between the target portion of the target instrument and the mirror surface of the endoscope may be calculated by combining the real-time gesture of the target instrument determined by the driving parameter for driving the current execution of the target instrument according to the target instrument structure parameter and the endoscope structure parameter, and the current display area of the target instrument in the surgical environment visual field interface is determined by using the optical projection imaging relationship, so that more alternative embodiments for determining the operation visual field area and the idle visual field area in the surgical environment visual field interface are provided.
In some embodiments, the obtaining the current position of the interface icon, if it is determined that the interface icon has a view shielding according to the position, adjusting the interface icon to be in the free view area includes:
acquiring the current position of an interface icon;
determining whether the interface icon and the operation visual field area overlap or not according to the position of the interface icon, and/or determining whether the idle visual field ratio of the area where the interface icon is located is smaller than a set threshold value or not;
if yes, the interface icon is adjusted to be in the free visual field area.
The condition for determining whether the interface icon has the view shielding may be that the interface icon has at least partial overlapping with the operation view area, or may be that the free view ratio of the area where the interface icon is located is smaller than a set threshold. In the executing process of the interface display control method, the interface display control method can be respectively judged according to two conditions, and when any one of the two conditions is met, the interface icon can be considered to have a view shielding effect. The free field of view ratio may comprise the area ratio of free field of view area to said area within the area, where the corresponding set threshold comprises a set area ratio threshold, or comprises the minimum distance between the edge of the non-free area within the area and said area, where the corresponding set threshold comprises a set distance threshold.
In the above embodiment, the position control of the interface icon in the visual interface of the surgical environment is optimized by optimizing the judgment condition for judging whether the visual field of the interface icon is blocked, so as to ensure the integrity of the visual field of the doctor.
In some embodiments, the adjusting the interface icon to be within the free field of view area includes:
forming a plurality of idle vision blocks from the idle vision area according to the relative position of the idle vision area relative to the working vision area in the surgical environment vision interface;
calculating the idle field duty ratio of a plurality of the idle field blocks;
and adjusting the interface icon to be in a target idle field block with the idle field duty ratio meeting the preset requirement.
Referring to fig. 4, an optional division result diagram of the idle field area is shown, and the idle field area is divided into eight blocks of upper left, upper right, upper left, right, lower left, lower right according to the relative position of the idle field area with respect to the operation field area. It should be noted that, as the operation dynamic changes, the number of free field areas may decrease and the size of each area may increase or decrease during the course of the change of the position of the operation field area in the operation environment field interface. According to the design of the visual field interface of the operation environment, virtual keys for operation of doctors may be arranged in different idle visual field blocks, when the positions of the interface icons are adjusted, the main operation equipment can respectively calculate the idle visual field duty ratios of a plurality of idle visual field blocks according to the real-time condition at the time, adjust the interface icons to the target idle visual field blocks with the idle visual field duty ratio meeting the preset requirement, for example, adjust the interface icons to the idle visual field block with the largest idle visual field duty ratio, and reduce the frequency of adjusting the positions of the interface icons.
In the above embodiment, the idle field area is divided into the plurality of idle field areas, and when the position of the interface icon is adjusted, the idle field area with the largest idle field occupation ratio is selected to set the interface icon, so as to optimize the overall layout of the surgical environment field interface, reduce the frequency of adjusting the position of the interface icon, and avoid the phenomenon that the position of the interface icon frequently jumps due to frequent change of the idle field area.
In some embodiments, the determining that the target instrument is in front of the current display area in the surgical environment field of view interface comprises:
judging whether the icon display mode is in a dynamic mode or a fixed mode;
if the operation environment visual field interface is in the dynamic mode, executing the step of determining the current display area of the target instrument in the operation environment visual field interface;
and if the interface icon is in the fixed mode, waiting for an interface icon adjusting instruction, and correspondingly adjusting the position of the interface icon based on the interface icon adjusting instruction.
The method comprises the steps that the position of an interface icon is adjusted to provide a dynamic mode and a fixed mode respectively, a doctor can select the dynamic mode or the fixed mode according to the current requirement, and when the dynamic mode is selected, the interface icon is automatically adjusted by determining the current display area of a target instrument in a surgical environment visual field interface and determining a working visual field area and an idle visual field area; when the fixed mode is selected, the interface icon is adjusted according to the specific adjustment operation of the doctor on the position of the interface icon, such as the adjustment operation of manually moving the interface icon to a certain position of the surgical environment visual field interface by the doctor. The interface icon adjustment instruction may be an instruction formed by the doctor dragging the interface icon to an adjustment operation at a certain position of the surgical environment visual field interface.
In the above embodiment, the dynamic mode and the fixed mode are set for the doctor to select according to the requirements in different application scenarios, so as to meet the requirements of more different application scenarios.
In some embodiments, the free field of view area includes a plurality of free field of view tiles located at different orientations within the surgical environment field of view interface, the adjusting the position of the interface icon based on the interface icon adjustment instruction includes:
acquiring a selection instruction of an idle view field block of a to-be-selected azimuth in the surgical environment view field interface, and adjusting the interface icon to be in the idle view field block of the corresponding azimuth according to the selection instruction.
In the fixed mode, adjusting and selecting keys corresponding to the idle visual field blocks are respectively arranged according to the positions and the number of the idle visual field blocks, and interface icons can be selected to be adjusted into the corresponding idle visual field blocks by clicking the designated adjusting and selecting keys. The operation environment view field interface is correspondingly provided with eight adjustment and selection keys for the eight idle view field areas, and a user can adjust and select the keys by clicking one of the adjustment and selection keys, obtain a selection instruction for the idle view field area to be selected in the operation environment view field interface according to the selection operation of the user, and adjust the interface icon into the idle view field area in the corresponding position according to the selection instruction.
In the above embodiment, a fixed mode for adjusting the position of the interface icon is provided, and for the relative position of the idle field area with respect to the operation field area, the selection of the idle field area with multiple directions is provided, so that the user can customize the display position of the interface icon to avoid the situation of shielding the field of view.
In some embodiments, after the adjusting the interface icon to be within the free field of view area, the method further comprises:
and updating and storing the current position of the interface icon.
Under the condition that the display position of the interface icon in the surgical environment visual field interface is changed, the current position of the interface icon is updated and stored, so that the current position of the interface icon is updated in real time. The display position of the interface icon in the surgical environment field interface is changed, which includes a case where the interface icon automatically adjusts the position according to the shielding of the field of view in the dynamic mode, and a case where the interface icon adjusts the position based on the adjustment instruction of the user in the fixed mode.
In the above embodiment, the position change of the interface icon is updated and recorded in real time, so as to ensure that the latest position of the interface icon can be obtained in real time in the process of adjusting the position of the interface icon in the surgical environment visual field interface.
Optionally, the determining that the target instrument is in front of the current display area in the surgical environment field interface includes:
acquiring an operation environment view field image acquired by an endoscope, and displaying the operation environment view field image on the operation environment view field interface;
wherein the interface icon comprises an endoscope icon.
In order to provide a more general understanding of the interface display control method provided by the embodiment of the present application, please refer to fig. 5 and 6, in which an interface icon is taken as an endoscope icon as an optional specific example, the endoscope icon includes an icon for representing a current posture of an endoscope, which is displayed in a view field interface of an operation environment when a doctor controls the endoscope, and the icon can indicate a rotation angle and a mirror angle of the current endoscope in real time, the interface display control method includes the following steps:
s10, starting to control the endoscope;
s11, judging that the position adjustment mode of the interface icon is a fixed mode or a dynamic mode; if the dynamic mode is the dynamic mode, executing S121; if the mode is the fixed mode, S122 is executed;
s121, determining a working visual field area and an idle visual field area in a visual field interface of the surgical environment; taking a target instrument as an activating instrument as an example, in a dynamic mode, determining a display area of the activating instrument in a surgical environment visual field interface by determining the position and the size of the activating instrument in the surgical environment visual field interface, and determining a working visual field area and an idle visual field area based on the display area of the activating instrument;
S13, acquiring the position of an interface icon;
s14, judging whether the interface icon blocks the view; if yes, executing S141; if not, executing S142, and maintaining the position of the interface icon unchanged;
s141, determining an idle visual field block with the largest idle visual field occupation ratio in the idle visual field area, and adjusting an interface icon into the idle visual field block; after S141, S15 is performed;
s122, judging whether an interface icon adjusting instruction for adjusting the position of the interface icon is received or not; if yes, executing S123; if not, executing S142, and maintaining the position of the interface icon unchanged;
s123, adjusting the interface icon to the corresponding idle vision field block according to the adjustment direction indicated by the interface adjustment instruction; after S123, S15 is executed;
s15, updating and storing the positions of the interface icons;
s16, whether to stop controlling the endoscope; if yes, then execute S161; if not, returning to S11;
s161, removing the endoscope icon.
In the above embodiment, the interface display control method at least has the following characteristics:
first, a fixed mode and a dynamic mode for adjusting the UI display position are supported, and a function of ensuring that the UI does not block the view of the operator's work during the movement of the surgical field or the movement of the endoscope view is realized. In the fixed mode, a plurality of choices corresponding to different idle vision blocks are provided, so that choices of UI display orientations are increased, and a user can customize the UI display position to avoid the situation of shielding vision;
Secondly, the dynamic mode can intelligently identify the operation area of the user, and then automatically and dynamically move the UI to a position which does not block the visual field, so that an operator can operate the area of the current display UI (because the UI can automatically move to an idle visual field area) under the condition of not moving the visual field, and the operation of a doctor can be reduced in the operation of frequently switching the operation area; secondly, the rule of UI movement in the dynamic mode is that the occupation ratio of the idle visual field is larger than a set threshold value, so that the phenomenon of frequent jitter of the UI position caused by frequent change of the maximum idle block can be avoided;
thirdly, through real-time optimization of the positions of the interface icons in the surgical environment view field interface, on the premise that the surgical environment view field interface is supported to be capable of presenting icons which are richer in number and respectively represent the positions and the execution states of more types of instruments, the situation that the doctor view field is not blocked by the interface icons in the progress of the surgery is ensured, so that the doctor can perform the surgery better.
In some embodiments, step S105 above, adjusting the interface icon to be within the free field area includes:
and acquiring the sizes of the interface icon and the free visual field area.
And judging whether the size of the interface icon is smaller than the size of the free visual field area.
If yes, the interface icon is adjusted to be in the free visual field area.
And if not, adjusting the size of the interface icon to be smaller than the size of the free visual field area, and adjusting the interface icon to be in the free visual field area.
Wherein adjusting the size of the interface icon to be smaller than the size of the free field of view comprises: the interface icon is resized to a proportional value of the size of the free field of view area, e.g., 0.5, 0.6, 0.7, 0.8, 0.9, or any other multiple less than 1.
In some embodiments, the interface display control method may further include the steps of:
a current display area of the target instrument in the surgical environment field of view interface is determined.
And determining a working visual field area and an idle visual field area in the visual field interface of the surgical environment according to the current display area of the target instrument.
Acquiring the current position of an interface icon, acquiring the sizes of the interface icon and the idle visual field area if the position falls into the operation visual field area, and judging whether the size of the interface icon is smaller than the size of the idle visual field area.
If yes, the interface icon is adjusted to be in the free visual field area.
If not, the size of the interface icon is adjusted in situ to reduce the shielding of the operation visual field area.
Wherein adjusting the size of the interface icon in situ comprises: the interface icon is resized to a proportional value of the size of the original interface icon, e.g., to 0.5, 0.6, 0.7, 0.8, 0.9 or any other multiple less than 1 of the free field area. Wherein adjusting the size of the interface icon in situ further comprises not changing the center position of the interface icon.
Referring to fig. 7, in another aspect of the embodiment of the present application, an interface display control device is provided, including: a determining module 211, configured to determine a current display area of the target instrument in the surgical environment field interface; the visual field dividing module 212 is configured to determine an idle visual field area in the surgical environment visual field interface according to a current display area of the target instrument; and the adjusting module 213 is configured to obtain a current position of the interface icon, and adjust the interface icon to be in the free field area if it is determined that the interface icon has a field of view occlusion according to the position.
Optionally, the view dividing module 212 is specifically configured to determine, according to a current display area of the target instrument, a working view area in the surgical environment view interface; and determining an idle visual field area in the surgical environment visual field interface based on the operation visual field area.
Optionally, the view dividing module 212 is specifically configured to acquire a view image of a surgical environment, perform object detection on the view image of the surgical environment, and determine a position and a size of the target instrument included in the view image of the surgical environment; and determining a current display area of the target instrument according to the position and the size of the target instrument, and determining a working visual field area and an idle visual field area in the visual field interface of the surgical environment based on the current display area.
Optionally, the view dividing module 212 is further configured to acquire a view image of a surgical environment through a neural network model, perform object detection on the view image of the surgical environment, and determine a position and a size of the target instrument included in the view image of the surgical environment.
Optionally, the view dividing module 212 is further configured to construct an initial neural network model; training the initial neural network model based on a training sample set containing a surgical environment visual field image of a target object mark to obtain a trained neural network model; the target object annotation includes a position and size annotation of a target instrument contained in the field of view image.
Optionally, the view dividing module 212 is further configured to determine a relative distance between a target site on the target instrument and an endoscope mirror surface; determining the position and the size of the target instrument contained in the view field image according to the relative distance and the conversion relation between an image coordinate system corresponding to the view field of the surgical environment of the endoscope and a world coordinate system; and determining a current display area of the target instrument according to the position and the size of the target instrument, and determining a working visual field area and an idle visual field area in the visual field interface of the surgical environment based on the current display area.
Optionally, the view dividing module 212 is further configured to obtain an endoscope structural parameter; determining the type of the target instrument, and determining instrument structure parameters of the target instrument according to the type of the target instrument; obtaining driving parameters corresponding to the target instrument; and determining the relative distance between the end part of the target instrument and the endoscope mirror surface according to the endoscope structure parameter, the instrument structure parameter and the driving parameter.
Optionally, the adjusting module 213 is specifically configured to obtain a current position of the interface icon; determining whether the interface icon and the operation visual field area overlap or not according to the position of the interface icon, and/or determining whether the idle visual field ratio of the area where the interface icon is located is smaller than a set threshold value or not; if yes, the interface icon is adjusted to be in the free visual field area.
Optionally, the adjusting module 213 is further configured to form the free-view area into a plurality of free-view blocks according to a relative position of the free-view area with respect to the working-view area in the surgical-environment visual-field interface; calculating the idle field duty ratio of a plurality of the idle field blocks; and adjusting the interface icon to be in a target idle field block with the idle field duty ratio meeting the preset requirement.
Optionally, the interface display control device further includes a judging module, configured to judge whether the icon display mode is in a dynamic mode or a fixed mode; if the operation environment visual field interface is in the dynamic mode, executing the step of determining the current display area of the target instrument in the operation environment visual field interface; and if the interface icon is in the fixed mode, waiting for an interface icon adjusting instruction, and correspondingly adjusting the position of the interface icon based on the interface icon adjusting instruction.
Optionally, the free view area includes a plurality of free view blocks located in different directions in the surgical environment view field interface, and the adjustment module 213 is further configured to obtain a selection instruction for a free view block to be selected in the surgical environment view field interface, and adjust the interface icon to be in the free view block in the corresponding direction according to the selection instruction.
Optionally, the view dividing module 212 is further configured to update and save the current location of the interface icon.
Optionally, the system further comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring an operation environment view field image acquired by an endoscope and displaying the operation environment view field image on the operation environment view field interface; wherein the interface icon comprises an endoscope icon.
Optionally, the target instrument comprises an activated instrument and/or a designated non-activated instrument.
Optionally, the adjusting module 213 is further configured to obtain the sizes of the interface icon and the free field area; judging whether the size of the interface icon is smaller than the size of the free visual field area or not; if yes, adjusting the interface icon to be in the free visual field area; or if not, adjusting the size of the interface icon to be smaller than the size of the free vision area, and adjusting the interface icon to be in the free vision area.
It should be noted that: in the process of implementing adjustment and optimization of the display position of the interface icon, the interface display control device provided in the above embodiment is only exemplified by the division of the program modules, and in practical application, the processing allocation may be completed by different program modules according to needs, that is, the internal structure of the device may be divided into different program modules, so as to complete all or part of the method steps described above. In addition, the interface display control device and the interface display control method provided in the foregoing embodiments belong to the same concept, and specific implementation processes thereof are detailed in the method embodiments and are not described herein again.
Referring to fig. 8, in another aspect of the embodiment of the present application, a computer device is provided, which includes a processor 21, a memory 22 connected to the processor 21, and a computer program stored on the memory 22 and executable by the processor 21, where the computer program when executed by the processor 21 implements the interface display control method according to any of the embodiments of the present application. It should be noted that the processor 21 may include one or more physically separate processors, and a plurality of processors or intelligent devices including the processors are in communication connection with each other to cooperatively execute the interface display control method according to the embodiment of the present application. Accordingly, the memory 22 may also include one or more storage media of the same type or different types that are physically separate from each other.
In another aspect of the embodiment of the application, a teleoperational medical system is further provided, which comprises the computer device and the slave operation device connected with the master operation device, wherein the slave operation device comprises a plurality of instruments in different types and a driving component for driving the instruments to execute specified actions. The instrument may include an endoscope for capturing images of the surgical environment field of view and transmitting them to the computer device for display. Wherein the instrument comprises at least one of: electrocautery, forceps holder, anastomat, shears, ultrasonic probe.
The embodiment of the application also provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor, implements the processes of the above-mentioned collision early warning method embodiment, and can achieve the same technical effects, and in order to avoid repetition, the description is omitted here. Wherein, the computer readable storage medium is Read-only memory (ROM), random Access Memory (RAM), magnetic disk or optical disk, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method according to the embodiments of the present application.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (19)

1. An interface display control method applied to a computer device, comprising the following steps:
Determining a current display area of a target instrument in a surgical environment visual field interface;
determining an idle visual field area in the surgical environment visual field interface according to the current display area of the target instrument;
acquiring the current position of an interface icon, and if the interface icon is determined to have vision shielding according to the position, adjusting the interface icon to be in the free vision area;
wherein the determining, according to the current display area of the target instrument, an idle field of view area in the surgical environment field of view interface includes:
determining the relative distance between a target part on the target instrument and the mirror surface of the endoscope;
determining the position and the size of the target instrument contained in a view field image according to the relative distance and the conversion relation between an image coordinate system corresponding to the view field of the surgical environment of the endoscope and a world coordinate system;
and determining a current display area of the target instrument according to the position and the size of the target instrument, and determining a working visual field area and an idle visual field area in the visual field interface of the surgical environment based on the current display area.
2. The interface display control method according to claim 1, wherein the determining a work field of view area and an idle field of view area in the surgical environment field of view interface based on the current display area includes:
Determining a working visual field area in the surgical environment visual field interface according to the current display area of the target instrument;
and determining an idle visual field area in the surgical environment visual field interface based on the operation visual field area.
3. The interface display control method according to claim 2, wherein the determining an idle field of view area in the surgical environment field of view interface according to the current display area of the target instrument includes:
acquiring a surgical environment view field image, performing target detection on the surgical environment view field image, and determining the position and the size of the target instrument contained in the surgical environment view field image;
and determining a current display area of the target instrument according to the position and the size of the target instrument, and determining a working visual field area and an idle visual field area in the visual field interface of the surgical environment based on the current display area.
4. The interface display control method according to claim 3, wherein the acquiring a surgical environment field image, performing object detection on the surgical environment field image, determining a position and a size of the target instrument included in the surgical environment field image, includes:
Acquiring an operation environment view field image through a neural network model, performing target detection on the operation environment view field image, and determining the position and the size of the target instrument contained in the operation environment view field image.
5. The interface display control method according to claim 4, wherein the acquiring a surgical environment field image through a neural network model, performing object detection on the surgical environment field image, and before determining the position and the size of the target instrument included in the surgical environment field image, includes:
constructing an initial neural network model;
training the initial neural network model based on a training sample set containing a surgical environment visual field image of a target object mark to obtain a trained neural network model; the target object annotation includes a position and size annotation of a target instrument contained in the field of view image.
6. The interface display control method of claim 1, wherein said determining the relative distance of the target site on the target instrument to the endoscope mirror surface comprises:
obtaining structural parameters of an endoscope;
determining the type of the target instrument, and determining instrument structure parameters of the target instrument according to the type of the target instrument;
Obtaining driving parameters corresponding to the target instrument;
and determining the relative distance between the end part of the target instrument and the endoscope mirror surface according to the endoscope structure parameter, the instrument structure parameter and the driving parameter.
7. The method of claim 1, wherein the obtaining the current position of the interface icon, if it is determined that the interface icon has a view shielding according to the position, adjusting the interface icon to be within the free view area includes:
acquiring the current position of an interface icon;
determining whether the interface icon and the operation visual field area overlap or not according to the position of the interface icon, and/or determining whether the idle visual field ratio of the area where the interface icon is located is smaller than a set threshold value or not;
if yes, the interface icon is adjusted to be in the free visual field area.
8. The interface display control method according to claim 7, wherein the adjusting the interface icon to be within the free-view area includes:
forming a plurality of idle vision blocks from the idle vision area according to the relative position of the idle vision area relative to the working vision area in the surgical environment vision interface;
Calculating the idle field duty ratio of a plurality of the idle field blocks;
and adjusting the interface icon to be in a target idle field block with the idle field duty ratio meeting the preset requirement.
9. The interface display control method of any one of claims 1 to 8, wherein the determining that the target instrument is in front of a current display area in a surgical environment field of view interface comprises:
judging whether the icon display mode is in a dynamic mode or a fixed mode;
if the operation environment visual field interface is in the dynamic mode, executing the step of determining the current display area of the target instrument in the operation environment visual field interface;
and if the interface icon is in the fixed mode, waiting for an interface icon adjusting instruction, and correspondingly adjusting the position of the interface icon based on the interface icon adjusting instruction.
10. The interface display control method of claim 9, wherein the free field of view area comprises a plurality of free field of view tiles positioned at different orientations within the surgical environment field of view interface, the adjusting the position of the interface icon based on the interface icon adjustment instruction comprises:
acquiring a selection instruction of an idle view field block of a to-be-selected azimuth in the surgical environment view field interface, and adjusting the interface icon to be in the idle view field block of the corresponding azimuth according to the selection instruction.
11. The interface display control method according to any one of claims 1 to 8, characterized by further comprising, after the interface icon is adjusted to be within the free-view area:
and updating and storing the current position of the interface icon.
12. The interface display control method of any one of claims 1 to 8, wherein the determining that the target instrument is in front of a current display area in a surgical environment field of view interface comprises:
acquiring an operation environment view field image acquired by an endoscope, and displaying the operation environment view field image on the operation environment view field interface;
wherein the interface icon comprises an endoscope icon.
13. The interface display control method of claim 1, wherein the target instrument comprises an activated instrument and/or a designated non-activated instrument.
14. The interface display control method according to claim 1, wherein the adjusting the interface icon to be within the free-view area includes:
acquiring the sizes of the interface icon and the free visual field area;
judging whether the size of the interface icon is smaller than the size of the free visual field area or not;
If yes, adjusting the interface icon to be in the free visual field area; or alternatively, the first and second heat exchangers may be,
and if not, adjusting the size of the interface icon to be smaller than the size of the free visual field area, and adjusting the interface icon to be in the free visual field area.
15. An interface display control apparatus, comprising:
the determining module is used for determining the current display area of the target instrument in the surgical environment visual field interface;
the visual field dividing module is used for determining an idle visual field area in the surgical environment visual field interface according to the current display area of the target instrument;
the adjustment module is used for acquiring the current position of the interface icon, and adjusting the interface icon to the idle visual field area if the visual field shielding exists in the interface icon according to the position;
the visual field dividing module is specifically used for determining the relative distance between a target part on the target instrument and the mirror surface of the endoscope; determining the position and the size of the target instrument contained in a view field image according to the relative distance and the conversion relation between an image coordinate system corresponding to the view field of the surgical environment of the endoscope and a world coordinate system; and determining a current display area of the target instrument according to the position and the size of the target instrument, and determining a working visual field area and an idle visual field area in the visual field interface of the surgical environment based on the current display area.
16. A computer device comprising a processor, a memory coupled to the processor, and a computer program stored on the memory and executable by the processor, the computer program when executed by the processor implementing the interface display control method of any one of claims 1 to 14.
17. A teleoperational medical system comprising a computer device according to claim 16 and a slave operating device connected to the computer device, the slave operating device comprising a plurality of instruments of different types and a drive assembly for driving the instruments to perform a specified action.
18. The teleoperational medical system of claim 17, wherein the instrument comprises at least one of: electrocautery, forceps holder, anastomat, shears, ultrasonic probe.
19. A computer-readable storage medium, wherein a computer program is stored thereon, which when executed by a processor, implements the interface display control method according to any one of claims 1 to 14.
CN202111316069.1A 2021-11-08 2021-11-08 Interface display control method and device, computer equipment and system and medium Active CN114041874B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111316069.1A CN114041874B (en) 2021-11-08 2021-11-08 Interface display control method and device, computer equipment and system and medium
CN202310888179.8A CN116849803A (en) 2021-11-08 2021-11-08 Interface display control method and device, computer equipment and system and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111316069.1A CN114041874B (en) 2021-11-08 2021-11-08 Interface display control method and device, computer equipment and system and medium

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202310888179.8A Division CN116849803A (en) 2021-11-08 2021-11-08 Interface display control method and device, computer equipment and system and medium

Publications (2)

Publication Number Publication Date
CN114041874A CN114041874A (en) 2022-02-15
CN114041874B true CN114041874B (en) 2023-08-22

Family

ID=80207835

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202111316069.1A Active CN114041874B (en) 2021-11-08 2021-11-08 Interface display control method and device, computer equipment and system and medium
CN202310888179.8A Pending CN116849803A (en) 2021-11-08 2021-11-08 Interface display control method and device, computer equipment and system and medium

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202310888179.8A Pending CN116849803A (en) 2021-11-08 2021-11-08 Interface display control method and device, computer equipment and system and medium

Country Status (1)

Country Link
CN (2) CN114041874B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102821671A (en) * 2010-03-31 2012-12-12 富士胶片株式会社 Oscope observation supporting system and method, and device and programme
CN105808230A (en) * 2014-12-31 2016-07-27 深圳Tcl新技术有限公司 Method and device for moving suspension icon
CN108778180A (en) * 2016-03-02 2018-11-09 柯惠Lp公司 System and method for removing the occlusion objects in operative image and/or video
CN110584778A (en) * 2018-06-12 2019-12-20 上海舍成医疗器械有限公司 Method and device for adjusting object posture and application of device in automation equipment
CN111954486A (en) * 2017-10-27 2020-11-17 深圳迈瑞生物医疗电子股份有限公司 Monitor, display method, display device and storage medium applied to monitor
CN112168358A (en) * 2015-03-17 2021-01-05 直观外科手术操作公司 System and method for screen recognition of instruments in teleoperational medical systems
CN112274250A (en) * 2015-03-17 2021-01-29 直观外科手术操作公司 System and method for presenting screen identification of instrument in teleoperational medical system
CN112580474A (en) * 2020-12-09 2021-03-30 云从科技集团股份有限公司 Target object detection method, system, device and medium based on computer vision
CN112971996A (en) * 2021-02-03 2021-06-18 上海微创医疗机器人(集团)股份有限公司 Computer-readable storage medium, electronic device, and surgical robot system
CN112971688A (en) * 2021-02-07 2021-06-18 杭州海康慧影科技有限公司 Image processing method and device and computer equipment

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102117273B1 (en) * 2013-03-21 2020-06-01 삼성전자주식회사 Surgical robot system and method for controlling the same
US10426339B2 (en) * 2016-01-13 2019-10-01 Novartis Ag Apparatuses and methods for parameter adjustment in surgical procedures
CN112672709A (en) * 2018-07-31 2021-04-16 直观外科手术操作公司 System and method for tracking the position of a robotically-manipulated surgical instrument
JP7080861B2 (en) * 2019-07-29 2022-06-06 株式会社メディカロイド Surgical system
GB2588829B (en) * 2019-11-11 2023-11-29 Cmr Surgical Ltd Method of controlling a surgical robot

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102821671A (en) * 2010-03-31 2012-12-12 富士胶片株式会社 Oscope observation supporting system and method, and device and programme
CN105808230A (en) * 2014-12-31 2016-07-27 深圳Tcl新技术有限公司 Method and device for moving suspension icon
CN112168358A (en) * 2015-03-17 2021-01-05 直观外科手术操作公司 System and method for screen recognition of instruments in teleoperational medical systems
CN112274250A (en) * 2015-03-17 2021-01-29 直观外科手术操作公司 System and method for presenting screen identification of instrument in teleoperational medical system
CN108778180A (en) * 2016-03-02 2018-11-09 柯惠Lp公司 System and method for removing the occlusion objects in operative image and/or video
CN111954486A (en) * 2017-10-27 2020-11-17 深圳迈瑞生物医疗电子股份有限公司 Monitor, display method, display device and storage medium applied to monitor
CN110584778A (en) * 2018-06-12 2019-12-20 上海舍成医疗器械有限公司 Method and device for adjusting object posture and application of device in automation equipment
CN112580474A (en) * 2020-12-09 2021-03-30 云从科技集团股份有限公司 Target object detection method, system, device and medium based on computer vision
CN112971996A (en) * 2021-02-03 2021-06-18 上海微创医疗机器人(集团)股份有限公司 Computer-readable storage medium, electronic device, and surgical robot system
CN112971688A (en) * 2021-02-07 2021-06-18 杭州海康慧影科技有限公司 Image processing method and device and computer equipment

Also Published As

Publication number Publication date
CN116849803A (en) 2023-10-10
CN114041874A (en) 2022-02-15

Similar Documents

Publication Publication Date Title
US20240115333A1 (en) Surgical system with training or assist functions
KR102014359B1 (en) Method and apparatus for providing camera location using surgical video
KR102298412B1 (en) Surgical image data learning system
CN112220562A (en) Method and system for enhancing surgical tool control during surgery using computer vision
US20190325574A1 (en) Surgical simulator providing labeled data
CN112804958A (en) Indicator system
KR101926123B1 (en) Device and method for segmenting surgical image
US20140012793A1 (en) System and method for predicting surgery progress stage
KR102427736B1 (en) Camera controller robot based on surgical image recognition and method for adjusting view of camera using the same
EP3819867A1 (en) Surgical scene assessment based on computer vision
JP7235212B2 (en) Handedness Tool Detection System for Surgical Video
US20220104887A1 (en) Surgical record creation using computer recognition of surgical events
CN114041874B (en) Interface display control method and device, computer equipment and system and medium
US20220409301A1 (en) Systems and methods for identifying and facilitating an intended interaction with a target object in a surgical space
KR20180100831A (en) Method for controlling view point of surgical robot camera and apparatus using the same
US20230045686A1 (en) Fusion of spatial and temporal context for location determination for visualization systems
WO2019222480A1 (en) Confidence-based robotically-assisted surgery system
CN114327046B (en) Method, device and system for multi-mode human-computer interaction and intelligent state early warning
US20220354586A1 (en) Robotic surgery
WO2022263430A1 (en) Joint identification and pose estimation of surgical instruments
CN116940301A (en) Tool type agnostic auxiliary function for surgical procedures
CN115376676A (en) Surgical instrument adjustment method, surgical system, and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant