CN114041874A - Interface display control method and device, computer equipment and system, and medium - Google Patents

Interface display control method and device, computer equipment and system, and medium Download PDF

Info

Publication number
CN114041874A
CN114041874A CN202111316069.1A CN202111316069A CN114041874A CN 114041874 A CN114041874 A CN 114041874A CN 202111316069 A CN202111316069 A CN 202111316069A CN 114041874 A CN114041874 A CN 114041874A
Authority
CN
China
Prior art keywords
interface
view
field
idle
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111316069.1A
Other languages
Chinese (zh)
Other versions
CN114041874B (en
Inventor
黄帆
王牌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Edge Medical Co Ltd
Original Assignee
Shenzhen Edge Medical Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Edge Medical Co Ltd filed Critical Shenzhen Edge Medical Co Ltd
Priority to CN202111316069.1A priority Critical patent/CN114041874B/en
Priority to CN202310888179.8A priority patent/CN116849803A/en
Publication of CN114041874A publication Critical patent/CN114041874A/en
Application granted granted Critical
Publication of CN114041874B publication Critical patent/CN114041874B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00147Holding or positioning arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00174Optical arrangements characterised by the viewing angles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/302Surgical robots specifically adapted for manipulations within body cavities, e.g. within abdominal or thoracic cavities

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Optics & Photonics (AREA)
  • Robotics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Gynecology & Obstetrics (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides an interface display control method and device, computer equipment and system and a computer storage medium, wherein the method comprises the following steps: determining a current display area of the target instrument in a surgical environment field-of-view interface; determining an idle view area in the surgical environment view field interface according to a current display area of the target instrument; the method comprises the steps of obtaining the current position of an interface icon, if the interface icon is determined to have view shielding according to the position, adjusting the interface icon to be in the idle view area, so that the position of the interface icon in the operation environment view field interface is optimized in real time, and on the premise that the operation environment view field interface can present icons which are rich in quantity and respectively represent positions and execution states of more types of instruments, the view of a doctor is not shielded by the interface icon in the process of operation, so that the doctor can perform operation better.

Description

Interface display control method and device, computer equipment and system, and medium
Technical Field
The present application relates to the field of surgical robot technology, and in particular, to an interface display control method and apparatus, a computer device and system, and a computer-readable storage medium.
Background
The minimally invasive surgery is a surgery mode for performing surgery in a human body by using modern medical instruments such as a laparoscope, a thoracoscope and the like and related equipment. Compared with the traditional minimally invasive surgery, the minimally invasive surgery has the advantages of small wound, light pain, quick recovery and the like. With the progress of science and technology, the minimally invasive surgery robot technology is gradually mature and widely applied. The minimally invasive surgery robot generally comprises a main operation table and a slave operation device, wherein the main operation table is used for sending control commands to the slave operation device according to the operation of a doctor so as to control the slave operation device, and the slave operation device is used for responding to the control commands sent by the main operation table and carrying out corresponding surgery operation. The doctor can observe the condition of the operation environment view field in real time through the display interface in the main operation table so as to be convenient for better performing an operation, and in order to assist the doctor to acquire required information, various icons can be displayed in the display interface generally so as to represent the position, the state and the like of corresponding instruments.
However, as the surgical procedure dynamically progresses, the focus of the doctor on the display interface also dynamically changes, so that the focus of the doctor may be blocked by the icon, which may affect the operation of the doctor.
Disclosure of Invention
In order to solve the existing technical problems, the application provides an interface display control method and device, computer equipment and system and a computer readable storage medium capable of dynamically optimizing the position of an interface icon.
In order to achieve the above purpose, the technical solution of the embodiment of the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides an interface display control method, which is applied to a computer device, and includes:
determining a current display area of the target instrument in a surgical environment field-of-view interface;
determining an idle view area in the surgical environment view field interface according to a current display area of the target instrument;
and acquiring the current position of the interface icon, and if the interface icon is determined to have view shielding according to the position, adjusting the interface icon to the idle view area.
In a second aspect, an embodiment of the present application provides an interface display control apparatus, including:
the determination module is used for determining the current display area of the target instrument in the operation environment field-of-view interface;
the visual field dividing module is used for determining an idle visual field area in the operation environment visual field interface according to the current display area of the target instrument;
and the adjusting module is used for acquiring the current position of the interface icon, and adjusting the interface icon to the idle view area if the view shielding of the interface icon is determined according to the position.
In a third aspect, an embodiment of the present application provides a computer device, including a processor, a memory connected to the processor, and a computer program stored on the memory and executable by the processor, where the computer program, when executed by the processor, implements the interface display control method according to any embodiment of the present application.
In a fourth aspect, embodiments of the present application provide a teleoperational medical system, including a computer device according to any of the embodiments of the present application and a slave operation device connected to the computer device, where the slave operation device includes a plurality of instruments of different types and a driving component for driving the instruments to perform a specified action.
In a fifth aspect, an embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by the processor, the method for controlling interface display according to any embodiment of the present application is implemented.
The interface display control method and the device provided by the above embodiment determine the current display area of the target instrument in the surgical environment view field interface, determine the idle view field area in the surgical environment view field interface according to the current display area of the target instrument, acquire the current position of the interface icon, adjust the interface icon to the idle view field area if the interface icon is determined to have view shielding according to the position of the interface icon, thus, by distinguishing the idle view field area according to the current key focus position of the doctor on the surgical environment view field interface, and when the interface icon is identified to have view shielding, adjust the interface icon to the idle view field area, the real-time optimization of the position of the interface icon in the surgical environment view field interface is realized, under the premise that the surgical environment view field interface can present icons with richer number and respectively representing the positions and the execution states of more types of instruments, meanwhile, the interface icon is ensured not to obstruct the view of the doctor in the process of operation, so that the doctor can perform the operation better.
In the above embodiments, the computer device and the system, the computer-readable storage medium and the corresponding embodiment of the interface display control method belong to the same concept, so that the computer device and the system respectively have the same technical effects as the corresponding embodiment of the interface display control method, and are not described herein again.
Drawings
FIG. 1 is an architecture diagram of an optional application scenario of an interface display control method in an embodiment;
FIG. 2 is a flow diagram of a method for controlling interface display in one embodiment;
FIG. 3 is a diagram of an interface icon in accordance with an embodiment;
FIG. 4 is a schematic view of a surgical environment field of view interface in one embodiment;
FIG. 5 is a schematic view of a surgical environment field of view interface determining a working field of view region and an idle field of view region based on a display area of a target instrument in one embodiment;
FIG. 6 is a flow chart of an alternative exemplary interface display control method;
FIG. 7 is a schematic diagram of an embodiment of an interface display control apparatus;
FIG. 8 is a block diagram of a computer device in an embodiment.
Detailed Description
The technical solution of the present application is further described in detail with reference to the drawings and specific embodiments of the specification.
In order to make the objectives, technical solutions and advantages of the present application clearer, the present application will be described in further detail with reference to the attached drawings, the described embodiments should not be considered as limiting the present application, and all other embodiments obtained by a person of ordinary skill in the art without creative efforts shall fall within the protection scope of the present application.
In the following description, reference is made to the expression "some embodiments" which describe a subset of all possible embodiments, it being noted that "some embodiments" may be the same subset or different subsets of all possible embodiments, and may be combined with each other without conflict.
In the following description, references to the terms "first, second, and third" are only used to distinguish between similar items and do not denote a particular order, but rather the terms "first, second, and third" are used to indicate that a particular order or sequence of items may be interchanged where appropriate to enable embodiments of the application described herein to be practiced otherwise than as specifically illustrated or described herein.
Referring to fig. 1, a schematic view of an optional application scenario in which the interface display control method provided in the embodiment of the present application is applied to a teleoperational medical system, where the teleoperational medical system includes a master operation device 10 and a slave operation device 12 connected to the master operation device 10. Wherein the slave operating device includes a plurality of instruments 121 of different types and a drive assembly 122 for driving the instruments 121 to perform a designated action, and typically, the instruments 121 include an endoscope and surgical tools associated with performing a surgical operation, such as an electrocautery, a forceps, a stapler, a scissors, an ultrasonic probe, and the like. The endoscope is used for collecting the view field image of the operation environment and sending the view field image to the main operation equipment 10 for displaying. The main operation device 10 may be used as a main operation table on the doctor side, and for example, a client for a user to remotely observe the current operation progress and remotely manage the slave operation device may be installed in the main operation device, where the client may be an application client (such as a mobile phone APP) or a web page client, which is not limited herein. The slave operation device 12 is used for receiving operation instructions sent by a doctor based on a client side to execute corresponding actions. Optionally, the instrument 121 includes an electrocautery, a forceps, a stapler, a scissors, an ultrasound probe. Drive assembly 122 may include, among other things, an articulating component (e.g., a joint assembly) coupled to instrument 121 such that the position and orientation of instrument 121 may be manipulated to move in one or more mechanical degrees of freedom relative to the instrument axis. Optionally, the instrument 121 also includes functional mechanical degrees of freedom with more morphological changes, such as jaws that can be opened and closed.
Referring to fig. 2, an embodiment of the present application provides an interface display control method, which is applied to a computer device, where the computer device may include one or more physically independent and separate intelligent devices with computing processing capability. In an alternative specific example, the computer device includes a main operation device as shown in fig. 1, and the interface display control method includes the following steps:
s101, determining a current display area of the target instrument in the operation environment visual field interface.
The target instruments include one or more predetermined instruments, such as the target instruments may be the predetermined instruments 1 and 2, or one or more predetermined instruments determined according to a predetermined rule, such as one or more activating instruments determined according to a rule of whether the target instrument is in use. The activation instrument includes an instrument currently being used in the surgery, and the activation instrument may be one or more of a plurality of instruments included in the slave operation device. The operation environment view field comprises visual scene information for a doctor to observe the current operation implementation state in real time, such as the position of an activated instrument on a human body, the current execution action of the activated instrument, the characteristics of a human operation part and the like.
Optionally, taking a computer device executing the interface display control method described in this embodiment as an example of a main operation device, the main operation device acquires an operation environment field image through an endoscope, and the operation environment field interface may be a display page that displays the operation environment field image acquired by the endoscope imaging system in real time in a client that remotely operates the medical device. Determining the current display area of the activated instrument in the surgical environment field-of-view interface includes determining, by the primary operating device, the imaging area of the designated site of the activated instrument as the current display area of the activated instrument in the surgical environment field-of-view interface according to the position of the corresponding imaging of the activated instrument in the surgical environment field-of-view interface. It should be noted that the imaging region of the designated portion of the activation instrument may be an imaging region including the distal end execution portion of the activation instrument, or an imaging region determined by peripherally expanding the distal end execution portion of the activation instrument to a predetermined strategy.
S103, determining an idle visual field area in the surgical environment visual field interface according to the current display area of the target instrument.
The idle visual field area refers to an area which is relatively not concerned by a doctor in a visual field image of the surgical environment displayed by the visual field interface of the surgical scene. The main operation device displays the operation environment view field image collected by the endoscope system in the operation scene view field interface of the client, so that a doctor can clearly and completely observe various index information under the current operation environment, and the attention of the doctor to the image information in different areas in the operation scene view field interface can be changed along with the continuous and dynamic change of an operation, for example, the doctor gradually transfers from a central part of the operation scene view field interface which is initially more concerned to a part which is more concerned to the right side in the operation scene view field interface, namely, an idle view field area is gradually changed from a peripheral part which surrounds the central part to a part which is more concerned to the left side in the operation scene view field interface.
In an optional embodiment, the determining a free field of view region in the surgical environment field of view interface according to the current display region of the target instrument comprises:
determining a working view field area in the surgical environment view field interface according to the current display area of the target instrument;
determining an idle field of view region in the surgical environment field of view interface based on the working field of view region.
The operation visual field region includes a region determined based on prediction of a site focused on by a surgeon performing an operation. The main operation device displays the operation environment view field image acquired by the endoscope system in the operation scene view field interface of the client, so that a doctor can clearly and completely observe various index information under the current operation environment, and the attention of the doctor to the image information in different areas in the operation scene view field interface can possibly change along with the continuous and dynamic change of an operation, for example, the doctor gradually transfers from the central part of the operation scene view field interface which is most concerned to the part which is more concerned to the right side in the operation scene view field interface, namely, the position of the operation view field area in the operation scene view field interface displayed by the client of the main operation device is changed by the doctor. The idle view region includes a portion of the surgical scene field of view interface other than the working view region.
And the computer equipment predicts the part concerned by the doctor in the operation based on the current display area of the target instrument according to the current display area of the target instrument so as to determine an operation visual field area and an idle visual field area in the operation environment visual field interface.
S105, acquiring the current position of the interface icon, and if the interface icon is determined to have view shielding according to the position, adjusting the interface icon to the idle view area.
The interface icons include virtual icons in the surgical environment field-of-view interface for implementing human-machine interaction, logical operation, interface aesthetics, slave operating device status, and/or instrument status, such as: and the endoscope icon is used for representing the state information such as the endoscope autorotation angle, the mirror surface angle and the like in the operation environment view field interface. Referring to fig. 3, a schematic view of an alternative embodiment of an endoscopic icon, wherein the upward view of the endoscopic icon correspondingly indicates the mirror angle is upward, as shown in fig. 3.a and 3. b; downward field of view means downward mirror angle, as shown in fig. 3.c and 3. d. Rotation 0 degree as in fig. 3.a and fig. 3. c; several degrees of rotation are indicated in fig. 3.b and 3.d (rotation does not exceed ± 90 degrees). Optionally, the mirror angle may be obtained from parameters set by a user, and the rotation angle is calculated from a position of a rotation motor that controls the endoscope to perform an action. The doctor can master the state of the endoscope in the operation environment in real time through the state of the endoscope icon in the operation environment view field interface. The computer equipment judges whether the interface icon shields the view of a doctor or not according to the current position of the interface icon by acquiring the current position of the interface icon, and if yes, the interface icon is adjusted to be in an idle view area of the view field interface of the operation environment.
In the embodiment, the idle view field area is distinguished from the operation environment view field interface according to the current key attention position of the doctor, when the view field shielding exists in the interface icon, the interface icon is adjusted to the idle view field area, the position of the interface icon in the operation environment view field interface is optimized in real time, and on the premise that the operation environment view field interface can present icons which are abundant in quantity and respectively represent positions and execution states of more types of instruments, the view field of the doctor is not shielded by the interface icon in the process of operation, so that the doctor can perform operation better.
In some embodiments, the determining a free field of view region in the surgical environment field of view interface from the current display region of the target instrument comprises:
acquiring a surgical environment field-of-view image, performing target detection on the surgical environment field-of-view image, and determining the position and size of the target instrument contained in the surgical environment field-of-view image;
determining a current display area of the target instrument according to the position and the size of the target instrument, and determining a working visual field area and an idle visual field area in the operation environment visual field interface based on the current display area.
The computer device may determine the position and size of the target instrument in the surgical environment field-of-view image by performing target detection on the surgical environment field-of-view image. The target detection may adopt a known target detection algorithm for detecting whether the image includes the specified object. Optionally, taking a computer device for executing the interface display control method according to the embodiment of the present application as an example of a main operation device, the acquiring the surgical environment field-of-view image may include acquiring, by an endoscope, surgical environment field-of-view video data and sending the surgical environment field-of-view video data to the main operation device, and extracting, by the main operation device, one or more image frames from the surgical environment field-of-view video data.
Wherein the target instrument may comprise an activation instrument; and/or, the target instrument may comprise a designated inactive instrument.
In the above embodiment, the computer device acquires the surgical environment field-of-view image, determines the position and size of the target instrument in the surgical environment field-of-view image by using a method based on target detection on the surgical environment field-of-view image, thereby determining the current display area of the target instrument, and predicts the part of the doctor concerned by the surgical operation based on the current display area of the target instrument, thereby determining the operation field-of-view area and the idle field-of-view area in the surgical environment field-of-view interface.
Optionally, the acquiring a field image of the surgical environment field, performing target detection on the field image, and determining the position and size of the target instrument included in the field image includes:
acquiring a surgical environment field-of-view image through a neural network model, carrying out target detection on the surgical environment field-of-view image, and determining the position and size of the target instrument contained in the surgical environment field-of-view image.
The method comprises the steps of collecting a neural network model based on image recognition to perform target detection on a visual field image of the surgical environment, wherein the neural network model can be obtained by training a known neural network architecture, such as a Fast R-CNN (Fast R-CNN) based on a Fast image recognition algorithm Fast YOLO (YOU ONLY LOOK ONCE). The target detection algorithm of the neural network model on the operation environment view field image can realize end-to-end detection on the position and the size of a target instrument contained in the operation environment view field image, and ensures that the target detection and identification are rapidly carried out under the condition of higher accuracy.
In some embodiments, the acquiring, by the neural network model, a field-of-view image of a field of view of the surgical environment, performing target detection on the field-of-view image, and before determining the position and size of the target instrument included in the field-of-view image, includes:
constructing an initial neural network model;
training the initial neural network model based on a training sample set of the operation environment field-of-view image containing the target object label to obtain a trained neural network model; the target object annotation includes a position and size annotation of a target instrument contained in the field of view image.
The neural network model comprises a model which can be deeply learned to extract key features in the image for representing whether the target object is carried or not. Among them, Deep Learning (DL) is a new research direction in the field of Machine Learning (ML), and is introduced into Machine Learning to make it closer to the original target, i.e., AI. Deep learning is the intrinsic law and expression level of the learning sample data, and the information obtained in the learning process is very helpful for the interpretation of data such as characters, images and sounds. The final aim of the method is to enable the machine to have the analysis and learning capability like a human, and to recognize data such as characters, images and sounds. Deep learning enables human brain activities such as machine-simulated audio-visual and thinking to achieve a plurality of achievements in search technology, data mining, machine learning, machine translation, natural language processing, voice recognition, recommendation and personalization technologies and other related fields, solves a plurality of complex pattern recognition problems, and makes great progress on artificial intelligence related technologies. The initial neural network model can select a known convolutional neural network, and comprises a convolutional layer for extracting image features, a pooling layer for performing processing such as dimension reduction and redundant information removal on the extracted image features, and an output layer for classifying, identifying and outputting a target object based on the image features output by the pooling layer.
The training sample set may include a positive sample image and a negative sample image, in this embodiment, the positive sample pattern includes a surgical environment field image including a target object label in the image, and the negative sample image includes a surgical environment field image not including the target object in the image, a surgical environment field image with a target object label error, or other images.
Based on the training sample set, the neural network model can be trained in the following way: performing category labeling on the sample image, wherein the category labeling can be performed on the sample image according to label information capable of uniquely representing the category identity, the position and the size of the instrument, for example, the category label of the image containing the specified instrument 1 in the sample image is correspondingly 1, the corresponding position and the size of the instrument in the image are labeled, the category label of the image containing the specified instrument 2 in the sample image is correspondingly 2, the corresponding position and the size of the instrument in the image are labeled, the category label of the image not containing any specified instrument in the sample image is correspondingly 0, so as to obtain the sample image containing the target object label; inputting a sample image containing a target object label into a neural network model, performing class prediction on a target object carried by the sample image through the neural network model, comparing the prediction class with a standard target class, determining a loss function value of the neural network model based on the difference between the prediction class and the standard target class, reversely transmitting the loss function value into each layer of the neural network model, and updating model parameters of each layer through a Stochastic Gradient Descent (SGD) method until the loss function converges to realize the training of the neural network model. Optionally, the initial neural network model may further include a regression layer, and a back propagation neural network may be adopted, and through training of sample data, the network weight and the threshold are continuously corrected to make the error function decrease along the negative gradient direction, thereby approaching the expected output.
The trained neural network model performs feature extraction on the operation environment view field image acquired by the main operation equipment, forms feature vectors based on the extracted image features to perform classification prediction so as to determine corresponding classification labels, and accordingly outputs the type, position and size information of the target instrument contained in the operation environment view field image. In an alternative specific example, the target instrument position and size information includes coordinate and size information of a target frame including the activated instrument, and the activated instrument is used as the target instrument, and the corresponding display area is determined based on the position and size of the activated instrument, so that the working visual field area and the idle visual field area are distinguished, and the interface icon is ensured not to block the activated instrument. In another optional specific example, the target instrument position and size information may also include coordinate and size information of a target frame including the designated inactive instrument, and by using the designated inactive instrument as the target instrument, the corresponding display area is determined based on the position and size of the inactive instrument, so as to distinguish the working view area from the idle view area, thereby ensuring that the interface icon does not block the designated inactive instrument. The target instrument may also include both an active instrument and a designated inactive instrument, thereby ensuring that the interface icon does not obscure any instruments.
In the embodiment, the neural network model can be obtained after training, a training sample set can be constructed and continuously enriched according to the surgical environment view field images collected in the practical application process, and the self-learning and iteration based on the neural network model can be upgraded and updated along with the increase of sample data, so that the recognition result of the surgical environment view field images can be better and better represented.
Wherein the manner of determining the current display area of the target instrument in the surgical environment field-of-view interface is not limited to image recognition, for example, in another optional embodiment, the determining the idle field-of-view area in the surgical environment field-of-view interface according to the current display area of the target instrument includes:
determining the relative distance between a target part on the target instrument and the mirror surface of the endoscope;
determining the position and the size of the target instrument contained in the view field image according to the relative distance and the conversion relation between the image coordinate system corresponding to the surgical environment view field of the endoscope and a world coordinate system;
determining a current display area of the target instrument according to the position and the size of the target instrument, and determining a working visual field area and an idle visual field area in the operation environment visual field interface based on the current display area.
The target site of the target instrument may be any reference site preset to characterize the location of the target instrument, such as the tip of the target instrument. Different target instruments of different types may be preset with different sites as target sites corresponding thereto. The world coordinate system comprises a reference coordinate system which is established based on the operation environment scene and is convenient for measuring the relative position relation between the entities, for example, the world coordinate system can be established based on the plane of the endoscope mirror surface, and the relative distance between the target part on the target instrument and the endoscope mirror surface under the world coordinate system is determined. The image coordinate system comprises a coordinate system established based on the relation between the entity and the imaging point when the image acquisition equipment acquires the view field image of the operation environment, such as a three-dimensional coordinate system established by taking the focus center of the image acquisition equipment as an origin and taking the optical axis of the image acquisition equipment as a longitudinal axis. The conversion relationship between the image coordinate system and the world coordinate system can be generally expressed by a rotation matrix and a translation matrix. The main operation equipment determines the position and the size of the target instrument contained in a view field image according to the relative distance and the conversion relation between an image coordinate system corresponding to the surgical environment view field of the endoscope and a world coordinate system by acquiring the relative distance between a target position on the target instrument and the mirror surface of the endoscope and by utilizing the projection imaging relation during the acquisition of the surgical environment view field image, and further determines the current display area of the target instrument in a surgical environment view field interface according to the position and the size of the target instrument in the surgical environment view field image.
In the above embodiments, by obtaining the distance between the target site of the target instrument and the mirror surface of the endoscope and determining the current display area of the target instrument in the surgical environment visual field interface by using the optical projection imaging relationship, more alternative embodiments for determining the working visual field area and the idle visual field area in the surgical environment visual field interface are provided. In the execution process of the interface display control method, the current display area of the target instrument in the operation environment field-of-view interface can be determined by simultaneously adopting a mode of carrying out target detection on the operation environment field-of-view image and a mode of calculating the position and the size of the target instrument in the operation environment field-of-view interface by utilizing an optical projection imaging relation, and the result can be corrected by integrating two modes, so that the operation field-of-view area and the idle field-of-view area in the operation environment field-of-view interface can be determined more accurately.
Wherein the relative distance of the target site on the target instrument from the endoscope mirror may be determined by different technical means. In some embodiments, said determining a relative distance of a target site on said target instrument from a mirror of an endoscope comprises:
acquiring structural parameters of the endoscope;
determining the type of the target instrument, and determining the instrument structure parameters of the target instrument according to the type of the target instrument;
acquiring a driving parameter corresponding to the target instrument;
and determining the relative distance between the end part of the target instrument and the endoscope mirror surface according to the endoscope structural parameters, the instrument structural parameters and the driving parameters.
The endoscope structure parameters can include the shape, size and the like of the structure of each part of the endoscope, so as to be used for determining the related parameters of the mirror surface position of the endoscope. The instrument configuration parameters of the target instrument may include relevant parameters such as the shape, size, etc. of the target site of the instrument, such as the shape and size of the tip of the activation instrument. The driving parameters corresponding to the target instrument can be obtained from a driving component for driving the target instrument to execute corresponding actions, such as the number of rotation cycles of a motor shaft of a driving motor, and the posture of the target instrument can be correspondingly determined through the driving parameters, such as the rotation angle, the extension distance and the like of the activation instrument. In the execution process of the interface display control method, the relative distance between the end part of the target instrument and the mirror surface of the endoscope can be calculated according to the structural parameters of the endoscope, the structural parameters of the instrument of the target instrument and the corresponding driving parameters.
In the above embodiment, by combining the real-time posture of the target instrument determined by the driving parameter for driving the current execution action of the target instrument according to the structural parameters of the target instrument and the structural parameters of the endoscope, the distance between the target part of the target instrument and the mirror surface of the endoscope can be calculated, and the current display area of the target instrument in the surgical environment field-of-view interface is determined by using the optical projection imaging relationship, so that more optional embodiments for determining the working field-of-view area and the idle field-of-view area in the surgical environment field-of-view interface are provided.
In some embodiments, the obtaining the current position of the interface icon, and if it is determined that the interface icon has view shielding according to the position, adjusting the interface icon to the idle view area includes:
acquiring the current position of an interface icon;
determining whether the interface icon is overlapped with the operation visual field area or not according to the position of the interface icon, and/or determining whether the idle visual field ratio of the area where the interface icon is located is smaller than a set threshold value or not;
and if so, adjusting the interface icon into the idle view field.
The condition for determining whether the interface icon has view blocking may be that the interface icon at least partially overlaps with the work view area, or that an idle view ratio of an area where the interface icon is located is smaller than a set threshold. In the execution process of the interface display control method, the judgment can be respectively carried out according to two conditions, and when any one of the two conditions is met, the interface icon can be considered to have view shielding. The idle view percentage may comprise an area ratio of an idle view area within the region to said region, when the corresponding set threshold comprises a set area ratio threshold, or a minimum distance between an edge of a non-idle area within the region and said region, when the corresponding set threshold comprises a set distance threshold.
In the embodiment, the judgment condition for judging whether the interface icon has the view shielding is optimized, so that the position control of the interface icon in the visual interface of the operation environment is optimized, and the integrity of the view of a doctor is ensured.
In some embodiments, said adjusting said interface icon into said free field of view area comprises:
forming a plurality of idle visual field blocks on the idle visual field area according to the relative position of the idle visual field area relative to the operation visual field area in the operation environment visual field interface;
calculating the idle visual field ratio of a plurality of idle visual field blocks;
and adjusting the interface icon into a target idle-view area block with an idle-view ratio meeting preset requirements.
Referring to fig. 4, a diagram of an optional division result of the free view area is shown, and the free view area is divided into eight blocks, i.e., an upper left block, an upper right block, a left lower block, a lower left block, a lower right block, and a lower right block, according to a relative position of the free view area with respect to the operation view area. It should be noted that, as the dynamic operation changes, the number of idle field blocks may decrease and the size of each block may increase or decrease during the process of changing the position of the operation field area in the operation environment field interface. According to the design of the visual field interface of the operation environment, virtual keys for doctors to operate may be arranged in different idle visual field blocks, when the position of the interface icon is adjusted, the main operation device may respectively calculate the idle visual field occupation ratio of the idle visual field blocks according to the current real-time condition, and adjust the interface icon to a target idle visual field block with the idle visual field occupation ratio meeting the preset requirement, for example, adjust the interface icon to the idle visual field block with the largest idle visual field occupation ratio, so that the frequency of adjusting the position of the interface icon may be reduced.
In the above embodiment, the free view area is divided into a plurality of free view blocks, and when the position of the interface icon is adjusted, the free view block with the largest free view ratio is selected to set the interface icon, so as to optimize the overall layout of the surgical environment view interface, reduce the frequency of adjusting the position of the interface icon, and avoid the phenomenon that the position of the interface icon frequently jumps due to frequent change of the free view block.
In some embodiments, the determining that the target instrument is in front of the current display area in the surgical environment field-of-view interface comprises:
judging whether the icon display mode is in a dynamic mode or a fixed mode;
if the surgical field interface is in the dynamic mode, executing the step of determining the current display area of the target instrument in the surgical field interface;
and if the mobile terminal is in the fixed mode, waiting for an interface icon adjusting instruction, and correspondingly adjusting the position of the interface icon based on the interface icon adjusting instruction.
When the dynamic mode is selected, the interface icon is automatically adjusted by determining the current display area of the target instrument in the operation environment visual field interface, determining the operation visual field area and the idle visual field area; when the fixed mode is selected, the interface icon is adjusted according to the specific adjustment operation of the doctor on the position of the interface icon, for example, the doctor manually moves the interface icon to a certain position of the operation environment view field interface. The interface icon adjusting instruction can be an instruction formed by a doctor dragging the interface icon to a position of the operation environment field-of-view interface and correspondingly forming an adjusting operation.
In the embodiment, a doctor can select the dynamic mode and the fixed mode according to the requirements of different application scenes, so that the requirements of more different application scenes can be met.
In some embodiments, the free field of view region includes a plurality of free field of view blocks located at different positions within the surgical environment field of view interface, and the adjusting the position of the interface icon based on the interface icon adjustment instruction includes:
and acquiring a selection instruction of an idle view area block of a to-be-selected direction in the operation environment view field interface, and adjusting the interface icon into the idle view area block of the corresponding direction according to the selection instruction.
In the fixed mode, adjusting and selecting keys corresponding to the idle view blocks are respectively arranged according to the positions and the number of the idle view blocks, and interface icons can be selectively adjusted into the corresponding idle view blocks by clicking the designated adjusting and selecting keys. The free visual field area comprises a plurality of free visual field blocks located in different directions in the operation environment visual field interface, for example, the operation environment visual field interface can be divided into eight free visual field blocks including an upper left free visual field block, an upper right free visual field block, a left free visual field block, a lower right free visual field block and a lower right free visual field block according to the relative positions of the free visual field areas relative to the operation visual field area, the operation environment visual field interface is respectively and correspondingly provided with an upper left adjustment selection key, an upper right adjustment selection key, a left adjustment selection key, a lower left adjustment selection key and a lower right adjustment selection key, a user can adjust the interface icon into the free visual field block in the corresponding direction according to the selection instruction by clicking one of the adjustment selection keys.
In the above embodiment, a fixed mode for adjusting the position of the interface icon is provided, and a selection of the free view area blocks in a plurality of orientations is provided for a user to customize the display position of the interface icon to avoid the situation of view obstruction, according to the relative position of the free view area block with respect to the work view area.
In some embodiments, after the adjusting the interface icon into the free field of view region, the method further includes:
and updating and storing the current position of the interface icon.
Under the condition that the display position of the interface icon in the operation environment view field interface changes, the current position of the interface icon is updated and stored, so that the real-time updating of the current position of the interface icon is ensured. The display position of the interface icon on the surgical environment visual field interface is changed, including the case where the interface icon is automatically adjusted in position according to the case where the visual field is blocked in the dynamic mode, and the case where the interface icon is adjusted in position based on the adjustment command of the user in the fixed mode.
In the above embodiment, the position change of the interface icon is updated and recorded in real time, so that the latest position of the interface icon can be obtained in real time in the process of adjusting the position of the interface icon in the surgical environment visual field interface.
Optionally, the determining that the target instrument is in front of the current display area in the surgical environment field-of-view interface includes:
acquiring an operation environment field image acquired by an endoscope, and displaying the operation environment field image on an operation environment field interface;
wherein the interface icon comprises an endoscope icon.
In order to more fully understand the interface display control method provided by the embodiment of the present application, please refer to fig. 5 and fig. 6, which take an interface icon as an endoscope icon as an example of an optional specific example, where the endoscope icon includes an icon for representing a current posture of an endoscope, which is displayed in a field-of-view interface of a surgical environment when a doctor controls the endoscope, and the icon can indicate a rotation angle and a mirror angle of the current endoscope in real time, and the interface display control method includes the following steps:
s10, starting to control the endoscope;
s11, judging the position adjusting mode of the interface icon to be a fixed mode or a dynamic mode; if the mode is the dynamic mode, executing S121; if the mode is the fixed mode, executing S122;
s121, determining a working visual field area and an idle visual field area in a visual field interface of the operation environment; taking a target instrument as an example of an activated instrument, determining a display area of the activated instrument in a surgical environment visual field interface by determining the position and the size of the activated instrument in the surgical environment visual field interface under a dynamic mode, and determining a working visual field area and an idle visual field area based on the display area of the activated instrument;
s13, acquiring the position of the interface icon;
s14, judging whether the interface icon has a view shielding area; if yes, go to S141; if not, executing S142 and keeping the position of the interface icon unchanged;
s141, determining an idle view zone with the largest idle view ratio in the idle view zone, and adjusting an interface icon into the idle view zone; after S141, S15 is executed;
s122, judging whether an interface icon adjusting instruction for adjusting the position of the interface icon is received; if yes, go to S123; if not, executing S142 and keeping the position of the interface icon unchanged;
s123, adjusting the interface icon into the corresponding idle view zone block according to the adjustment direction correspondingly indicated by the interface adjustment instruction; after S123, S15 is executed;
s15, updating and storing the position of the interface icon;
s16, whether to stop controlling the endoscope; if yes, go to S161; if not, returning to S11;
s161, removing the endoscope icon.
In the above embodiment, the interface display control method at least has the following characteristics:
first, a fixed mode and a dynamic mode for adjusting the UI display position are supported, and a function of ensuring that the UI does not obstruct the view focused by the operator during movement of the surgical field or movement of the endoscope view is realized. In the fixed mode, multiple choices corresponding to different idle view blocks are provided, so that choices of UI display directions are increased, and a user can define a UI display position to avoid the condition of shielding the view;
secondly, the dynamic mode can intelligently identify the operation area of the user, and then automatically and dynamically move the UI to the position where the view field is not blocked, so that the operator can operate the area where the UI is currently displayed without moving the view field (because the UI can automatically move to the idle view field area), and the operation of a doctor can be reduced in the operation of frequently switching the operation area; secondly, the UI movement rule in the dynamic mode is that the idle view occupation ratio is larger than a set threshold value, so that the phenomenon of frequent jump of the UI position caused by frequent change of the maximum idle block can be avoided;
thirdly, by optimizing the position of the interface icon in the operation environment view field interface in real time, on the premise that the operation environment view field interface can present icons which are richer in quantity and respectively represent the positions and the execution states of more types of instruments, the interface icon is ensured not to shield the view field of a doctor in the process of operation, so that the doctor can perform operation better.
In some embodiments, the step S105 of adjusting the interface icon into the idle view area includes:
and acquiring the sizes of the interface icon and the idle view area.
And judging whether the size of the interface icon is smaller than that of the idle view field.
And if so, adjusting the interface icon into the idle view field.
If not, adjusting the size of the interface icon to be smaller than the size of the idle visual field area, and then adjusting the interface icon to be in the idle visual field area.
Wherein resizing the interface icon to be smaller than the size of the free field of view region comprises: the interface icon is resized to a proportional value of the size of the free field of view region, for example, the interface icon is resized to 0.5, 0.6, 0.7, 0.8, 0.9 times the free field of view region or any other multiple less than 1.
In some embodiments, the interface display control method may further include the steps of:
a current display area of the target instrument in the surgical environment field of view interface is determined.
And determining a working visual field area and an idle visual field area in the surgical environment visual field interface according to the current display area of the target instrument.
The method comprises the steps of obtaining the current position of an interface icon, obtaining the sizes of the interface icon and an idle view area if the position falls into an operation view area, and judging whether the size of the interface icon is smaller than the size of the idle view area.
And if so, adjusting the interface icon into the idle view field.
If not, the size of the interface icon is adjusted in situ to reduce the shielding of the operation view field.
Wherein, the in-situ adjustment of the size of the interface icon comprises: and adjusting the size of the interface icon to a certain proportional value of the size of the original interface icon, for example, adjusting the size of the interface icon to be 0.5, 0.6, 0.7, 0.8, 0.9 times or any other multiple smaller than 1 of the free view area. Wherein adjusting the size of the interface icon in situ further comprises not changing the center position of the interface icon.
Referring to fig. 7, in another aspect of the present application, an interface display control apparatus is provided, including: a determination module 211, configured to determine a current display area of the target instrument in the surgical environment field-of-view interface; a field of view partitioning module 212, configured to determine an idle field of view region in the surgical environment field of view interface according to a current display region of the target instrument; the adjusting module 213 is configured to obtain a current position of an interface icon, and adjust the interface icon to the idle view area if it is determined that the interface icon has view shielding according to the position.
Optionally, the visual field dividing module 212 is specifically configured to determine a working visual field region in the surgical environment visual field interface according to the current display region of the target instrument; determining an idle field of view region in the surgical environment field of view interface based on the working field of view region.
Optionally, the view dividing module 212 is specifically configured to acquire a surgical environment view image, perform target detection on the surgical environment view image, and determine a position and a size of the target instrument included in the surgical environment view image; determining a current display area of the target instrument according to the position and the size of the target instrument, and determining a working visual field area and an idle visual field area in the operation environment visual field interface based on the current display area.
Optionally, the visual field dividing module 212 is further configured to acquire a surgical environment visual field image through a neural network model, perform target detection on the surgical environment visual field image, and determine a position and a size of the target instrument included in the surgical environment visual field image.
Optionally, the visual field dividing module 212 is further configured to construct an initial neural network model; training the initial neural network model based on a training sample set of the operation environment field-of-view image containing the target object label to obtain a trained neural network model; the target object annotation includes a position and size annotation of a target instrument contained in the field of view image.
Optionally, the visual field dividing module 212 is further configured to determine a relative distance between a target site on the target instrument and a mirror surface of the endoscope; determining the position and the size of the target instrument contained in the view field image according to the relative distance and the conversion relation between the image coordinate system corresponding to the surgical environment view field of the endoscope and a world coordinate system; determining a current display area of the target instrument according to the position and the size of the target instrument, and determining a working visual field area and an idle visual field area in the operation environment visual field interface based on the current display area.
Optionally, the field of view dividing module 212 is further configured to acquire endoscope structure parameters; determining the type of the target instrument, and determining the instrument structure parameters of the target instrument according to the type of the target instrument; acquiring a driving parameter corresponding to the target instrument; and determining the relative distance between the end part of the target instrument and the endoscope mirror surface according to the endoscope structural parameters, the instrument structural parameters and the driving parameters.
Optionally, the adjusting module 213 is specifically configured to obtain a current position of the interface icon; determining whether the interface icon is overlapped with the operation visual field area or not according to the position of the interface icon, and/or determining whether the idle visual field ratio of the area where the interface icon is located is smaller than a set threshold value or not; and if so, adjusting the interface icon into the idle view field.
Optionally, the adjusting module 213 is further configured to form a plurality of idle view blocks from the idle view region according to a relative position of the idle view region with respect to the working view region in the surgical environment view interface; calculating the idle visual field ratio of a plurality of idle visual field blocks; and adjusting the interface icon into a target idle-view area block with an idle-view ratio meeting preset requirements.
Optionally, the interface display control device further includes a determining module, configured to determine that the icon display mode is in a dynamic mode or a fixed mode; if the surgical field interface is in the dynamic mode, executing the step of determining the current display area of the target instrument in the surgical field interface; and if the mobile terminal is in the fixed mode, waiting for an interface icon adjusting instruction, and correspondingly adjusting the position of the interface icon based on the interface icon adjusting instruction.
Optionally, the idle view area includes a plurality of idle view blocks located in different directions within the surgical environment view field interface, and the adjusting module 213 is further configured to obtain a selection instruction for an idle view block in a to-be-selected direction in the surgical environment view field interface, and adjust the interface icon into an idle view block in a corresponding direction according to the selection instruction.
Optionally, the view dividing module 212 is further configured to update and store the current position of the interface icon.
Optionally, the system further comprises an acquisition module, configured to acquire an operation environment field image acquired by an endoscope, and display the operation environment field image on the operation environment field interface; wherein the interface icon comprises an endoscope icon.
Optionally, the target instrument comprises an activation instrument and/or a designated deactivation instrument.
Optionally, the adjusting module 213 is further configured to obtain the size of the interface icon and the size of the free view area; judging whether the size of the interface icon is smaller than that of the idle view field or not; if so, adjusting the interface icon into the idle view field; or, if not, adjusting the size of the interface icon to be smaller than the size of the idle visual field area, and adjusting the interface icon to be in the idle visual field area.
It should be noted that: in the process of implementing the adjustment and optimization of the display position of the interface icon, the interface display control apparatus provided in the above embodiment is only illustrated by dividing the program modules, and in practical applications, the processing may be allocated to different program modules as needed, that is, the internal structure of the apparatus may be divided into different program modules, so as to complete all or part of the method steps described above. In addition, the interface display control device and the interface display control method provided by the above embodiments belong to the same concept, and specific implementation processes thereof are detailed in the method embodiments and are not described herein again.
Referring to fig. 8, in another aspect of the embodiment of the present application, a computer device is further provided, which includes a processor 21, a memory 22 connected to the processor 21, and a computer program stored on the memory 22 and executable by the processor 21, and when the computer program is executed by the processor 21, the computer device implements the interface display control method according to any embodiment of the present application. It should be noted that the processor 21 may include one or more processors physically separated from each other, and the multiple processors or the intelligent devices including the processors are communicatively connected to cooperatively execute the interface display control method according to the embodiment of the present application. Accordingly, the memory 22 may also include one or more storage media of the same type or different types that are physically separate from each other.
In another aspect of the embodiments of the present application, there is also provided a teleoperational medical system, including the computer device according to the embodiments of the present application and a slave operation device connected to the master operation device, where the slave operation device includes a plurality of instruments of different types and a driving component for driving the instruments to perform a specified action. The instrument may include an endoscope for acquiring images of a surgical environment field of view and transmitting them to the computer device for display. Wherein the instrument comprises at least one of: electrocautery, forceps, stapler, scissors, ultrasound probe.
The embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the foregoing collision warning method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (20)

1. An interface display control method is applied to computer equipment and is characterized by comprising the following steps:
determining a current display area of the target instrument in a surgical environment field-of-view interface;
determining an idle view area in the surgical environment view field interface according to a current display area of the target instrument;
and acquiring the current position of the interface icon, and if the interface icon is determined to have view shielding according to the position, adjusting the interface icon to the idle view area.
2. The interface display control method of claim 1, wherein said determining an idle field of view region in the surgical environment field of view interface based on a current display region of the target instrument comprises:
determining a working view field area in the surgical environment view field interface according to the current display area of the target instrument;
determining an idle field of view region in the surgical environment field of view interface based on the working field of view region.
3. The interface display control method of claim 2, wherein said determining an idle field of view region in the surgical environment field of view interface based on a current display region of the target instrument comprises:
acquiring a surgical environment field-of-view image, performing target detection on the surgical environment field-of-view image, and determining the position and size of the target instrument contained in the surgical environment field-of-view image;
determining a current display area of the target instrument according to the position and the size of the target instrument, and determining a working visual field area and an idle visual field area in the operation environment visual field interface based on the current display area.
4. The interface display control method according to claim 3, wherein the acquiring a surgical environment field-of-view image, performing target detection on the surgical environment field-of-view image, and determining the position and size of the target instrument included in the surgical environment field-of-view image includes:
acquiring a surgical environment field-of-view image through a neural network model, carrying out target detection on the surgical environment field-of-view image, and determining the position and size of the target instrument contained in the surgical environment field-of-view image.
5. The interface display control method according to claim 4, wherein the obtaining of the surgical environment field-of-view image through the neural network model, the target detection of the surgical environment field-of-view image, and the determining of the position and size of the target instrument included in the surgical environment field-of-view image comprise:
constructing an initial neural network model;
training the initial neural network model based on a training sample set of the operation environment field-of-view image containing the target object label to obtain a trained neural network model; the target object annotation includes a position and size annotation of a target instrument contained in the field of view image.
6. The interface display control method of claim 2, wherein said determining an idle field of view region in the surgical environment field of view interface based on a current display region of the target instrument comprises:
determining the relative distance between a target part on the target instrument and the mirror surface of the endoscope;
determining the position and the size of the target instrument contained in the view field image according to the relative distance and the conversion relation between the image coordinate system corresponding to the surgical environment view field of the endoscope and a world coordinate system;
determining a current display area of the target instrument according to the position and the size of the target instrument, and determining a working visual field area and an idle visual field area in the operation environment visual field interface based on the current display area.
7. The interface display control method of claim 6, wherein said determining a relative distance of a target site on the target instrument from a mirror of an endoscope comprises:
acquiring structural parameters of the endoscope;
determining the type of the target instrument, and determining the instrument structure parameters of the target instrument according to the type of the target instrument;
acquiring a driving parameter corresponding to the target instrument;
and determining the relative distance between the end part of the target instrument and the endoscope mirror surface according to the endoscope structural parameters, the instrument structural parameters and the driving parameters.
8. The interface display control method according to claim 2, wherein the acquiring a current position of the interface icon, and if it is determined that there is view occlusion in the interface icon according to the position, adjusting the interface icon into the idle view area includes:
acquiring the current position of an interface icon;
determining whether the interface icon is overlapped with the operation visual field area or not according to the position of the interface icon, and/or determining whether the idle visual field ratio of the area where the interface icon is located is smaller than a set threshold value or not;
and if so, adjusting the interface icon into the idle view field.
9. The interface display control method of claim 8, wherein said adjusting the interface icon into the idle-view region comprises:
forming a plurality of idle visual field blocks on the idle visual field area according to the relative position of the idle visual field area relative to the operation visual field area in the operation environment visual field interface;
calculating the idle visual field ratio of a plurality of idle visual field blocks;
and adjusting the interface icon into a target idle-view area block with an idle-view ratio meeting preset requirements.
10. The interface display control method of any one of claims 1-9, wherein the determining that the target instrument is prior to the current display area in the surgical environment field of view interface comprises:
judging whether the icon display mode is in a dynamic mode or a fixed mode;
if the surgical field interface is in the dynamic mode, executing the step of determining the current display area of the target instrument in the surgical field interface;
and if the mobile terminal is in the fixed mode, waiting for an interface icon adjusting instruction, and correspondingly adjusting the position of the interface icon based on the interface icon adjusting instruction.
11. The interface display control method of claim 10, wherein the free field of view region includes a plurality of free field of view blocks at different locations within the surgical environment field of view interface, and wherein the adjusting the position of the interface icon based on the interface icon adjustment instruction comprises:
and acquiring a selection instruction of an idle view area block of a to-be-selected direction in the operation environment view field interface, and adjusting the interface icon into the idle view area block of the corresponding direction according to the selection instruction.
12. The interface display control method of any one of claims 1 to 9, further comprising, after adjusting the interface icon into the free field of view region:
and updating and storing the current position of the interface icon.
13. The interface display control method of any one of claims 1-9, wherein the determining that the target instrument is prior to the current display area in the surgical environment field of view interface comprises:
acquiring an operation environment field image acquired by an endoscope, and displaying the operation environment field image on an operation environment field interface;
wherein the interface icon comprises an endoscope icon.
14. The interface display control method of claim 1, wherein the target instrument comprises an active instrument and/or a designated inactive instrument.
15. The interface display control method of claim 1, wherein said adjusting the interface icon into the idle-view region comprises:
acquiring the size of the interface icon and the size of the idle view area;
judging whether the size of the interface icon is smaller than that of the idle view field or not;
if so, adjusting the interface icon into the idle view field; or the like, or, alternatively,
if not, adjusting the size of the interface icon to be smaller than the size of the idle visual field area, and adjusting the interface icon to be in the idle visual field area.
16. An interface display control apparatus, comprising:
the determination module is used for determining the current display area of the target instrument in the operation environment field-of-view interface;
the visual field dividing module is used for determining an idle visual field area in the operation environment visual field interface according to the current display area of the target instrument;
and the adjusting module is used for acquiring the current position of the interface icon, and adjusting the interface icon to the idle view area if the view shielding of the interface icon is determined according to the position.
17. A computer device comprising a processor, a memory connected to the processor, and a computer program stored on the memory and executable by the processor, the computer program, when executed by the processor, implementing the interface display control method of any one of claims 1 to 15.
18. A teleoperational medical system comprising a computer device according to claim 17 and a slave manipulator device coupled to the computer device, the slave manipulator device comprising a plurality of instruments of different types and a drive assembly for driving the instruments to perform a specified action.
19. The teleoperational medical system of claim 18, wherein the instrument comprises at least one of: electrocautery, forceps, stapler, scissors, ultrasound probe.
20. A computer-readable storage medium, characterized in that a computer program is stored thereon, which when executed by the processor implements the interface display control method according to any one of claims 1 to 15.
CN202111316069.1A 2021-11-08 2021-11-08 Interface display control method and device, computer equipment and system and medium Active CN114041874B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111316069.1A CN114041874B (en) 2021-11-08 2021-11-08 Interface display control method and device, computer equipment and system and medium
CN202310888179.8A CN116849803A (en) 2021-11-08 2021-11-08 Interface display control method and device, computer equipment and system and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111316069.1A CN114041874B (en) 2021-11-08 2021-11-08 Interface display control method and device, computer equipment and system and medium

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202310888179.8A Division CN116849803A (en) 2021-11-08 2021-11-08 Interface display control method and device, computer equipment and system and medium

Publications (2)

Publication Number Publication Date
CN114041874A true CN114041874A (en) 2022-02-15
CN114041874B CN114041874B (en) 2023-08-22

Family

ID=80207835

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202310888179.8A Pending CN116849803A (en) 2021-11-08 2021-11-08 Interface display control method and device, computer equipment and system and medium
CN202111316069.1A Active CN114041874B (en) 2021-11-08 2021-11-08 Interface display control method and device, computer equipment and system and medium

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202310888179.8A Pending CN116849803A (en) 2021-11-08 2021-11-08 Interface display control method and device, computer equipment and system and medium

Country Status (1)

Country Link
CN (2) CN116849803A (en)

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102821671A (en) * 2010-03-31 2012-12-12 富士胶片株式会社 Oscope observation supporting system and method, and device and programme
US20140288413A1 (en) * 2013-03-21 2014-09-25 Samsung Electronics Co., Ltd. Surgical robot system and method of controlling the same
CN105808230A (en) * 2014-12-31 2016-07-27 深圳Tcl新技术有限公司 Method and device for moving suspension icon
US20170196453A1 (en) * 2016-01-13 2017-07-13 Novartis Ag Apparatuses and methods for parameter adjustment in surgical procedures
US20180064499A1 (en) * 2015-03-17 2018-03-08 Intuitive Surgical Operations, Inc. Systems and Methods for Onscreen Identification of Instruments in a Teleoperational Medical System
CN108778180A (en) * 2016-03-02 2018-11-09 柯惠Lp公司 System and method for removing the occlusion objects in operative image and/or video
CN110584778A (en) * 2018-06-12 2019-12-20 上海舍成医疗器械有限公司 Method and device for adjusting object posture and application of device in automation equipment
CN111954486A (en) * 2017-10-27 2020-11-17 深圳迈瑞生物医疗电子股份有限公司 Monitor, display method, display device and storage medium applied to monitor
CN112274250A (en) * 2015-03-17 2021-01-29 直观外科手术操作公司 System and method for presenting screen identification of instrument in teleoperational medical system
US20210030257A1 (en) * 2019-07-29 2021-02-04 Medicaroid Corporation Surgical system
CN112580474A (en) * 2020-12-09 2021-03-30 云从科技集团股份有限公司 Target object detection method, system, device and medium based on computer vision
US20210137619A1 (en) * 2019-11-11 2021-05-13 Cmr Surgical Limited Method of controlling a surgical robot
CN112971688A (en) * 2021-02-07 2021-06-18 杭州海康慧影科技有限公司 Image processing method and device and computer equipment
CN112971996A (en) * 2021-02-03 2021-06-18 上海微创医疗机器人(集团)股份有限公司 Computer-readable storage medium, electronic device, and surgical robot system
US20210290317A1 (en) * 2018-07-31 2021-09-23 Intuitive Surgical Operations, Inc. Systems and methods for tracking a position of a robotically-manipulated surgical instrument

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102821671A (en) * 2010-03-31 2012-12-12 富士胶片株式会社 Oscope observation supporting system and method, and device and programme
US20140288413A1 (en) * 2013-03-21 2014-09-25 Samsung Electronics Co., Ltd. Surgical robot system and method of controlling the same
CN105808230A (en) * 2014-12-31 2016-07-27 深圳Tcl新技术有限公司 Method and device for moving suspension icon
CN112168358A (en) * 2015-03-17 2021-01-05 直观外科手术操作公司 System and method for screen recognition of instruments in teleoperational medical systems
US20180064499A1 (en) * 2015-03-17 2018-03-08 Intuitive Surgical Operations, Inc. Systems and Methods for Onscreen Identification of Instruments in a Teleoperational Medical System
CN112274250A (en) * 2015-03-17 2021-01-29 直观外科手术操作公司 System and method for presenting screen identification of instrument in teleoperational medical system
US20170196453A1 (en) * 2016-01-13 2017-07-13 Novartis Ag Apparatuses and methods for parameter adjustment in surgical procedures
CN108778180A (en) * 2016-03-02 2018-11-09 柯惠Lp公司 System and method for removing the occlusion objects in operative image and/or video
CN111954486A (en) * 2017-10-27 2020-11-17 深圳迈瑞生物医疗电子股份有限公司 Monitor, display method, display device and storage medium applied to monitor
CN110584778A (en) * 2018-06-12 2019-12-20 上海舍成医疗器械有限公司 Method and device for adjusting object posture and application of device in automation equipment
US20210290317A1 (en) * 2018-07-31 2021-09-23 Intuitive Surgical Operations, Inc. Systems and methods for tracking a position of a robotically-manipulated surgical instrument
US20210030257A1 (en) * 2019-07-29 2021-02-04 Medicaroid Corporation Surgical system
US20210137619A1 (en) * 2019-11-11 2021-05-13 Cmr Surgical Limited Method of controlling a surgical robot
CN112580474A (en) * 2020-12-09 2021-03-30 云从科技集团股份有限公司 Target object detection method, system, device and medium based on computer vision
CN112971996A (en) * 2021-02-03 2021-06-18 上海微创医疗机器人(集团)股份有限公司 Computer-readable storage medium, electronic device, and surgical robot system
CN112971688A (en) * 2021-02-07 2021-06-18 杭州海康慧影科技有限公司 Image processing method and device and computer equipment

Also Published As

Publication number Publication date
CN114041874B (en) 2023-08-22
CN116849803A (en) 2023-10-10

Similar Documents

Publication Publication Date Title
US11751957B2 (en) Surgical system with training or assist functions
KR102014359B1 (en) Method and apparatus for providing camera location using surgical video
KR102298412B1 (en) Surgical image data learning system
CN112220562A (en) Method and system for enhancing surgical tool control during surgery using computer vision
WO2019204777A1 (en) Surgical simulator providing labeled data
KR101926123B1 (en) Device and method for segmenting surgical image
CN109690688A (en) System and method for preventing operation mistake
US20140012793A1 (en) System and method for predicting surgery progress stage
CA2827927A1 (en) Surgical robot system for performing surgery based on displacement information determined by user indication and method for controlling the same
KR20210110961A (en) Camera controller robot based on surgical image recognition and method for adjusting view of camera using the same
CN113380413A (en) Method and device for constructing invalid re-circulation (FR) prediction model
US20230316545A1 (en) Surgical task data derivation from surgical video data
US20220104887A1 (en) Surgical record creation using computer recognition of surgical events
CN114041874B (en) Interface display control method and device, computer equipment and system and medium
KR102276862B1 (en) Method, apparatus and program for controlling surgical image play
WO2023046630A1 (en) Surgical microscope system and corresponding system, method and computer program for a surgical microscope system
CN114631892B (en) Intelligent dermatological medical robot system for automatic diagnosis and treatment
US20220160433A1 (en) Al-Based Automatic Tool Presence And Workflow/Phase/Activity Recognition
JP2021527272A (en) Dominant hand tool detection system for surgical videos
US20220409301A1 (en) Systems and methods for identifying and facilitating an intended interaction with a target object in a surgical space
US20230045686A1 (en) Fusion of spatial and temporal context for location determination for visualization systems
US20230248464A1 (en) Surgical microscope system and system, method, and computer program for a microscope of a surgical microscope system
CN114327046A (en) Multi-mode man-machine interaction and state intelligent early warning method, device and system
CN115624387A (en) Stitching control method, control system, readable storage medium and robot system
CN115376676A (en) Surgical instrument adjustment method, surgical system, and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant