CN115607284B - Intraoperative navigation control device for endoscope, device, and storage medium - Google Patents

Intraoperative navigation control device for endoscope, device, and storage medium Download PDF

Info

Publication number
CN115607284B
CN115607284B CN202211547510.1A CN202211547510A CN115607284B CN 115607284 B CN115607284 B CN 115607284B CN 202211547510 A CN202211547510 A CN 202211547510A CN 115607284 B CN115607284 B CN 115607284B
Authority
CN
China
Prior art keywords
endoscope
route
unit
target
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211547510.1A
Other languages
Chinese (zh)
Other versions
CN115607284A (en
Inventor
何进雄
谭有余
谭文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Seesheen Medical Technology Co ltd
Original Assignee
Zhuhai Seesheen Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Seesheen Medical Technology Co ltd filed Critical Zhuhai Seesheen Medical Technology Co ltd
Priority to CN202211547510.1A priority Critical patent/CN115607284B/en
Publication of CN115607284A publication Critical patent/CN115607284A/en
Application granted granted Critical
Publication of CN115607284B publication Critical patent/CN115607284B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/32Surgical robots operating autonomously
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)

Abstract

The invention belongs to the technical field of medical robots, and discloses an intra-operative navigation control method, a device, equipment and a storage medium of an endoscope, wherein a lumen image in the process of moving the endoscope to a destination position is collected in real time for identification, prompt information is output if the color of an in vivo mucosa at any position is identified to be abnormal, and at the moment, if a marking instruction of a user is received within a first specified time period, the position is marked as a stop position, so that when the endoscope moves to the destination position, a return route of the endoscope is planned according to the stop position marked in the whole process, and the endoscope is controlled to stop at the stop position in the return process, so that a doctor user can observe/treat the in vivo mucosa at the position conveniently, and the navigation control mode is more flexible, the flexibility is improved, and the use is more convenient.

Description

Intraoperative navigation control device for endoscope, apparatus, and storage medium
Technical Field
The invention belongs to the technical field of medical robots, and particularly relates to an intraoperative navigation control method and device of an endoscope, equipment and a storage medium.
Background
In the process of performing minimally invasive surgery by using a medical robot, an automatic control route for moving to a specified focus is generally planned in advance through a virtual endoscope (namely a virtual platform), then a control signal is output to control a mechanical arm of the surgical robot to drive the optical endoscope to move to the position of the specified focus, and the surgical field is automatically adjusted through an algorithm, so that the requirements on operators are reduced.
The path planning is generally to acquire a two-dimensional image of the CT, identify lesion tissues based on machine vision, perform three-dimensional modeling to obtain a virtual platform, and perform planning of an intraoperative automatic control route by using the virtual platform.
However, the CT two-dimensional image cannot acquire the real color of the mucosa in the body, and once the automatic control route in the operation is determined, the automatic control route cannot be changed, for example, a doctor finds that the color of the mucosa is not correct in the process, wants to observe in other areas without a specified focus, cannot pause the automatic control route, or cannot select a route change for the doctor, and the doctor can only manually stop the automatic control mode and select the manual mode to control the optical endoscope, which causes inconvenience in the operation. Therefore, the current endoscope navigation control method in operation is not flexible enough and is inconvenient to use.
Disclosure of Invention
The invention aims to provide an intraoperative navigation control method and device of an endoscope, equipment and a storage medium, which can improve the flexibility and are more convenient to use.
The invention discloses a method for controlling intraoperative navigation of an endoscope, which comprises the following steps:
controlling the endoscope to acquire a plurality of cavity images in real time in the process that the endoscope moves to the end position;
when the color of the mucosa in the body at the first position is identified to be abnormal according to the plurality of cavity images, outputting prompt information for representing the abnormality of the mucosa in the body at the first position;
if a marking instruction input by a user is detected within a first specified duration, marking the first position as a staying position;
when the endoscope moves to the end position, determining a return route of the endoscope from the end position according to the position information of the stop position, wherein the stop position is positioned on the return route;
controlling the endoscope to move and return according to the return route;
and in the return process, when the endoscope moves to the stop position, controlling the endoscope to stop.
The second aspect of the present invention discloses an intraoperative navigation control device for an endoscope, comprising:
the camera shooting unit is used for controlling the endoscope to collect a plurality of cavity images in real time in the process that the endoscope moves to the end position;
the prompting unit is used for outputting prompting information for representing the abnormality of the mucosa in the body at the first position when the color of the mucosa in the body at the first position is identified according to the cavity images;
the marking unit is used for marking the first position as a staying position if a marking instruction input by a user is detected within a first specified duration after the prompting unit outputs prompting information for representing mucosa abnormality in the first position;
the navigation unit is used for determining a return route of the endoscope from the end position to move according to the position information of the stop position when the endoscope moves to the end position, and the stop position is positioned on the return route;
the return unit is used for controlling the endoscope to move and return according to the return route;
and the stopping unit is used for controlling the endoscope to stop when the endoscope moves to the stopping position in the return process.
A third aspect of the invention discloses an electronic device comprising a memory storing executable program code and a processor coupled to the memory; the processor calls the executable program code stored in the memory for executing the intra-operative navigation control method of the endoscope disclosed in the first aspect.
A fourth aspect of the present invention discloses a computer-readable storage medium storing a computer program, wherein the computer program causes a computer to execute the intraoperative navigation control method of an endoscope disclosed in the first aspect.
The endoscope navigation control method, the endoscope navigation control device, the endoscope navigation control equipment and the storage medium have the advantages that the lumen image in the process of moving the endoscope to the end position is collected in real time for identification, if the color of the internal mucosa at any position is identified to be abnormal, prompt information is output, and if a marking instruction of a user is received within a first specified time period, the position is marked as a stopping position, so that when the endoscope moves to the end position, a return route of the endoscope is planned according to the marked stopping position in the whole process, the endoscope is controlled to stop at the stopping position in the return process, the observation/treatment of the internal mucosa at the position is facilitated for a doctor user, and therefore, the navigation control mode is more flexible, the flexibility is improved, and the endoscope navigation control method is more convenient to use.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles and effects of the invention.
Unless otherwise specified or defined, the same reference numerals in different figures represent the same or similar technical features, and different reference numerals may be used for the same or similar technical features.
FIG. 1 is a flowchart of an intraoperative navigation control method of an endoscope, disclosed by an embodiment of the invention;
FIG. 2 is a flow chart of another method for controlling intra-operative navigation of an endoscope, in accordance with an embodiment of the present invention;
FIG. 3 is a schematic structural diagram of an intraoperative navigation control device of an endoscope according to the embodiment of the present invention;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Description of reference numerals:
301. an image pickup unit; 302. a presentation unit; 303. labeling units; 304. a navigation unit; 305. a return unit; 306. a retention unit; 401. a memory; 402. a processor.
Detailed Description
In order that the invention may be readily understood, specific embodiments thereof will be described in more detail below with reference to the accompanying drawings.
Unless specifically stated or otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. In the case of combining the technical solutions of the present invention in a realistic scenario, all technical and scientific terms used herein may also have meanings corresponding to the purpose of achieving the technical solutions of the present invention. As used herein, "first and second" \ 8230 "" are used merely to distinguish between names and do not denote a particular quantity or order. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
It will be understood that when an element is referred to as being "secured to" another element, it can be directly secured to the other element or intervening elements may also be present; when an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present; when an element is referred to as being "mounted on" another element, it can be directly on the other element or intervening elements may also be present. When an element is referred to as being "on" another element, it can be directly on the other element or intervening elements may also be present.
As used herein, unless otherwise specified or defined, the terms "comprises," "comprising," and "including" are used interchangeably and refer to the term "comprising," and are used interchangeably and refer to the term "comprising," or "comprises," as used herein.
It is needless to say that technical contents or technical features which are contrary to the object of the present invention or are clearly contradictory should be excluded.
The intraoperative navigation control method of the endoscope disclosed by the invention can be realized by computer programming, and the execution main body of the method can be electronic equipment such as a computer, a notebook computer, a tablet computer and the like, or an intraoperative navigation control device of the endoscope embedded in the electronic equipment, which is not limited by the invention. In the embodiment of the present invention, an electronic device is taken as an example for explanation.
As shown in FIG. 1, the embodiment of the invention discloses an intraoperative navigation control method of an endoscope, which comprises the following steps of S110-S180:
and S110, when the target focus position input by the user is received, the electronic equipment takes the target focus position as an end position, and determines the current operation and control route of the endoscope.
In the embodiment of the invention, the electronic device can be in communication connection with the surgical robot through a wireless or wired network, and the mechanical arm of the surgical robot holds an optical endoscope (simply called endoscope), so that the electronic device outputs a control signal to control the mechanical arm of the surgical robot to drive the endoscope to move. The electronic device is generally configured with an electronic display screen, a user (e.g., a doctor) can input a multi-plane CT two-dimensional image of an operation object (i.e., a patient) on the electronic display screen before an operation, and then the electronic device reconstructs the multi-plane CT two-dimensional image to obtain a tomographic image, and segments the tomographic image, so as to identify a position of a suspected lesion of the patient as a target lesion position. Alternatively, the user may input the location of the suspected lesion of the patient as the location of the target lesion by himself/herself, which is not limited by the present invention. Wherein, the optical endoscope can be a gastroscope, an enteroscope, a cystoscope, a bronchoscope, a thoracoscope or a laparoscope and the like.
When a target lesion position input by a user is received, namely when the position of a suspected lesion is determined, the electronic equipment can control the endoscope to enter an automatic control mode, under the automatic control mode, the electronic equipment can receive a setting instruction input by the user, determine the setting position as a designated starting point position according to the setting instruction, use the target lesion position as an end point position, and initially plan a path comprising the designated starting point position and the end point position as a current control route by combining a pre-learned human body environment. The designated starting point position may be a position set by a user, for example, an entrance of a main bronchus, or a branch bronchus near a suspected lesion in the bronchial tree.
Specifically, optionally, the determining, by the electronic device, the current manipulation route of the endoscope may include the following steps S111 to S116:
and S111, displaying the preoperative sectional image of the patient on an electronic display screen by the electronic equipment.
And the image pixel points corresponding to the target focus position are marked as first marking points.
S112, the electronic device determines a region between a first marking point corresponding to the designated starting point position and a second marking point corresponding to the target lesion position in the tomographic image as a candidate region.
S113, the electronic equipment responds to a click instruction of the user for the candidate area, and determines a target pixel point in the candidate area in the user point.
S114, the electronic equipment performs quadratic curve fitting according to the first annotation point, the second annotation point and the target pixel point to obtain a pre-planned path.
The quadratic curve fitting may adopt an algorithm such as a least square method or a minimum absolute deviation method, for example, coordinate information (x, y) of the first annotation point, the second annotation point, and the target pixel point is substituted into the following formula (1) to determine the parameters
Figure SMS_1
To obtain an expression of the pre-planned path:
Figure SMS_2
(1)
s115, the electronic equipment uniformly divides the pre-planned path into a specified number of sub-paths, and responds to correction pixel points which are selected by the user and located around at least one sub-path.
The distance between each modified pixel point and its nearest sub-path should not exceed a specified distance threshold, and a specific numerical value of the specified distance threshold may be preset by a developer, for example, 5, 8, or 10.
And S116, the electronic equipment corrects the sub-path closest to the correction pixel point according to the correction pixel point, and the current operation path is obtained.
By implementing the steps, curve fitting of the pre-planned path can be realized based on human-computer interaction, and then fine adjustment and correction are carried out on the fitted pre-planned path based on human-computer interaction, so that the accuracy of path planning can be improved while the instant path planning is realized. If the precision of fine adjustment correction needs to be improved, specific numerical values of the designated number can be flexibly set, namely the precision is higher when the designated number used for dividing the sub-paths is more, and conversely, the precision is lower when the designated number used for dividing the sub-paths is less.
And S120, the electronic equipment controls the endoscope to move from the designated starting position to the destination position according to the current control route.
Before executing step S120, the electronic device may further determine whether the endoscope enters from outside the patient to reach the designated starting point location needs to pass through the lumen, for example, if the designated starting point location is an entrance of the main bronchus (close to the larynx), it is determined that the endoscope can reach the designated starting point location without passing through the lumen; if the specified starting point position is in a branch trachea close to the suspected focus in the bronchial tree, judging that the endoscope can reach the specified starting point position only after passing through a body cavity channel; if the endoscope needs to pass through the body cavity channel, the electronic equipment can control the endoscope to accurately move to the specified starting point position based on the real-time path guidance of human-computer interaction, so that the injury to a patient is avoided, and the safety in the operation is improved.
Specifically, if it is determined that the endoscope needs to pass through the body channel to reach the designated starting point, and it is detected that the endoscope is located at the entrance of the channel (has entered the patient), the electronic device displays a sectional image of the patient before operation on the electronic display screen, and repeatedly performs the following steps: the electronic equipment responds to a click instruction of a user on a displayed tomographic image on an electronic display screen, determines a pixel point in a user point, takes the position of the pixel point as a node position, controls the endoscope to shoot towards the node position to obtain a node image, outputs the node image and forward inquiry information, and controls the endoscope to move to the node position if a forward confirmation instruction of the user for the forward inquiry information is received; and repeating the steps until the endoscope moves to the appointed starting point position.
And S130, in the process that the endoscope moves to the end point position, the electronic equipment controls the endoscope to collect a plurality of cavity images in real time.
The tip of the endoscope is provided with an image sensor, so that in the moving process of the endoscope, the image sensor can be controlled to continuously acquire image signals at a preset frequency to obtain a plurality of cavity images and transmit the cavity images to the electronic equipment, and the electronic equipment can perform image identification and operative field tracking or adjustment on the cavity images.
S140, when the color of the mucous membrane in the body at the first position is identified to be abnormal according to the plurality of cavity images, the electronic equipment outputs prompt information for representing the abnormality of the mucous membrane in the body at the first position.
In the step, the electronic equipment firstly carries out normalization and Gaussian low-pass filtering pretreatment on each image of the cavity, removes noise caused by overexposure and/or underexposure in the image to obtain a pretreated image, and then classifies the pretreated image by adopting a pre-trained two-classification model to obtain a classification prediction result; the electronic equipment further judges whether the classification prediction result is an abnormal result or a normal result, if so, abnormal region detection is carried out on the preprocessed image, and the detection result is obtained and used as the first position. The first position is a current position within the lumen through which the electronic device is passing.
The two-classification model can be obtained by acquiring a sample image in advance, carrying out normalization and Gaussian low-pass filtering pretreatment on the sample image to obtain a candidate image, then calculating the weight of the candidate image to obtain a feature vector, dividing the feature vector into a training set and a testing set, carrying out label marking on each feature vector, and selecting a support vector machine for training.
S150, if the marking instruction input by the user is detected within the first specified duration, the electronic equipment marks the first position as the stopping position.
Because the abnormal color of the mucosa in the body may be the problem of inflammation, ulcer and the like, when the electronic equipment identifies the abnormal color of the mucosa in the body at the first position, the prompt information is output, and simultaneously the electronic equipment starts a timer to time to obtain the timing duration, if the timing duration reaches the first specified duration, the marking instruction input by the user is not received, the processing is not carried out; and if a marking instruction input by a user is received within a first specified time length (such as 8 seconds, 10 seconds or 15 seconds), marking the first position as the staying position. The method for inputting the annotation instruction by the user includes, but is not limited to, inputting through interactive methods such as voice, touch, gesture or text input.
And S160, when the endoscope moves to the end position, the electronic equipment determines a return route including the stop position, in which the endoscope moves from the end position, according to the position information of the stop position.
In the process that the endoscope moves to the end position, the electronic equipment can finish labeling work of a plurality of mucosa color abnormal positions to form a plurality of stop positions, so that when the endoscope reaches the end position, the electronic equipment can plan a return route of the endoscope moving from the end position according to position information of the stop positions, and the stop positions are located on the return route.
S170, the electronic equipment controls the endoscope to move and return according to the return route.
Since the endoscope generally needs to stay at the end position for a sufficient operation time to perform the operation, in step S170, the electronic device may specifically control the endoscope to move back according to the return route after the endoscope has been controlled to process the suspected lesion (operation target) and when a return control instruction input by the user is received.
And S180, in the return process of the electronic equipment, when the endoscope moves to the stop position, controlling the endoscope to stop at the stop position.
The electronic device can preset a stay time, and when the endoscope is controlled to stay at the stay position each time, the electronic device only maintains the stay time, and then continuously moves back to the next stay position.
In summary, after the endoscope is controlled to treat the suspected lesion (the surgical target), the endoscope is controlled to return along the return route and stop when passing through each stop position on the way of the return route, so that the doctor can conveniently observe/treat the mucosa at each stop position. Therefore, the navigation control mode is more flexible, the flexibility can be improved, and the use is more convenient. Moreover, the problems except for the operation target in one operation process are efficiently solved, so that the redundant injury of the patient caused by the repeated intervention of the endoscope is avoided, and the efficiency can be improved.
As shown in FIG. 2, the embodiment of the invention discloses another intraoperative navigation control method of an endoscope, which comprises the following steps S210-S280:
s210 to S230. For the descriptions of steps S210-S230, please refer to the descriptions of steps S110-S130, which are not repeated herein.
And S240, when the second position is identified to correspond to the plurality of route options according to the plurality of cavity channel images, the electronic equipment outputs selection inquiry information.
The second position is the same as the first position and can be the current position in a cavity where the endoscope passes through, and when the second position is identified to be at a bifurcation port from a cavity image, namely, the second position corresponds to a straight channel (can go forward or backward) or a steering port (can turn left or right), the doctor is prompted to select a route if the second position corresponds to a plurality of route options.
And S250, the electronic equipment judges whether the route selection information input by the user for the selection inquiry information is detected within a second specified time. If yes, executing steps S260-S280; otherwise, S290 is performed.
And S260, if the route selection information input by the user aiming at the selection inquiry information is detected within the second specified time, the electronic equipment determines a first target route option selected by the user from the plurality of route options according to the route selection information.
After the electronic device outputs the selection inquiry information, it can be monitored by timing whether the route selection information input by the user (e.g. doctor) by means of voice, text, touch, gesture or other operation means is received within a second designated time (e.g. 8 seconds, 10 seconds or 15 seconds, etc.), for example, the user may touch the electronic display screen for click input, or speak to input a voice, or input its route selection information by waving/swinging gesture within the camera range of the camera of the electronic device.
In addition, the electronic device may output the selection query information in a voice and/or text manner, and more preferably, the electronic device may output the selection query information in a manner of: and outputting a screenshot marked with a plurality of route positions corresponding to the route options on the electronic display screen, wherein the route positions in the screenshot are respectively endowed with codes such as A, B and C, and simultaneously outputting voice and/or characters for prompting a user to select on the electronic display screen.
The plurality of route options may specifically be "leftward", "centering", "rightward", and the like, the plurality of route positions refer to next positions after operations of "leftward", "centering", and "rightward" at the current position, and when the route selection information input by the user is received to represent a certain target code or a certain route position, such as the next position after B or "centering" operations, it is determined that "centering" is the target route option. After the 'forward' is determined as the target route option, route options such as 'forward' and 'backward' can be further output, and the target route option selected by the user is determined again, such as 'backward', so that the doctor can conveniently control the endoscope to return to the position which is just passed by for further viewing.
S270, the electronic equipment adjusts the current control route of the endoscope according to the first target route option to obtain a target control route.
Because the operation target is moved to the end position, the default end position is unchanged, and when the operation target is detected to be adjusted by the user to the current operation route, a new operation route can be determined again to be used as the target operation route. The target maneuver route should also be targeted to the destination location. It should be noted that the target operation route may be changed in real time, that is, after a user inputs a target route option each time, the current operation route is adjusted according to the target route option. It is of course not excluded that the target maneuver route after the adjustment is the same as the current maneuver route before the adjustment.
Further, in step S240, when a plurality of route options corresponding to the second location are identified according to the plurality of cavity images, before the electronic device outputs the selection query information, the electronic device may first determine a target cavity image with the highest matching degree with the second location from the plurality of cavity images, then retrieve whether tag data corresponding to the target cavity image exists in the identity database corresponding to the currently logged-in user account, and if the tag data does not exist, output the selection query information to prompt the doctor to perform route selection; and if the label data exist, determining a second target route option matched with the label data from the plurality of route options according to the label data. Then, step S270 is implemented as: and adjusting the current control route of the endoscope according to the second target route option to obtain a target control route.
And S280, the electronic equipment controls the endoscope to move from the second position to the end position according to the target control route.
Due to the fact that the endoscope is located at the second position in real time, after interaction is completed, the endoscope is controlled to continue to move to the end position according to the target control route by taking the second position as a starting point.
As an optional implementation manner, after the electronic device determines the first target route option selected by the user from the plurality of route options according to the route selection information, the following steps S261 to S262 may be further performed:
s261, the electronic equipment marks the first target route option as label data of the target cavity channel image.
And S262, the electronic equipment stores the target cavity image and the label data in an identity database corresponding to the current login user account in an associated manner.
And the target route option selected by the user is identified and stored, so that the route option meeting the personalized habit of the user can be automatically selected in the following process.
And S290, if the route selection information input by the user for the selection inquiry information is not detected within the second specified time, the electronic equipment controls the endoscope to continue to move from the second position to the end position according to the current control route.
Therefore, by implementing the embodiment of the invention, the selection information of the control route can be obtained by the doctor in the automatic control mode without switching the automatic control mode into the manual control mode, so that the problem that the automatic control strategy cannot be changed after the automatic control route is determined, and if the doctor wants to observe other regions of unspecified focuses, the doctor can only stop the automatic control mode and select the manual control mode is solved.
As shown in fig. 3, the embodiment of the present invention discloses an intraoperative navigation control device of an endoscope, comprising a camera unit 301, a prompt unit 302, a labeling unit 303, a navigation unit 304, a return unit 305, and a stop unit 306, wherein,
the camera unit 301 is used for controlling the endoscope to collect a plurality of cavity images in real time in the process that the endoscope moves to the end point position;
the prompting unit 302 is used for outputting prompting information for representing the abnormality of the internal mucosa of the first position when the color of the internal mucosa of the first position is identified according to the plurality of cavity images;
a labeling unit 303, configured to label the first location as a staying location if a labeling instruction input by a user is detected within a first specified duration after the prompting unit 302 outputs prompting information for characterizing an abnormality of a mucous membrane in the first location;
a navigation unit 304 for determining a return route including a stop position from which the endoscope moves from the end position, based on the position information of the stop position when the endoscope moves to the end position;
a return unit 305 for controlling the endoscope to move back along a return path;
and a stop unit 306 for controlling the endoscope to stop when the endoscope moves to the stop position during the return stroke.
As an alternative embodiment, the intra-operative navigation control device of the endoscope may further include the following units, not shown:
the endoscope comprises a planning unit, a display unit and a control unit, wherein the planning unit is used for determining a current control route of the endoscope by taking a target focus position as an end point position when the target focus position input by a user is received, and the current control route comprises a designated starting point position and an end point position;
and the first moving unit is used for controlling the endoscope to move from the specified starting position to the end position according to the current control route and triggering the camera unit 301 to execute the operation of controlling the endoscope to acquire a plurality of cavity images in real time.
As an alternative embodiment, the intra-operative navigation control device of the endoscope may further include the following units, not shown:
the query unit is used for outputting selection query information when a plurality of route options corresponding to the second position are identified according to the plurality of cavity images after the camera unit 301 controls the endoscope to collect the plurality of cavity images in real time;
the selection unit is used for determining a first target route option selected by the user from the plurality of route options according to the route selection information if the route selection information input by the user aiming at the selection inquiry information is detected within a second designated time after the inquiry unit outputs the selection inquiry information;
the adjusting unit is used for adjusting the current control route of the endoscope according to the first target route option to obtain a target control route;
and the second moving unit is used for controlling the endoscope to move from the second position to the end position according to the target control route and triggering the camera unit 301 to execute the operation of controlling the endoscope to acquire a plurality of cavity images in real time.
As an alternative embodiment, the intra-operative navigation control device of the endoscope may further include the following units, not shown:
the matching unit is used for determining a target cavity image with the highest matching degree with a second position from the plurality of cavity images when a plurality of route options corresponding to the second position are identified according to the plurality of cavity images after the camera unit 301 controls the endoscope to collect the plurality of cavity images in real time;
the retrieval unit is used for judging whether label data corresponding to the target cavity image is retrieved in an identity database corresponding to the current login user account;
the query unit is specifically configured to output selection query information when the retrieval unit determines that the tag data corresponding to the target cavity image is not retrieved in the identity database corresponding to the current login user account.
As an alternative embodiment, the intra-operative navigation control device of the endoscope may further include an index unit, not shown, for determining a second target route option matching the tag data from the plurality of route options according to the tag data when the retrieval unit determines that the tag data corresponding to the target lumen image is retrieved from the identity database corresponding to the currently logged-in user account;
correspondingly, the adjusting unit is specifically configured to adjust the current operation route of the endoscope according to the second target route option to obtain the target operation route.
As an alternative embodiment, the intra-operative navigation control device of the endoscope may further include the following units, not shown:
the marking unit is used for marking the first target route option selected by the user as the label data of the target cavity image when the searching unit judges that the label data corresponding to the target cavity image is not searched in the identity database corresponding to the current login user account and the selecting unit determines the first target route option selected by the user from the plurality of route options according to the route selection information;
and the storage unit is used for storing the target cavity image and the label data into an identity database corresponding to the current login user account in a correlated manner.
Optionally, the first moving unit is further configured to, after the query unit outputs the selection query information, control the endoscope to continue moving from the second position to the destination position according to the current manipulation route if the route selection information input by the user for the selection query information is not detected within the second specified time period.
As shown in fig. 4, an embodiment of the present invention discloses an electronic device, which includes a memory 401 storing executable program codes and a processor 402 coupled to the memory 401;
the processor 402 calls the executable program code stored in the memory 401 to execute the intra-operative navigation control method of the endoscope described in the above embodiments.
The embodiment of the invention also discloses a computer readable storage medium which stores a computer program, wherein the computer program enables a computer to execute the intraoperative navigation control method of the endoscope described in the embodiments.
The above embodiments are provided to illustrate, reproduce and deduce the technical solutions of the present invention, and to fully describe the technical solutions, the objects and the effects of the present invention, so as to make the public more thoroughly and comprehensively understand the disclosure of the present invention, and not to limit the protection scope of the present invention.
The above examples are not intended to be exhaustive of the invention and there may be many other embodiments not listed. Any alterations and modifications without departing from the spirit of the invention are within the scope of the invention.

Claims (6)

1. An intraoperative navigation control device for an endoscope, comprising:
the system comprises a planning unit, a processing unit and a display unit, wherein the planning unit is used for determining a current control route of an endoscope by taking a target focus position as an end point position when receiving the target focus position input by a user, and the current control route comprises a designated starting point position and the end point position;
a first moving unit configured to control the endoscope to move from the designated start position to the end position according to the current manipulation route;
the camera shooting unit is used for controlling the endoscope to collect a plurality of cavity images in real time in the process that the endoscope moves to the end point position;
the prompting unit is used for outputting prompting information for representing the abnormality of the mucosa in the body at the first position when the color of the mucosa in the body at the first position is identified according to the cavity images;
the marking unit is used for marking the first position as a staying position if a marking instruction input by a user is detected within a first specified duration after the prompting unit outputs prompting information for representing mucosa abnormality in the first position;
the navigation unit is used for determining a return route of the endoscope from the end position according to the position information of the stop position when the endoscope moves to the end position, and the stop position is positioned on the return route;
the return unit is used for controlling the endoscope to move and return according to the return route;
the stopping unit is used for controlling the endoscope to stop when the endoscope moves to the stopping position in the return process;
the planning unit is specifically configured to determine a current manipulation route of the endoscope in a manner of:
the planning unit is used for displaying a sectional image before the operation of the patient on an electronic display screen, and determining a region between a first marking point corresponding to a specified starting point position and a second marking point corresponding to a target focus position in the sectional image as a candidate region; responding to a click instruction of a user for the candidate region, and determining a target pixel point in the candidate region in the user points; performing quadratic curve fitting according to the first marking point, the second marking point and the target pixel point to obtain a pre-planned path; uniformly dividing the pre-planned path into a specified number of sub-paths, and responding to correction pixel points which are selected by a user and are positioned around at least one sub-path; correcting the sub-path closest to each correction pixel point according to each correction pixel point to obtain a current operation path; and the first annotation point and the second annotation point are image pixel points in the tomographic image.
2. The intra-operative navigation control device of an endoscope according to claim 1, further comprising:
the query unit is used for outputting selection query information when a plurality of route options corresponding to a second position are identified according to the plurality of cavity images after the camera unit controls the endoscope to acquire the plurality of cavity images in real time;
the selection unit is used for determining a first target route option selected by a user from a plurality of route options according to the route selection information if the route selection information input by the user aiming at the selection inquiry information is detected within a second specified time after the inquiry unit outputs the selection inquiry information;
the adjusting unit is used for adjusting the current control route of the endoscope according to the first target route option to obtain a target control route;
and the second moving unit is used for controlling the endoscope to move from the second position to the end position according to the target control route and triggering the camera shooting unit to execute the operation of controlling the endoscope to acquire a plurality of cavity images in real time.
3. The intra-operative navigation control device of an endoscope according to claim 2, further comprising:
the matching unit is used for determining a target cavity image with the highest matching degree with a second position from the cavity images when a plurality of route options corresponding to the second position are identified according to the cavity images after the camera shooting unit controls the endoscope to collect the plurality of cavity images in real time;
the retrieval unit is used for judging whether label data corresponding to the target cavity image is retrieved in an identity database corresponding to the current login user account;
and the query unit is specifically used for outputting selection query information when the retrieval unit judges that the label data corresponding to the target cavity image is not retrieved in the identity database corresponding to the current login user account.
4. The intra-operative navigation control device of an endoscope according to claim 3, further comprising an indexing unit configured to determine a second target route option matching the tag data from the plurality of route options according to the tag data when the retrieving unit determines that the tag data corresponding to the target lumen image is retrieved from the identity database corresponding to the currently logged-in user account;
and the adjusting unit is further used for adjusting the current control route of the endoscope according to the second target route option to obtain a target control route.
5. An electronic device comprising a memory storing executable program code and a processor coupled to the memory; the processor calls the executable program code stored in the memory for performing the steps of:
when a target focus position input by a user is received, determining a current control route of the endoscope by taking the target focus position as an end point position, wherein the current control route comprises a designated starting point position and the end point position; controlling the endoscope to move from the designated starting position to the end position according to the current manipulation route;
controlling the endoscope to collect a plurality of cavity images in real time in the process that the endoscope moves to the end position; when the color of the mucosa in the body at the first position is identified to be abnormal according to the plurality of cavity images, outputting prompt information for representing the abnormality of the mucosa in the body at the first position; if a marking instruction input by a user is detected within a first specified duration, marking the first position as a staying position;
when the endoscope moves to the end position, determining a return route of the endoscope from the end position according to the position information of the stop position, wherein the stop position is positioned on the return route; controlling the endoscope to move and return according to the return route; in the return process, when the endoscope moves to the stop position, controlling the endoscope to stop;
the mode of determining the current control route of the endoscope specifically comprises the following steps:
displaying a sectional image before a patient operation on an electronic display screen, and determining a region between a first marking point corresponding to a specified starting point position and a second marking point corresponding to a target focus position in the sectional image as a candidate region; responding to a click instruction of a user for the candidate region, and determining a target pixel point in the candidate region in the user points; performing quadratic curve fitting according to the first marking point, the second marking point and the target pixel point to obtain a pre-planned path; uniformly dividing the pre-planned path into a specified number of sub-paths, and responding to correction pixel points which are selected by a user and are positioned around at least one sub-path; correcting the sub-path closest to each corrected pixel point according to each corrected pixel point to obtain a current operation path; and the first annotation point and the second annotation point are image pixel points in the tomographic image.
6. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program, wherein the computer program causes a computer to execute the steps of:
when a target focus position input by a user is received, determining a current control route of the endoscope by taking the target focus position as an end point position, wherein the current control route comprises a designated starting point position and the end point position; controlling the endoscope to move from the designated starting position to the end position according to the current manipulation route;
controlling the endoscope to collect a plurality of cavity images in real time in the process that the endoscope moves to the end position; when the color of the mucosa in the body at the first position is identified to be abnormal according to the cavity images, outputting prompt information for representing the abnormality of the mucosa in the body at the first position; if a marking instruction input by a user is detected within a first specified duration, marking the first position as a staying position;
when the endoscope moves to the end point position, determining a return route of the endoscope from the end point position according to the position information of the stop position, wherein the stop position is positioned on the return route; controlling the endoscope to move and return according to the return route; in the return process, when the endoscope moves to the stop position, controlling the endoscope to stop;
the mode of determining the current control route of the endoscope specifically comprises the following steps:
displaying a sectional image before a patient operation on an electronic display screen, and determining a region between a first marking point corresponding to a specified starting point position and a second marking point corresponding to a target focus position in the sectional image as a candidate region; responding to a click instruction of a user for the candidate area, and determining target pixel points in the candidate area in the user points; performing quadratic curve fitting according to the first marking point, the second marking point and the target pixel point to obtain a pre-planned path; uniformly dividing the pre-planned path into a specified number of sub-paths, and responding to correction pixel points which are selected by a user and are positioned around at least one sub-path; correcting the sub-path closest to each corrected pixel point according to each corrected pixel point to obtain a current operation path; the first annotation point and the second annotation point are image pixel points in the tomographic image.
CN202211547510.1A 2022-12-05 2022-12-05 Intraoperative navigation control device for endoscope, device, and storage medium Active CN115607284B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211547510.1A CN115607284B (en) 2022-12-05 2022-12-05 Intraoperative navigation control device for endoscope, device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211547510.1A CN115607284B (en) 2022-12-05 2022-12-05 Intraoperative navigation control device for endoscope, device, and storage medium

Publications (2)

Publication Number Publication Date
CN115607284A CN115607284A (en) 2023-01-17
CN115607284B true CN115607284B (en) 2023-03-14

Family

ID=84880403

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211547510.1A Active CN115607284B (en) 2022-12-05 2022-12-05 Intraoperative navigation control device for endoscope, device, and storage medium

Country Status (1)

Country Link
CN (1) CN115607284B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106256331A (en) * 2015-06-19 2016-12-28 柯惠有限合伙公司 For the navigation system and method by the air flue in virtual bronchoscopic view
CN113155116A (en) * 2020-01-22 2021-07-23 无锡祥生医疗科技股份有限公司 Ultrasonic scanning navigation method and device and storage medium
EP3991632A1 (en) * 2020-10-28 2022-05-04 FUJI-FILM Corporation Endoscope system and endoscope device
CN114916898A (en) * 2022-07-20 2022-08-19 广州华友明康光电科技有限公司 Automatic control inspection method, system, equipment and medium for magnetic control capsule

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106256331A (en) * 2015-06-19 2016-12-28 柯惠有限合伙公司 For the navigation system and method by the air flue in virtual bronchoscopic view
CN113155116A (en) * 2020-01-22 2021-07-23 无锡祥生医疗科技股份有限公司 Ultrasonic scanning navigation method and device and storage medium
EP3991632A1 (en) * 2020-10-28 2022-05-04 FUJI-FILM Corporation Endoscope system and endoscope device
CN114916898A (en) * 2022-07-20 2022-08-19 广州华友明康光电科技有限公司 Automatic control inspection method, system, equipment and medium for magnetic control capsule

Also Published As

Publication number Publication date
CN115607284A (en) 2023-01-17

Similar Documents

Publication Publication Date Title
US20240176579A1 (en) Vocally actuated surgical control system
CN107405079B (en) Method and system for content management of video images of anatomical regions
US20120130171A1 (en) Endoscope guidance based on image matching
JP2023544360A (en) Interactive information overlay on multiple surgical displays
EP1685787B1 (en) Insertion support system
JP2023544593A (en) collaborative surgical display
JP2023544594A (en) Display control of layered systems based on capacity and user operations
US20160007827A1 (en) Device and method for asissting laparoscopic surgery - rule based approach
US20190269390A1 (en) Device and method for assisting laparoscopic surgery - rule based approach
JP7160033B2 (en) Input control device, input control method, and surgical system
JP2009077765A (en) Endoscopic system
RU2007132734A (en) DEVICE AND METHOD OF DIRECTING A CATHETER IN ELECTROPHYSIOLOGICAL RESEARCH
CN112672709A (en) System and method for tracking the position of a robotically-manipulated surgical instrument
CN109863553A (en) The operation control system of voice activation
US11561762B2 (en) Vocally actuated surgical control system
US20220277461A1 (en) Method for generating learning model and program
CN115607284B (en) Intraoperative navigation control device for endoscope, device, and storage medium
CN113366583A (en) Camera control system and method for computer-assisted surgery system
WO2022054847A1 (en) Medical imaging apparatus, learning model generation method, and learning model generation program
US20190231167A1 (en) System and method for guiding and tracking a region of interest using an endoscope
CN115553925B (en) Endoscope control model training method and device, equipment and storage medium
EP3840629B1 (en) Image correction of a surgical endoscope video stream
CN111950338A (en) Monitoring processing of objects
US20220361739A1 (en) Image processing apparatus, image processing method, and endoscope apparatus
US10694929B2 (en) Medical equipment system and operation method of medical equipment system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant