CN114022548A - Endoscope collision detection method, device, equipment and storage medium - Google Patents

Endoscope collision detection method, device, equipment and storage medium Download PDF

Info

Publication number
CN114022548A
CN114022548A CN202111080471.4A CN202111080471A CN114022548A CN 114022548 A CN114022548 A CN 114022548A CN 202111080471 A CN202111080471 A CN 202111080471A CN 114022548 A CN114022548 A CN 114022548A
Authority
CN
China
Prior art keywords
collision
early warning
endoscope
grid structure
collision early
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111080471.4A
Other languages
Chinese (zh)
Inventor
李凌
徐强
辜嘉
李文超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Zhongkehuaying Health Technology Co ltd
Original Assignee
Suzhou Zhongkehuaying Health Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Zhongkehuaying Health Technology Co ltd filed Critical Suzhou Zhongkehuaying Health Technology Co ltd
Priority to CN202111080471.4A priority Critical patent/CN114022548A/en
Publication of CN114022548A publication Critical patent/CN114022548A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/20Finite element generation, e.g. wire-frame surface description, tesselation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/21Collision detection, intersection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Endoscopes (AREA)

Abstract

The application discloses an endoscope collision detection method, an endoscope collision detection device, equipment and a storage medium, wherein the method comprises the following steps: acquiring three-dimensional data of a region to be detected; based on a preset grid planning rule, carrying out grid structure processing on the three-dimensional data to obtain a corresponding three-dimensional grid structure; acquiring a collision early warning section corresponding to the endoscope lens; if the collision early warning section and a grid structure corresponding to the current position of the collision early warning section in the three-dimensional grid structure do not meet the preset collision condition, acquiring the current orientation of the collision early warning section; determining a target position of a collision early warning section based on a preset advancing rule corresponding to the endoscope lens and the current orientation; if the collision early warning tangent plane is located the target location, the collision early warning tangent plane satisfies preset collision condition with its target grid structure that corresponds in three-dimensional grid structure, then acquires the target collision distance based on the target location to send the collision distance early warning, this application can be when the endoscope is used collision and collision early warning detect accurately fast.

Description

Endoscope collision detection method, device, equipment and storage medium
Technical Field
The present application relates to the field of intelligent medical technology, and in particular, to a method, an apparatus, a device, and a storage medium for detecting endoscope collision.
Background
The endoscope is a detection instrument integrating traditional optics, ergonomics, precision machinery, modern electronics, mathematics and software into a whole. One has an image sensor, optical lens, light source illumination, mechanical device, etc. that can enter the stomach orally or through other natural orifices. Since a lesion which cannot be displayed by X-ray can be seen by an endoscope, it is very useful for a doctor. For example, with the aid of an endoscopist, an ulcer or tumor in the stomach can be observed, on the basis of which an optimal treatment plan is developed;
the endoscope technology does not need to be used for modern minimally invasive surgery technology, so that the traditional operation is replaced more, and the change is happening day by day, wherein the application of the endoscope technology has more important significance; the optical fiber non-invasive device is known as the third eye of human, is an optical fiber non-invasive device for otorhinolaryngology diagnosis and treatment which integrates examination, diagnosis and treatment, is one of the most advanced technologies in the field of international otorhinolaryngology treatment, and is the breakthrough progress of first utilizing the optical fiber in human medical history;
the endoscope is of a regular cylinder structure, the direction of the endoscope is changed by a snake bone at the front end, and the whole endoscope body can be divided into a hard endoscope (which cannot be bent) and a soft endoscope (which can be bent), so that the detection method is different from other environments in collision detection of endoscopic surgery.
In an endoscope operation, collision detection and collision early warning are very important, and doctors need to judge in advance how long the current endoscope and human tissues are walking continuously in the current direction to cause collision so as to avoid an inaccessible route in advance and make judgment conveniently; however, in the prior art, the application of the collision detection technology in the endoscope is more limited, and the precision and the speed are difficult to meet the operation requirements.
Disclosure of Invention
In order to solve the technical problems, the application discloses an endoscope collision detection method, which can rapidly and accurately perform collision detection and collision early warning detection in the using process of an endoscope so as to avoid an inaccessible route in advance, and is convenient for a doctor to avoid operation risks in advance according to collision early warning results.
In order to achieve the above object, the present application provides an endoscope collision detection method, including:
acquiring three-dimensional data of a region to be detected;
based on a preset grid planning rule, carrying out grid structure processing on the three-dimensional data to obtain a corresponding three-dimensional grid structure;
acquiring a collision early warning section corresponding to the endoscope lens;
if the collision early warning section and a grid structure corresponding to the current position of the collision early warning section in the three-dimensional grid structure do not meet a preset collision condition, acquiring the current orientation of the collision early warning section;
determining a target position of the collision early warning section based on a preset advancing rule corresponding to the endoscope lens and the current orientation;
and if the collision early warning section is located at the target position, and the collision early warning section and the corresponding target grid structure in the three-dimensional grid structure meet preset collision conditions, acquiring a target collision distance based on the target position, and sending collision distance early warning.
In some embodiments, the acquiring a collision warning section corresponding to the endoscope lens includes:
acquiring the current coordinate of the lens center of the endoscope lens;
determining an Euler angle of the lens center according to the current coordinate;
and determining a corresponding collision early warning tangent plane based on the current coordinate and the Euler angle.
In some embodiments, the determining a corresponding collision warning tangent plane based on the current coordinates and the euler angle includes:
determining a corresponding quadrilateral area by taking the current coordinate of the center of the lens as the center, n times of the diameter of the endoscope as the side length and a nutation angle in the Euler angle as an inclination angle, wherein n is more than 1 and is a natural number;
and taking the quadrilateral area as the collision early warning tangent plane.
In some embodiments, if the current positions of the collision early warning tangent plane and the collision early warning tangent plane in the three-dimensional grid structure do not satisfy a preset collision condition, obtaining a current orientation of the collision early warning tangent plane, where the method further includes:
judging whether the collision early warning section and the current position of the collision early warning section have a cross point in a corresponding grid structure in the three-dimensional grid structure;
if not, judging that the collision early warning section and a grid structure corresponding to the current position of the collision early warning section in the three-dimensional grid structure do not meet the preset collision condition.
In some embodiments, the obtaining the current orientation of the collision warning tangent plane includes:
acquiring the current coordinate of the lens center of the endoscope lens;
determining an Euler angle of the lens center according to the current coordinate of the lens center;
and determining the current orientation of the collision early warning tangent plane according to the Euler angle of the center of the lens.
In some embodiments, the determining the target position of the collision warning section based on the preset advance rule corresponding to the endoscope lens and the current orientation includes:
and updating the position of the collision early warning tangent plane along the current direction according to a preset advancing step length by taking the current position of the collision early warning tangent plane as an advancing starting point to obtain the target position of the collision early warning tangent plane.
In some embodiments, if the collision warning section is located at the target position, and the collision warning section and the target grid structure corresponding to the collision warning section in the three-dimensional grid structure meet a preset collision condition, acquiring a target collision distance based on the target position, and sending a collision distance warning, the method further includes:
if the collision early warning section is located at the target position, judging whether a cross point exists between the collision early warning section and a corresponding target grid structure in the three-dimensional grid structure;
and if so, judging that the collision early warning section and the corresponding target grid structure in the three-dimensional grid structure meet a preset collision condition.
In some embodiments, said obtaining a target collision distance based on said target position comprises:
acquiring a distance difference value between the target position of the collision early warning tangent plane and the current position of the collision early warning tangent plane;
and taking the distance difference value as a target collision distance.
In some embodiments, before acquiring the three-dimensional structure diagram of the region to be diagnosed, the method further includes:
acquiring a medical image;
performing three-dimensional reconstruction based on the medical image to obtain a three-dimensional structure diagram corresponding to the medical image;
and segmenting the three-dimensional data of the area to be detected from the three-dimensional structure chart.
In some embodiments, the obtaining a target collision distance based on the target position and issuing a collision distance warning further includes:
and planning a path of the endoscope lens by taking the current position of the collision early warning tangent plane as a starting point.
The present application further provides an endoscope collision detection device, the device comprising:
the first acquisition module is used for acquiring three-dimensional data of a region to be detected;
the grid planning module is used for carrying out grid structure processing on the three-dimensional data based on a preset grid planning rule to obtain a corresponding three-dimensional grid structure;
the second acquisition module is used for acquiring a collision early warning section corresponding to the endoscope lens;
a third obtaining module, configured to obtain a current orientation of the collision early warning tangent plane if the collision early warning tangent plane and a grid structure corresponding to the current position of the collision early warning tangent plane in the three-dimensional grid structure do not meet a preset collision condition;
the determining module is used for determining the target position of the collision early warning tangent plane based on a preset advancing rule corresponding to the endoscope lens and the current orientation;
and the collision early warning module is used for acquiring a target collision distance based on the target position and sending out collision distance early warning if the collision early warning section is located at the target position and the collision early warning section and a target grid structure corresponding to the collision early warning section in the three-dimensional grid structure meet preset collision conditions.
The present application further provides an endoscope collision detection apparatus comprising a processor and a memory, the memory having stored therein at least one instruction or at least one program, the at least one instruction or the at least one program being loaded and executed by the processor to implement the endoscope collision detection method as described above.
The present application further provides a computer-readable storage medium, wherein at least one instruction or at least one program is stored in the storage medium, and the at least one instruction or the at least one program is loaded by a processor and executes the endoscope collision detection method as described above.
The embodiment of the application has the following beneficial effects:
the endoscope collision detection method can be used for rapidly and accurately performing collision detection and collision early warning detection in the using process of the endoscope so as to avoid the non-advanceable route in advance, and a doctor can conveniently avoid operation risks in advance according to collision early warning results.
Drawings
In order to more clearly illustrate the endoscope collision detection method, apparatus, device and storage medium described in the present application, the drawings required for the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and that other drawings can be obtained by those skilled in the art without inventive effort from these drawings.
Fig. 1 is a schematic flowchart of an endoscope collision detection method according to an embodiment of the present application;
fig. 2 is a schematic flowchart of a method for determining a collision warning section according to an embodiment of the present disclosure;
fig. 3 is an exemplary schematic diagram of a collision warning area of an endoscope lens advancing in a collision detection process according to an embodiment of the present application;
FIG. 4 is a schematic structural diagram of another endoscope collision detection device provided by the embodiment of the application;
fig. 5 is a schematic structural diagram of an endoscope collision detection device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or server that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Before further detailed description of the embodiments of the present application, terms and expressions referred to in the embodiments of the present application will be described, and the terms and expressions referred to in the embodiments of the present application will be used for the following explanation.
Euler's angle, a set of 3 independent angle parameters used to determine the position of a fixed point rotating rigid body, consists of nutation angle theta, precession angle (i.e., precession angle) psi, and rotation angle phi, and was first proposed by euler.
The collision detection method of the present application is described below with reference to fig. 1, and can be applied to the field of intelligent medical treatment, and in particular, can be applied to endoscope collision detection in surgery; the method can be applied to a human tissue area needing diagnosis to judge whether the endoscope and the human tissue area collide and whether the collision risk exists.
Referring to fig. 1, which is a schematic flow chart of a method for detecting collision of an endoscope according to an embodiment of the present application, the present specification provides the method steps as described in the embodiment or the flow chart, but based on the conventional method; or the inventive process may include additional or fewer steps. The sequence of steps recited in the embodiments is only one of many steps in execution sequence and does not represent the only execution sequence, and the endoscope collision detection method can be executed according to the method sequence shown in the embodiments or the figures. Specifically, as shown in fig. 1, the method includes:
s101, acquiring three-dimensional data of a to-be-detected area;
it should be noted that, in the embodiment of the present application, the three-dimensional data of the region to be detected may be a three-dimensional structure diagram including a region to be diagnosed, which needs to be detected in the surgical procedure;
in this embodiment of the present application, the method for acquiring three-dimensional data of a region to be detected may include, but is not limited to:
acquiring a medical image;
in the embodiment of the present application, the medical image may be acquired by using a medical imaging device, for example, a CT image;
performing three-dimensional reconstruction based on the medical image to obtain a three-dimensional structure diagram corresponding to the medical image;
in the embodiment of the application, the existing three-dimensional reconstruction method can be adopted to carry out three-dimensional reconstruction on the medical image to obtain a three-dimensional structure chart corresponding to the medical image;
segmenting three-dimensional data of a region to be detected from the three-dimensional structure chart;
specifically, three-dimensional reconstruction is performed through the acquired CT image data, and a three-dimensional structure diagram of the bronchial region is segmented.
In the embodiment of the application, the three-dimensional structure chart can be subjected to region segmentation, and three-dimensional data of the region to be detected, namely the three-dimensional structure chart of the region to be detected, is segmented based on treatment requirements.
S103, carrying out grid structure processing on the three-dimensional data based on a preset grid planning rule to obtain a corresponding three-dimensional grid structure;
in the embodiment of the application, the surface of the three-dimensional structure diagram represented by the three-dimensional data can be subjected to gridding treatment based on the existing grid planning rule;
enabling the three-dimensional curved surface of the three-dimensional structure diagram to be represented in a grid structure; the design can avoid the problems of high complexity and large calculation amount when the three-dimensional curved surface is directly used for collision calculation, so that the calculation amount of the collision calculation is reduced, and the collision detection efficiency is improved.
In the embodiment of the application, the size of the grid in the three-dimensional grid structure can be dynamically planned according to the size of the three-dimensional data;
specifically, the planning may be performed according to the number of triangular patches in the three-dimensional data.
S105, acquiring a collision early warning section corresponding to the endoscope lens;
in the embodiment of the application, the collision early warning section can be a section area of the lens of the endoscope;
specifically, in the embodiment of the present application, the lens section area is larger than the lens section of the endoscope.
In the embodiment of the present application, as shown in fig. 2, a schematic flow chart of a method for determining a collision warning tangent plane provided in the embodiment of the present application is shown, specifically, the following:
s201, acquiring the current coordinate of the lens center of the endoscope lens;
in the embodiment of the application, a pose sensor arranged on an endoscope lens is adopted to obtain the current coordinate of the lens center; wherein the pose sensor can be used to detect its position and three-dimensional pose.
Specifically, if the pose sensor is arranged at the center of the lens of the endoscope, the coordinates of the pose sensor acquired by the pose sensor are the current coordinates of the center of the lens;
if the position and pose sensor is arranged on the endoscope lens, and the distance between the position and pose sensor and the lens center is m, the current coordinate of the lens center can be obtained through calculation based on the coordinate of the position and pose sensor acquired by the position and pose sensor and the distance m.
S203, determining an Euler angle of the center of the lens according to the current coordinate;
in the embodiment of the application, the pose sensor can determine the euler angle of the lens center based on the acquired coordinates of the lens center;
and S205, determining a corresponding collision early warning tangent plane based on the current coordinate and the Euler angle.
In the embodiment of the application, a corresponding quadrilateral area is determined by taking the current coordinate of the center of a lens as the center, taking n times of the diameter of an endoscope as the side length and taking a nutation angle in an Euler angle as an inclination angle, wherein n is more than 1 and is a natural number;
and taking the quadrilateral area as a collision early warning tangent plane.
In the embodiment of the application, an included angle between a plane where the quadrilateral area is located and a horizontal plane is an inclined angle;
preferably, n can be 1.2, that is, the quadrilateral area is planned with the side length being 1.2 times of the diameter of the endoscope;
preferably, the quadrilateral area can be a square plane; that is, the collision early warning tangent plane in this application is square plane.
S107, if the collision early warning section and a grid structure corresponding to the current position of the collision early warning section in the three-dimensional grid structure do not meet a preset collision condition, acquiring the current orientation of the collision early warning section;
in the embodiment of the application, the collision early warning tangent plane and the three-dimensional grid structure are positioned under the same coordinate system; the preset collision condition can be that a collision early warning tangent plane and a three-dimensional grid structure have a cross point.
In this application embodiment, if the mesh structure that collision early warning tangent plane and collision early warning tangent plane's current position correspond in three-dimensional mesh structure does not satisfy and predetermine the collision condition, then before obtaining the current orientation of collision early warning tangent plane, still include:
judging whether the collision early warning section and a grid structure corresponding to the current position of the collision early warning section in the three-dimensional grid structure meet a preset collision condition or not;
specifically, whether the collision early warning section and the corresponding grid structure of the current position of the collision early warning section in the three-dimensional grid structure have a cross point or not can be judged;
that is, whether a cross point exists between the section area of the collision early warning section at the current position and the grid structure corresponding to the current position of the collision early warning section in the three-dimensional grid structure can be judged;
if no intersection exists, the collision early warning section and the grid structure corresponding to the current position of the collision early warning section in the three-dimensional grid structure can be judged not to meet the preset collision condition.
In the embodiment of the application, under the condition that the collision early warning section and a grid structure corresponding to the current position of the collision early warning section in the three-dimensional grid structure do not meet the preset collision condition, the current orientation of the collision early warning section is obtained;
in the embodiment of the application, the current orientation of the collision early warning tangent plane can be the current orientation of the endoscope lens;
in particular, the current orientation of the lens may be determined at the euler angle of the endoscope lens.
In this embodiment of the application, obtaining the current orientation of the collision warning tangent plane may include:
acquiring a current coordinate of a lens center of an endoscope lens;
determining an Euler angle of the center of the lens according to the current coordinate of the center of the lens;
and determining the current orientation of the collision early warning tangent plane according to the Euler angle of the center of the lens.
Specifically, in the embodiment of the present application, the orientation of the endoscope lens at the current position may be calculated by a nutation angle in the euler angle, so as to determine the current orientation of the collision warning tangent plane.
In another implementation of the present application, if the collision early warning section and the grid structure corresponding to the current position of the collision early warning section in the three-dimensional grid structure have a cross point, that is, the collision early warning section and the grid structure corresponding to the current position of the collision early warning section in the three-dimensional grid structure meet a preset collision condition, the endoscope stops moving forward; in particular, the endoscope may exit the three-dimensional mesh structure or change the course of travel.
S109, determining a target position of a collision early warning section based on a preset advancing rule corresponding to the endoscope lens and the current orientation;
in the embodiment of the present application, the preset advance rule may be to advance according to a preset step length;
the target position can be a virtual position, namely, the collision early warning section is updated according to a preset advancing rule corresponding to the endoscope lens and the current orientation; and subsequently carrying out collision early warning detection on the endoscope lens based on the virtual position.
In this embodiment of the present application, the method for determining the target position of the collision warning tangent plane may include:
and updating the position of the collision early warning tangent plane along the current direction according to a preset advancing step length by taking the current position of the collision early warning tangent plane as an advancing starting point to obtain the target position of the collision early warning tangent plane.
Specifically, the collision early warning section is advanced along the current direction according to a preset step length, and the position of the collision early warning section is updated after the advance is performed once, so that an updated position of the collision early warning section, namely a target position, can be obtained;
when the collision early warning section is located at the target position, collision judgment is carried out on the collision early warning section and a grid structure corresponding to the three-dimensional grid structure where the collision early warning section is located, and the updating of the position can be finished until the collision early warning section and the target grid structure corresponding to the collision early warning section in the three-dimensional grid structure meet preset collision conditions, wherein when the target position of the collision early warning section is obtained, the total step length of the collision early warning section advancing along the current direction is smaller than or equal to a preset step length threshold value, namely when the step length of the collision early warning section advancing along the current direction is the preset step length threshold value, the advance is stopped.
And judging whether the collision early warning section and a target grid structure corresponding to the collision early warning section in the three-dimensional grid structure meet preset collision conditions or not when the collision early warning section is located at the target position based on the obtained target position of the collision early warning section.
S111, if the collision early warning section is located at the target position, the collision early warning section and a target grid structure corresponding to the collision early warning section in the three-dimensional grid structure meet preset collision conditions, acquiring a target collision distance based on the target position, and sending collision distance early warning;
in this application embodiment, if collision early warning tangent plane is located the target location, collision early warning tangent plane and its target grid structure that corresponds in three-dimensional grid structure satisfy and predetermine the collision condition, then obtain the target collision distance based on the target location to send the collision distance early warning, still include before:
if the collision early warning section is located at the target position, judging whether the collision early warning section and a target grid structure corresponding to the collision early warning section in the three-dimensional grid structure have intersections;
specifically, it may be determined whether a cross point exists between a section area of the collision early-warning section at the target position and a grid structure corresponding to the target position of the collision early-warning section in the three-dimensional grid structure;
and if so, judging that the collision early warning section and the corresponding target grid structure in the three-dimensional grid structure meet the preset collision condition.
In the embodiment of the application, when the collision early warning section and the corresponding target grid structure in the three-dimensional grid structure meet the preset collision condition, the target collision distance is obtained;
specifically, the target collision distance is determined based on the current position of the collision early warning tangent plane and the target position;
in the embodiment of the present application, acquiring a target collision distance based on a target position includes:
acquiring a distance difference value between the target position of the collision early warning tangent plane and the current position of the collision early warning tangent plane;
the distance difference is taken as the target collision distance.
In the embodiment of the present application, the target collision distance is less than or equal to the preset step threshold.
In the embodiment of the application, after the collision distance early warning is sent out, the path planning can be performed on the endoscope lens by taking the current position of the collision early warning tangent plane as a starting point.
In another embodiment of the present application, if the collision early warning section is located at the target position, the collision early warning section and the corresponding target grid structure in the three-dimensional grid structure do not satisfy the preset collision condition, and the target collision distance is smaller than the preset step length threshold, the step of determining the target position and the step of determining whether the collision early warning section is located at the target position and the collision early warning section and the corresponding target grid structure in the three-dimensional grid structure satisfy the preset collision condition are repeated,
until the collision early warning section and a target grid structure corresponding to the collision early warning section in the three-dimensional grid structure meet a preset collision condition, and acquiring a target collision distance; in the embodiment, the target collision distance is always smaller than or equal to the preset step length threshold.
In another embodiment of the application, if the collision early warning section is located at the target position, the collision early warning section and the corresponding target grid structure in the three-dimensional grid structure do not meet the preset collision condition, and the target collision distance is equal to the preset step length threshold, it is determined that the endoscope lens has no collision risk, the collision early warning is not required to be sent out, and the endoscope lens is controlled to advance along the current direction by the distance of the preset step length threshold.
In a preferred embodiment of the present application, fig. 3 is a schematic view illustrating an exemplary forward progress of the collision warning region of the endoscope lens during collision detection;
in the schematic diagram, 1 represents a collision early warning section, 2 represents a pose sensor, 3 represents a three-dimensional grid structure, 4 represents an endoscope, and a white arrow represents the direction of motion of the collision early warning section along the current direction;
the collision early warning section is of a quadrilateral structure at the position A and the position B, and is arranged at the center of the lens
Specifically, as shown in fig. 3, when the collision early-warning section advances from the current position a to the target position B, the collision early-warning section and the target grid structure corresponding to the collision early-warning section in the three-dimensional grid structure meet the preset collision condition.
As can be seen from the embodiments of the endoscope collision detection method, apparatus, device, and storage medium provided by the present application, the embodiments of the present application acquire three-dimensional data of a region to be detected; based on a preset grid planning rule, carrying out grid structure processing on the three-dimensional data to obtain a corresponding three-dimensional grid structure; acquiring a collision early warning section corresponding to the endoscope lens; if the collision early warning section and a grid structure corresponding to the current position of the collision early warning section in the three-dimensional grid structure do not meet the preset collision condition, acquiring the current orientation of the collision early warning section; determining a target position of a collision early warning section based on a preset advancing rule corresponding to the endoscope lens and the current orientation; if the collision early warning section is located at the target position, the collision early warning section and a target grid structure corresponding to the collision early warning section in the three-dimensional grid structure meet preset collision conditions, acquiring a target collision distance based on the target position, and sending collision distance early warning; by utilizing the technical scheme provided by the embodiment of the specification, the collision detection and the collision early warning detection can be rapidly and accurately carried out in the using process of the endoscope so as to avoid the non-advanceable route in advance, and a doctor can conveniently avoid the operation risk in advance according to the collision early warning result.
An endoscope collision detection device is further provided in an embodiment of the present application, as shown in fig. 4, which is a schematic structural diagram of the endoscope collision detection device provided in the embodiment of the present application; specifically, the device comprises:
a first obtaining module 410, configured to obtain three-dimensional data of a region to be detected;
the grid planning module 420 is configured to perform grid structure processing on the three-dimensional data based on a preset grid planning rule to obtain a corresponding three-dimensional grid structure;
the second obtaining module 430 is configured to obtain a collision early warning section corresponding to the endoscope lens;
a third obtaining module 440, configured to obtain a current orientation of the collision early warning section if the collision early warning section and a grid structure corresponding to the current position of the collision early warning section in the three-dimensional grid structure do not meet a preset collision condition;
the determining module 450 is configured to determine a target position of the collision early warning section based on a preset advance rule corresponding to the endoscope lens and the current orientation;
and a collision early warning module 460, configured to, if the collision early warning tangent plane is located at the target position, obtain a target collision distance based on the target position and send a collision distance early warning if the collision early warning tangent plane and the target grid structure corresponding to the collision early warning tangent plane in the three-dimensional grid structure meet a preset collision condition.
In this embodiment of the present application, the second obtaining module 430 includes:
a first acquisition unit for acquiring a current coordinate of a lens center of an endoscope lens;
the first determining unit is used for determining the Euler angle of the lens center according to the current coordinate;
and the second determining unit is used for determining a corresponding collision early warning tangent plane based on the current coordinate and the Euler angle.
In an embodiment of the present application, the second determination unit includes:
the first determining subunit is used for determining a corresponding quadrilateral area by taking the current coordinate of the center of the lens as the center, taking n times of the diameter of the endoscope as the side length and taking a nutation angle in an Euler angle as an inclination angle, wherein n is more than 1 and is a natural number;
and the processing subunit is used for taking the quadrilateral area as a collision early warning tangent plane.
In the embodiment of the present application, the method further includes:
the first judgment module is used for judging whether the collision early warning section and a grid structure corresponding to the current position of the collision early warning section in the three-dimensional grid structure have a cross point or not;
and the first judging module is used for judging that the collision early warning section and the grid structure corresponding to the current position of the collision early warning section in the three-dimensional grid structure do not meet the preset collision condition if no intersection point exists between the collision early warning section and the grid structure corresponding to the current position of the collision early warning section in the three-dimensional grid structure.
In this embodiment of the present application, the third obtaining module 440 includes:
a second acquisition unit that acquires a current coordinate of a lens center of the endoscope lens;
a third determining unit, configured to determine an euler angle of the lens center according to the current coordinate of the lens center;
and the fourth determining unit is used for determining the current orientation of the collision early warning tangent plane according to the Euler angle of the lens center.
In the embodiment of the present application, the determining module 450 includes:
and the fifth determining unit is used for updating the position of the collision early warning tangent plane along the current direction according to a preset advancing step length by taking the current position of the collision early warning tangent plane as an advancing starting point to obtain the target position of the collision early warning tangent plane.
In the embodiment of the present application, the method further includes:
the second judgment module is used for judging whether the collision early warning section and a corresponding target grid structure in the three-dimensional grid structure have a cross point or not if the collision early warning section is located at the target position;
and the second judgment module is used for judging that the collision early warning section and the corresponding target grid structure in the three-dimensional grid structure meet the preset collision condition if the collision early warning section and the corresponding target grid structure in the three-dimensional grid structure have a cross point.
In the embodiment of the present application, the collision warning module 460 includes:
the third acquisition unit is used for acquiring a distance difference value between the target position of the collision early warning tangent plane and the current position of the collision early warning tangent plane;
and the processing unit is used for taking the distance difference value as the target collision distance.
In the embodiment of the present application, the method further includes:
a fourth acquisition module for acquiring a medical image;
the three-dimensional reconstruction module is used for performing three-dimensional reconstruction based on the medical image to obtain a three-dimensional structure chart corresponding to the medical image;
and the image segmentation module is used for segmenting the three-dimensional data of the region to be detected from the three-dimensional structure chart.
In the embodiment of the present application, the method further includes:
and the path planning module is used for planning a path of the endoscope lens by taking the current position of the collision early warning tangent plane as a starting point.
The embodiment of the application provides an endoscope collision detection device, which comprises a processor and a memory, wherein at least one instruction or at least one program is stored in the memory, and the at least one instruction or the at least one program is loaded by the processor and executed to realize the endoscope collision detection method according to the embodiment of the method.
The memory may be used to store software programs and modules, and the processor may execute various functional applications and data processing by operating the software programs and modules stored in the memory. The memory can mainly comprise a program storage area and a data storage area, wherein the program storage area can store an operating system, application programs needed by functions and the like; the storage data area may store data created according to use of the device, and the like. Further, the memory may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory may also include a memory controller to provide the processor access to the memory.
Fig. 5 is a schematic structural diagram of an endoscope collision detection device provided in an embodiment of the present application, and internal configurations of the endoscope collision detection device may include, but are not limited to: a processor, a network interface and a memory, wherein the processor, the network interface and the memory in the endoscope collision detection device can be connected through a bus or other means, and the bus connection is taken as an example in fig. 5 shown in the embodiment of the present specification.
The processor (or CPU) is a computing core and a control core of the endoscope collision detection device. The network interface may optionally include a standard wired interface, a wireless interface (e.g., WI-FI, mobile communication interface, etc.). The Memory (Memory) is a Memory device in the endoscope collision detection apparatus for storing programs and data. It is understood that the memory herein may be a high-speed RAM storage device, or may be a non-volatile storage device (non-volatile memory), such as at least one magnetic disk storage device; optionally, at least one memory device located remotely from the processor. The memory provides storage space that stores the operating system of the endoscope collision detection device, which may include, but is not limited to: windows system (an operating system), Linux (an operating system), etc., which are not limited in this application; also, one or more instructions, which may be one or more computer programs (including program code), are stored in the memory space and are adapted to be loaded and executed by the processor. In the embodiment of the present application, the processor loads and executes one or more instructions stored in the memory to implement the endoscope collision detection method provided by the above method embodiment.
Embodiments of the present application also provide a computer-readable storage medium that may be disposed in an endoscope collision detection device to store at least one instruction, at least one program, a set of codes, or a set of instructions related to implementing an endoscope collision detection method in method embodiments, which may be loaded and executed by a processor of an electronic device to implement the endoscope collision detection method provided by the above-described method embodiments.
Optionally, in this embodiment, the storage medium may include, but is not limited to: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
It should be noted that: the sequence of the embodiments of the present application is only for description, and does not represent the advantages and disadvantages of the embodiments. And specific embodiments thereof have been described above. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
According to an aspect of the application, a computer program product or computer program is provided, comprising computer instructions, the computer instructions being stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method provided in the various alternative implementations described above.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, as for the device and server embodiments, since they are substantially similar to the method embodiments, the description is simple, and the relevant points can be referred to the partial description of the method embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above disclosure is only one preferred embodiment of the present application, and certainly does not limit the scope of the present application, which is therefore intended to cover all modifications and equivalents of the claims.

Claims (13)

1. A method of endoscope collision detection, the method comprising:
acquiring three-dimensional data of a region to be detected;
based on a preset grid planning rule, carrying out grid structure processing on the three-dimensional data to obtain a corresponding three-dimensional grid structure;
acquiring a collision early warning section corresponding to the endoscope lens;
if the collision early warning section and a grid structure corresponding to the current position of the collision early warning section in the three-dimensional grid structure do not meet a preset collision condition, acquiring the current orientation of the collision early warning section;
determining a target position of the collision early warning section based on a preset advancing rule corresponding to the endoscope lens and the current orientation;
and if the collision early warning section is located at the target position, and the collision early warning section and the corresponding target grid structure in the three-dimensional grid structure meet preset collision conditions, acquiring a target collision distance based on the target position, and sending collision distance early warning.
2. The endoscope collision detection method according to claim 1, wherein the acquiring a collision warning section corresponding to the endoscope lens comprises:
acquiring the current coordinate of the lens center of the endoscope lens;
determining an Euler angle of the lens center according to the current coordinate;
and determining a corresponding collision early warning tangent plane based on the current coordinate and the Euler angle.
3. The endoscope collision detection method according to claim 2, wherein the determining a corresponding collision warning tangent plane based on the current coordinates and the euler angle comprises:
determining a corresponding quadrilateral area by taking the current coordinate of the center of the lens as the center, n times of the diameter of the endoscope as the side length and a nutation angle in the Euler angle as an inclination angle, wherein n is more than 1 and is a natural number;
and taking the quadrilateral area as the collision early warning tangent plane.
4. The endoscope collision detection method according to claim 1, wherein if the current positions of the collision early warning section and the collision early warning section in the three-dimensional mesh structure correspond to a mesh structure that does not satisfy a preset collision condition, acquiring a current orientation of the collision early warning section, before further comprising:
judging whether the collision early warning section and the current position of the collision early warning section have a cross point in a corresponding grid structure in the three-dimensional grid structure;
if not, judging that the collision early warning section and a grid structure corresponding to the current position of the collision early warning section in the three-dimensional grid structure do not meet the preset collision condition.
5. The endoscope collision detection method according to claim 1, wherein the acquiring the current orientation of the collision warning section comprises:
acquiring the current coordinate of the lens center of the endoscope lens;
determining an Euler angle of the lens center according to the current coordinate of the lens center;
and determining the current orientation of the collision early warning tangent plane according to the Euler angle of the center of the lens.
6. The endoscope collision detection method according to claim 1, wherein the determining the target position of the collision warning tangent plane based on the preset advance rule corresponding to the endoscope lens and the current orientation comprises:
and updating the position of the collision early warning tangent plane along the current direction according to a preset advancing step length by taking the current position of the collision early warning tangent plane as an advancing starting point to obtain the target position of the collision early warning tangent plane.
7. The endoscope collision detection method according to claim 1, wherein if the collision warning section is located at the target position and the collision warning section and the corresponding target mesh structure in the three-dimensional mesh structure meet a preset collision condition, acquiring a target collision distance based on the target position and issuing a collision distance warning, before further comprising:
if the collision early warning section is located at the target position, judging whether a cross point exists between the collision early warning section and a corresponding target grid structure in the three-dimensional grid structure;
and if so, judging that the collision early warning section and the corresponding target grid structure in the three-dimensional grid structure meet a preset collision condition.
8. The endoscope collision detection method according to claim 1, wherein the acquiring a target collision distance based on the target position includes:
acquiring a distance difference value between the target position of the collision early warning tangent plane and the current position of the collision early warning tangent plane;
and taking the distance difference value as a target collision distance.
9. The endoscope collision detection method according to claim 1, further comprising, before the acquiring three-dimensional data of the region to be diagnosed:
acquiring a medical image;
performing three-dimensional reconstruction based on the medical image to obtain a three-dimensional structure diagram corresponding to the medical image;
and segmenting the three-dimensional data of the area to be detected from the three-dimensional structure chart.
10. The endoscope collision detection method according to claim 1, wherein the acquiring a target collision distance based on the target position and issuing a collision distance warning further comprises:
and planning a path of the endoscope lens by taking the current position of the collision early warning tangent plane as a starting point.
11. An endoscope collision detection device, characterized in that said device comprises:
the first acquisition module is used for acquiring three-dimensional data of a region to be detected;
the grid planning module is used for carrying out grid structure processing on the three-dimensional data based on a preset grid planning rule to obtain a corresponding three-dimensional grid structure;
the second acquisition module is used for acquiring a collision early warning section corresponding to the endoscope lens;
a third obtaining module, configured to obtain a current orientation of the collision early warning tangent plane if the collision early warning tangent plane and a grid structure corresponding to the current position of the collision early warning tangent plane in the three-dimensional grid structure do not meet a preset collision condition;
the determining module is used for determining the target position of the collision early warning tangent plane based on a preset advancing rule corresponding to the endoscope lens and the current orientation;
and the collision early warning module is used for acquiring a target collision distance based on the target position and sending out collision distance early warning if the collision early warning section is located at the target position and the collision early warning section and a target grid structure corresponding to the collision early warning section in the three-dimensional grid structure meet preset collision conditions.
12. An endoscope collision detection apparatus, characterized in that the apparatus comprises a processor and a memory, in which at least one instruction or at least one program is stored, which is loaded and executed by the processor to implement the endoscope collision detection method according to any one of claims 1 to 10.
13. A computer-readable storage medium, wherein at least one instruction or at least one program is stored in the storage medium, and the at least one instruction or the at least one program is loaded by a processor and executes the endoscope collision detection method according to any one of claims 1 to 10.
CN202111080471.4A 2021-09-15 2021-09-15 Endoscope collision detection method, device, equipment and storage medium Pending CN114022548A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111080471.4A CN114022548A (en) 2021-09-15 2021-09-15 Endoscope collision detection method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111080471.4A CN114022548A (en) 2021-09-15 2021-09-15 Endoscope collision detection method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114022548A true CN114022548A (en) 2022-02-08

Family

ID=80054163

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111080471.4A Pending CN114022548A (en) 2021-09-15 2021-09-15 Endoscope collision detection method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114022548A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115859411A (en) * 2022-12-09 2023-03-28 腾讯科技(深圳)有限公司 Volume rendering collision detection method, device, equipment and storage medium
WO2023160723A1 (en) * 2022-02-28 2023-08-31 上海安翰医疗技术有限公司 Tof camera module-based anti-collision system and method for capsule endoscope control device
CN117074708A (en) * 2023-10-12 2023-11-17 深圳市帝迈生物技术有限公司 Sample detection device and control method thereof

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023160723A1 (en) * 2022-02-28 2023-08-31 上海安翰医疗技术有限公司 Tof camera module-based anti-collision system and method for capsule endoscope control device
CN115859411A (en) * 2022-12-09 2023-03-28 腾讯科技(深圳)有限公司 Volume rendering collision detection method, device, equipment and storage medium
CN117074708A (en) * 2023-10-12 2023-11-17 深圳市帝迈生物技术有限公司 Sample detection device and control method thereof
CN117074708B (en) * 2023-10-12 2024-03-22 深圳市帝迈生物技术有限公司 Sample detection device and control method thereof

Similar Documents

Publication Publication Date Title
CN114022548A (en) Endoscope collision detection method, device, equipment and storage medium
CN106659373B (en) Dynamic 3D lung map view for tool navigation inside the lung
US8116847B2 (en) System and method for determining an optimal surgical trajectory
US7809176B2 (en) Device and method for automated planning of an access path for a percutaneous, minimally invasive intervention
CN114022547B (en) Endoscopic image detection method, device, equipment and storage medium
EP2901934B1 (en) Method and device for generating virtual endoscope image, and program
EP4177664A1 (en) Program, information processing method, and endoscope system
CN117372661B (en) Surgical navigation system, surgical robot system and registration method
CN115944388B (en) Surgical endoscope position guiding method, device, computer equipment and storage medium
WO2020183936A1 (en) Inspection device, inspection method, and storage medium
CN112545647A (en) Operation support device and operation navigation system
US20220249174A1 (en) Surgical navigation system, information processing device and information processing method
EP3165192B1 (en) Updating a volumetric map
JP2007236629A (en) Medical image processor and medical image processing method
JP2009273644A (en) Medical imaging apparatus, medical image processing device, and medical image processing program
CN114022538A (en) Route planning method and device for endoscope, terminal and storage medium
JP7172086B2 (en) Surgery simulation device and surgery simulation program
CN113317874A (en) Medical image processing device and medium
CN114027974B (en) Endoscope path planning method, device and terminal for multiple lesion sites
WO2024018713A1 (en) Image processing device, display device, endoscope device, image processing method, image processing program, trained model, trained model generation method, and trained model generation program
JP7495216B2 (en) Endoscopic surgery support device, endoscopic surgery support method, and program
WO2021199294A1 (en) Information processing device, display method, and non-transitory computer-readable medium having program stored therein
US20240090741A1 (en) Endoscopic examination support apparatus, endoscopic examination support method, and recording medium
WO2024029502A1 (en) Endoscopic examination assistance device, endoscopic examination assistance method, and recording medium
US20230210627A1 (en) Three-dimensional instrument pose estimation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination