CN116433874A - Bronchoscope navigation method, device, equipment and storage medium - Google Patents

Bronchoscope navigation method, device, equipment and storage medium Download PDF

Info

Publication number
CN116433874A
CN116433874A CN202111679733.9A CN202111679733A CN116433874A CN 116433874 A CN116433874 A CN 116433874A CN 202111679733 A CN202111679733 A CN 202111679733A CN 116433874 A CN116433874 A CN 116433874A
Authority
CN
China
Prior art keywords
bifurcation
virtual
real
bronchoscope
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111679733.9A
Other languages
Chinese (zh)
Inventor
陈日清
刘润南
徐宏
余坤璋
张欢
李楠宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Kunbo Biotechnology Co Ltd
Original Assignee
Hangzhou Kunbo Biotechnology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Kunbo Biotechnology Co Ltd filed Critical Hangzhou Kunbo Biotechnology Co Ltd
Priority to CN202111679733.9A priority Critical patent/CN116433874A/en
Priority to PCT/CN2022/138717 priority patent/WO2023124978A1/en
Publication of CN116433874A publication Critical patent/CN116433874A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/267Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the respiratory tract, e.g. laryngoscopes, bronchoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/344Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • G06T2207/10021Stereoscopic video; Stereoscopic image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Surgery (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Optics & Photonics (AREA)
  • Computer Hardware Design (AREA)
  • Pathology (AREA)
  • General Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Computer Graphics (AREA)
  • Otolaryngology (AREA)
  • Artificial Intelligence (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physiology (AREA)
  • Pulmonology (AREA)
  • Robotics (AREA)
  • Quality & Reliability (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • Multimedia (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Endoscopes (AREA)

Abstract

The application belongs to the field of medical treatment and provides a bronchoscope navigation method, a device, equipment and a storage medium. The method comprises the following steps: monitoring the similarity between a virtual bronchoscope image corresponding to the virtual bronchoscope and a real bronchoscope image acquired by a real bronchoscope; acquiring a real bifurcation image corresponding to any bifurcation, which is acquired by the real bronchoscope; matching the real bifurcation images with a virtual image set to obtain a matching result; and according to the matching result, determining a virtual bifurcation partial image matched with the real bifurcation image, and determining the real pose of the real bronchoscope in the real bronchial tree according to the determined virtual bifurcation partial image. Therefore, when the pose of the virtual bronchoscope and the pose of the real bronchoscope generate larger errors, the synchronization of the virtual bronchoscope image and the real bronchoscope image can be realized rapidly and effectively.

Description

Bronchoscope navigation method, device, equipment and storage medium
Technical Field
The application belongs to the medical field, and particularly relates to a bronchoscope navigation method, a device, equipment and a storage medium.
Background
The bronchoscope is a medical instrument which can be placed into the lower respiratory tract of a patient through the mouth or nose and is used for observation, biopsy sampling, bacteriology and cytology examination of lung lobes, sections and sub Duan Zhi tracheal lesions, and can be matched with a display system to carry out optical navigation, photography, teaching and dynamic recording.
Offline calculations may be performed in advance to determine a virtual three-dimensional bronchial tree from the CT data prior to optical navigation using a bronchoscope. And (3) when the real-time navigation starts, the real bronchoscope pose and the virtual bronchoscope pose are adjusted to be the same as possible. In the real-time tracking process, a real bronchoscope image display window displays an airway inner picture acquired by a bronchoscope in real time, and a virtual bronchoscope image display window displays a virtual bronchoscope image which is similar to the real bronchoscope image currently displayed and is obtained by rendering a virtual bronchoscope tree through a renderer in real time. However, when the real bronchoscope moves too fast, hits the wall or other reasons may cause fewer features in the real bronchoscope image, the real-time tracked pose of the virtual bronchoscope and the pose of the real bronchoscope may generate a larger error, and the virtual bronchoscope image cannot be synchronized with the real bronchoscope image.
Disclosure of Invention
In view of this, the embodiments of the present application provide a bronchoscope navigation method, apparatus, device, and storage medium, so as to solve the problem in the prior art that when a real bronchoscope moves too fast, a wall is bumped or other reasons may cause fewer features in a real bronchoscope image, a real-time tracking pose of a virtual bronchoscope and a pose of the real bronchoscope may generate a larger error, and the virtual bronchoscope image cannot be synchronized with the real bronchoscope image.
A first aspect of an embodiment of the present application provides a bronchoscope navigation method, the method including:
monitoring the similarity between a virtual bronchoscope image corresponding to the virtual bronchoscope and a real bronchoscope image acquired by a real bronchoscope;
when the similarity is smaller than a preset similarity threshold value and the real bronchoscope is moved to any bifurcation of a real bronchotree, acquiring a real bifurcation image corresponding to any bifurcation, which is acquired by the real bronchoscope;
matching the real bifurcation images with a virtual image set to obtain a matching result, wherein the virtual image set comprises a plurality of virtual bifurcation partial images;
And determining a virtual bifurcation partial image matched with the real bifurcation image according to the matching result, and determining the real pose of the real bronchoscope in the real bronchotree according to the determined virtual bifurcation partial image, wherein the real pose is used for registering the virtual bronchoscope and the real bronchoscope.
With reference to the first aspect, in a first possible implementation manner of the first aspect, matching the real bifurcation image with the virtual image set to obtain a matching result includes:
determining a matching range of the virtual image set corresponding to any bifurcation;
and matching the real bifurcation image with the virtual bifurcation partial image in the matching range to obtain a matching result.
With reference to the first possible implementation manner of the first aspect, in a second possible implementation manner of the first aspect, determining a matching range in the virtual image set corresponding to the any bifurcation includes:
determining a current branch of the virtual bronchoscope at the virtual bronchoscope tree when the virtual bronchoscope image is acquired;
searching a sub-branch, a parent branch and an adjacent branch which are adjacent to the current branch in the virtual bronchial tree;
Determining bifurcation ports corresponding to the current branch, the searched sub-branch, the adjacent branch and the parent branch;
and determining the virtual bifurcation partial image in the matching range as the virtual bifurcation partial image associated with the determined bifurcation in the virtual image set.
With reference to the first possible implementation manner of the first aspect, in a third possible implementation manner of the first aspect, matching the real bifurcation image with the virtual bifurcation partial image in the matching range includes:
and determining the moving speed of the real bronchoscope and/or the feature quantity of the real bronchoscope image, and determining the matching sequence of the virtual bifurcation part image and the real bifurcation image in the matching range.
With reference to the first possible implementation manner of the first aspect, in a fourth possible implementation manner of the first aspect, matching the real bifurcation image with the virtual bifurcation partial image in the matching range includes:
determining a virtual bifurcation partial image in a virtual bifurcation partial image set corresponding to each bifurcation in the matching range, wherein the similarity between the virtual bifurcation partial image and the real bifurcation image is the highest;
And sequentially sequencing the first virtual bifurcation partial images corresponding to the bifurcation ports according to the similarity, and selecting a preset number of first virtual bifurcation port partial images.
With reference to the fourth possible implementation manner of the first aspect, in a fifth possible implementation manner of the first aspect, determining a virtual bifurcation site image in the virtual bifurcation site image set corresponding to each bifurcation site in the matching range, where the first virtual bifurcation site image has a highest similarity with the real bifurcation site image includes:
respectively determining a plurality of initial poses corresponding to any bifurcation in a matching range, and rendering according to the initial poses to obtain a virtual bifurcation local image set corresponding to any bifurcation;
according to the similarity between any virtual bifurcation part image in any virtual bifurcation part image set and the real bifurcation part image, adjusting the pose of any virtual bifurcation part image;
regenerating a virtual bifurcation port local image according to the adjusted pose, calculating the similarity between the regenerated virtual bifurcation port local image and a real bifurcation port image, and continuously adjusting the pose of any virtual bifurcation port local image according to the similarity until the preset iteration requirement is met, so as to obtain the adjusted pose corresponding to the initial pose of any bifurcation port;
And selecting the pose with the highest similarity to determine the first virtual bifurcation partial image according to the similarity between the virtual bifurcation partial image corresponding to the multiple adjusted poses of any bifurcation and the real bifurcation image.
With reference to the fourth possible implementation manner of the first aspect, in a sixth possible implementation manner of the first aspect, determining a virtual bifurcation site image in the virtual bifurcation site image set corresponding to each bifurcation site in the matching range, where the first virtual bifurcation site image has a highest similarity with the real bifurcation site image includes:
respectively determining a plurality of initial poses corresponding to any bifurcation in a matching range, and obtaining a virtual bifurcation local image set corresponding to any bifurcation according to the initial poses;
registering the real bifurcation image with a virtual bifurcation partial image contained in the virtual bifurcation partial image set, and taking the virtual bifurcation partial image with the highest similarity with the real bifurcation image as the first virtual bifurcation partial image.
With reference to the first aspect, in a seventh possible implementation manner of the first aspect, before navigating, the method further includes:
Offline calculation process: reconstructing a virtual bronchial tree, determining an airway branch central line of the virtual bronchial tree and determining a virtual image set corresponding to the virtual bronchial tree;
navigation registration process: and when navigation starts, adjusting the pose of the real bronchoscope and/or the pose of the virtual bronchoscope to enable the pose of the real bronchoscope to be identical to the pose of the virtual bronchoscope.
A second aspect of embodiments of the present application provides a bronchoscope navigation system, the system comprising:
the tracking preparation module is used for carrying out three-dimensional reconstruction according to CT data to obtain a virtual bronchial tree, determining a virtual image set corresponding to bifurcation ports in the bronchial tree, and enabling the real bronchoscope pose to be identical to the virtual bronchoscope pose by adjusting the real bronchoscope and/or the virtual bronchoscope pose when navigation starts, wherein the virtual image set comprises a plurality of virtual bifurcation port partial images;
the real-time tracking module is used for acquiring the real bronchoscope image acquired by the real bronchoscope and the virtual bronchoscope image corresponding to the virtual bronchoscope in real time in the navigation process, and monitoring the similarity between the acquired real bronchoscope image and the virtual bronchoscope image corresponding to the virtual bronchoscope;
And the missing positioning module is used for searching a matched virtual bifurcation partial image through a real bifurcation image acquired by the real bronchoscope when the similarity monitored by the real-time tracking module is smaller than a preset similarity threshold value and the real bronchoscope is moved to any bifurcation of the real bronchoscope, determining the real pose of the real bronchoscope in the real bronchoscope according to the searched virtual bifurcation partial image, and registering the virtual bronchoscope and the real bronchoscope according to the real pose.
A third aspect of embodiments of the present application provides a bronchoscope navigation device, said device comprising:
the image monitoring unit is used for monitoring the similarity between the virtual bronchoscope image corresponding to the virtual bronchoscope and the real bronchoscope image acquired by the real bronchoscope;
a real bifurcation image acquisition unit configured to acquire a real bifurcation image corresponding to any bifurcation acquired by the real bronchoscope when the similarity is smaller than a predetermined similarity threshold and the real bronchoscope is moved to any bifurcation of a real bronchotree;
the image matching unit is used for matching the real bifurcation images with a virtual image set to obtain a matching result, wherein the virtual image set comprises a plurality of virtual bifurcation partial images;
And the real pose determining unit is used for determining a virtual bifurcation partial image matched with the real bifurcation image according to the matching result, determining the real pose of the real bronchoscope in the real bronchoscope according to the determined virtual bifurcation partial image, and registering the virtual bronchoscope and the real bronchoscope according to the real pose.
A fourth aspect of the embodiments of the present application provides a bronchoscope navigation device comprising a memory, a processor and a computer program stored in said memory and executable on said processor, said processor implementing the steps of the method according to any one of the first aspects when said computer program is executed.
A fifth aspect of the embodiments of the present application provides a computer readable storage medium storing a computer program which, when executed by a processor, implements the steps of the method according to any of the first aspects.
Compared with the prior art, the embodiment of the application has the beneficial effects that: according to the method, the similarity of the virtual bronchoscope image corresponding to the virtual bronchoscope and the real bronchoscope image acquired by the real bronchoscope is monitored, when the similarity is smaller than a preset similarity threshold value and the real bronchoscope moves to the bifurcation of the real bronchoscope, the real bifurcation image of any bifurcation acquired by the real bronchoscope is acquired, the real bifurcation image is matched with a virtual image set, the real pose of the real bronchoscope is determined according to a matching result, and the method is used for registering the real bronchoscope and the virtual bronchoscope, so that when a large error is generated between the pose of the virtual bronchoscope and the pose of the real bronchoscope, synchronization of the virtual bronchoscope image and the real bronchoscope image can be realized rapidly and effectively based on the real bifurcation image comprising more characteristics.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required for the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a virtual bronchoscope tracking failure provided in an embodiment of the present application;
fig. 2 is a schematic implementation flow chart of a bronchoscope navigation method according to an embodiment of the present application;
FIG. 3 is a schematic representation of a reconstructed three-dimensional virtual bronchial tree provided by embodiments of the present application;
FIG. 4 is a schematic representation of a virtual bronchial tree defining centerlines and bifurcation orifices provided in accordance with embodiments of the present application;
FIG. 5 is a schematic view of a virtual bronchoscope image and a virtual bronchoscope image in an initial state according to an embodiment of the present application;
FIG. 6 is a schematic illustration of a determined bifurcation range provided by an embodiment of the present application;
FIG. 7 is a schematic view of a virtual bifurcation partial image operator interface according to an embodiment of the present application;
Fig. 8 is a schematic implementation flow diagram of a method for matching a partial image of a virtual bifurcation with a real bronchoscope image according to an embodiment of the present application;
FIG. 9 is a schematic diagram of a bronchoscope navigation device provided in an embodiment of the present application;
fig. 10 is a schematic diagram of a bronchoscope navigation device provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system configurations, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to illustrate the technical solutions described in the present application, the following description is made by specific examples.
When the bronchoscope is used for optical navigation, the moving speed of the real bronchoscope is too high, the number of characteristics (such as characteristics of an air passage, an air passage opening, a bifurcation opening, folds and the like) in the real bronchoscope image is small, so that obvious deviation is generated between the pose of the real bronchoscope and the pose of the virtual bronchoscope, the difference between the real bronchoscope image acquired by the real bronchoscope and the virtual bronchoscope image display corresponding to the virtual bronchoscope is large, at the moment, the virtual bronchoscope tracking fails, and the method is unfavorable for accurately guiding an operator to finish related operations of bronchoscope navigation. Fig. 1 is a schematic diagram showing a virtual bronchoscope tracking failure, wherein the left image is a real bronchoscope image, and the right image is a virtual bronchoscope image. Because the bronchoscope moves at too high a speed, the images of the real bronchoscope and the virtual bronchoscope are significantly different, including as shown in the left graph with the real bronchoscope in the middle of the airway and in the right graph with the virtual bronchoscope in the bifurcation of the bronchoairway.
In order to overcome the above problems, an embodiment of the present application proposes a bronchoscope navigation method, as shown in fig. 2, including:
in S201, the similarity of the virtual bronchoscope image corresponding to the virtual bronchoscope and the real bronchoscope image acquired by the real bronchoscope is monitored.
According to the bronchoscope navigation method, an operator acquires a real bronchoscope image in a real bronchoscope tree by using the real bronchoscope, combines a virtual bronchoscope image corresponding to the pose of the virtual bronchoscope in the virtual bronchoscope tree, registers the pose of the real bronchoscope and the pose of the virtual bronchoscope by calculating the similarity between the real bronchoscope image and the virtual bronchoscope image, and accordingly the operator can see the pose of the virtual bronchoscope in the virtual bronchoscope tree, and is convenient for the operator to finish related operations by using the real bronchoscope.
Before the bronchoscope is used for navigation, an offline calculation process and a navigation registration process can be finished in advance, and data preparation and initial pose preparation are carried out for bronchoscope navigation. The offline calculation process can obtain offline data required by the subsequent steps by reading the CT data. Before navigation is started, a navigation registration process can be started first, so that the initial pose of the virtual bronchoscope is identical to the initial pose of the real bronchoscope. The following will specifically describe each of them.
In the off-line calculation process, the method can comprise three-dimensional virtual bronchial tree reconstruction, determination of the airway branch central line in the virtual bronchial tree and construction of virtual image sets corresponding to the bifurcation ports.
The three-dimensional reconstruction of the virtual bronchus tree can comprise the steps of extracting main bronchus by adopting a self-adaptive three-dimensional space region growing method, extracting other bronchus except the main bronchus by adopting an image feature extraction method, and suturing the main bronchus with the other bronchus by adopting a fuzzy connectivity method to obtain the three-dimensional virtual bronchus tree shown in fig. 3.
When determining the calculation of the branch central line of the air passage in the virtual bronchial tree, the position of the central line of each air passage can be determined according to the space position of the air passage of the three-dimensional virtual bronchial tree after reconstruction. And determining the bifurcation of the airway according to the intersection point of the central lines of the airway. A schematic representation of a virtual bronchial tree is obtained defining the centerline 41 and bifurcation ostium 42 as shown in fig. 4.
After determining the bifurcation of the virtual bronchial tree, pose acquisition can be performed within a predetermined range around the bifurcation according to the position of the bifurcation. At each bifurcation, a pose set formed by a plurality of different angles and positions can be obtained by changing different poses. The multiple poses corresponding to each bifurcation can be input into a renderer, and virtual bronchoscope images are obtained on the corresponding poses in the virtual three-dimensional bronchotree, so that a virtual image set can be formed according to the multiple poses. By establishing the bifurcation and the pose set, the accurate pose of the bronchoscope in the real bronchotree can be conveniently determined when the bronchoscope is near the bifurcation.
And performing navigation registration on the real bronchoscope so as to enable the pose of the real bronchoscope to be matched with the pose of the virtual bronchoscope. At the beginning of bronchoscopy navigation, both have the same initial pose. The method of navigation registration may include a variety of methods, such as may include:
mode one: the operator moves the bronchoscope to any pose of the pulmonary bronchotree, including, for example, the position and orientation of the bifurcation near the bifurcation of the main airway and the entire bifurcation of the main airway within the field of view of the lens, and then adjusts the virtual bronchoscope to the pose of the real bronchoscope at this time.
Mode two: the virtual bronchoscope is moved to any position in the pulmonary bronchotree where the bifurcation can be seen, or one pose is selected from the virtual image set, for example, the pose close to the bifurcation of the main airway can be selected, and the operator adjusts the real bronchoscope to the same pose as the virtual bronchoscope at the moment.
Mode three: moving the real bronchoscope to a position capable of seeing a bifurcation in a pulmonary bronchotree at will, determining an airway branch where the real bronchoscope is located at the moment on the bronchotree, registering the virtual bronchoscope to a pose where the real bronchoscope is located at the moment, and synchronizing the real bronchoscope and the virtual bronchoscope at the moment to obtain a virtual bronchoscope image and a virtual bronchoscope image in an initial state as shown in fig. 5, wherein a left image is the real bronchoscope image in the initial state, and a right image is the virtual bronchoscope image in the initial state.
After the navigation registration is completed, the real bronchoscope navigation work, namely the pose of the real bronchoscope is determined through the real bronchoscope image acquired by the real bronchoscope, can be started. The virtual bronchoscope can track the pose of the real bronchoscope in real time, namely, the pose of the virtual bronchoscope is matched with the pose of the real bronchoscope in real time. In the real-time tracking process, the real bronchoscope image display window displays an airway inner picture displayed by the current operator operating the real bronchoscope in real time, and meanwhile, the virtual bronchoscope image display window displays a virtual bronchoscope image which is as similar as possible to the currently displayed real bronchoscope image and is obtained by rendering the virtual bronchoscope through a renderer in real time. The renderer refers to a rendering engine for rendering the three-dimensional model file, and includes, for example, an amold renderer or a source renderer. According to the scheme, the pose data of any point can be input into a preset renderer, so that a virtual bronchoscope image of the bronchoscope at the point, which is output by the renderer, is obtained.
When the real-time tracking is executed, the pose of the real bronchoscope can be tracked in real time in a key frame detection, frame-to-frame matching, image matching and fence filtering mode, and the pose of the virtual bronchoscope is displayed at the corresponding position of the virtual bronchoscope. However, in some cases (such as the case that the real bronchoscope moves too fast, collides with a wall, and causes too few features in the real bronchoscope image), a large error is generated in real-time tracking, and the real bronchoscope image cannot be automatically corrected, so that the virtual bronchoscope image cannot be always synchronized with the real bronchoscope image, and whether the tracking fails can be determined according to the similarity between the virtual bronchoscope image and the real bronchoscope image.
When calculating the similarity between the virtual bronchoscope image and the real bronchoscope image, the region of interest of the image to be calculated can be acquired first, and then the similarity is calculated according to the region of interest. The regions of interest in the virtual bronchoscope image and the real bronchoscope image can be determined by a feature recognition method in the images. Wherein the region of interest in the image may be determined by a trained neural network model. When training the neural network, the region of interest in the virtual bronchoscope image in the sample image and the region of interest in the corresponding real bronchoscope image can be calibrated first, and parameters of the neural network model are adjusted according to the difference between the region of interest output by the neural network and the calibrated region of interest.
Determining the similarity of the virtual bronchoscope image and the real bronchoscope image may include using a hash similarity algorithm, a structural similarity algorithm, a histogram similarity algorithm, and the like.
In S202, when the similarity is smaller than a predetermined similarity threshold and the real bronchoscope is moved to any bifurcation of the real bronchotree, acquiring a real bifurcation image corresponding to any bifurcation acquired by the real bronchoscope.
The similarity threshold may be determined based on a pre-counted similarity of the samples. And, the similarity threshold may be a different value for different similarity algorithms.
When the similarity is smaller than a preset similarity threshold, the difference between the currently displayed real bronchoscope image and the virtual bronchoscope image is obvious, the position of the real bronchoscope is obviously different from that of the virtual bronchoscope, missing positioning is required to be executed, and the pose of the virtual bronchoscope corresponding to the real bronchoscope in the virtual bronchoscope tree is found.
When the missing positioning is carried out, the characteristics of the real bronchoscope at the middle position of the air section are fewer, so that the accuracy of the missing positioning is not improved, and the real bronchoscope can be moved to the bifurcation of the real bronchoscope tree. The moving operation can be completed by an operator, and the system can also determine whether the real bronchoscope moves to the bifurcation of the real bronchoscope according to the image acquired by the real bronchoscope. In order to distinguish the description from the real bronchoscope image before moving to the bifurcation, the image acquired while moving to the bifurcation is the real bifurcation image. For example, the real bifurcation images acquired by the real bronchoscope can be identified by a trained neural network model to determine whether the real bronchoscope is moved to the bifurcation of the real bronchotree. The pre-trained neural network model can comprise a deep convolutional neural network model ResNet-18, VGGNet, googLeNet, denseNet and other series of convolutional neural networks or transformers with binary classification structures. The training process of the neural network model is as follows: first, a bifurcation ostium adjacent image dataset may be constructed consisting of image frames contained in video data obtained by bronchoscopy of different patient's lungs. Each image frame is labeled with a label that characterizes "near bifurcation" or "not near bifurcation". For example, label 1 may be used to indicate an adjacent bifurcation, and label 0 may be used to indicate a non-adjacent bifurcation. And in the training process, the marked data set which is randomly disturbed is divided into a training set and a testing set according to a certain proportion, and the model is trained by adopting the training set. At this point, the measure of the loss function in the model may be the difference between the model prediction and the actual annotation of the image.
This movement may control the movement of the real bronchoscope forward (in the direction of the main bronchi) or backward (in the opposite direction to the direction of the main bronchi). In general, if the cause of the missing is due to the too fast movement speed of the real bronchoscope, the real bronchoscope can be chosen to be moved backwards.
In a possible implementation manner, a matching range of searching bifurcation ports in the virtual bronchotree can be determined, and a pose corresponding to a real bronchoscope can be determined in the matching range of bifurcation ports, so that the efficiency and accuracy of missing positioning of the real bronchoscope are improved. The matching range of the bifurcation is not too large nor too small. If the matching range of the bifurcation is too large, the bifurcation is more, and the possibility that a mismatching result is likely to occur is higher. When the matching range of the bifurcation is too small, a virtual bifurcation partial image matched with the real bifurcation image may not be found.
In S203, the real bifurcation image is matched with a virtual image set, so as to obtain a matching result, where the virtual image set includes a plurality of virtual bifurcation partial images.
In one implementation manner of the present application, a sub-branch, a parent branch and an adjacent branch of the current branch may be determined according to the current branch of the virtual bronchial tree where the virtual bronchoscope is located, and bifurcation ports corresponding to the current branch, the sub-branch, the parent branch and the adjacent branch are used as matching ranges of the virtual image set. Then, only the real bifurcation image is required to be matched with each virtual bifurcation partial image corresponding to the current bifurcation, the sub-bifurcation, the parent bifurcation and the adjacent bifurcation included in the virtual image set, the virtual bifurcation partial image matched with the real bifurcation image is determined, and the real bifurcation image is not required to be matched with all virtual bifurcation partial images corresponding to the virtual bronchial tree, so that the missing positioning efficiency of the real bronchoscope can be improved.
The branch corresponding bifurcation may be the bifurcation at the rear end or the tail end of the branch, i.e. the bifurcation at the end far away from the main bronchus direction. For example, fig. 6 is a schematic diagram showing the range of the determined bifurcation, and the bifurcation corresponding to the current branch is the sub-branch intersection point of the current branch. And the bifurcation corresponding to the parent branch is the intersection point of the adjacent branch and the current branch.
After the matching range of the bifurcation orifice for matching calculation or similarity calculation with the real bronchoscope image is determined, a virtual image set corresponding to the pose set of each bifurcation orifice in the matching range of the bifurcation orifice can be extracted, and the similarity calculation is carried out on the virtual bifurcation orifice local image in the virtual image set and the real bifurcation orifice image acquired by the real bronchoscope at the bifurcation orifice. The multiple virtual bifurcation partial images at each bifurcation can be respectively calculated to obtain corresponding similarity values, and the virtual bifurcation partial image with the highest similarity can be selected as the virtual bifurcation partial image of the bifurcation, which is possibly matched with the pose of a real bronchoscope. The virtual bifurcation site images determined by the plurality of bifurcation sites may be ranked according to similarity, and a predetermined number, such as three virtual bifurcation site images, may be selected for selection by an operator. In the schematic view of the virtual bifurcation site image operation interface shown in fig. 7, when the operator selects any one of the virtual bifurcation site partial images, the schematic view of the contrast effect between the selected virtual bifurcation site partial image and the real bronchoscope can be displayed in real time on the left side. And displaying the pose of the currently selected virtual bifurcation part image in the virtual bronchial tree in real time.
When the similarity calculation is performed on the virtual bifurcation part image and the real bifurcation part image, the pose of the virtual bifurcation part image can be adjusted, or the pose in a centralized way is adjusted, for example, iterative calculation can be performed by adopting a gradient-free optimization method to continuously adjust the pose of the virtual bifurcation part image or the pose in a centralized way, so that the virtual bifurcation part image rendered by the adjusted pose can be more reliably matched with the real bifurcation part image. The matching process of the pose and the real bifurcation images can be as shown in fig. 8, and includes:
in S801, a plurality of initial poses corresponding to any bifurcation in a matching range are respectively determined, and a virtual bifurcation local image set corresponding to any bifurcation is rendered according to the plurality of initial poses.
The correspondence between the bifurcation and the pose set may be preset, and according to the correspondence, a plurality of poses included in the pose set corresponding to the bifurcation may be determined. Before the adjustment, the pose in the pose set is the initial pose.
And inputting the initial pose into a renderer corresponding to a virtual bronchial tree for rendering, and obtaining a virtual bifurcation partial image set corresponding to the bifurcation, wherein the virtual bifurcation partial image set comprises one or more virtual bifurcation partial images.
In S802, according to the similarity between any one of the obtained virtual bifurcation partial images and the real bifurcation image, the pose of any one of the virtual bifurcation partial images is adjusted.
In a possible implementation manner, the region of interest of the virtual bifurcation image and the real bifurcation image can be selected through the structural characteristics in the airway and the region of interest screening function, and then the similarity between the real bifurcation image and the virtual bifurcation image in the region of interest is calculated through the similarity measurement function.
The measure function of similarity may include a loss function, for example, the loss function may include, for example, an L1 loss function, an L2 loss function, a structural similarity loss function, a multi-scale structural similarity loss function, and the like.
In S803, a virtual bifurcation partial image is regenerated according to the adjusted pose, and the similarity between the regenerated virtual bifurcation partial image and the real bifurcation image is calculated, and the pose of any virtual bifurcation partial image is continuously adjusted according to the similarity until the preset iteration requirement is met, so as to obtain the adjusted pose corresponding to the initial pose of the bifurcation.
Iterative computation can be performed through a gradient-free optimization method, and the optimal pose corresponding to the virtual bifurcation partial image is determined.
In the application, a measurement function for measuring the similarity between a real bifurcation image and a virtual bifurcation partial image can be written as f (x), wherein x represents the pose of a virtual bronchoscope, the similarity value between the real bifurcation image and the virtual bifurcation partial image under the condition that a certain similarity measurement function is taken as a measurement standard can be represented as y, and the similarity value between the real bifurcation image and the virtual bifurcation partial image can be represented by adopting a function related to the pose of the virtual bronchoscope, namely: y=f (x), in order to find a virtual bifurcation partial image that is most similar to a real bifurcation image, it is necessary to find the maximum of the function. The extremum solving method can comprise an analytic method capable of directly solving a simple function and determining the extremum of the function. Extremum can be solved numerically for complex nonlinear functions, while extremum can be solved using a gradient-free optimization method for complex and difficult-to-solve nonlinear functions.
The iterative computation is performed by a gradient-free optimization method, and the process of determining the optimal pose corresponding to the partial image of the virtual bifurcation can comprise the following steps:
S1, inputting pose x i In the renderer, render the pose x i Corresponding virtual bifurcation site partial images. Wherein, pose x i Can be the initial pose x 0 Or the updated pose obtained in the step S3. i is an integer greater than or equal to 0, x i The pose obtained through i iterations can be represented.
S2, calculating a real bifurcation image of the current frame and rendering to obtain a position x corresponding to the pose i Is provided for the virtual bifurcation ostium partial image similarity.
S3, adjusting the pose x according to the similarity calculated in the step S2 i And obtaining the updated pose.
Wherein, the matching range can comprise N bifurcation ports, each bifurcation port can comprise M initial pose x 0 . Wherein N and M are integers greater than or equal to 1. The processes S1 to S3 may be executed in a loop by respectively executing the initial poses included in any bifurcation in the matching range until the degree of transformation of the poses is less than a certain threshold (i.e., the transformation amount between the poses obtained by this iteration and the poses obtained by the last iteration is less than a certain threshold, and the local optimum is reached by default), or the number of times of loop execution has reached the maximum number of iterations set by human, and then the iteration is stopped. And obtaining the similarity between the virtual bifurcation image and the real bifurcation image corresponding to the pose of which each initial pose is subjected to iterative adjustment.
In S804, according to the similarity between the virtual bifurcation part image corresponding to the adjusted poses of any bifurcation and the real bifurcation image, the pose with the highest similarity is selected to determine the first virtual bifurcation part image.
After performing the iterative optimization process a plurality of times, the system can obtain a plurality of poses which approximate the local optimal values. And calculating the similarity of the virtual bifurcation port local image corresponding to the plurality of poses and the real bifurcation port image of the current frame in the interested area, and finally selecting the pose corresponding to the highest similarity as the pose corresponding to the current frame.
Or in a possible implementation manner, a plurality of initial poses corresponding to any bifurcation can be determined first, and rendering is performed according to the plurality of initial poses to obtain a plurality of corresponding virtual bifurcation partial images. And matching calculation of similarity is directly carried out on the rendered multiple virtual bifurcation images and the real bifurcation images, so that multiple similarities are obtained. And selecting the virtual bifurcation part image corresponding to the pose with the highest similarity as the first virtual bifurcation part image. The first virtual bifurcation site partial image can be determined more efficiently.
In S204, a virtual bifurcation partial image matching with the real bifurcation image is determined according to the matching result, and a real pose of the real bronchoscope in the real bronchial tree is determined according to the determined virtual bifurcation partial image.
Wherein the real pose is used to register the virtual bronchoscope with the real bronchoscope.
If the matching result is a predetermined number of virtual bifurcation partial images, the operator can select among the predetermined number of virtual bifurcation partial images. For example, a predetermined number of virtual bifurcation site partial images may be displayed by a display device, or a predetermined number of virtual bifurcation site partial images transformed by a pose transformation matrix may be displayed. An operator can switch and select the virtual bifurcation partial image matched with the current real bronchoscope or the virtual bifurcation partial image after transformation through a foot pedal. For example, the virtual bifurcation partial image may be switched by a light foot pedal, and selected by a long foot pedal.
Alternatively, a virtual bifurcation site partial image matching the current real bronchoscope image may also be automatically determined by the system. Or determining, by the system, a virtual bifurcation ostium partial image having a similarity threshold greater than a predetermined matching threshold.
After the virtual bifurcation partial image matched with the real bronchoscope is determined, the pose of the matched virtual bifurcation partial image can be used as the real pose of the real bronchoscope, and the pose of the virtual bronchoscope in the virtual bronchoscope can be updated according to the matching result.
When the virtual bifurcation partial image is an image subjected to pose transformation, the pose determined by the pose of the virtual bifurcation partial image after the pose transformation can be used as the pose of the virtual bronchoscope in the virtual bronchial tree.
According to the method and the device, when the real-time tracking fails, the virtual bronchoscope pose does not need to be manually adjusted, the virtual bronchoscope can be matched with the real bronchoscope pose, and compared with the mode of manually adjusting the virtual bronchoscope pose, the repositioning time is greatly shortened, and the consumption of human resources is reduced. This is because when the pose of the virtual bronchoscope is manually adjusted, a person other than the person operating the real bronchoscope is required to move the corresponding position of the virtual bronchoscope in the virtual bronchoscope by mouse and keyboard operations. The bronchoscope navigation method can be completely operated by a single person, and the bronchoscope navigation efficiency is effectively improved.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic of each process, and should not limit the implementation process of the embodiment of the present application in any way.
Fig. 9 is a schematic diagram of a bronchoscope navigation device according to an embodiment of the present application, as shown in fig. 9, the device includes:
the image monitoring unit 901 is used for monitoring the similarity between a virtual bronchoscope image corresponding to the virtual bronchoscope and a real bronchoscope image acquired by the real bronchoscope;
a real bifurcation image acquiring unit 902, configured to acquire a real bifurcation image acquired by the real bronchoscope corresponding to any bifurcation when the similarity is smaller than a predetermined similarity threshold and the real bronchoscope is moved to any bifurcation of a real bronchotree;
an image matching unit 903, configured to match the real bifurcation image with a virtual image set, to obtain a matching result, where the virtual image set includes a plurality of virtual bifurcation partial images;
a real pose determining unit 904, configured to determine a virtual bifurcation partial image matching with the real bifurcation image according to the matching result, and determine a real pose of the real bronchoscope in the real bronchotree according to the determined virtual bifurcation partial image, where the real pose is used for registering the virtual bronchoscope and the real bronchoscope.
The bronchoscope navigation device shown in fig. 9 corresponds to the bronchoscope navigation method shown in fig. 2.
In addition, the embodiment of the application also provides a bronchoscope navigation system, which comprises:
the tracking preparation module is used for carrying out three-dimensional reconstruction according to CT data to obtain a virtual bronchial tree, determining a virtual image set corresponding to bifurcation ports in the bronchial tree, and enabling the real bronchoscope pose to be identical to the virtual bronchoscope pose by adjusting the real bronchoscope and/or the virtual bronchoscope pose when navigation starts, wherein the virtual image set comprises a plurality of virtual bifurcation port partial images;
the real-time tracking module is used for acquiring the real bronchoscope image acquired by the real bronchoscope and the virtual bronchoscope image corresponding to the virtual bronchoscope in real time in the navigation process, and monitoring the similarity between the acquired real bronchoscope image and the virtual bronchoscope image corresponding to the virtual bronchoscope;
and the missing positioning module is used for searching the virtual bifurcation partial images which are intensively matched with the virtual images through the real bifurcation images obtained by the real bronchoscope when the similarity monitored by the real-time tracking module is smaller than a preset similarity threshold value and the real bronchoscope is moved to any bifurcation of the real bronchoscope, determining the real pose of the real bronchoscope in the real bronchoscope according to the searched virtual bifurcation partial images, and registering the virtual bronchoscope and the real bronchoscope according to the real pose.
Wherein the missing positioning module corresponds to the bronchoscope navigation device shown in fig. 9.
Fig. 10 is a schematic diagram of a bronchoscope navigation device according to an embodiment of the present application. As shown in fig. 10, the bronchoscope navigation apparatus 10 of this embodiment includes: a processor 100, a memory 101, and a computer program 102, such as a bronchoscope navigation program, stored in the memory 101 and executable on the processor 100. The processor 100, when executing the computer program 102, implements the steps of the various bronchoscope navigation method embodiments described above. Alternatively, the processor 100, when executing the computer program 102, performs the functions of the modules/units of the apparatus embodiments described above.
By way of example, the computer program 102 may be partitioned into one or more modules/units that are stored in the memory 101 and executed by the processor 100 to complete the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions for describing the execution of the computer program 102 in the bronchoscope navigation device 10.
The bronchoscope navigation device 10 may be a desktop computer, a notebook computer, a palm computer, a cloud server, or the like. The bronchoscope navigation device may include, but is not limited to, a processor 100, a memory 101. It will be appreciated by those skilled in the art that fig. 10 is merely an example of a bronchoscope navigation device 10 and is not limiting of the bronchoscope navigation device 10, and may include more or fewer components than shown, or may combine certain components, or different components, e.g., the bronchoscope navigation device may further include an input-output device, a network access device, a bus, etc.
The processor 100 may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 101 may be an internal storage unit of the bronchoscope navigation device 10, such as a hard disk or a memory of the bronchoscope navigation device 10. The memory 101 may also be an external storage device of the bronchoscope navigation device 10, such as a plug-in hard disk, smart Media Card (SMC), secure Digital (SD) Card, flash Card (Flash Card) or the like, which are provided on the bronchoscope navigation device 10. Further, the memory 101 may also include both internal and external memory units of the bronchoscope navigation device 10. The memory 101 is used for storing the computer program as well as other programs and data required by the bronchoscope navigation device. The memory 101 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other manners. For example, the apparatus/terminal device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. With such understanding, the present application implements all or part of the flow of the method of the above embodiments, and may also be implemented by hardware associated with computer program instructions, where the computer program may be stored on a computer readable storage medium, where the computer program, when executed by a processor, implements the steps of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the computer readable medium may include content that is subject to appropriate increases and decreases as required by jurisdictions in which such content is subject to legislation and patent practice, such as in certain jurisdictions in which such content is not included as electrical carrier signals and telecommunication signals.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (11)

1. A bronchoscope navigation method, said method comprising:
monitoring the similarity between a virtual bronchoscope image corresponding to the virtual bronchoscope and a real bronchoscope image acquired by a real bronchoscope;
when the similarity is smaller than a preset similarity threshold value and the real bronchoscope is moved to any bifurcation of a real bronchotree, acquiring a real bifurcation image corresponding to any bifurcation, which is acquired by the real bronchoscope;
matching the real bifurcation images with a virtual image set to obtain a matching result, wherein the virtual image set comprises a plurality of virtual bifurcation partial images;
And determining a virtual bifurcation partial image matched with the real bifurcation image according to the matching result, and determining the real pose of the real bronchoscope in the real bronchotree according to the determined virtual bifurcation partial image, wherein the real pose is used for registering the virtual bronchoscope and the real bronchoscope.
2. The method of claim 1, wherein matching the real bifurcation images with a virtual image set results in a match result, comprising:
determining a matching range of the virtual image set corresponding to any bifurcation;
and matching the real bifurcation image with the virtual bifurcation partial image in the matching range to obtain a matching result.
3. The method of claim 2, wherein determining a matching range in the virtual image set corresponding to the any bifurcation, comprises:
determining a current branch of the virtual bronchoscope at the virtual bronchoscope tree when the virtual bronchoscope image is acquired;
searching a sub-branch, a parent branch and an adjacent branch which are adjacent to the current branch in the virtual bronchial tree;
Determining bifurcation ports corresponding to the current branch, the searched sub-branch, the adjacent branch and the parent branch;
and the virtual images are concentrated, and the virtual bifurcation partial images associated with the determined bifurcation are determined as virtual bifurcation partial images in the matching range.
4. The method of claim 2, wherein matching the real bifurcation image with the virtual bifurcation partial image within the matching range comprises:
determining a virtual bifurcation partial image in a virtual bifurcation partial image set corresponding to each bifurcation in the matching range, wherein the similarity between the virtual bifurcation partial image and the real bifurcation image is the highest;
and sequentially sequencing the first virtual bifurcation partial images corresponding to the bifurcation ports according to the similarity, and selecting a preset number of first virtual bifurcation port partial images.
5. The method of claim 4, wherein determining a virtual bifurcation site image in the set of virtual bifurcation site partial images corresponding to each bifurcation site in the matching range, a first virtual bifurcation site image having a highest similarity with the real bifurcation site image, comprises:
Respectively determining a plurality of initial poses corresponding to any bifurcation in a matching range, and rendering according to the initial poses to obtain a virtual bifurcation local image set corresponding to any bifurcation;
according to the similarity between any virtual bifurcation part image in any virtual bifurcation part image set and the real bifurcation part image, adjusting the pose of any virtual bifurcation part image;
regenerating a virtual bifurcation port local image according to the adjusted pose, calculating the similarity between the regenerated virtual bifurcation port local image and a real bifurcation port image, and continuously adjusting the pose of any virtual bifurcation port local image according to the similarity until the preset iteration requirement is met, so as to obtain the adjusted pose corresponding to the initial pose of any bifurcation port;
and selecting the pose with the highest similarity to determine the first virtual bifurcation partial image according to the similarity between the virtual bifurcation partial image corresponding to the multiple adjusted poses of any bifurcation and the real bifurcation image.
6. The method of claim 4, wherein determining a virtual bifurcation site image in the set of virtual bifurcation site partial images corresponding to each bifurcation site in the matching range, a first virtual bifurcation site image having a highest similarity with the real bifurcation site image, comprises:
Respectively determining a plurality of initial poses corresponding to any bifurcation in a matching range, and obtaining a virtual bifurcation local image set corresponding to any bifurcation according to the initial poses;
registering the real bifurcation image with a virtual bifurcation partial image contained in the virtual bifurcation partial image set, and taking the virtual bifurcation partial image with the highest similarity with the real bifurcation image as the first virtual bifurcation partial image.
7. The method of claim 1, wherein prior to navigating, the method further comprises:
offline calculation process: reconstructing a virtual bronchial tree, determining an airway branch central line of the virtual bronchial tree and determining a virtual image set corresponding to the virtual bronchial tree;
navigation registration process: and when navigation starts, adjusting the pose of the real bronchoscope and/or the pose of the virtual bronchoscope to enable the pose of the real bronchoscope to be identical to the pose of the virtual bronchoscope.
8. A bronchoscope navigation system, said system comprising:
the tracking preparation module is used for carrying out three-dimensional reconstruction according to CT data to obtain a virtual bronchial tree, determining a virtual image set corresponding to bifurcation ports in the bronchial tree, and enabling the real bronchoscope pose to be identical to the virtual bronchoscope pose by adjusting the real bronchoscope and/or the virtual bronchoscope pose when navigation starts, wherein the virtual image set comprises a plurality of virtual bifurcation port partial images;
The real-time tracking module is used for acquiring the real bronchoscope image acquired by the real bronchoscope and the virtual bronchoscope image corresponding to the virtual bronchoscope in real time in the navigation process, and monitoring the similarity between the acquired real bronchoscope image and the virtual bronchoscope image;
and the missing positioning module is used for searching the virtual bifurcation partial images which are intensively matched with the virtual images through the real bifurcation images obtained by the real bronchoscope when the similarity monitored by the real-time tracking module is smaller than a preset similarity threshold value and the real bronchoscope is moved to any bifurcation of the real bronchoscope, determining the real pose of the real bronchoscope in the real bronchoscope according to the searched virtual bifurcation partial images, and registering the virtual bronchoscope and the real bronchoscope according to the real pose.
9. A bronchoscope navigation device, said device comprising:
the image monitoring unit is used for monitoring the similarity between the virtual bronchoscope image corresponding to the virtual bronchoscope and the real bronchoscope image acquired by the real bronchoscope;
a real bifurcation image acquisition unit configured to acquire a real bifurcation image corresponding to any bifurcation acquired by the real bronchoscope when the similarity is smaller than a predetermined similarity threshold and the real bronchoscope is moved to any bifurcation of a real bronchotree;
The image matching unit is used for matching the real bifurcation images with a virtual image set to obtain a matching result, wherein the virtual image set comprises a plurality of virtual bifurcation partial images;
and the real pose determining unit is used for determining a virtual bifurcation partial image matched with the real bifurcation image according to the matching result, determining the real pose of the real bronchoscope in the real bronchoscope according to the determined virtual bifurcation partial image, and registering the virtual bronchoscope and the real bronchoscope according to the real pose.
10. Bronchoscope navigation device comprising a memory, a processor and a computer program stored in said memory and executable on said processor, wherein said processor implements the steps of the method according to any one of claims 1 to 7 when said computer program is executed by said processor.
11. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the method according to any one of claims 1 to 7.
CN202111679733.9A 2021-12-31 2021-12-31 Bronchoscope navigation method, device, equipment and storage medium Pending CN116433874A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111679733.9A CN116433874A (en) 2021-12-31 2021-12-31 Bronchoscope navigation method, device, equipment and storage medium
PCT/CN2022/138717 WO2023124978A1 (en) 2021-12-31 2022-12-13 Bronchoscope navigation method and apparatus, and device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111679733.9A CN116433874A (en) 2021-12-31 2021-12-31 Bronchoscope navigation method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116433874A true CN116433874A (en) 2023-07-14

Family

ID=86997737

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111679733.9A Pending CN116433874A (en) 2021-12-31 2021-12-31 Bronchoscope navigation method, device, equipment and storage medium

Country Status (2)

Country Link
CN (1) CN116433874A (en)
WO (1) WO2023124978A1 (en)

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102883651A (en) * 2010-01-28 2013-01-16 宾夕法尼亚州研究基金会 Image-based global registration system and method applicable to bronchoscopy guidance
US20140051986A1 (en) * 2012-08-14 2014-02-20 Intuitive Surgical Operations, Inc. Systems and Methods for Registration of Multiple Vision Systems
CN104105439A (en) * 2012-02-06 2014-10-15 皇家飞利浦有限公司 Invisible bifurcation detection within vessel tree images
US20150196228A1 (en) * 2013-04-15 2015-07-16 Olympus Medical Systems Corp. Endoscope system
CN105608687A (en) * 2014-10-31 2016-05-25 株式会社东芝 Medical image processing method and medical image processing device
US20170071504A1 (en) * 2015-09-16 2017-03-16 Fujifilm Corporation Endoscope position identifying apparatus, endoscope position identifying method, and recording medium having an endoscope position identifying program recorded therein
CN106572794A (en) * 2014-07-02 2017-04-19 柯惠有限合伙公司 System and method for navigating within the lung
US20170296032A1 (en) * 2015-03-06 2017-10-19 Fujifilm Corporation Branching structure determination apparatus, method, and program
US20180271358A1 (en) * 2017-05-23 2018-09-27 Parseh Intelligent Surgical System Navigating an imaging instrument in a branched structure
US20190005687A1 (en) * 2017-06-29 2019-01-03 Covidien Lp System and method for identifying, marking and navigating to a target using real time two dimensional fluoroscopic data
CN111724364A (en) * 2020-06-12 2020-09-29 深圳技术大学 Method and device based on lung lobes and trachea trees, electronic equipment and storage medium
CN111815780A (en) * 2020-06-30 2020-10-23 北京市商汤科技开发有限公司 Display method, display device, equipment and computer readable storage medium
CN112641514A (en) * 2020-12-17 2021-04-13 罗雄彪 Minimally invasive interventional navigation system and method
CN112741692A (en) * 2020-12-18 2021-05-04 上海卓昕医疗科技有限公司 Rapid navigation method and system for realizing device navigation to target tissue position
CN113034700A (en) * 2021-03-05 2021-06-25 广东工业大学 Anterior cruciate ligament reconstruction surgery navigation method and system based on mobile terminal
CN113112609A (en) * 2021-03-15 2021-07-13 同济大学 Navigation method and system for lung biopsy bronchoscope
CN113100943A (en) * 2020-12-31 2021-07-13 杭州堃博生物科技有限公司 Navigation processing method, device, system, equipment and medium in physiological channel
WO2021179745A1 (en) * 2020-03-11 2021-09-16 中国科学院深圳先进技术研究院 Environment reconstruction method and device
CN113610826A (en) * 2021-08-13 2021-11-05 推想医疗科技股份有限公司 Puncture positioning method and device, electronic device and storage medium
CN113616333A (en) * 2021-09-13 2021-11-09 上海微创医疗机器人(集团)股份有限公司 Catheter movement assistance method, catheter movement assistance system, and readable storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
BRPI1007726A2 (en) * 2009-05-18 2017-01-31 Koninl Philips Electronics Nv Image-to-image registration method, Image-to-image registration system, Guided endoscopy camera position calibration method and Guided endoscopy camera calibration system
CN110478050A (en) * 2019-08-23 2019-11-22 北京仁馨医疗科技有限公司 3-D image and scope image fusing method, apparatus and system based on CT/MRI data

Patent Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102883651A (en) * 2010-01-28 2013-01-16 宾夕法尼亚州研究基金会 Image-based global registration system and method applicable to bronchoscopy guidance
CN104105439A (en) * 2012-02-06 2014-10-15 皇家飞利浦有限公司 Invisible bifurcation detection within vessel tree images
US20140051986A1 (en) * 2012-08-14 2014-02-20 Intuitive Surgical Operations, Inc. Systems and Methods for Registration of Multiple Vision Systems
US20150196228A1 (en) * 2013-04-15 2015-07-16 Olympus Medical Systems Corp. Endoscope system
CN106572794A (en) * 2014-07-02 2017-04-19 柯惠有限合伙公司 System and method for navigating within the lung
CN105608687A (en) * 2014-10-31 2016-05-25 株式会社东芝 Medical image processing method and medical image processing device
US20170296032A1 (en) * 2015-03-06 2017-10-19 Fujifilm Corporation Branching structure determination apparatus, method, and program
US20170071504A1 (en) * 2015-09-16 2017-03-16 Fujifilm Corporation Endoscope position identifying apparatus, endoscope position identifying method, and recording medium having an endoscope position identifying program recorded therein
US20180271358A1 (en) * 2017-05-23 2018-09-27 Parseh Intelligent Surgical System Navigating an imaging instrument in a branched structure
US20190005687A1 (en) * 2017-06-29 2019-01-03 Covidien Lp System and method for identifying, marking and navigating to a target using real time two dimensional fluoroscopic data
WO2021179745A1 (en) * 2020-03-11 2021-09-16 中国科学院深圳先进技术研究院 Environment reconstruction method and device
CN111724364A (en) * 2020-06-12 2020-09-29 深圳技术大学 Method and device based on lung lobes and trachea trees, electronic equipment and storage medium
CN111815780A (en) * 2020-06-30 2020-10-23 北京市商汤科技开发有限公司 Display method, display device, equipment and computer readable storage medium
CN112641514A (en) * 2020-12-17 2021-04-13 罗雄彪 Minimally invasive interventional navigation system and method
CN112741692A (en) * 2020-12-18 2021-05-04 上海卓昕医疗科技有限公司 Rapid navigation method and system for realizing device navigation to target tissue position
CN113100943A (en) * 2020-12-31 2021-07-13 杭州堃博生物科技有限公司 Navigation processing method, device, system, equipment and medium in physiological channel
CN113116524A (en) * 2020-12-31 2021-07-16 杭州堃博生物科技有限公司 Detection compensation method and device, navigation processing method and device and navigation system
CN113116475A (en) * 2020-12-31 2021-07-16 杭州堃博生物科技有限公司 Method, apparatus, medium, device and navigation system for navigation processing through a catheter
CN113034700A (en) * 2021-03-05 2021-06-25 广东工业大学 Anterior cruciate ligament reconstruction surgery navigation method and system based on mobile terminal
CN113112609A (en) * 2021-03-15 2021-07-13 同济大学 Navigation method and system for lung biopsy bronchoscope
CN113610826A (en) * 2021-08-13 2021-11-05 推想医疗科技股份有限公司 Puncture positioning method and device, electronic device and storage medium
CN113616333A (en) * 2021-09-13 2021-11-09 上海微创医疗机器人(集团)股份有限公司 Catheter movement assistance method, catheter movement assistance system, and readable storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
程渊;: "虚拟支气管镜技术", 中国医学前沿杂志(电子版), no. 12 *
罗慧娉;方锐;蒋家琪;: "3D打印支气管模型用于硬支气管镜检查的模拟训练", 中国眼耳鼻喉科杂志, no. 03 *

Also Published As

Publication number Publication date
WO2023124978A1 (en) 2023-07-06

Similar Documents

Publication Publication Date Title
WO2021203795A1 (en) Pancreas ct automatic segmentation method based on saliency dense connection expansion convolutional network
CN111667478B (en) Method and system for identifying carotid plaque through CTA-MRA cross-modal prediction
CN111696089A (en) Arteriovenous determining method, device, equipment and storage medium
US10366488B2 (en) Image processing used to estimate abnormalities
CN113011509B (en) Lung bronchus classification method and device, electronic equipment and storage medium
CN112037146B (en) Automatic correction method and device for medical image artifacts and computer equipment
CN115345938B (en) Global-to-local-based head shadow mark point positioning method, equipment and medium
CN113017702B (en) Method and system for identifying extension length of small probe of ultrasonic endoscope and storage medium
CN116503607B (en) CT image segmentation method and system based on deep learning
CN112734776A (en) Minimally invasive surgical instrument positioning method and system
CN111681247B (en) Lung lobe lung segment segmentation model training method and device
CN111563550A (en) Sperm morphology detection method and device based on image technology
CN113327225A (en) Method for providing airway information
CN114332132A (en) Image segmentation method and device and computer equipment
CN114176775B (en) Calibration method, device, equipment and medium for ERCP selective bile duct intubation
WO2022155454A1 (en) Methods and apparatuses for generating anatomical models using diagnostic images
US8831301B2 (en) Identifying image abnormalities using an appearance model
CN117237322A (en) Organ segmentation modeling method and terminal based on medical image
CN116433874A (en) Bronchoscope navigation method, device, equipment and storage medium
CN112488982A (en) Ultrasonic image detection method and device
CN116228731A (en) Multi-contrast learning coronary artery high-risk plaque detection method, system and terminal
CN113192099B (en) Tissue extraction method, device, equipment and medium
Lou et al. WS-SfMLearner: Self-supervised Monocular Depth and Ego-motion Estimation on Surgical Videos with Unknown Camera Parameters
Jiang et al. An automatic and fast centerline extraction algorithm for virtual colonoscopy
CN114419061A (en) Method and system for segmenting pulmonary artery and vein blood vessels

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination