WO2011062035A1 - Système de support de biopsie - Google Patents

Système de support de biopsie Download PDF

Info

Publication number
WO2011062035A1
WO2011062035A1 PCT/JP2010/068973 JP2010068973W WO2011062035A1 WO 2011062035 A1 WO2011062035 A1 WO 2011062035A1 JP 2010068973 W JP2010068973 W JP 2010068973W WO 2011062035 A1 WO2011062035 A1 WO 2011062035A1
Authority
WO
WIPO (PCT)
Prior art keywords
biopsy
image
tumor
ultrasonic
target tissue
Prior art date
Application number
PCT/JP2010/068973
Other languages
English (en)
Japanese (ja)
Inventor
大西 順一
Original Assignee
オリンパスメディカルシステムズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパスメディカルシステムズ株式会社 filed Critical オリンパスメディカルシステムズ株式会社
Priority to CN201080003323.6A priority Critical patent/CN102231965B/zh
Priority to EP10827691.6A priority patent/EP2430979B1/fr
Priority to JP2011505721A priority patent/JP4733243B2/ja
Priority to US13/027,592 priority patent/US20110237934A1/en
Publication of WO2011062035A1 publication Critical patent/WO2011062035A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/12Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B10/00Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
    • A61B10/02Instruments for taking cell samples or for biopsy
    • A61B10/04Endoscopic instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/065Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2063Acoustic tracking systems, e.g. using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • A61B8/4254Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe

Definitions

  • the present invention relates to a biopsy support system that supports a procedure for performing a biopsy from a target tissue such as a tumor using an ultrasonic tomographic image when an ultrasonic probe is inserted into a bronchi or the like.
  • ultrasonic probes have been widely used for various treatments and treatments.
  • an ultrasonic probe is used to draw the peripheral bronchi as an ultrasonic tomographic image.
  • a tissue such as a tumor is present in an ultrasonic tomographic image, it may be necessary to biopsy the tissue and perform tissue diagnosis.
  • Japanese Patent Application Laid-Open No. 2004-499 as a first conventional example describes information on the position of a target tissue such as a tumor acquired by transmission / reception of ultrasonic waves by an ultrasonic transducer.
  • An ultrasonic medical system that outputs as appropriate information so that it can be used by a separate X-ray irradiation apparatus or the like is disclosed.
  • the treatment by the X-ray irradiation toward the tumor is performed by the X-ray irradiation apparatus.
  • virtual image generation means for generating a virtual image of a body cavity in a subject based on image data of a three-dimensional region of the subject, and a body cavity Route start point setting means for setting the start point of the route to be inserted into the path, region of interest setting means for setting the region of interest in the subject, and insertion route to the body cavity of the endoscope based on the region of interest
  • An insertion support system including a route end point extraction unit for extracting the end point of the route is disclosed.
  • the surgeon can set the endoscope or the like near the region of interest according to the insertion route, but after that, the tissue can be actually collected from the target tissue such as a tumor by the biopsy device. Since there is no disclosure, it is difficult to perform tissue collection.
  • the operator can set the distal end portion of the endoscope near the target tissue, for example, but try to collect the tissue from the distal end portion of the endoscope with respect to the target tissue. Insufficient information for For this reason, it has a function to support biopsy by displaying information on the biopsy range that facilitates tissue sampling with a biopsy device, such as the range where the target tissue exists from the tip of the endoscope A biopsy support system is desired.
  • the present invention has been made in view of the above-described points, and provides a biopsy support range that makes it possible to easily collect a tissue from a target tissue such as a tumor, or a biopsy support system that can display the information. With the goal.
  • a biopsy support system includes a virtual shape image generation unit that generates a virtual shape image of a body cavity from image data of a three-dimensional region with respect to a subject, and an endoscope that is inserted into the body cavity And an endoscope provided with a channel through which a biopsy treatment tool can be inserted, and an ultrasonic transducer and a position sensor for detecting a three-dimensional position are provided at the distal end of the endoscope.
  • An ultrasonic probe, an ultrasonic image generation unit that generates an ultrasonic tomographic image by the ultrasonic transducer, and the movement of the tip of the ultrasonic probe protruding from the tip of the channel are displayed on the ultrasonic tomographic image.
  • a position detector that detects a three-dimensional position of the position sensor when both end positions of a target tissue to be biopsied are detected, and a three-dimensional position corresponding to the both end positions detected by the position detector.
  • An image display control unit that superimposes and displays a biopsy range for performing biopsy with the biopsy treatment tool in the target tissue at a position corresponding to the biopsy range on the virtual shape image; I have.
  • FIG. 1 is a diagram illustrating an overall configuration of a biopsy support system according to a first embodiment of the present invention.
  • FIG. 2 is a diagram showing the configuration of the tip side of the ultrasonic probe.
  • FIG. 3 is a flowchart illustrating an example of an operation procedure for performing a biopsy according to the first embodiment.
  • FIG. 4 is a flowchart showing processing for generating and displaying a bronchial virtual image.
  • FIG. 5 is a block diagram showing a detailed configuration of the insertion support apparatus and the like in FIG.
  • FIG. 6 is a view showing a state in which the distal end portion of the endoscope is inserted into the distal portion side of the bronchus.
  • FIG. 1 is a diagram illustrating an overall configuration of a biopsy support system according to a first embodiment of the present invention.
  • FIG. 2 is a diagram showing the configuration of the tip side of the ultrasonic probe.
  • FIG. 3 is a flowchart illustrating an example of an operation procedure for performing
  • FIG. 7 is a view showing a state in which a distal end side of an ultrasonic probe inserted into a channel of an endoscope is projected to a tumor side.
  • FIG. 8 is a flowchart showing an example of a processing procedure for calculating an appropriate range as a tumor position detection and biopsy range.
  • FIG. 9 is a diagram illustrating a state in which a distal end portion of an ultrasonic probe is moved and a region where a tumor exists is detected using an ultrasonic tomographic image.
  • FIG. 10 is an explanatory diagram of an example of calculating the appropriate range.
  • FIG. 11 is a diagram illustrating an example of an image state when a tumor is not detected on an ultrasonic tomographic image and an image when it is detected.
  • FIG. 12 is a diagram showing a display example of a virtual shape image in a bronchial virtual image displayed by superimposing information such as an appropriate range.
  • FIG. 13 is a diagram showing an overall configuration of a biopsy support system according to a second embodiment of the present invention.
  • FIG. 14 is a flowchart illustrating an example of a processing procedure for performing a biopsy according to the second embodiment.
  • FIG. 15 is an explanatory diagram showing a state in which ultrasonic tomographic image data is generated by moving the ultrasonic probe.
  • FIG. 16 is a flowchart showing an example of a processing procedure for calculating an appropriate range along a direction perpendicular to the traveling direction of the bronchus in the modification.
  • FIG. 17 is a diagram showing ultrasonic tomographic image data acquired by an ultrasonic transducer.
  • FIG. 18 is a diagram showing the center of gravity position and the appropriate range of a tumor calculated using the ultrasonic tomographic image data of FIG.
  • FIG. 19 is a diagram showing a display example of a virtual shape image in a virtual image of a bronchus and a state in which tissue is collected from a tumor by projecting a biopsy needle from an endoscope channel.
  • a biopsy support system 1 includes an ultrasonic observation system 3 including an ultrasonic probe 2 inserted into, for example, a bronchus as a tubular body cavity to be inserted.
  • the insertion support device 4 used together with the ultrasonic observation system 3 and the endoscope device 5 are mainly configured.
  • the ultrasonic observation system 3 includes an ultrasonic probe 2, an ultrasonic observation device 6 that is connected to the ultrasonic probe 2 and includes an ultrasonic image generation unit that generates an ultrasonic tomographic image, and the ultrasonic observation device 6.
  • a monitor 7 for displaying the ultrasonic tomographic image generated by the above.
  • the ultrasonic probe 2 has an elongated probe body 11.
  • FIG. 2 is an enlarged view of the tip of the ultrasonic probe 2.
  • the probe tip 12 as the tip of the ultrasonic probe 2 is provided with an ultrasonic transducer 13 and a position sensor 14 arranged to detect the position of the probe tip 12 or the ultrasonic transducer 13. ing.
  • the position detection device 8 (see FIG. 5) actually detects the three-dimensional position of the position sensor 14.
  • the three-dimensional position can be approximated as the three-dimensional position of the ultrasonic transducer 13 or the probe tip 12. Note that since the distance between the position sensor 14 and the ultrasonic transducer 13 is known, it is possible to accurately detect the three-dimensional position of the ultrasonic transducer 13 from the three-dimensional position of the position sensor 14 in consideration of this distance. it can.
  • a position sensor 14 composed of a coil detects a fluctuating magnetic field emitted from an antenna (not shown) connected to the position detection device 8, and a current is detected by the position detection device 8.
  • the position detection device 8 can estimate the position of the position sensor 14 from the fluctuating magnetic field output from the antenna and the current value of the position sensor 14.
  • a magnetic position detection device using a coil has been described.
  • the position sensor 14 and the position detection device 8 are not limited to this configuration.
  • the position sensor 14 is disposed in the probe distal end portion 12 together with the ultrasonic transducer 13. As will be described in detail with reference to FIG. 5, the insertion support apparatus 4 detects the three-dimensional position of the position sensor 14 using a tomographic image of a tumor as a target tissue detected on the ultrasonic tomographic image. And a position detection unit 15 that detects a boundary position or both end positions of a range in which a tumor exists in a predetermined direction (traveling direction of the body cavity) based on the information of the three-dimensional position. Have.
  • the position detection unit 15 may be defined to include a function of detecting a region where a tumor exists in a direction orthogonal to the traveling direction of the body cavity.
  • the insertion support apparatus 4 is based on CT image data so that a processing procedure for performing a biopsy of a tumor as a target tissue can be performed smoothly, and a virtual endoscopic image of a bronchi as a body cavity (hereinafter referred to as VBS).
  • VBS virtual endoscopic image of a bronchi as a body cavity
  • Image a virtual image generation unit 16 that generates a virtual image (virtual image) of the virtual three-dimensional shape image (virtual shape image) of the bronchus.
  • the VBS image as a virtual endoscopic image of the bronchus is an endoscopic image of the bronchus that is virtually estimated by an endoscope arranged in the bronchus.
  • the insertion support device 4 has a function of combining an endoscope image (real image) as a moving image obtained by the endoscope device 5 and the VBS image and displaying the synthesized image on the monitor 17. 5 when the operator inserts the endoscope 18 into the bronchus.
  • the endoscope apparatus 5 includes an endoscope 18, a light source device that supplies illumination light to the endoscope 18, and a camera control unit (abbreviated as CCU) that performs signal processing on an imaging signal from the endoscope 18. ) Etc.
  • the endoscope 18 incorporates an image pickup means 19c (see FIG.
  • the endoscope 18 is provided with a channel 19a in the longitudinal direction of the endoscope insertion portion 19 (see FIG. 6).
  • a biopsy device such as a biopsy needle as a biopsy treatment tool for performing biopsy can be inserted into the channel 19a.
  • a bendable bending portion is provided adjacent to the proximal end portion of the distal end portion 19b of the endoscope insertion portion 19, and the surgeon performs an operation to bend the bending portion, thereby allowing the distal end of the endoscope to be bent. It can be inserted into a body cavity tract whose side is bent (in the specific example, in the bronchus). Note that the ultrasonic probe 2 inserted into the channel 19a of the endoscope 18 in the present embodiment is sufficiently small in diameter and small in size.
  • the ultrasonic transducer 13 built in the ultrasonic probe 2 used in this embodiment is a radial scan type that transmits and receives ultrasonic waves in the circumferential direction around the longitudinal direction of the ultrasonic probe 2.
  • a radial scan type but also a type that performs sector scan in a part of the circumferential direction may be used.
  • FIG. 5 is a block diagram of the system. First, this explanation will be given.
  • the virtual image generation unit 16 that constitutes the insertion support apparatus 4 can receive three-dimensional image data generated by a known CT apparatus (not shown) that captures an X-ray tomographic image of a patient, such as a DVD (Digital Versatile Disk) apparatus.
  • a CT image data capturing unit 21 that captures via a portable storage medium is included.
  • the virtual image generation unit 16 also includes a CT image data storage unit 22 that stores CT image data captured by the CT image data capture unit 21, and CT image data stored in the CT image data storage unit 22.
  • an MPR image generator 23 that generates an MPR image (multi-section reconstructed image) based on the MPR image.
  • the virtual image generation unit 16 includes an image processing unit 27 that receives an imaging signal from the endoscope apparatus 5 as an input signal, generates a real image, and synthesizes it with a VBS image.
  • the image processing unit 27 is connected to a memory 28 that temporarily stores image data and the like for image processing.
  • the image processing unit 27 includes a virtual shape image generation unit 27 a that generates a virtual shape image of the bronchus from the CT image data stored in the CT image data storage unit 22.
  • the virtual shape image generation unit 27a stores the generated virtual shape image data of the bronchi in the image data storage unit 27b therein.
  • the image data storage unit 27b forms a virtual shape image storage unit that stores virtual shape images.
  • the virtual shape image generation unit 27 a may be provided outside the image processing unit 27.
  • the insertion support apparatus 4 is provided with an image display control unit 29 that performs control for causing the monitor 17 to display the route setting screen generated by the route setting unit 24 and the insertion support image generated by the image processing unit 27.
  • an input device 30 including a keyboard and a pointing device for inputting setting information to the route setting unit 24 is provided.
  • the CT image data storage unit 22 and the VBS image storage unit 26 for storing VBS images may be configured by a single hard disk, and the MPR image generation unit 23, route setting unit 24, VBS for generating VBS images.
  • the image generation unit 25 and the image processing unit 27 can be configured by one arithmetic processing circuit.
  • the CT image data capturing unit 21 has been described as capturing CT image data via a portable storage medium such as a DVD. However, a CT apparatus or a hospital server storing CT image data is connected to the hospital LAN. In such a case, the CT image data capturing unit 21 may be configured by an interface circuit that can be connected to the in-hospital LAN, and CT image data may be captured through the in-hospital LAN.
  • the position detection unit 15 includes a tumor position detection control unit 31 that detects a boundary position of a tumor as a target tissue using the three-dimensional position information detected by the position detection device 8, and a range suitable for biopsy.
  • a parameter storage unit 32 that stores an appropriate range parameter used when determining an appropriate range, and a position information memory 33 that stores detected position information.
  • the target tissue is not limited to a tumor (tissue), and may be an affected part (tissue) to be biopsied by an operator.
  • the tumor position detection control unit 31 includes a tumor detection switch A34 and a tumor (input of the detection of one end when a tumor is present along a predetermined direction in which the probe tip 12 is moved) and a tumor ( Detection) is connected to a tumor detection switch B35 for inputting a none signal.
  • Tumor detection switch A34 and tumor detection switch B35 are input by the operator.
  • the apparatus side constituting the biopsy support system may generate a tumor detection signal and a tumor-free signal.
  • the operator Since the ultrasonic tomographic image acquired using the ultrasonic transducer 13 built in the tip of the ultrasonic probe 2 constituting the ultrasonic observation system 3 is displayed on the monitor 7, the operator It can be confirmed whether or not the tumor is detected (displayed) by the tomographic image.
  • the operator inserts and moves the ultrasonic probe 2 along the traveling direction of the bronchus, for example, to the distal side, and on the ultrasonic tomographic image, a tumor that cannot be optically observed outside the lumen (body cavity) of the bronchus. The presence or absence can be confirmed.
  • the operator detects one end (for example, the upper end) of the tumor on the ultrasonic tomography while moving the ultrasonic probe 2
  • the operator operates the tumor detection switch A34 to send the tumor detection signal to the tumor position detection control unit. 31.
  • the tumor position detection control unit 31 stores the three-dimensional position of the distal end portion of the ultrasonic probe 2, that is, the probe distal end portion 12 output by the position detection device 8 in the position information memory 33.
  • the tumor detection switch B35 that the operator inputs to the tumor position detection control unit 31 that the other end of the tumor (that is, the portion where the tumor is not detected from the other end) is input as a tumor-free signal. Is provided.
  • the tumor position detection control unit 31 stores information on the three-dimensional position of the probe tip 12 detected by the position detection device 8 in the position information memory 33. In this way, as described later, it is possible to detect the range or both end positions where the tumor exists at least in a predetermined direction (the traveling direction of the bronchus into which the probe tip 12 is inserted).
  • the tumor position detection control unit 31 performs processing for calculating an appropriate range suitable for biopsy using a biopsy treatment tool as a biopsy range in the tumor using information on the three-dimensional position of the probe tip 12. . Then, the tumor position detection control unit 31 outputs information on the three-dimensional position of the probe tip 12 and information on an appropriate range in the tumor to the image display control unit 29.
  • the information on the three-dimensional position of the probe tip 12 and the position information on the appropriate range or the biopsy target position are stored in a position information memory 33 that constitutes a storage means for storing the position information.
  • the image display control unit 29 associates the three-dimensional position information of the probe tip 12 with the virtual image so that the operator can easily recognize which part of the bronchus the ultrasound image has been acquired. That is, the sensor coordinate system obtained by the position sensor 14 is associated with the CT coordinate system constituting the virtual image. In order to associate the two coordinate systems, parameters necessary for the conversion are acquired or set based on a plurality of known positions in advance.
  • the patient is set in a predetermined posture and the traveling direction of the bronchus is defined, the probe tip 12 is set at one known point near the entrance of the bronchus, and the sensor coordinate system is associated with the CT coordinate system
  • the surgeon gives instructions. Further, the operator instructs to insert the probe tip 12 sequentially into the bronchus for a plurality of known insertion lengths and associate the sensor coordinate system with the CT coordinate system. In this way, the system 1 can acquire parameters that need to be converted from one to the other for both coordinate systems.
  • the image display control unit 29 has a corresponding position on the bronchus virtual image displayed on the monitor 17.
  • the positions of the probe tip 12 and the like are superimposed and displayed. That is, the image display control unit 29 superimposes and displays information such as the position information of the probe tip 12 and the biopsy range on the virtual image of the bronchus.
  • a keyboard as a data input means or the like so that the operator can input data specifying a biopsy target position such as the approximate center position of the region where the tumor is present to the tumor position detection control unit 31.
  • the input device 36 is provided.
  • surgeon can instruct the tumor position detection control unit 31 to record the information of the three-dimensional position detected by the position detection device 8 in the position information memory 33 from the input device 36.
  • FIG. 1 A representative example of a procedure for performing a biopsy using the insertion support system 1 is shown in FIG.
  • the operation and configuration of the insertion support system 1 will be described in detail.
  • the surgeon acquires 3D CT image data of the patient, generates a virtual image of the patient using the insertion support device 4, and displays it on the monitor 17. This process is performed by the virtual image generation unit 16, and details thereof are shown in FIG.
  • the surgeon acquires three-dimensional CT image data by CT scanning for the patient before the procedure for collecting the tissue.
  • the three-dimensional CT image data is taken into the CT image data storage unit 22 in the insertion support device 4 via the CT image data fetching unit 21 in the virtual image generation unit 16 incorporated in the insertion support device 4.
  • the three-dimensional CT image data can be easily moved by, for example, a DVD.
  • the image processing unit 27 performs a bronchus extraction process, for example, by a process of extracting a body cavity portion where air exists.
  • the image processing unit 27 generates an extracted virtual shape image of the bronchus.
  • the image processing unit 27 stores the generated image data of the virtual shape image of the bronchus in the image data storage unit such as the memory 28 and sends it to the image display control unit 29.
  • the virtual image generation unit 16 generates a route setting screen having the MPR image (multi-section reconstructed image) generated by the MPR image generation unit 23 and sets a route setting unit 24 for setting a route to the bronchus of the endoscope 18.
  • the virtual image generation unit 16 includes a VBS image generation unit 25 that generates a VBS image according to the CT image data stored in the CT image data storage unit 22 and the route set by the route setting unit 24.
  • the CT image data in the CT image data storage unit 22 is output to the MPR image data generation unit 23. While viewing the MPR image generated here, the operator sets the route by the input means 30. From the route setting information and the MPR image data, the route setting unit 24 determines a route. Based on this route information, the VBS image generation unit 25 generates a VBS image from the CT image data (step S14).
  • the VBS image generated by the VBS image generation unit 25 is stored in the VBS image storage unit 26.
  • the image display control unit 29 displays a virtual image of the bronchus on the monitor 17.
  • the VBS image generated by the VBS image generation unit 25 is also synthesized and displayed on the monitor 17. Accordingly, a virtual image composed of a bronchial VBS image and a virtual shape image is displayed on the monitor 17. In this manner, a virtual image can be generated and displayed by the flow shown in FIG.
  • step S ⁇ b> 2 of FIG. 3 the surgeon refers to the bronchus virtual image and inserts the endoscope 18 to the distal end side of the target bronchus.
  • FIG. 6 shows this state, and shows a state in which the distal end portion 19b of the endoscope 18 is inserted to the vicinity of the target site close to the tumor 42 on the distal side of the bronchus 41.
  • step S3 the operator inserts the ultrasonic probe 2 into the channel 19a of the endoscope 18, and causes the distal end side of the ultrasonic probe 2 to protrude from the distal end opening of the channel 19a.
  • FIG. 7 shows a state in which the distal end side of the ultrasonic probe 2 protrudes from the distal end opening of the channel 19a of the endoscope 18 to the vicinity of the position where the tumor 42 can be observed.
  • a broken line indicates the ultrasonic probe 2 and shows a state where the ultrasonic probe 2 is inserted into the channel 19 a of the endoscope 18 through the guide tube 43.
  • the guide tube 43 is set so that the distal end opening is positioned near the proximal end of the probe distal end portion 12 of the ultrasonic probe 2, for example.
  • the guide tube 43 is moved in the channel 19 a together with the ultrasonic probe 2.
  • This guide tube 43 can be used for positioning on the distal end side when performing a tissue sampling procedure using a biopsy device as a treatment tool for performing a biopsy procedure described below.
  • the biopsy support system 1 performs an operation of displaying an ultrasonic tomographic image by the ultrasonic observation system 3 and detecting a three-dimensional position using the position detection device 8 of the insertion support device 4.
  • the position detection apparatus 8 detects the three-dimensional position of the probe front-end
  • the position of the position detection sensor 14 is P 1
  • the position of the ultrasonic transducer 13 is P 2
  • the distance between the position detection sensor 14 and the ultrasonic transducer 13 is l
  • the distance l is small, for example, l ⁇ In the case of 1 mm, it can be approximated as P 1 ⁇ P 2 .
  • step S5 the surgeon observes the ultrasonic tomographic image by the ultrasonic transducer 13 in the probe distal end portion 12 protruding from the distal end opening of the channel 19a, and positions the tumor 42 as the target tissue. Check. At this time, the position detection unit 15 derives an appropriate range for biopsy based on the inputs of the tumor detection switches A34 and B35.
  • the tumor position detection control unit 31 sends information on an appropriate range of biopsy calculated in step S5 to the image display control unit 29. Further, the image display control unit 29 displays the appropriate range of biopsy (or information on the appropriate range) superimposed on a position corresponding to the appropriate range on the virtual image generated by the image processing unit 27. The tumor position detection control unit 31 also sends information on the three-dimensional position of the probe tip 12 to the image display control unit 29. The image display control unit 29 performs image processing for superimposing the appropriate range and the like (information thereof) and the three-dimensional position of the probe tip 12 of the ultrasonic probe 2 on the virtual image of the bronchi 41, and monitors the superimposed image on the monitor 17. Is displayed. Then, the surgeon determines the position of the guide tube based on the information displayed on the monitor 17.
  • FIG. 8 shows a processing flow including tumor position detection by the biopsy support system 1 in the procedure from steps S4 to S6 in FIG. Next, a procedure including the position detection of the tumor 42 as the target tissue in step S5 of FIG. 3 will be described with reference to FIG.
  • the position detection device 8 acquires the three-dimensional position information of the position sensor 14 as shown in step S21. Then, as shown in step S22, the position detection device 8 detects the three-dimensional position of the position sensor 14, in other words, the three-dimensional position of the probe tip portion 12. Further, as shown in step S23, the ultrasonic transducer 13 built in the probe tip 12 transmits and receives ultrasonic waves, and the ultrasonic tomographic image generated by the ultrasonic image generation unit inside the ultrasonic observation device 6 is transmitted. It is displayed on the monitor 7. The operator moves the probe tip 12 toward the distal side along the traveling direction of the bronchus 41 while observing the ultrasonic tomographic image displayed on the monitor 7.
  • the operator brings the probe tip 12 into contact with the inner wall surface of the bronchus 41 so that an ultrasonic tomographic image can be obtained.
  • a balloon containing an ultrasonic transmission medium may be attached to the probe distal end portion 12.
  • the tumor position detection control unit 31 is in a state waiting for input of the tumor detection switch A34, and the process from step S21 to step S24 is performed until a tumor detection signal is input by the tumor detection switch A34. repeat.
  • a tumor detection signal is input by the tumor detection switch A34. repeat.
  • FIG. 7 when the ultrasonic transducer 13 at the probe tip 12 reaches a boundary position where the tumor 42 starts to be observed, an ultrasonic tomographic image (also simply referred to as a tomographic image) for the tumor 42 is displayed on the monitor 7. It will be in a state to be obtained.
  • the operator pulls out the ultrasonic probe 2 leaving the guide tube 43 in the channel 19a as shown in step S7, and into the guide tube 43. Insert the biopsy device.
  • step S8 the surgeon projects the biopsy device from the distal end of the positioned guide tube 43, and performs biopsy for the tumor 42 as a target tissue, that is, tissue sampling.
  • tissue sampling the operator pulls out the endoscope 18 from the bronchus 41 together with the biopsy device as shown in step S9. Then, the surgeon finishes the target tissue collecting treatment of the tumor 42.
  • FIG. 9 shows an enlarged view of the periphery of the tumor 42 in FIG. Probe tip 12 as shown in FIG.
  • FIG. 11 schematically shows an ultrasonic tomographic image displayed on the monitor 7 in this case.
  • the tumor 42 can be identified from the luminance level on the ultrasonic tomographic image because the acoustic impedance is different from that of the normal tissue.
  • the tomographic image 42a of the tumor 42 is displayed as shown on the right side.
  • the ultrasonic tomographic image in FIG. 11 corresponds to a case where the ultrasonic transducer 13 performs a radial scan in which ultrasonic waves are sequentially emitted in a circumferential direction perpendicular to the longitudinal direction of the ultrasonic probe 2 as an axis.
  • FIG. 11 shows a case where the direction perpendicular to the plane and the traveling direction D L, small circle represents the ultrasonic transmission and reception position of the ultrasonic vibrator.
  • the operator detects the tumor detection switch A34. Operate to input tumor detection signal.
  • the tumor position detection control section 31 in step S25 of FIG. 8 the three-dimensional position of the probe tip 12, a three-dimensional upper end position A obtained by projecting the upper end position a in other words tumor 42 in the traveling direction D L, the position The information is stored in the position information memory 33 as information storage means. Surgeon continued for an additional operation to move the probe tip 12 in the traveling direction D L of the bronchi 41, while watching the ultrasonic tomographic image, find the other end of the tumor 42. In response to this operation, the processing from step S26 to step S29 is performed. Steps S26, S27, and S28 are the same processes as steps S21, S22, and S23, respectively. In step S29, the tumor position detection control unit 31 is waiting for input of the tumor detection switch B35, and repeats the processing from step S26 to step S29 until a tumor-free signal is input by the tumor detection switch B35.
  • the ultrasonic tomographic image changes from the right side state in FIG. 11 to the left side state.
  • the operator operates the tumor detection switch B35 to input a tumor-free signal.
  • the tumor position detection control section 31 in step S30 in FIG. 8 the three-dimensional position, three-dimensional position B (the lower end on the running direction D L corresponding to the lower end position b of the other words tumor 42 of probe tip 12 The position B) is stored in the position information memory 33.
  • step S31 the surgeon performs input in the direction in which the tumor 42 exists from the input device 36, so that the tumor position detection control unit 31 stores the three-dimensional information of the upper end position A and the lower end position B in the position information memory 33. In addition to the position information, information on the direction in which the tumor 42 is present is stored.
  • the direction in which the tumor 42 exists is not fixed. Specifically, it is not known whether the tumor 42 exists in any of the circumferential directions around the line segment AB connecting the upper end position A and the lower end position B. For this, the surgeon, when you move the probe tip 12 in the traveling direction D L of the bronchi 41, and inputs the direction of the tomographic image 42a of the tumor 42 appears. For example, in the case of FIG. 11, the 9 o'clock direction of the clock is input (as the width direction DT in which the tumor 42 exists).
  • the width direction DT in which the tumor 42 exists is determined.
  • the input operation in the width direction DT may be performed between step S24 and step S29.
  • the width direction DT may be determined (input) by image processing as in Example 2 described later.
  • a substantially central position c 'of the tumor 42 in the width direction D T as follows even in good (modification of the second embodiment described below, or to detect the width or range along the width direction D T tumor 42 is present in about the axis of the running direction D L, biopsy in the width direction D T
  • a two-dot chain line in FIG. 11 shows a tomographic image 42b of the tumor 42 obtained when the probe tip 12 is set near the center position between the upper end position A and the lower end position B.
  • the operator designates the center position c ′ along the width direction DT by a pointing device (not shown) provided in the ultrasonic observation apparatus 6 (the ultrasonic observation apparatus 6 performs image processing in Example 2 described later).
  • the center position c ′ may be designated).
  • the ultrasonic observation apparatus 6 calculates the actual distance Wc from the distance Wc ′ on the tomographic image 42b from the ultrasonic transducer to the central position c ′.
  • the distance Wc ′ on the ultrasonic tomographic image can be converted into the actual distance Wc by using values of parameters and the like used for displaying the tomographic image.
  • the surgeon inputs an actual distance Wc from the input device 36 (the ultrasonic observation device 6 may specify the distance Wc in the second embodiment described later).
  • Information on the distance Wc is stored in the position information memory 33. Further, the three-dimensional position information of the center position c of the tumor 42 may also be stored in the position information memory 33. And you may set this center position c to the target
  • the ultrasonic observation apparatus 6 automatically calculates the actual distance Wc, and the distance Wc.
  • This information may be output to the tumor position detection control unit 31, and the tumor position detection control unit 31 may store information on the distance Wc and the like in the position information memory 33.
  • the position information of the center position c of the tumor 42 may also be stored in the position information memory 33.
  • the tumor position detection control unit 31 reads an appropriate range parameter for biopsy from the parameter storage unit 32.
  • the tumor position detection control unit 31 appropriately performs biopsy from the three-dimensional position information of the upper end position A and the lower end position B stored in the position information memory 33 and the appropriate range parameter. Calculate the appropriate range of biopsies suitable for
  • FIG. 10 is basically the same diagram as FIG. 9, and is an explanatory diagram for calculating the appropriate range K based on the upper end position A and the lower end position B by the ultrasonic transducer 13 in FIG. 9.
  • Running direction D segment length of AB in the upper end position A and the lower end position B corresponding to the upper end position a and the lower end position b tumor 42 is present along the L bronchus 41, as shown in FIG. 10, the direction D It becomes the length L of the tumor 42 along L.
  • the appropriate range K can be set as the length L of the tumor 42.
  • the tumor 42 may have a different shape from the substantially ellipsoidal shape shown in FIG. 10, for example, a crescent moon shape with a small thickness. Therefore, when the appropriate range parameter is set so that the appropriate range K is set or calculated inside the detected both ends, the tissue of the tumor 42 can be collected more reliably when performing biopsy.
  • the appropriate range K is set on the inner side or the center side of the detected line segment AB.
  • the upper end position A ′ and the lower end position B ′ of the appropriate range K are calculated by the following equations, for example.
  • a ′ (BA) ⁇ t / 100 +
  • a B ′ B ⁇ (BA) ⁇ t / 100
  • t as an appropriate range parameter is a constant, and can be set to about 10 to several tens, for example. If the tumor is nearly spherical, it may be set to 10 or less.
  • step S ⁇ b> 34 of FIG. 8 the tumor position detection control unit 31 transfers the calculated data of the appropriate range K and the information on the width direction D T in which the tumor 42 exists or the center position c to the image display control unit 29. .
  • the image display control unit 29 superimposes information such as the appropriate range K and the width direction D T (or the center position c) in which the tumor 42 exists on the virtual image (the virtual shape image) of the bronchus 41 as biopsy range information. And displayed on the monitor 17.
  • FIG. 12 shows one display example when the appropriate range K or the like is displayed on the virtual shape image Ib of the virtual image of the bronchus 41 displayed on the monitor 17. As shown in FIG.
  • the appropriate range K is displayed at a position corresponding to the appropriate range K on the virtual shape image Ib of the virtual image of the bronchus 41, and the width direction DT in which the tumor 42 exists is (the width direction thereof). (In the direction corresponding to DT ).
  • the center position c of the tumor 42 in the width direction DT can be displayed.
  • a two-dot chain line with the tip at the reference position P in FIG. 12 indicates the insertion route of the endoscope 18.
  • the position detection device 8 continues to acquire position information of the position sensor 14.
  • the position detection device 8 detects the three-dimensional position of the probe tip 12 and sends information on the three-dimensional position to the image display control unit 29.
  • the three-dimensional position of the probe tip 12 is displayed superimposed on the corresponding position on the virtual image Ib.
  • step S37 the operator places the probe distal end portion 12 on the distal end surface of the endoscope 18 (which becomes the distal end opening of the channel 19a).
  • the three-dimensional position of the probe tip 12 is acquired by the position detection device 8.
  • the position corresponding to the three-dimensional position of the distal end surface (of the distal end opening of the channel 19a) of the endoscope 18 is displayed as the reference position P, for example.
  • the position of the distal end opening of the guide tube 43 may be displayed on the virtual shape image Ib of the virtual image as the reference position P.
  • the biopsy range as shown in FIG. 12 and information related thereto are displayed on the monitor 17, so that the operator can continue to view the distal end surface of the channel 19 a of the endoscope 18. Therefore, the biopsy device can be supported so that the biopsy device can be smoothly positioned by extracting the tissue from the tumor 42 by protruding the biopsy device.
  • step S38 subsequent to step S37 in FIG. 8, the tumor position detection control unit 31 enters a state of waiting for resetting the appropriate range K. If this operation is not performed, the processing in steps S35 to S38 is repeated.
  • the operator may continue this operation state until, for example, the biopsy procedure for collecting tissue from the tumor 42 is completed. Then, the appropriate range K is reset at the end of tissue collection, and the position detection operation in FIG. 8 is ended. Acquisition of information such as the appropriate range K for facilitating tissue collection by the biopsy device using the ultrasonic probe 2 and the width direction D T in which the tumor 42 exists, the center position c thereof, etc., and the information on the virtual image After the process of superimposing and displaying is finished, the operator pulls out the ultrasonic probe 2 from the channel 19a. That is, as described in step S7 of FIG. 3, the operator pulls out the ultrasonic probe 2 from the channel 19a while leaving the guide tube 43, and inserts the biopsy device.
  • the surgeon refers to the virtual shape image Ib of the virtual image displayed in FIG. 12, and causes the biopsy device to protrude from the distal end opening of the channel 19a of the endoscope 18 as shown in step S8.
  • Tissue is taken from 42.
  • the surgeon projects the biopsy device in the direction c indicated by the one-dot chain line with the position of the distal end opening of the channel 19a of the endoscope 18 in the enlarged view in FIG.
  • the tissue can be collected from the tumor 42.
  • a position sensor is provided at the distal end of the biopsy device, the position is detected by the position detection device 8 and also displayed on the virtual shape image Ib of the virtual image of the bronchus 41. The surgeon can perform the biopsy procedure more easily.
  • the position information of the biopsy device may be stored in the position information memory 33.
  • a virtual image of the bronchus 41 is generated and exists on the virtual image along at least a predetermined direction (traveling direction of the body cavity) of the tumor as the target tissue to be biopsyed. Since the information including the appropriate range K as the biopsy range calculated from the range to be displayed is superimposed and displayed, it is possible to assist in facilitating the biopsy treatment.
  • FIG. 13 shows the configuration of a biopsy support system 1B according to the second embodiment of the present invention.
  • This biopsy support system 1B has a configuration that does not include, for example, the tumor detection switch A34 and the tumor detection switch B35 in the biopsy support system 1 of FIG.
  • the biopsy support system 1B automatically performs the functions of the tumor detection switch A34 and the tumor detection switch B35, which were operated by the operator in the first embodiment, by image processing. Yes.
  • the image data of the ultrasonic tomographic image generated by the ultrasonic image generating unit in the ultrasonic observation apparatus 6 is input to the tumor position detection control unit 31, and the tumor position detection control unit 31 performs the following operation on the ultrasonic tomographic image.
  • the tumor position detection control unit 31 performs the following operation on the ultrasonic tomographic image.
  • FIG. 14 shows a flowchart of the position detection operation in this embodiment.
  • step S24 and step S29 in FIG. 8 are changed to the following step S24 ′ and step S29 ′. That is, in step S24 ′ subsequent to step S23, the tumor position detection control unit 31 determines whether or not it is time to start detecting a tomographic image of the tumor 42 on the ultrasonic tomographic image, that is, detection of the tomographic image of the tumor 42. It is determined whether or not a tumor detection signal is detected. When it does not correspond to detection of a tumor detection signal, it returns to step S21. On the other hand, if it is determined that it is time to start detecting a tomographic image of the tumor 42, that is, if a tumor detection signal is detected, the process proceeds to step S25.
  • step S29 ′ the tumor position detection control unit 31 determines whether or not the tomographic image of the tumor 42 is not detected on the ultrasonic tomographic image, that is, the tomographic image of the tumor 42 is not present. It is determined whether or not a tumor-free signal is detected by detection. If it does not fall under the detection of the tumor-free signal, the process returns to step S26. On the other hand, if it is determined that a tumor-free signal is detected, the process proceeds to step S30.
  • the input in the width direction DT in which the tumor 42 is present in step S31 is performed by the biopsy support system 1B based on a tumor detection signal obtained by image processing (without being performed by the operator). The same applies to recording.
  • Other operations are the same as those in the first embodiment. According to the present embodiment, the burden on the operator can be reduced as compared with the first embodiment. The other effects are the same as those of the first embodiment. Next, a modification of this embodiment will be described.
  • FIG. 15 shows an explanatory diagram in this case.
  • FIG. 16 shows a processing procedure in this case.
  • the surgeon as shown in step S41, the probe tip 12 when the same ultrasonic probe 2 of FIG. 9 in the traveling direction D L, moving from the top position A to the bottom position B.
  • the tumor position detection control unit 31 stores the position detection information by the position sensor 14 by the position detection device 8 in the position information memory 33 while moving.
  • the ultrasonic image generating unit in the ultrasonic observation apparatus 6 converts (ultrasonic) tomographic image data obtained by the ultrasonic transducer 13 into the ultrasonic observation apparatus 6. Sequentially recorded in the internal memory.
  • the position detection information stored in the position information memory 33 by the position sensor 14 and the tomographic image data stored in the memory in the ultrasonic observation apparatus 6 are stored in association with each other.
  • the tumor portion to be detected as the tomographic image data representative location and tumor 42 obtained when the when the user moves the ultrasonic probe 2 in the traveling direction D L of the bronchi 41 (also referred to as slice data) Is shown.
  • the ultrasonic image generation unit performs ultrasonic vibration along the width direction DT perpendicular to the traveling direction DL with respect to the volume data formed by the entire tomographic image data stored in the memory.
  • the end position e with the smallest distance from the child 13 and the end position f with the longest distance are calculated by image processing.
  • end positions e and f are positions on the ultrasonic tomographic image corresponding to the end position E and the end position F farthest away from the ultrasonic transducer 13 of the tumor 42.
  • the image processing apparatus in a next step S44, calculating the end position e from the ultrasound transducer 13 on the traveling direction D L of the bronchi 41 on the tomographic image, the distance on the tomographic image to f We, the Wf To do. Then, the ultrasonic image generation unit outputs the calculated distances We and Wf on the tomographic image to the tumor position detection control unit 31.
  • the tumor position detection control unit 31 calculates the end positions E and F of the tumor 42 using the three-dimensional position information of the ultrasonic transducer 13 from the distances We and Wf on the tomographic image. .
  • Tumor position detection control section 31 in the following additional step S46, the end position E, by using the information F, as in the case of proper range K along the running direction D L, perpendicular to the traveling direction D L An appropriate range J in the width direction DT is calculated.
  • the tumor position detection control unit 31 outputs the calculated information of the appropriate range J in the width direction DT to the image display control unit 29.
  • the image display control unit 29 superimposes and displays the appropriate ranges K and J on the virtual image of the bronchus 41 on the display surface of the monitor 17.
  • the proper range K of the traveling direction D L since to be able to display also appropriate range J of the widthwise direction D T, the surgeon performs treatment of the biopsy by biopsy device easier be able to.
  • the appropriate range K of the traveling direction D L has been to calculate the appropriate range J of the widthwise direction D T perpendicular to the traveling direction D L.
  • the appropriate range K of the traveling direction D L as described below may be calculated two-dimensional area of the proper range in the circumferential direction from the traveling direction D L.
  • the ultrasonic image generation unit inside the ultrasonic observation apparatus 6 calculates, for example, a two-dimensional area where the tomographic image data shown in FIG. 17 exists. In the example of FIG.
  • the traveling direction DL is set as the Z coordinate, and a two-dimensional region in which each tomographic image data of the tumor 42 exists on a tomographic image plane (X, Y plane) perpendicular to this is detected. coordinate axis in the running direction D L of the position detecting device 8).
  • the ultrasonic image generation unit detects the two-dimensional region for each tomographic image data from the upper end position A to the lower end position B, and sequentially calculates the distribution information of the two-dimensional region in each tomographic image data as the tumor position.
  • the tumor position detection control unit 31 converts the two-dimensional distribution on the tomographic image of each tomographic image data into the two-dimensional distribution in the position coordinate detection system using the position of the Z-axis coordinate from which each tomographic image data is acquired. .
  • Tumor position detection control unit 31 performs the process from the top position A to the lower end position B, and calculates the three-dimensional distribution region of the tumor 42 in the circumferential direction with the axial direction of travel D L.
  • the tumor position detection control unit 31 calculates the center of gravity position G after calculating the three-dimensional distribution region of the tumor 42.
  • the tumor position detection control unit 31 has a three-dimensional distribution region from the centroid position G of the tumor 42 as shown in FIG. 18 and a spherical portion within a predetermined distance r from the centroid position G as shown in FIG. Is calculated as an appropriate range M.
  • the predetermined distance r is set to be inside the three-dimensional distribution region of the tumor 42.
  • an appropriate range N that is inside at a predetermined distance or a predetermined ratio from the periphery of the three-dimensional distribution region of the tumor 42 may be calculated.
  • the tumor position detection control unit 31 outputs the calculated center-of-gravity position G and three-dimensional position information of the appropriate range M or N to the image display control unit 29.
  • the image display control unit 29 superimposes and displays, for example, the appropriate range K, the gravity center position G and the appropriate range M or N, and the reference position P of the distal end surface of the endoscope 18 on the virtual image of the bronchus.
  • the image display control unit 29 may additionally display the appropriate range M or N as the second appropriate range together with the appropriate range K or J described above, or the appropriate range instead of the appropriate range K or J. Only M or N may be displayed.
  • the upper side of FIG. 19 shows a display example of the virtual shape image Ib of the virtual image in this state. Further, the lower side of FIG. 19 shows a state in which tissue is collected from the tumor 42 by actually projecting, for example, a biopsy needle 51 as a biopsy device from the distal end opening of the channel 19 a of the endoscope 18. Note that an example in which a position sensor 52 is provided near the tip of the biopsy needle 51 is shown. As described above, when the position sensor 52 is provided, the position Q can be superimposed and displayed on the virtual shape image Ib of the virtual image. For example, in the case of FIG. 19, the direction of PQ from the reference position P to the position Q is a direction deviating below the gravity center position G. The curved portion of the mirror 18 is manipulated (slightly curved upward in FIG. 19). The tissue can be collected from the tumor 42 by projecting in that direction.
  • the surgeon can easily perform the treatment of collecting the tissue from the tumor 42 by referring to the virtual shape image Ib of the upper virtual image in FIG. Further, since the position detection means is incorporated in the biopsy device, positioning by the guide tube 43 is not necessary, and guidance based on the position information of the biopsy device is possible.
  • the above-described embodiment may be further modified.
  • the upper end position A of the three-dimensional position of the ultrasonic transducer 13 on the traveling direction D L of the bronchi 41 had detected a B.
  • the upper end position a and the lower end position b of the tumor 42 may be detected or calculated.
  • the upper end positions a and b are calculated by referring to the distance information from the ultrasonic transducer 13 on the tomographic image and the three-dimensional position information of the ultrasonic transducer 13 when the tomographic image is generated. can do.
  • an appropriate range from the upper end position a and the lower end position b of the tumor 42 to the inside (center side) of both ends thereof may be set as the biopsy range. Further, the center positions of both ends may be calculated simultaneously. Also in this case, it is possible to effectively support the biopsy treatment. Further, ranges tumor 42 along the running direction D L like bronchial 41 is present, for example, a line segment AB as biopsy range which is a measure for performing a biopsy, on the virtual shape image Ib of the virtual image of the bronchus 41 What is displayed in a superimposed manner also belongs to the present invention. That is, the range of both end positions where the target tissue such as the tumor 42 exists in the traveling direction of the bronchus where the probe tip 12 is inserted and moved may be used as the biopsy range.
  • the virtual image generation unit 16 that generates a virtual image includes a VBS image as a virtual endoscopic image of a bronchus as a body cavity and a virtual three-dimensional shape image of a bronchus.
  • the virtual image generation unit 16 may be applied to a configuration of a virtual shape image generation unit that generates only a virtual shape image of a body cavity path. Can do.
  • the present invention can also be applied to the case where the virtual image generation unit 16 is a virtual endoscopic image generation unit that generates only a virtual endoscopic image of a body cavity path.
  • the present invention is not limited to this case, and the case where the probe distal end portion 12 is inserted into the upper digestive tract or the lower digestive tract and moved. Applicable. Moreover, what does not change the gist of the invention, such as an embodiment configured by partially combining the above-described embodiments, etc. belongs to the present invention.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Human Computer Interaction (AREA)
  • Endoscopes (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

La présente invention concerne un système de support de biopsie comportant: un générateur d'image de forme virtuelle qui génère une image de forme virtuelle de l'intérieur d'une cavité corporelle dans laquelle une insertion va être exécutée, à partir de données d'image d'une région tridimensionnelle par rapport à un sujet sous examen ; un endoscope qui est muni d'une partie d'insertion d'endoscope et d'un canal ; une sonde à ultrasons qui est introduite dans le canal, et qui est munie, à son extrémité de pointe, d'un transducteur à ultrasons et d'un capteur de position qui détecte des positions tridimensionnelles ; un générateur d'images ultrasonores qui génère une image tomographique ultrasonore ; un détecteur de position qui détecte des positions tridimensionnelles correspondant aux deux positions d'extrémité du tissu cible dans l'image tomographique ultrasonore de la sonde à ultrasons qui est en saillie depuis le canal ; et un contrôleur d'affichage d'image qui affiche une information de l'étendue de la biopsie en fonction des positions d'extrémité par la superposition de ladite information sur l'image de forme virtuelle.
PCT/JP2010/068973 2009-11-17 2010-10-26 Système de support de biopsie WO2011062035A1 (fr)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN201080003323.6A CN102231965B (zh) 2009-11-17 2010-10-26 活检辅助系统
EP10827691.6A EP2430979B1 (fr) 2009-11-17 2010-10-26 Système de support de biopsie
JP2011505721A JP4733243B2 (ja) 2009-11-17 2010-10-26 生検支援システム
US13/027,592 US20110237934A1 (en) 2009-11-17 2011-02-15 Biopsy support system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009262314 2009-11-17
JP2009-262314 2009-11-17

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/027,592 Continuation US20110237934A1 (en) 2009-11-17 2011-02-15 Biopsy support system

Publications (1)

Publication Number Publication Date
WO2011062035A1 true WO2011062035A1 (fr) 2011-05-26

Family

ID=44059518

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/068973 WO2011062035A1 (fr) 2009-11-17 2010-10-26 Système de support de biopsie

Country Status (5)

Country Link
US (1) US20110237934A1 (fr)
EP (1) EP2430979B1 (fr)
JP (1) JP4733243B2 (fr)
CN (1) CN102231965B (fr)
WO (1) WO2011062035A1 (fr)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015107268A (ja) * 2013-12-05 2015-06-11 国立大学法人名古屋大学 内視鏡観察支援装置
JP2016529062A (ja) * 2013-09-06 2016-09-23 コビディエン エルピー 超音波を用いて肺を可視化するためのシステムおよび方法
US9867665B2 (en) 2013-09-06 2018-01-16 Covidien Lp Microwave ablation catheter, handle, and system
US10201265B2 (en) 2013-09-06 2019-02-12 Covidien Lp Microwave ablation catheter, handle, and system
US10448862B2 (en) 2013-09-06 2019-10-22 Covidien Lp System and method for light based lung visualization
US10814128B2 (en) 2016-11-21 2020-10-27 Covidien Lp Electroporation catheter
WO2022059197A1 (fr) * 2020-09-18 2022-03-24 オリンパスメディカルシステムズ株式会社 Procédé de collecte d'un tissu biologique et système de support de biopsie

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014151102A (ja) * 2013-02-13 2014-08-25 Olympus Corp 管状装置の相対位置検出システム及び内視鏡装置
CN104107069A (zh) * 2013-04-16 2014-10-22 南方医科大学南方医院 用于纵隔病变的超声镜检系统
US10278680B2 (en) 2014-03-19 2019-05-07 Covidien Lp Devices, systems, and methods for navigating a biopsy tool to a target location and obtaining a tissue sample using the same
CN104013423B (zh) * 2014-05-09 2016-07-13 杨松 B超扫描探头、b超扫描系统和b超扫描方法
CN105376503B (zh) * 2015-12-14 2018-07-20 北京医千创科技有限公司 一种手术图像处理装置及方法
EP3402408B1 (fr) * 2016-01-15 2020-09-02 Koninklijke Philips N.V. Direction de sonde automatisée pour vues cliniques à l'aide d'annotations dans un système de guidage d'images fusionnées
US10925629B2 (en) 2017-09-18 2021-02-23 Novuson Surgical, Inc. Transducer for therapeutic ultrasound apparatus and method
GB2588102B (en) * 2019-10-04 2023-09-13 Darkvision Tech Ltd Surface extraction for ultrasonic images using path energy
US20210196230A1 (en) * 2019-12-29 2021-07-01 Biosense Webster (Israel) Ltd. Position registered sideview ultrasound (us) imager inserted into brain via trocar
CN114098844A (zh) * 2021-11-15 2022-03-01 首都医科大学附属北京天坛医院 一种超细支气管镜

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0422192B2 (fr) 1983-03-21 1992-04-15 Arizona Chem
JP2000116655A (ja) * 1998-10-14 2000-04-25 Olympus Optical Co Ltd 診断装置
JP2001198125A (ja) * 2000-01-18 2001-07-24 Olympus Optical Co Ltd 画像診断装置
JP2004000499A (ja) 2002-03-27 2004-01-08 Aloka Co Ltd 超音波医療システム
JP2004097696A (ja) * 2002-09-12 2004-04-02 Olympus Corp 内視鏡観測装置
WO2007055032A1 (fr) * 2005-11-14 2007-05-18 Olympus Medical Systems Corp. Procede de diagnostic endoscopique ou dispositif therapeutique et medical
JP4022192B2 (ja) * 2003-10-31 2007-12-12 オリンパス株式会社 挿入支援システム
JP2009262314A (ja) 2008-04-30 2009-11-12 Techno First Kk 発泡樹脂板の穿孔器

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4802487A (en) * 1987-03-26 1989-02-07 Washington Research Foundation Endoscopically deliverable ultrasound imaging system
JPH0422192A (ja) * 1990-05-17 1992-01-27 Fujitsu Ltd 板部材の取付構造
US6716166B2 (en) * 2000-08-18 2004-04-06 Biosense, Inc. Three-dimensional reconstruction using ultrasound
WO2004042546A1 (fr) * 2002-11-04 2004-05-21 V-Target Technologies Ltd. Dispositif et procedes de formation d'images et de correction d'attenuation
JP4698966B2 (ja) * 2004-03-29 2011-06-08 オリンパス株式会社 手技支援システム
EP1890601A4 (fr) * 2005-06-17 2015-04-22 Orthosoft Inc Méthode et appareil de resurfaçage assisté par ordinateur de tête de fémur
US8821376B2 (en) * 2007-03-12 2014-09-02 David Tolkowsky Devices and methods for performing medical procedures in tree-like luminal structures

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0422192B2 (fr) 1983-03-21 1992-04-15 Arizona Chem
JP2000116655A (ja) * 1998-10-14 2000-04-25 Olympus Optical Co Ltd 診断装置
JP2001198125A (ja) * 2000-01-18 2001-07-24 Olympus Optical Co Ltd 画像診断装置
JP2004000499A (ja) 2002-03-27 2004-01-08 Aloka Co Ltd 超音波医療システム
JP2004097696A (ja) * 2002-09-12 2004-04-02 Olympus Corp 内視鏡観測装置
JP4022192B2 (ja) * 2003-10-31 2007-12-12 オリンパス株式会社 挿入支援システム
WO2007055032A1 (fr) * 2005-11-14 2007-05-18 Olympus Medical Systems Corp. Procede de diagnostic endoscopique ou dispositif therapeutique et medical
JP2009262314A (ja) 2008-04-30 2009-11-12 Techno First Kk 発泡樹脂板の穿孔器

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2430979A4 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10448861B2 (en) 2013-09-06 2019-10-22 Covidien Lp System and method for light based lung visualization
US11864829B2 (en) 2013-09-06 2024-01-09 Covidien Lp Microwave ablation catheter, handle, and system
US9867665B2 (en) 2013-09-06 2018-01-16 Covidien Lp Microwave ablation catheter, handle, and system
JP2018094438A (ja) * 2013-09-06 2018-06-21 コビディエン エルピー 超音波を用いて肺を可視化するためのシステムおよび方法
US10448862B2 (en) 2013-09-06 2019-10-22 Covidien Lp System and method for light based lung visualization
US10098566B2 (en) 2013-09-06 2018-10-16 Covidien Lp System and method for lung visualization using ultrasound
JP2016529062A (ja) * 2013-09-06 2016-09-23 コビディエン エルピー 超音波を用いて肺を可視化するためのシステムおよび方法
US10201265B2 (en) 2013-09-06 2019-02-12 Covidien Lp Microwave ablation catheter, handle, and system
US10098565B2 (en) 2013-09-06 2018-10-16 Covidien Lp System and method for lung visualization using ultrasound
US10561463B2 (en) 2013-09-06 2020-02-18 Covidien Lp Microwave ablation catheter, handle, and system
US11931139B2 (en) 2013-09-06 2024-03-19 Covidien Lp System and method for lung visualization using ultrasound
US11925452B2 (en) 2013-09-06 2024-03-12 Covidien Lp System and method for lung visualization using ultrasound
US11324551B2 (en) 2013-09-06 2022-05-10 Covidien Lp Microwave ablation catheter, handle, and system
JP2015107268A (ja) * 2013-12-05 2015-06-11 国立大学法人名古屋大学 内視鏡観察支援装置
US10814128B2 (en) 2016-11-21 2020-10-27 Covidien Lp Electroporation catheter
WO2022059197A1 (fr) * 2020-09-18 2022-03-24 オリンパスメディカルシステムズ株式会社 Procédé de collecte d'un tissu biologique et système de support de biopsie

Also Published As

Publication number Publication date
US20110237934A1 (en) 2011-09-29
EP2430979A4 (fr) 2012-05-23
CN102231965A (zh) 2011-11-02
JP4733243B2 (ja) 2011-07-27
EP2430979A1 (fr) 2012-03-21
JPWO2011062035A1 (ja) 2013-04-04
CN102231965B (zh) 2014-03-12
EP2430979B1 (fr) 2015-12-16

Similar Documents

Publication Publication Date Title
JP4733243B2 (ja) 生検支援システム
CN110831498B (zh) 活检装置和系统
KR102558061B1 (ko) 생리적 노이즈를 보상하는 관강내 조직망 항행을 위한 로봇 시스템
JP6200152B2 (ja) 医療処置におけるトラッキング方法及び装置
US7951070B2 (en) Object observation system and method utilizing three dimensional imagery and real time imagery during a procedure
EP1691666B1 (fr) Systeme de guidage et d'interface en 3d pour catheterscope
JP4724262B2 (ja) 内視鏡装置
JP6600690B2 (ja) 挿入体支援システム
JP6270026B2 (ja) 内視鏡観察支援装置
JP4334839B2 (ja) 内視鏡観測装置
JP4022114B2 (ja) 内視鏡装置
CN116096309A (zh) 腔内机器人(elr)系统和方法
JP6203456B2 (ja) 超音波観測装置、超音波観測装置の作動方法および超音波観測装置の作動プログラム
WO2015091226A1 (fr) Vue laparoscopique étendue avec une vision à rayons x
JP2006218239A (ja) 手技支援システム
WO2022059197A1 (fr) Procédé de collecte d'un tissu biologique et système de support de biopsie
US20240112407A1 (en) System, methods, and storage mediums for reliable ureteroscopes and/or for imaging
Fai et al. Tool for transbronchial biopsies of peripheral lung nodules
JP4354353B2 (ja) 挿入支援システム
JP2003175042A (ja) 超音波画像診断装置
JP2005211529A (ja) 手技支援システム
WO2024081745A2 (fr) Localisation et ciblage de petites lésions pulmonaires
JP2005021354A (ja) 遠隔手術支援装置
JP2005211530A (ja) 手技支援システム
JP2004337463A (ja) バーチャル画像表示装置及びバーチャル画像表示方法

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201080003323.6

Country of ref document: CN

ENP Entry into the national phase

Ref document number: 2011505721

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2010827691

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE