WO2022230160A1 - 内視鏡システム、内腔構造算出システム及び内腔構造情報の作成方法 - Google Patents
内視鏡システム、内腔構造算出システム及び内腔構造情報の作成方法 Download PDFInfo
- Publication number
- WO2022230160A1 WO2022230160A1 PCT/JP2021/017141 JP2021017141W WO2022230160A1 WO 2022230160 A1 WO2022230160 A1 WO 2022230160A1 JP 2021017141 W JP2021017141 W JP 2021017141W WO 2022230160 A1 WO2022230160 A1 WO 2022230160A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- lumen
- dimension
- dimensional structure
- specific part
- Prior art date
Links
- 238000004364 calculation method Methods 0.000 title claims abstract description 81
- 238000000034 method Methods 0.000 title claims description 76
- 238000003780 insertion Methods 0.000 claims abstract description 62
- 230000037431 insertion Effects 0.000 claims abstract description 62
- 238000003384 imaging method Methods 0.000 claims abstract description 58
- 239000000284 extract Substances 0.000 claims description 16
- 238000000605 extraction Methods 0.000 claims description 7
- 238000004513 sizing Methods 0.000 claims 1
- 238000012545 processing Methods 0.000 description 60
- 238000001514 detection method Methods 0.000 description 32
- 238000005452 bending Methods 0.000 description 30
- 210000002429 large intestine Anatomy 0.000 description 17
- 238000005259 measurement Methods 0.000 description 17
- 230000003902 lesion Effects 0.000 description 15
- 230000008569 process Effects 0.000 description 15
- 238000010586 diagram Methods 0.000 description 13
- 238000003860 storage Methods 0.000 description 13
- 238000011282 treatment Methods 0.000 description 11
- 238000012986 modification Methods 0.000 description 10
- 230000004048 modification Effects 0.000 description 10
- 230000003287 optical effect Effects 0.000 description 9
- 238000013528 artificial neural network Methods 0.000 description 6
- 230000008859 change Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 238000005286 illumination Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 4
- 210000000436 anus Anatomy 0.000 description 3
- 238000009826 distribution Methods 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 206010028980 Neoplasm Diseases 0.000 description 2
- 208000037062 Polyps Diseases 0.000 description 2
- 230000000052 comparative effect Effects 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 238000007689 inspection Methods 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 201000011510 cancer Diseases 0.000 description 1
- 239000003990 capacitor Substances 0.000 description 1
- 210000001072 colon Anatomy 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 150000001875 compounds Chemical class 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 238000012143 endoscopic resection Methods 0.000 description 1
- 238000001839 endoscopy Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000000968 intestinal effect Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000002572 peristaltic effect Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000002271 resection Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000005406 washing Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000095—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope for image enhancement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00039—Operational features of endoscopes provided with input arrangements for the user
- A61B1/0004—Operational features of endoscopes provided with input arrangements for the user for electronic operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00064—Constructional details of the endoscope body
- A61B1/00071—Insertion part of the endoscope body
- A61B1/0008—Insertion part of the endoscope body characterised by distal tip features
- A61B1/00097—Sensors
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00163—Optical arrangements
- A61B1/00194—Optical arrangements adapted for three-dimensional imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/05—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/06—Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
- A61B5/065—Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/107—Measuring physical dimensions, e.g. size of the entire body or parts thereof
- A61B5/1076—Measuring physical dimensions, e.g. size of the entire body or parts thereof for measuring dimensions inside body cavities, e.g. using catheters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
Definitions
- the present invention relates to an endoscope system, a lumen structure calculation system, a method for creating lumen structure information, and the like.
- Patent Literature 1 discloses a measurement endoscope apparatus that performs measurement by image processing using an endoscope image by a pair of objective lenses.
- An endoscope having a plurality of optical systems such as that disclosed in Patent Document 1
- a monocular endoscope When using a monocular endoscope, near objects appear larger and distant objects appear smaller.
- One aspect of the present disclosure relates to an insertion section that is inserted into a lumen that serves as a subject, a monocular imaging section that is provided in the insertion section and acquires a captured image of the subject, and at least part of the lumen.
- an actual dimension determination information acquisition unit that acquires actual dimension determination information; a three-dimensional structure of the lumen and dimension information related to at least a part of the three-dimensional structure based on the captured image and the actual dimension determination information; a lumen structure calculator that calculates certain three-dimensional structural dimension information; a dimension estimator that outputs specific part dimension information, which is dimensions of a particular part in the three-dimensional structure, based on the three-dimensional structural dimension information; It relates to an endoscopic system including
- a photographed image of a subject acquired by a monocular imaging unit provided in an insertion section inserted into a lumen serving as a subject and an actual size of at least part of the lumen are captured.
- an acquisition unit that acquires actual dimension determination information that is information to be determined; a three-dimensional structure of the lumen and at least a part of the three-dimensional structure based on the captured image and the actual dimension determination information; a lumen structure calculation unit for calculating three-dimensional structural dimension information, which is information for determining actual dimensions; and a dimension estimator for outputting.
- a captured image of the subject is acquired by a monocular imaging unit provided in an insertion section inserted into a lumen of the subject; obtaining actual size determination information that is information for determining the actual size of the lumen, and based on the captured image and the actual size determination information, the three-dimensional structure of the lumen and at least a part of the three-dimensional structure calculating three-dimensional structure dimension information, which is information for determining the actual dimensions of the three-dimensional structure; and a method of creating lumen structure information.
- FIG. 2 is a block diagram for explaining a configuration example of an endoscope system; FIG. The perspective view explaining the example of an endoscope.
- FIG. 4 is a diagram showing a display example of a monitor to which the technique of the present embodiment is applied;
- FIG. 3 is a block diagram for explaining another configuration example of the endoscope system; The figure which shows the display example of the monitor explaining a specific site
- FIG. 3 is a block diagram for explaining another configuration example of the endoscope system; The figure explaining a magnetic sensor.
- FIG. 3 is a block diagram for explaining another configuration example of the endoscope system; 4 is a flowchart for explaining an example of processing for acquiring lumen structure information;
- FIG. 4 is a diagram for explaining an example of luminal structure information; 4 is a flowchart for explaining another example of processing for acquiring luminal structure information; Schematic diagram for explaining the relationship between a plurality of feature points and the position and orientation of the tip. 4 is a flowchart for explaining an example of processing for calculating three-dimensional structural dimension information; The figure which shows the display example of the monitor explaining a comparison target. 4 is a flowchart for explaining another example of processing for calculating three-dimensional structural dimension information; The figure explaining a TOF sensor.
- FIG. 5 is a diagram showing another display example of a monitor to which the method of the present embodiment is applied;
- FIG. 4 is a diagram for explaining a method of calculating a distance between measurement points; FIG.
- FIG 5 is a diagram showing another display example of a monitor to which the method of the present embodiment is applied; The figure explaining the measuring method of the dimension of a specific site
- FIG. 1 is a block diagram illustrating a configuration example of an endoscope system 1 of this embodiment.
- the endoscope system 1 includes an insertion section 2b and a lumen structure calculation system 100.
- FIG. The insertion section 2b includes an imaging section 30.
- the lumen structure calculation system 100 includes an acquisition section 110 , a lumen structure calculation section 120 and a dimension estimation section 130 .
- Acquisition unit 110 includes an actual dimension determination information acquisition unit 112 . That is, the endoscope system 1 of this embodiment includes an insertion section 2b, an imaging section 30, an acquisition section 110, a lumen structure calculation section 120, and a dimension estimation section .
- the configuration of the endoscope system 1 is not limited to that shown in FIG. 1, and various modifications such as addition of other components are possible, the details of which will be described later.
- FIG. 2 is a perspective view illustrating an example of the endoscope 2 of this embodiment.
- the endoscope 2 has an operation section 2a, a flexible insertion section 2b, and a universal cable 2c including signal lines and the like.
- the endoscope 2 is a tubular insertion device that inserts a tubular insertion portion 2b into a lumen.
- a connector is provided at the tip of the universal cable 2c, and the endoscope 2 is detachably connected to a light source device 4 or the like, which will be described later with reference to FIG. 6, by the connector.
- a light guide is inserted through the universal cable 2c, and the endoscope 2 emits illumination light from the light source device through the light guide from the distal end of the insertion portion 2b.
- the operator inserts the endoscope 2 into the large intestine of the patient Pa, but the place where the endoscope 2 is inserted is not limited to the large intestine. Also, the operator is, for example, a doctor, and the doctor is, for example, an endoscopist.
- the insertion portion 2b has a distal end portion 11, a bendable bending portion 12, and a flexible tube portion 13 from the distal end to the proximal end of the insertion portion 2b.
- the insertion portion 2b is inserted into the lumen of the patient Pa who is a subject.
- the proximal end of the distal end portion 11 is connected to the distal end of the bending portion 12
- the proximal end of the bending portion 12 is connected to the distal end of the flexible tube portion 13 .
- a distal end portion 11 of the insertion portion 2b is the distal end portion of the endoscope 2 and is a hard distal end rigid portion.
- the bending portion 12 can be bent in a desired direction according to the operation of the bending operation member 14 provided on the operation portion 2a.
- the bending operation member 14 includes, for example, a horizontal bending operation knob 14a and a vertical bending operation knob 14b.
- the bending section 12 has a plurality of bending pieces connected along the longitudinal axis of the insertion section 2b. Therefore, the operator can observe the inside of the large intestine of the patient Pa by bending the bending section 12 in various directions while pushing the insertion section 2b into the lumen of the large intestine or pulling it out from the lumen.
- the left/right bending operation knob 14a and the up/down bending operation knob 14b pull and loosen the operation wire inserted into the insertion portion 2b in order to bend the bending portion 12.
- the bending operation member 14 further has a fixing knob 14c that fixes the position of the bending portion 12 that is bent.
- the operation portion 2a is also provided with various operation buttons such as a release button and an air/water supply button.
- the flexible tube portion 13 is flexible and bends according to external force.
- the flexible tube portion 13 is a tubular member extending from the operation portion 2a.
- a monocular imaging unit 30 is provided at the distal end portion 11 of the insertion portion 2b.
- the monocular imaging unit 30 is an imaging device that has one imaging optical system and captures an image of a subject as an image without parallax. That is, instead of using two optical systems having parallax with each other, such as a stereo optical system, an image pickup apparatus forms a single image of a subject using a single optical system and captures the formed image using an image sensor. be.
- the imaging unit 30 captures an image of the observation site in the large intestine illuminated by the illumination light from the light source device 4 .
- the monocular imaging unit 30 is provided at the distal end portion 11 of the insertion portion 2b, and images the interior of the subject at a plurality of time points to obtain captured images at a plurality of time points.
- the position where the imaging unit 30 is provided is not limited to the distal end portion 11 of the insertion portion 2b.
- the imaging section 30 may be provided at a position closer to the base end side than the distal end portion 11 by guiding light from a subject.
- the acquisition unit 110 acquires a captured image of the subject captured by the monocular imaging unit 30 provided in the insertion section 2b inserted into the lumen of the subject. Specifically, for example, an imaging signal obtained by the imaging unit 30 is transmitted to the image processing device 3 described later with reference to FIG. The captured image data obtained is transmitted to the acquisition unit 110 . It is transmitted to the acquisition unit 110 .
- the acquisition unit 110 may include an image processing module and may be configured to perform image processing based on the imaging signal obtained by the imaging unit 30, and various modifications are possible.
- the actual dimension determination information acquisition unit 112 is an interface that acquires actual dimension determination information.
- the actual dimension determination information is information that determines the actual dimension of at least a portion of the lumen.
- the actual size of at least a part of the lumen is the actual size in the real space in which the lumen, which is the object, exists, and is information based on data transmitted from a predetermined external sensor, for example.
- the predetermined external sensor is, for example, the magnetic sensor 16 described later with reference to FIG. 6, but may be a position sensor or the like.
- the actual size determination information acquisition unit 112 includes an image processing module and the like, and acquires the actual size determination information by performing processing for obtaining the actual size based on the captured image of the subject acquired by the imaging unit 30. may The details of the acquisition method of the actual dimension determination information will be described later.
- the lumen structure calculation unit 120 calculates the three-dimensional structure of the lumen and three-dimensional structure dimension information based on the captured image and the actual dimension determination information transmitted from the acquisition unit 110 .
- the three-dimensional structure of the lumen is a three-dimensional structure model of the lumen constructed in virtual space based on the captured two-dimensional lumen image.
- the lumen structure calculation unit 120 forms a three-dimensional structure of the lumen based on the captured two-dimensional lumen image by a method described later. It is a dimension in the constructed virtual space and cannot be grasped as an actual dimension in the real space. Therefore, the lumen structure calculator 120 of the present embodiment further calculates three-dimensional structure dimension information based on the captured image and the actual dimension determination information.
- the three-dimensional structural dimension information is information for determining the actual dimensions of at least part of the three-dimensional structure, in other words, the dimensions of at least part of the three-dimensional structure in the virtual space are determined using the actual dimension determination information. This is information converted into actual dimensions.
- the aforementioned actual dimension determination information is information that associates dimensions in the virtual space with actual dimensions in the real space, that is, information for converting the dimensions in the virtual space into actual dimensions in the real space. be.
- the actual size determination information may be information that allows the actual size of the specific portion 200 to be described later to be grasped, but may be information that allows the actual size of an area wider than the area including the specific portion 200 to be grasped.
- the specific site 200 is, for example, a lesion such as cancer or a polyp, but is not limited to a lesion as long as it is a site in the lumen that the operator wants to observe or want to know the actual size of.
- the dimension estimation unit 130 outputs specific part dimension information, which is the actual dimension of the specific part 200 in the three-dimensional structure, based on the three-dimensional structure dimension information. For example, the dimension estimator 130 outputs the dimension of the specific site to the monitor 150 based on the three-dimensional structure dimension information transmitted from the lumen structure calculator 120 .
- the dimension estimation unit 130 can be implemented by a display processor that displays various images on the monitor 150 or a display module that operates on the display processor.
- Each part of the lumen structure calculation system 100 is configured with the following hardware.
- Each section of the lumen structure calculation system 100 specifically includes an acquisition section 110 , an actual dimension determination information acquisition section 112 , a lumen structure calculation section 120 and a dimension estimation section 130 .
- each section of the lumen structure calculation system 100 may include a specific site setting section 140 described later with reference to FIG. 4, or may include a feature point extraction section 122 and a three-dimensional position calculation section 124 described later with reference to FIG.
- the hardware may include circuitry for processing digital signals and/or circuitry for processing analog signals.
- the hardware may consist of one or more circuit devices or one or more circuit elements mounted on a circuit board.
- the one or more circuit devices are, for example, ICs (Integrated Circuits), FPGAs (field-programmable gate arrays), or the like.
- the one or more circuit elements are, for example, resistors, capacitors, and the like.
- Luminal structure calculation system 100 includes a memory that stores information and a processor that operates based on the information stored in the memory.
- the information is, for example, programs and various data.
- a processor includes hardware.
- Various processors such as a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and a DSP (Digital Signal Processor) can be used as the processor.
- the memory may be a semiconductor memory such as SRAM (Static Random Access Memory) or DRAM (Dynamic Random Access Memory), a register, or a magnetic storage device such as HDD (Hard Disk Drive).
- it may be an optical storage device such as an optical disc device.
- the memory stores computer-readable instructions, and when the instructions are executed by the processor, some or all of the functions of the components of the luminal structure calculation system 100 are realized as processes. become.
- the instruction here may be an instruction set that constitutes a program, or an instruction that instructs a hardware circuit of a processor to perform an operation.
- all or part of each part of the lumen structure calculation system 100 can be realized by cloud computing, and each process described later can be performed on cloud computing.
- each part of the lumen structure calculation system 100 of this embodiment may be implemented as a module of a program that runs on a processor.
- the acquisition unit 110 is implemented as an image acquisition module.
- the lumen structure calculation unit 120 is realized as an acquisition module for information required for calculation of the lumen structure and a calculation module that performs calculations based on the information.
- the dimension estimation unit 130 is implemented as a display processing module.
- the program that implements the processing performed by each unit of the lumen structure calculation system 100 of this embodiment can be stored in, for example, an information storage device that is a computer-readable medium.
- the information storage device can be implemented by, for example, an optical disc, memory card, HDD, semiconductor memory, or the like.
- a semiconductor memory is, for example, a ROM.
- the lumen structure calculation system 100 performs various processes of this embodiment based on programs stored in the information storage device. That is, the information storage device stores programs for causing a computer to function as each part of the lumen structure calculation system 100 .
- a computer is a device that includes an input device, a processing unit, a storage unit, and an output unit.
- the program according to the present embodiment is a program for causing a computer to execute each step described later with reference to FIG. 9 and the like.
- FIG. 3 is a display example of the monitor 150 when the endoscope system 1 of this embodiment is applied.
- the operator uses the endoscope system 1 to perform an endoscopic examination on the patient Pa lying supine on the bed, and the image captured by the endoscope 2 is displayed on the monitor 150.
- the monitor 150 displays a screen including a lumen image captured by the imaging section 30 provided in the insertion section 2b.
- the imaging unit 30 is monocular, the operator can only perceive the lumen image as a two-dimensional image. Furthermore, in the lumen image, objects that are closer look larger, and objects that are farther away look smaller.
- the size of the specific part 200 determined by the operator based on the impression of viewing the monitor 150 and the actual size of the specific part 200 may differ greatly.
- the actual size of the specific portion 200 can be measured, so it is possible to display a screen showing the actual size information of the specific portion 200, as indicated by A2, for example.
- the operator and the patient Pa may be allowed to see the screen of the monitor 150 together. Thereby, the operator and the patient Pa can share detailed information about the specific region 200 with each other.
- one type of screen may be displayed on one monitor 150 and switched for each content, or a plurality of monitors 150 may be provided and displayed separately for each content. is possible.
- the endoscope system 1 of the present embodiment includes the insertion section 2b, the imaging section 30, the actual dimension determination information acquisition section 112, the lumen structure calculation section 120, and the dimension estimation section 130.
- the insertion portion 2b is inserted into a lumen that is a subject.
- the imaging unit 30 is monocular, is provided in the insertion unit 2b, and acquires a captured image of a subject.
- the actual dimension determination information acquisition unit 112 acquires actual dimension determination information that is information for determining the actual dimension of at least part of the lumen.
- the lumen structure calculation unit 120 calculates the three-dimensional structure of the lumen and three-dimensional structural dimension information that is information for determining the actual dimension of at least a part of the three-dimensional structure. , is calculated.
- the dimension estimation unit 130 outputs specific part dimension information, which is the actual dimensions of the specific part 200 in the three-dimensional structure, based on the three-dimensional structure dimension information. By doing so, the size of the specific region 200 can be measured using an endoscope having a conventional monocular imaging system. Since the imaging unit 30 of the endoscope 2 is generally monocular, it can capture only a two-dimensional image of the target observation site.
- the imaging unit 30 of a compound eye type increases the outer diameter, so that it can be used only for extremely limited applications. Therefore, an endoscope system that uses a monocular optical system and acquires three-dimensional information of an observation site has not been proposed so far.
- the endoscope system 1 capable of measuring the size of the specific site 200 can be used for a wider range of applications.
- the burden on doctors can be reduced.
- the method of the present embodiment may be implemented as the lumen structure calculation system 100. That is, the lumen structure calculation system 100 of this embodiment includes an acquisition unit 110, a lumen structure calculation unit 120, and a dimension estimation unit .
- the acquisition unit 110 acquires a captured image of the subject obtained by the monocular imaging unit 30 provided in the insertion unit 2b inserted into the lumen serving as the subject, and information for determining the actual size of at least part of the lumen. and the actual dimension determination information.
- the lumen structure calculation unit 120 calculates the three-dimensional structure of the lumen and three-dimensional structural dimension information that is information for determining the actual dimension of at least a part of the three-dimensional structure.
- the dimension estimation unit 130 outputs specific part dimension information, which is the actual dimensions of the specific part 200 in the three-dimensional structure, based on the three-dimensional structure dimension information. By doing so, an effect similar to that described above can be obtained.
- the method of the present embodiment may be implemented as a method of creating lumen structure information. That is, the method of creating lumen structure information according to the present embodiment includes obtaining a captured image of a subject obtained by the monocular imaging unit 30 provided in the insertion section 2b inserted into the lumen of the subject. . Further, the method of creating lumen structure information of the present embodiment includes obtaining actual dimension determination information, which is information for determining the actual dimension of at least part of the lumen. In addition, the lumen structure information creation method of the present embodiment uses information for determining the three-dimensional structure of the lumen and the actual dimensions of at least a part of the three-dimensional structure based on the captured image and the actual dimension determination information.
- the method of creating lumen structure information of this embodiment includes outputting specific portion dimension information, which is the actual size of the specific portion 200 in the three-dimensional structure, based on the three-dimensional structure dimension information. By doing so, an effect similar to that described above can be obtained.
- a specific part structure 200A which is a three-dimensional structural model of the specific part 200, may be further output to the monitor 150. That is, the dimension estimation unit 130 performs processing for outputting the specific part structure 200A based on the three-dimensional structure of the lumen and the specific part dimension information. That is, in the endoscope system 1 of the present embodiment, the dimension estimating section 130 outputs the specific region dimension information and the information related to the three-dimensional structure of the specific region in association with each other. By doing so, the shape and size of the specific part 200 can be grasped, so that the user can grasp the details of the specific part 200 more accurately.
- the user here is the operator, but may include the patient Pa. For example, as shown in FIG. 3, both the operator who operates the endoscope 2 and the patient Pa on the bed see the size information of the specific region 200 displayed on the monitor 150, thereby identifying the operator and the patient Pa. Accurate recognition matching can be performed for the part 200 .
- the actual dimension determination information is enough to grasp the actual dimension of the specific part 200.
- the actual dimension determination information is used to determine the entire three-dimensional structure of the lumen portion obtained by the imaging unit 30.
- the dimensions may be known. That is, the lumen structure calculator 120 of the endoscope system 1 of the present embodiment calculates three-dimensional structure information in which actual dimensions are set based on the actual dimension determination information.
- the situation regarding the specific portion 200 can be grasped in more detail. For example, since the size of the lesion, which is the specific site 200, can be compared with the size of the observed lumen portion, the operator can more accurately grasp the severity of the lesion.
- the dimension estimating unit 130 extracts a predetermined region including the specific part structure 200A from the three-dimensional structure of the lumen, and outputs the predetermined region and the specific part dimension information to the monitor 150.
- the displays shown in A2 and A3 in FIG. 3 can be realized.
- the configuration of the endoscope system 1 of this embodiment is not limited to that shown in FIG.
- the endoscope system 1 of this embodiment may further include a specific site setting section 140 as shown in FIG. That is, the specific part setting section 140 of the endoscope system 1 of this embodiment sets the specific part 200 based on the captured image transmitted from the acquisition section 110 .
- the specific part setting unit 140 can be realized by a processor or an image processing module similar to the acquisition unit 110, the actual size determination information acquisition unit 112, the lumen structure calculation unit 120, and the size estimation unit .
- FIG. 5 is a display example of the monitor 150 for explaining the specific part setting unit 140.
- FIG. 5 assume that the operator performs an endoscopy on the patient Pa and finds a suspected lesion from the captured image. The operator makes a selection to set the range of the specific part 200 on the screen indicated by B1 on the monitor 150 . Specifically, for example, the operator selects whether to set the specific region 200 in the manual mode or to set the specific region 200 in the automatic mode. That is, in the endoscope system 1 of the present embodiment, the specific part setting section 140 has a first setting mode and a second setting mode in which the method of setting the specific part 200 is different from that in the first setting mode.
- the operator selects one setting mode from a plurality of setting modes in which the setting methods of the specific part 200 are different from each other.
- the specific site setting section 140 can select from a plurality of setting modes. By doing so, the specific portion 200 can be appropriately set. It should be noted that there may be a plurality of types of manual mode and automatic mode.
- the specific part range display Ex surrounding the specific part 200 is displayed, for example, as indicated by B3.
- the specific part setting unit 140 generates information specifying the range of the lesion, which is the specific part 200, and transmits the information to the dimension estimating unit 130.
- the dimension estimating unit 130 displays the result on the monitor 150, so that B3 The display shown can be realized. That is, in the endoscope system 1 of the present embodiment, the specific part setting unit 140 presents the specific part on the captured image so that the user can visually recognize it.
- the user here is, for example, an operator. By doing so, the boundary between the specific portion 200 and the portion other than the specific portion 200 is clarified, so the size of the specific portion 200 can be clarified.
- the setting of the specific part 200 in the manual mode is performed, for example, by displaying the specific part range display Ex by the operator operating an input unit (not shown). That is, in the manual mode, each operator empirically judges the boundary between the specific site 200 and the area other than the specific site 200 based on the color, brightness, smoothness, shape, etc. of the observed lumen. , the specific part range display Ex is set.
- the setting of the specific part 200 in the automatic mode for example, the specific part 200 and the specific part 200 by inference processing based on instructions from a learned model including a program in which an inference algorithm is described and parameters used in the inference algorithm. Boundaries between other than are determined.
- the specific part setting unit 140 can automatically generate the information of the specific part range display Ex, and the dimension estimating unit 130 can automatically display the specific part range display Ex on the monitor 150 . That is, the specific part setting unit 140 includes a memory (not shown) in which the learned model is stored, and when the operator selects the automatic mode, the learned model is read from the memory.
- a learned model is generated by a learning device (not shown) that exists outside the endoscope system 1 , but the learning device may be included in the endoscope system 1 .
- the learned model is updated by, for example, inputting the captured image including the specific part 200 and the data of the specific part range display Ex determined in the manual mode to the learning device for each case.
- a neural network can be adopted as an inference algorithm.
- a weighting factor of the connection between nodes in the neural network is a parameter.
- a neural network consists of an input layer that receives image data, an intermediate layer that performs arithmetic processing on the data input through the input layer, and an output layer that outputs recognition results based on the operation results output from the intermediate layer.
- a convolutional neural network (CNN) is suitable as a neural network used for image recognition processing, but other neural network techniques may be employed.
- the inference algorithm is not limited to neural networks, and various machine learning techniques used for image recognition can be adopted. Since these techniques are publicly known, the description thereof will be omitted.
- the specific part setting unit 140 has a discriminator that automatically sets the specific part 200 .
- the range of the specific part 200 is automatically set without depending on the operator's judgment, so that the setting error of the specific part range display Ex can be reduced.
- the burden on the operator can be reduced.
- the plurality of color information are separated into first color information and second color information different from the first color information, and an inference algorithm is used.
- an inference algorithm may be used for the parameters of More specifically, for example, when it is known that the color of the lumen is close to red and the lesion that is the specific part 200 is a color close to blue, the area of the specific part 200 is determined using blue as the first color information. Used as parameters for inference. Similarly, red is used as second color information as a parameter for inferring regions other than the specific portion 200 .
- the captured image can be separated into a plurality of pieces of color information, and the specific part setting section 140 uses the first color information included in the plurality of pieces of color information. , to set a specific part.
- the boundary between the specific part 200 and the part other than the specific part 200 can be inferred more accurately, so the specific part range display Ex in the automatic mode can be displayed more accurately.
- the specific part range display Ex when the specific part range display Ex is displayed, as shown in B4, it is displayed that the range of the specific part 200 has been set, and the operator confirms whether the specific part range display Ex is properly displayed. to request confirmation. For example, when the operator determines that the specific part range display Ex is appropriately displayed, the operator performs an input operation to that effect. Further, when the specific part range display Ex is not properly displayed, the operator can select to set the range of the specific part 200 again.
- the case where the specific part range display Ex is not properly displayed is, for example, when the area indicated by the specific part range display Ex is displayed at a position significantly different from the area of the specific part 200 determined on the monitor 150 by the operator. and so on.
- the operator can set the range of the specific part 200 again in another setting mode for the same image.
- the specific part setting unit 140 performs the second setting for the same captured image as the captured image in which the specific part 200 is set in the first setting mode.
- the setting of the specific part 200 can be redone in the mode.
- the operator can also select to set the range of the specific part 200 again using another image. This is because, if the aspect of the display of the specific part 200 changes by performing imaging again, the setting result of the specific part 200 in the automatic mode may change.
- the specific part setting unit 140 sets the specific part 200 to the second captured image different from the first captured image.
- Settings can be redone.
- the second captured image is a captured image in which the same specific site 200 as the specific site 200 captured in the first captured image is captured.
- the specific part setting section 140 can redo the setting of the specific part 200 in a different setting mode for the same image or in a different image. By doing so, the range of the specific part 200 can be set more appropriately.
- the configuration of the endoscope system 1 of this embodiment is not limited to the above.
- a configuration including the magnetic sensor 16 and the magnetic field generator 7 may be employed.
- the image processing device 3 is a video processor that performs predetermined image processing on the received imaging signal to generate a captured image.
- a video signal of the generated captured image is output from the image processing device 3 to the monitor 150 , and the live captured image is displayed on the monitor 150 .
- the operator can observe the inside of the large intestine of the patient Pa when the distal end portion 11 of the insertion portion 2b is inserted through the anus of the patient Pa.
- a magnetic sensor 16 is arranged at the distal end portion 11 of the insertion portion 2b.
- the magnetic sensor 16 is a detection device that is arranged near the imaging section 30 of the distal end portion 11 and detects the position and orientation of the viewpoint of the imaging section 30 .
- the magnetic sensor 16 has two coils 16a and 16b, for example, as shown in FIG. Two central axes of the two cylindrical coils 16a and 16b are orthogonal to each other. Therefore, the magnetic sensor 16 is a 6-axis sensor and detects the position coordinates and orientation of the distal end portion 11 . Orientation here refers to Euler angles.
- the magnetic sensor 16 is not limited to the example shown in FIG. 7, and modifications such as changing the combination and arrangement of the coils 16a and 16b are possible.
- the performance and the number of detection axes may be limited, but the angle at which the two coils 16a and 16b intersect may be changed.
- the magnetic sensor 16 may be composed of one coil 16a.
- a signal line 2 e of the magnetic sensor 16 extends from the endoscope 2 and is connected to the lumen structure detection device 5 .
- the magnetic field generator 7 generates a predetermined magnetic field
- the magnetic sensor 16 detects the magnetic field generated by the magnetic field generator 7 .
- the magnetic field generator 7 is connected to the lumen structure detector 5 via a signal line 7a.
- a detection signal of the magnetic field is supplied from the endoscope 2 to the lumen structure detection device 5 via the signal line 2e.
- a magnetic field generating element is provided at the distal end portion 11 instead of the magnetic sensor 16, and a magnetic sensor 16 is provided outside the patient Pa instead of the magnetic field generating device 7 to detect the position and posture of the distal end portion 11. You may do so.
- the magnetic sensor 16 detects the position and orientation of the distal end portion 11, in other words, the position and orientation of the viewpoint of the captured image acquired by the imaging section 30 in real time.
- the light source device 4 is a light source device capable of emitting normal light for the normal light observation mode.
- the light source device 4 has normal light for the normal light observation mode and special light for the special light observation mode. and are selectively emitted.
- the light source device 4 emits either normal light or special light as illumination light in accordance with the state of a changeover switch provided in the image processing device 3 for switching observation modes.
- the lumen structure detection device 5 includes a processor 51 , a storage device 52 , an interface 53 , an image acquisition section 54 , a position/orientation detection section 55 and a drive circuit 56 . Each part of the lumen structure detection device 5 is connected to each other by a bus 58 .
- the processor 51 is a control unit that has a CPU and a memory and controls the processing of each unit within the lumen structure detection device 5 .
- the memory is a storage unit including ROM, RAM, and the like. Various processing programs executed by the CPU and various data are stored in the ROM. The CPU can read and execute various programs stored in the ROM and storage device 52 .
- the storage device 52 stores a lumen structure calculation program.
- the lumen structure calculation program is a software program for calculating lumen structure information from the position and orientation information of the distal end portion 11 and the captured image.
- the CPU reads out and executes the lumen structure calculation program, so that the processor 51 calculates the lumen structure based on the captured image obtained by the imaging unit 30 and the three-dimensional arrangement of the distal end portion 11 detected by the magnetic sensor 16. constitutes a lumen structure calculation unit that calculates the three-dimensional structure of
- the interface 53 outputs lumen structure information calculated by the processor 51 to the lumen structure calculation system 100 .
- the interface 53 is a communication interface that communicates with the lumen structure calculation system 100, for example.
- the image capturing unit 54 is a processing unit that captures captured images obtained by the image processing device 3 at regular intervals. For example, from the endoscope 2, 30 captured images are acquired from the image processing device 3 in one second, which is the same as the frame rate. Here, the image capturing unit 54 captures 30 captured images per second, but the captured images may be captured at a cycle longer than the frame rate. For example, the image capturing unit 54 may capture three captured images per second.
- the position/orientation detection unit 55 controls the drive circuit 56 that drives the magnetic field generator 7 to cause the magnetic field generator 7 to generate a predetermined magnetic field.
- the position/orientation detection unit 55 detects the magnetic field with the magnetic sensor 16, and from the detection signal of the detected magnetic field, the position coordinates (x, y, z) and the orientation (vx, vy, vz) of the imaging unit 30. Generate data. Orientation refers to Euler angles. That is, the position/orientation detection unit 55 is a detection device that detects the position and orientation of the imaging unit 30 based on the detection signal from the magnetic sensor 16 . More specifically, the position/orientation detection unit 25 detects three-dimensional arrangement time change information, which is information on changes in the three-dimensional arrangement over time. Therefore, the position/orientation detection unit 25 acquires three-dimensional arrangement information of the insertion portion 2b at a plurality of times.
- FIG. 8 is a block diagram showing another configuration example of the endoscope system 1.
- a lumen structure calculation unit 120 includes a feature point extraction unit 122 that extracts a plurality of feature points F in each captured image, and A three-dimensional position calculator 124 for calculating the position of each feature point F in the three-dimensional space based on this.
- the feature point extraction unit 122 and the three-dimensional position calculation unit 124 can be realized by a processor equivalent to the processor 51 in FIG. 6 and a program module operating on the processor 51 . 8, illustration of the image processing device 3, the light source device 4, the magnetic field generation device 7, etc. is omitted.
- FIG. 9 Although the following description is performed by each unit of the endoscope system 1 shown in FIG. 6, it can also be realized by the endoscope system 1 shown in FIG.
- each processing of the flowcharts of FIGS. 9 and 11 can be implemented by the processor 51 in the following description, but the lumen structure calculation unit 120, the feature point extraction unit 122, the three-dimensional position calculation unit 124, etc. are configured. It can also be realized by the processor performing it appropriately.
- FIG. 9 is a flow chart showing an example of the flow of lumen structure calculation processing.
- the operator places the distal end portion 11 of the insertion portion 2b at a predetermined position and performs a predetermined operation on an input device (not shown). For example, when the lumen is the large intestine as shown in FIG. 10, the predetermined position is the anus indicated by C1.
- the processor 51 sets the position and orientation data from the position and orientation detection unit 55 as the reference position and reference orientation of the distal end portion 11 when calculating the lumen structure (step S1).
- the operator sets the reference position and the reference orientation of the distal end portion 11 at a predetermined position in the three-dimensional space as initial values while the distal end portion 11 is applied to a predetermined position.
- the lumen structure calculated by the following processing is calculated based on the reference position and reference posture set here.
- the specific position is, for example, the deepest part of the large intestine indicated by C2 in FIG. From the state where the distal end portion 11 of the insertion portion 2b is at the specific position indicated by C2, air is supplied to expand the inside of the large intestine, and the operator pulls the insertion portion 2b to move it toward the predetermined position indicated by C1.
- the inner wall of the large intestine is observed by bending the bending portion 12 in various directions while stopping the withdrawal of the insertion portion 2b in the middle.
- the lumen structure of the large intestine is calculated while the operator observes the inner wall of the large intestine.
- the image capturing unit 54 acquires captured images at predetermined intervals ⁇ t from captured images supplied from the image processing device 3 every 1/30 second (step S2).
- the period ⁇ t is, for example, 0.5 seconds.
- the CPU acquires the position and orientation information of the distal end portion 11 output from the position and orientation detection section 55 when the captured image was acquired (step S3).
- the processor 51 calculates positional information in a three-dimensional space, such as a plurality of feature points F, in one captured image acquired in step S2 and one or more captured images acquired before that (step S4). .
- a set of positional information such as a plurality of calculated feature points F becomes information on the lumen structure.
- the position information of each feature point F may be calculated from image information using methods such as SLAM (Simultaneous Localization and Mapping) and SfM (Structure from Motion), or may be calculated using the principle of triangulation. can be calculated by A method of calculating the position of each feature point F will be described later.
- step S4 when the first captured image is acquired, there is no previously acquired captured image, so the process of step S4 is not executed until a predetermined number of captured images are acquired.
- the processor 51 creates or updates lumen structure information by adding position information such as the calculated plurality of feature points F (step S5).
- the set of one or more feature points F in the region observed by the endoscope 2 constitutes the lumen structure information created in step S5.
- the lumen structure information is 3D data.
- FIG. 10 represents an image of lumen structure information viewed from a given viewpoint. For example, when displaying lumen structure information, the user can confirm the structure of the lumen when viewed from a desired direction of 360 degrees by inputting an instruction to change the viewpoint position.
- FIG. 10 illustrates the lumen structure information that takes into consideration even the unevenness of the lumen, but the lumen structure information may be more simplified information.
- lumen structure information may be a cylindrical model.
- the lumen it is possible to reduce the processing load.
- a sensor such as the magnetic sensor 16
- the effect of reducing the amount of calculation by making the shape of the lumen cylindrical is large.
- a method of simplification it is assumed that only straight lumens without bends or simple bends are assumed, and structural models that differ only in size such as length and diameter for each part of the standard lumen structure are assumed. You can
- the interface 53 of the lumen structure detection device 5 outputs the generated lumen structure information to the lumen structure calculation system 100 (step S6). Further, in step S6, the interface 53 may perform control to display the lumen structure information on the monitor 150.
- the processor 51 determines whether or not the insertion portion 2b has been removed from the patient (step S7). For example, when the user withdraws the insertion portion 2b, the user performs user input indicating the end of observation using an input device (not shown). The processor 51 makes the determination shown in S7 based on the user input. If the removal has not been performed (No in step S7), the process returns to step S2.
- the lumen structure calculation unit 120 calculates the three-dimensional structure of the lumen from the position of each feature point F in the calculated three-dimensional space. By doing so, it is possible to generate a three-dimensional structure based on the captured image.
- the processor 51 may use methods such as SLAM and SfM to calculate the positions of feature points F on a plurality of consecutive images.
- the generation of lumen structure information it is possible to apply bundle adjustment that optimizes intrinsic parameters, extrinsic parameters and world coordinate point clouds from images using the nonlinear least squares method. For example, using each estimated parameter, the world coordinate points of the plurality of extracted feature points F are subjected to perspective projection transformation, and each parameter and each world coordinate point group are obtained so that the reprojection error is minimized. .
- the extrinsic parameters for tip 11 are calculated by solving a 5-point and 8-point algorithm.
- the position of the feature point F is calculated according to the position of the tip 11 and the triangulation method.
- the error E between the coordinates of the 3D point projected onto the image plane and the feature point F due to the reprojection error is expressed by the following equation (1).
- L is the number of feature points F on the K images
- Psj is the coordinate position of the 3D point Pi estimated by triangulation and the parameters of the tip 11 on the image plane
- Pi is , are the coordinate positions of the corresponding feature points F on the image.
- the position coordinates of the tip 11 are calculated using the LM (Levenberg-Marquartdt) method so as to minimize the function of the error E in equation (1).
- the lumen structure calculator 120 uses the second color information included in the plurality of color information and not including the first color information to calculate the three-dimensional structure. and three-dimensional structural dimension information. By doing so, a three-dimensional structure or the like is generated based on an image having a color close to that of the lumen to be observed, so that the accuracy of estimating the three-dimensional structure can be improved.
- FIG. 11 is a flowchart of a method of calculating the position of each feature point F in the three-dimensional space by bundle adjustment.
- the processor 51 sets the time t to t0 when the predetermined position is set as the initial position, and sets the count value n of the software counter to 0 (step S11).
- the processor 51 acquires the captured image at time t0 and information on the position and orientation of the distal end portion 11 (step S12).
- a captured image is obtained from the image processing device 3 .
- Information on the position and orientation of the distal end portion 11 is obtained from the position and orientation detection section 55 .
- the processor 51 determines the position and orientation of the distal end portion 11 at the initial position (step S13). For example, the predetermined position (x, y, z) is determined to be (0, 0, 0), and the orientation (vx, vy, vz) is determined to be (0, 1, 0). Steps S11 and S13 correspond to step S1 in FIG.
- the processor 51 acquires the captured image at the time (t0+n ⁇ t) and information on the position and orientation of the distal end portion 11 (step S14). Steps S12 and S14 correspond to step S2 in FIG. Information on the position and orientation of the distal end portion 11 may be corrected. For example, using a Kalman filter, the path that the tip 11 has traveled in the past is corrected, and the past position of the tip 11 is corrected based on the corrected path.
- the processor 51 extracts a plurality of feature points F in each captured image, and given the position and orientation of the tip 11 at k points in time, that is, the three-dimensional arrangement of the tip 11, the obtained
- the positions of m feature points F included in the captured image are calculated by the bundle adjustment described above (step S15). Therefore, the process of extracting a plurality of feature points F in each endoscopic image in step S15 constitutes a feature point extraction unit that extracts a plurality of feature points F in each captured image.
- step S15 feature points F that appear in common in captured images at a plurality of time points are extracted.
- step S15 the position of each feature point F in the three-dimensional space is calculated based on the positions of the extracted plurality of feature points F in the captured image and the three-dimensional arrangement of the insertion portion 2b.
- a three-dimensional position calculator for calculating a position in a dimensional space is constructed. More specifically, the feature point F position in the three-dimensional space is calculated. Then, the position of each feature point F in the three-dimensional space is determined by bundle adjustment.
- the feature point extracting unit 122 extracts feature points commonly appearing in captured images at a plurality of time points.
- the three-dimensional position calculation unit 124 calculates the three-dimensional arrangement information of the insertion portion 2b at a plurality of time points based on the output of the magnetic sensor 16, which is a position sensor for extracting at least part of information on the position and orientation of the imaging unit 30. to get
- the three-dimensional position calculation unit 124 is based on the three-dimensional arrangement information of the insertion portion 2b at a plurality of time points and the positions on the captured images of the feature points F that are commonly displayed in the captured images at a plurality of time points. , the position of the feature point F in the three-dimensional space.
- FIG. 12 is a schematic diagram for explaining the relationship between feature points F on a plurality of consecutively acquired captured images and the position and orientation of the distal end portion 11.
- the white triangle Pw indicates the actual position and orientation of the tip 11
- the black triangle Pb indicates the estimated position and orientation of the tip 11 .
- the tip 11 has actually moved along the solid line.
- the estimated tip 11 is moving along the dotted line. With the lapse of time, the position of the distal end portion 11 moves and the posture of the distal end portion 11 changes.
- the white square pw indicates the actual position of the feature point F
- the black square pb indicates the estimated, ie calculated position of the feature point F.
- the feature point F is, for example, a portion that has a characteristic shape and color in the captured image and that can be easily identified or tracked.
- the coordinates of a plurality of feature points F on the inner wall of the intestinal canal of the large intestine are obtained, and a three-dimensional model is obtained by a set of the obtained plurality of coordinates or by connecting these coordinates. is generated. That is, the three-dimensional structure of the lumen is determined from the calculated position of each feature point F in the three-dimensional space.
- the information on the position and orientation of the tip 11 at each point in time includes information for six axes, so the information on the position and orientation of the tip 11 at k points in time includes 6k pieces of information. Since the position of each feature point F includes information for three axes, information on the positions of m feature points F includes 3m pieces of information. Therefore, when using methods such as SLAM and SfM, the number of parameters to be determined is (6k+3m).
- the magnetic sensor 16 is provided at the distal end portion 11 of the endoscope 2, and the lumen structure detection device 5 detects the position and orientation information detected by the magnetic sensor 16.
- a posture detection unit 55 may be included.
- 6k parameters corresponding to the position and orientation of the tip 11 are known. Since the optimization calculation by the processor 51 is limited to the calculation of 3m parameters, the processing amount of the optimization calculation can be reduced. Therefore, it is possible to speed up the processing. In addition, since the reduction in the number of parameters suppresses the accumulation of detection errors, it is possible to suppress the deviation of the generated three-dimensional model structure.
- the distal end portion 11 of the insertion portion 2b of the endoscope 2 is pressed against the inner wall of the lumen, is submerged in dirty washing water, or the captured image is blurred, an appropriate continuous captured image cannot be obtained.
- Information on the position and orientation of the distal end portion 11 is obtained even if it is not possible to obtain the information. Therefore, even if consecutive captured images cannot be obtained, the possibility that 3m parameters can be calculated increases. As a result, the robustness of the computation of lumen structure is increased.
- Step S16 corresponds to step S5 in FIG.
- the processor 51 corrects the previously calculated position information of the feature point F (step S17).
- the position information of the feature points F calculated in the past is calculated in the past by using the newly calculated position information, for example, by calculating the average value. Correct the location information that has been received. Note that the processing of step S17 may not be performed, and the position information of each feature point F calculated in the past may be updated with the newly calculated position information of the feature point F.
- step S17 the processor 51 increments n by 1 (step S18) and determines whether a command to end the inspection has been input (step S19).
- the command to end the examination is, for example, a predetermined command that the operator inputs to the input device after the insertion portion 2b is removed from the large intestine.
- the command is input (YES in step S19)
- the process ends.
- step S19 When the command to end the inspection is not input (NO in step S19), the process proceeds to step S14.
- the processor 51 acquires a captured image after the cycle ⁇ t from the last acquisition time of the captured image (step S14), and executes the processing after step S15.
- lumen structure information is output.
- the lumen structure information obtained by these methods requires the relative positional relationship between the feature points F, it is not possible to obtain information on absolute dimension values.
- the method of calculating the lumen structure described with reference to FIGS. 9 to 12 can be further combined with the method of setting the specific part range display Ex described with reference to FIGS.
- the range of the specific site 200 is set based on the captured image, and the characteristic points F are extracted by the lumen structure calculation process, so the feature points F within the range of the specific site 200 are also extracted. be.
- the position of the feature point F inside the specific site 200 is also calculated.
- the position of the specific part 200 is calculated by associating the extracted feature points F of the specific part 200 with the feature points F in the three-dimensional structure.
- the endoscope system 1 of the present embodiment further includes the specific part setting unit 140 that extracts the specific part 200 from the captured image and extracts the feature point F from the captured image that shows the specific part 200,
- the specific part setting unit 140 sets the position of the specific part 200 in the three-dimensional structure by associating the extracted feature points F with the feature points F in the three-dimensional structure.
- FIG. 13 is a flowchart illustrating an example of processing for calculating three-dimensional structure dimension information.
- the lumen structure calculation system 100 acquires a lumen image (step S21). Specifically, for example, the acquisition unit 110 acquires the captured image transmitted from the image processing device 3 . After that, lumen structure calculation system 100 generates a three-dimensional structure (step S22). Specifically, positional information in a three-dimensional space such as a plurality of feature points F is calculated using any of the methods described above with reference to FIGS. 9 to 12, and a three-dimensional structure of the lumen is generated.
- the lumen structure calculation system 100 calibrates the actual dimensions (step S23). Specifically, for example, a comparative object whose dimension information is at least partly known in advance is arranged in the imaging range.
- the comparative object here is, for example, a treatment instrument that is inserted into a lumen from the distal end portion 11 and used for observation, diagnosis, treatment, or the like, but it may also be a cap or the like that is attached to the distal end portion 11. , at least a part of which should be within the imaging field of view.
- the O-ring 300 may be used as a comparison object.
- the method of arranging the O-ring 300 is not limited to the example of FIG. 14, and for example, the O-ring 300 may be arranged in the vicinity of the specific portion 200 without surrounding the specific portion 200 . Also, the size of the O-ring 300 may be smaller than the size of the specific portion 200 .
- the lumen image in step S21 described above includes the image of O-ring 300
- the three-dimensional structure in step S22 includes the three-dimensional structure of O-ring 300.
- the actual dimension determination information acquisition unit 112 acquires the known actual dimension of the O-ring 300 as the actual dimension determination information based on the captured image of the lumen
- the lumen structure calculation unit 120 calculates the actual dimension determination information.
- a process of calibrating the actual dimensions of the three-dimensional structure generated in step S22 is performed.
- the lumen structure calculation system 100 may perform the processing of step S23 by the operator pressing a wire-shaped, knife-shaped, rod-shaped, or the like treatment tool against the inner wall of the lumen. This is because, as described above, the comparison object to be placed around the specific site 200 only needs to have known dimensions.
- the actual size determination information acquisition unit of the endoscope system 1 of the present embodiment acquires the size of the comparison object in the captured image whose dimensions are known as the actual size determination information. . By doing so, the absolute dimension values of the three-dimensional structure can be grasped.
- the luminal structure calculation system 100 determines whether or not it is necessary to construct a three-dimensional structure. If necessary (YES in step S24), the process returns to step S21. , distance information is calculated (step S25). For example, when the operator discovers a lesion that is the specific site 200, observation is continued until it is determined that the lesion can be imaged optimally, so the three-dimensional structure of the lumen continues to be constructed. Then, when the operator determines that the lesion can be optimally imaged, the specific part 200 is calculated by the method described above with reference to FIG.
- the predetermined period is, for example, the period from the occurrence of the deformation to the occurrence of the next deformation of the large intestine, which is the object of imaging, which periodically deforms due to peristaltic motion, pendulum motion, segmental motion, or the like. This is because if the large intestine or the like is deformed before the processing of steps S21 to S23 is completed, an appropriate three-dimensional structure cannot be constructed.
- the lumen structure calculation unit 120 calculates 3 Output dimensional structure and 3D structural dimension information. By doing so, the three-dimensional structure of the lumen can be appropriately formed. In addition, when the three-dimensional structure including absolute dimension values is obtained and then deformed, it is possible to follow the state of deformation of the observed object.
- FIG. 15 is a flow chart explaining another processing example for calculating the three-dimensional structure dimension information.
- the processing example of FIG. 15 is a processing example of obtaining a captured image and absolute dimension information and obtaining a three-dimensional structure including the absolute dimension.
- the lumen structure calculation system 100 acquires a lumen image (step S31), as in step S21 of FIG. After that, the lumen structure calculation system 100 performs a process of determining whether or not it is time to acquire the detection result of the sensor. A detection result is acquired (step S33). Details of the processing in step S33 will be described later.
- the lumen structure calculation system 100 calculates the distance between the feature points F (step S34 ). Then, the luminal structure calculation system 100 generates a three-dimensional structure by the same method as in step S22 of FIG. 13 (step S35).
- the processing example of FIG. 15 differs from the processing example of FIG. 13 in that the generated three-dimensional structure includes absolute dimension information at the time of step S35.
- the lumen structure calculation system 100 determines whether or not construction of the three-dimensional structure is necessary, and if necessary (YES in step S36), the process proceeds to step S31. Returning, if unnecessary (NO in step S36), distance information is calculated (step S37).
- the sensor in step S33 is, for example, the magnetic sensor 16, which is a position sensor.
- the lumen structure calculator 120 obtains information about the position and orientation of the distal end portion 11, which is the actual size determination information, based on the output information from the magnetic sensor 16, which is the position sensor. In other words, information about the position or orientation of the imaging unit 30 is obtained through the actual size determination information obtaining unit 112 .
- the lumen structure calculation unit 120 incorporates information about the position or orientation of the imaging unit 30 into the feature points F extracted from the lumen image acquired in step S31. Generate a three-dimensional structure of the cavity.
- the actual dimension determination information acquisition unit 112 acquires information related to the output of the magnetic sensor 16, which is a position sensor for extracting at least part of the information related to the position and orientation of the imaging unit 30, as the actual dimension determination information. do. By doing so, absolute dimensional information can be added to a three-dimensional structure that can only be obtained relatively.
- the position and orientation of the distal end portion 11 may be detected using not only the magnetic sensor 16 but also a shape sensor and an insertion amount/torsion amount sensor.
- the shape sensor is a fiber sensor that is a bending sensor that detects the amount of bending from the curvature of a specific portion using an optical fiber, for example.
- the insertion amount/torsion amount sensor has a cylindrical shape with a hole through which the insertion portion 2b can be inserted. and an encoder for detecting the amount of rotation of the insertion portion 2b about its axis.
- the sensor in step S33 may be, for example, a ranging sensor.
- FIG. 16 is a perspective view of the distal end portion of the distal end portion 11 of the insertion portion 2b having the distance measuring sensor.
- An observation window 41, two illumination windows 42, a forceps opening 43, and a TOF (Time Of Flight) sensor 60, which is a distance measuring sensor, are arranged on the distal end surface 11a1 of the distal end portion 11.
- the TOF sensor 60 detects the distance image and measures the distance by measuring the flight time of light.
- a TOF function is embedded in each pixel of the image sensor. Therefore, the TOF sensor 60 obtains distance information for each pixel.
- the TOF sensor 60 is provided at the distal end portion 11 of the insertion portion 2b and detects the distance from the distal end portion 11 to the inner wall of the lumen. Accordingly, information on the distance from the imaging unit 30 to the subject can be obtained.
- the distance information is, for example, distance distribution information, but may be distance information for one point.
- the lumen structure calculation unit 120 obtains the information on the distance from the distal end portion 11 to the inner wall of the lumen to be imaged via the actual dimension determination information acquisition unit 112 as TOF determination information. Acquired from the sensor 60 .
- the lumen structure calculation unit 120 incorporates information on the distance from the distal end portion 11 to the inner wall of the lumen into the feature points F extracted from the lumen image acquired in step S31, thereby calculating the actual dimension Generate a 3D structure of the lumen containing
- the distance measurement sensor is not limited to the TOF sensor 60, and may be a sensor of another type such as LiDAR (Light Detection and Ranging/Laser Imaging Detection and Ranging).
- the actual dimension determination information acquisition unit 112 of the endoscope system 1 of the present embodiment obtains information related to the output of the TOF sensor 60, which is a distance measurement unit that extracts information on the distance from the imaging unit 30 to the subject. is obtained as the actual dimension determination information.
- the method of obtaining the distance distribution from the imaging unit 30 to the subject is not limited to the method using a sensor.
- a distance distribution from 30 to the subject may be obtained.
- the Shape From Shading method is a technique for obtaining a three-dimensional shape based on the shading on the surface of an object. By solving this, the three-dimensional shape can be calculated. A more detailed description is omitted because it is well known.
- the method of this embodiment is not limited to the above, and various modifications are possible.
- MB may be presented to the user.
- the information between two points includes the X-direction, Y-direction, and Z-direction components between the two points, the length between the two points, the projection length between the two points, and the like.
- the screen display indicated by E2 in FIG. 17 can be realized by the following method. For example, as shown in FIG. 18, it is assumed that feature points F forming a three-dimensional structure with known absolute dimension values are arranged.
- the lumen structure calculation unit 120 extracts the feature point FA closest to the measurement point MA and the feature point FB closest to the measurement point MB, and based on the coordinate information of the feature point FA and the feature point FB, calculates the measurement points MA, Information such as MB coordinates and distance is obtained, and processing for transmitting the information to the dimension estimation unit 130 is performed.
- the lumen structure calculator 120 may obtain the coordinates of the measurement points MA and MB and the information between the two points using the coordinate information of a plurality of feature points F positioned near the measurement points MA and MB. Then, the dimension estimation unit 130 performs processing for outputting screen data to the monitor 150 based on the information.
- the endoscope system 1 may present a three-dimensional specific part structure 200A to the user while changing the direction of the viewpoint, as shown in G1 of FIG. Further, as indicated by G2, the major axis D1, minor axis D2, and height H of the three-dimensional specific site structure 200A may be presented to the user on the monitor 150. FIG. By doing so, the operator and the patient Pa who are users can appropriately understand the specific region 200 .
- the specific part structure 200A is a three-dimensional structure, and the appearance may change depending on the viewpoint and direction, so if the size measurement is performed without specifying the line-of-sight direction, the size measurement results will vary. Therefore, the three-dimensional structure calculation unit of the endoscope system 1 of the present embodiment calculates a plane parallel to the inner wall of the lumen around the specific site 200, in other words, a plane perpendicular to the normal vector N shown in FIGS. Based on the object projected onto the surface, the major axis D1, minor axis D2, and height H of the specific site structure 200A are calculated.
- the height H is the height based on the plane parallel to the inner wall, and is the length of the specific site structure 200A along the direction parallel to the normal vector N, as shown in FIG.
- the major axis D1 and the minor axis D2 of the specific site structure 200A are determined.
- dimension estimation section 130 outputs display data based on these pieces of information to monitor 150 .
- the dimension estimating unit 130 of the endoscope system 1 of the present embodiment calculates the major diameter D1 and the minor diameter D2, which are the actual dimensions of the object projected onto a plane parallel to the lumen around the specific site 200. Output as specific part dimension information.
- the dimension estimation unit 130 of the endoscope system 1 of the present embodiment outputs the height H, which is the actual dimension of the object in the direction perpendicular to the inner wall of the lumen around the specific part 200, as the specific part dimension information. do.
- the specific part 200 is observed from the direction perpendicular to the normal vector N, it is difficult for the user to grasp the height H information.
- Information on the height H of the specific portion 200 can also be known.
- the endoscope system 1 of this embodiment can also grasp the actual dimension information of the distance LE from the predetermined opening to the specific site 200 .
- the distance LE is the distance from the predetermined opening indicated by J to the center of the specific portion 200, as shown in FIG. 23, for example.
- the predetermined opening is, for example, the anus in the case of colon observation.
- the center here includes the approximate center. For example, it is assumed that the specific site 200 is a lesion and the operator is considering the optimum treatment among a plurality of treatments in order to remove the specific site 200 .
- Multiple procedures include, for example, endoscopic resection, surgical resection, and the like.
- the method of the present embodiment can grasp the actual dimension of the distance LE from the predetermined opening to the specific site 200, so that the operator can select an appropriate treatment from a plurality of treatments.
- the operator can predict the post-treatment progress with higher accuracy.
- Control unit 34 Storage unit 35 Focus control unit 51 Processor 52 Storage device 53 Interface 55 Position and orientation detection unit 56 Drive circuit 58 Bus 100 Lumen structure calculation system , 110... acquisition unit, 112... actual size determination information acquisition unit, 120... lumen structure calculation unit, 122... characteristic point extraction unit, 124... three-dimensional position calculation unit, 130... dimension estimation unit, 140... specific part setting unit , 150... monitor, 200... specific portion, 300... O-ring, D1... major axis, D2... minor axis, F, FA, FB... characteristic point, H... height, LE... distance, N... normal vector, MA, MB...Measurement point, Pa...Patient
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Pathology (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Optics & Photonics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Geometry (AREA)
- Signal Processing (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Human Computer Interaction (AREA)
- Endoscopes (AREA)
Abstract
Description
Claims (20)
- 被写体となる内腔に挿入される挿入部と、
前記挿入部に設けられ、前記被写体の撮像画像を取得する単眼の撮像部と、
前記内腔の少なくとも一部に係る実寸法を決定する情報である実寸法決定情報を取得する実寸法決定情報取得部と、
前記撮像画像と前記実寸法決定情報に基づいて、前記内腔の3次元構造と、前記3次元構造の少なくとも一部に係る実寸法を決定する情報である3次元構造寸法情報と、を算出する内腔構造算出部と、
前記3次元構造寸法情報に基づき、前記3次元構造における特定部位の実寸法である特定部位寸法情報を出力する寸法推定部と、を含むことを特徴とする内視鏡システム。 - 請求項1において、
前記内腔構造算出部は、
各撮像画像中の複数の特徴点を抽出する特徴点抽出部と、
前記複数の特徴点の前記撮像画像上の位置に基づき、各特徴点の3次元空間内の位置を算出する3次元位置算出部と、を有し、
前記内腔構造算出部は、算出された前記3次元空間内の前記各特徴点の位置から前記内腔の前記3次元構造を算出することを特徴とする内視鏡システム。 - 請求項2において、
前記撮像画像から前記特定部位を抽出するとともに、前記特定部位が写る前記撮像画像から前記特徴点を抽出する特定部位設定部と、
をさらに含み、
前記特定部位設定部は、抽出した前記特徴点と、前記3次元構造内の前記特徴点との対応付けによって、前記3次元構造における前記特定部位の位置を設定することを特徴とする内視鏡システム。 - 請求項2において、
前記撮像部は、複数の時点の前記撮像画像を取得し、
前記特徴点抽出部は、前記複数の時点の前記撮像画像に共通して映されている前記特徴点を抽出し、
前記3次元位置算出部は、
前記撮像部の位置及び向きの少なくとも一部の情報を抽出する位置センサの出力に基づき、前記複数の時点の前記挿入部の3次元配置情報を取得し、
前記複数の時点の前記挿入部の前記3次元配置情報と、複数の時点の前記撮像画像に共通して映されている前記特徴点の前記撮像画像上の位置とに基づき、前記特徴点の前記3次元空間内の位置を算出することを特徴とする内視鏡システム。 - 請求項1において、
前記撮像画像に基づき、前記特定部位を設定する特定部位設定部と、
をさらに含むことを特徴とする内視鏡システム。 - 請求項5において、
前記特定部位設定部は、前記撮像画像上の前記特定部位をユーザが視認可能に呈示することを特徴とする内視鏡システム。 - 請求項5において、
前記特定部位設定部は、自動で、前記特定部位を設定する識別器を有することを特徴とする内視鏡システム。 - 請求項5において、
前記撮像画像は、複数の色情報に分離可能であり、
前記特定部位設定部は、前記複数の色情報に含まれる、第1の色情報を用いて、前記特定部位の設定を行い、
前記内腔構造算出部は、前記複数の色情報に含まれ、かつ前記第1の色情報を含まない第2の色情報を用いて、前記3次元構造と、前記3次元構造寸法情報とを出力することを特徴とする内視鏡システム。 - 請求項5において、
前記特定部位設定部は、複数の設定モードから選択可能であることを特徴する内視鏡システム。 - 請求項5において、
前記特定部位設定部は、前記特定部位の設定を同一画像の異なる設定モードまたは、別画像で、やり直し可能であることを特徴とする内視鏡システム。 - 請求項1において、
前記実寸法決定情報取得部は、前記撮像部の位置及び向きに関する情報のうち少なくとも一部の情報を抽出する位置センサの出力に係る情報を、前記実寸法決定情報として取得することを特徴とする内視鏡システム。 - 請求項1において、
前記実寸法決定情報取得部は、前記撮像部から前記被写体までの距離の情報を抽出する距離計測部の出力に係る情報を、前記実寸法決定情報として取得することを特徴とする内視鏡システム。 - 請求項1において、
前記実寸法決定情報取得部は、前記撮像画像に映り、寸法が既知の比較対象物の前記撮像画像上の大きさを前記実寸法決定情報として取得することを特徴とする内視鏡システム。 - 請求項1において、
前記内腔構造算出部は、所定の時間内に撮影された前記撮像画像と、前記所定の時間内に取得された前記実寸法決定情報とに基づいて、前記3次元構造と、前記3次元構造寸法情報を出力することを特徴とする内視鏡システム。 - 請求項1において、
前記寸法推定部は、前記特定部位の周囲の前記内腔に平行な面に投影した対象物の前記実寸法を前記特定部位寸法情報として出力することを特徴とする内視鏡システム。 - 請求項1において、
前記寸法推定部は、前記特定部位の周囲の前記内腔に垂直な方向の対象物の前記実寸法を前記特定部位寸法情報として出力することを特徴とする内視鏡システム。 - 請求項1において、
前記内腔構造算出部は、前記実寸法決定情報に基づいて、前記実寸法が設定された3次元構造情報を算出することを特徴とする内視鏡システム。 - 請求項1において、
前記寸法推定部は、前記特定部位寸法情報と、前記特定部位の前記3次元構造に係る情報を関連づけて出力することを特徴とする内視鏡システム。 - 被写体となる内腔に挿入される挿入部に設けられた単眼の撮像部によって取得された前記被写体の撮像画像と、前記内腔の少なくとも一部に係る実寸法を決定する情報である実寸法決定情報と、を取得する取得部と、
前記撮像画像と前記実寸法決定情報に基づいて、前記内腔の3次元構造と、前記3次元構造の少なくとも一部に係る実寸法を決定する情報である3次元構造寸法情報とを算出する内腔構造算出部と、
前記3次元構造寸法情報に基づき、前記3次元構造における特定部位の実寸法である特定部位寸法情報を出力する寸法推定部と、を含むことを特徴とする内腔構造算出システム。 - 被写体となる内腔に挿入される挿入部に設けられた単眼の撮像部によって取得された前記被写体の撮像画像を取得することと、
前記内腔の少なくとも一部に係る実寸法を決定する情報である実寸法決定情報を取得することと、
前記撮像画像と前記実寸法決定情報に基づいて、前記内腔の3次元構造と、前記3次元構造の少なくとも一部に係る実寸法を決定する情報である3次元構造寸法情報とを算出することと、
前記3次元構造寸法情報に基づき、前記3次元構造における特定部位の実寸法である特定部位寸法情報を出力することと、を含む内腔構造情報の作成方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/017141 WO2022230160A1 (ja) | 2021-04-30 | 2021-04-30 | 内視鏡システム、内腔構造算出システム及び内腔構造情報の作成方法 |
JP2023516988A JPWO2022230160A1 (ja) | 2021-04-30 | 2021-04-30 | |
CN202180097593.6A CN117241718A (zh) | 2021-04-30 | 2021-04-30 | 内窥镜系统、内腔构造计算系统以及内腔构造信息的制作方法 |
US18/384,975 US20240057847A1 (en) | 2021-04-30 | 2023-10-30 | Endoscope system, lumen structure calculation system, and method for creating lumen structure information |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/017141 WO2022230160A1 (ja) | 2021-04-30 | 2021-04-30 | 内視鏡システム、内腔構造算出システム及び内腔構造情報の作成方法 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/384,975 Continuation US20240057847A1 (en) | 2021-04-30 | 2023-10-30 | Endoscope system, lumen structure calculation system, and method for creating lumen structure information |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022230160A1 true WO2022230160A1 (ja) | 2022-11-03 |
Family
ID=83848155
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/017141 WO2022230160A1 (ja) | 2021-04-30 | 2021-04-30 | 内視鏡システム、内腔構造算出システム及び内腔構造情報の作成方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20240057847A1 (ja) |
JP (1) | JPWO2022230160A1 (ja) |
CN (1) | CN117241718A (ja) |
WO (1) | WO2022230160A1 (ja) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10248806A (ja) * | 1997-03-12 | 1998-09-22 | Olympus Optical Co Ltd | 計測内視鏡装置 |
JP2008048906A (ja) * | 2006-08-24 | 2008-03-06 | Olympus Medical Systems Corp | 医療用画像処理装置及び医療用画像処理方法 |
WO2008136098A1 (ja) * | 2007-04-24 | 2008-11-13 | Olympus Medical Systems Corp. | 医療用画像処理装置及び医療用画像処理方法 |
WO2017057330A1 (ja) * | 2015-09-28 | 2017-04-06 | オリンパス株式会社 | 内視鏡システム及び画像処理方法 |
JP2018197674A (ja) * | 2017-05-23 | 2018-12-13 | オリンパス株式会社 | 計測装置の作動方法、計測装置、計測システム、3次元形状復元装置、およびプログラム |
JP6478136B1 (ja) * | 2017-06-15 | 2019-03-06 | オリンパス株式会社 | 内視鏡システム、内視鏡システムの作動方法 |
US20200281454A1 (en) * | 2019-02-26 | 2020-09-10 | Optecks, Llc | Colonoscopy system and method |
-
2021
- 2021-04-30 WO PCT/JP2021/017141 patent/WO2022230160A1/ja active Application Filing
- 2021-04-30 JP JP2023516988A patent/JPWO2022230160A1/ja active Pending
- 2021-04-30 CN CN202180097593.6A patent/CN117241718A/zh active Pending
-
2023
- 2023-10-30 US US18/384,975 patent/US20240057847A1/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10248806A (ja) * | 1997-03-12 | 1998-09-22 | Olympus Optical Co Ltd | 計測内視鏡装置 |
JP2008048906A (ja) * | 2006-08-24 | 2008-03-06 | Olympus Medical Systems Corp | 医療用画像処理装置及び医療用画像処理方法 |
WO2008136098A1 (ja) * | 2007-04-24 | 2008-11-13 | Olympus Medical Systems Corp. | 医療用画像処理装置及び医療用画像処理方法 |
WO2017057330A1 (ja) * | 2015-09-28 | 2017-04-06 | オリンパス株式会社 | 内視鏡システム及び画像処理方法 |
JP2018197674A (ja) * | 2017-05-23 | 2018-12-13 | オリンパス株式会社 | 計測装置の作動方法、計測装置、計測システム、3次元形状復元装置、およびプログラム |
JP6478136B1 (ja) * | 2017-06-15 | 2019-03-06 | オリンパス株式会社 | 内視鏡システム、内視鏡システムの作動方法 |
US20200281454A1 (en) * | 2019-02-26 | 2020-09-10 | Optecks, Llc | Colonoscopy system and method |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022230160A1 (ja) | 2022-11-03 |
US20240057847A1 (en) | 2024-02-22 |
CN117241718A (zh) | 2023-12-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102558061B1 (ko) | 생리적 노이즈를 보상하는 관강내 조직망 항행을 위한 로봇 시스템 | |
JP7214757B2 (ja) | 生理学的ノイズを検出する管腔網のナビゲーションのためのロボットシステム及び方法 | |
CN108430373B (zh) | 用于在患者体内跟踪内窥镜的位置的装置和方法 | |
JP5676058B1 (ja) | 内視鏡システム及び内視鏡システムの作動方法 | |
JP5715311B2 (ja) | 内視鏡システム | |
JP6254053B2 (ja) | 内視鏡画像診断支援装置、システムおよびプログラム、並びに内視鏡画像診断支援装置の作動方法 | |
JP5715312B2 (ja) | 内視鏡システム | |
WO2021166103A1 (ja) | 内視鏡システム、管腔構造算出装置及び管腔構造情報の作成方法 | |
JP6141559B1 (ja) | 医療装置、医療画像生成方法及び医療画像生成プログラム | |
JP5750669B2 (ja) | 内視鏡システム | |
CN111067468B (zh) | 用于控制内窥镜系统的方法、设备及存储介质 | |
JP6022133B2 (ja) | 医療装置 | |
JP7385731B2 (ja) | 内視鏡システム、画像処理装置の作動方法及び内視鏡 | |
US20240115338A1 (en) | Endoscope master-slave motion control method and surgical robot system | |
JP7441934B2 (ja) | 処理装置、内視鏡システム及び処理装置の作動方法 | |
US9345394B2 (en) | Medical apparatus | |
CN117204791A (zh) | 一种内窥镜器械引导方法以及系统 | |
WO2022230160A1 (ja) | 内視鏡システム、内腔構造算出システム及び内腔構造情報の作成方法 | |
WO2022202520A1 (ja) | 医療情報処理装置、内視鏡システム、医療情報処理方法、及び医療情報処理プログラム | |
WO2024029502A1 (ja) | 内視鏡検査支援装置、内視鏡検査支援方法、及び、記録媒体 | |
JP7506264B2 (ja) | 画像処理装置、内視鏡装置及び画像処理装置の作動方法 | |
JP2024531672A (ja) | ロバストな面及び深度推定の方法 | |
WO2016039292A1 (ja) | 内視鏡システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21939319 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023516988 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202180097593.6 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21939319 Country of ref document: EP Kind code of ref document: A1 |