WO2014168128A1 - 内視鏡システム及び内視鏡システムの作動方法 - Google Patents
内視鏡システム及び内視鏡システムの作動方法 Download PDFInfo
- Publication number
- WO2014168128A1 WO2014168128A1 PCT/JP2014/060141 JP2014060141W WO2014168128A1 WO 2014168128 A1 WO2014168128 A1 WO 2014168128A1 JP 2014060141 W JP2014060141 W JP 2014060141W WO 2014168128 A1 WO2014168128 A1 WO 2014168128A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- predetermined
- unit
- subject
- endoscope system
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00006—Operational features of endoscopes characterised by electronic signal processing of control signals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/0002—Operational features of endoscopes provided with data storages
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00039—Operational features of endoscopes provided with input arrangements for the user
- A61B1/00042—Operational features of endoscopes provided with input arrangements for the user for mechanical operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00055—Operational features of endoscopes provided with output arrangements for alerting the user
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00064—Constructional details of the endoscope body
- A61B1/00071—Insertion part of the endoscope body
- A61B1/0008—Insertion part of the endoscope body characterised by distal tip features
- A61B1/00096—Optical elements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00147—Holding or positioning arrangements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/307—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the urinary organs, e.g. urethroscopes, cystoscopes
Definitions
- the present invention relates to an endoscope system and an operation method of the endoscope system, and more particularly, to an endoscope system that displays an organ model image to which an endoscope image is pasted and an operation method of the endoscope system.
- endoscope systems have been widely used in the medical field and the industrial field.
- an operator inserts an insertion portion of an endoscope into a subject, and an endoscope image obtained through an observation window provided at a distal end portion of the insertion portion is displayed on a display device Is displayed.
- the surgeon can perform endoscopy by viewing the displayed endoscopic image.
- the endoscope system can also record an endoscope image.
- a doctor can use a recorded endoscopic image of a lesion as a part of a medical record.
- a capsule endoscope system has also been put into practical use.
- the capsule endoscope captures an image of the body while moving inside the body, Record images inside the body.
- the endoscopy is performed again, or the lesion found in the previous endoscopy is Treatment is also performed using a mirror.
- the doctor fills in the chart with the position of the lesion found in the examination in the organ to be examined.
- the position of the lesion is specified by marking a bladder development drawing (schema) drawn on the medical chart.
- the present invention provides an endoscope system in which the position of an endoscopic image in an examination target organ is easily known, and only a specific endoscopic image can be easily specified from a plurality of endoscopic images.
- An object of the present invention is to provide a method for operating an endoscope system.
- an insertion portion that is inserted into a subject, an objective optical window that is provided on a distal end side of the insertion portion, receives light from the subject, and is incident from the objective optical window Obtained from light by an imaging unit that images the inside of the subject, a position information acquisition unit that acquires position information of the objective optical window, an in-subject image acquired by the imaging unit, and the position information acquisition unit
- a recording unit that records the position information in association with each other, an association unit that associates position information recorded in the storage unit with a model image of a predetermined organ in the subject, and the association unit
- An image generation unit that generates an image in which the in-subject image is pasted on the associated model image of the predetermined organ as an in-subject image pasted image, and a predetermined trigger signal related to imaging in the imaging unit
- a determination unit that determines whether or not a predetermined feature amount of the in-vivo image acquired by the imaging unit satisfies a predetermined condition, and the determination unit generates the pre
- an imaging unit that images the inside of the subject from light incident from an objective optical window provided on the distal end side of the insertion portion to be inserted into the subject, and the objective optical window
- a position information acquisition unit that acquires position information
- a recording unit that records the in-vivo image acquired by the imaging unit and the position information acquired by the position information acquisition unit in association with each other
- an association unit An operation method of an endoscope system including an image generation unit, a determination unit, and a display unit, wherein the association unit stores a model image of a predetermined organ in the subject in the storage unit.
- the recorded position information is associated, and the image generation unit pastes an image in which the in-subject image is pasted on the model image of the predetermined organ associated by the association unit. Generated as an image and said judgment However, it is determined whether a predetermined trigger signal related to imaging in the imaging unit has occurred or whether a predetermined feature amount of the in-vivo image acquired by the imaging unit satisfies a predetermined condition, and the display unit However, the in-subject image determined by the determination unit that the predetermined trigger signal has been generated or the predetermined feature amount satisfies the predetermined condition is the other image on the in-subject image pasted image. It is possible to provide an operating method of an endoscope system, characterized in that it is displayed so as to be distinguishable from the in-subject image.
- 1 is a configuration diagram showing a configuration of an endoscope system according to a first embodiment of the present invention.
- 1 is a block diagram showing a configuration of an endoscope system 1 according to a first embodiment of the present invention.
- It is a flowchart which shows the example of the flow of the sticking process of the endoscopic image to a bladder model image at the time of observation in the bladder concerning the 1st Embodiment of this invention.
- It is a figure which shows the position of the typical patient's bladder for demonstrating the name of each part of the bladder concerning the 1st Embodiment of this invention.
- It is a figure which shows the typical bladder for demonstrating the name of each part of the bladder concerning the 1st Embodiment of this invention.
- FIG. 1 it is a diagram for explaining the position P 2 and the direction V 2 at the tip 2d of position and the direction vector a second coordinate system (X 2 Y 2 Z 2) . It is a figure for demonstrating the coordinate relationship in the two-dimensional coordinate system (U, V) concerning the 1st Embodiment of this invention.
- the pasting of each pixel to the inner surface of the sphere of the second coordinate system (X 2 Y 2 Z 2 ) will be described by scanning the entire endoscope image according to the first embodiment of the present invention.
- FIG. It is a figure which shows the other example of the image displayed on the screen of the monitor 6 concerning the 1st Embodiment of this invention.
- FIG. It is a figure which shows the example of the image displayed on the screen of the monitor 6 at the time of using the 5-axis sensor concerning the 1st Embodiment of this invention.
- the 5-axis sensor according to the first embodiment of the present invention only an endoscopic image when the release button 13 is pressed is pasted on the 2D model image 31a and an example of an image is shown.
- FIG. It is a figure which shows the example of the display screen at the time of displaying the image of two organ models corresponding to two observation modes concerning the 1st Embodiment of this invention.
- FIG. 1 is a configuration diagram showing a configuration of an endoscope system according to the present embodiment.
- FIG. 2 is a block diagram showing a configuration of the endoscope system 1.
- the endoscope system 1 includes an endoscope 2, a recording device 3, a light source device 4, a processor 5, a monitor 6, and a magnetic field generator 7.
- the endoscope system 1 has two observation modes: normal light observation and special light observation. A doctor who is an examiner performs an endoscopic examination of the bladder B of the patient P lying on his / her back on the bed 8.
- the endoscope 2 includes an operation unit 2a, a flexible insertion unit 2b that is inserted into a subject, and a universal cable 2c.
- the endoscope 2 is an endoscope for bladder examination. Further, although not shown, a light guide is inserted into the universal cable 2c, and the endoscope 2 emits illumination light from the light source device 4 from the distal end portion 2d of the insertion portion 2b through the light guide. It is configured.
- an imaging element 11 is provided at the distal end portion 2d of the insertion portion 2b, and a portion in the bladder B illuminated by the illumination light of the light source device 4 opens the objective optical window 11a.
- the image is picked up by the image pickup device 11.
- the objective optical window 11a is provided on the distal end side of the insertion portion 2 and receives light from the subject.
- the imaging element 11 constitutes an imaging unit that images the inside of the subject from light that is inserted into the subject and is incident from the objective optical window 11a.
- An imaging signal obtained by the imaging element 11 is supplied to the processor 5 via a signal line in the universal cable 2c, and the imaging signal is subjected to image processing in the processor 5.
- the processor 5 includes a changeover switch 5a for switching the observation mode, and the processor 5 generates an endoscopic image corresponding to the observation mode specified by the changeover switch 5a.
- the generated endoscopic image is output from the processor 5 to the monitor 6, and the live endoscopic image is displayed on the monitor 6.
- a doctor hereinafter referred to as an examiner who conducts the examination can insert the distal end 2d of the insertion portion 2b from the urethra of the patient P and observe the inside of the patient B's bladder B (shown by a dotted line in FIG. 1).
- a magnetic sensor 12 is disposed at the distal end portion 2d of the insertion portion 2b. Specifically, a magnetic sensor 12 having two coils 2e is provided in the vicinity of the objective optical window 11a of the tip 2d. Therefore, the magnetic sensor 12 is a 6-axis sensor. A signal line 2 f of the magnetic sensor 12 extends from the endoscope 2 and is connected to the recording device 3.
- the magnetic sensor 12 may be a 5-axis sensor.
- the magnetic field generator 7 generates a predetermined magnetic field, and the magnetic sensor 12 detects the magnetic field generated by the magnetic field generator 7.
- the magnetic field detection signal is supplied from the endoscope 2 to the recording device 3 via the signal line 2f.
- a release button 13 is provided on the operation unit 2a of the endoscope 2.
- the release button 13 is a button to be pressed when the examiner records an endoscopic image.
- a release button operation signal is input to the processor 5, and the processor 5 generates a release signal and supplies it to the recording device 3.
- An endoscopic image when the release button 13 is pressed is recorded in a memory 22 (to be described later) of the recording device 3.
- the recording device 3 includes a central processing unit (hereinafter referred to as CPU) 21, a memory 22, a display interface (hereinafter abbreviated as display I / F) 23, an image capturing unit 24, and a position / direction detection unit 25. And a drive circuit 26.
- the CPU 21, memory 22, display interface (hereinafter abbreviated as display I / F) 23, image capture unit 24, position / direction detection unit 25, and drive circuit 26 are connected to each other via a bus 27. Has been.
- the CPU 21 is a control unit that controls processing of each unit in the recording apparatus 3.
- the memory 22 is a storage unit including a ROM, a RAM, a flash memory, and the like.
- the memory 22 stores various processing programs and various data executed by the CPU 21. Further, as will be described later, the endoscope image information and the position and direction are stored. Information is also stored.
- the memory 22 also stores data of an organ model image (hereinafter referred to as an organ model image) described later, and an endoscope image is pasted on the organ model image as described later.
- an organ model image hereinafter referred to as an organ model image
- the CPU 21 performs a process of pasting the endoscope image on the model image stored in advance based on the position and direction information of the distal end portion 2d when the endoscope image is captured,
- the organ model image to which the endoscopic image is pasted is stored in the memory 22.
- the organ model image stored in the memory 22 is used as a part of a medical record.
- the organ model image stored in the memory 22 is output via the display I / F 23 and displayed on the screen of the monitor 6. Further, the monitor 6 is also connected to the processor 5.
- the monitor 6 has a PinP (Picture In Picture) function, and a live endoscope obtained by imaging with the imaging element 11 of the endoscope 2 together with the organ model image to which the endoscope image is pasted by the CPU 21. A mirror image can be displayed.
- the image capturing unit 24 is a processing unit that captures an image obtained by the processor 5 at a constant period. For example, 30 endoscopic images per second, the same as the frame rate, are acquired from the processor 2 from the processor 5. The image capturing unit 24 also receives a release signal from the processor 5. Here, although the image capturing unit 24 captures 30 endoscopic images per second, the endoscopic image is different from the frame rate, for example, at a longer cycle such as 3 per second. May be obtained.
- the position / direction detection unit 25 controls the drive circuit 26 that drives the magnetic field generator 7 to generate a predetermined magnetic field in the magnetic field generator 7, detects the magnetic field by the magnetic sensor 12, and detects the detected magnetic field. From the detection signal, position coordinate (x, y, z) and orientation (ie, Euler angles ( ⁇ , ⁇ , ⁇ )) data of the objective optical window 11a, that is, position / direction information is generated in real time. That is, the position / direction detection unit 25 constitutes a position information acquisition unit that acquires position information and direction information from the magnetic sensor 12 and acquires position information of the objective optical window 11a.
- the CPU 21 associates the image captured by the image capturing unit 24 with the position and direction information of the tip 2d calculated from the position / direction information detected by the position / direction detection unit 25 and stores the associated information in the memory 22.
- the CPU 21 further has a stereo measurement function, and has a function of measuring the distance from two frame images obtained by imaging to each part of the target part in the frame image. Specifically, the CPU 21 acquires the imaging position information of the objective optical window 11a based on the position / direction information from the position / direction detection unit 25 when the two frame images are captured, and captures the two frame images. The distance from the image sensor 11 to each part in the frame image can be calculated from the parallax at that time.
- a program for the stereo measurement function is stored in the memory 22, and the CPU 21 can perform stereo measurement by reading and executing the program.
- the light source device 4 is a light source device capable of emitting normal light for the normal light observation mode and special light for the special light observation mode, and a changeover switch 5 a for switching the observation mode provided in the processor 5. Depending on the state, either normal light or special light is emitted as illumination light.
- the special light observation mode is a narrow-band observation mode.
- the special light observation mode may be an infrared light observation mode or a fluorescence observation mode. Therefore, the endoscope system 1 has two observation modes, a normal light observation mode and a special light observation mode, and the light source device 4 emits normal illumination light when the changeover switch 5a is in the normal light observation mode.
- the change-over switch 5a When the change-over switch 5a is in the special light observation mode, it emits narrow-band illumination light having a predetermined wavelength. That is, the light source device 4 constitutes an illumination unit that irradiates the subject with white light or special light having a predetermined wavelength band in a switchable manner.
- the processor 5 generates a normal light observation image of the subject obtained by irradiating the subject with white light in the normal light observation mode, and special light (in this case, narrowband light) in the special light observation mode.
- a special light observation image of the subject obtained by irradiating the subject is generated.
- the narrowband observation image which is a special light observation image, can also be obtained by performing spectral estimation processing on each RGB image obtained by normal light irradiation. 5 may generate a narrow-band observation image by spectral estimation.
- FIG. 3 is a flowchart showing an example of the flow of an endoscopic image pasting process to a bladder model image during observation inside the bladder. The processing in FIG. 3 is executed from when the examiner inserts the distal end portion 2d of the insertion portion 2b into the urethra by the CPU 21 reading and executing a predetermined program stored in the memory 22.
- the CPU 21 determines whether or not the insertion of the distal end portion 2d into the bladder B is detected (S1).
- the distal end portion 2d of the insertion portion 2b is inserted into the urethra and enters the bladder B through the urethra.
- the insertion of the distal end portion 2d into the bladder B is detected by the luminance of the endoscope image acquired by the image capturing unit 24 (the average luminance of the entire endoscope image or a predetermined partial region of the endoscope image). This is performed based on a change amount of (average luminance). That is, the CPU 21 performs the determination of S1 using the fact that the brightness of the endoscopic image changes when the distal end portion 2d enters the bladder B from the urethra. When the luminance value of the endoscopic image changes from a high state to a low state, the CPU 21 determines that the distal end portion 2d has entered the bladder B.
- the detection of the insertion of the distal end portion 2d into the bladder B is performed based on the luminance of the endoscopic image, but the amount of change in the color of the endoscopic image or the amount of change in the texture is detected. You may make it perform based.
- the color change is a color change from red to another color system
- the texture change is a blood vessel pattern that can be recognized from an image state where the blood vessel pattern cannot be recognized. It is a change to such a state.
- the position / direction information of the position detection unit 25 at the time of the detection is used as the position of the tip 2d (specifically, the objective optical window 11a). And recorded as direction reference information (S2).
- the CPU 21 performs reference determination using the position and direction of the tip 2d recorded in S2 as the reference position and reference direction of the three-dimensional bladder model (hereinafter referred to as a 3D bladder model) M1 (S3).
- the CPU 21 changes from the first coordinate system (X 0 Y 0 Z 0 ) based on the external magnetic field generator 7 to the coordinate system (X 1 ) based on the entrance (neck) of the bladder B. Y 1 Z 1 ), and further, from the coordinate system (X 1 Y 1 Z 1 ) to the coordinate system (X 2 Y 2 Z 2 ) based on the center of the bladder model M1 Can do.
- the coordinate system conversion will be described later.
- the processing from S1 to S3 is performed based on the amount of change in the in-subject image information in the patient P as the subject, and the position in the coordinate system of the predetermined organ model image in the patient P from the position information of the objective optical window 11a
- An alignment unit for matching the two is configured.
- the examination of the bladder B is performed in a state where the patient is on his back and the inside of the bladder B is filled with a predetermined liquid (for example, physiological saline). For example, if it is an adult, even if there is a difference in the size of the bladder B, there is no great difference, and the bladder B can be modeled in a spherical shape having substantially the same size.
- a predetermined liquid for example, physiological saline
- FIG. 4 is a schematic diagram showing the position of the bladder of the patient for explaining the names of the parts of the bladder.
- FIG. 4 is a view as seen from the direction facing the front of the patient P.
- FIG. 5 is a diagram showing a schematic bladder for explaining names of respective parts of the bladder.
- FIG. 5 is a view of the bladder as seen from the left side of the patient P.
- Bladder B is the urethra opening and entrance to bladder B, cervical RP, the apex opposite the cervical RP, the front wall on the abdomen, the rear wall on the back, and the patient P A right wall on the right side and a plurality of regions on the left wall on the left side as viewed from the patient P are distinguished. Since the examination of the bladder B is performed in a state where the patient P is lying on the back and the inside of the bladder B is filled with a predetermined liquid, the examiner has the entire position and direction of the actual bladder B as follows: Easy to understand.
- the process repeats the process of S1.
- the distal end portion 2d is at the position of the neck RP of the bladder B. Since the magnetic sensor 12 generates position / direction information of six axes, that is, (position coordinates (x, y, z) and orientation (Euler angles ( ⁇ , ⁇ , ⁇ )), the recording device 3 has a tip 2d. Is recorded as the reference position and the reference direction of the objective optical window 11a with respect to the 3D bladder model M1.
- the imaging device 11 provided at the distal end portion 2d of the insertion portion 2b captures an endoscopic image with a viewing angle ⁇ in the bladder B.
- FIG. 6 is a diagram showing a 3D bladder model M1.
- the 3D bladder model M1 has a substantially spherical shape and is formed in a three-dimensional coordinate system X 2 Y 2 Z 2 .
- the coordinate system X 2 Y 2 Z 2 is a coordinate system converted from the coordinate system X 1 Y 1 Z 1 .
- FIG. 6 in order to show the neck RP that is the entrance of the insertion portion 2b in the bladder B, the figure of the insertion portion 2b is also shown.
- 3D bladder model M1 is the axis of the left wall direction center O from the street right wall of the sphere and the X 2 axis, the axis of the top direction center O of the sphere from the street neck and Y 2 axis, the center O of the sphere It is formed to the axis of the front wall direction Z 2 axes from the street rear wall.
- FIG. 7 is a diagram showing a two-dimensional model (hereinafter referred to as a 2D bladder model) M2 of the bladder B.
- the 2D bladder model M2 has a shape including two circles, and is formed in a two-dimensional coordinate system UV.
- the 2D bladder model M2 has substantially the same shape as the bladder development view (schema) BE shown in FIG.
- FIG. 8 is a diagram showing a bladder development view BE.
- the bladder development view BE is a diagram showing the position of each part in the bladder B. As shown in FIG. 8, each part in the bladder B corresponds to each predetermined region on the bladder development view BE.
- the two ureteral openings of bladder B are in the positions indicated by uo in FIGS.
- the position of the lesion AA in FIG. 6 corresponds to the position indicated by the dotted line in FIG.
- reference information in S2 information on the position and direction of the distal end portion 2d when the insertion of the distal end portion 2d into the bladder B is detected is recorded as reference information in S2, and is designated by the reference information. From these directions, the reference of the 3D bladder model M1 and the reference of the 2D bladder model M2 are derived.
- the release detection process is a process for detecting whether or not the release button 13 of the operation unit 2a of the endoscope 2 has been pressed.
- a release signal is input to the image capturing unit 24 via the processor 5.
- the release signal is a trigger signal for recording the in-vivo image acquired by the image sensor 11 that is an imaging unit.
- the CPU 21 can detect whether or not the release button 13 has been pressed by monitoring the rise (or fall) of the release signal input to the image capturing unit 24. Therefore, the process of S4 constitutes a determination unit that determines whether a release signal as a predetermined trigger signal related to imaging in the image sensor 11 has occurred.
- the CPU 21 acquires an endoscopic image from the image capturing unit 24 (S5). As described above, the image capturing unit 24 acquires an endoscopic image from the processor 5 every 1/30 second, which is the same as the frame rate.
- the CPU 21 acquires information on the position and direction of the distal end portion 2d of the insertion portion 2b (S6). By reading the position / direction information from the position detection unit 25, the CPU 21 can acquire information on the position and direction of the tip 2d.
- CPU 21 has the position direction information in the coordinate system (X 0 Y 0 Z 0) , the position in the three-dimensional coordinate system (X 2 Y 2 Z 2) Convert to direction information.
- the position information of the objective optical window 11a and the coordinate system of the bladder model image which is the predetermined organ model image are matched, and then the position and direction of the distal end portion 2d acquired by the position / direction detection unit 25 in S6 ( That is, the position and direction of the objective optical window 11a) are associated with the position and direction in the coordinate system of the bladder model image. That is, the process of S6 constitutes an associating unit that associates the position information recorded in the memory 22 that is the storage unit of S8 with the model image of the predetermined organ in the subject.
- the CPU 21 performs an endoscopic image pasting process (S7).
- the endoscopic image pasting process is pasted on the inner surface of the spherical 3D bladder model M1 based on the position and direction information acquired in S6 and converted into the three-dimensional coordinate system (X 2 Y 2 Z 2 ).
- This is a process of pasting an endoscopic image onto a 2D model M2 diagram (hereinafter referred to as a 2D model image).
- the processing in S7 is performed in the in-subject image on the model image of the predetermined organ in which the position of the objective optical window 11a and the position in the coordinate system of the 3D model image are associated by S1 to S3 constituting the alignment unit.
- a part of an image generation unit that generates an image pasted as an in-subject pasted image is configured.
- the pasting process of S7 is performed by converting the endoscopic image projected on the spherical inner surface of the 3D bladder model M1 defined by the three-dimensional coordinate system (X 2 Y 2 Z 2 ) into the two-dimensional coordinate system (U, V) is performed by pasting it on the image of the 2D bladder model M2.
- the position and direction of the endoscopic image to be pasted on the image of the 2D bladder model M2 are determined as described above, and the size of the endoscopic image to be pasted is, for example, the imaging portion of the distal end portion 2d and the bladder B. It is changed according to the distance.
- Position and orientation reference information determined in S3 is a position and orientation in determining the magnetic field generator 7 for extracorporeal a standard three-dimensional coordinate system (X 0 Y 0 Z 0) , in the pasting process in S7
- the position and direction are the position and direction in a two-dimensional coordinate system (U, V) based on the neck RP of the 2D bladder model M2.
- the CPU 21 derives the position / direction information of the tip 2d in the two-dimensional coordinate system from the reference information obtained in S3, and based on the derived position / direction information, converts the endoscope image into the 2D model.
- the position and inclination to be projected and pasted on the image are calculated.
- the endoscopic image pasting in S7 is performed when the endoscopic image has already been pasted at the position where the endoscopic image is pasted, and the image acquired later is pasted and pasted. It is performed so as to be superimposed and pasted on the endoscope image.
- the CPU 21 records each information on the position of the pasted endoscope image, 2D model image, and the presence / absence of a release signal in the memory 22 (S8). That is, the processing of S8 constitutes a recording unit that records the endoscopic image that is the in-subject image acquired by the image sensor 11 and the positional information and the direction information acquired by the position / direction detection unit 25 in association with each other. To do.
- the CPU 21 executes an identifiable display process (S9).
- the identifiable display process when there are a plurality of endoscopic images to be pasted on the 2D model image and pasted so that all or part of them overlap each other, the endoscopic image with the release signal is the most
- the in-subject image when the release button 13 of the endoscope 2 is pressed is pasted on the foreground on the model image of a predetermined organ in preference to the other in-subject images, and is further identified.
- An endoscopic image when the release signal is pressed at a glance by adding a predetermined frame 41 (see FIG. 10) to the endoscopic image when the release button 13 is pressed for possible display. And other endoscopic images.
- the CPU 21 displays the 2D model image subjected to the identifiable display process on the monitor 6 through the display I / F 23 (S10). At this time, the CPU 21 also generates a 3D model image and displays it together with the 2D model image.
- the CPU 21 generates a 3D model image by generating an image of the insertion portion 2b based on the position direction information of the distal end portion 2d and superimposing it on the 3D model image.
- the in-subject image when the predetermined trigger signal is generated is determined.
- a display unit is configured.
- the predetermined process of S9 is a process of adding the frame image 41 to the in-vivo image when the release signal that is the predetermined trigger signal is generated. Then, when the release signal that is the predetermined trigger signal is not generated in the determination unit in S4, the display unit does not perform the predetermined process in S9 that displays the in-subject image in an identifiable manner.
- the CPU 21 estimates the shape of the insertion portion based on the position and direction information of the tip 2d acquired in S6, and generates an image of the insertion portion 2b having the estimated shape. Therefore, the process of S10 includes a shape estimation unit that performs shape estimation of the insertion unit 2b based on the position information and direction information of the distal end portion 2d acquired in S6, and the position information and orientation information of the urethral orifice RP, In S10, a process of superimposing an insertion portion image, which is shape information estimated by the shape estimation portion, on the 3D model image relating to the predetermined organ is executed.
- CPU21 determines whether the front-end
- the determination of S11 can be made by determining whether or not the position coordinates of the distal end portion 2d have moved from the neck of the bladder B into the urethra.
- the process returns to S4, and the CPU 21 repeats the processing from S4 to S11 until the distal end portion 2d is removed from the bladder B.
- FIG. 9 is a diagram showing an example of a display screen at the time of endoscopy displayed on the screen of the monitor 6.
- the screen G1 is a screen generated by the CPU 21, and includes a 2D model image display unit 31, a 3D model image display unit 32, and a live endoscope image (hereinafter referred to as a live image). And a live image display unit 33 for displaying.
- the 2D model image display unit 31 is an area for displaying a 2D model image corresponding to the 2D model of FIG.
- the 2D model image display unit 31 includes a 2D model image 31a which is a 2D bladder development view, and an endoscopic image 31b which is an in-subject image pasted on the 2D model image 31a by the processes of S7 and S9. Is displayed.
- the 3D model image display unit 32 is an area for displaying a 3D model image corresponding to the 3D model shown in FIG.
- the 3D model image display unit 32 displays a 3D model image 32a and an insertion portion image 32b indicating the position and direction of the distal end portion 2d of the insertion portion 2b in the 3D model.
- the CPU 21 generates the insertion portion image 32b based on the current position / direction information of the distal end portion 2d.
- the 2D model image display unit 31 shown in FIG. 9 is obtained when an endoscopic image first captured when the distal end portion 2d enters the bladder B and faces the apex is pasted on the 2D model image 31a. An image is displayed.
- the live in-vivo image acquired by the image sensor 11 is displayed together with the model image, and the insertion shape of the insertion unit 2b having the image sensor 11 for capturing the live in-vivo image is also as follows. Displayed with model image.
- the live image display unit 33 is an area in which the endoscopic image acquired by the monitor 6 from the processor 5 is displayed as it is.
- the live image display unit 33 is included in the screen G1 by the PinP function of the monitor 6, for example.
- the live endoscope is displayed on the monitor 6 by using the PinP function of the monitor 6.
- the CPU 21 of the recording device 3 synthesizes the live image in the screen G1 and monitors the monitor. 6 may be output.
- FIG. 10 is a diagram showing another example of the display screen displayed on the screen of the monitor 6.
- the 2D model image display unit 31 in FIG. 10 displays an image when a plurality of endoscopic images 31b picked up in various directions by moving the distal end portion 2d are pasted on the 2D model image 31a. ing.
- a predetermined frame image 41 is added to the endoscopic image when the release button 13 is pressed by the distinguishable display process of S9. Since the color tone of the endoscopic image pasted on the 2D model image 31a is similar, if there is no frame image 41, it is difficult to determine the endoscopic image when the release button 13 is pressed. By adding the frame image 41, the endoscopic image when the release button 13 is pressed stands out, and the inspector can see at a glance the endoscopic image when the release signal is pressed and other endoscopes. Since the image can be distinguished from the image, it is possible to easily find a lesion having a small or poor color change.
- the 2D model image display unit 31 includes a plurality of endoscopic images 31b.
- a region where a plurality of endoscopic images are pasted is a region observed by the examiner. Therefore, the examiner can easily discriminate the region observed by the endoscope only by looking at the image of FIG.
- the 3D model image display unit 32 displays an insertion portion image 32b indicating the current line-of-sight direction of the distal end portion 2d on the 3D model image 32a, so that the inspector is currently observing. Easy to understand.
- the 2D model image display unit 31 of the screen G1 displayed on the monitor 6 is processed on the endoscope image acquired last. The image at the time is displayed. Further, only the 3D model image 32a in which the insertion portion image 32b of the insertion portion 2b is not displayed is displayed on the 3D model image display portion 32, and the live image in the bladder B is not displayed on the live image display portion 33.
- the examiner may record the image of the 2D model image display unit 31 in the nonvolatile memory unit of the memory 22 as patient chart data, or may print and paste the image on the chart.
- FIG. 11 is a diagram for explaining the relationship between the coordinate system of the magnetic field generation device 7 and the coordinate system of the bladder B of the patient P on the bed 8.
- the position / direction detection unit 25 generates position / direction information based on the first coordinate system (X 0 Y 0 Z 0 ) of the magnetic field generator 7 in real time.
- the CPU 21 determines the position and direction of the entrance of the bladder B as the reference position and the reference direction, and the position / direction detection unit according to the following expressions (1) and (2):
- the position / direction information of 25 is converted into the position / direction information of the coordinate system (X 1 Y 1 Z 1 ) based on the entrance of the bladder B.
- P 1 R 01 P 0 + M 01 (1)
- V 1 R 01 V 0 ⁇ formula (2)
- P 0 and V 0 are the position and direction vectors in the first coordinate system (X 0 Y 0 Z 0 ), which is a coordinate system based on the magnetic field generator 7, respectively.
- R 01 is a rotation matrix expressed by the following equation (3)
- M 01 is a parallel progression sequence expressed by the following equation (4).
- the point (x 0 , y 0 , z 0 ) on the first coordinate system (X 0 Y 0 Z 0 ) is represented by the intermediate coordinate system (X 1 Y 1 Z) as shown in the following equation (5). 1 ) is converted into a point (x 1 , y 1 , z 1 ) on the top.
- FIG. 12 is a diagram for explaining the direction vector projected onto the intermediate coordinate system (X 1 Y 1 Z 1 ).
- the condition that the rotation matrix R 01 satisfies is that Z 1 is parallel to the direction of gravity, and V ′ 0 is projected onto the X 1 Y 1 plane perpendicular to Z 1 , and the projected vector direction is represented by Y 1 ,
- the vector perpendicular to the Y 1 Z 1 plane is X 1 .
- the position and direction vector of the intermediate coordinate system (X 1 Y 1 Z 1 ) are the second coordinates based on the center of the 3D bladder model M1. It is converted into a position and direction vector in the system (X 2 Y 2 Z 2 ).
- FIG. 13 is a diagram for explaining the relationship between the intermediate coordinate system (X 1 Y 1 Z 1 ) and the second coordinate system (X 2 Y 2 Z 2 ).
- V 2 R 12 V 1 (8)
- P 1 and V 1 are the position and direction vectors in the intermediate coordinate system (X 1 Y 1 Z 1 ), respectively, and P 2 and V 2 are the second coordinate system (X 2 Y 2 , respectively).
- Z 2 ) is the position and direction vector.
- V 2 is the direction vector of the center of the pixel of the endoscopic image in the second coordinate system (X 2 Y 2 Z 2) .
- R 12 is a rotation matrix represented by the following equation (9), and M 02 is a parallel progression represented by the following equation (10).
- a point on the intermediate coordinate system (X 1 Y 1 Z 1) (x 1, y 1, z 1) is converted to a point (x 2 , y 2 , z 2 ) above.
- the position P 0 of the first coordinate system (X 0 Y 0 Z 0 ) of the magnetic field generator 7 is the second position based on the center of the 3D model based on the expressions (5) and (11).
- To the position P 2 of the coordinate system (X 2 Y 2 Z 2 ), and the direction V 0 in the first coordinate system (X 0 Y 0 Z 0 ) is the second coordinate according to the following equation (14): It is converted in the direction V 2 of the system (X 2 Y 2 Z 2 ).
- the 3D model M1 assumes that the shape of the bladder B is a sphere with a radius R2.
- the endoscopic image is pasted on the inner surface of the sphere.
- FIG. 14 is a diagram for explaining coordinates on the inner surface of the sphere in the second coordinate system (X 2 Y 2 Z 2 ).
- FIG. 15 is a diagram for explaining the position P 2 and the direction V 2 in the second coordinate system (X 2 Y 2 Z 2 ) from the position and direction vector of the tip 2 d.
- FIG. 16 is a diagram for explaining a coordinate relationship in a two-dimensional coordinate system (U, V).
- Direction vector V 2 as described above, the direction vector of the pixel of the image center of the endoscopic image in the second coordinate system (X 2 Y 2 Z 2) . Therefore, for pixels other than the pixel at the center of the image in the endoscopic image, the direction vector of each pixel is obtained, and the transformation operation of the above-described equations (15) to (20) is repeated, so that the endoscopic image The whole can be attached to the inner surface of the sphere of the second coordinate system (X 2 Y 2 Z 2 ).
- FIG. 17 is a diagram for explaining the pasting of each pixel to the inner surface of the sphere of the second coordinate system (X 2 Y 2 Z 2 ) by scanning the entire endoscopic image.
- Each pixel of the endoscopic image EI is pasted on the inner surface of the sphere of the second coordinate system (X 2 Y 2 Z 2 ) while scanning in a predetermined direction as indicated by the dotted line. Is called.
- V 2 ′ indicates a pasting vector of each pixel of the endoscopic image EI
- P 21 ′ indicates a pasting vector of the inner surface of the sphere of the second coordinate system (X 2 Y 2 Z 2 ). Indicates.
- the endoscopic image of the portion inspected inside the bladder B is superimposed on the 2D model image 31a and the endoscopic image when the release button 13 is pressed. Is superimposed and displayed on the 2D model image 31a so that the examiner can easily confirm the region confirmed in the bladder B, and the image of the lesioned part or the part of interest. Can be seen clearly. Note that when an endoscopic image is pasted on the 2D model image 31a, only the endoscopic image when the release button 13 is pressed may be pasted.
- FIG. 18 is a diagram showing another example of an image displayed on the screen of the monitor 6.
- the 2D model image display unit 31 only the endoscopic image when the release button 13 is pressed is pasted on the 2D model image 31a.
- a predetermined frame image 41 is added to the endoscopic image when the release button 13 is pressed.
- the examiner may also record the image of the 2D model image display unit 31 in FIG. 18 in the nonvolatile memory unit of the memory 22 as the patient chart data, or print it and paste it on the chart. You can also.
- the display state of FIG. 10 and the display state of FIG. 18 may be switched.
- the magnetic sensor 12 is a six-axis sensor, the plurality of endoscopic images to be pasted on the 2D model image are pasted so that the vertical and horizontal directions coincide with each other.
- the magnetic sensor 12 may be a 5-axis sensor.
- FIG. 19 is a diagram showing an example of an image displayed on the screen of the monitor 6 when a 5-axis sensor is used.
- FIG. 20 is a diagram illustrating an example of an image in which only the endoscopic image when the release button 13 is pressed is pasted on the 2D model image 31a when the 5-axis sensor is used. 19 corresponds to FIG. 10, and FIG. 20 corresponds to FIG. 19 and 20 also, a predetermined frame image 41 is added to the endoscopic image when the release button 13 is pressed.
- each endoscopic image 31b is inserted into the insertion portion 2b. Is pasted on the 2D model image 31a at a predetermined angle unrelated to the rotation around the axis. Even if a 5-axis sensor is used, the same effect as the above-described embodiment can be obtained.
- the endoscopic image in the normal light observation mode is pasted on the organ model image, but the endoscopic image in the special light observation mode is pasted on the organ model image. Also good.
- the endoscope image 31b is not an endoscope image of normal light but an endoscope image of special light (here, narrowband light).
- two organ model images may be displayed, one of which is an ordinary light endoscope image and the other of which is a special light endoscope image.
- FIG. 21 is a diagram illustrating an example of a display screen when images of two organ models are displayed corresponding to two observation modes.
- the same components in FIGS. 10 and 18 to 20 are denoted by the same reference numerals, and description thereof is omitted.
- FIG. 21 shows an example in which a 6-axis sensor is used.
- a 2D model image display unit 34 for pasting the special light endoscopic image is added on the screen.
- the 2D model image display unit 34 displays a 2D model image 34a and an endoscope image 34b of special light pasted on the 2D model image 34a by the processes of S7 and S9.
- a predetermined frame image 41 is added to the endoscopic images of normal light and special light when the release button 13 is pressed.
- the examiner compares both However, the examination can be performed, and also in the subsequent examination, if both images are attached to the chart, the examiner can know the state of the organ in the previous examination in more detail.
- a plurality of model images are set, and a plurality of model images are set on the model set based on the type of illumination light of the light source device 4 that is the illumination unit, according to the type of illumination light.
- An endoscopic image is pasted.
- the endoscopic image of the narrow band light shows a finer texture inside the mucosal surface than the endoscopic image of the normal light, so that the release is displayed on the 2D model image 31a of the 2D model image display unit 31.
- An endoscopic image of narrow band light and an endoscopic image of narrow band light are displayed on one 2D model image display unit so that an endoscopic image of narrow band light when the button is pressed is attached to the forefront.
- An image in which both images are pasted may be generated.
- the examiner who is viewing the endoscopic image can recognize that the distal end portion 2d has entered the bladder by the change in the endoscopic image displayed on the monitor 6, the examiner can recognize the distal end portion 2d.
- a predetermined operation may be performed on the operation unit 2a or the operation panel of the processor 5 to record the reference position and direction. That is, the position and direction of the objective optical window 11a may be aligned with the coordinate system of the organ model image based on a predetermined operation input by the examiner.
- the position where the inspector enters the bladder from the urethra outside the body cavity perpendicular to the Y 1 direction of the plane that contains the position (coordinate system with reference to the inlet of the bladder B (X 1 Y 1 Z 1 ) Set (Plane).
- An endoscope may be inserted into the urethra, and the position and orientation when passing through the plane may be recorded as the reference position and direction. That is, the position and direction of the objective optical window 11a may be aligned with the coordinate system of the organ model image based on position information with respect to a preset reference plane.
- the examiner can easily know the position of the endoscopic image in the examination target organ, and the release button from the plurality of endoscopic images. It is possible to realize an endoscope system in which it is easy to specify only a specific endoscope image pressed.
- the frame image 41 is added to the endoscopic image so as to surround the endoscopic image when the release button is pressed among the plurality of endoscopic images. Only a specific endoscopic image with the button pressed is displayed so that it can be distinguished from other endoscopic images, but marks such as arrows pointing to the endoscopic image when the release button is pressed May be displayed.
- FIG. 22 is a diagram illustrating an example of a screen G1 according to the first modification that displays only a specific endoscopic image so as to be distinguishable from other endoscopic images.
- an arrow 42 as a mark indicating the endoscopic image when the release button is pressed is displayed together with the endoscopic image.
- the examiner can easily determine the endoscopic image indicated by the arrow 42.
- the predetermined process of S9 is a process of adding a predetermined mark (for example, an arrow) to the in-vivo image when the release signal that is the predetermined trigger signal is generated. Therefore, the examiner can easily find the endoscopic image when the release button is pressed.
- Modification 2 In the first modification described above, a mark such as an arrow indicating the endoscope image when the release button is pressed is displayed. However, the endoscope image when the release button is pressed and other marks are displayed. You may make it vary a hue with an endoscopic image.
- FIG. 23 is a diagram illustrating an example of a screen G1 according to Modification 2 that displays only a specific endoscopic image so as to be distinguishable from other endoscopic images.
- the hue of the endoscopic image 43 (shown by diagonal lines) when the release button is pressed among the plurality of endoscopic images 31b, and other endoscopic images where the release button has not been pressed.
- a plurality of endoscopic images 31b are displayed in different colors from the mirror image 31b. That is, here, the predetermined process of S9 is the color tone of the in-vivo image when the release signal that is the predetermined trigger signal is generated, and the release signal that is the predetermined trigger signal is not generated in the determination unit of S4. In this case, the color tone of the in-subject image is different.
- the endoscope image 43 (shown by oblique lines) when the release button is pressed is displayed in color, and the other endoscope image 31b where the release button is not pressed is
- the endoscope image 43 (shown by diagonal lines) when the release button is pressed is highlighted so as to be displayed in monochrome, or the endoscope image 43 (shown by diagonal lines) when the release button is pressed
- the release button is displayed so that at least one of the saturation and lightness of (shown) is displayed higher than at least one of the saturation and lightness of the other endoscopic image 31b where the release button is not pressed.
- FIG. 24 is a diagram illustrating an example of a screen G1 according to Modification 3 that displays only a specific endoscopic image when the release button is pressed, and displays the endoscopic image so as to be distinguishable from other endoscopic images. is there.
- an endoscopic image 44 (indicated by hatching) when the release button is pressed among a plurality of captured endoscopic images 31b is displayed in the foreground, and the release button is pressed.
- the region 44a of the other endoscopic image that has not been painted is filled with a predetermined color that is different from the color of the bladder developed view (schema) in the background (shown by the dotted diagonal line).
- the predetermined process of S9 is a process of making the color tone of the in-vivo image different from the color tone of the model image of the predetermined organ when the release signal that is the predetermined trigger signal is not generated. Therefore, the inspector can easily recognize the observed area from the area filled with a predetermined color, and easily determine the endoscopic image when the release button is pressed.
- FIG. 25 is a diagram illustrating an example of a screen G1 when a magnetic sensor of a 5-axis sensor according to Modification 3 is used.
- the examiner can easily know the position of the endoscopic image in the inspection target organ, and the specific inner area where the release button is pressed from the plurality of endoscopic images. It is possible to realize an endoscope system that can easily specify only an endoscopic image.
- the first embodiment is an endoscope system that makes it possible to easily specify only an endoscopic image when the release button is pressed.
- the release button is not operated. Rather, it is an endoscope system that makes it possible to easily specify only an endoscopic image when the processing result of predetermined image processing or the determination result of the presence or absence of a predetermined event is a predetermined result.
- the endoscope system according to the present embodiment has the configuration shown in FIGS.
- the endoscopic image pasting process to the organ model image in the endoscopic system of the present embodiment is different from the pasting process of the endoscopic image to the organ model image of the first embodiment.
- FIG. 26 is a flowchart showing an example of a flow of an endoscopic image pasting process to a bladder model image at the time of observation inside the bladder according to the present embodiment.
- FIG. 26 the same processes as those in FIG. 26.
- the CPU 21 acquires an endoscopic image (S5) after the reference determination (S3). Thereafter, the CPU 21 acquires information on the position and direction of the distal end portion 2d (S6) and performs an endoscope image pasting process (S7).
- the CPU 21 performs recording processing of the endoscope image information and the position and direction information (S21). Then, after executing the process of S21, the CPU 21 executes a determination process and an identifiable display process (S22).
- the determination process of S22 is a determination process for displaying an endoscopic image from which a predetermined determination result is obtained so as to be distinguishable from other endoscopic images.
- FIG. 27 is a flowchart illustrating an example of the flow of determination processing and distinguishable display processing.
- the CPU 21 acquires determination information (S31).
- the determination information acquired in S31 differs depending on the determination method.
- the determination target is a) presence / absence of an uneven portion having a predetermined size (height) in the image, b) presence / absence of a portion whose color tone is different from the surroundings, c) The presence or absence of the texture is different from the surroundings.
- the event detection method there are d) presence / absence of a predetermined operation and e) presence / absence of a mode switching operation.
- the determination information acquired in S31 is information on the size (that is, height) of the unevenness in the image.
- the size of the unevenness in the image is calculated from position information of a plurality of predetermined points of the two images, that is, distance information from the image sensor 11 to each point using a stereo measurement function. Specifically, for a plurality of predetermined points in the image, from the difference between the distance from the image sensor 11 to one of the two adjacent points and the distance from the image sensor 11 to the other of the two points, The size (ie, height) of the unevenness of the site of the subject is calculated and obtained. For example, if the lesioned part in the bladder B is swollen and raised, the raised part is a convex part higher than the surrounding mucosal surface. The height of the convex portion on the subject surface in the in-subject image constitutes a predetermined feature amount.
- the CPU 21 acquires, as determination information, a plurality of pieces of information about the difference between two adjacent points obtained from a plurality of pieces of distance information from the image sensor 11 to a plurality of predetermined points in the endoscopic image.
- the CPU 21 determines whether or not a predetermined determination condition is satisfied from the acquired determination information (S32).
- the determination condition includes a plurality of pieces of difference information having a predetermined threshold value (TH1) or more. If there is a plurality of pieces of difference information that is greater than or equal to a predetermined threshold (TH1), for example, it is considered that there is a tumor.
- TH1 predetermined threshold value
- the CPU 21 executes an identifiable display process (S9). If there is no difference information equal to or greater than the predetermined threshold (TH1) among the plurality of difference information (S32: NO), the CPU 21 does not execute the process of S9. Thereafter, the CPU 21 executes display processing (S10), and repeats the processing from S5 to S10 until the insertion portion 2b is removed from the bladder B.
- TH1 predetermined threshold
- the frame image 41 is automatically added only to the endoscopic image determined to have the convex portion, Since it is displayed so as to be distinguishable from the endoscopic image, the examiner can easily specify only the endoscopic image determined to have a convex portion.
- the above example is the case of a), but in the cases of b) to e), the determination information and the determination conditions are different.
- the determination information is a predetermined calculated value calculated from the color information of each pixel in the endoscopic image.
- the calculated value is, for example, an average value of pixel values calculated for each RGB channel, a variance, and a ratio (for example, G / R) of RGB pixel values calculated for each pixel.
- the color tone of the in-subject image constitutes a predetermined feature amount.
- the determination condition is whether or not a difference between a predetermined calculated value obtained from the color information and a calculated value in another image or a past image is a predetermined value or more. When a difference between a predetermined calculated value calculated from the endoscopic image and a calculated value in another image is equal to or larger than the predetermined value, it is determined that the determination condition is satisfied.
- the determination information is texture information in the endoscopic image.
- the texture information is generated from information on feature amounts (for example, edge elements) extracted from the endoscopic image by image processing.
- the texture of the in-subject image constitutes a predetermined feature amount.
- the determination condition is whether or not the difference between the obtained texture information and the texture information in another image or a past image is a predetermined value or more. For example, when there are many edge elements in other images or past images, it is determined that the determination condition is satisfied.
- the determination information is a change amount of at least one piece of information on the position and direction of the distal end portion 2d of the insertion portion 2b.
- the information on the position and direction of the tip 2d is information obtained from the position and direction information of the magnetic sensor 12, and the amount of change is information on at least one difference between the position and the direction.
- the CPU 21 can calculate the amount of change.
- An operation signal indicating a predetermined operation with respect to the insertion portion 2b of the endoscope provided with the imaging element 11 as such an imaging unit constitutes a predetermined trigger signal.
- the determination condition is whether or not the amount of change is greater than or equal to (or less than) a predetermined threshold. When the amount of change is equal to or greater than (or less than) a predetermined value, it is determined that the determination condition is satisfied.
- the endoscopic image to which the frame image 41 is automatically added when the determination condition is satisfied is an image stored in the memory 22 immediately before it is determined that the predetermined condition is satisfied. That is, since a plurality of endoscopic images captured by the image capturing unit 24 are stored in the memory 22, the memory 22 stores a predetermined time or a predetermined frame from when it is determined that a predetermined condition is satisfied. A number of previous endoscope images are selected, and a frame image 41 is added to the endoscope image.
- the change amount may be a change amount indicating a stationary state in which the position and the direction do not change for a predetermined time.
- the determination information is information on the operation signal of the changeover switch 5a.
- An operation signal of the changeover switch 5 a is supplied from the processor 5 to the CPU 21 via the image capturing unit 24.
- the observation mode switching signal when imaging the inside of the subject constitutes a predetermined trigger signal.
- the determination condition is whether or not the observation mode is changed based on the operation signal. When the observation mode is changed, it is determined that the determination condition is satisfied.
- the endoscope system 1 has two modes of a normal light observation mode and a special light observation mode, but may have a PDD mode for a photodynamic diagnosis method (PDD).
- PDD photodynamic diagnosis method
- the processing of S7 and S22 is performed when a predetermined trigger signal is generated or when a predetermined trigger signal is generated or when a predetermined feature amount is determined to satisfy a predetermined condition in the determination unit of S32.
- a model image of a predetermined organ associated with the in-subject image by performing the predetermined process of S9 that displays the in-subject image when the feature amount satisfies a predetermined condition in an identifiable manner.
- a display unit configured to generate and display an image with the in-subject image pasted thereon is configured.
- the process of S32 is performed when an operation signal (predetermined twisting operation signal, mode switching signal) is generated as a predetermined trigger signal related to imaging in the image sensor 11, or a predetermined feature amount (unevenness in the in-vivo image) , Color tone, texture) constitutes a determination unit that determines whether a predetermined condition is satisfied.
- an operation signal predetermined twisting operation signal, mode switching signal
- a predetermined feature amount unevenness in the in-vivo image
- Color tone, texture constitutes a determination unit that determines whether a predetermined condition is satisfied.
- a combination of two or more of a) to e) may be used as the determination condition. Note that the three modifications described in the first embodiment can also be applied to the second embodiment.
- the predetermined process of S9 adds a predetermined mark to the in-vivo image when a predetermined trigger signal is generated or when a predetermined feature amount satisfies a predetermined condition as in Modification 1.
- the color tone of the in-vivo image when a predetermined trigger signal is generated or when a predetermined feature amount satisfies a predetermined condition, and a predetermined trigger signal is generated in the determination unit of S32 Otherwise, the color tone of the in-vivo image when the predetermined feature amount does not satisfy the predetermined condition is different, or a predetermined trigger signal is generated at the determination unit in S32 as in Modification 3.
- the color tone of the in-subject image when it does not occur or when the predetermined feature quantity does not satisfy the predetermined condition may be different from the color tone of the model image of the predetermined organ.
- the examiner can easily know the position of the endoscopic image in the examination target organ, and only a specific endoscopic image is selected from the plurality of endoscopic images.
- An endoscopic image is pasted on the organ model image of the target organ so that it is easy to specify the endoscope, so that an endoscopic system that can shorten the examination time or treatment time by the endoscope is realized. Can do.
- a specific endoscopic image is displayed in an identifiable manner by adding a frame image to the endoscopic image pasted on the organ model image.
- an enlarged image of a specific endoscopic image may be displayed together.
- FIG. 28 to FIG. 30 are diagrams showing examples of screens that display an enlarged image together with the organ model image on the display screen displayed on the screen of the monitor 6.
- a frame image 41 is added to the endoscopic image 31b when the release button is pressed, and an enlarged display unit 51 that displays an enlarged image of the endoscopic image 31b to which the frame image 41 is added. Is displayed on the screen G2.
- the examiner can easily confirm the endoscopic image when the release button is pressed, for example, confirm whether the lesioned part is properly captured. Can be.
- a frame image 41 is added to the endoscope image 31b when the release button is pressed a plurality of times (twice in FIG. 29), and two endoscopes to which the frame image 41 is added.
- An enlarged display unit 52 that displays an enlarged image of the image 31b is displayed on the screen G2.
- a frame image 41 is added to the endoscope image 31b when the release button is pressed a plurality of times (twice in FIG. 28), and two endoscopes to which the frame image 41 is added.
- An enlarged display section 52a that displays the enlarged image of the image 31b together with the corresponding number of the endoscopic image to which the frame image 41 is added is displayed on the screen G2.
- identification numbers surrounded by circles are added to the two endoscope images, and identification numbers surrounded by the circles are also added to the enlarged image of the enlarged display section 52a. Yes.
- the 3D model image display unit 32 is not displayed to display a plurality of enlarged images. Therefore, the examiner determines which endoscopic image enlarged and displayed on the enlarged display unit 52a corresponds to which endoscopic image to which the frame image shown in the 2D model image display unit 31 is added. It can be easily grasped.
- the correspondence between the endoscopic image on the bladder development view (schema) and the enlarged image is indicated by a number, but may be replaced by a character, a symbol, or the like. Further, in FIG. The correspondence between the endoscopic image on the bladder development diagram (schema) and the enlarged image may be shown by a diagram connecting the endoscopic image and the enlarged image as shown by 53.
- the endoscope indicated by the pointer when the examiner moves the pointer on the screen to the endoscopic image Only the image may be displayed as an enlarged image.
- an enlarged display unit may be provided in each of the above-described embodiments and modifications.
- the endoscopic image is pasted on the two-dimensional organ model image.
- the endoscopic image is pasted on the three-dimensional organ model image which is a 3D image.
- the model image may be a 3D image instead of a 2D image.
- the endoscopic image in the bladder is pasted on the 2D model image of the bladder.
- the endoscopic system in each of the above-described embodiments uses other organs other than the bladder.
- the present invention can be applied to the stomach and uterus.
- the reference information can be determined and pasted on the organ model image.
- the endoscope 2 is a flexible endoscope having a flexible insertion portion.
- the present invention is not limited to other types such as a rigid endoscope and a scanning endoscope.
- the present invention can also be applied to an endoscope.
- the present invention can also be applied to an endoscope in which the insertion portion has a light guide member that guides light incident on the objective optical window at the distal end portion to the proximal end portion.
- the endoscope system described above is used for recording or displaying the position of an endoscopic image in an organ, it can also be used for recording a biopsy position in a random biopsy.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Physics & Mathematics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Optics & Photonics (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Signal Processing (AREA)
- Urology & Nephrology (AREA)
- Mechanical Engineering (AREA)
- Endoscopes (AREA)
- Instruments For Viewing The Inside Of Hollow Bodies (AREA)
- Television Signal Processing For Recording (AREA)
- Studio Devices (AREA)
- Camera Bodies And Camera Details Or Accessories (AREA)
- Camera Data Copying Or Recording (AREA)
Abstract
Description
さらにまた、特開2007-236700号公報には、カプセル型内視鏡で得られた全方位画像を展開して展開画像を作成し、その展開画像内の血管パターン及び構造パターンを抽出し、その抽出結果に基づいて展開画像を繋なぎ合わせて、臓器の輪郭図に重ねて表示する技術が提案されている。
カプセル内視鏡を利用する上記の特開2010-240000号公報に開示の内視鏡システムでは、内視鏡画像が対象臓器の3Dモデルに貼り付けられるが、カプセル型内視鏡であるため、医者は、得られた内視鏡画像から、臓器における病変部の位置を容易に把握することはできない。上記の特開2007-236700号公報に開示のカプセル型内視鏡システムにおいても、臓器の輪郭図に展開画像が繋ぎ合わされるが、展開画像が繋ぎ合わされたモザイク画像から、臓器における病変部の位置を把握することは容易ではない。
さらに、病変部が小さい場合あるいは病変部の色調変化が乏しい場合は、医者は、病変部を認識し難いため、得られた複数の内視鏡画像の中から病変部を含む内視鏡画像だけを特定することが容易でない場合がある。
結果として、内視鏡による検査時間若しくは処置時間が長く掛かってしまうという問題がある。
(第1の実施の形態)
(構成)
図1は、本実施の形態に係わる内視鏡システムの構成を示す構成図である。図2は、内視鏡システム1の構成を示すブロック図である。内視鏡システム1は、内視鏡2と、記録装置3と、光源装置4と、プロセッサ5と、モニタ6と、磁場発生装置7とを含んで構成されている。内視鏡システム1は、通常光観察と特殊光観察の2つの観察モードを有する。検査者である医者は、ベッド8上に仰向けで横になっている患者Pの膀胱Bの内視鏡検査を行う。
さらに、図示しないが、ユニバーサルケーブル2c内には、ライトガイドが挿通されており、内視鏡2は、光源装置4からの照明光を、ライトガイドを通して挿入部2bの先端部2dから出射するように構成されている。
また、生成された内視鏡画像は、プロセッサ5からモニタ6へ出力され、ライブの内視鏡画像が、モニタ6上に表示される。検査を行う医者(以下、検査者という)は、挿入部2bの先端部2dを患者Pの尿道から挿入し、患者Pの膀胱B(図1において点線で示す)内を観察することができる。
なお、磁気センサ12は、5軸のセンサでもよい。
メモリ22は、ROM、RAM、フラッシュメモリなどを含む記憶部であり、CPU21により実行される各種処理プログラム及び各種データが記憶され、さらに、後述するように、内視鏡画像情報及び位置と方向の情報なども記憶される。
さらに、モニタ6は、プロセッサ5も接続されている。モニタ6は、PinP(Picture In Picture)機能を有し、CPU21により内視鏡画像が貼り付けられた臓器モデル画像と共に、内視鏡2の撮像素子11により撮像して得られたライブの内視鏡画像を表示することができる。
図3は、膀胱内の観察時における、膀胱モデル画像への内視鏡画像の貼り付け処理の流れの例を示すフローチャートである。図3の処理は、CPU21が、メモリ22に記憶された所定のプログラムを読み出して実行することによって、検査者が挿入部2bの先端部2dを尿道へ挿入したときから実行される。
CPU21は、S2で記録された先端部2dの位置と方向を、それぞれ3次元膀胱モデル(以下、3D膀胱モデルという)M1の基準位置と基準方向とする基準決定を行う(S3)。S3の処理により、CPU21は、体外の磁場発生装置7を基準とする第1の座標系(X0Y0Z0)から、膀胱Bの入り口(頸部)を基準とする座標系(X1Y1Z1)への変換、さらには、座標系(X1Y1Z1)から、膀胱モデルM1の中心を基準とする座標系(X2Y2Z2)への変換、を行うことができる。座標系の変換については、後述する。
図5に示すように、挿入部2bの先端部2dに設けられた撮像素子11は、膀胱B内で、視野角θの内視鏡画像を撮像する。
よって、S4の処理は、撮像素子11における撮像に関する所定のトリガ信号としてのレリーズ信号の発生があったかを判定する判定部を構成する。
S7における内視鏡画像の貼り付けは、内視鏡画像が貼り付けられる位置に、内視鏡画像が既に貼り付けられている場合は、後に取得された画像が、先に取得されて貼り付けられた内視鏡画像上に重畳して貼り付けるように、行われる。
よって、S9の処理は、S7で貼り付けられた内視鏡画像の画素位置に、既に貼り付けられている他の内視鏡画像の画素がある画素領域についてのみ行われる。
先端部2dが膀胱B内から抜去されていない場合(S11:NO)、処理は、S4へ戻り、CPU21は、先端部2dが膀胱B内から抜去されるまで、S4からS11の処理を繰り返す。
図11は、磁場発生装置7の座標系とベッド8上の患者Pの膀胱Bの座標系の関係を説明するための図である。位置方向検出部25は、磁場発生装置7の第1の座標系(X0Y0Z0)を基準とする位置方向情報をリアルタイムで生成する。
V1=R01V0 ・・・式(2)
ここで、P0とV0は、それぞれ、磁場発生装置7を基準とする座標系である第1の座標系(X0Y0Z0)における位置と方向ベクトルである。R01は、次の式(3)で示される回転行列であり、M01は、次の式(4)で示される並進行列である。
また、回転行列R01は以下の条件を満たすように求める。図12は、中間座標系(X1Y1Z1)上に投影される方向ベクトルを説明するための図である。回転行列R01の満たす条件は、Z1は重力方向と平行であること、及びZ1に対して垂直なX1Y1平面にV´0を投影し、その投影したベクトル方向をY1、Y1Z1平面に垂直なベクトルをX1とする、ことである。
V2=R12V1 ・・・式(8)
ここで、P1とV1は、それぞれ、中間座標系(X1Y1Z1)における位置と方向ベクトルであり、P2とV2は、それぞれ、第2の座標系(X2Y2Z2)における位置と方向ベクトルである。V2は、第2の座標系(X2Y2Z2)における内視鏡画像の中心の画素の方向ベクトルである。R12は、次の式(9)で示される回転行列であり、M02は、次の式(10)で示される並進行列である。
また、S7における内視鏡画像の貼り付け処理においては、第2の座標系(X2Y2Z2)において、3D膀胱モデルM1の内面に内視鏡画像を貼り付ける場合の座標の算出について説明する。
|P21|=R2 ・・・式(16)
内視鏡画像は、求めた座標P21の位置に投影されて貼り付けられる。
v=y21+R2 ・・・式(18)
また、膀胱Bの背中側の半球の場合(Z2<0)は、2次元の膀胱モデルは左右が反転するため、u方向の値は、次の式(19)により示され、v方向の値は、次の式(20)により示される。
v=-y21-R2 ・・・式(20)
図16は、2次元の座標系(U,V)における座標関係を説明するための図である。
方向ベクトルV2は、上述したように、第2の座標系(X2Y2Z2)における内視鏡画像の画像中心の画素の方向ベクトルである。よって、内視鏡画像のおける画像中心の画素以外の画素については、各画素の方向ベクトルを求め、上述した式(15)から式(20)の変換演算を繰り返すことによって、内視鏡画像の全体を第2の座標系(X2Y2Z2)の球体の内面に貼り付けることができる。
以上のように、本実施の形態によれば、膀胱B内を検査した部分の内視鏡画像が、2Dモデル画像31a上に重畳され、かつレリーズボタン13が押されたときの内視鏡画像は、2Dモデル画像31a上で最前面にくるように重畳されて表示されるので、検査者は、膀胱B内で確認した領域を簡単に確認できると共に、病変部あるいは気になった部位の画像を明瞭にみることができる。
なお、2Dモデル画像31a上に内視鏡画像を貼り付ける場合、レリーズボタン13が押されたときの内視鏡画像のみを貼り付けるようにしてもよい。
5軸センサを用いても、上述した実施の形態と同様の効果を得ることができる。
また、2つの臓器モデル画像を表示し、一方には、通常光の内視鏡画像を貼り付け、他方には、特殊光の内視鏡画像を貼り付けるようにしてもよい。
図21において、図10、図18から図20において同じ構成要素については、同じ符号を付し説明は、省略する。なお、図21は、6軸センサを用いた場合の例を示す。
2Dモデル画像表示部34には、2Dモデル画像34aと、2Dモデル画像34a上にS7及びS9の処理により貼り付けられた特殊光の内視鏡画像34bが表示される。図21においても、レリーズボタン13が押されたときの通常光及び特殊光の内視鏡画像には、所定の枠画像41が付加されている。
上述した実施の形態では、複数の内視鏡画像のうち、レリーズボタンが押されたときの内視鏡画像を枠で囲むように、枠画像41を内視鏡画像に付加することにより、レリーズボタンの押された特定の内視鏡画像のみを他の内視鏡画像とは識別可能に表示するようにしているが、レリーズボタンが押されたときの内視鏡画像を指し示す矢印等のマークを表示するようにしてもよい。
図22に示すように、レリーズボタンが押されたときの内視鏡画像を指し示すマークとしての矢印42が、内視鏡画像と共に表示されている。検査者は、矢印42が指し示す内視鏡画像を容易に判別することができる。ここでは、S9の所定の処理は、所定のトリガ信号であるレリーズ信号が発生したときの被検体内画像に、所定のマーク(例えば矢印)を付加する処理である。
よって、検査者は、レリーズボタンが押されたときの内視鏡画像を見つけやすい。
上述した変形例1では、レリーズボタンが押されたときの内視鏡画像を指し示す矢印等のマークを表示するようにしているが、レリーズボタンが押されたときの内視鏡画像と、他の内視鏡画像との色合いを異ならせるようにしてもよい。
図23に示すように、複数の内視鏡画像31bの中でレリーズボタンが押されたときの内視鏡画像43(斜線で示す)の色合いと、レリーズボタンが押されていない他の内視鏡画像31bの色合いとを異ならせて、複数の内視鏡画像31bが表示されている。すなわち、ここでは、S9の所定の処理は、所定のトリガ信号であるレリーズ信号が発生したときの被検体内画像の色調と、S4の判定部において所定のトリガ信号であるレリーズ信号が発生しなかった場合の被検体内画像の色調と、を異ならせる処理である。
上述した変形例2では、レリーズボタンが押されたときの内視鏡画像の色合いを、識別可能とするように、他の内視鏡画像の色合いとは異ならせるようにしているが、レリーズボタンが押されたときの内視鏡画像のみ表示し、レリーズボタンが押されなかった内視鏡画像は、観察済みであることがわかるように、背景の膀胱展開図(シェーマ)の色とは異なる色で表示するようにしてもよい。
図24に示すように、撮像された複数の内視鏡画像31bの中でレリーズボタンが押されたときの内視鏡画像44(斜線で示す)は、最前面に表示され、レリーズボタンが押されていない他の内視鏡画像の領域44aは、背景の膀胱展開図(シェーマ)の色(点線の斜線で示す)とは異なる所定の色で塗りつぶされている。すなわち、ここでは、S9の所定の処理は、所定のトリガ信号であるレリーズ信号が発生しなかった場合の被検体内画像の色調を、所定臓器のモデル画像の色調と異ならせる処理である。
よって、検査者は、所定の色で塗りつぶされた領域によって、観察した領域が一目で認識できると共に、レリーズボタンが押されたときの内視鏡画像も容易に判別し易い。
以上のように、上述した各変形例によっても、検査者が内視鏡画像の検査対象臓器における位置を容易にわかり、かつ複数の内視鏡画像の中からレリーズボタンの押された特定の内視鏡画像のみを特定することが容易な内視鏡システムを実現することができる。
第1の実施の形態は、レリーズボタンの押されたときの内視鏡画像のみを容易に特定可能にする内視鏡システムであるが、本第2の実施の形態は、レリーズボタンの操作ではなく、所定の画像処理の処理結果あるいは所定のイベントの有無の判定結果が、所定の結果であるときの内視鏡画像のみを容易に特定可能にする内視鏡システムである。
S31において取得される判定情報は、判定方法によって異なる。
a)の場合、S31において取得される判定情報は、画像中の凹凸の大きさ(すなわち高さ)の情報である。
CPU21は、取得した判定情報から、所定の判定条件が満たしているか否かを判定する(S32)。
その後、CPU21は、表示処理を実行し(S10)、膀胱B内から挿入部2bが抜去されるまで、S5からS10の処理を繰り返す。
以上の例は、a)の場合であるが、b)~e)の場合は、判定情報と判定条件が異なる。
なお、変化量は、所定の時間だけ位置も方向も変化のない静止状態を示す変化量であってもよい。
ここでは、内視鏡システム1は、通常光観察モーと特殊光観察モードの2つを有するが、光線力学的診断法(PDD)のためのPDDモードなどを有していてもよい。
なお、第1の実施の形態で説明した3つの変形例も、本第2の実施の形態においても適用可能である。
図28では、例えばレリーズボタンが押された時の内視鏡画像31bに枠画像41が付加されると共に、枠画像41が付加された内視鏡画像31bの拡大画像を表示する拡大表示部51が、画面G2上に表示されている。拡大表示部51に表示された内視鏡画像を見ることにより、検査者は、レリーズボタンを押したときの内視鏡画像の確認、例えば病変部が適切に撮像されているかの確認、を容易にすることができる。
よって、検査者は、拡大表示部52aに拡大表示された各内視鏡画像が、2Dモデル画像表示部31に示された枠画像が付加された内視鏡画像のどれに対応するかを、容易に把握することができる。図30では、番号により膀胱展開図(シェーマ)上の内視鏡画像と、拡大画像との対応関係を示しているが、数字に代えて、文字、記号などでもよく、さらに、図30において点線53で示すような、内視鏡画像と拡大画像とを結ぶ線図により、膀胱展開図(シェーマ)上の内視鏡画像と、拡大画像との対応関係を示すようにしてもよい。
以上のように、上述した各実施の形態及び各変形例において、このような拡大表示部を設けるようにしてもよい。
さらになお、上述した各実施の形態では、膀胱内の内視鏡画像を、膀胱の2Dモデル画像上に貼り付けるが、上述した各実施の形態の内視鏡システムは、膀胱以外の他の臓器、例えば胃、子宮、に対しても適用可能である。
さらに、上述した内視鏡システムは、臓器内の内視鏡画像の位置を記録し、あるいは表示させるために利用されているが、ランダムバイオプシーにおける生検位置の記録にも利用することができる。
囲において、種々の変更、改変等が可能である。
本出願は、2013年4月12日に日本国に出願された特願2013-84301号を優先権主張の基礎として出願するものであり、上記の開示内容は、本願明細書、請求の範囲に引用されるものとする。
Claims (15)
- 被検体内に挿入する挿入部と、
前記挿入部の先端側に設けられ、前記被検体からの光を受ける対物光学窓と、
前記対物光学窓から入射された光から、前記被検体内を撮像する撮像部と、
前記対物光学窓の位置情報を取得する位置情報取得部と、
前記撮像部により取得された被検体内画像と、前記位置情報取得部により取得された前記位置情報と、を関連付けて記録する記録部と、
前記被検体内における所定臓器のモデル画像に対して前記記憶部に記録される位置情報を対応付ける対応付け部と、
前記対応付け部により対応付けられた前記所定臓器のモデル画像上に前記被検体内画像を貼り付けた画像を、被検体内画像貼り付け画像として生成する画像生成部と、
前記撮像部における撮像に関する所定のトリガ信号の発生があったか、又は前記撮像部により取得された前記被検体内画像の所定の特徴量が所定の条件を満たしたか、を判定する判定部と、
前記判定部で前記所定のトリガ信号の発生があったもしくは前記所定の特徴量が前記所定の条件を満たしたと判定された被検体内画像を、前記被検体内画像貼り付け画像上における他の被検体内画像と識別可能に表示する表示部と、
を備えることを特徴とする内視鏡システム。 - 前記所定のトリガ信号は、前記撮像部により取得された前記被検体内画像を記録するレリーズ信号、前記撮像部が設けられた内視鏡の挿入部に対する所定の操作を示す操作信号、あるいは前記被検体内を撮像するときの観察モードの切り換え信号であることを特徴とする請求項1に記載の内視鏡システム。
- 前記所定の特徴量は、前記被検体内画像における被検体表面の凸部の高さ、前記被検体内画像の色調又は前記被検体内画像のテクスチャであることを特徴とする請求項1に記載の内視鏡システム。
- 前記表示部は、前記判定部において前記所定のトリガ信号が発生しなかった場合又は前記所定の特徴量が前記所定の条件を満たしていなかった場合、前記被検体内画像を識別可能に表示しないことを特徴とする請求項1に記載の内視鏡システム。
- 前記表示部は、前記所定のトリガ信号が発生したとき又は前記所定の特徴量が前記所定の条件を満たしたときの前記被検体内画像に、枠画像を付加することを特徴とする請求項1に記載の内視鏡システム。
- 前記表示部は、前記所定のトリガ信号が発生したとき又は前記所定の特徴量が前記所定の条件を満たしたときの前記被検体内画像に、所定のマークを付加することを特徴とする請求項1に記載の内視鏡システム。
- 前記表示部は、前記所定のトリガ信号が発生したとき又は前記所定の特徴量が前記所定の条件を満たしたときの前記被検体内画像の色調と、前記判定部において前記所定のトリガ信号が発生しなかった場合又は前記所定の特徴量が前記所定の条件を満たしていなかった場合の前記被検体内画像の色調と、を異ならせることを特徴とする請求項1に記載の内視鏡システム。
- 前記表示部は、前記判定部において前記所定のトリガ信号が発生しなかった場合又は前記所定の特徴量が前記所定の条件を満たしていなかった場合の前記被検体内画像を、所定の色で塗りつぶすことを特徴とする請求項1に記載の内視鏡システム。
- 前記被検体内画像に対して分光推定処理を行う分光推定処理部を備え、前記分光推定処理部の動作モードに応じて複数設定されたモデル上に前記所定の処理を施して、前記被検体内画像の貼り付けを行うことを特徴とする請求項1に記載の内視鏡システム。
- 前記表示部は、前記識別可能に表示された前記被検体内画像の拡大画像を生成することを特徴とする請求項1に記載の内視鏡システム。
- 前記表示部は、前記所定のトリガ信号が発生したとき又は前記所定の特徴量が前記所定の条件を満たしたときの前記被検体内画像を、前記判定部において前記所定のトリガ信号が発生しなかった場合又は前記所定の特徴量が前記所定の条件を満たしていなかった場合の前記被検体内画像に優先して、前記所定臓器のモデル画像上の最前面に貼り付けることを特徴とする請求項1に記載の内視鏡システム。
- 前記被検体内画像は、前記被検体の膀胱内の画像であり、
前記モデル画像は、膀胱のモデル画像であることを特徴とする請求項1に記載の内視鏡システム。 - 前記モデル画像は、2Dの膀胱展開図であることを特徴とする請求項12に記載の内視鏡システム。
- 被検体内に挿入する挿入部の先端側に設けられた対物光学窓から入射された光から、前記被検体内を撮像する撮像部と、前記対物光学窓の位置情報を取得する位置情報取得部と、前記撮像部により取得された被検体内画像と、前記位置情報取得部により取得された前記位置情報と、を関連付けて記録する記録部と、対応付け部と、画像生成部と、判定部と、表示部とを有する内視鏡システムの作動方法であって、
前記対応付け部が、前記被検体内における所定臓器のモデル画像に対して前記記憶部に記録される位置情報を対応付け、
前記画像生成部が、前記対応付け部により対応付けられた前記所定臓器のモデル画像上に前記被検体内画像を貼り付けた画像を、被検体内画像貼り付け画像として生成し、
前記判定部が、前記撮像部における撮像に関する所定のトリガ信号の発生があったか、又は前記撮像部により取得された前記被検体内画像の所定の特徴量が所定の条件を満たしたか、を判定し、
前記表示部が、前記判定部で前記所定のトリガ信号の発生があったもしくは前記所定の特徴量が前記所定の条件を満たしたと判定された被検体内画像を、前記被検体内画像貼り付け画像上における他の被検体内画像と識別可能に表示する、
ことを特徴とする内視鏡システムの作動方法。 - 前記所定のトリガ信号は、前記撮像部により取得された前記被検体内画像を記録するレリーズ信号、前記撮像部が設けられた内視鏡の挿入部に対する所定の操作を示す操作信号、あるいは前記被検体内を撮像するときの観察モードの切り換え信号であることを特徴とする請求項14に記載の内視鏡システムの作動方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014550583A JP5771757B2 (ja) | 2013-04-12 | 2014-04-08 | 内視鏡システム及び内視鏡システムの作動方法 |
EP14782663.0A EP2957216A4 (en) | 2013-04-12 | 2014-04-08 | ENDOSCOPE SYSTEM AND METHOD OF OPERATION FOR ENDOSCOPE SYSTEM |
CN201480016285.6A CN105050479B (zh) | 2013-04-12 | 2014-04-08 | 内窥镜系统 |
US14/858,074 US9538907B2 (en) | 2013-04-12 | 2015-09-18 | Endoscope system and actuation method for displaying an organ model image pasted with an endoscopic image |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013084301 | 2013-04-12 | ||
JP2013-084301 | 2013-04-12 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/858,074 Continuation US9538907B2 (en) | 2013-04-12 | 2015-09-18 | Endoscope system and actuation method for displaying an organ model image pasted with an endoscopic image |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014168128A1 true WO2014168128A1 (ja) | 2014-10-16 |
Family
ID=51689535
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/060141 WO2014168128A1 (ja) | 2013-04-12 | 2014-04-08 | 内視鏡システム及び内視鏡システムの作動方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9538907B2 (ja) |
EP (1) | EP2957216A4 (ja) |
JP (1) | JP5771757B2 (ja) |
CN (1) | CN105050479B (ja) |
WO (1) | WO2014168128A1 (ja) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016080331A1 (ja) * | 2014-11-17 | 2016-05-26 | オリンパス株式会社 | 医療装置 |
WO2016098665A1 (ja) * | 2014-12-15 | 2016-06-23 | オリンパス株式会社 | 医療機器システム、医療機器システムの作動方法 |
WO2017056762A1 (ja) * | 2015-10-02 | 2017-04-06 | ソニー株式会社 | 医療用制御装置、制御方法、プログラム、および医療用制御システム |
JP2018050890A (ja) * | 2016-09-28 | 2018-04-05 | 富士フイルム株式会社 | 画像表示装置及び画像表示方法並びにプログラム |
JP2018139846A (ja) * | 2017-02-28 | 2018-09-13 | 富士フイルム株式会社 | 内視鏡システム及びその作動方法 |
JP2018139847A (ja) * | 2017-02-28 | 2018-09-13 | 富士フイルム株式会社 | 内視鏡システム及びその作動方法 |
WO2021149137A1 (ja) * | 2020-01-21 | 2021-07-29 | オリンパス株式会社 | 画像処理装置、画像処理方法およびプログラム |
WO2023286196A1 (ja) * | 2021-07-14 | 2023-01-19 | オリンパスメディカルシステムズ株式会社 | 画像処理装置、内視鏡装置及び画像処理方法 |
JP7558982B2 (ja) | 2020-01-20 | 2024-10-01 | 富士フイルム株式会社 | 医療画像処理装置、内視鏡システム、及び医療画像処理装置の作動方法 |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5810248B2 (ja) * | 2013-10-02 | 2015-11-11 | オリンパス株式会社 | 内視鏡システム |
TWI533834B (zh) * | 2013-12-10 | 2016-05-21 | Ying-Jie Su | Magnetic manipulation of the surgical lighting device and a hand with a lighting function Assisted system |
JP6432770B2 (ja) * | 2014-11-12 | 2018-12-05 | ソニー株式会社 | 画像処理装置、画像処理方法、並びにプログラム |
JPWO2017130567A1 (ja) * | 2016-01-25 | 2018-11-22 | ソニー株式会社 | 医療用安全制御装置、医療用安全制御方法、及び医療用支援システム |
US20180263568A1 (en) * | 2017-03-09 | 2018-09-20 | The Board Of Trustees Of The Leland Stanford Junior University | Systems and Methods for Clinical Image Classification |
CN110769731B (zh) * | 2017-06-15 | 2022-02-25 | 奥林巴斯株式会社 | 内窥镜系统、内窥镜用处理系统、图像处理方法 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11104072A (ja) * | 1997-10-03 | 1999-04-20 | Mitsubishi Electric Corp | 医療支援システム |
JP2007236700A (ja) | 2006-03-09 | 2007-09-20 | Fujifilm Corp | カプセル内視鏡システム |
JP2010508047A (ja) * | 2006-05-22 | 2010-03-18 | ヴォルケイノウ・コーポレーション | 表示前方視画像データのためのレンダリングのための装置および方法 |
JP2010099171A (ja) * | 2008-10-22 | 2010-05-06 | Fujifilm Corp | 画像取得方法および内視鏡装置 |
JP2010240000A (ja) | 2009-04-01 | 2010-10-28 | Hoya Corp | 画像処理装置、画像処理方法、およびシステム |
JP2011177419A (ja) * | 2010-03-03 | 2011-09-15 | Olympus Corp | 蛍光観察装置 |
WO2012157338A1 (ja) * | 2011-05-17 | 2012-11-22 | オリンパスメディカルシステムズ株式会社 | 医療機器、医療画像におけるマーカ表示制御方法及び医療用プロセッサ |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6702736B2 (en) * | 1995-07-24 | 2004-03-09 | David T. Chen | Anatomical visualization system |
JP4550048B2 (ja) * | 2003-05-01 | 2010-09-22 | ギブン イメージング リミテッド | パノラマ視野の撮像装置 |
DE102004008164B3 (de) * | 2004-02-11 | 2005-10-13 | Karl Storz Gmbh & Co. Kg | Verfahren und Vorrichtung zum Erstellen zumindest eines Ausschnitts eines virtuellen 3D-Modells eines Körperinnenraums |
WO2005077253A1 (ja) * | 2004-02-18 | 2005-08-25 | Osaka University | 内視鏡システム |
GB0613576D0 (en) * | 2006-07-10 | 2006-08-16 | Leuven K U Res & Dev | Endoscopic vision system |
EP2335551B1 (en) | 2008-10-22 | 2014-05-14 | Fujifilm Corporation | Endoscope apparatus and control method therefor |
DE102009039251A1 (de) * | 2009-08-28 | 2011-03-17 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Verfahren und Vorrichtung zum Zusammenfügen von mehreren digitalen Einzelbildern zu einem Gesamtbild |
JP5580637B2 (ja) * | 2010-03-30 | 2014-08-27 | オリンパス株式会社 | 画像処理装置、内視鏡装置の作動方法及びプログラム |
US9265468B2 (en) * | 2011-05-11 | 2016-02-23 | Broncus Medical, Inc. | Fluoroscopy-based surgical device tracking method |
EP2868256A4 (en) * | 2013-03-06 | 2016-08-10 | Olympus Corp | ENDOSCOPY SYSTEM AND OPERATING PROCESS FOR THE ENDOSCOPE SYSTEM |
CN104918534B (zh) * | 2013-03-19 | 2017-03-15 | 奥林巴斯株式会社 | 内窥镜系统 |
US9107578B2 (en) * | 2013-03-31 | 2015-08-18 | Gyrus Acmi, Inc. | Panoramic organ imaging |
EP3005232A4 (en) * | 2013-05-29 | 2017-03-15 | Kang-Huai Wang | Reconstruction of images from an in vivo multi-camera capsule |
US9901246B2 (en) * | 2014-02-05 | 2018-02-27 | Verathon Inc. | Cystoscopy system including a catheter endoscope and method of use |
-
2014
- 2014-04-08 CN CN201480016285.6A patent/CN105050479B/zh active Active
- 2014-04-08 JP JP2014550583A patent/JP5771757B2/ja active Active
- 2014-04-08 EP EP14782663.0A patent/EP2957216A4/en not_active Withdrawn
- 2014-04-08 WO PCT/JP2014/060141 patent/WO2014168128A1/ja active Application Filing
-
2015
- 2015-09-18 US US14/858,074 patent/US9538907B2/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11104072A (ja) * | 1997-10-03 | 1999-04-20 | Mitsubishi Electric Corp | 医療支援システム |
JP2007236700A (ja) | 2006-03-09 | 2007-09-20 | Fujifilm Corp | カプセル内視鏡システム |
JP2010508047A (ja) * | 2006-05-22 | 2010-03-18 | ヴォルケイノウ・コーポレーション | 表示前方視画像データのためのレンダリングのための装置および方法 |
JP2010099171A (ja) * | 2008-10-22 | 2010-05-06 | Fujifilm Corp | 画像取得方法および内視鏡装置 |
JP2010240000A (ja) | 2009-04-01 | 2010-10-28 | Hoya Corp | 画像処理装置、画像処理方法、およびシステム |
JP2011177419A (ja) * | 2010-03-03 | 2011-09-15 | Olympus Corp | 蛍光観察装置 |
WO2012157338A1 (ja) * | 2011-05-17 | 2012-11-22 | オリンパスメディカルシステムズ株式会社 | 医療機器、医療画像におけるマーカ表示制御方法及び医療用プロセッサ |
Non-Patent Citations (3)
Title |
---|
KOHEI ARAI: "3D image reconstruction acquired with arrayed optical fiber", 1 December 2007 (2007-12-01), XP055276957, Retrieved from the Internet <URL:http://portal.dl.saga-u.ac.jp/bitstream/123456789/55466/1/ZR00006124.pdf> [retrieved on 20140616] * |
NAINGGOLAN TUA NAMORA: "Sotaikan Yuketsu Shokogun ni Okeru Taiban Hyomen no Mapping System no Kaihatsu", 22 March 2007 (2007-03-22), TOKYO, XP008180467, Retrieved from the Internet <URL:http://repository.dl.itc.u-tokyo.ac.jp/dspace/bitstream/2261/20154/1/K-00928-a.pdf> [retrieved on 20140616] * |
See also references of EP2957216A4 |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10231601B2 (en) | 2014-11-17 | 2019-03-19 | Olympus Corporation | Medical apparatus for associating index indicating priority corresponding to lesion occurrence frequency with position of inner wall opposing in visual line direction of endoscope the position being one at which image is acquired |
JP6022133B2 (ja) * | 2014-11-17 | 2016-11-09 | オリンパス株式会社 | 医療装置 |
WO2016080331A1 (ja) * | 2014-11-17 | 2016-05-26 | オリンパス株式会社 | 医療装置 |
WO2016098665A1 (ja) * | 2014-12-15 | 2016-06-23 | オリンパス株式会社 | 医療機器システム、医療機器システムの作動方法 |
JP6017743B1 (ja) * | 2014-12-15 | 2016-11-02 | オリンパス株式会社 | 医療機器システム、医療機器システムの作動方法 |
US10694929B2 (en) | 2014-12-15 | 2020-06-30 | Olympus Corporation | Medical equipment system and operation method of medical equipment system |
WO2017056762A1 (ja) * | 2015-10-02 | 2017-04-06 | ソニー株式会社 | 医療用制御装置、制御方法、プログラム、および医療用制御システム |
US11197736B2 (en) | 2015-10-02 | 2021-12-14 | Sony Corporation | Medical control system and method that uses packetized data to convey medical video information |
US10433709B2 (en) | 2016-09-28 | 2019-10-08 | Fujifilm Corporation | Image display device, image display method, and program |
JP2018050890A (ja) * | 2016-09-28 | 2018-04-05 | 富士フイルム株式会社 | 画像表示装置及び画像表示方法並びにプログラム |
JP2018139847A (ja) * | 2017-02-28 | 2018-09-13 | 富士フイルム株式会社 | 内視鏡システム及びその作動方法 |
JP2018139846A (ja) * | 2017-02-28 | 2018-09-13 | 富士フイルム株式会社 | 内視鏡システム及びその作動方法 |
JP7558982B2 (ja) | 2020-01-20 | 2024-10-01 | 富士フイルム株式会社 | 医療画像処理装置、内視鏡システム、及び医療画像処理装置の作動方法 |
WO2021149137A1 (ja) * | 2020-01-21 | 2021-07-29 | オリンパス株式会社 | 画像処理装置、画像処理方法およびプログラム |
WO2023286196A1 (ja) * | 2021-07-14 | 2023-01-19 | オリンパスメディカルシステムズ株式会社 | 画像処理装置、内視鏡装置及び画像処理方法 |
JP7506264B2 (ja) | 2021-07-14 | 2024-06-25 | オリンパスメディカルシステムズ株式会社 | 画像処理装置、内視鏡装置及び画像処理装置の作動方法 |
Also Published As
Publication number | Publication date |
---|---|
EP2957216A4 (en) | 2016-11-09 |
EP2957216A1 (en) | 2015-12-23 |
CN105050479B (zh) | 2017-06-23 |
US9538907B2 (en) | 2017-01-10 |
JP5771757B2 (ja) | 2015-09-02 |
JPWO2014168128A1 (ja) | 2017-02-16 |
US20160000307A1 (en) | 2016-01-07 |
CN105050479A (zh) | 2015-11-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5771757B2 (ja) | 内視鏡システム及び内視鏡システムの作動方法 | |
JP5676058B1 (ja) | 内視鏡システム及び内視鏡システムの作動方法 | |
US9662042B2 (en) | Endoscope system for presenting three-dimensional model image with insertion form image and image pickup image | |
JP5750669B2 (ja) | 内視鏡システム | |
JP5580637B2 (ja) | 画像処理装置、内視鏡装置の作動方法及びプログラム | |
JP5993515B2 (ja) | 内視鏡システム | |
JP6141559B1 (ja) | 医療装置、医療画像生成方法及び医療画像生成プログラム | |
WO2017159335A1 (ja) | 医療用画像処理装置、医療用画像処理方法、プログラム | |
US20200098104A1 (en) | Medical image processing apparatus, medical image processing method, and program | |
WO2019130868A1 (ja) | 画像処理装置、プロセッサ装置、内視鏡システム、画像処理方法、及びプログラム | |
CN117481579A (zh) | 内窥镜系统及其工作方法 | |
JP7385731B2 (ja) | 内視鏡システム、画像処理装置の作動方法及び内視鏡 | |
WO2016076262A1 (ja) | 医療装置 | |
US20240013389A1 (en) | Medical information processing apparatus, endoscope system, medical information processing method, and medical information processing program | |
US20240000299A1 (en) | Image processing apparatus, image processing method, and program | |
JP2017205343A (ja) | 内視鏡装置、内視鏡装置の作動方法 | |
JP7441934B2 (ja) | 処理装置、内視鏡システム及び処理装置の作動方法 | |
WO2012165370A1 (ja) | 画像処理装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201480016285.6 Country of ref document: CN |
|
ENP | Entry into the national phase |
Ref document number: 2014550583 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14782663 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014782663 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |