WO2016039292A1 - Système d'endoscope - Google Patents

Système d'endoscope Download PDF

Info

Publication number
WO2016039292A1
WO2016039292A1 PCT/JP2015/075327 JP2015075327W WO2016039292A1 WO 2016039292 A1 WO2016039292 A1 WO 2016039292A1 JP 2015075327 W JP2015075327 W JP 2015075327W WO 2016039292 A1 WO2016039292 A1 WO 2016039292A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
scale
subject
information
line
Prior art date
Application number
PCT/JP2015/075327
Other languages
English (en)
Japanese (ja)
Inventor
美穂 南里
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Priority to JP2016530258A priority Critical patent/JPWO2016039292A1/ja
Publication of WO2016039292A1 publication Critical patent/WO2016039292A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to an endoscope system that displays an image of a subject.
  • Japanese Patent Application Laid-Open No. 2005-118107 as a first conventional example displays a wide-angle image and an enlarged image in which a part of the wide-angle image is enlarged, and a wide-angle image coordinate scale on the wide-angle image.
  • the magnification of the wide angle image and the enlarged image can be easily grasped.
  • a numerical value based on the enlargement rate in each direction is given as a coordinate scale in two orthogonal directions. Further, it is disclosed that a lattice scale or a concentric circle scale is used as the coordinate scale.
  • Japanese Patent Laid-Open No. 2007-90060 as a second conventional example makes it possible to select a circumferential scale and a grid for an image of an object.
  • the contents of displaying the scale image as the image of the scale when each position in the displayed image exists substantially at the same distance from the position of the viewpoint. Is disclosed. For this reason, each part corresponding to each position on the image from the viewpoint position (on the observation target part side) as in the case of displaying an image when the observation target part such as a lesion is observed from an oblique direction is displayed.
  • the size when the above-mentioned conventional example is applied and the size of the lesioned part is measured by the scale image (the scale on the image reflects the change in the distance)
  • the present invention has been made in view of the above-described points, and an object thereof is to provide an endoscope system capable of accurately measuring the size of a site such as a lesioned part.
  • An endoscope system includes a subject image generation unit configured to generate a subject image obtained by viewing a subject from a predetermined viewpoint, and at least two directions different from each other on the subject image
  • the first scale is configured to generate a first scale image configured to reflect the lengths in different directions on the subject image on the subject image.
  • the image generation unit is configured using different scales for at least two different directions on the subject image, and reflects the lengths in the different directions on the subject on the subject image.
  • a second scale image generation unit configured to generate a second scale image; and the first scale image or the second scale for the subject image. Having an image processing unit configured to generate a superimposed image obtained by selectively superimposing the scaled image.
  • FIG. 1 is a diagram showing an overall configuration of an endoscope system according to a first embodiment of the present invention.
  • FIG. 2 is a block diagram showing the configuration of the image processing apparatus and the like in FIG.
  • FIG. 3A is a diagram showing a superimposed image on which concentric first scale images having the same scale in different directions on the endoscopic image are superimposed.
  • FIG. 3B is a diagram showing a superimposed image in which a second scale image having a grid shape with different scales in two different directions on the endoscopic image is superimposed.
  • FIG. 4 is a diagram showing a second coordinate system or the like set to a 3D model in which the bladder is approximated by a sphere.
  • FIG. 1 is a diagram showing an overall configuration of an endoscope system according to a first embodiment of the present invention.
  • FIG. 2 is a block diagram showing the configuration of the image processing apparatus and the like in FIG.
  • FIG. 3A is a diagram showing a superimposed image on which concentric first scale images having the same
  • FIG. 5A is a diagram showing a monitor screen in which an endoscopic image in an endoscopic image display area and a superimposed image in a superimposed image display area are displayed adjacent to each other.
  • FIG. 5B is a diagram showing a monitor screen displaying a superimposed image in which a scale image is superimposed on an endoscopic image in an endoscopic image display area.
  • FIG. 6A is an explanatory diagram of a perspective relationship calculation circuit calculating a perspective relationship as a relationship in which the distance from the viewpoint to each point on the subject corresponding to each point on the endoscopic image changes.
  • FIG. 6B is a diagram illustrating a state in which the inner surface of the bladder is observed in an observation state in which the line-of-sight direction is nearly perpendicular to the inner surface of the bladder.
  • FIG. 6C is an explanatory diagram of a state in which the inner surface of the bladder is observed in a state where the line-of-sight direction is set in a direction perpendicular to the inner surface of the bladder.
  • FIG. 7 is a flowchart illustrating an operation including a processing procedure by the image processing apparatus according to the first embodiment.
  • FIG. 8A is a flowchart showing a first scale image generation process.
  • FIG. 8B is an explanatory diagram of the operation of FIG. 8A.
  • FIG. 9A is a flowchart showing a second scale image generation process.
  • FIG. 9B is an explanatory diagram illustrating the operation of FIG. 9A.
  • FIG. 10 is a diagram illustrating a superimposed image on which a grid-shaped first scale image is superimposed.
  • an endoscope system 1 As shown in FIG. 1, an endoscope system 1 according to a first embodiment of the present invention includes an endoscope 2 that observes (or examines) the inside of a patient P as a subject, a light source device 3, and a processor 4. And an image processing device 5, a monitor 6, and a magnetic field generator 7.
  • the endoscope system 1 has a function of observing in two observation modes of normal light observation and special light observation.
  • An operator who is a user of the endoscope system 1 performs an endoscopic examination inside the bladder B as a predetermined luminal organ in a patient P as a subject placed on his / her back on the bed 8.
  • the endoscope 2 includes an operation unit 2a, a flexible insertion unit 2b, and a universal cable 2c.
  • the endoscope 2 is configured by, for example, an endoscope for bladder examination.
  • a light guide 9 as shown in FIG. 2 is inserted into the universal cable 2c, and the endoscope 2 passes illumination light from the light source device 3 through the light guide 9 and the distal end portion 2d of the insertion portion 2b. And illuminates the subject (bladder B) in which the distal end portion 2d of the insertion portion 2b is inserted.
  • the distal end portion 2d of the insertion portion 2b is provided with an objective optical system 10 and an imaging surface at its imaging position, and an optical image formed on the imaging surface is photoelectrically converted.
  • An image sensor 11 that outputs an image signal is provided.
  • the part in the bladder B illuminated by the illumination light of the light source device 3 is imaged by the image sensor 11. Therefore, the objective optical system 10 and the image sensor 11 are inserted into the subject to form an imaging unit (or an imaging device) 12 that images the inside of the subject.
  • An imaging signal obtained by the imaging element 11 is input to a subject image generation unit that generates a subject image or a processor 4 as a subject image generation device via a signal line in the universal cable 2c.
  • Image generation processing is performed in the image generation circuit 4a in the processor 4 to generate an endoscope image as a subject image.
  • the image generation circuit 4a in the processor 4 converts the optical image of the subject imaged on the imaging surface of the imaging device 11 of the imaging unit 12 mounted on the distal end 2d into an electrical imaging signal.
  • the object image corresponding to the optical image is generated by conversion, and the generated object image is displayed on the monitor 6 as an endoscope image.
  • the position of the imaging unit 12 for imaging the subject and the imaging direction for imaging are the viewpoint position and the line-of-sight direction when the subject image is generated by looking at the subject.
  • the processor 4 has a changeover switch 4b for switching the observation mode.
  • An observation mode signal designated by the changeover switch 4b is input to the image generation circuit 4a.
  • the image generation circuit 4a is an observation designated by the changeover switch 4b.
  • An endoscopic image corresponding to the mode is generated.
  • a normal light observation image captured under illumination of normal light (as white light) is generated, and the special light observation mode is specified In this case, a special light observation image (narrow band light observation image in a narrow sense) is generated.
  • the observation mode signal from the changeover switch 4b is input to the LED control circuit 3a of the light source device 3, and the LED control circuit 3a controls to generate illumination light according to the observation mode.
  • the LED control circuit 3a When the normal light observation mode is designated by the changeover switch 4b, the LED control circuit 3a performs control so that the white LED 3b serving as the light source for the normal light observation mode emits light, and the special light observation mode is designated. In this case, the LED control circuit 3a performs control so that the narrow-band blue LED 3c serving as the light source for the special light observation mode emits light.
  • the narrow-band blue LED 3c When the narrow-band blue LED 3c emits light, the narrow-band blue light is selectively reflected by the dichroic mirror 3d disposed at an angle of 45 degrees on the optical path in which the narrow-band blue light travels, and then is collected by the condenser lens 3e. The condensed light is incident on the base end of the light guide 9.
  • the narrow-band blue illumination light incident on the proximal end of the light guide 9 is transmitted by the light guide 9 and emitted from the illumination window to which the distal end of the light guide 9 is attached.
  • illumination for a narrow-band light observation mode is performed.
  • white light emitted from the white LED 3b is selectively transmitted by the dichroic mirror 3d disposed on the optical path, and most of the white light except for the narrow-band blue light is selectively transmitted by the condenser lens 3e.
  • Light is incident on the proximal end of the light guide 9.
  • White illumination light other than the narrow-band blue light incident on the base end of the light guide 9 is transmitted by the light guide 9 and emitted from the illumination window to which the tip of the light guide 9 is attached, for the normal light observation mode. Do the lighting.
  • the endoscopic image generated by the processor 4 (the image generation circuit 4 a thereof) is output from the processor 4 to the monitor 6, and the live endoscopic image is displayed on the monitor 6.
  • the operator who performs the examination can insert the distal end portion 2d of the insertion portion 2b from the urethra of the patient P and observe the inside of the bladder B (shown by a dotted line in FIG. 1) of the patient P.
  • a magnetic sensor 13 as a position sensor is disposed at the distal end portion 2d of the insertion portion 2b. Specifically, as shown in FIG. 2, a magnetic sensor 13 is provided in the vicinity of the objective optical system 10 and the imaging element 11 that constitute the imaging unit 12 of the distal end portion 2 d.
  • the magnetic sensor 13 is used to detect a three-dimensional position (simply a position) that is the viewpoint of the imaging unit 12 at the distal end portion 2d and the line-of-sight direction at that position.
  • a three-dimensional position that is the viewpoint of the imaging unit 12 at the distal end portion 2d and the line-of-sight direction at that position.
  • the optical axis direction of the objective optical system 10 constituting the imaging unit 12 mounted on the distal end portion 2d is parallel to the axial direction of the distal end portion 2d.
  • the position and the line-of-sight direction can be approximated to the position of the tip 2d and its axial direction (simply the direction).
  • an optical image of the subject is formed on the imaging surface of the imaging device 11 arranged at the tip 2d by the objective optical system 10 arranged at the tip 2d.
  • an optical image formed by the objective optical system 10 disposed at the distal end portion 2d is transmitted to the rear (base end) side of the insertion portion 2b by the optical image transmission system, and the insertion portion 2b
  • the present invention can also be applied to an endoscope in which an image is picked up by an image pickup element arranged on the base end side.
  • the viewpoint position (and the line-of-sight direction) of the objective optical system 10 (arranged at the tip 2d) is used rather than the expression of the viewpoint position (and the line-of-sight direction) of the imaging unit. Is more reasonable. Therefore, in the following description, the viewpoint position and the line-of-sight direction are mainly used as the viewpoint position and the line-of-sight direction of the objective optical system 10, or instead of the viewpoint position and the line-of-sight direction as an approximate expression (objective optical system In some cases, the position and direction (of 10 or the tip 2d) are used.
  • the magnetic sensor 13 includes, for example, two coils 2e.
  • the magnetic sensor 13 is a sensor that detects the position and direction of the tip 2d.
  • a signal line 2 f of the magnetic sensor 13 extends from the endoscope 2 and is connected to the image processing device 5 (internal position / direction acquisition circuit 25).
  • the magnetic field generator 7 generates a magnetic field at a predetermined position that is known, and the magnetic sensor 13 detects the magnetic field generated by the magnetic field generator 7.
  • the magnetic field detection signal is input from the endoscope 2 to the image processing device 5 (internal position / direction acquisition circuit 25) via the signal line 2f.
  • the position direction acquisition circuit 25 uses the amplitude and phase of the input detection signal as the viewpoint position information (viewpoint position information) as the position of the objective optical system 10 disposed at the distal end portion 2d and the viewpoint direction.
  • an information acquisition circuit 25a that forms an information acquisition unit that acquires line-of-sight direction information.
  • the subject to be examined is the inner surface of the bladder B, and the bladder B can be approximated by a sphere as will be described later.
  • a release button 14 is provided on the operation unit 2 a of the endoscope 2.
  • the release button 14 is a button to be pressed when a user such as an operator records an endoscopic image.
  • a release button operation signal is input to the processor 4, and the processor 4 generates a release signal and outputs it to the image processing device 5.
  • An endoscopic image when the release button 14 is pressed is recorded in a memory 22 described later of the image processing device 5.
  • the endoscope 2 includes an ID generation unit (simply abbreviated as “ID” in FIG. 2) configured by a ROM (read only memory) that generates identification information (abbreviated as “ID”) unique to each endoscope 2.
  • ID generation unit 15 is input to the image processing device 5 (inside the position / direction acquisition circuit 25) via the processor 4. As shown in FIG. 2, the ID may be input to the image processing device 5 (inside position / direction acquisition circuit) without going through the processor 4.
  • the position / direction acquisition circuit 25 is configured by the imaging unit 12 such as the focal length of the objective optical system 10 from the ID, the number of pixels of the image sensor 11 that captures an optical image by the objective optical system 10, the size of the pixel, and the angle of view (viewing angle).
  • a function of an imaging information acquisition unit (or imaging information acquisition circuit) 25b that acquires imaging information in the case of imaging is provided. The acquired imaging information is used when calculating the perspective relationship described later.
  • the image processing apparatus 5 includes a central processing unit (hereinafter referred to as a CPU) 21, a memory 22, a display interface (hereinafter abbreviated as display I / F) 23, an image capturing circuit 24, and a position / direction acquisition circuit. 25, a luminal organ extraction circuit 26, and a drive circuit 27.
  • the CPU 21, the memory 22, the display I / F 23, the image capture circuit 24, the position / direction acquisition circuit 25, and the luminal organ extraction circuit 26 are connected to each other via a bus 28.
  • the endoscope system 1 includes a CT (Computed Tomography) device 29 for acquiring three-dimensional information related to a patient P as a subject, and an image processing device 5 with various types of users.
  • CT Computer Tomography
  • the CPU 21 constitutes a control unit that controls processing of each unit in the image processing device 5 and is arranged at the distal end portion 2d using the viewpoint position information, the line-of-sight direction information, and the imaging information acquired by the position / direction acquisition circuit 25.
  • a perspective relation calculation circuit 21a for calculating a perspective relation corresponding to a magnitude relation of a distance from a position serving as a viewpoint of the objective optical system 10 to each point on the subject corresponding to each point in the subject image;
  • a scale image generation circuit 21b that generates two types of scale images to be superimposed on the specimen image and a superimposed image generation circuit 21c that forms an image processing unit that generates the superimposed image are provided.
  • the scale image generation circuit 21b is configured using the same scale in at least two different directions on the subject image, and reflects the lengths in different directions on the subject on the subject image.
  • the first scale image generation circuit 21d that generates a first scale image having a scale (scale) of 2 and a difference 2 calculated at least on the subject image in accordance with the perspective relationship calculated by the perspective relationship calculation circuit 21a.
  • a second scale image is generated by using different scales for one direction, and generates a second scale image having a scale (scale) reflecting the length in the different direction on the subject on the subject image.
  • Scale image generation circuit 21e. 3A shows a superimposed image Is obtained by superimposing the first scale image I1 on the endoscopic image Ien as the subject image, and FIG.
  • 3B shows a second scale image I2 superimposed on the endoscopic image Ien.
  • the superimposed image Is is shown.
  • 3A and 3B are shown in a state of being generated on the image display memory in the display I / F 23, for example, and the superimposed image on the image display memory is displayed in a size corresponding to the display size of the monitor 6.
  • the relative scale relationship between the endoscopic image Ien and the first scale image I1 or the second scale image I2 on the image display memory does not change.
  • the CPU 21 obtains the first coordinate system (X 0 Y 0 Z as a coordinate system based on the magnetic field generator 7 when the position information and the line-of-sight direction information acquired by the position / direction acquisition circuit 25 are acquired.
  • 0 (see FIG. 1) has a function of a coordinate conversion processing circuit 21f for converting into an intermediate coordinate system (X 1 Y 1 Z 1 ) (see FIG. 1) based on the entrance of the bladder B.
  • the coordinate conversion processing circuit 21f determines the position and direction of the entrance of the bladder B as the reference position and the reference direction, and uses the position / direction information of the position / direction acquisition circuit 25 in accordance with the following expressions (1) and (2).
  • the information is converted into position / direction information of an intermediate coordinate system (X 1 Y 1 Z 1 ) based on the entrance of B.
  • P 1 R 01 P 0 + M 01 (1)
  • V 1 R 01 V 0 ⁇ formula (2)
  • P 0 and V 0 are respectively a position (vector notation) in the first coordinate system (X 0 Y 0 Z 0 ), which is a coordinate system based on the magnetic field generator 7, and a magnitude of 1 Direction vector.
  • R 01 is a rotation matrix expressed by the following equation (3)
  • M 01 is a parallel progression sequence expressed by the following equation (4).
  • the rotation matrix R 01 is obtained so as to satisfy the following conditions.
  • the condition that the rotation matrix R 01 satisfies is that the Z 1 axis is parallel to the gravity direction, and that V ′ 0 is projected onto the X 1 Y 1 plane perpendicular to the Z 1 axis, and the projected vector direction is represented by Y One axis, a vector perpendicular to the Y 1 Z 1 plane is taken as the X 1 axis.
  • the coordinate conversion processing circuit 21f converts the position vector and the direction vector of the intermediate coordinate system (X 1 Y 1 Z 1 ) into the center of the spherical 3D model image M1 according to the following equations (7) and (8).
  • FIG. 4 is a diagram for explaining the relationship between the intermediate coordinate system (X 1 Y 1 Z 1 ) and the second coordinate system (X 2 Y 2 Z 2 ).
  • P 2 R 12 P 1 + M 02 (7)
  • V 2 R 12 V 1 (8)
  • P 1 and V 1 are a position (vector) and a direction vector at an arbitrary position in the intermediate coordinate system (X 1 Y 1 Z 1 ), respectively
  • P 2 and V 2 are the second A position vector and a direction vector when converted into the coordinate system (X 2 Y 2 Z 2 ).
  • P 0 and V 0 are set to a position (vector) representing the position of the viewpoint of the objective optical system 10 of the objective optical system 10 calculated from the position / direction acquisition circuit 25 and a direction vector in the line-of-sight direction
  • P 0 and V 0 are P 1 and V 1 are converted, and P 1 and V 1 are further converted into P 2 and V 2 . That is, when P 0 and V 0 are acquired as the viewpoint position and the direction vector in the line-of-sight direction, the corresponding P 2 and V 2 are the viewpoints of the objective optical system 10 in the second coordinate system (X 2 Y 2 Z 2 ).
  • Position vector (also used as Pv) and a direction vector in the line-of-sight direction V also used as V).
  • R 12 is a rotation matrix represented by the following equation (9), and M 02 is a parallel progression represented by the following equation (10).
  • Equation 9 [Equation 10] Therefore, a point on the intermediate coordinate system (X 1 Y 1 Z 1) (x 1, y 1, z 1) , as shown in the following equation (11), a second coordinate system (X 2 Y 2 Z 2 ) is converted to a point (x 2 , y 2 , z 2 ) above.
  • the position of (the inner surface of the bladder B) can also be handled using the second coordinate system (X 2 Y 2 Z 2 ).
  • the viewpoint position Pv and the line-of-sight direction V in the second coordinate system (X 2 Y 2 Z 2 ) of the objective optical system 10 of the distal end portion 2d are determined, the center position (center portion) of the imaging surface 11a of the imaging device 11 is determined.
  • the coordinate position (also referred to simply as position) Ps on the inner surface of the sphere (which represents the bladder B) to be imaged is obtained using the sphere radius R as follows.
  • a coefficient k satisfying the following equations (15) and (16) is calculated, and a position (or reference position) Ps as a coordinate position in the second coordinate system (X 2 Y 2 Z 2 ) is obtained.
  • Ps Pv + kV (15)
  • R
  • the perspective relation calculating circuit 21a can calculate the distance between the position Pv of the viewpoint and the position Ps on the inner surface by calculating the position Ps by Expressions (15) and (16).
  • the memory 22 shown in FIG. 2 is formed using a ROM, a RAM (Random Access Memory), a flash memory, or the like.
  • the memory 22 stores various processing programs executed by the CPU 21 and various data, and further generates an image.
  • the information of the endoscopic image generated by the circuit 4a, the position information acquired by the position / direction acquisition circuit 25, the line-of-sight direction information, and the like are also stored. That is, the memory 22 is acquired by the endoscope image information storage unit 22a that stores information of an endoscope image, the position / direction information storage unit 22b that stores viewpoint position information and line-of-sight direction information, and the CT apparatus 29.
  • a patient image information storage unit (or subject image information storage unit) 22c that stores three-dimensional image information including at least the bladder B of the patient P as a subject.
  • the endoscope image information storage unit 22a, the position / direction information storage unit 22b, and the patient image information storage unit 22c are formed by different storage areas in the memory 22, for example. Therefore, each of these storage units can also be called a storage area or a storage device.
  • the luminal organ extraction circuit 26 extracts a hollow three-dimensional image of the bladder B as a luminal organ from the three-dimensional image information of the patient P. That is, the luminal organ extraction circuit 26 has a function of the bladder extraction circuit 26a that extracts a three-dimensional image of the bladder B.
  • the extracted three-dimensional image information of the bladder B is stored in the patient image information storage unit 22c in the memory 22 as the 3D model image M1. Therefore, the patient image information storage unit 22c has a function of an organ model image storage unit.
  • the organ model image storage unit may be formed in a storage area in the memory 22 other than the patient image information storage unit 22c.
  • the 3D model image M1 as an organ model image stored in the memory 22 is read by, for example, the CPU 21 and displayed on the monitor 6 via the display I / F 23.
  • a composite image obtained by combining the endoscopic image generated by the image generation circuit 4a, the superimposed image generated by the image processing device 5, and the organ model image is input to the monitor 6, and the monitor 6 is shown in FIG. 5A.
  • the endoscope image Ien is displayed in the endoscope image display area A1
  • the 3D model image M1 is displayed in the organ model display area A3.
  • the endoscope image generated by the image generation circuit 4a is output to the monitor 6 as indicated by a dotted line in FIG. 2, and the endoscope image is displayed in the endoscope image display area A1.
  • the image Ien may be displayed.
  • it has the display mode change function which changes the 1st display form shown to FIG. 5A to a 2nd display form, and a user changes from the input device 30 comprised from a keypad etc. It can change so that a superimposition image may be displayed in endoscope image display area A1 by input. For example, when a change input for changing the display form of FIG. 5A is performed, the superimposed image Is is displayed in the endoscope image display area A1 as shown in FIG. 5B, and 3D is displayed in the organ model display area A3 on the right side thereof. A model image M1 is displayed.
  • the image capturing circuit 24 illustrated in FIG. 2 has a function of an image capturing unit that performs processing for capturing an endoscopic image generated by the processor 4 at a certain period.
  • the image capture circuit 24 acquires, for example, 30 endoscopic images per second, which are the same as the frame rate, from the processor 4.
  • the image capture circuit 24 also receives a release signal from the processor 4.
  • the image capturing circuit 24 captures a three-dimensional image of the patient P as a subject acquired in advance by the CT apparatus 29 described above, and stores the captured three-dimensional image information of the patient P in the memory 22. It memorize
  • the position / direction acquisition circuit 25 controls the drive circuit 27 that drives the magnetic field generator 7 to generate a predetermined magnetic field in the magnetic field generator 7, detects the magnetic field by the magnetic sensor 13, and detects the detected magnetic field. From the detection signal, the position coordinates (x, y, z) that is the viewpoint of the objective optical system 10 and the orientation (vector of magnitude 1) that is the line-of-sight direction, that is, the position information of the viewpoint and the line-of-sight direction information are the first. Generate in real time in the coordinate system (X 0 Y 0 Z 0 ).
  • the position / direction acquisition circuit 25 acquires the position information and direction information from the magnetic sensor 13, and the viewpoint position information and the line-of-sight direction information of the objective optical system 10 (simplified position line-of-sight). Also called direction information).
  • the CPU 21 obtains (the information thereof) the endoscopic image captured by the image capturing circuit 24 and the position / direction information calculated from the position / direction information detected by the position / direction acquisition circuit 25 (for example, a sphere by the coordinate conversion processing circuit 21f).
  • the association information associated with the second coordinate system (X 2 Y 2 Z 2 ) based on the center O 2 is stored in the memory 22.
  • the memory 22 stores association information that associates the endoscopic image captured by the image capturing circuit 24 with the position / direction information at the time of capturing the endoscopic image in, for example, the association information storage unit 22d in the memory 22. It may be stored (example shown in FIG. 2). Alternatively, the endoscope image information storage unit 22a or the position / direction information storage unit 22b in the memory 22 has the function of the association information storage unit 22d, and the endoscope image information storage unit 22a or the position / direction information storage unit 22b having this function. The association information may be stored.
  • the perspective relation calculation circuit (or distance calculation circuit) 21a of the CPU 21 is three-dimensional image information of the bladder B extracted by the bladder extraction circuit 26a, more specifically, a three-dimensional hollow. From the 3D model image information of the sphere and the association information to each point on the subject corresponding to each point on the endoscopic image as the subject image from the viewpoint (position) of the objective optical system 10 Calculate the perspective relationship (as the relationship of the distance to reach).
  • FIG. 6A shows an operation explanatory diagram for calculating the perspective relationship or the distance magnitude relationship by the perspective relationship calculation circuit 21a.
  • FIG. 6A shows a state in which the inner surface of the bladder B is observed by the objective optical system 10 and the image sensor 11 arranged at the distal end portion 2d of the insertion portion 2b inserted into the bladder B.
  • the inner surface of the bladder B is approximated by a sphere having a radius R with the center being O 2, and each position on the inner surface is a second coordinate system (X 2 Y 2 Z 2 ) with O 2 as a reference (origin). ) Makes it easier to handle.
  • the information on the viewpoint position Pv and the line-of-sight direction (optical axis direction) V of the objective optical system 10 disposed at the distal end portion 2d is acquired using the position / direction acquisition circuit 25, and the second coordinates After being converted into system (X 2 Y 2 Z 2 ) information, it is stored in association with the endoscopic image.
  • the position / direction acquisition circuit 25 (the imaging information acquisition unit 25b thereof) acquires imaging information such as the focal length f of the objective optical system 10, the number of pixels of the imaging element 11, and the viewing angle.
  • an optical image of the inner surface that is within the range of the viewing angle ⁇ with the position in the visual line direction V as the center is formed on the imaging surface 11a of the image sensor 11 arranged perpendicular to the visual line direction V.
  • the optical image is converted into an endoscopic image by the image generation circuit 4a.
  • This endoscopic image is captured by the image capturing circuit 24 in the image processing device 5, temporarily stored in the image memory 23 a provided in the memory 22 or the display I / F 23, read out, and displayed on the monitor 6. Is displayed as an endoscopic image (referred to as an endoscopic image on a monitor when it is necessary to distinguish from an endoscopic image equivalent to an optical image of the imaging surface 11a described below).
  • the optical center of the objective optical system 10 is the viewpoint position Pv
  • the position where the straight line along the visual line (optical axis) direction V from the viewpoint position Pv intersects the inner surface is the reference position Ps.
  • the reference position Ps is imaged at, for example, the center reference position Qs on the imaging surface 11a, and the positions Pa and Pb that are boundaries of the viewing angle ⁇ of the image sensor 11 are imaged at positions Qa and Qb on the imaging surface 11a, respectively.
  • the distance from the viewpoint position Pv to the position Ps is Ls
  • the length on the subject specifically, the inner surface of the bladder B
  • the above scale is highly accurate even when the position on the inner surface is different.
  • the optical image on the imaging surface 11a is, for example, an endoscopic image having the same scale as the optical image (not necessarily the same scale) by the image generation circuit 4a that forms the subject image generation unit. Are generated at the same scale). For this reason, the content described on the imaging surface 11a can be similarly applied to an endoscopic image generated by the image generation circuit 4a.
  • the size of the endoscopic image on the monitor as the endoscopic image on the display surface of the monitor 6 changes depending on the size of the area to be displayed on the monitor 6.
  • the length of the magnification is E times.
  • the contents described on the imaging surface 11 a can be similarly applied to the endoscopic image on the monitor displayed on the monitor 6.
  • a first scale image and a second scale image are generated for an endoscopic image having the same scale as the optical image of the imaging surface 11a, and a superimposed image is generated.
  • the observation state when the subject side (the inner surface side of the bladder B) is observed using the objective optical system 10 from the viewpoint position Pv is close to the state where the line of sight passes through the center O 2.
  • 1 scale image I1 is generated.
  • the viewing state is set so as not to pass through the vicinity of the center O 2
  • the length on the subject side (the inner surface side of the bladder B) as it deviates from the reference position Ps.
  • the reference position Ps can be calculated from the equation (15), and the value of the distance Ls can also be calculated from the following equation (17).
  • Ls
  • the positions Pa and Pb are also similar to the expressions (15) and (16) as the direction vectors Va and Vb obtained by rotating the direction vector V in the line-of-sight direction from the viewpoint position Pv by ⁇ / 2 in opposite directions.
  • the distances La and Lb can be calculated by applying a formula similar to the formula (17).
  • the length in the vicinity of each position Ps, Pa, Pb on the inner surface of the sphere is reduced to the length of f / Ls, f / La, f / Lb in the vicinity of each position Qs, Qa, Qb on the imaging surface 11a. Is done.
  • FIG. 6A for example, when a position Pc having a distance (or length) d from the position Ps on the inner surface is set, a position Qc on the imaging surface 11a that is on a straight line passing through the position Pc and the viewpoint position Pv is acquired. You can also In this case, the distance d on the inner surface is reduced to a distance (length) QsQc (a distance between the positions Qs and Qc) on the imaging surface 11a.
  • a process of setting the distance d on the inner surface as the reference length and setting the distance on the inner surface as the distance on the imaging surface may be applied to generate a second scale image to be described later.
  • the perspective relationship calculation circuit 21a starts from the viewpoint position Pv on the endoscope image (corresponding to the optical image of the imaging surface 11a) (or the endoscope) in accordance with the contents described in FIG. 6A.
  • the perspective relation including the distance to each point on the subject corresponding to each point in the image) is calculated, and the scale information on which the predetermined length on the subject is scaled on the endoscopic image is obtained. calculate.
  • the superimposed image generation circuit 21 c performs an endoscopic image as shown in FIG. 3B for an endoscopic image (observation state in which the scale value is changed on the endoscopic image).
  • a second scale image I2 reflecting a change in length or scale on Ien is generated, and a superimposed image Is is generated by superimposing the generated second scale image I2 on the endoscope image Ien.
  • the first scale image or the second scale is used as the scale image to be superimposed on the endoscope image as the subject image in accordance with the information representing the perspective relationship calculated by the perspective relationship calculation circuit 21a.
  • a determination circuit 21g that forms a scale image determination unit that automatically determines (or sets) one of the images is included.
  • the perspective relationship calculation circuit 21a is shown as having a function of the determination circuit 21g.
  • the determination circuit 21g uses information on a plurality of distances from the viewpoint calculated by the perspective relation calculation circuit 21a to a plurality of points on the inner surface of the bladder B forming the subject, so that the line-of-sight direction has an angle close to perpendicular to the inner surface.
  • a determination circuit 21h that determines whether or not the angle is smaller than an angle threshold is provided, and the determination circuit 21g automatically adds a scale image to be superimposed based on the determination result by the determination circuit 21h from the first scale image or the second scale image. To decide.
  • FIG. 6B shows an operation explanatory diagram of the determination circuit 21h.
  • FIG. 6B shows a state in which the inner surface of the bladder B is observed in a setting state in which the line-of-sight direction V at the viewpoint position Pv of the objective optical system 10 at the distal end portion 2d is nearly perpendicular to the inner surface of the bladder B as the observation state of FIG. .
  • a case where the inner surface is observed in a state where the line-of-sight direction V is nearly perpendicular to the inner surface of the bladder B is referred to as front view.
  • the perspective relation calculation circuit 21a has two positions Pc and Pd having a predetermined length d in, for example, two orthogonal directions from the reference position Ps (in FIG. 6B, Pc is in the paper surface and Pd is in the vertical direction of the paper surface).
  • the distance Ls from the viewpoint position Pv to the inner surface is set to an arbitrary value in a state where the line-of-sight direction V passes through the center O 2 in advance, that is, in an observation state where the line-of-sight direction V is perpendicular to the inner surface.
  • Lr (Ls) the value of the distance from the viewpoint to the position Pc (or Pd) in the case of the position of the predetermined length d is stored.
  • Lr (Ls) represents that the value of Lr changes depending on the value of Ls.
  • FIG. 6C is an explanatory diagram of the distance Lr (Ls) stored in the memory 22.
  • FIG. 6C shows Lr (Ls1) and Lr (Ls2) as distance values Lr (Ls) in the case of two viewpoint positions Pv1 and Pv2.
  • the distance value Lr (Ls) varies depending on the viewpoint position Pv on the line-of-sight direction V or the value of the distance Ls from the viewpoint position Pv to the inner surface.
  • distance values Lr (Ls1) and Lr (Ls2) indicate distance values from the viewpoint positions Pv1, Pv2 to the position Pc, but the same applies when the position Pd is used instead of the position Pc. Value.
  • the memory 22 stores a distance value Lr (Ls) corresponding to the distance Ls.
  • for example, about several percent to 10% of Lr (Ls)
  • the determination circuit 21h determines that the distances Lc and Ld have thresholds Ta and Tb set according to the value of the distance Ls.
  • the determination circuit 21g determines to use the first scale image.
  • the condition of Expression (18) is not satisfied, it is determined that the viewing direction V is not nearly perpendicular to the inner surface as the observation state, and the determination circuit 21g uses the second scale image. To decide.
  • 6C also shows a state in which the inner surface is observed from the viewpoint position Pv3 deviating from the direction perpendicular to the inner surface, unlike the viewpoint positions Pv1 and Pv2.
  • the distance Ls3 from the viewpoint position Pv3 to the position Ps is indicated by the same value as the distance Ls1, but the distance value Lr (Ls3) from the viewpoint position Pv3 to the position Pc is Lr (Ls1). )
  • Lr (Ls1) Much larger than In such a case, it is determined that the condition of Expression (18) is not satisfied. Instead of using Equation (18), as shown in FIG.
  • an angle ⁇ formed by the line-of-sight direction V and a vector representing the reference position Ps from the center O 2 is calculated, and the determination circuit 21h Depending on whether ⁇ is within the angle threshold Tan set within 0 to 10 degrees, it may be determined whether or not the viewing direction V is nearly perpendicular to the inner surface as an observation state.
  • the viewing direction V is nearly perpendicular to the inner surface as the observation state. If the condition of the equation (19) is not satisfied, the viewing direction V is perpendicular to the inner surface as the observation state. It may be determined that the state is not close to.
  • the user can select whether or not to use the function of the determination circuit 21g. For example, when the scale automatic selection switch 30a provided in the input device 30 is turned on, the user automatically determines the scale image to be superimposed on the endoscope image as the subject image by the determination operation of the determination circuit 21g.
  • the threshold value when the determination circuit 21g is used can be specified (set) or selected from the input device 30.
  • the input device 30 has a function of a threshold setting unit (or a threshold setting device) that sets a threshold.
  • a threshold setting unit or a threshold setting device
  • the user can select or specify a scale image to be superimposed on the endoscope image as the subject image.
  • the user selects the scale image to be superimposed on the endoscope image from the input device 30.
  • the user such as an operator observes the inner surface of the bladder B using the endoscope 2 and, for example, when a lesion exists and wants to grasp the size of the lesion,
  • the size of the first scale image or the second scale image can be easily grasped by, for example, inputting from the input device 30 a selection instruction to superimpose the first scale image or the second scale image on the endoscope image.
  • the superimposed image generation circuit 21c performs processing for generating a first scale image and a second scale image, and stores the generated first scale image and second scale image in the memory 22, for example.
  • a memory other than the memory 22 may be used. Then, the first scale image or the second scale image instructed to be selected by the user can be quickly displayed. If the superimposed image generation circuit 21c that can generate the first scale image or the second scale image in a short time (having a processing function) is used after the selection instruction is given, the selection instruction The first scale image or the second scale image that is instructed to be selected after being generated may be generated, or only one scale image may be generated, and the other scale image may be You may make it produce
  • the endoscope system 1 of the present embodiment includes an image generation circuit 4a that forms a subject image generation unit configured to generate a subject image when a subject is viewed from a predetermined viewpoint, and the viewpoint for the subject.
  • Each position on the subject image from the viewpoint using the position / direction acquisition circuit 25 forming an information acquisition unit configured to acquire the position information and the line-of-sight direction information, and the position information and the line-of-sight direction information.
  • a perspective relationship calculation circuit 21a that forms a perspective relationship calculation unit configured to calculate a perspective relationship (information) to each point on the subject corresponding to, and using the position information and the line-of-sight direction information , Configured using the same scale for at least two different directions on the subject image, and reflecting lengths in the different directions on the subject on the subject image
  • the first scale image generation circuit 21d that forms the first scale image generation unit configured to generate the first scale image for the purpose, and the perspective relationship (information) calculated by the perspective relationship calculation unit.
  • the second image is configured using different scales for at least two different directions on the subject image, and reflects the lengths in the different directions on the subject image on the subject image.
  • a second scale image generation circuit 21e that forms a second scale image generation unit configured to generate a second scale image, and the first scale image or the second scale with respect to the subject image.
  • a superimposed image generation circuit 21c that forms an image processing unit configured to generate a superimposed image in which images are selectively superimposed. That.
  • FIG. 7 shows an operation when the inner surface of the bladder B is inspected or observed using the endoscope 2.
  • a user such as an operator performs initial settings in the first step S1. For example, when a scale image formed from the first scale image or the second scale image is displayed, the display mode in FIG. 5A or 5B is set. Thereafter, a scale image (superimposed image), which will be described later, is displayed according to this setting. The user can also reduce the scale of the displayed endoscopic image at different positions on the endoscopic image based on the perspective relation calculation result by the perspective relation calculating circuit 21a of the image processing apparatus 5.
  • the scale automatic selection switch 30a When it is desired to automatically determine the scale image to be superimposed on the endoscopic image depending on whether or not it is in a state close to frontal view with almost no change, the scale automatic selection switch 30a is turned ON. In addition, when the user desires that the user himself selects (or designates) a scale image to be superimposed on the endoscope image, the user turns off the scale automatic selection switch 30a.
  • step S2 the operator inserts the insertion portion 2b of the endoscope 2 into the bladder B (to be examined or observed) through the urethra of the patient P.
  • an imaging signal obtained by imaging the inner surface of the bladder B by the objective optical system 10 in the distal end portion 2d and the imaging element 11 arranged at the imaging position thereof is input to the image generation circuit 4a, and the image generation circuit 4a generates an endoscopic image.
  • the generated endoscopic image is displayed on the monitor 6.
  • step S4 the position / direction acquisition circuit 25 acquires information on the viewpoint position Pv and the line-of-sight direction V of the objective optical system 10 in the distal end portion 2d, and the coordinate conversion processing circuit 21f receives the second coordinate.
  • step S5 Convert to the system (X 2 Y 2 Z 2 ). Then, as shown in step S5, the memory 22 stores the endoscope image and the information on the viewpoint position Pv and the line-of-sight direction V of the objective optical system 10 acquired at the time when the endoscope image was acquired as the second coordinates. Convert to system (X 2 Y 2 Z 2 ) and store (store) it in association. This operation is performed by the CPU 21 forming the control unit.
  • step S ⁇ b> 6 the CPU 21 (the perspective relation calculation circuit 21 a) determines a plurality of representative plural images on the imaging surface 11 a (or in the endoscopic image) within the field of view or the angle of view of the imaging unit 12 from the viewpoint position Pv. The distance to a plurality of points on the inner surface of the bladder B corresponding to the point is calculated (in order to determine whether the point is close to the front view). Further, the CPU 21 (the perspective relation calculation circuit 21a) is in the endoscopic image (or the imaging surface 11a) with respect to the length on the inner surface in the vicinity of a plurality of representative points in the endoscopic image (or in the imaging surface 11a). Calculate the middle scale.
  • step S7 imaging information such as the focal length f of the objective optical system acquired by the position / direction acquisition circuit 25 is referred to.
  • the CPU 21 the scale image generation circuit 21b
  • step S6 the CPU 21 (the scale image generation circuit 21b) generates the first scale image and the second scale image by using the distance and scale information (at least a part thereof) calculated in step S6.
  • this processing is not performed in step S7, and processing may be performed so that the first scale image is generated immediately before step S10, which will be described later, and the second scale image is generated immediately before step S11, which will be described later. .
  • the CPU 21 determines whether or not the scale automatic selection switch 30a is ON.
  • the CPU 21 determines that the current endoscopic image observation state is based on the equation (18) or the equation (19). It is determined whether the front view or the current line-of-sight direction is nearly perpendicular to the inner surface.
  • the CPU 21 uses the first scale image generated by the first scale image generation circuit 21d as the endoscopic image. To generate a superimposed image.
  • the monitor 6 displays a superimposed image obtained by superimposing the first scale image on the endoscopic image.
  • the CPU 21 superimposed image generation circuit 21c
  • the monitor 6 uses the second scale image generation circuit 21e.
  • the generated second scale image is superimposed on the endoscope image to generate a superimposed image.
  • the monitor 6 displays a superimposed image obtained by superimposing the second scale image on the endoscopic image.
  • step S12 the CPU 21 determines whether or not the user has performed an input operation for ending the examination or observation in the bladder B from the input device 30. When the input operation to end is not performed, the process returns to the process of step S3, and when the operation input to end is performed, the process of FIG. 7 is ended. If the automatic scale selection switch 30a is determined to be OFF in step S8 (and the automatic scale selection switch 30a is determined to be ON), as shown in step S13, the CPU 21 sets the first scale. It is determined whether a superimposed image using the image is selected (or designated). In the case of a determination result in which a superimposed image using the first scale image is selected (or designated) by the user, the process proceeds to step S10. On the other hand, in the case of the determination result that the superimposed image using the second scale image is selected (or specified) without the user selecting (or specifying) the superimposed image using the first scale image, step S11. Move on to processing.
  • FIG. 8A shows the first scale image generation processing in FIG.
  • FIG. 8B shows an explanatory diagram of the operation of FIG. 8A and shows how the inner surface of the bladder B is observed by the endoscope 2.
  • FIG. 8B shows a state in which the line-of-sight direction V is close to perpendicular to the inner surface as indicated by O 2 at the center of the sphere.
  • the first scale image generation circuit 21d first observes the inner surface of the bladder B from the viewpoint position Pv and the line-of-sight direction V of the objective optical system 10 in step S21.
  • the corresponding position on the inner surface is set as the reference position Ps, and the scale value on the endoscopic image at the reference position Ps is acquired based on the acquisition result of the perspective relation by the perspective relation calculation circuit 21a.
  • the imaging device 11 images the inner surface of the bladder B as the subject in the viewpoint position Pv and the line-of-sight direction V from the viewpoint position Pv
  • the position of the inner surface on the extension line of the line-of-sight direction V is used as a reference.
  • the value G of the scale is obtained. As described with reference to FIG. 6A, the scale value G is f / Ls.
  • the first scale image generation circuit 21d sets the reference lengths Lf and Lf on both sides at the reference position Ps on the inner surface of the bladder B, and acquires the reference length 2 ⁇ Lf.
  • a first circle Sp1 having a length multiplied by the scale value G as a reference scale diameter Da (representing 5 mm on the inner surface in FIG. 3A) is generated.
  • the first scale image generating circuit 21d has a reference scale diameter Db that is twice the reference scale diameter Da outside the first circle Sp1, and is a second circle that is concentric with the first circle Sp1. Sp2 is generated. Even if the reference length Lf is set to the reference scale radius, the same result is obtained.
  • a first scale image I1 including the first circle Sp1 and the second circle Sp2 is generated.
  • the first scale image generation circuit 21d sends the generated first scale image I1 to the superimposed image generation circuit 21c, and ends the process of FIG. 8A.
  • the superimposed image generation circuit 21c to which the first scale image I1 is sent overlaps the first scale image I1 around the reference position Qs on the endoscopic image captured by the image sensor 11, as shown in FIG. 3A.
  • a superposed image Is is generated.
  • the superimposed image Is shown in FIG. 3A is, for example, a first scale composed of a first circle Sp1 and a second circle Sp2 centered on a reference position Qs on an endoscopic image Ien including lesions Id1, Id2, and Id3.
  • the image I1 is superimposed.
  • the first scale image I1 is along an intersecting surface (imaging surface 11a of the image sensor 11 that forms) that intersects the line-of-sight direction V substantially perpendicularly to an arbitrary direction including the case of two different directions. Formed.
  • the first scale image I1 is configured using the same scale in at least two different directions on the endoscopic image, and the bladder B (as the subject) on the endoscopic image is displayed.
  • the first scale image is a concentric circle scale (image) for reflecting the lengths in different directions on the inner surface. That is, the concentric scale (image) is configured using the same scale for any direction including two different directions, and reflects the length in any direction on the inner surface of the bladder B ( It is a scale that represents an isotropic scale without direction dependency. In FIG. 10 to be described later, it is configured using the same scale (that is, the same two scales) with respect to two directions orthogonal to each other, and has a scale for reflecting the lengths in the two directions on the inner surface of the bladder B. It becomes a scale. In addition, as shown in FIG.
  • the calculated scale value is used in the vicinity of the reference position Qs, which is the center of the endoscopic image Ien, or in the periphery and other positions.
  • a distance (for example, a distance Ls to the reference position Ps) or the like may be displayed.
  • the first scale image I1 has a circular scale for measurement along an arbitrary direction.
  • the generated superimposed image Is is output to the monitor 6.
  • the superimposed image Is is displayed in the superimposed image display area A2 of the monitor 6.
  • the surgeon can easily grasp the size of the lesioned part or the like from the first scale image displayed superimposed on the endoscopic image.
  • a display form for displaying the superimposed image Is in the endoscope image display area A1 can be selected. In this case, it can be displayed larger than FIG. 5A.
  • FIG. 9A shows the second scale image generation processing in FIG.
  • FIG. 9B is an explanatory diagram showing the operation of FIG.
  • the second scale image generation circuit 21e determines that the viewpoint position Pv of the objective optical system 10 and the line-of-sight direction V from the viewpoint position Pv are as follows.
  • a reference position Ps (see FIG. 9B, see FIG. 6A) that intersects the inner surface of the bladder B is set, and a position Qs (see FIG. 9B, see FIG. 6A) on the endoscopic image corresponding to the reference position Ps is acquired.
  • the reference position Ps on the inner surface corresponding to the position Qs is indicated by (Ps). Other positions on the inner surface are also indicated in the same manner.
  • the second scale image generation circuit 21e sets a first straight line (in the specific example, a straight line extending in the left-right direction) la that passes through the reference position Ps on the inner surface, and on the first straight line la.
  • positions Pl2, Pr2,... Of the reference length L from the reference position Ps to the reference length L are set on both sides of the reference position Ps, and the positions Pl1, Pr1, Pl2, Pr2,. , The positions Ql1, Qr1, Ql2, Qr2,... On the endoscopic image corresponding to.
  • the second scale image generation circuit 21e sets, for example, a second straight line (in the specific example, a straight line extending in the vertical direction) lb that passes through the reference position Ps and is orthogonal to the first straight line. Also on the straight line lb of FIG. 2, positions Pu1, Pd2,..., Positions Pu2, Pd2,...
  • the second scale image generation circuit 21e sets a straight line lu1 (not shown) that passes through the position Pu1 and is parallel to the first straight line la, and on both sides of the position Pu1 on the straight line lu1. From the position Pu1, the positions Pul1, Pur1, 2 of the reference length L, the positions Pul2, Pur2,... Of the reference length L are set, and on the endoscopic image corresponding to the positions Pul1, Pur1, Pul2, Pur2,. , Qul1, Qur1, Qul2, Qur2,.
  • the second scale image generation circuit 21e sets a straight line ld1 parallel to the first straight line la passing through the position Pd1, and on the straight line ld12 and on both sides of the position Pd1.
  • Positions Pdl1, Pdr1,..., Positions Pdl1, Pdr2,... Of the reference length L are set from the position Pd1, and endoscope images corresponding to the positions Pdl1, Pdr1, Pul2, Pur2,.
  • step S36 the second scale image generation circuit 21e detects the position Pu2 adjacent to the upper side of the position Pu1, the position Pd2 adjacent to the lower side of the position Pd1, and the position Pu3 adjacent to the upper side of the position Pu2.
  • step S34 is repeated for the positions Pd3,.
  • the second scale image generation circuit 21e draws a curve la ′ corresponding to the straight line la, a curve lu1 ′ corresponding to the straight line lu1, a curve ld1 ′ corresponding to the straight line ld1,.
  • the curve la ′ passes through the position Qs, the positions Ql1, Qrl on both sides thereof, the positions Ql2, Qr2,... On both sides of the positions Ql1, Qrl.
  • the position Qs and the positions existing on both sides in the horizontal direction are connected by the curve la ′.
  • the second scale image generation circuit 21e is configured to extend in the vertical direction passing through the position Qs, the positions Qu1, Qd1, the positions Qu1, Qd2, and the positions Qu2, Qd2,. Subtract lb '.
  • the second scale image generation circuit 21e also applies to the positions Ql1, Qr1, Ql2, Qr2,... In the vertical direction passing through the positions existing in the vertical direction as in step S38.
  • the extending curves ll1 ', lr1', ll2 ', lr2' ... are drawn.
  • the second scale image I2 is generated. Then, the process of FIG. 9A ends.
  • the distance between two positions adjacent to each other in the horizontal direction or the vertical direction on the inner surface is a reference length L, on the other hand, the horizontal direction or the vertical direction on the endoscopic image.
  • the distance between two positions adjacent to each other is a value L ′ obtained by reducing the reference length L by the length on the endoscopic image.
  • the second scale image I2 is a scale (scale) whose scale is changed in two different directions on the endoscopic image, so that the subject on the endoscopic image is measured. As an image having a scale reflecting the length on the inner surface of the bladder B.
  • the first scale image I1 is an isotropic scale image
  • the second scale image I2 is an anisotropic scale image having direction dependency.
  • the scale value L ′ shown in FIG. 9B is long on the inner surface of the bladder B. L may be displayed so as to indicate that it is L.
  • the generation of the second scale image I2 described above is summarized as follows. From a reference position Ps on the inner surface of the organ where the line-of-sight direction of the objective optical system 10 intersects, a plurality of positions Pl1, Pl2, etc.
  • the second scale image I2 is generated.
  • the second scale image I2 shown in FIG. 3B and FIG. 9B since the distance from the viewpoint is larger in the upper area than in the lower area, The interval between the scales is smaller than the area in the downward direction. Further, as shown in FIG. 3B, even when lesions Id1, Id2, and Id3 exist at a plurality of different positions on the endoscopic image Ien, the sizes are measured or grasped by the second scale image I2. can do.
  • any of the scales on the endoscopic image When a superimposed image using the first scale image having the same scale scale with respect to the direction is displayed, and the distance to the test surface changes according to different positions on the endoscopic image A superimposed image using a grid-shaped second scale image having different scales is displayed in at least two different directions on the endoscopic image. Then, when there is a site of interest such as a lesion on the endoscopic image, the user such as the surgeon uses the first scale image or the second scale image displayed with the size superimposed. Easy and accurate measurement.
  • the first scale image generated by generating the first scale image including the first circle and the second circle that are concentric in the observation state close to the front view is generated.
  • An example in which a superimposed image is generated using the above has been described.
  • a grid-shaped first scale image may be generated, and a superimposed image may be generated using the generated grid-shaped first scale image.
  • the superimposed image Is as shown in FIG. 10 is a first image formed along the first direction on the intersecting surface formed by the image capturing surface 11a of the image sensor 11 that intersects substantially perpendicularly to the line-of-sight direction when viewing the subject.
  • a scale of the same scale as the first scale along the second direction intersecting the first direction on the intersecting surface has a grid-shaped first scale image I1 ′ having a second scale (a scale that is substantially equally spaced in the horizontal direction in FIG. 10).
  • the grid-shaped first scale image I1 ′ can be generated by the processing procedure shown in FIG. 9A.
  • the user selects the first scale image
  • the user can select one circular scale image I1 and a grid-shaped first scale image I1 ′ from the input device 30.
  • the automatic scale selection is turned on
  • the first scale image I1 having a circular shape and the first scale image I1 having a grid shape are input by the user from the input device, as compared with the case where the first scale image is automatically selected. It is also possible to select one of '.
  • a viewpoint setting unit or a viewpoint setting device that virtually sets the viewpoint position Pv of the objective optical system 10 illustrated in FIG. 6A may be provided in the input device 30.
  • a viewpoint setting unit 30b provided in the input device 30 is indicated by a dotted line.
  • the CPU 21 generates an optical image using the optical characteristics of the objective optical system 10 from the viewpoint position Pv set by the viewpoint setting unit 30b and a virtual endoscope image corresponding to the optical image. It has the function of a mirror image generation unit.
  • the endoscope 2 uses the imaging unit 12 including the single objective optical system 10 and the imaging element 11 as shown in FIG. 2 and the like, but in FIG. 3D endoscope 42 may be used as shown in FIG.
  • left and right objective optical systems 10a and 10b are arranged at a predetermined interval in the left-right direction at the distal end portion 2d of the insertion portion 2b.
  • 11b is arranged, and left and right imaging units 12a and 12b are configured.
  • the light guide 9 and the magnetic sensor 13 are provided as in the case of the endoscope 2 indicated by the solid line.
  • the processor 4 includes an image generation circuit that performs signal processing on the left and right imaging units 12a and 12b.
  • the perspective relation calculation circuit 21a uses the viewpoint positions of the plurality of left and right objective optical systems 10a and 10b and the information on the line-of-sight direction corresponding to each viewpoint position in the left and right endoscopic images. The distance to each point on the inner surface of the bladder B corresponding to each point is calculated. Further, in the above-described embodiment, the example in which the image processing apparatus 5 constructs the three-dimensional image information based on the three-dimensional image captured from the CT apparatus as illustrated in FIG. You may make it construct
  • the 3D shape data constructing unit 42a corresponds to a 3D shape corresponding to one 2D image, for example, a method described in Japanese Patent No. 5354494 or a well-known Shape from Shading method. May be estimated.
  • a stereo method using two or more images, a three-dimensional shape estimation method using monocular movement vision, a SLAM method, or a method of estimating a 3D shape in combination with a position sensor may be used.
  • the 3D shape data may be constructed with reference to 3D image data acquired from a tomographic image acquisition apparatus such as an external CT apparatus.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Astronomy & Astrophysics (AREA)
  • General Physics & Mathematics (AREA)
  • Endoscopes (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)

Abstract

L'invention concerne un système d'endoscope comprenant : une unité de génération d'image de sujet pour générer une image de sujet visualisant un sujet depuis un point de vue prescrit ; une unité de génération d'image d'échelle pour générer des premier et seconde images d'échelle pour réfléchir des longueurs dans différentes directions sur le sujet, en utilisant la même réduction d'échelle et une réduction d'échelle différente sur deux directions qui diffèrent l'une de l'autre sur l'image de sujet ; et une unité de traitement d'image pour générer une image superposée dans laquelle la première ou seconde image d'échelle est superposée sélectivement sur l'image de sujet.
PCT/JP2015/075327 2014-09-11 2015-09-07 Système d'endoscope WO2016039292A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2016530258A JPWO2016039292A1 (ja) 2014-09-11 2015-09-07 内視鏡システム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-185459 2014-09-11
JP2014185459 2014-09-11

Publications (1)

Publication Number Publication Date
WO2016039292A1 true WO2016039292A1 (fr) 2016-03-17

Family

ID=55459040

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/075327 WO2016039292A1 (fr) 2014-09-11 2015-09-07 Système d'endoscope

Country Status (2)

Country Link
JP (1) JPWO2016039292A1 (fr)
WO (1) WO2016039292A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002156212A (ja) * 2000-11-21 2002-05-31 Olympus Optical Co Ltd 目盛表示装置及びその方法
JP2005118107A (ja) * 2003-10-14 2005-05-12 Pentax Corp 画像表示システム
JP2010102113A (ja) * 2008-10-23 2010-05-06 Olympus Corp 画像処理装置、内視鏡装置、内視鏡システム、およびプログラム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002156212A (ja) * 2000-11-21 2002-05-31 Olympus Optical Co Ltd 目盛表示装置及びその方法
JP2005118107A (ja) * 2003-10-14 2005-05-12 Pentax Corp 画像表示システム
JP2010102113A (ja) * 2008-10-23 2010-05-06 Olympus Corp 画像処理装置、内視鏡装置、内視鏡システム、およびプログラム

Also Published As

Publication number Publication date
JPWO2016039292A1 (ja) 2017-04-27

Similar Documents

Publication Publication Date Title
US9516993B2 (en) Endoscope system
US9326660B2 (en) Endoscope system with insertion support apparatus
JP5676058B1 (ja) 内視鏡システム及び内視鏡システムの作動方法
US9357945B2 (en) Endoscope system having a position and posture calculating portion
US9662042B2 (en) Endoscope system for presenting three-dimensional model image with insertion form image and image pickup image
JP6242569B2 (ja) 医用画像表示装置及びx線診断装置
JP5977900B2 (ja) 画像処理装置
JP5750669B2 (ja) 内視鏡システム
JP6141559B1 (ja) 医療装置、医療画像生成方法及び医療画像生成プログラム
JP6022133B2 (ja) 医療装置
JPWO2014156378A1 (ja) 内視鏡システム
JP3707830B2 (ja) 術式支援用画像表示装置
WO2017212725A1 (fr) Système d'observation médicale
WO2016076262A1 (fr) Dispositif médical
US20240057847A1 (en) Endoscope system, lumen structure calculation system, and method for creating lumen structure information
JP2017205343A (ja) 内視鏡装置、内視鏡装置の作動方法
JP5613353B2 (ja) 医療装置
JP7441934B2 (ja) 処理装置、内視鏡システム及び処理装置の作動方法
US10694929B2 (en) Medical equipment system and operation method of medical equipment system
WO2016039292A1 (fr) Système d'endoscope
US20210052146A1 (en) Systems and methods for selectively varying resolutions
JP6335839B2 (ja) 医療装置、医療画像生成方法及び医療画像生成プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15840553

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2016530258

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15840553

Country of ref document: EP

Kind code of ref document: A1