WO2013141155A1 - 画像内遮蔽領域の画像補完システム、画像処理装置及びそのプログラム - Google Patents
画像内遮蔽領域の画像補完システム、画像処理装置及びそのプログラム Download PDFInfo
- Publication number
- WO2013141155A1 WO2013141155A1 PCT/JP2013/057392 JP2013057392W WO2013141155A1 WO 2013141155 A1 WO2013141155 A1 WO 2013141155A1 JP 2013057392 W JP2013057392 W JP 2013057392W WO 2013141155 A1 WO2013141155 A1 WO 2013141155A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- complementary
- main
- imaging device
- screen
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/012—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor characterised by internal passages or accessories therefor
- A61B1/018—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor characterised by internal passages or accessories therefor for receiving instruments
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
- G02B23/2407—Optical details
- G02B23/2415—Stereoscopic endoscopes
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
- G02B23/2476—Non-optical details, e.g. housings, mountings, supports
- G02B23/2484—Arrangements in relation to a camera or imaging device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/313—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/365—Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
- G06T2207/10021—Stereoscopic video; Stereoscopic image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
- G06T2207/30208—Marker matrix
Definitions
- the present invention relates to an image complementation system, an image processing apparatus, and a program for an occluded area in an image. More specifically, the present invention relates to image information of a target space captured in a state where a part is occluded by a predetermined member. The present invention relates to an image interpolation system, an image processing apparatus, and a program for an occluded area in an image that are complemented with other image information captured from a direction.
- the field of view becomes narrower, and it becomes more difficult to accurately grasp the space near the affected area. For this reason, if there is a dangerous part such as a blood vessel or a nerve in which the surgical instrument should not be brought into contact with the shielded area, the surgical instrument may unexpectedly come into contact with the dangerous part, which may cause an accident such as bleeding. There is.
- a dangerous part such as a blood vessel or a nerve in which the surgical instrument should not be brought into contact with the shielded area
- Patent Literature 1 discloses a surgery support device that processes three-dimensional image data captured by an MRI (Magnetic Resonance Image Diagnosis Device) or the like and superimposes it on an endoscopic image.
- this surgery support apparatus segmentation image data obtained by extracting a specific region in a three-dimensional image is created, a projection process is performed on the segmentation image data to create a surgery support image, and the surgery support image is converted into an endoscopic video. It is designed to overlap.
- Patent Document 2 describes correspondence between a stereoscopic endoscope image captured during surgery and a three-dimensional image obtained from image data captured before surgery by MRI or the like.
- An image processing apparatus is disclosed that performs alignment and synthesizes and displays the images.
- this image processing apparatus when a part of the left or right image of the stereoscopic endoscope image is obstructed by the surgical instrument, the feature points in the tissue existing on the back side of the surgical instrument are geometrically restored. The three-dimensional position of the tissue existing on the back side of the surgical instrument is grasped.
- JP 2007-7041 A Japanese Patent Laid-Open No. 11-309
- an internal space captured by the endoscopic image is moved, for example, an organ reflected in the endoscopic image is moved during the operation.
- the state of the image changes, the correspondence between the endoscopic image acquired in real time and the three-dimensional image representing the state of the past body space acquired by MRI or the like before the operation is taken. And the 3D image cannot be superimposed on the endoscopic image.
- the present invention has been devised by paying attention to such a problem, and its purpose is to have a shielded area partially shielded by a predetermined member with respect to an image obtained by capturing a predetermined target space.
- the image complementing system, the image processing apparatus, and the program for the in-image shielding region that can complement the image information of the target space in the shielding region without performing a complicated operation Is to provide.
- the present invention mainly captures a main imaging device that acquires a main image obtained by imaging a target space to be monitored, and images the target space from a line-of-sight direction different from the main imaging device.
- a complementary imaging device that acquires a complementary image for complementing the main image, and a separation distance between each set point set in at least three points in the target space and a predetermined reference point are respectively measured.
- the three-dimensional position measuring device that measures the three-dimensional positions of the main imaging device and the complementary imaging device, and the distance measuring device and the three-dimensional position measuring device.
- An image processing device that complements a part of the image with the complementary image, and the image processing device captures the back of the member by capturing a member having a known shape in the main image together with the target space.
- the image information of the occluded area of the target space hidden by the image is acquired from the complementary image, and the obtained image information is replaced or superimposed on the image information of the member in the main image.
- the composite image supplemented with the supplement image is generated.
- the depth side of the member in the hidden portion is supplemented by the image information of the supplementary image in real time, and a composite image can be obtained in real time as if the member was seen through the main image.
- the occluded area by the member is eliminated by image processing, the reduction of the visual field in the main image due to the presence of the occluded area is improved, and the visual field can be substantially expanded.
- FIG. 1 is a schematic system configuration diagram of an image complementation system according to the present embodiment.
- FIGS. 8A to 8F are diagrams illustrating images for explaining a procedure for acquiring a composite image of an operative field image V1 and a complementary image V2.
- (A) is a diagram showing an image for explaining the screen movement setpoint P i, a portion of (B), in order to explain the screen movement of a non-fixed point P p
- A It is a figure which shows the enlarged image.
- FIG. 1 shows a schematic system configuration diagram of an image complementation system for an occluded area in an image according to the present embodiment.
- an image complementation system 10 according to the present embodiment is used in an endoscopic operation for performing an operation by operating a treatment section S1 such as a scalpel or forceps attached to the distal end of a surgical instrument S from the outside of the body.
- a treatment section S1 such as a scalpel or forceps attached to the distal end of a surgical instrument S from the outside of the body.
- This is a system that complements endoscopic images.
- This image complementation system 10 is virtually set to an imaging device 11 that captures a target space that is a body space of a monitoring target that includes an organ K including an affected area to be treated and its peripheral portion, and objects in the target space.
- a distance measuring device 12 that measures the separation distance between each set point and a predetermined reference point
- a three-dimensional position measuring device 13 that measures the three-dimensional position of the imaging device 11, and an image obtained by the imaging device 11, respectively.
- an image processing device 14 for processing.
- the imaging device 11 is a monocular operative field endoscope 16 that acquires an operative field image V1 (see FIG. 2A) that is a main image that constitutes an endoscopic image of a treatment region viewed by an operator during surgery. (Main imaging device) and a monocular complementary endoscope 17 (complementary imaging device) that acquires a complementary image V2 (see FIG. 2B) that complements the surgical field image.
- the surgical field endoscope 16 is configured to image a desired target space in accordance with an operator's instruction or operation.
- the complementary endoscope 17 can image the target space from a different line-of-sight direction than the operative field endoscope 16, and may be movable integrally following the movement of the operative field endoscope 16.
- the complementary endoscope 17 is a target space on the depth side that has been hidden in the surgical field image V1 by the surgical instrument S present in the surgical field image V1 captured by the surgical field endoscope 16. These areas are arranged so that they can be imaged.
- the distance measuring device 12 for example, a known structure disclosed in Japanese Patent Application Laid-Open No. 2010-220787 is adopted, and the stereo camera 19 capable of acquiring a stereo image and the stereo camera 19 capture an image.
- Distance measuring means 20 for searching for corresponding points between the pair of stereo images and determining the distance of each corresponding point from the tip of the stereo camera 19 using a stereo matching method.
- the detailed structure and algorithm of the distance measuring device 12 employs a well-known technique and is not an essential part of the present invention.
- the stereo camera 19 is provided integrally with the complementary endoscope 17, and can acquire most of the stereo images of the space reflected on the complementary endoscope 17.
- each set point P is set.
- the distance from the tip of the stereo camera 19 is obtained, and the three-dimensional coordinates (three-dimensional position) in the stereo camera coordinate system with a predetermined point of the stereo camera 19 as the origin are specified (detected).
- each set point P is not particularly limited, and it is sufficient that at least three points or more exist.
- the object displayed on the supplementary endoscope 17 is set at predetermined vertical and horizontal intervals on the screen. Many are set. Any one of the stereo cameras 19 may be used in combination with the complementary endoscope 17.
- the three-dimensional position measurement device 13 includes markers 22 that are attached to at least three positions on a member that is a position measurement target, and a main body 23 that includes a light receiving unit 23A that receives infrared light emitted from each marker 22, A known structure that can detect the three-dimensional position of each marker 22 by tracking the infrared rays in accordance with the movement of the marker 22 is employed. Since this structure is not an essential part of the present invention, detailed description thereof is omitted. In addition, as the three-dimensional position measuring device 13, as long as it can detect the three-dimensional position of a member to be position-measured, devices having various principles and structures can be used alternatively.
- the marker 22 is attached to each of the surgical instrument S, the surgical field endoscope 16 and the complementary endoscope 17 at the rear end portion located outside the body at the time of surgery.
- the main body 23 measures the three-dimensional coordinates (position) of the reference coordinate system having a predetermined point as the origin.
- each of the three-dimensional coordinates of the rear end portion does not move relative to the rear end portion.
- the three-dimensional coordinates of the constituent parts are obtained by calculation in the main body 23. If the operative field endoscope 16 and the complementary endoscope 17 are integrated so as not to be relatively movable, the marker 22 may be provided only on one of them.
- the complementary endoscope 17 and the stereo camera 19 of the distance measuring device 12 are integrally provided so as not to be relatively movable, so that the three-dimensional position measuring device 13 uses the complementary endoscope 17.
- the position of each component of the stereo camera 17 is also automatically specified.
- the three-dimensional coordinates of each set point P in the stereo camera coordinate system obtained by the distance measuring device 12 can be converted from the measurement result of the three-dimensional position measuring device 13 into the three-dimensional coordinates of the reference coordinate system.
- the marker 22 is also provided at the rear end portion of the stereo camera 17. Is attached.
- the image processing device 14 is configured by a computer including an arithmetic processing device such as a CPU and a storage device such as a memory and a hard disk, and a program for causing the computer to function as the following units is installed.
- a computer including an arithmetic processing device such as a CPU and a storage device such as a memory and a hard disk, and a program for causing the computer to function as the following units is installed.
- the image processing apparatus 14 acquires image information from the complementary image V2 for the shielding area of the target space hidden on the depth side of the surgical instrument S shown in the surgical field image V1, and uses the acquired image information as the surgical field image. By replacing or superimposing on the image information of the shielding area of V1, a process for generating a composite image in which the shielding area is complemented with a complementary image is performed.
- the image processing device 14 specifies the three-dimensional coordinates (three-dimensional position) in the reference coordinate system for each set point P from the measurement results of the distance measuring device 12 and the three-dimensional position measuring device 13, and Setting point position specifying means 25 for determining the in-screen coordinates (two-dimensional coordinates) in the screen coordinate system in the image V1 and the in-screen coordinates (two-dimensional coordinates) in the screen coordinate system in the complementary image V2, and setting point position specifying Based on the in-screen coordinates obtained by the means 25, the image information of each point (pixel) in the complementary image is converted into the in-screen so that the image information in the complementary image V2 is converted into the line-of-sight direction of the operative field endoscope 16.
- the complementary image deformation means 26 for generating the deformed image V3 (see FIG. 2 (D)) moved in step S1, and the rod-shaped main body portion on the rear side of the treatment section S1 of the surgical instrument S in the surgical field image V1 Identify the shielding area occupied by S2
- the occlusion area specifying means 27 and the corresponding area corresponding to the occlusion area in the deformed image V3 are specified, and the image information of the occlusion area of the operative field image V1 is obtained from the deformed image V3.
- a composite image generating means 28 is provided that generates a composite image of the surgical field image V1 and the complementary image V2 by replacing or superimposing the image information on the corresponding area.
- the three-dimensional coordinates of each set point P in the stereo camera coordinate system obtained by the distance measuring device 12 from the measurement result of the three-dimensional position measuring device 13 is the reference coordinate system (see FIG. 3). ) To three-dimensional coordinates. Then, the in-screen coordinates (two-dimensional coordinates) of each set point P in the complementary image V2 are obtained by the following well-known mathematical expressions stored in advance.
- the reference coordinate system which is a three-dimensional coordinate system, is set so that the z-axis direction matches the optical axis direction of the complementary endoscope 17.
- (u i , v i ) is the in-screen coordinates of the screen coordinate system in the complementary image V2 at the set point P n , that is, 2 in the screen horizontal direction and the screen vertical direction.
- f is the focal length of the operative field endoscope
- k u is the screen resolution in the horizontal direction of the screen of the complementary endoscope 17
- k v is the horizontal direction of the screen of the complementary endoscope 17.
- (u 0 , v 0 ) are coordinates in the screen horizontal direction and the screen vertical direction of the point that the optical axis crosses on the image plane of the complementary image V2.
- f, k u , k v , u 0 , and v 0 are constants that are specified according to the specifications and the arrangement state of the complementary endoscope 17 and stored in advance.
- the coordinates (x i , y i , z i ) of each set point P i in the reference coordinate system are determined between the operative field endoscope 16 and the complementary endoscope 17 based on the measurement result of the three-dimensional position measurement device 13. From the relative positional relationship, it is converted into three-dimensional coordinates (x ′ i , y ′ i , z ′ i ) based on a predetermined position of the operative field endoscope 16, and the same as the above formulas (1) and (2) Is converted into in-screen coordinates (u ′ i , v ′ i ) in the operative field endoscope 16 for each set point P i .
- the complement image deforming means 26 screen coordinates (u 'i, v' i ) of the operative field image V1 of the set point P i and, screen coordinates of the complementary image V2 of the set point P i From (u i , v i ), the image information for each point in the complementary image V2 is in a position corresponding to the in-screen coordinates in the operative field image V1 in which the same part as that point is shown. It is moved within the screen of V2, and a deformed image V3 of the complementary image V2 is generated.
- each set point P i in the in-screen coordinates (u i , v i ) of the complementary image V2 is the corresponding set point P.
- i screen coordinates (u 'i, v' i ) of within the operative field image V1 of the to be the same screen coordinates is moved in the complementary image V2.
- each non-fixed point P p (p 1, 2,...) That is a remaining portion other than the set point P i (white circle point in the figure) in the complementary image V2.
- the image information of black circle points (only one point is displayed in the figure) is moved as follows by weighted average.
- each non-fixed point P p is represented by a single black circle for the purpose of avoiding the complication of the drawing, but in reality, it exists in each pixel portion in the screen excluding the set point P i. Become.
- the moving vector T (u p, v p) for each non-fixed point P p are obtained, respectively.
- T (u j , v j ) be the movement vector in the specified complementary image V2
- W j be the aforementioned weighting factor according to the separation distance from the non-fixed point P p .
- each non-fixed point P p of the complementary image V2 is moving vector T (u p, v p) determined respectively in accordance with the moving amount and the moving direction by, they are moved in the screen of the complementary image V2.
- T moving vector
- the image information of each set point P i and each non-fixed point P P in the complementary image V2 is moved within the same screen, and the complementary image V2 is transformed into the line-of-sight direction of the operative field endoscope 16.
- An image V3 is generated.
- a non-fixed point P motion vector T of the image information in the p (u p, v p) a was determined by a weighted average, B-spline complement the like from the position information of the setting point P i, other moving vector T (u p, v p) using the procedure may be obtained.
- the shielding area specifying means 27 specifies the position of the shielding area occupied by the main body portion S2 in the operative field image V1 as follows. That is, the three-dimensional position measurement device 13 obtains the three-dimensional coordinates in the reference coordinate system of each part of the surgical instrument S1. These three-dimensional coordinates are converted into in-screen coordinates (two-dimensional coordinates) in the screen coordinate system of the operative field image V1, using the same arithmetic expressions as the expressions described in the set point position specifying means 25, The position of the shielding area in the operative field image V1 is specified.
- the specification of the shielding area is not limited to the method described above, and a predetermined color is assigned to the main body portion S2, and the shielding area is specified by identifying the image information of the surgical field image V1 based on the color. You may employ
- the composite image generation means 28 generates a composite image by performing the following mask processing. That is, first, as shown in FIG. 2E, a mask is generated by extracting the occluded region specified in the operative field image V1. Then, by the generated mask, the range of the in-screen coordinates in the deformed image A3 (FIG. (D)) that coincides with the range of the in-screen coordinates of the shielding area in the operative field image V1 corresponds to the corresponding region (FIG. D) and the image information of the corresponding area is extracted. Thereafter, the image information of the corresponding area is superimposed or replaced with the image information of the occlusion area in the operative field image V1, and a composite image shown in FIG.
- a mask is generated by extracting the occluded region specified in the operative field image V1. Then, by the generated mask, the range of the in-screen coordinates in the deformed image A3 (FIG. (D)) that coincides with the range of the in-screen coordinates of the
- the composite image is based on the operative field image V1 and the image information on the depth side of the body part S2 is transmitted from the complementary endoscope 17 as if the body part S2 of the surgical instrument S shown in the operative field image V1 is transmitted. It is an image supplemented with the complementary image V2. Therefore, in the composite image, only the treatment section S1 at the tip of the surgical instrument S shown in the operative field image V1 is left. In the operative field image V1, the operator except for the treatment section S1 that is necessary during the operation. The internal space can be projected, and the operative field of endoscopic images can be substantially expanded.
- the image complementation system 10 that performs image processing of endoscopic images in endoscopic surgery has been illustrated and described.
- the present invention is not limited to this, and surgery support that supports endoscopic surgery.
- images from imaging devices such as cameras are obtained for work spaces that cannot be seen by humans, such as work in nuclear power plant reactors.
- the present invention can also be applied to image processing for performing remote operation of a robot arm or the like. In this case, if the above-described surgical instrument S is replaced with a previously specified member such as a robot arm and an algorithm similar to that described above is applied, an image complementing system in accordance with the application can be obtained.
- the present invention can be industrially used as a system that complements the restriction of visual field by an imaging device that acquires an image in a space that cannot be directly viewed by humans.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Optics & Photonics (AREA)
- General Health & Medical Sciences (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Medical Informatics (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Animal Behavior & Ethology (AREA)
- Pathology (AREA)
- Molecular Biology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- General Physics & Mathematics (AREA)
- Astronomy & Astrophysics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Gynecology & Obstetrics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Endoscopes (AREA)
- Instruments For Viewing The Inside Of Hollow Bodies (AREA)
Abstract
Description
次に、次の重み係数Wiが、仮想領域T内に存在する設定点Piに対応してそれぞれ求められる。具体的には、仮想領域T内に存在する設定点Piそれぞれについて、非定点Ppに対する離間距離が求められ、当該離間距離から、予め設定された演算式を用いて重み係数Wiが求められる。この重み係数Wiは、離間距離に応じて逆比例するように設定される。
そして、以下の式によって、各非定点Ppについての移動ベクトルT(up,vp)がそれぞれ求められる。なお、ここでは、仮想領域T内にN個存在する設定点Piを設定点PTj(j=1、2、・・・、N)とし、当該各設定点PTjにつき、前述した手順によって特定された補完用画像V2内の移動ベクトルをT(uj,vj)とし、非定点Ppからの離間距離に応じた前述の重み係数をWjとする。
11 撮像装置
12 測距装置
13 3次元位置計測装置
14 画像処理装置
16 術野内視鏡(主撮像装置)
17 補完用内視鏡(補完用撮像装置)
25 画像内位置算出手段
26 補完用画像変形手段
27 遮蔽領域特定手段
28 合成画像生成手段
P 設定点
S 手術器具(部材)
V1 術野画像(主画像)
V2 補完用画像
Claims (7)
- モニター対象となる対象空間が撮像された主画像を取得する主撮像装置と、
当該主撮像装置とは別の視線方向から前記対象空間を撮像することで、前記主画像を補完するための補完用画像を取得する補完用撮像装置と、
前記対象空間に少なくとも3点以上設定された各設定点の3次元位置、並びに前記主撮像装置及び前記補完用撮像装置のそれぞれの3次元位置に基づき、前記主画像の一部を前記補完用画像で補完する画像処理装置とを備え、
前記主画像装置及び前記補完用撮像装置は、それぞれ移動しながら前記対象空間をほぼ同時に撮像可能に設けられ、
前記画像処理装置では、既知の形状を有する部材が前記対象空間とともに前記主画像に撮像されることにより、前記部材の奥側に隠れた前記対象空間の遮蔽領域の画像情報を、逐次検出される前記各3次元位置の情報を基に前記補完用画像から取得し、当該取得した画像情報を前記主画像内の前記部材の画像情報に対して置換若しくは重畳させることで、前記遮蔽領域を前記補完用画像で補完した合成画像を生成することを特徴とする画像内遮蔽領域の画像補完システム。 - 前記画像処理装置は、前記各3次元位置の検出結果から、前記各設定点について、前記主画像の画面座標系における画面内座標及び前記補完用画像の画面座標系における画面内座標を特定する設定点位置特定手段と、
前記各画面内座標に基づき、前記補完用画像を前記主撮像装置の視線方向に変換するように、前記補完用画像の各点の画像情報を画面内で移動させた変形画像を生成する補完用画像変形手段と、
前記遮蔽領域の前記主画像内での位置を特定する遮蔽領域特定手段と、
前記遮蔽領域の画像情報を、前記変形画像内で前記遮蔽領域に相応する相応領域の画像情報に置換若しくは重畳することで、前記合成画像を生成する合成画像生成手段とを備えたことを特徴とする請求項1記載の画像内遮蔽領域の画像補完システム。 - モニター対象となる対象空間が撮像された主画像を取得する主撮像装置と、
当該主撮像装置とは別の視線方向から前記対象空間を撮像することで、前記主画像を補完するための補完用画像を取得する補完用撮像装置と、
前記対象空間に少なくとも3点以上設定された各設定点と所定の基準点との離間距離をそれぞれ計測する測距装置と、
前記主撮像装置及び前記補完用撮像装置の3次元位置を計測する3次元位置計測装置と、
前記測距装置及び前記3次元位置計測装置からの計測結果に基づいて、前記主画像の一部を前記補完用画像で補完する画像処理装置とを備え、
前記画像処理装置は、既知の形状を有する部材が前記対象空間とともに前記主画像に撮像されることにより、前記部材の奥側に隠れた対象空間の遮蔽領域の画像情報を前記補完用画像から取得し、当該取得した画像情報を前記主画像内の前記部材の画像情報に対して置換若しくは重畳させることで、前記遮蔽領域を前記補完用画像で補完した合成画像を生成することを特徴とする画像内遮蔽領域の画像補完システム。 - 前記画像処理装置は、前記測距装置及び前記3次元位置計測装置の計測結果から、前記各設定点について、前記主画像の画面座標系における画面内座標及び前記補完用画像の画面座標系における画面内座標を特定する設定点位置特定手段と、
前記各画面内座標に基づき、前記補完用画像を前記主撮像装置の視線方向に変換するように、前記補完用画像の各点の画像情報を画面内で移動させた変形画像を生成する補完用画像変形手段と、
前記遮蔽領域の前記主画像内での位置を特定する遮蔽領域特定手段と、
前記遮蔽領域の画像情報を、前記変形画像内で前記遮蔽領域に相応する相応領域の画像情報に置換若しくは重畳することで、前記合成画像を生成する合成画像生成手段とを備えたことを特徴とする請求項3記載の画像内遮蔽領域の画像補完システム。 - 前記補完用画像変形手段では、前記補完用画像内の前記各設定点が、前記主画像内に存在する同一の前記設定点の画面内座標に一致するように、前記補完用画像の画像情報を移動することで前記変形画像を生成することを特徴とする請求項2又は4記載の画像内遮蔽領域の画像補完システム。
- 主撮像装置によって取得され、モニター対象となる対象空間の主画像と、補完用撮像装置によって前記主画像と別の視線方向からほぼ同時に撮像された前記対象空間の補完用画像とを合成することで、既知の形状を有する部材が前記対象空間とともに前記主画像に撮像された際に、前記部材の少なくとも一部によって遮蔽された前記主画像内の遮蔽領域の画像情報を前記補完用画像の画像情報で補完する処理を行う画像処理装置であって、
前記対象空間に少なくとも3点以上設定された各設定点について、当該各設定点の3次元位置と、前記主撮像装置及び前記補完用撮像装置のそれぞれの3次元位置とから、前記主画像の画面座標系における画面内座標及び前記補完用画像の画面座標系における画面内座標を特定する設定点位置特定手段と、
前記各画面内座標に基づき、前記補完用画像を前記主画像の視線方向に変換するように、前記補完用画像の各点の画像情報を画面内で移動させた変形画像を生成する補完用画像変形手段と、
前記遮蔽領域の前記主画像内での位置を特定する遮蔽領域特定手段と、
前記遮蔽領域の画像情報を、前記変形画像内で前記遮蔽領域に相応する相応領域の画像情報に置換若しくは重畳することで、前記主画像の遮蔽領域を前記補完用画像で補完した合成画像を生成する合成画像生成手段とを備えたことを特徴とする画像処理装置。 - 主撮像装置によって取得され、モニター対象となる対象空間の主画像と、補完用撮像装置によって前記主画像と別の視線方向からほぼ同時に撮像された前記対象空間の補完用画像とを合成することで、既知の形状を有する部材が前記対象空間とともに前記主画像に撮像された際に、前記部材の少なくとも一部によって遮蔽された前記主画像内の遮蔽領域の画像情報を前記補完用画像の画像情報で補完する処理を行うコンピュータを含む画像処理装置のプログラムであって、
前記対象空間に少なくとも3点以上設定された各設定点について、当該各設定点の3次元位置と、前記主撮像装置及び前記補完用撮像装置のそれぞれの3次元位置とから、前記各設定点の前記主画像内の画面内座標及び前記補完用画像内の画面内座標を算出する設定点位置特定手段と、
前記各画面内座標に基づき、前記補完用画像を前記主画像の視線方向に変換するように、前記補完用画像の各点の画像情報を画面内で移動させた変形画像を生成する補完用画像変形手段と、
前記遮蔽領域の前記主画像内での位置を特定する遮蔽領域特定手段と、
前記遮蔽領域の画像情報を、前記変形画像内で前記遮蔽領域に相応する相応領域の画像情報に置換若しくは重畳することで、前記主画像の遮蔽領域を前記補完用画像で補完した合成画像を生成する合成画像生成手段として、前記コンピュータを機能させることを特徴とする画像処理装置のプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014506196A JP6083103B2 (ja) | 2012-03-17 | 2013-03-15 | 画像内遮蔽領域の画像補完システム、画像処理装置及びそのプログラム |
EP13764231.0A EP2829218B1 (en) | 2012-03-17 | 2013-03-15 | Image completion system for in-image cutoff region, image processing device, and program therefor |
US14/383,776 US20150145953A1 (en) | 2012-03-17 | 2013-03-15 | Image completion system for in-image cutoff region, image processing device, and program therefor |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-061285 | 2012-03-17 | ||
JP2012061285 | 2012-03-17 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013141155A1 true WO2013141155A1 (ja) | 2013-09-26 |
Family
ID=49222614
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/057392 WO2013141155A1 (ja) | 2012-03-17 | 2013-03-15 | 画像内遮蔽領域の画像補完システム、画像処理装置及びそのプログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20150145953A1 (ja) |
EP (1) | EP2829218B1 (ja) |
JP (1) | JP6083103B2 (ja) |
WO (1) | WO2013141155A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6150968B1 (ja) * | 2016-02-10 | 2017-06-21 | オリンパス株式会社 | 内視鏡システム |
WO2017138208A1 (ja) * | 2016-02-10 | 2017-08-17 | オリンパス株式会社 | 内視鏡システム |
EP3005937B1 (de) * | 2014-09-25 | 2018-02-28 | Carl Zeiss Meditec AG | Verfahren zur korrektur eines oct-bildes und mikroskop |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10231791B2 (en) * | 2012-06-21 | 2019-03-19 | Globus Medical, Inc. | Infrared signal based position recognition system for use with a robot-assisted surgery |
JP2015159955A (ja) * | 2014-02-27 | 2015-09-07 | オリンパス株式会社 | 手術システムおよび医療器具の干渉回避方法 |
CN107529976B (zh) * | 2015-09-18 | 2019-10-29 | 奥林巴斯株式会社 | 第一信号处理装置和内窥镜系统 |
US10058393B2 (en) | 2015-10-21 | 2018-08-28 | P Tech, Llc | Systems and methods for navigation and visualization |
US10918445B2 (en) * | 2016-12-19 | 2021-02-16 | Ethicon Llc | Surgical system with augmented reality display |
JP6987513B2 (ja) * | 2017-03-07 | 2022-01-05 | ソニー・オリンパスメディカルソリューションズ株式会社 | 内視鏡装置 |
CN107622497B (zh) * | 2017-09-29 | 2020-03-27 | Oppo广东移动通信有限公司 | 图像裁剪方法、装置、计算机可读存储介质和计算机设备 |
JP7094742B2 (ja) * | 2018-03-20 | 2022-07-04 | ソニー・オリンパスメディカルソリューションズ株式会社 | 三次元内視鏡システム |
WO2020044523A1 (ja) * | 2018-08-30 | 2020-03-05 | オリンパス株式会社 | 記録装置、画像観察装置、観察システム、観察システムの制御方法、及び観察システムの作動プログラム |
US11832883B2 (en) | 2020-04-23 | 2023-12-05 | Johnson & Johnson Surgical Vision, Inc. | Using real-time images for augmented-reality visualization of an ophthalmology surgical tool |
US20210330394A1 (en) * | 2020-04-23 | 2021-10-28 | Johnson & Johnson Surgical Vision, Inc. | Augmented-reality visualization of an ophthalmic surgical tool |
US20230107857A1 (en) | 2021-09-29 | 2023-04-06 | Cilag Gmbh International | Surgical sealing devices for a natural body orifice |
US20230100698A1 (en) * | 2021-09-29 | 2023-03-30 | Cilag Gmbh International | Methods for Controlling Cooperative Surgical Instruments |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11309A (ja) | 1997-06-12 | 1999-01-06 | Hitachi Ltd | 画像処理装置 |
JP2005021353A (ja) * | 2003-07-01 | 2005-01-27 | Olympus Corp | 手術支援装置 |
JP2006198032A (ja) * | 2005-01-18 | 2006-08-03 | Olympus Corp | 手術支援システム |
JP2007007041A (ja) | 2005-06-29 | 2007-01-18 | Hitachi Medical Corp | 手術支援装置 |
JP2007152027A (ja) * | 2005-12-08 | 2007-06-21 | Univ Waseda | 内視鏡視野拡張システム、内視鏡視野拡張装置及び内視鏡視野拡張用プログラム |
JP2010220787A (ja) | 2009-03-24 | 2010-10-07 | Waseda Univ | 測距装置及びそのプログラム、並びに測距システム |
WO2011114731A1 (ja) * | 2010-03-17 | 2011-09-22 | 富士フイルム株式会社 | 内視鏡観察を支援するシステムおよび方法、並びに、装置およびプログラム |
WO2011118287A1 (ja) * | 2010-03-24 | 2011-09-29 | オリンパス株式会社 | 内視鏡装置 |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4061135A (en) * | 1976-09-27 | 1977-12-06 | Jerrold Widran | Binocular endoscope |
US4386602A (en) * | 1977-05-17 | 1983-06-07 | Sheldon Charles H | Intracranial surgical operative apparatus |
US4528587A (en) * | 1982-10-28 | 1985-07-09 | Cjm Associates | Three-dimensional video apparatus and methods using composite and mixed images |
US4834518A (en) * | 1983-05-13 | 1989-05-30 | Barber Forest C | Instrument for visual observation utilizing fiber optics |
US4651201A (en) * | 1984-06-01 | 1987-03-17 | Arnold Schoolman | Stereoscopic endoscope arrangement |
US6059718A (en) * | 1993-10-18 | 2000-05-09 | Olympus Optical Co., Ltd. | Endoscope form detecting apparatus in which coil is fixedly mounted by insulating member so that form is not deformed within endoscope |
US5647838A (en) * | 1994-05-10 | 1997-07-15 | Bloomer; William E. | Camera fixture for stereoscopic imagery and method of using same |
US6139490A (en) * | 1996-02-22 | 2000-10-31 | Precision Optics Corporation | Stereoscopic endoscope with virtual reality viewing |
US6167296A (en) * | 1996-06-28 | 2000-12-26 | The Board Of Trustees Of The Leland Stanford Junior University | Method for volumetric image navigation |
JP4170042B2 (ja) * | 2002-08-09 | 2008-10-22 | フジノン株式会社 | 立体電子内視鏡装置 |
US9037215B2 (en) * | 2007-01-31 | 2015-05-19 | The Penn State Research Foundation | Methods and apparatus for 3D route planning through hollow organs |
US7791009B2 (en) * | 2007-11-27 | 2010-09-07 | University Of Washington | Eliminating illumination crosstalk while using multiple imaging devices with plural scanning devices, each coupled to an optical fiber |
EP2425761B1 (en) * | 2010-05-10 | 2015-12-30 | Olympus Corporation | Medical device |
JP5701140B2 (ja) * | 2011-04-21 | 2015-04-15 | キヤノン株式会社 | 立体内視鏡装置 |
-
2013
- 2013-03-15 EP EP13764231.0A patent/EP2829218B1/en not_active Not-in-force
- 2013-03-15 JP JP2014506196A patent/JP6083103B2/ja not_active Expired - Fee Related
- 2013-03-15 US US14/383,776 patent/US20150145953A1/en not_active Abandoned
- 2013-03-15 WO PCT/JP2013/057392 patent/WO2013141155A1/ja active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11309A (ja) | 1997-06-12 | 1999-01-06 | Hitachi Ltd | 画像処理装置 |
JP2005021353A (ja) * | 2003-07-01 | 2005-01-27 | Olympus Corp | 手術支援装置 |
JP2006198032A (ja) * | 2005-01-18 | 2006-08-03 | Olympus Corp | 手術支援システム |
JP2007007041A (ja) | 2005-06-29 | 2007-01-18 | Hitachi Medical Corp | 手術支援装置 |
JP2007152027A (ja) * | 2005-12-08 | 2007-06-21 | Univ Waseda | 内視鏡視野拡張システム、内視鏡視野拡張装置及び内視鏡視野拡張用プログラム |
JP2010220787A (ja) | 2009-03-24 | 2010-10-07 | Waseda Univ | 測距装置及びそのプログラム、並びに測距システム |
WO2011114731A1 (ja) * | 2010-03-17 | 2011-09-22 | 富士フイルム株式会社 | 内視鏡観察を支援するシステムおよび方法、並びに、装置およびプログラム |
WO2011118287A1 (ja) * | 2010-03-24 | 2011-09-29 | オリンパス株式会社 | 内視鏡装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2829218A4 |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3005937B1 (de) * | 2014-09-25 | 2018-02-28 | Carl Zeiss Meditec AG | Verfahren zur korrektur eines oct-bildes und mikroskop |
JP6150968B1 (ja) * | 2016-02-10 | 2017-06-21 | オリンパス株式会社 | 内視鏡システム |
WO2017138208A1 (ja) * | 2016-02-10 | 2017-08-17 | オリンパス株式会社 | 内視鏡システム |
CN108348134A (zh) * | 2016-02-10 | 2018-07-31 | 奥林巴斯株式会社 | 内窥镜系统 |
US10638915B2 (en) | 2016-02-10 | 2020-05-05 | Olympus Corporation | System for moving first insertable instrument and second insertable instrument, controller, and computer-readable storage device |
Also Published As
Publication number | Publication date |
---|---|
EP2829218A1 (en) | 2015-01-28 |
EP2829218A4 (en) | 2015-12-09 |
US20150145953A1 (en) | 2015-05-28 |
JP6083103B2 (ja) | 2017-02-22 |
EP2829218B1 (en) | 2017-05-03 |
JPWO2013141155A1 (ja) | 2015-08-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6083103B2 (ja) | 画像内遮蔽領域の画像補完システム、画像処理装置及びそのプログラム | |
CA3013128C (en) | Methods and systems for updating an existing landmark registration | |
US11116383B2 (en) | Articulated structured light based-laparoscope | |
JP5380348B2 (ja) | 内視鏡観察を支援するシステムおよび方法、並びに、装置およびプログラム | |
JP4152402B2 (ja) | 手術支援装置 | |
US8147503B2 (en) | Methods of locating and tracking robotic instruments in robotic surgical systems | |
US8108072B2 (en) | Methods and systems for robotic instrument tool tracking with adaptive fusion of kinematics information and image information | |
US20150287236A1 (en) | Imaging system, operating device with the imaging system and method for imaging | |
JPH11309A (ja) | 画像処理装置 | |
KR20140115575A (ko) | 수술 로봇 시스템 및 그 제어 방법 | |
JP2002102249A (ja) | 手術ナビゲーション装置および手術ナビゲーション方法 | |
JP2011212301A (ja) | 投影画像生成装置および方法、並びにプログラム | |
KR101993384B1 (ko) | 환자의 자세 변화에 따른 의료 영상을 보정하는 방법, 장치 및 시스템 | |
KR20180116090A (ko) | 의료용 네비게이션 시스템 및 그 방법 | |
EP3666166B1 (en) | System and method for generating a three-dimensional model of a surgical site | |
JP6476125B2 (ja) | 画像処理装置、及び手術顕微鏡システム | |
JP4187830B2 (ja) | 医用画像合成装置 | |
Field et al. | Stereo endoscopy as a 3-D measurement tool | |
US20220022964A1 (en) | System for displaying an augmented reality and method for generating an augmented reality | |
JP2014104328A (ja) | 手術支援システム、手術支援方法及び手術支援プログラム | |
JP5283015B2 (ja) | 測距装置及びそのプログラム、並びに測距システム | |
JP7407831B2 (ja) | 介入装置追跡 | |
Mikamo et al. | 3D endoscope system with AR display superimposing dense and wide-angle-of-view 3D points obtained by using micro pattern projector | |
CN111991080A (zh) | 一种手术入口的确定方法和系统 | |
Ciucci et al. | The NEAR project: Active endoscopes in the operating room |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13764231 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014506196 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14383776 Country of ref document: US |
|
REEP | Request for entry into the european phase |
Ref document number: 2013764231 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013764231 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |