WO2007139187A1 - 3次元画像構築装置及び方法並びにプログラム - Google Patents
3次元画像構築装置及び方法並びにプログラム Download PDFInfo
- Publication number
- WO2007139187A1 WO2007139187A1 PCT/JP2007/061100 JP2007061100W WO2007139187A1 WO 2007139187 A1 WO2007139187 A1 WO 2007139187A1 JP 2007061100 W JP2007061100 W JP 2007061100W WO 2007139187 A1 WO2007139187 A1 WO 2007139187A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- tubular structure
- luminance information
- dimensional image
- pixels
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 42
- 230000003287 optical effect Effects 0.000 claims abstract description 25
- 238000005286 illumination Methods 0.000 claims abstract description 16
- 238000011161 development Methods 0.000 claims abstract description 8
- 238000010276 construction Methods 0.000 claims description 37
- 238000012360 testing method Methods 0.000 claims description 33
- 230000010365 information processing Effects 0.000 claims description 30
- 238000003384 imaging method Methods 0.000 claims description 27
- 238000001514 detection method Methods 0.000 claims description 24
- 238000012937 correction Methods 0.000 claims description 12
- 238000012545 processing Methods 0.000 claims description 12
- 238000000605 extraction Methods 0.000 claims description 8
- 239000000284 extract Substances 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 22
- 238000007796 conventional method Methods 0.000 description 8
- 210000000056 organ Anatomy 0.000 description 7
- 238000004364 calculation method Methods 0.000 description 5
- 210000001035 gastrointestinal tract Anatomy 0.000 description 5
- 238000009826 distribution Methods 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 238000012327 Endoscopic diagnosis Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 238000003745 diagnosis Methods 0.000 description 2
- 238000001839 endoscopy Methods 0.000 description 2
- 239000000835 fiber Substances 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 238000013481 data capture Methods 0.000 description 1
- 238000013144 data compression Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 210000001198 duodenum Anatomy 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 210000003238 esophagus Anatomy 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 210000002429 large intestine Anatomy 0.000 description 1
- 230000003902 lesion Effects 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 210000000813 small intestine Anatomy 0.000 description 1
- 210000002784 stomach Anatomy 0.000 description 1
- 210000000626 ureter Anatomy 0.000 description 1
- 210000003708 urethra Anatomy 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/042—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by a proximal camera, e.g. a CCD camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/507—Depth or shape recovery from shading
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/107—Measuring physical dimensions, e.g. size of the entire body or parts thereof
- A61B5/1076—Measuring physical dimensions, e.g. size of the entire body or parts thereof for measuring dimensions inside body cavities, e.g. using catheters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30028—Colon; Small intestine
Definitions
- the present invention relates to a continuous inner surface of a tubular structure observed by a camera or endoscope placed in or inserted into a tubular structure such as a tunnel, a sewer pipe, a digestive tract or a luminal organ of a patient.
- the present invention relates to a 3D image construction apparatus that constructs a continuous 3D image at a high speed based on the relative distance between each point on the inner surface of a tubular structure calculated from a target image or video image and an objective lens.
- an observation region depending on the field of view of an endoscope inserted into a tubular structure is observed and displayed with the naked eye or via a video camera. Recorded by photography or video shooting within the recording range specified by the user.
- the observation and display range is limited to the observation field of the endoscope, and the whole is recorded by repeating local recording. According to this, since the whole cannot be displayed as a single unbroken photograph, the objectivity is poor in locating the region of interest.
- Another method for continuously recording the whole is to shoot with video images, but this method cannot be displayed simultaneously and takes time to view.
- the conventional method only records a flat image, so it cannot grasp the 3D structure. Furthermore, the conventional method cannot objectively record the hardness and movement of the tissue constituting the luminal structure.
- Patent Document 1 discloses an image creation system for creating continuous and seamless expanded still image data of the inner surface of this type of tubular structure.
- This image creation system includes digital image data capture means, pipe projection conversion means for creating a development view in the circumferential direction of the inner surface of the tubular structure for each frame of the captured digital image data, mosaicing processing means, image It is composed of data compression means and compressed image data storage means.
- the development view in the circumferential direction of each frame of the endoscopic video image is connected in the central axis direction of the tubular structure, A whole image is constructed.
- Patent Document 2 while monitoring the posture and position information of a camera moving in a tubular structure, the inner surface of the tube is photographed, and a plurality of sheets are matched with the observation field of the camera. A method of creating an entire image by connecting images is disclosed.
- Patent Document 3 discloses a method in which a directed scanning illumination device is attached to the distal end of an endoscope and the shape of the inner surface of the body is measured three-dimensionally by directed illumination.
- Patent Document 4 an interference fringe projection unit and a laser spot projection unit for distance measurement are attached to the distal end of an endoscope, and three-dimensional information of the subject is obtained from the interference fringe and distance measurement information formed on the subject.
- the calculation method is public.
- Patent Document 5 an image is taken with an arbitrary time difference while changing the amplitude of illumination light, and the distance between each point is measured based on the brightness of each point and the degree of change in imaging gain.
- a method for detecting stereoscopic information is disclosed.
- a strip-shaped image is created from an image in a 360-degree space around the camera using a convex mirror or fisheye lens installed in front of the camera moving in the tube, and this is the direction of camera movement.
- a method of displaying the entire image as a single image by connecting the images while correcting them is disclosed.
- Patent Document 7 after calculating the relative positions of a pipe and an endoscope from an endoscopic image of the pipe inner surface observed with an endoscope moving on the pipe inner surface having a single cross section, A method for calculating the length and area of an object is disclosed.
- Patent Document 8 discloses image processing means for creating a tertiary shape model from a plurality of two-dimensional images captured while changing optical parameters.
- Patent Document 9 discloses a technique for creating a three-dimensional model from a plurality of images picked up by changing optical parameters.
- Patent Document 10 discloses a technique for decomposing an observation object into color components by an endoscope equipped with a color filter and extracting stereoscopic information by parallax.
- the hardness of a living tissue is calculated by calculating a reflection signal from a living tissue that is displaced by an ultrasonic wave emitted from an ultrasonic transducer attached to the distal end of an endoscope. Techniques for measuring are disclosed.
- Patent Document 1 Japanese Patent Application Laid-Open No. 2003-32674
- Patent Document 2 Japanese Patent Laid-Open No. 1 66316
- Patent Document 3 Japanese Patent Publication No. 2003-535659
- Patent Document 4 JP-A-5-211988
- Patent Document 5 JP 2000-121339 A
- Patent Document 6 Japanese Patent Laid-Open No. 2000-331 168
- Patent Document 7 JP-A-5-340721
- Patent Document 8 Japanese Patent Laid-Open No. 1-11337337
- Patent Document 9 JP 2002-191554 A
- Patent Document 10 Japanese Unexamined Patent Publication No. 2000-19424
- Patent Document 11 Japanese Patent Laid-Open No. 2001-224594
- the technique disclosed in Patent Document 5 calculates the distance to the camera based on the degree of change in luminance of reflected light from an object that is imaged at an arbitrary time difference with illumination light that changes in amplitude. Therefore, an expensive device such as a sensor with excellent resolution is required. In addition, measurement is impossible when the object force is close to the S-camera or when the object repeats rapid movement and deformation.
- the technique disclosed in Patent Document 8 also creates a three-dimensional model from a plurality of images captured by the force S without changing the optical parameters, and cannot be applied when the subject moves quickly. It is necessary to set a new optical device.
- Patent Document 7 is premised on the measurement of an object inside a pipe having a single cross section, and cannot be applied to an irregularly shaped tubular structure.
- the technique disclosed in Patent Document 10 requires a new stereoscopic endoscope and apparatus, and it is necessary to reconstruct stereoscopic information and color information due to parallax and arrange them appropriately for displaying a continuous stereoscopic image. Since the calculation is complicated, it is impossible to cope with fast movement.
- the technique disclosed in Patent Document 11 requires an ultrasonic device attached to the distal end of the endoscope, and air must not be interposed between the ultrasonic device and the living tissue. Calculation is required, and application fields in medicine are limited.
- the present invention has been made in view of the above technical problem, and in a state where the relative position between the center axis of the irregularly shaped and moving tubular structure and the optical axis of the photographing means varies. It is an object to provide a 3D image construction apparatus and method sequence and a program that can easily construct a 3D image of a structure. Means for solving the problem
- a three-dimensional image construction apparatus that constructs a three-dimensional image based on an image of an inner surface of a tubular structure that is an observation target.
- the three-dimensional image constructing apparatus includes an optical axis extending in the axial direction of the tubular structure, an imaging unit that acquires a plurality of frame images while moving into the tubular structure under a predetermined illumination condition, and the imaging
- a luminance information extracting means for extracting luminance information of pixels corresponding to a predetermined range of each frame image of the inner surface of the tubular structure acquired by the means, and a depth based on the luminance information extracted by the luminance information extracting means.
- Distance information calculation means for calculating a relative distance between each point on the inner surface of the tubular structure and the objective lens (imaging means) in the direction, and pixels corresponding to a predetermined range of each frame image on the inner surface of the tubular structure.
- 3D image construction means for continuously constructing a 3D image of the inner surface of the tubular structure by combining the arrangement of data for a plurality of frame images, reflecting the distance.
- the apparatus further includes a change amount detection device that measures a change amount in the circumferential direction and the axial direction of the imaging means (for example, an endoscope), and the three-dimensional image construction means includes a tube Reflecting the circumferential and axial motion information (detection information) of the tip of the imaging means measured by the change amount detection device for pixels corresponding to a predetermined range of each frame image on the inner surface of the structure A 3D image of the inner surface of the tubular structure is constructed by combining data arrays for multiple frame images.
- a change amount detection device that measures a change amount in the circumferential direction and the axial direction of the imaging means (for example, an endoscope)
- the three-dimensional image construction means includes a tube Reflecting the circumferential and axial motion information (detection information) of the tip of the imaging means measured by the change amount detection device for pixels corresponding to a predetermined range of each frame image on the inner surface of the structure
- a 3D image of the inner surface of the tubular structure is constructed by
- the three-dimensional image construction device has a point corresponding to the optical axis of the imaging unit on each frame image as a predetermined range of the frame image from which the luminance information is extracted by the luminance information extraction unit.
- the luminance information extracting means extracts luminance information of pixels existing on the test line for each frame image.
- the 3D image construction device includes color information extraction means for extracting color information of pixels constituting each frame image of the inner surface of the tubular structure imaged by the imaging means, and the 3D image construction means.
- Color information addition means for adding color information extracted by the color extraction means to each pixel constituting the constructed three-dimensional image may be provided.
- the luminance information may be luminance information relating to red, green, blue, or a mixed color of pixels constituting each frame image.
- the imaging means may be an endoscope.
- a three-dimensional image of the tubular structure can be easily and at high speed.
- Power to build S By constructing a three-dimensional image at high speed, it is possible to accurately diagnose and record the movement of the lumen itself in addition to the conventional shape and color diagnosis during endoscopic diagnosis.
- information on the hardness and tension of the living tissue that composes the lumen is created by constructing a three-dimensional image at high speed while injecting the optimal gas or liquid into the lumen at the time of endoscopic diagnosis. Can also be recorded.
- a three-dimensional image faithful to the actual observation target can be constructed by properly using the luminance information of red, green, blue, or a mixed color according to the color tone of the observation target.
- the luminance information is a color tone having a wavelength similar to the complementary color of the observation object's color tone, for example, luminance information related to green, so that the actual observation object The ability to build 3D images that are even more faithful.
- FIG. 1 is a block diagram showing an overall configuration of a 3D image construction apparatus according to an embodiment of the present invention.
- FIG. 2 is a block diagram showing a basic configuration of an information processing apparatus constituting the 3D image construction apparatus.
- FIG. 3 is a diagram showing a configuration of a distal end portion of an endoscope scope in an endoscope device constituting the 3D image construction apparatus.
- FIG. 4 (a) is a diagram showing an observation region in a state where the endoscope scope inserted into the lumen is inclined downward.
- (B) It is a figure showing the observation field in the state where the endoscope scope inserted in the lumen inclines upward.
- FIG. 5 is a diagram showing a mode in which a circular test line is set as a predetermined range to be extracted with luminance information on a frame image extracted from a video file.
- FIG. 6 is an explanatory diagram of a developed pixel array extracted from each frame image and subsequent combination processing.
- FIG. 7 is a diagram showing changes in luminance information of pixels extracted corresponding to test lines set on each frame image.
- FIG. 8 is a diagram showing a developed image of the inner surface of a lumen constructed by a conventional method for comparison with and contrast with FIG.
- FIG. 9 (a) Distribution of luminance information for red of each pixel composing a 3D image 3 It is a dimensional graph. (B) A three-dimensional graph showing the distribution of luminance information for the green color of each pixel constituting the three-dimensional image. (C) A three-dimensional graph showing the distribution of luminance information about the blue color of each pixel constituting the three-dimensional image.
- FIG. 10 It is on the positive side in the luminance axis direction in (a), (b) and (c) of Fig. 9, and is perpendicular to the plane consisting of the axes corresponding to "time” and "position on the test line", respectively.
- FIG. 5 is a view showing a developed image of the inner surface of the lumen as seen from the viewpoint set in the direction.
- FIG. 11 is a view showing a developed image of the inner surface of the lumen constructed by a conventional method for comparison with FIG.
- FIG. 12 is a graph showing a relative change between luminance and distance.
- FIG. 13 (a) is a diagram showing a three-dimensional image viewed from the viewpoint set on the positive side in the depth direction and perpendicular to the XY plane.
- FIG. 14 is a diagram showing a three-dimensional image in which the image shown in FIG. 13 (a) is pulled up diagonally to the left and rotated about the Y axis.
- FIG. 14 is a diagram showing a three-dimensional image formed by pulling the image shown in (a) of FIG. 13 diagonally downward to the right and rotating it around the Y axis.
- FIG. 14 is a flowchart of the 3D image construction process in the first embodiment.
- FIG. 15 is a diagram showing a motion detection device of the three-dimensional image construction device in the second embodiment.
- FIG. 16 is a diagram for explaining correction of an image in the circumferential direction in the second embodiment.
- FIG. 17 is a diagram for describing correction of an image in the axial direction in the second embodiment.
- FIG. 18 (a) is a view showing a developed image of the inner surface of the lumen when correction in the circumferential direction and the axial direction is not performed. (B) It is a figure which shows the expansion
- FIG. 19 is a flowchart of the 3D image construction process in the second embodiment. Explanation of symbols
- FIG. 1 is a block diagram schematically showing the overall configuration of the 3D image construction apparatus according to the first embodiment of the present invention.
- This 3D image construction apparatus 1 is connected to an endoscopic device 10 that captures a video file composed of a plurality of frame images, for example, by imaging the inner surface of a luminal organ, and is connected to the endoscopic device 10.
- An information processing device 20 composed of a general-purpose PC that displays and outputs the video file acquired by the mirror device 10 and executes a process of constructing a pseudo three-dimensional image based on the frame image extracted from the video file. ,have.
- the endoscope device 10 and the information processing apparatus 20 are connected via a cable 11 such as a USB cable, for example, and the video file acquired by the endoscope device 10 is transferred to the information processing apparatus 20 between the two devices.
- the command signal can be transmitted from the information processing apparatus 20 to the endoscope device 10.
- a video capture board may be interposed between the endoscope device 10 and the information processing apparatus 20.
- the endoscopic device 10 is inserted into, for example, a luminal organ, and controls the endoscope scope 2 that images the inner surface of the lumen, the endoscope scope 2, and the endoscope scope 2 through the endoscope scope 2. And a control unit 3 for creating a video file based on the input signal.
- the configuration of the endoscope scope 2 will be described with reference to FIG.
- the control unit 3 includes a control unit 3a that controls each component in the unit 3, and a signal processing unit 3b that creates a video file on the inner surface of the lumen based on a signal input via the endoscope scope 2.
- a light source unit 3c that is a light source of illumination light emitted from the distal end portion of the endoscope scope 2 to the observation object.
- the control unit 3a controls on / off switching of imaging and illumination by the endoscope scope 2, and adjusts the amount of light supplied to the endoscope scope 2 from the light source unit 3c in accordance with a user operation. To do.
- the configuration of the three-dimensional image construction apparatus 1 is not limited to this.
- the endoscopic device 10 that observes the inner surface of a luminal organ
- the inner wall of a tunnel or a sewer pipe is used.
- An endoscope device or a video camera system to be observed may be used.
- a laptop (desktop) PC is used instead of a laptop. You can use a PC-type PC power.
- FIG. 2 is a block diagram showing a basic configuration of the information processing apparatus 20.
- This information processing apparatus 20 is based on a program such as an operating system (OS) that is basic software, and controls a sequence of each component in the information processing apparatus 20 and a boot program that is executed when the information processing apparatus 20 is started.
- OS operating system
- ROM22 which is used as a buffer area of the work area necessary for program execution
- HD hard disk
- HD hard disk
- the video file input from the monitor 25 as the display output device that can display various information such as the case screen, the input device 26 such as the mouse 26a and the keyboard 26b, and the endoscope device 10.
- the information processing apparatus 20 has a configuration such as an optical disk drive or a floppy (registered trademark) disk drive, for example.
- a 3D image construction program is stored in the ROM 22 (or hard disk 24), and in the image processor 27 of the information processing apparatus 20, an internal view is performed as this program is read and executed.
- Video image input from the mirror device 10 Multiple frame images are extracted in sequence, each obtained from each frame image, and the pixels after the unfolded array are combined for multiple frame images. An original image is constructed.
- the video file input from the endoscope device 10 may be stored on the hard disk 24 on the information processing apparatus 20 side or to a printer (not shown) for printing out a predetermined frame image. May be forwarded.
- FIG. 3 is an enlarged view showing the distal end portion of the endoscope scope 2.
- an objective lens 2a to be observed
- a pair of illumination optical fibers 2b and an operation channel 2c such as a forceps channel or a suction channel.
- an operation channel 2c such as a forceps channel or a suction channel.
- the endoscope scope 2 is not limited to this, for example, one having a single or three or more illumination optical fibers is employed. May be.
- the objective lens 2a includes an optical axis F extending parallel to or at a constant angle with respect to the central axis (indicated by symbol B) of the endoscope scope 2, and with respect to the optical axis F, It has a viewing angle of up, down, left and right ⁇ °.
- the endoscope scope 2 when imaging the inner surface of the lumen, it is ideal that the endoscope scope 2 is moved along the central axis of the lumen within the lumen, but in reality, the lumen is irregularly shaped. At the same time, the movement of the endoscope scope 2 changes in the lumen, and the relative position between the central axis of the lumen and the optical axis F of the objective lens 2a always varies.
- the observation area is an area as indicated by reference numeral S2. That is, in this case, on the upper inner surface of the lumen 30, the region close to the objective lens 2a is within the observation region S2, while on the lower inner surface of the lumen 30, the region farther from the objective lens 2a is observed. It will fit in area S2.
- the illumination fiber 2b of the endoscope scope 2 emits a certain amount of illumination light to the lumen inner surface 30a, and this illumination light is emitted from the lumen inner surface 30a.
- the reflected light enters the objective lens 2a.
- the brightness of the reflected light is based on the relationship inversely proportional to the distance from the objective lens 2a to the lumen inner surface 30a (Fig. 12), and is far from the objective lens 2a.
- the brightness of the reflected light from the lumen inner surface 30a located at the lower position becomes weaker, while the brightness of the reflected light from the lumen inner surface 30a near the object lens 2a becomes stronger.
- the intensity of the brightness of the reflected light depends on the CCD (not shown) etc. where the reflected light is incorporated in the endoscope scope 2. It is reflected in the magnitude of the luminance of the pixels constituting each frame image of the lumen inner surface 30a obtained by being converted into an electrical signal.
- luminance information of pixels corresponding to a predetermined range is extracted from each frame image of a video file in which the intensity of reflected light from an observation target is reflected in this way.
- a circular test line centered on a point corresponding to the optical axis F of the object lens 2a is set on each frame image. This establishes a conical depth axis connecting the lens and the test line.
- FIG. 5 shows a mode in which a test line (circular white line) as a predetermined range to be extracted with luminance information is set on the frame image acquired by the endoscope device 10.
- pixels located in the circumferential direction of the lumen inner surface 30a are extracted from each frame image constituting the video file.
- a pixel is determined as a luminance information extraction target, an RGB value is acquired for each pixel, luminance information is acquired based on the RGB value, and a relative distance in the depth information direction is calculated.
- the radius of this test line can be set arbitrarily, and one test line set to acquire a 3D image of an observation target is applied to all frame images extracted from the video file. Is done.
- FIG. 6 conceptually shows a mode in which pixels located in the circumferential direction of the lumen inner surface 30a are expanded and arranged and combined for a plurality of frame images.
- the left-right direction in the figure corresponds to the axial direction of the lumen 30, and the up-down direction corresponds to the inner diameter of the lumen 30.
- the pixel corresponding to the central axis of the objective lens 2a (here, the pixel located at 6 o'clock on the test line) is set as the reference pixel, and the pixel after each unfolded array is the reference pixel.
- the pixels are combined while being positioned to match.
- the width in the vertical direction is changed according to the luminance of each pixel on the test line. For example, each pixel is plotted so that the width is reduced for pixels with high brightness, while the width is increased for pixels with low brightness. Thereby, information on the inner diameter of the lumen 30 is added.
- FIG. 7 is a diagram in which graphs representing changes in luminance information of pixels respectively extracted from a plurality of frame images are displayed side by side.
- the changing power of the luminance information of the pixels extracted from the ten frame images fl to fl 0 is shifted and arranged in parallel in the order of extraction of the frame images.
- the left-right direction in the figure corresponds to the inner diameter of the lumen 30, and the up-down direction corresponds to the axial direction of the lumen 30.
- changes in luminance information that should actually be reflected in the direction perpendicular to the drawing are represented in the vertical direction in the figure.
- FIG. 8 shows a developed image of the luminal inner surface 30a constructed by using the conventional method based on the same file as the video file adopted in the present embodiment.
- the left-right direction in the figure corresponds to the inner diameter of the lumen 30, and the up-down direction corresponds to the axial direction of the lumen 30. From these FIGS. 7 and 8, it can be seen that the change in the luminance information obtained by the method according to the present embodiment roughly matches the change in the luminance information obtained by the conventional method.
- FIGS. 9A, 9B, and 9C distributions of luminance information about red, green, and blue of each pixel constituting the three-dimensional image of the lumen inner surface 30a are shown.
- a three-dimensional graph showing is shown.
- the axis representing “luminance” in addition to the axis representing “luminance”, the axis representing the “time” corresponding to the axial direction of the lumen 30 and the “position on the test line” corresponding to the inner diameter of the lumen 30 are set. Is done.
- FIG. 10 corresponding to (a), (b) and (c) of FIG. 9, the plane is on the positive side in the luminance axis direction and is composed of the axes of “time” and “position on the test line”. The developed image is seen from the viewpoint set in the vertical direction.
- FIG. 11 shows a developed image of the inner surface 30a of the lumen constructed by the conventional method based on the same file as the video file adopted in the present embodiment.
- Fig. 11 corresponds to the unfolded image shown in Fig. 8 rotated 90 ° counterclockwise, and the area shown in the white frame in Fig. 11 is the unfolded image shown in Fig. 10. The corresponding area.
- the direction indicating “time” and “position on the lumen cross section” in FIG. Correspond to the directions of the axes representing “time” and “position on the test line” in Figs. 9 and 10, respectively.
- the best detection is obtained when the green luminance information is adopted in constructing the three-dimensional image. Sensitivity was obtained, and a 3D image faithful to the shape of the actual observation object was acquired. Subsequently, good detection sensitivity was obtained in the order of blue and red. Basically, it was possible to construct a three-dimensional image faithful to the actual observation object by properly using the luminance information of green, blue, red, or a mixed color according to the color tone of the observation object.
- the luminance information is a color tone similar to the complementary color of the color to be observed, such as luminance information on green, for example. 3D images that are more faithful to the object can be constructed.
- the position information in the depth direction calculated based on the luminance information of the pixels is reflected when the pixels are expanded and arranged.
- the position information for example, as shown in FIG. 12, the position information in the depth direction is calculated from the luminance information of each pixel based on the relationship in which the luminance and the distance change exponentially. Specifically, as the luminance information is larger, it is determined that the observation object has a shorter distance from the objective lens 2a (see FIG. 4) to the observation object, and the position information in the depth direction is set larger. In this way, the unevenness in the depth direction of the observation target can be grasped.
- FIG. 13 is a diagram showing a three-dimensional image viewed from a viewpoint that is set on the positive side in the depth direction (Z-axis direction) and perpendicular to the XY plane.
- Z-axis direction extends along the depth direction so that the front side of the paper is in the positive direction.
- FIG. 13 (b) is a diagram showing a three-dimensional image acquired by pulling the image shown in FIG. 13 (a) obliquely to the upper left and rotating it around the Y axis.
- (c) of FIG. 13 is a diagram showing a three-dimensional image obtained by pulling the image shown in (a) of FIG. 13 diagonally to the lower right and rotating it around the Y axis.
- the output RGB value may be added.
- a three-dimensional image that is more faithful to the actual observation target can be constructed, and the characteristics of the lumen inner surface 30a to be observed can be more easily grasped.
- FIG. 14 is a three-dimensional image construction process executed on the information processing apparatus 20 side based on the three-dimensional image construction program.
- the video file acquired on the endoscope device 10 side is the information processing apparatus 20. This is a flow chart for the processing from when it is read to the side until the 3D image is constructed.
- a video file acquired on the endoscope device 10 side is read (# 11), and then a test line as shown in FIG. 5 is set (# 12). This test line is applied to all frame images extracted from the video file in the subsequent steps.
- a depth axis corresponding to the Z axis described with reference to FIG. 13 is set in order to represent the position information of each pixel in the depth direction in constructing the three-dimensional image.
- the luminance of each pixel on the test line is calculated based on the RGB value obtained in step # 14 (# 16).
- information on the relative distance in the depth direction on the test line is acquired from this luminance based on the relationship in which the distance and luminance change exponentially as shown in FIG. 12 (# 17).
- the relative distance from the objective lens of the endoscope 2 to each point on the inner surface of the lumen structure on the test line is calculated.
- a 3D image is generated based on the objective lens force of such an endoscope and the relative distance to each point on the inner surface of the lumen structure.
- step # 19 it is determined whether or not steps # 14 to # 18 have been completed for all frame images (# 19), and as a result, it is determined that all frame images have not been completed yet. Returns to step # 13, and for the different frame images, Is repeated. On the other hand, if it is determined in step # 19 that the step has been completed for all the frame images, then the three-dimensional image is constructed by combining the expanded pixels (# 20). ). Finally, a 3D graph as shown in Fig. 9 is displayed (# 21), and the process is terminated.
- the three-dimensional image and the three-dimensional graph in steps # 20 and # 21 may be stored in the hard disk 24 as necessary.
- the 3D image construction process described above is executed in parallel with the acquisition of the video file on the endoscope device 10 side, or the video file acquired on the endoscope device 10 side is information. Once stored on the processing apparatus 20 side, it may be executed as necessary. Further, the above-described 3D image construction processing is executed by reading a 3D image construction program stored in the ROM 22 or the hard disk 24 of the information processing device 20. This program is stored in the information processing device 20.
- an optical disc 18 such as a CD-ROM, DVD-ROM, or a floppy disc 19 (both It may be additionally stored in the hard disk 24 of the information processing device 20 by being downloaded using an external recording medium such as FIG. 1) or via a network.
- a 3D image of the luminal inner surface 30a that is irregularly shaped and moves can be easily constructed based on luminance information.
- the power to do S Conventionally, a large number of captured images are required to record an endoscopic image.
- the user can determine the position and shape of the lesion. In addition to being easily recognizable, it can also objectively record tissue hardness and movement information. This improves the diagnostic accuracy of the endoscopy.
- the memory capacity for storing images can be reduced, and the viewing time for images can be shortened.
- the three-dimensional image construction device of the present embodiment further includes a motion detection device that detects the motion of the endoscope scope 2.
- FIG. 15 shows the configuration of the motion detection device.
- the motion detection device 50 includes an axial direction sensor 51 that detects the amount of movement of the endoscope scope 2 in the axial direction, and a circumferential direction sensor 52 that detects the amount of change in the circumferential direction of the endoscope scope 2.
- the axial direction sensor 51 and the circumferential direction sensor 52 can be easily realized by using, for example, a mouse mechanism generally used as an instruction device for a personal computer.
- the motion detection device 50 detects the amount of movement of the endoscope scope 2 from the axial reference position and the amount of change in the circumferential direction of the endoscope scope 2 from the circumferential reference position. Output to.
- the information processing device 20 When the information processing device 20 receives the detection data from the motion detection device 50, the information processing device 20 associates the time information indicating the reception time (corresponding to the detection time), the amount of movement in the axial direction, and the amount of change in the circumferential direction. Then, the movement information of the endoscope scope is stored in a predetermined storage device such as the hard disk 24. The information processing apparatus 20 corrects the circumferential direction and the axial direction based on the motion information when the images are developed and arranged.
- the time information indicating the reception time (corresponding to the detection time) may be any information that can correspond to the shooting time of each image frame of the video file and the detection value by the motion detection device 50. For example, JST ( Japan Standard Time), standard time such as GMT (Greenwich Mean Time), and information indicating the elapsed time from the start of video image recording can be used.
- the circumferential direction sensor 52 of the motion detection device 50 detects the amount of change in the circumferential direction of the endoscope scope 2, that is, the amount of rotation.
- the amount of change detected by the circumferential sensor 52 is ⁇ 1
- pixel data on the test line hereinafter referred to as “pixel column data”.
- the axial direction sensor 51 of the motion detection device 50 detects the amount of movement of the endoscope scope 2 in the axial direction.
- Embodiment 1 as shown in FIG. 17 (a), a developed image is obtained by sequentially arranging pixel column data obtained in time series from a video file.
- the endoscope scope The traveling speed of 2 is not always constant. That is, when the traveling speed of the endoscope scope 2 is almost zero, multiple images of the same position on the inner surface of the lumen are obtained, while on the other hand, the faster the traveling speed of the endoscope scope 2, the more the inner surface of the lumen. An image at a distant position will be obtained.
- FIG. 17 (b) is a diagram showing an example in which pixel column data obtained in a time series from a video file is arranged in correspondence with an actual physical position.
- the position (movement amount) in the axial direction at the time of acquisition of the pixel column data 2 to 4, the pixel column data 5 to 6, and the pixel column data 8 to 9 is almost the same, so they are overlapped. .
- the pixel column data 0 and the pixel column data 1 and the pixel column data 7 and the pixel column data 8 that are in contact with P are arranged at distant positions because there is a large difference in the amount of movement in the axial direction when the data is acquired.
- a relatively long portion of pixel data between the position of the inner surface of the lumen corresponding to pixel row data 0 (or 8) and the position of the inner surface of the lumen corresponding to pixel row data 1 (or 9) is acquired. Means not.
- the inner surface of the lumen such as pixel column data 2-4, pixel column data 5-6, and pixel column data 8-9. If there are multiple pixel column data indicating the image at the same position, only one data will be used. On the other hand, when there is a gap between the pixel column data 0 and the pixel column data 1 and between the pixel column data 7 and the pixel column data 8, the data between them is interpolated from the data at both ends. For example, as shown in FIG. 17C, the interpolation data A is generated from the pixel column data 0 and the pixel column data 1, and is arranged between the pixel column data 0 and the pixel column data 1.
- the number of interpolation data arranged between the pixel column data is appropriately determined according to the interval. For example, when the interval between the pixel column data 7 and the pixel column data 8 is large, a plurality of interpolation data B and C are generated and arranged by linear interpolation. More specifically, in the developed image, the resolution (the number of pixel column data arranged per unit time) is set in the axial direction (time direction), and pixel column data is thinned out and interpolated based on the resolution. What should I do?
- FIG. 18 (a) is a diagram showing an example of an image developed and arranged by the method of the first embodiment when the axial and circumferential corrections are not performed.
- a corrected image as shown in FIG. 18 (b) is obtained.
- FIG. 19 is a flowchart of processing for executing a 3D image construction process while performing correction based on the movement of the endoscope scope 2 in the present embodiment.
- step # 15 is added to the flowchart of FIG. 14 in the first embodiment, and the processing content of step # 18 is different.
- the information processing device 20 reads the motion information acquired by the detection device 50 and stored in the information processing device 20, and acquires position information in the circumferential direction and the axial direction. Then, in the pixel expansion array processing in step # 18, the pixel array data is expanded and arranged while performing the above-described circumferential and axial corrections. Other steps are the same as those in the first embodiment.
- the correction is performed in both the circumferential direction and the axial direction.
- the correction may be performed in at least one of them. Even in this case, the image can be reproduced closer to the actual condition of the inner surface of the lumen.
- the present invention has been described with respect to specific embodiments, the spirit of the present invention is not limited to the illustrated embodiments, and various modifications and changes can be made without departing from the spirit of the present invention. Needless to say, design changes are possible.
- the present invention specifically includes the upper digestive tract such as the stomach, the duodenum, and the esophagus, and the large intestine and the small intestine. Applicable to various luminal organs including lower digestive tract, urethra, ureter and the like.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Surgery (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Radiology & Medical Imaging (AREA)
- Geometry (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Computer Graphics (AREA)
- Pathology (AREA)
- Software Systems (AREA)
- Optics & Photonics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biophysics (AREA)
- Endoscopes (AREA)
- Instruments For Viewing The Inside Of Hollow Bodies (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/302,480 US9247865B2 (en) | 2006-05-31 | 2007-05-31 | Three-dimensional-image forming device, three dimensional-image forming method and program |
EP07744488.3A EP2030558B1 (en) | 2006-05-31 | 2007-05-31 | Three-dimensional image forming device, three-dimensional image forming method and program |
JP2008517981A JP4753104B2 (ja) | 2006-05-31 | 2007-05-31 | 3次元画像構築装置及び方法並びにプログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2006151936 | 2006-05-31 | ||
JP2006-151936 | 2006-05-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2007139187A1 true WO2007139187A1 (ja) | 2007-12-06 |
Family
ID=38778704
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2007/061100 WO2007139187A1 (ja) | 2006-05-31 | 2007-05-31 | 3次元画像構築装置及び方法並びにプログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US9247865B2 (ja) |
EP (1) | EP2030558B1 (ja) |
JP (1) | JP4753104B2 (ja) |
WO (1) | WO2007139187A1 (ja) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2009128055A1 (en) * | 2008-04-15 | 2009-10-22 | Provost Fellows And Scholars Of The College Of The Holy And Undivided Trinity Of Queen Elizabeth Near Dublin | Endoscopy system with motion sensors |
JP2010092283A (ja) * | 2008-10-08 | 2010-04-22 | Chiba Univ | 立体画像作成装置及び方法並びに内視鏡検査システム |
JP2010256988A (ja) * | 2009-04-21 | 2010-11-11 | Chiba Univ | 3次元画像生成装置、3次元画像生成方法、及びプログラム |
WO2012098792A1 (ja) * | 2011-01-20 | 2012-07-26 | オリンパスメディカルシステムズ株式会社 | 画像処理装置、画像処理方法、画像処理プログラムおよび内視鏡システム |
JP2014096020A (ja) * | 2012-11-09 | 2014-05-22 | Honda Motor Co Ltd | 画像処理方法、画像処理装置、表面検査システム及びプログラム |
JP2016043033A (ja) * | 2014-08-22 | 2016-04-04 | 富士フイルム株式会社 | 展開画像生成装置、方法、及びプログラム、並びに画像情報記憶装置 |
WO2017158896A1 (ja) * | 2016-03-15 | 2017-09-21 | オリンパス株式会社 | 画像処理装置、画像処理システム、画像処理装置の作動方法 |
WO2017212725A1 (ja) * | 2016-06-07 | 2017-12-14 | オリンパス株式会社 | 医療用観察システム |
JP2019150574A (ja) * | 2018-03-05 | 2019-09-12 | リオン株式会社 | 3次元形状データ作成方法、及び3次元形状データ作成システム |
JP2021067905A (ja) * | 2019-10-28 | 2021-04-30 | 一般財団法人電力中央研究所 | 中空電柱の内周面撮影装置および非破壊検査装置、並びに中空電柱の内周面撮影方法および非破壊検査方法 |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2424422B1 (en) | 2009-04-29 | 2019-08-14 | Koninklijke Philips N.V. | Real-time depth estimation from monocular endoscope images |
US20130051659A1 (en) * | 2010-04-28 | 2013-02-28 | Panasonic Corporation | Stereoscopic image processing device and stereoscopic image processing method |
US20150057952A1 (en) * | 2013-08-26 | 2015-02-26 | General Electric Company | Modular inspection system |
US20150054942A1 (en) * | 2013-08-26 | 2015-02-26 | General Electric Company | Modular inspection system inspection module |
JP6264834B2 (ja) * | 2013-10-24 | 2018-01-24 | 富士通株式会社 | ガイド方法、情報処理装置およびガイドプログラム |
US9545220B2 (en) * | 2014-03-02 | 2017-01-17 | V.T.M (Virtual Tape Measure) Technologies Ltd. | Endoscopic measurement system and method |
US10548459B2 (en) | 2014-03-17 | 2020-02-04 | Intuitive Surgical Operations, Inc. | Systems and methods for control of imaging instrument orientation |
JP6828465B2 (ja) * | 2017-01-30 | 2021-02-10 | セイコーエプソン株式会社 | 内視鏡操作支援システム |
GB201712486D0 (en) * | 2017-08-03 | 2017-09-20 | Ev Offshore Ltd | Quantative surface measurements by combining image and height profile data |
CN112740666A (zh) | 2018-07-19 | 2021-04-30 | 艾科缇弗外科公司 | 自动手术机器人视觉系统中多模态感测深度的系统和方法 |
WO2020044523A1 (ja) * | 2018-08-30 | 2020-03-05 | オリンパス株式会社 | 記録装置、画像観察装置、観察システム、観察システムの制御方法、及び観察システムの作動プログラム |
KR20220021920A (ko) | 2019-04-08 | 2022-02-22 | 액티브 서지컬, 인크. | 의료 이미징을 위한 시스템 및 방법 |
DE102019114817B4 (de) | 2019-06-03 | 2021-12-02 | Karl Storz Se & Co. Kg | Bildgebungssystem und Verfahren zur Beobachtung |
WO2021035094A1 (en) | 2019-08-21 | 2021-02-25 | Activ Surgical, Inc. | Systems and methods for medical imaging |
AU2022230435A1 (en) * | 2021-03-05 | 2023-08-24 | Hollister Incorporated | Stoma and peristomal imaging and quantification of size, shape, and color |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0521988A (ja) | 1991-07-12 | 1993-01-29 | Hitachi Ltd | チツプ電子部品供給装置 |
JPH05340721A (ja) | 1992-04-06 | 1993-12-21 | Olympus Optical Co Ltd | 3次元計測方法及び3次元計測装置 |
JPH1166316A (ja) | 1997-08-26 | 1999-03-09 | Tokyo Met Gov Gesuido Service Kk | 管渠内面展開図化装置 |
JPH11337845A (ja) | 1998-05-25 | 1999-12-10 | Mitsubishi Electric Corp | 内視鏡装置 |
JP2000019424A (ja) | 1998-06-29 | 2000-01-21 | Terumo Corp | 立体内視鏡装置 |
JP2000121339A (ja) | 1998-10-15 | 2000-04-28 | Hamamatsu Photonics Kk | 立体情報検出方法及び装置 |
JP2000331168A (ja) | 1999-03-12 | 2000-11-30 | Tokyoto Gesuido Service Kk | 管渠内面画像の処理装置及びその方法 |
JP2001224594A (ja) | 2000-02-15 | 2001-08-21 | Olympus Optical Co Ltd | 超音波内視鏡システム |
JP2002191554A (ja) | 2000-12-26 | 2002-07-09 | Asahi Optical Co Ltd | 3次元画像検出装置を備えた電子内視鏡 |
JP2003032674A (ja) | 2001-07-16 | 2003-01-31 | Emaki:Kk | 管状物内壁のビデオ画像から連続した展開静止画像を自動生成するシステム |
JP2003535659A (ja) | 2000-06-19 | 2003-12-02 | ユニヴァーシティ オブ ワシントン | 走査型単一光ファイバシステムを用いる医療用画像化、診断および治療 |
WO2004096008A2 (en) * | 2003-05-01 | 2004-11-11 | Given Imaging Ltd. | Panoramic field of view imaging device |
JP2006187551A (ja) * | 2005-01-07 | 2006-07-20 | Olympus Corp | 食道粘膜用画像処理装置 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2523369B2 (ja) * | 1989-03-14 | 1996-08-07 | 国際電信電話株式会社 | 動画像の動き検出方法及びその装置 |
GB2248361B (en) * | 1990-09-28 | 1994-06-01 | Sony Broadcast & Communication | Motion dependent video signal processing |
US5469254A (en) * | 1992-04-06 | 1995-11-21 | Olympus Optical Co., Ltd. | Method and apparatus for measuring three-dimensional position of a pipe from image of the pipe in an endoscopic observation system |
JP2963284B2 (ja) * | 1992-07-24 | 1999-10-18 | 財団法人鉄道総合技術研究所 | トンネル検査装置 |
JP4623843B2 (ja) * | 2001-03-02 | 2011-02-02 | Hoya株式会社 | 3次元画像入力装置 |
WO2005093654A2 (en) * | 2004-03-25 | 2005-10-06 | Fatih Ozluturk | Method and apparatus to correct digital image blur due to motion of subject or imaging device |
JP2006065676A (ja) * | 2004-08-27 | 2006-03-09 | Canon Inc | 画像処理装置およびその方法 |
-
2007
- 2007-05-31 WO PCT/JP2007/061100 patent/WO2007139187A1/ja active Application Filing
- 2007-05-31 EP EP07744488.3A patent/EP2030558B1/en not_active Not-in-force
- 2007-05-31 US US12/302,480 patent/US9247865B2/en not_active Expired - Fee Related
- 2007-05-31 JP JP2008517981A patent/JP4753104B2/ja not_active Expired - Fee Related
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0521988A (ja) | 1991-07-12 | 1993-01-29 | Hitachi Ltd | チツプ電子部品供給装置 |
JPH05340721A (ja) | 1992-04-06 | 1993-12-21 | Olympus Optical Co Ltd | 3次元計測方法及び3次元計測装置 |
JPH1166316A (ja) | 1997-08-26 | 1999-03-09 | Tokyo Met Gov Gesuido Service Kk | 管渠内面展開図化装置 |
JPH11337845A (ja) | 1998-05-25 | 1999-12-10 | Mitsubishi Electric Corp | 内視鏡装置 |
JP2000019424A (ja) | 1998-06-29 | 2000-01-21 | Terumo Corp | 立体内視鏡装置 |
JP2000121339A (ja) | 1998-10-15 | 2000-04-28 | Hamamatsu Photonics Kk | 立体情報検出方法及び装置 |
JP2000331168A (ja) | 1999-03-12 | 2000-11-30 | Tokyoto Gesuido Service Kk | 管渠内面画像の処理装置及びその方法 |
JP2001224594A (ja) | 2000-02-15 | 2001-08-21 | Olympus Optical Co Ltd | 超音波内視鏡システム |
JP2003535659A (ja) | 2000-06-19 | 2003-12-02 | ユニヴァーシティ オブ ワシントン | 走査型単一光ファイバシステムを用いる医療用画像化、診断および治療 |
JP2002191554A (ja) | 2000-12-26 | 2002-07-09 | Asahi Optical Co Ltd | 3次元画像検出装置を備えた電子内視鏡 |
JP2003032674A (ja) | 2001-07-16 | 2003-01-31 | Emaki:Kk | 管状物内壁のビデオ画像から連続した展開静止画像を自動生成するシステム |
WO2004096008A2 (en) * | 2003-05-01 | 2004-11-11 | Given Imaging Ltd. | Panoramic field of view imaging device |
JP2006187551A (ja) * | 2005-01-07 | 2006-07-20 | Olympus Corp | 食道粘膜用画像処理装置 |
Non-Patent Citations (2)
Title |
---|
SAKAI T.: "Chushiten o Koryo shita Chokan Tenkai Gazo no Tekioteki Hyojiho", INFORMATION PROCESSING SOCIETY OF JAPAN KENKYU HOKOKU, vol. 2006, no. 51, 18 May 2006 (2006-05-18) - 19 May 2006 (2006-05-19), pages 167 - 172, XP003019622 * |
See also references of EP2030558A4 |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2009128055A1 (en) * | 2008-04-15 | 2009-10-22 | Provost Fellows And Scholars Of The College Of The Holy And Undivided Trinity Of Queen Elizabeth Near Dublin | Endoscopy system with motion sensors |
JP2010092283A (ja) * | 2008-10-08 | 2010-04-22 | Chiba Univ | 立体画像作成装置及び方法並びに内視鏡検査システム |
JP2010256988A (ja) * | 2009-04-21 | 2010-11-11 | Chiba Univ | 3次元画像生成装置、3次元画像生成方法、及びプログラム |
WO2012098792A1 (ja) * | 2011-01-20 | 2012-07-26 | オリンパスメディカルシステムズ株式会社 | 画像処理装置、画像処理方法、画像処理プログラムおよび内視鏡システム |
JP5341257B2 (ja) * | 2011-01-20 | 2013-11-13 | オリンパスメディカルシステムズ株式会社 | 画像処理装置、画像処理装置の作動方法、画像処理プログラムおよび内視鏡システム |
US8692869B2 (en) | 2011-01-20 | 2014-04-08 | Olympus Medical Systems Corp. | Image processing device, image processing method, machine readable recording medium, endoscope system |
JPWO2012098792A1 (ja) * | 2011-01-20 | 2014-06-09 | オリンパスメディカルシステムズ株式会社 | 画像処理装置、画像処理装置の作動方法、画像処理プログラムおよび内視鏡システム |
JP2014096020A (ja) * | 2012-11-09 | 2014-05-22 | Honda Motor Co Ltd | 画像処理方法、画像処理装置、表面検査システム及びプログラム |
JP2016043033A (ja) * | 2014-08-22 | 2016-04-04 | 富士フイルム株式会社 | 展開画像生成装置、方法、及びプログラム、並びに画像情報記憶装置 |
WO2017158896A1 (ja) * | 2016-03-15 | 2017-09-21 | オリンパス株式会社 | 画像処理装置、画像処理システム、画像処理装置の作動方法 |
JPWO2017158896A1 (ja) * | 2016-03-15 | 2018-03-29 | オリンパス株式会社 | 画像処理装置、画像処理システム、画像処理装置の作動方法 |
US10354436B2 (en) | 2016-03-15 | 2019-07-16 | Olympus Corporation | Image processing apparatus, image processing system and image processing apparatus operation method |
WO2017212725A1 (ja) * | 2016-06-07 | 2017-12-14 | オリンパス株式会社 | 医療用観察システム |
JPWO2017212725A1 (ja) * | 2016-06-07 | 2018-06-28 | オリンパス株式会社 | 医療用観察システム |
JP2019150574A (ja) * | 2018-03-05 | 2019-09-12 | リオン株式会社 | 3次元形状データ作成方法、及び3次元形状データ作成システム |
JP7236689B2 (ja) | 2018-03-05 | 2023-03-10 | リオン株式会社 | 3次元形状データ作成システムの作動方法、及び3次元形状データ作成システム |
JP2021067905A (ja) * | 2019-10-28 | 2021-04-30 | 一般財団法人電力中央研究所 | 中空電柱の内周面撮影装置および非破壊検査装置、並びに中空電柱の内周面撮影方法および非破壊検査方法 |
JP7254428B2 (ja) | 2019-10-28 | 2023-04-10 | 一般財団法人電力中央研究所 | 中空電柱の非破壊検査装置および中空電柱の非破壊検査方法 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2007139187A1 (ja) | 2009-10-15 |
JP4753104B2 (ja) | 2011-08-24 |
US20090207241A1 (en) | 2009-08-20 |
US9247865B2 (en) | 2016-02-02 |
EP2030558A4 (en) | 2010-08-04 |
EP2030558B1 (en) | 2017-05-03 |
EP2030558A1 (en) | 2009-03-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4753104B2 (ja) | 3次元画像構築装置及び方法並びにプログラム | |
US8353816B2 (en) | Endoscopy system and method therefor | |
CN102247114B (zh) | 图像处理装置及图像处理方法 | |
JP4468544B2 (ja) | 内視鏡装置 | |
Seibel et al. | Tethered capsule endoscopy, a low-cost and high-performance alternative technology for the screening of esophageal cancer and Barrett's esophagus | |
KR100890102B1 (ko) | 의료 화상 처리 장치 및 의료 화상 처리 방법 | |
US7775977B2 (en) | Ultrasonic tomographic diagnostic apparatus | |
JP5199594B2 (ja) | 画像計測装置および方法 | |
JP2017508529A (ja) | 内視鏡測定システム及び方法 | |
JP6254053B2 (ja) | 内視鏡画像診断支援装置、システムおよびプログラム、並びに内視鏡画像診断支援装置の作動方法 | |
JP2004358096A (ja) | 超音波内視鏡装置 | |
JPWO2005046462A1 (ja) | 内視鏡装置及びこれを用いた撮影方法 | |
JP4287646B2 (ja) | 画像読取装置 | |
JPH05108819A (ja) | 画像処理装置 | |
JPH03102202A (ja) | 撮像手段による対象部分の検査方法 | |
JP5113990B2 (ja) | 計測用内視鏡装置 | |
JP2010511432A (ja) | 多重集光器を利用した走査ビーム撮像装置および内視鏡 | |
KR101117026B1 (ko) | 서로 다른 영상간에 영상 정합을 수행하는 영상 정합 시스템 및 방법 | |
JPWO2017006708A1 (ja) | 医療装置、医療画像生成方法及び医療画像生成プログラム | |
WO2021171465A1 (ja) | 内視鏡システム及び内視鏡システムによる管腔走査方法 | |
KR101818726B1 (ko) | 내시경 장치 및 이것의 제어 방법 | |
JP7148534B2 (ja) | 画像処理装置、プログラム、及び内視鏡システム | |
JP5210991B2 (ja) | 校正方法および装置 | |
JPH11113912A (ja) | 超音波画像診断装置 | |
WO2022230563A1 (ja) | 内視鏡システム及びその作動方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
DPE2 | Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101) | ||
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 07744488 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12302480 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2008517981 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REEP | Request for entry into the european phase |
Ref document number: 2007744488 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2007744488 Country of ref document: EP |