WO2016043063A1 - 画像処理装置および画像処理方法 - Google Patents
画像処理装置および画像処理方法 Download PDFInfo
- Publication number
- WO2016043063A1 WO2016043063A1 PCT/JP2015/075161 JP2015075161W WO2016043063A1 WO 2016043063 A1 WO2016043063 A1 WO 2016043063A1 JP 2015075161 W JP2015075161 W JP 2015075161W WO 2016043063 A1 WO2016043063 A1 WO 2016043063A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- view
- direct
- image processing
- processing apparatus
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 61
- 238000003672 processing method Methods 0.000 title claims abstract description 7
- 239000000523 sample Substances 0.000 claims abstract description 59
- 238000003384 imaging method Methods 0.000 claims description 34
- 239000002131 composite material Substances 0.000 claims description 33
- 238000000034 method Methods 0.000 claims description 22
- 230000008859 change Effects 0.000 claims description 5
- 230000002194 synthesizing effect Effects 0.000 abstract description 2
- 230000008569 process Effects 0.000 description 13
- 230000004048 modification Effects 0.000 description 8
- 238000012986 modification Methods 0.000 description 8
- 238000005286 illumination Methods 0.000 description 7
- 230000000694 effects Effects 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 3
- 239000000203 mixture Substances 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000002674 endoscopic surgery Methods 0.000 description 2
- 229910052736 halogen Inorganic materials 0.000 description 2
- 150000002367 halogens Chemical class 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 229910052724 xenon Inorganic materials 0.000 description 2
- FHNFHKCVQCLJFQ-UHFFFAOYSA-N xenon atom Chemical compound [Xe] FHNFHKCVQCLJFQ-UHFFFAOYSA-N 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 210000000056 organ Anatomy 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000001356 surgical procedure Methods 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00163—Optical arrangements
- A61B1/00188—Optical arrangements with focusing or zooming features
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/05—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
- A61B1/051—Details of CCD assembly
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/156—Mixing image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/555—Constructional details for picking-up images in sites, inaccessible due to their dimensions or hazardous conditions, e.g. endoscopes or borescopes
Definitions
- the present disclosure relates to an image processing apparatus and an image processing method, and more particularly, to an image processing apparatus and an image processing method that enable an image obtained by a direct-view camera and a side-view camera to be presented in an easy-to-understand manner.
- An endoscopic probe used for endoscopic surgery in addition to a direct-view camera for capturing a direct-viewing direction that is the tip direction of the probe, a side-viewing camera for capturing a side-viewing direction that is the side of the probe Those have been proposed (see, for example, Patent Documents 1 and 2).
- the side-viewing camera By providing the side-viewing camera, it is possible to expand the field of view other than the operative field in the direct-viewing direction, but it is required to present the images obtained by the direct-viewing camera and the side-viewing camera in an easy-to-understand manner for the operator. .
- the present disclosure has been made in view of such a situation, and makes it possible to present images obtained by a direct-view camera and a side-view camera in an easy-to-understand manner.
- An image processing apparatus places a direct-view image obtained by imaging an object in a direct-view direction, which is the tip direction of the probe, in a circular area, and a side-view direction that is the side of the probe.
- an image combining unit configured to generate a composite image in which a side view image obtained by capturing an image of the subject is arranged in a fan shape along the outer periphery of the circular shape.
- the image processing apparatus arranges a direct-view image obtained by imaging an object in the direct-view direction which is the tip direction of the probe in a circular area, and the side of the probe A composite image is generated in which a side-view image obtained by imaging a subject in the side-view direction is arranged in a fan shape along the outer periphery of the circular shape.
- a direct-view image obtained by imaging an object in the direct-view direction that is the tip direction of the probe is disposed in a circular area, and the subject in the side-view direction that is the side of the probe is A composite image is generated in which a side view image obtained by imaging is arranged in a fan shape along the outer periphery of the circular shape.
- the image processing apparatus may be an independent apparatus or an internal block constituting one apparatus.
- an image obtained by the direct-view camera and the side-view camera can be presented in an easy-to-understand manner.
- FIG. 1 is a block diagram showing a configuration example of an embodiment of an endoscope system according to the present disclosure. It is a figure which shows the front-end
- FIG. 1 is a block diagram showing a configuration example of an embodiment of an endoscope system according to the present disclosure.
- the endoscope system 1 of FIG. 1 includes an endoscope probe 11, an image processing device 12, and a display 13.
- the endoscope system 1 is used in an endoscopic surgery in which a portion (operative portion) in the body to be operated on is imaged as a portion to be observed and displayed on the display 13 and the portion to be observed is treated while looking at the display 13 Be
- the endoscope probe 11 is inserted into the patient's body, emits light to the operation site, captures an image of the operation site, and supplies the image to the image processing apparatus 12.
- the image processing device 12 processes (image-processes) an image captured by the endoscope probe 11 so as to be easily viewed by the operator.
- the display 13 displays the processed image supplied from the image processing device 12 and presents it to the operator.
- the endoscope probe 11 includes a direct-view camera 21, side-view cameras 22A and 22B, a zoom drive unit 23, and an illumination unit 24.
- the image processing apparatus 12 includes a direct-view camera data acquisition unit 31, side-view camera data acquisition units 32A and 32B, an image combining unit 33, a display control unit 34, an operation unit 35, and a setting unit 36.
- the direct-view camera 21 captures an object in the direct-view direction, which is the tip direction of the endoscope probe 11, and generates an image obtained as a result.
- the side-viewing cameras 22A and 22B capture an object in the side-viewing direction that is the side of the endoscope probe 11, and generate an image obtained as a result.
- the direct-view camera 21 and the side-view cameras 22A and 22B are configured by, for example, an image sensor such as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) sensor.
- CCD charge coupled device
- CMOS complementary metal oxide semiconductor
- FIG. 2 is a view showing a distal end portion of the endoscope probe 11 provided with the direct-view camera 21 and the side-view cameras 22A and 22B.
- the direct-view camera 21 is configured by a stereo camera of a first camera 21A and a second camera 21B, and the endoscope probe 11 is configured so that the direct-viewing direction of the endoscope probe 11 is an imaging direction. It is attached to the tip of 11.
- the side-viewing cameras 22A and 22B are symmetrical with respect to the axial center of the endoscope probe 11 so that the imaging range becomes uniform on the cylindrical side surface within a predetermined distance from the tip of the endoscope probe 11. It is attached in the proper position.
- the direct-view camera 21 captures an image of the operative site
- the side-viewing cameras 22A and 22B capture the peripheral area of the operative site as auxiliary information. Therefore, for example, a high resolution image sensor having so-called 4K camera, having a pixel count of about 4000 ⁇ about 2000 in the horizontal direction ⁇ vertical direction, is adopted as the direct view camera 21, and the camera 22A for side view
- a low-resolution image sensor having a pixel count of about 2000 ⁇ about 1000 in the horizontal and vertical directions is adopted for each of Image sensors with different resolutions can be used.
- the direct-view camera 21 and the side-view cameras 22A and 22B are Chip-On-Tip types in which the image sensor is attached to the tip of the endoscope probe 11.
- An image sensor is disposed at a CCU (camera control unit) or the like at the root of the endoscope probe 11, and the light taken in at the tip of the endoscope probe 11 is transmitted to the image sensor by an optical fiber or the like. It can also be done.
- the zoom drive unit 23 drives the optical lens of the imaging optical system based on the control signal supplied from the operation unit 35 of the image processing apparatus 12. Thereby, the focus and zoom magnification of the direct-view camera 21 displayed on the display 13 are changed.
- the zoom magnifications of the side-viewing cameras 22A and 22B are fixed, but the zoom magnifications of the side-viewing cameras 22A and 22B may be changeable similarly to the direct-viewing camera 21.
- the illumination unit 24 includes, for example, a halogen lamp, a xenon lamp, an LED (Light Emitting Diode) light source, and the like, and emits light for illuminating the surgical site.
- the illumination unit 24 may be configured to arrange an LED light source in the vicinity of the direct-view camera 21 and the side-view cameras 22A and 22B in FIG. 2, or the direct-view camera 21 and the side-view camera 22A. It is also possible to provide only the light emitting part in the vicinity of each of the light source unit 22 and the light emitting element 22B and transmit the light from the light source unit such as a halogen lamp or xenon lamp to the light emitting part by an optical fiber etc.
- the on / off and the light amount of the illumination of the illumination unit 24 are controlled by a control signal from the operation unit 35.
- the direct-view camera data acquisition unit 31 acquires imaging data obtained by the direct-view camera 21 and supplies the imaging data to the image combining unit 33.
- the side-viewing camera data acquisition unit 32A acquires the imaging data obtained by the side-viewing camera 22A, and supplies the imaging data to the image combining unit 33.
- the side-view camera data acquisition unit 32 ⁇ / b> B acquires imaging data acquired by the side-view camera 22 ⁇ / b> B and supplies the imaging data to the image combining unit 33.
- the image combining unit 33 generates a combined image in which captured images obtained by the direct-view camera 21, the side-view camera 22A, and the side-view camera 22B are arranged at predetermined positions, and supplies the combined image to the display control unit 34. Do. Parameters are supplied from the setting unit 36 to the image combining unit 33, and the image combining unit 33 generates a combined image according to the parameters. Details of the combined image generated by the image combining unit 33 will be described later.
- the display control unit 34 displays the synthesized image on the display 13 by converting the image data of the synthesized image supplied from the image synthesizing unit 33 into an image signal corresponding to the input format of the display 13 and outputting the image signal to the display 13.
- the display 13 is configured by, for example, an LCD (Liquid Crystal Display) or the like, and enables 2D display and 3D display corresponding to a stereo camera.
- the display 13 can be a display capable of only 2D display.
- the display 13 may be a head mounted display or the like.
- the operation unit 35 receives the operation of the operator (user) on the endoscope probe 11 such as the zoom magnification and the illumination light amount, and outputs a control signal to the zoom drive unit 23 and the illumination unit 24 according to the received operation content. .
- the operation unit 35 also receives input of parameters for generating a composite image, and outputs the input parameters to the setting unit 36.
- the setting unit 36 acquires various parameters supplied from the operation unit 35 and stores the parameters in an internal memory. Further, the setting unit 36 supplies various parameters stored in the memory to the image combining unit 33 as needed.
- the endoscope system 1 is configured as described above.
- FIG. 3 shows an example of a combined image generated by the image combining unit 33 and displayed on the display 13.
- the composite image 50 includes a first circle C1 and a second circle C2 having a larger diameter.
- the direct-view image display area 51 is provided inside the first circle C1.
- the direct-view image display area 51 is an area in which an image captured by the direct-view camera 21 is displayed.
- an image captured by the direct-view camera 21 is also referred to as a direct-view image.
- a side-view A camera display area 52A and a side-view B camera display area 52B are provided in an area between the first circle C1 and the second circle C2.
- the side view A camera display area 52A is an area where an image taken by the side view camera 22A is displayed
- the side view B camera display area 52B is an image where the image taken by the side view camera 22B is displayed It is an area.
- Each of the side view A camera display area 52A and the side view B camera display area 52B has a fan shape.
- the side view A camera display area 52A and the side view B camera display area 52B are disposed at symmetrical positions corresponding to the positions of the side view cameras 22A and 22B.
- an image captured by the side-viewing camera 22A or 22B is also referred to as a side-viewing image.
- the side-viewing cameras 22A and 22B are also simply referred to as side-viewing cameras 22.
- Scan conversion processing used in an algorithm of B-mode display using a convex type probe of an ultrasonic diagnostic apparatus for processing of converting a rectangular side view image obtained by the side view camera 22 into a fan shape Can be adopted.
- the scan conversion process is a process of converting image data represented by a signal line of scanning lines into image data represented by an orthogonal coordinate system. Since the data becomes sparser toward the outside of the fan shape, it is appropriately interpolated using linear interpolation or the like.
- the non-display area 53 other than the side view A camera display area 52A and the side view B camera display area 52B is captured by the side view camera 22. It is an area where an image is not displayed.
- the non-display area 53 is a display that allows the operator to clearly recognize that there is no display of an image, for example, a gray display or the like.
- parameter information such as a zoom factor is displayed.
- the image processing apparatus 12 arranges the direct-view image captured by the direct-view camera 21 in the circular area of the inner first circle C1, and the side captured by the side-view cameras 22A and 22B.
- a composite image in which the visual image is arranged in a fan shape along the outer periphery of the first circle C1 is generated and displayed on the display 13.
- the display 13 for surgery tends to be high definition and large screen.
- the display area of the display 13 can be effectively used by presenting the side view image as additional information on the circumference of the direct view image as described above.
- FIG. 4A is a perspective view of the tip portion of the endoscope probe 11 as viewed obliquely from below
- FIG. 4B is a view of the tip portion of the endoscope probe 11 as viewed from directly below.
- a predetermined position (direction) of the direct-view camera 21 is a reference position (reference direction)
- the position (direction) at which the side-view camera 22A is disposed is the direct-view camera 21 as shown in A of FIG.
- view angle ⁇ of the side-view camera 22A shown in B of FIG. 4 is also uniquely determined by the lens design.
- the side-view A camera display area 52A in the composite image is between the first circle C1 and the second circle C2, as shown in FIG. Is determined at a predetermined position in the area of That is, the center position of the side view A camera display area 52A in the arc direction is a position at an angle ⁇ with respect to the reference position of the direct view image, and the sectorial angle of the side view A camera display area 52A is ⁇ .
- the side-view A camera display area 52A is disposed in the area between the first circle C1 and the second circle C2. The same applies to the positional relationship between the side-viewing B camera display area 52B and the direct-view image display area 51.
- a notch that indicates the reference position (reference direction) of the direct-view camera 21 is provided at the tip portion of the endoscope probe 11 so that the notch appears in the direct-view image, and the notch in the direct-view image
- the angle .alpha is provided at the tip portion of the endoscope probe 11 so that the notch appears in the direct-view image, and the notch in the direct-view image
- the space in the body into which the endoscope probe 11 is inserted is regarded as the space inside the cylindrical shape 61 as shown in A of FIG.
- the letter “F” is described on the bottom of the cylindrical shape 61, and the letter “A” and the letter “B” face the inner side surface (inner wall) of the cylindrical shape 61. Is described in.
- the image combining unit 33 of the image processing device 12 generates a combined image 50 as shown in C of FIG. That is, the character “F” is displayed in the direct-view image display area 51 of the composite image 50, the character “A” is displayed in the side-view A camera display area 52A, and the character “B” is displayed in the side-view B camera display area 52B. Is projected.
- the display of the composite image 50 on the display 13 is also rotated in accordance with the movement of the endoscope probe 11. You can do so.
- the display of the composite image 50 may be fixed.
- the 3D image is a 2D image for performing 3D display, and is configured of a set of an image for the left eye and an image for the right eye.
- Parallax is set to the image for the left eye and the image for the right eye, and by displaying the image for the left eye and the image for the right eye alternately, a stereoscopic effect (a feeling of depth) for the person who saw the displayed image Can be perceived.
- the image combining unit 33 also applies parallax to the side-view A camera display area 52A and the side-view B camera display area 52B from the image obtained by the corresponding single-eye side-view camera 22.
- An image and an image for the right eye may be generated to display a 3D image.
- the positions (coordinate points) in the side view A camera display area 52A and the side view B camera display area 52B are farther from the direct view image display area 51, in other words, the sense of depth
- the parallax image can be generated so as to feel less and in front.
- a display method in which rectangular direct view images and side view images are arranged side by side a display method in which a monitor on which direct view images are displayed and a monitor on which side view images are displayed are displayed side by side It is necessary to move the viewpoint greatly to see the image.
- the display method of the present disclosure it is not necessary to move the viewpoint, and the positional relationship between the direct-view image and the side-view image can also be determined instantaneously.
- the operator can easily and instantaneously determine the positional relationship between the direct-view image and the side-view image, and not only the operative part in the direct-view direction but also the operation field around it. Can be easily grasped. Therefore, the images obtained by the direct-view camera 21 and the side-view camera 22 can be presented in an easy-to-understand manner.
- the operative field in the periphery other than the direct viewing direction can be easily grasped, it is possible to prevent, for example, the endoscope probe 11 or forceps from coming into contact with an organ in an area other than the direct viewing direction. .
- the diameter P1 of the first circle C1 of the composite image 50 and the diameter P2 of the second circle C2 as shown in FIG. 8 can be set.
- the height of the fan-shaped side view image is determined by the relationship between the diameter P1 and the diameter P2.
- the diameter P1 of the first circle C1 and the diameter P2 of the second circle C2 set here may be fixed values, but as default values in the initial state, for example, as shown in FIG.
- the image combining unit 33 may perform control to change according to the zoom operation on the direct-view image.
- FIG. 9 shows an example of control for changing the diameter P1 of the first circle C1 of the composite image according to the zoom operation.
- the image combining unit 33 sets the diameter of the first circle C1 on which the direct-view image is displayed. , Control to change to a value P1 'larger than the default value P1.
- the position on the display 13 where the composite image 50 is displayed can also be set by parameters such as the right side, the center, and the left side.
- the display mode in which the side view A camera display area 52A for displaying the side view image and the side view B camera display area 52B protrude from the display area of the display 13 As shown in 14, setting a display mode or the like for adjusting the height of the side-view image so that the side-view A camera display area 52A and the side-view B camera display area 52B do not protrude from the display area of the display 13 it can.
- step S1 the image processing device 12 acquires a direct view image and a side view image.
- the direct-view camera data acquisition unit 31 acquires the direct-view image supplied from the direct-view camera 21 and supplies the direct-view image to the image combining unit 33.
- the side-view camera data acquisition unit 32A acquires the side-view image supplied from the side-view camera 22A
- the side-view camera data acquisition unit 32B receives the side-view image supplied from the side-view camera 22B. Are respectively supplied to the image combining unit 33.
- step S2 the image combining unit 33 arranges the direct-view image inside the first circle C1 using the acquired direct-view image and the side-view image, and the side on the outer circumference of the first circle C1.
- a composite image in which the side view images captured by the viewing camera 22A and the side viewing camera 22B are arranged in a fan shape is generated and supplied to the display control unit 34.
- step S 3 the display control unit 34 converts the image data of the combined image supplied from the image combining unit 33 into an image signal corresponding to the input format of the display 13 and outputs the image signal to the display 13.
- the display 13 is displayed.
- steps S1 to S3 described above are continuously performed while the direct-view image and the side-view image are supplied from the endoscope probe 11.
- the operator can simultaneously observe not only the operative part in the direct view direction but also a wide range including the side by viewing the composite image displayed by the image composition display process.
- FIG. 16 shows an example in which four side-viewing cameras 22A to 22D are attached to the side surface of the endoscope probe 11 so as to equally divide the periphery of the side in four directions and to capture an image. .
- the mounting position of the side-viewing camera 22D is a position not visible in FIG. 16, so the side-viewing camera 22D is not shown in FIG.
- the side-view A camera display area 52A is provided in the area between the first circle C1 and the second circle C2, as shown in FIG.
- the side-viewing B camera display area 52B, the side-viewing C camera display area 52C, and the side-viewing D camera display area 52D are arranged in accordance with the imaging direction.
- the imaging range may overlap in the side-viewing images captured by the four side-viewing cameras 22A to 22D. In the area where the imaging ranges overlap, a plurality of side view images including the overlapping ranges are connected seamlessly using any stitching technique.
- the hatched areas indicate areas where imaging ranges overlap between adjacent side-view images.
- a side view image of an area where such imaging ranges overlap for example, two side view images are combined at a synthesis ratio ⁇ to generate a side view image of an overlapping portion, and adjacent side view images are generated.
- the priorities of the side-viewing images captured by the four side-viewing cameras 22A to 22D are set in advance, and any of the overlapping side-viewing images is set according to the set priorities. Adjacent side-view images may be connected by adopting one of them.
- ⁇ Modification 2> In the embodiment described above, an example has been described in which the side-viewing cameras 22 are equally disposed on the side surface of the endoscope probe 11 so that the entire periphery of the side of the endoscope probe 11 is imaged as much as possible.
- the side-viewing cameras 22A and 22B are disposed adjacent to only certain sides of the side periphery of the endoscope probe 11, and the side-viewing camera 22A is And 22B may function as a stereo camera.
- the two side-viewing cameras 22A and 22B are disposed along the longitudinal direction of the endoscope probe 11, the distance between the side-viewing cameras 22A and 22B can be secured to a certain extent. Therefore, distance information by stereo vision can be accurately obtained.
- the endoscope probe 11 includes one side-viewing camera 22 together with a rotation mechanism, and the side-viewing camera 22 rotates 360 degrees to make the side of the endoscope probe 11.
- a side view image obtained by imaging the entire surroundings may be acquired and displayed.
- the endoscope probe 11 is mounted with an angle sensor capable of detecting the rotation angle of the camera 22 for side vision, and the angle sensor rotates the camera 22 for side vision at the moment when the side vision image is acquired. Detect corners And based on the angle information on the moment the side view image was acquired, the display position of the side view image in a synthetic image can be determined.
- the above-described series of processes may be performed by hardware or software.
- a program that configures the software is installed on a computer.
- the computer includes, for example, a general-purpose personal computer that can execute various functions by installing a computer incorporated in dedicated hardware and various programs.
- FIG. 20 is a block diagram showing an example of a hardware configuration of a computer that executes the series of processes described above according to a program.
- a central processing unit (CPU) 101 a read only memory (ROM) 102, and a random access memory (RAM) 103 are mutually connected by a bus 104.
- CPU central processing unit
- ROM read only memory
- RAM random access memory
- an input / output interface 105 is connected to the bus 104.
- An input unit 106, an output unit 107, a storage unit 108, a communication unit 109, and a drive 110 are connected to the input / output interface 105.
- the input unit 106 includes a keyboard, a mouse, a microphone, and the like.
- the output unit 107 includes a display, a speaker, and the like.
- the storage unit 108 includes a hard disk, a non-volatile memory, and the like.
- the communication unit 109 includes a network interface and the like.
- the drive 110 drives a removable recording medium 111 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
- the CPU 101 loads the program stored in the storage unit 108 into the RAM 103 via the input / output interface 105 and the bus 104 and executes the program to execute the above-described image.
- a series of processing including composite display processing is performed.
- the program can be installed in the storage unit 108 via the input / output interface 105 by mounting the removable recording medium 111 in the drive 110.
- the program can be received by the communication unit 109 via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting, and can be installed in the storage unit 108.
- the program can be installed in advance in the ROM 102 or the storage unit 108.
- the program executed by the computer may be a program that performs processing in chronological order according to the order described in this specification, in parallel, or when necessary, such as when a call is made. It may be a program to be processed.
- a system means a set of a plurality of components (apparatus, modules (parts), etc.), and it does not matter whether all the components are in the same case. Therefore, a plurality of devices housed in separate housings and connected via a network, and one device housing a plurality of modules in one housing are all systems. .
- the present disclosure can have a cloud computing configuration in which one function is shared and processed by a plurality of devices via a network.
- each step described in the above-described flowchart can be executed by one device or in a shared manner by a plurality of devices.
- the plurality of processes included in one step can be executed by being shared by a plurality of devices in addition to being executed by one device.
- the present disclosure can also have the following configurations.
- a direct-view image obtained by imaging a subject in the direct-view direction which is the tip direction of the probe is disposed in a circular area, and a side-view obtained by imaging the subject in the side-view direction which is the side of the probe
- An image processing apparatus comprising: an image combining unit configured to generate a composite image in which an image is arranged in a fan-like shape along the outer periphery of the circular shape.
- the image processing apparatus wherein the plurality of side-view images are disposed at the position of the outer periphery of the circular shape corresponding to the position of the plurality of imaging elements of the probe.
- the image processing apparatus according to (2) or (3), wherein the plurality of side-view images including the area where the imaging ranges overlap are connected by stitching.
- the image processing apparatus according to any one of (1) to (4), in which continuity of video is secured at the boundary between the direct-view image and the side-view image.
- the image combining unit performs control of changing the diameter of the circular shape in which the direct-view image is arranged according to the zoom operation of the direct-view image.
- the image processing device according to any one of (1) to (5) .
- the image processing apparatus according to any one of (1) to (6), wherein the direct-view image is a 3D image.
- the image processing apparatus according to any one of (1) to (9), wherein the direct-view image is an image of higher resolution than the side-view image.
- the image processing apparatus according to any one of (1) to (10), further including: a setting unit configured to set a display method of the composite image.
- the image processing device A direct-view image obtained by imaging a subject in the direct-view direction which is the tip direction of the probe is disposed in a circular area, and a side-view obtained by imaging the subject in the side-view direction which is the side of the probe An image processing method for generating a composite image in which an image is arranged in a fan shape along the outer periphery of the circular shape.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Radiology & Medical Imaging (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Optics & Photonics (AREA)
- Pathology (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Endoscopes (AREA)
- Instruments For Viewing The Inside Of Hollow Bodies (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
図1は、本開示に係る内視鏡システムの一実施の形態の構成例を示すブロック図である。
図3は、画像合成部33によって生成されてディスプレイ13に表示された合成画像の例を示している。
図8乃至図14を参照して、設定部36で設定可能な合成画像のパラメータについて説明する。
図15のフローチャートを参照して、画像処理装置12による画像合成表示処理について説明する。
上述した例では、円筒状の内視鏡プローブ11の側面に取り付けられている側視用カメラ22の個数が2個である例について説明したが、側視用カメラ22の個数は、1個でもよいし、3個以上でもよい。
上述した実施の形態では、内視鏡プローブ11の側方の周囲全体が出来るだけ撮像されるように、内視鏡プローブ11の側面に側視用カメラ22が均等配置される例について説明した。
図19に示されるように、内視鏡プローブ11は1個の側視用カメラ22を回転機構とともに備え、側視用カメラ22が360度回転することにより、内視鏡プローブ11の側方の周囲全体を撮像した側視画像を取得して表示してもよい。この場合、内視鏡プローブ11には、側視用カメラ22の回転角を検出できる角度センサを搭載しておき、角度センサが、側視画像が取得された瞬間の側視用カメラ22の回転角を検出する。そして、側視画像が取得された瞬間の角度情報に基づいて、合成画像における側視画像の表示位置を決定することができる。
上述した一連の処理は、ハードウエアにより実行することもできるし、ソフトウエアにより実行することもできる。一連の処理をソフトウエアにより実行する場合には、そのソフトウエアを構成するプログラムが、コンピュータにインストールされる。ここで、コンピュータには、専用のハードウエアに組み込まれているコンピュータや、各種のプログラムをインストールすることで、各種の機能を実行することが可能な、例えば汎用のパーソナルコンピュータなどが含まれる。
(1)
プローブの先端方向である直視方向の被写体を撮像して得られた直視画像を円形状の領域内に配置し、前記プローブの側方である側視方向の被写体を撮像して得られた側視画像を、前記円形状の外周に沿わせて扇形状に配置した合成画像を生成する画像合成部を備える
画像処理装置。
(2)
前記合成画像には、複数の撮像素子で得られた複数の前記側視画像が配置されている
前記(1)に記載の画像処理装置。
(3)
複数の前記側視画像は、前記プローブの前記複数の撮像素子の位置に対応した前記円形状の外周の位置に配置されている
前記(2)に記載の画像処理装置。
(4)
撮像範囲が重複している領域を含む複数の前記側視画像は、スティッチングにより接続される
前記(2)または(3)に記載の画像処理装置。
(5)
前記直視画像と前記側視画像の境界では、映像の連続性が確保されている
前記(1)乃至(4)のいずれかに記載の画像処理装置。
(6)
前記画像合成部は、前記直視画像のズーム操作に応じて、前記直視画像を配置する前記円形状の直径を変更する制御を行う
前記(1)乃至(5)のいずれかに記載の画像処理装置。
(7)
前記直視画像は、3D画像である
前記(1)乃至(6)のいずれかに記載の画像処理装置。
(8)
前記側視画像は、3D画像である
前記(1)乃至(7)のいずれかに記載の画像処理装置。
(9)
前記側視画像は、扇形状の外側になるほど、より手前に感じる画像である
前記(8)に記載の画像処理装置。
(10)
前記直視画像は、前記側視画像より高解像度の画像である
前記(1)乃至(9)のいずれかに記載の画像処理装置。
(11)
前記合成画像の表示方法を設定する設定部をさらに備える
前記(1)乃至(10)のいずれかに記載の画像処理装置。
(12)
前記設定部は、扇形状の前記側視画像の横幅を設定する
前記(11)に記載の画像処理装置。
(13)
前記設定部は、前記円形状の径を設定する
前記(11)または(12)に記載の画像処理装置。
(14)
前記設定部は、扇形状の前記側視画像の高さを設定する
前記(11)乃至(13)のいずれかに記載の画像処理装置。
(15)
前記設定部は、画面内の前記合成画像の表示位置を設定する
前記(11)乃至(14)のいずれかに記載の画像処理装置。
(16)
画像処理装置が、
プローブの先端方向である直視方向の被写体を撮像して得られた直視画像を円形状の領域内に配置し、前記プローブの側方である側視方向の被写体を撮像して得られた側視画像を、前記円形状の外周に沿わせて扇形状に配置した合成画像を生成する
画像処理方法。
Claims (16)
- プローブの先端方向である直視方向の被写体を撮像して得られた直視画像を円形状の領域内に配置し、前記プローブの側方である側視方向の被写体を撮像して得られた側視画像を、前記円形状の外周に沿わせて扇形状に配置した合成画像を生成する画像合成部を備える
画像処理装置。 - 前記合成画像には、複数の撮像素子で得られた複数の前記側視画像が配置されている
請求項1に記載の画像処理装置。 - 複数の前記側視画像は、前記プローブの前記複数の撮像素子の位置に対応した前記円形状の外周の位置に配置されている
請求項2に記載の画像処理装置。 - 撮像範囲が重複している領域を含む複数の前記側視画像は、スティッチングにより接続される
請求項2に記載の画像処理装置。 - 前記直視画像と前記側視画像の境界では、映像の連続性が確保されている
請求項1に記載の画像処理装置。 - 前記画像合成部は、前記直視画像に対するズーム操作に応じて、前記直視画像を配置する前記円形状の直径を変更する制御を行う
請求項1に記載の画像処理装置。 - 前記直視画像は、3D画像である
請求項1に記載の画像処理装置。 - 前記側視画像は、3D画像である
請求項1に記載の画像処理装置。 - 前記側視画像は、扇形状の外側になるほど、より手前に感じる画像である
請求項8に記載の画像処理装置。 - 前記直視画像は、前記側視画像より高解像度の画像である
請求項1に記載の画像処理装置。 - 前記合成画像の表示方法を設定する設定部をさらに備える
請求項1に記載の画像処理装置。 - 前記設定部は、扇形状の前記側視画像の横幅を設定する
請求項11に記載の画像処理装置。 - 前記設定部は、前記円形状の径を設定する
請求項11に記載の画像処理装置。 - 前記設定部は、扇形状の前記側視画像の高さを設定する
請求項11に記載の画像処理装置。 - 前記設定部は、画面内の前記合成画像の表示位置を設定する
請求項11に記載の画像処理装置。 - 画像処理装置が、
プローブの先端方向である直視方向の被写体を撮像して得られた直視画像を円形状の領域内に配置し、前記プローブの側方である側視方向の被写体を撮像して得られた側視画像を、前記円形状の外周に沿わせて扇形状に配置した合成画像を生成する
画像処理方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/510,124 US10701339B2 (en) | 2014-09-18 | 2015-09-04 | Image processing device and image processing method |
JP2016548833A JPWO2016043063A1 (ja) | 2014-09-18 | 2015-09-04 | 画像処理装置および画像処理方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014190164 | 2014-09-18 | ||
JP2014-190164 | 2014-09-18 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016043063A1 true WO2016043063A1 (ja) | 2016-03-24 |
Family
ID=55533103
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/075161 WO2016043063A1 (ja) | 2014-09-18 | 2015-09-04 | 画像処理装置および画像処理方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US10701339B2 (ja) |
JP (1) | JPWO2016043063A1 (ja) |
WO (1) | WO2016043063A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019530508A (ja) * | 2016-09-29 | 2019-10-24 | 270 サージカル リミテッド | 剛性医療手術用照明装置 |
WO2020004259A1 (ja) * | 2018-06-29 | 2020-01-02 | 富士フイルム株式会社 | 内視鏡画像表示システムおよび内視鏡画像表示装置 |
JP2020144166A (ja) * | 2019-03-04 | 2020-09-10 | 株式会社タムロン | 観察撮像装置 |
JP2021045339A (ja) * | 2019-09-18 | 2021-03-25 | Hoya株式会社 | 内視鏡、内視鏡用プロセッサ及び内視鏡システム |
WO2022070782A1 (ja) * | 2020-10-02 | 2022-04-07 | Hoya株式会社 | プログラム、情報処理方法及び内視鏡システム |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170071456A1 (en) * | 2015-06-10 | 2017-03-16 | Nitesh Ratnakar | Novel 360-degree panoramic view formed for endoscope adapted thereto with multiple cameras, and applications thereof to reduce polyp miss rate and facilitate targeted polyp removal |
US10454897B1 (en) | 2016-01-21 | 2019-10-22 | Amazon Technologies, Inc. | Proxy captive portal traffic for input-limited devices |
US10292570B2 (en) * | 2016-03-14 | 2019-05-21 | Endochoice, Inc. | System and method for guiding and tracking a region of interest using an endoscope |
WO2019051019A1 (en) * | 2017-09-08 | 2019-03-14 | Covidien Lp | FUNCTIONAL IMAGING OF OPERATIVE FIELD WITH AN AUXILIARY CAMERA FOLLOWED |
US11310481B2 (en) * | 2017-10-26 | 2022-04-19 | Sony Corporation | Imaging device, system, method and program for converting a first image into a plurality of second images |
WO2019104329A1 (en) * | 2017-11-27 | 2019-05-31 | Optecks, Llc | Medical three-dimensional (3d) scanning and mapping system |
EP3737276A4 (en) * | 2018-01-10 | 2021-10-20 | ChemImage Corporation | MODULATION OF TIME-CORRELATED SOURCES FOR ENDOSCOPY |
US11224330B2 (en) * | 2018-01-28 | 2022-01-18 | Surgical Ltd 270. | Medical imaging device with camera magnification management system |
EP3686610A1 (en) | 2019-01-24 | 2020-07-29 | Rohde & Schwarz GmbH & Co. KG | Probe, measuring system and method for applying a probe |
US20210113765A1 (en) * | 2019-10-17 | 2021-04-22 | National Guard Health Affairs | Smart device and system for treatment of gastric reflux |
US20240127931A1 (en) * | 2020-02-21 | 2024-04-18 | Raytrx, Llc | Surgery visualization theatre |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010279539A (ja) * | 2009-06-04 | 2010-12-16 | Fujifilm Corp | 診断支援装置および方法並びにプログラム。 |
JP2013066648A (ja) * | 2011-09-26 | 2013-04-18 | Olympus Corp | 内視鏡用画像処理装置及び内視鏡装置 |
JP2013066646A (ja) * | 2011-09-26 | 2013-04-18 | Olympus Medical Systems Corp | 内視鏡用画像処理装置、内視鏡装置及び画像処理方法 |
WO2014088076A1 (ja) * | 2012-12-05 | 2014-06-12 | オリンパスメディカルシステムズ株式会社 | 内視鏡装置 |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04341232A (ja) * | 1991-03-11 | 1992-11-27 | Olympus Optical Co Ltd | 電子内視鏡システム |
JPH06269403A (ja) * | 1993-03-19 | 1994-09-27 | Olympus Optical Co Ltd | 電子式内視鏡装置 |
JP2001078960A (ja) * | 1999-09-14 | 2001-03-27 | Olympus Optical Co Ltd | 内視鏡装置 |
US20050197533A1 (en) * | 2000-03-16 | 2005-09-08 | Medivision, Inc. | Endoscope and camera mount |
JP2005261557A (ja) * | 2004-03-17 | 2005-09-29 | Olympus Corp | 視野方向可変型内視鏡および内視鏡システム |
JP2009268657A (ja) * | 2008-05-07 | 2009-11-19 | I Systems:Kk | 立体内視鏡 |
CN102469930B (zh) * | 2009-11-06 | 2014-09-10 | 奥林巴斯医疗株式会社 | 内窥镜系统 |
JP5855358B2 (ja) * | 2011-05-27 | 2016-02-09 | オリンパス株式会社 | 内視鏡装置及び内視鏡装置の作動方法 |
JP5919533B2 (ja) * | 2011-12-15 | 2016-05-18 | パナソニックIpマネジメント株式会社 | 内視鏡およびこれを備えた内視鏡システム |
KR102043439B1 (ko) * | 2012-11-21 | 2019-11-12 | 삼성전자주식회사 | 내시경 장치 |
WO2015191784A1 (en) * | 2014-06-10 | 2015-12-17 | Nitesh Ratnakar | Endoscope with multiple views and novel configurations adapted thereto |
-
2015
- 2015-09-04 US US15/510,124 patent/US10701339B2/en not_active Expired - Fee Related
- 2015-09-04 WO PCT/JP2015/075161 patent/WO2016043063A1/ja active Application Filing
- 2015-09-04 JP JP2016548833A patent/JPWO2016043063A1/ja active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010279539A (ja) * | 2009-06-04 | 2010-12-16 | Fujifilm Corp | 診断支援装置および方法並びにプログラム。 |
JP2013066648A (ja) * | 2011-09-26 | 2013-04-18 | Olympus Corp | 内視鏡用画像処理装置及び内視鏡装置 |
JP2013066646A (ja) * | 2011-09-26 | 2013-04-18 | Olympus Medical Systems Corp | 内視鏡用画像処理装置、内視鏡装置及び画像処理方法 |
WO2014088076A1 (ja) * | 2012-12-05 | 2014-06-12 | オリンパスメディカルシステムズ株式会社 | 内視鏡装置 |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019530508A (ja) * | 2016-09-29 | 2019-10-24 | 270 サージカル リミテッド | 剛性医療手術用照明装置 |
EP3518725A4 (en) * | 2016-09-29 | 2020-05-13 | 270 Surgical Ltd. | RIGID MEDICAL SURGERY LIGHTING DEVICE |
WO2020004259A1 (ja) * | 2018-06-29 | 2020-01-02 | 富士フイルム株式会社 | 内視鏡画像表示システムおよび内視鏡画像表示装置 |
JP2020144166A (ja) * | 2019-03-04 | 2020-09-10 | 株式会社タムロン | 観察撮像装置 |
JP7265376B2 (ja) | 2019-03-04 | 2023-04-26 | 株式会社タムロン | 観察撮像装置 |
JP2021045339A (ja) * | 2019-09-18 | 2021-03-25 | Hoya株式会社 | 内視鏡、内視鏡用プロセッサ及び内視鏡システム |
JP7353886B2 (ja) | 2019-09-18 | 2023-10-02 | Hoya株式会社 | 内視鏡及び内視鏡システム |
WO2022070782A1 (ja) * | 2020-10-02 | 2022-04-07 | Hoya株式会社 | プログラム、情報処理方法及び内視鏡システム |
JP2022059877A (ja) * | 2020-10-02 | 2022-04-14 | Hoya株式会社 | プログラム、情報処理方法及び内視鏡システム |
JP7440392B2 (ja) | 2020-10-02 | 2024-02-28 | Hoya株式会社 | プログラム、情報処理方法及び内視鏡システム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2016043063A1 (ja) | 2017-07-06 |
US20170257619A1 (en) | 2017-09-07 |
US10701339B2 (en) | 2020-06-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2016043063A1 (ja) | 画像処理装置および画像処理方法 | |
JP6549711B2 (ja) | 内視鏡装置及び内視鏡装置の作動方法 | |
US20190051039A1 (en) | Image processing apparatus, image processing method, program, and surgical system | |
JP2015531271A (ja) | 外科用画像処理システム、外科用画像処理方法、プログラム、コンピュータ可読記録媒体、医用画像処理装置、および画像処理検査装置 | |
WO2018096987A1 (ja) | 情報処理装置および方法、並びにプログラム | |
JP2015228954A (ja) | 画像処理装置および画像処理方法 | |
EP3571662B1 (en) | Video signal processing apparatus, video signal processing method, and program for dynamic range compression | |
JP7544034B2 (ja) | 画像処理装置、画像処理方法およびプログラム | |
JP6256872B2 (ja) | 内視鏡システム | |
WO2013179855A1 (ja) | 立体視内視鏡システム | |
JPWO2019012857A1 (ja) | 撮像装置、画像生成方法 | |
US20120271102A1 (en) | Stereoscopic endoscope apparatus | |
JP2019004978A (ja) | 手術システムおよび手術用撮像装置 | |
JP2014228851A (ja) | 内視鏡装置、画像取得方法および画像取得プログラム | |
JP7146735B2 (ja) | 制御装置、外部機器、医療用観察システム、制御方法、表示方法およびプログラム | |
WO2020065756A1 (ja) | 内視鏡装置、内視鏡画像処理装置及び内視鏡画像処理装置の作動方法 | |
JPWO2018225346A1 (ja) | 医療用システム及び制御ユニット | |
EP3598735A1 (en) | Imaging device, video signal processing device, and video signal processing method | |
WO2018087977A1 (ja) | 情報処理装置、情報処理方法、プログラム | |
JP2019146882A (ja) | 医療用制御装置、医療用観察装置、および制御方法 | |
JPWO2018163499A1 (ja) | 医療画像表示制御装置、医療画像表示装置、医療情報処理システム、及び医療画像表示制御方法 | |
WO2020054193A1 (ja) | 情報処理装置、情報処理方法及びプログラム | |
JP6064091B2 (ja) | 内視鏡システム | |
US12070185B2 (en) | Observation device and method of operating an observation device | |
JP7552593B2 (ja) | 画像処理装置、画像処理方法、プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15841688 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016548833 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15510124 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15841688 Country of ref document: EP Kind code of ref document: A1 |