CN219895684U - Endoscopic imaging system - Google Patents

Endoscopic imaging system Download PDF

Info

Publication number
CN219895684U
CN219895684U CN202321311959.8U CN202321311959U CN219895684U CN 219895684 U CN219895684 U CN 219895684U CN 202321311959 U CN202321311959 U CN 202321311959U CN 219895684 U CN219895684 U CN 219895684U
Authority
CN
China
Prior art keywords
image
lens
image detector
imaging system
detector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202321311959.8U
Other languages
Chinese (zh)
Inventor
刘正
方煜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan United Imaging Zhirong Medical Technology Co Ltd
Original Assignee
Wuhan United Imaging Zhirong Medical Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan United Imaging Zhirong Medical Technology Co Ltd filed Critical Wuhan United Imaging Zhirong Medical Technology Co Ltd
Priority to CN202321311959.8U priority Critical patent/CN219895684U/en
Application granted granted Critical
Publication of CN219895684U publication Critical patent/CN219895684U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Endoscopes (AREA)

Abstract

The utility model relates to an endoscopic imaging system, which is characterized in that an image detector is switched between a first position and a second position which are horizontally distributed, namely, the positions of the same observed object projected on the image detector are different, so that 2D images corresponding to the two positions are output. The 2D images are synthesized into 3D images by an image processor. Because the images of the first position and the second position are imaged through the same lens and the image detector, the difference of definition between the two paths of images is small; the first position and the second position are distributed along the direction vertical to the optical axis, namely parallax exists between the first position and the second position in the direction vertical to the optical axis, and the relative positions in the vertical direction cannot be different, so that the fusion effect can be ensured, and the 3D imaging effect is improved. Meanwhile, only one lens is arranged in the endoscopic imaging system, so that the problem that the size of an insertion part of the endoscope is large due to the fact that double light paths are arranged side by side can be solved, and the use comfort of a patient is improved.

Description

Endoscopic imaging system
Technical Field
The utility model relates to the technical field of 3D imaging and medical instruments, in particular to an endoscopic imaging system.
Background
With the development of endoscopic imaging technology, 3D (three-dimensional) endoscopic imaging is becoming an indispensable instrument in minimally invasive surgery. Compared with the traditional 2D endoscope, the 3D endoscope can acquire the depth information of the scene, and can reflect the real situation of the scene, so that a doctor can feel the situation of the operation part like being in the scene, the operation flow is better controlled, and the accuracy and precision of the doctor in the minimally invasive operation are greatly improved.
The existing three-dimensional endoscope is mainly realized based on the binocular stereoscopic vision principle, namely, two identical camera modules are utilized to shoot images of an observed object from different angles, and then the three-dimensional information of the observed object is obtained by calculating the position parallax between corresponding points in the images. However, since the three-dimensional endoscope is provided with the double lenses, tolerance differences in processing and adjustment of optical elements and mechanical components cause differences in relative positions and definition between two images, and finally, the displayed 3D image is difficult to be fused by human eyes. In addition, the three-dimensional endoscope has the advantages that two imaging light paths which are arranged side by side are required to be designed for the insertion part of the endoscope, so that the size of the insertion part of the endoscope is large, more than twice the size of the insertion part of a common two-dimensional endoscope is achieved, and the use comfort of a patient is affected.
Disclosure of Invention
Accordingly, it is necessary to provide an endoscopic imaging system for solving the problems of high cost and large size of the conventional 3D endoscope.
An endoscopic imaging system, comprising:
the lens is used for collecting monocular images;
an image detector for receiving the monocular image and converting the monocular image into a 2D image signal; the image detector has a first position and a second position distributed in a direction perpendicular to an optical axis; the image detector is configured to be operable to switch between the first position and the second position;
an image processor for receiving the 2D image signal of the image detector at the first position and the 2D image signal of the image detector at the second position, and synthesizing the 2D image signal of the first position and the 2D image signal of the second position into a 3D image signal;
and a 3D display for receiving the 3D image signal and displaying the 3D image signal as a 3D image.
In one embodiment, the endoscopic imaging system includes a jog controller, the image detector being connected to the jog controller;
the jog controller is used for driving the image detector to move along the direction perpendicular to the optical axis so as to switch between the first position and the second position.
In one embodiment, the imaging frequency of the image detector is the same as the frequency at which the image detector switches between the first position and the second position.
In one embodiment, the lens includes a lens barrel, a first lens and a second lens spaced from the first lens, and the first lens and the second lens are both connected to an inner wall of the lens barrel.
In one embodiment, the lens comprises an adjusting piece, the first lens is connected to the inner wall of the lens barrel in a sliding manner, and the adjusting piece is connected to the inner wall of the lens barrel in a threaded manner;
the adjustment member is configured to be operably movable in an axial direction of the lens to urge the first lens to move in the axial direction relative to the second lens.
In one embodiment, the lens further comprises an illumination channel for receiving an illumination fiber.
In one embodiment, the illumination channel is located radially outside the barrel.
An endoscopic imaging system, comprising:
the lens is used for collecting monocular images;
an image detector for receiving the monocular image and converting the monocular image into a 2D image signal; the image detector has a first region and a second region distributed in a direction perpendicular to an optical axis;
an image processor for receiving the 2D image signal of the image detector in the first area and the 2D image signal of the image detector in the second area, and synthesizing the 2D image signal of the first area and the 2D image signal of the second area into a 3D image signal;
and a 3D display for receiving the 3D image signal and displaying the 3D image signal as a 3D image.
In one embodiment, the image detector is a CMOS detector.
In one embodiment, the imaging frequency of the image detector is the same as the frequency at which the image detector switches between the first region and the second region.
The endoscopic imaging system comprises an endoscope, an image detector, an image processor and a 3D display. The image detector is switched between a first position and a second position which are horizontally distributed, namely, the positions of the same observed object projected on the image detector are different, so that a 2D image corresponding to the first position and a 2D image corresponding to the second position are output. The 2D images of the two positions are combined into a 3D image by the image processor and displayed by the 3D display, thereby allowing the observer to see the corresponding 3D image. Because the images of the first position and the second position are imaged through the same lens and the image detector, the difference of definition between the two paths of images is small; the first position and the second position are distributed along the direction vertical to the optical axis, namely parallax exists between the first position and the second position in the direction vertical to the optical axis, and the relative positions in the vertical direction cannot be different, so that the fusion effect can be ensured, and the 3D imaging effect is improved. Meanwhile, only one lens is arranged in the endoscopic imaging system, so that the problem that the size of an insertion part of the endoscope is large due to the fact that double light paths are arranged side by side can be solved, and the use comfort of a patient is improved.
Drawings
FIG. 1 is a schematic diagram of an endoscopic imaging system according to a first embodiment of the present utility model;
FIG. 2 is a schematic view of an image detector in the endoscopic imaging system shown in FIG. 1 in a first position and a second position;
FIG. 3 is a partial schematic view of a lens barrel of the endoscopic imaging system shown in FIG. 1;
FIG. 4 is a schematic diagram of an endoscopic imaging system according to a second embodiment of the present utility model;
FIG. 5 is a schematic view of a first region and a second region of an image detector in the endoscopic imaging system shown in FIG. 4;
fig. 6 is a split schematic view of a first region and a second region of an image detector in the endoscopic imaging system shown in fig. 5.
Reference numerals: 10. an endoscopic imaging system; 100. a lens; 110. a lens barrel; 120. a first lens; 130. a second lens; 140. an illumination channel; 200. an image detector; 300. an image processor; 400. a jog controller; 500. a 3D display.
Detailed Description
In order that the above objects, features and advantages of the utility model will be readily understood, a more particular description of the utility model will be rendered by reference to the appended drawings. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present utility model. The present utility model may be embodied in many other forms than described herein and similarly modified by those skilled in the art without departing from the spirit of the utility model, whereby the utility model is not limited to the specific embodiments disclosed below.
In the description of the present utility model, it should be understood that the terms "center", "longitudinal", "lateral", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", "axial", "radial", "circumferential", etc. indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings are merely for convenience in describing the present utility model and simplifying the description, and do not indicate or imply that the device or element being referred to must have a specific orientation, be configured and operated in a specific orientation, and therefore should not be construed as limiting the present utility model.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present utility model, the meaning of "plurality" means at least two, for example, two, three, etc., unless specifically defined otherwise.
In the present utility model, unless explicitly specified and limited otherwise, the terms "mounted," "connected," "secured," and the like are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally formed; can be mechanically or electrically connected; either directly or indirectly, through intermediaries, or both, may be in communication with each other or in interaction with each other, unless expressly defined otherwise. The specific meaning of the above terms in the present utility model can be understood by those of ordinary skill in the art according to the specific circumstances.
In the present utility model, unless expressly stated or limited otherwise, a first feature "up" or "down" a second feature may be the first and second features in direct contact, or the first and second features in indirect contact via an intervening medium. Moreover, a first feature being "above," "over" and "on" a second feature may be a first feature being directly above or obliquely above the second feature, or simply indicating that the first feature is level higher than the second feature. The first feature being "under", "below" and "beneath" the second feature may be the first feature being directly under or obliquely below the second feature, or simply indicating that the first feature is less level than the second feature.
It will be understood that when an element is referred to as being "fixed" or "disposed" on another element, it can be directly on the other element or intervening elements may also be present. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present. The terms "vertical," "horizontal," "upper," "lower," "left," "right," and the like are used herein for illustrative purposes only and are not meant to be the only embodiment.
Fig. 1 is a schematic view of an endoscopic imaging system 10 according to a first embodiment of the present utility model; fig. 2 is a schematic view of an image detector 200 in the endoscopic imaging system 10 shown in fig. 1 in a first position and a second position. Referring to fig. 1-2, an endoscopic imaging system 10 according to an embodiment of the present utility model includes a lens 100, an image detector 200, an image processor 300, and a 3D display 500. Wherein, the lens 100 is used for collecting monocular images; the image detector 200 is used for receiving the monocular image and converting the monocular image into a 2D image signal; the image detector 200 has a first position and a second position distributed in a direction perpendicular to the optical axis; the image detector 200 is configured to operatively switch between a first position and a second position; the image processor 300 is configured to receive a 2D image signal of the image detector 200 at a first location and a 2D image signal of the image detector 200 at a second location, and to synthesize the 2D image signal of the first location and the 2D image signal of the second location into a 3D image signal; the 3D display 500 is for receiving a 3D image signal and displaying the 3D image signal as a 3D image. As shown in fig. 1, the direction of the optical axis, which is the axial direction of the lens 100, is illustrated by an arrow X; the direction perpendicular to the optical axis is the direction perpendicular to the paper surface.
In the endoscopic imaging system 10, the image detector 200 is switched between the first position and the second position which are horizontally distributed, that is, the positions of the same observed object projected on the image detector 200 are different, so that a 2D image corresponding to the first position and a 2D image corresponding to the second position are output. The image processor 300 embeds 2D images of two positions into a 3D image using a 3D composition algorithm, for example, a left-right composition algorithm, and displays the 3D image through the 3D display 500, thereby allowing the observer to see the corresponding 3D image. Since the images of the first position and the second position are imaged through the same lens 100 and image detector 200, the difference in sharpness between the two images is small; the first position and the second position are distributed along the direction vertical to the optical axis, namely parallax exists between the first position and the second position in the direction vertical to the optical axis, and the relative positions in the vertical direction cannot be different, so that the fusion effect can be ensured, and the 3D imaging effect is improved. Meanwhile, since only one lens 100 is provided in the endoscopic imaging system 10, it is possible to solve the problem that the size of the insertion portion of the endoscope is large due to the side-by-side arrangement of the dual optical paths, thereby improving the comfort of the patient.
As shown in fig. 1 and 2, in some embodiments, the endoscopic imaging system 10 further includes a jog control 400, and the image detector 200 is connected to the jog control 400. Specifically, the image detector 200 is driven by the jog controller 400 to automatically move in a direction perpendicular to the optical axis at a certain frequency, so that the image detector 200 is switched between a first position and a second position, thereby outputting two-dimensional images corresponding to the two positions, and the two-dimensional images of the two positions are combined into a three-dimensional image by the image processor 300. By providing the jog controller 400, the position of the image detector 200 can be conveniently switched to move within a preset range, and the reliability and stability of the output image are ensured.
In one embodiment, the imaging frequency of the image detector is the same as the frequency at which the image detector switches between the first position and the second position.
Specifically, the image output action of the image detector is triggered by the displacement pulse of the jog controller, that is, when the jog controller controls the image detector to switch to the first position or the second position, the jog controller simultaneously sends out a corresponding trigger signal and feeds back the trigger signal to the image detector, and at the moment, the image detector triggers the sampling, scanning and output actions of a certain frame of image by the signal. In this way, it is ensured that the position switching frequency of the image detector and the imaging frequency of the image detector are matched to each other.
The response speed of the image detector for sampling, scanning and outputting actions is very fast and is in millisecond level, so that the image detector can be considered to just output a two-dimensional image at a corresponding position when the image detector just moves to the first position or the second position, the imaging frequency of the image detector is controlled to be consistent with the switching frequency of the imaging frequency, the consistency of the images at the first position and the second position is ensured, and further the subsequent fusion quality is ensured. For example, the image detector outputs an image at 60 frames per second, the jog controller controls the image detector to vibrate at a frequency of 60HZ per second. For example, when the image detector outputs a first frame image, the image detector is in a first position; when outputting the second frame of image, the image detector is at a second position; by the third frame of image, the image detector is returned to the first position and the cycle is repeated. The odd frame image always corresponds to the image at the first position, and the even frame image always corresponds to the image at the second position, so that the subsequent fusion quality is ensured by ensuring the consistency of the images at the first position and the second position.
Fig. 3 is a partial schematic view of lens 100 of endoscopic imaging system 10 shown in fig. 1. As shown in fig. 1 and 3, in a specific embodiment, the lens 100 includes a lens barrel 110, a first lens 120, and a second lens 130 spaced apart from the first lens 120, and the first lens 120 and the second lens 130 are connected to an inner wall of the lens barrel 110.
Specifically, the first lens 120 and the second lens 130 are disposed at intervals along the optical axis direction, one of the first lens 120 and the second lens 130 is a convex lens, and the other is a concave lens, so that the light emitted from the observed object is collected and focused by the cooperation of the two lenses, a clear image is formed, and the light is incident on the image detector 200. For example, the first lens 120 may be a convex lens and the second lens 130 may be a concave lens.
In one embodiment, the lens comprises an adjusting piece, the first lens is connected to the inner wall of the lens barrel in a sliding manner, and the adjusting piece is connected to the inner wall of the lens barrel in a threaded manner; the adjustment member is configured to be operatively moved in an axial direction of the lens to urge the first lens to move in the axial direction relative to the second lens. The first lens is pushed to move through the adjusting piece, so that the distance between the first lens and the second lens is changed, the focal length of the lens is adjusted, and the practical use requirement is met.
As shown in fig. 1 and 3, in some embodiments, the lens 100 further includes an illumination channel 140, the illumination channel 140 for receiving an illumination fiber. With this arrangement, the imaging is made clearer, facilitating the acquisition of a clearer image by the image detector 200 for compositing by the image processor 300.
As shown in fig. 1 and 3, in one embodiment, the illumination channel 140 is located radially outside the barrel 110. By disposing the illumination channel 140 outside the lens barrel 110, the imaging definition of the lens 100 is improved while saving the space occupied by the lens in the axial direction.
Fig. 4 is a schematic diagram of an endoscopic imaging system 10 according to a second embodiment of the present utility model; fig. 5 is a schematic view of a first region and a second region of an image detector 200 in the endoscopic imaging system 10 shown in fig. 4; fig. 6 is a schematic view of the first and second regions of the image detector 200 of the endoscopic imaging system 10 of fig. 5 in a split-up manner. As shown in fig. 4 to 6, an endoscopic imaging system 10 according to another embodiment of the present utility model is provided, and the endoscopic imaging system 10 includes a lens 100, an image detector 200, an image processor 300 and a 3D display 500. The lens 100, the image detector 200, the image processor 300 and the 3D display 500 are identical to the lens 100, the image detector 200, the image processor 300 and the 3D display 500 in the foregoing embodiments, and will not be described herein.
The present embodiment differs from the foregoing embodiment in that the position of the image detector 200 is fixed, it is no longer moved in the direction perpendicular to the optical axis, and the imaging area of the image detector 200 increases, including a first area and a second area, the aggregate of which is larger than that of the image detector 200 in the foregoing embodiment. For example, the imaging region of the image probe 200 in the foregoing embodiment has a size of 1920×1080, and the imaging region has a size of 2520×1080 in the present embodiment.
As shown in fig. 5 and 6, the area surrounded by the solid line frame can be regarded as a first area, and the area surrounded by the broken line frame can be regarded as a second area. Specifically, in the first timing, the image detector 200 outputs only the two-dimensional image located in the first area (solid line frame area); in the second time, only the two-dimensional image located in the second area (the area of the dashed box) is output, and the three-dimensional image is obtained by stitching and synthesizing the images of the first area and the second area. Since the images of the first region and the second region are imaged through the same lens 100 and image detector 200, the difference in sharpness between the two images is small; and the first area and the second area are distributed along the direction vertical to the optical axis, namely parallax exists between the first area and the second area along the direction vertical to the optical axis, and the relative positions of the first area and the second area in the vertical direction are not different, so that the fusion effect can be ensured, and the 3D imaging effect is improved.
In one embodiment, the image detector is a CMOS detector, i.e. a CMOS image sensor. The CMOS detector can accurately control the exposure area, ensure that the output area image is the image of the target area, but not the image of the complete area, and ensure the subsequent fusion effect.
In one embodiment, the imaging frequency of the image detector is the same as the frequency at which the image detector switches between the first region and the second region. For example, when the image detector outputs a first frame image, the exposure area of the image detector is controlled to be a first area; when outputting the second frame of image, controlling the exposure area of the image detector in the second area; by the third frame of image, the exposed area of the image detector is changed back to the first area, and so on. The odd frame image always corresponds to the image of the first area, and the even frame image always corresponds to the image of the second area, so that the subsequent fusion quality is ensured by ensuring the consistency of the images of the first area and the second area.
As shown in fig. 4 and 5, the area surrounded by the solid line frame is regarded as a first area, and the area surrounded by the broken line frame is regarded as a second area. Taking the image detector 200 as a CMOS image sensor for example, the CMOS image sensor is stationary. The intersection region between the first region and the second region is always in the exposure region, the CMOS image sensor first frame acquires the non-intersection region in the first region, and the second frame acquires the non-intersection region in the second region. And splicing the acquisition region and the intersection region of the first frame to obtain a first region, and splicing the second frame and the acquisition region to obtain a second region. The images of the two regions are synthesized to obtain a 3D image, which is displayed on the 3D display 500. The images of the first area and the second area are imaged by the same lens 100 and the image detector 200, so that the difference of definition between the two images is small; and the two have parallax in the direction perpendicular to the optical axis, and the relative positions in the vertical direction cannot be different, so that the fusion effect can be ensured, and the 3D imaging effect is improved.
The technical features of the above-described embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above-described embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples illustrate only a few embodiments of the utility model, which are described in detail and are not to be construed as limiting the scope of the utility model. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the utility model, which are all within the scope of the utility model. Accordingly, the scope of protection of the present utility model is to be determined by the appended claims.

Claims (10)

1. An endoscopic imaging system, comprising:
a lens (100) for acquiring a monocular image;
an image detector (200) for receiving the monocular image and converting the monocular image into a 2D image signal; the image detector (200) has a first position and a second position distributed in a direction perpendicular to an optical axis; the image detector (200) is configured to be operatively switched between the first position and the second position;
an image processor (300) for receiving a 2D image signal of the image detector (200) at the first position and a 2D image signal of the image detector (200) at the second position, and synthesizing the 2D image signal of the first position and the 2D image signal of the second position into a 3D image signal;
and a 3D display (500) for receiving the 3D image signal and displaying the 3D image signal as a 3D image.
2. The endoscopic imaging system according to claim 1, wherein the endoscopic imaging system comprises a jog controller (400), the image detector (200) being connected to the jog controller (400);
the jog controller (400) is used for driving the image detector (200) to move along the direction perpendicular to the optical axis so as to switch between the first position and the second position.
3. The endoscopic imaging system according to claim 2, wherein an imaging frequency of the image detector (200) is the same as a frequency at which the image detector (200) switches between the first position and the second position.
4. The endoscopic imaging system according to claim 1, wherein the lens (100) comprises a barrel (110), a first lens (120) and a second lens (130) disposed at a distance from the first lens (120), the first lens (120) and the second lens (130) being both connected to an inner wall of the barrel (110).
5. The endoscopic imaging system of claim 4, wherein the lens (100) comprises an adjustment member, the first lens (120) being slidably coupled to an inner wall of the barrel (110), the adjustment member being threadably coupled to the inner wall of the barrel (110);
the adjustment member is configured to be operatively movable in an axial direction of the lens (100) to urge the first lens (120) to move in the axial direction relative to the second lens (130).
6. The endoscopic imaging system according to claim 4, wherein the lens (100) further comprises an illumination channel (140), the illumination channel (140) being adapted to house an illumination fiber.
7. The endoscopic imaging system according to claim 6, wherein the illumination channel (140) is located radially outside the barrel (110).
8. An endoscopic imaging system, comprising:
a lens (100) for acquiring a monocular image;
an image detector (200) for receiving the monocular image and converting the monocular image into a 2D image signal; the image detector (200) has a first region and a second region distributed in a direction perpendicular to an optical axis;
an image processor (300) for receiving a 2D image signal of the image detector (200) in the first region and a 2D image signal of the image detector (200) in the second region, and synthesizing the 2D image signal of the first region and the 2D image signal of the second region into a 3D image signal;
and a 3D display (500) for receiving the 3D image signal and displaying the 3D image signal as a 3D image.
9. The endoscopic imaging system according to claim 8, wherein an imaging frequency of the image detector (200) is the same as a frequency at which the image detector (200) switches between the first region and the second region.
10. The endoscopic imaging system according to claim 8, wherein the image detector (200) is a CMOS detector.
CN202321311959.8U 2023-05-24 2023-05-24 Endoscopic imaging system Active CN219895684U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202321311959.8U CN219895684U (en) 2023-05-24 2023-05-24 Endoscopic imaging system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202321311959.8U CN219895684U (en) 2023-05-24 2023-05-24 Endoscopic imaging system

Publications (1)

Publication Number Publication Date
CN219895684U true CN219895684U (en) 2023-10-27

Family

ID=88464327

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202321311959.8U Active CN219895684U (en) 2023-05-24 2023-05-24 Endoscopic imaging system

Country Status (1)

Country Link
CN (1) CN219895684U (en)

Similar Documents

Publication Publication Date Title
JP5730339B2 (en) Stereoscopic endoscope device
US10334225B2 (en) Stereoscopic camera
JP5284731B2 (en) Stereoscopic image display system
US20060029256A1 (en) Method of generating image and device
JP2020114491A (en) Surgical microscope having data unit and method for overlaying images
JPH095643A (en) Stereoscopic endoscope device
CN109031642B (en) Universal stereoscopic microscopic naked eye visualization display method and system device
CN110431465B (en) Microscope device for recording and displaying three-dimensional images of a sample
US20130023732A1 (en) Endoscope and endoscope system
JP2000152285A (en) Stereoscopic image display device
JPH09238369A (en) Three-dimension image display device
JP5946777B2 (en) Stereo imaging device
CN113925441B (en) Imaging method and imaging system based on endoscope
US6674462B1 (en) Videoscope and its display unit
JP4253493B2 (en) Optical observation apparatus and stereoscopic image input optical system used therefor
JP3816599B2 (en) Body cavity treatment observation system
JP2001145129A (en) Stereoscopic image display device
KR101339667B1 (en) System for 3d high definition image interface of medical surgery microscope
CN219895684U (en) Endoscopic imaging system
JP2010231192A (en) Stereoscopic imaging apparatus
JP3544171B2 (en) 3D image display device
JP2002085330A (en) Stereoscopic endoscope device
JPH0815616A (en) Stereoscopic endoscope image pickup device
US11310481B2 (en) Imaging device, system, method and program for converting a first image into a plurality of second images
JPH08313825A (en) Endoscope device

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant