WO2017183371A1 - Système endoscopique - Google Patents

Système endoscopique Download PDF

Info

Publication number
WO2017183371A1
WO2017183371A1 PCT/JP2017/010694 JP2017010694W WO2017183371A1 WO 2017183371 A1 WO2017183371 A1 WO 2017183371A1 JP 2017010694 W JP2017010694 W JP 2017010694W WO 2017183371 A1 WO2017183371 A1 WO 2017183371A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
optical
refractive index
images
optical system
Prior art date
Application number
PCT/JP2017/010694
Other languages
English (en)
Japanese (ja)
Inventor
片倉正弘
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Priority to JP2017544793A priority Critical patent/JPWO2017183371A1/ja
Publication of WO2017183371A1 publication Critical patent/WO2017183371A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/04Reversed telephoto objectives
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/26Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes using light guides
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present invention relates to an endoscope system, and more particularly to a depth-of-field endoscope system.
  • the depth of field becomes narrower as the number of pixels of the imaging device increases. That is, when the pixel pitch (the vertical and horizontal dimensions of one pixel) is reduced in order to increase the number of pixels in the imaging device, the permissible circle of confusion is also reduced accordingly, and the depth of field of the imaging device is reduced.
  • Patent Documents 1, 2, and 3 disclose a configuration in which a self-portrait is divided and formed, and the acquired images are combined by image processing to increase the depth. .
  • This configuration is superior in that the self-portrait is divided and imaged, and the acquired images are combined by image processing to increase the depth.
  • Patent Documents 1, 2, and 3 are not preferable because a plurality of image sensors are required and the cost increases.
  • the present invention has been made in view of the above, and in an endoscope system using a depth expansion technique in self-portrait division using a prism, it is possible to obtain good image quality from the center region to the peripheral region of the screen.
  • An object is to provide an endoscope system.
  • One aspect of the endoscope system according to the present invention is as follows.
  • An objective optical system An optical path dividing unit that divides a subject image obtained by the objective optical system into two optical images with different focus using two prisms;
  • One image sensor for acquiring two optical images;
  • An endoscopic system having an image composition processing unit that selects an image having relatively high contrast between two acquired optical images in a predetermined region and generates a composite image,
  • the two prisms are bonded so that the bonding surfaces are substantially parallel with each other through a bonding agent,
  • the adhesive surface and the optical axis of the objective optical system form an angle of 45 degrees and satisfy the following conditional expression (1).
  • the unit of the angle is degrees ⁇ 10 ⁇ d ⁇ sin ( ⁇ ′ ⁇ 45) / (cos ⁇ ′ ⁇ im_pitch) ⁇ 10 (1) here, im_pitch is the pixel pitch of the image sensor, ⁇ ′ is the refraction angle of the light ray incident on the adhesive surface, d is the thickness of the bonding agent, It is.
  • the present invention has an effect of providing an endoscope system that can obtain a good image quality from the center region to the peripheral region of a screen in an endoscope system using a depth expansion technique in self-portrait division using a prism. .
  • FIG. 1 It is a figure which shows the cross-sectional structure of the objective optical system which the endoscope system which concerns on one Embodiment of this invention, an optical path division part, and an image pick-up element. It is a schematic block diagram of the optical path division part and imaging device which the endoscope system which concerns on embodiment of this invention has. It is a schematic structure figure of an image sensor which an endoscope system concerning an embodiment of the present invention has. It is a figure which shows the structure of the adhesive agent vicinity of an optical path division part.
  • A is a spot diagram of an optical system in which the x-direction shift is corrected.
  • (B) is a spot diagram of an optical system in which an x-direction shift occurs.
  • the endoscope system according to the embodiment of the present invention it is a diagram showing an imaging state when an image is formed on an imaging device after an odd number of reflections by a beam splitter.
  • SA spherical aberration
  • AS astigmatism
  • DT distortion
  • CC lateral chromatic aberration
  • E respectively show spherical aberration
  • AS astigmatism
  • DT distortion
  • CC lateral chromatic aberration
  • the endoscope system divides an object optical system OBL and a subject image obtained by the objective optical system OBL into two optical images with different focus using two prisms.
  • An image combining process for generating a composite image by selecting an image having a relatively high contrast between the acquired two optical images in a predetermined region
  • the two prisms are bonded so that the bonding surfaces are substantially parallel to each other with a bonding agent interposed between the bonding surface and the objective optical system OBL.
  • the optical axis forms an angle of 45 degrees and satisfies the following conditional expression (1).
  • the unit of angle is degrees.
  • im_pitch is the pixel pitch of the image sensor 22
  • ⁇ ′ is the refraction angle of the light ray incident on the bonding surface 21h
  • d is the thickness of the bonding agent 21g, It is.
  • FIG. 2 is a diagram illustrating a schematic configuration of the optical path splitting unit 20.
  • the light emitted from the objective optical system OBL enters the optical path dividing unit 20.
  • the optical path splitting unit 20 includes a polarizing beam splitter 21 that splits a subject image into two optical images with different focus, and an image sensor 22 that captures two optical images and acquires two images.
  • the polarization beam splitter 21 includes a first prism 21b, a second prism 21e, a mirror 21c, and a ⁇ / 4 plate 21d. Both the first prism 21b (object-side prism) and the second prism 21e (image-side prism) have beam split surfaces having an inclination of 45 degrees with respect to the optical axis AX.
  • a polarization splitting film 21f is formed on the beam splitting surface of the first prism 21b.
  • the first prism 21b and the second prism 21e constitute the polarization beam splitter 21 by bringing the beam split surfaces into contact with each other via the polarization separation film 21f.
  • the mirror 21c is provided near the end face of the first prism 21b via a ⁇ / 4 plate 21d.
  • An image sensor 22 is attached to the end face of the second prism 21e via a cover glass CG.
  • the subject image from the objective optical system OBL is separated into a P-polarized component (transmitted light) and an S-polarized component (reflected light) by the polarization separation film 21f provided on the beam splitting surface in the first prism 21b, and the reflected light side And an optical image on the transmitted light side are separated into two optical images.
  • the optical image of the S-polarized component is reflected to the imaging element 22 by the polarization separation film 21f, passes through the A optical path, passes through the ⁇ / 4 plate 21d, is reflected by the mirror 21c, and is folded back to the imaging element 22 side. It is.
  • the folded optical image is transmitted through the ⁇ / 4 plate 21d again to rotate the polarization direction by 90 °, passes through the polarization separation film 21f, and forms an image on the imaging device 22.
  • the optical image of the P-polarized component passes through the polarization separation film 21f, passes through the B optical path, and is reflected by a mirror surface provided on the opposite side of the beam splitting surface of the second prism 21e that folds vertically toward the image sensor 22.
  • the image is formed on the image sensor 22.
  • a prism glass path is set so that a predetermined optical path difference of, for example, about several tens of ⁇ m is generated between the A optical path and the B optical path, and two optical images with different focus are received on the light receiving surface of the image sensor 22. To form an image.
  • the first prism 21b and the second prism 21e can be separated into two optical images having different focus positions so that the optical path length (glass path length) on the transmitted light side to the imaging element 22 in the first prism 21b can be separated.
  • the optical path length on the reflected light side is short (small).
  • the image sensor 22 receives two optical images with different focus positions and individually receives and captures two optical regions (effective pixels) in the entire pixel area of the image sensor 22. Regions) 22a and 22b are provided.
  • the light receiving regions 22a and 22b are arranged so as to coincide with the image planes of these optical images in order to capture two optical images.
  • the light receiving area 22a is relatively shifted (shifted) to the near point side with respect to the light receiving area 22b, and the light receiving area 22b is in focus with respect to the light receiving area 22a.
  • the position is relatively shifted to the far point side. Thereby, two optical images with different focus are formed on the light receiving surface of the image sensor 22.
  • the optical path length to the image sensor 22 is changed so that the focus position with respect to the light receiving regions 22a and 22b is relatively shifted. Also good.
  • a correction pixel area 22c for correcting a geometric shift of the optical image divided into two is provided around the light receiving areas 22a and 22b.
  • a correction pixel area 22c for correcting a geometric shift of the optical image divided into two is provided.
  • manufacturing errors are suppressed, and correction by image processing is performed by an image correction processing unit 23b (FIG. 6) described later, thereby eliminating the geometrical deviation of the optical image described above. It has become.
  • the second lens group G2 of the present embodiment described above is a focusing lens and can be selectively moved to two positions in the direction of the optical axis.
  • the second lens group G2 is driven by an actuator (not shown) so as to move from one position to the other position and from the other position to one position between two positions.
  • the second lens group G2 In the state where the second lens group G2 is set to the front side (object side) position, the second lens group G2 is set so as to focus on the subject in the observation area when performing far-field observation (normal observation). Further, in the state where the second lens group G2 is set to the rear side position, it is set to focus on the subject in the observation region when performing close-up observation (magnification observation).
  • the polarization beam splitter 21 when used for the polarization separation, the brightness of the separated image is different unless the polarization state of the light to be separated is a circular polarization. Regular brightness differences are relatively easy to correct in image processing. However, if brightness differences occur locally and under viewing conditions, they cannot be corrected completely, resulting in uneven brightness in the composite image. May end up.
  • the subject observed with the endoscope may have uneven brightness in the relatively peripheral part of the visual field of the composite image. It should be noted that the brightness unevenness with the polarization state broken is conspicuous when the subject has a relatively saturated brightness distribution.
  • the endoscope In the peripheral part of the visual field, the endoscope often sees the blood vessel running and the mucous membrane structure of the subject image relatively close to each other, and there is a high possibility that the image will be very troublesome for the user. Therefore, for example, as shown in FIG. 2, it is preferable to arrange the ⁇ / 4 plate 21a closer to the object side than the polarization separation film 21f of the optical path splitting unit 20 so as to return the polarization state to the circularly polarized state. .
  • a half mirror that splits the intensity of incident light can be used instead of the polarizing beam splitter as described above.
  • FIG. 4 shows a configuration in the vicinity of the bonding agent 21 g of the optical path splitting unit 20.
  • the first prism 21b and the second prism 21e are bonded so that the bonding surfaces 21h and 21i are substantially parallel to each other with a bonding agent 21g interposed therebetween.
  • the bonding surface 21h and the optical axis AX of the objective optical system OBL form an angle of 45 degrees.
  • the angle formed by the adhesive surface 21h and the optical axis AX of the objective optical system OBL may be approximately 45 degrees.
  • the normal line of the bonding surface 21h is indicated by N.
  • the light beam AL is refracted in the direction of the angle ⁇ ′ on the bonding surface 21h.
  • the light beam is refracted by the difference between the refractive index of the prism and the refractive index of the bonding agent.
  • This refraction occurs on the surface including the 45 slope of the prism (x direction in FIG. 4), but does not occur in the vertical direction (y direction in FIG. 4).
  • the distance between the light beam AL that has traveled without being refracted and the light beam AL 'that has traveled after being refracted is referred to as x-direction shift x-sht (see FIG. 4). Note that the x-direction shift occurs in both the optical paths A and B in the prism.
  • Nd_Pr01 ⁇ sin ⁇ Nd_CE ⁇ sin ⁇ ′ (A) here, Nd_Pr01 is the refractive index at the d-line of the first prism 21b on the object side, Nd_CE is the refractive index of the bonding agent 21g at the d-line, ⁇ is the incident angle of the light beam AL incident on the bonding surface 21h, ⁇ ′ is the refraction angle of the light beam AL incident on the bonding surface 21h, It is.
  • Conditional expression (1) relates to x-direction shift.
  • FIG. 5A is a diagram showing a spot diagram of the optical system in which the x-direction shift is corrected in the present embodiment.
  • FIG. 5B is a spot diagram of an optical system in which an x-direction shift occurs in the conventional configuration.
  • the objective optical system OBL is an ideal lens without aberration, a circular object at the center of the screen becomes elliptical on the image plane. For this reason, the resolution is deteriorated, which is not preferable.
  • Satisfying conditional expression (1) makes it possible to reduce the x-direction shift, so that good optical performance can be obtained.
  • conditional expression (1) If the upper limit value of conditional expression (1) is exceeded or less than the lower limit value, the maximum value of the x-direction shift becomes larger than the pixel pitch of the image sensor. For this reason, since optical performance will deteriorate, it is not preferable.
  • conditional expression (1) ′ instead of conditional expression (1).
  • conditional expression (1) ′′ instead of conditional expression (1).
  • Nd_Pr01 is the refractive index at the d-line of the first prism 21b on the object side
  • Nd_Pr02 is a refractive index at the d-line of the second prism 21e on the image side
  • Nd_CE is the refractive index of the bonding agent 21g at the d-line
  • the refractive index Nd_Pr01 of the first prism 21b and the refractive index Nd_Pr02 of the second prism 21e are desirably the same value.
  • Conditional expression (2) relates to an appropriate ratio between the refractive index of the first prism 21b and the refractive index of the bonding agent 21g. Satisfying the conditional expression (2) is preferable because the amount of refraction of the light beam AL at the interface between the first prism 21b and the bonding agent 21g becomes small.
  • conditional expression (2) When the upper limit value of conditional expression (2) is exceeded or below the lower limit value, the amount of refraction of the light beam AL at the interface between the first prism 21b and the bonding agent 21g increases. This is not preferable because the x-direction shift becomes too large.
  • Conditional expression (3) relates to an appropriate ratio of the refractive indexes of the second prism 21e and the bonding agent 21g. Satisfying the conditional expression (3) is preferable because the amount of refraction of the light beam AL at the interface between the second prism 21e and the bonding agent 21g becomes small.
  • conditional expression (3) If the upper limit value of conditional expression (3) is exceeded or below the lower limit value, the amount of light refraction at the interface between the second prism 21e and the bonding agent 21g increases. This is not preferable because the x-direction shift becomes too large.
  • conditional expressions (2) ′ and (3) ′ are preferably satisfied instead of conditional expressions (2) and (3).
  • 0.8 ⁇ Nd_Pr01 / Nd_CE ⁇ 1.2 (2) ′ 0.8 ⁇ Nd_Pr02 / Nd_CE ⁇ 1.2 (3) ′
  • 0.9 ⁇ Nd_Pr02 / Nd_CE ⁇ 1.11 (3) "
  • d is the thickness of the bonding agent 21g
  • ⁇ ′ is the refraction angle of the light ray incident on the bonding surface 21h
  • fw is the focal length in the normal observation state of the objective optical system
  • ih is the image height
  • Conditional expression (4) relates to an appropriate ratio between the x-direction shift and the focal length of the objective optical system.
  • conditional expression (4) If the upper limit value of conditional expression (4) is exceeded, the x-direction shift becomes too large and the resolution is deteriorated, which is not preferable.
  • conditional expression (4) If the lower limit value of conditional expression (4) is not reached, the x-direction shift almost does not exist, so the problem of the present application itself does not occur.
  • Conditional expression (5) relates to an appropriate ratio between x-direction shift and image height.
  • conditional expression (5) If the upper limit value of conditional expression (5) is exceeded, the x-direction shift becomes too large, and the resolution is deteriorated.
  • conditional expression (5) If the lower limit of conditional expression (5) is not reached, there is almost no x-direction shift, so the problem of the present application itself does not occur.
  • conditional expressions (4) ′ and (5) ′ are preferably satisfied instead of conditional expressions (4) and (5).
  • (4) ′ 0.0015 ⁇ d ⁇ sin ( ⁇ ′ ⁇ 45) / (cos ⁇ ′ ⁇ ih) ⁇ 0.5 (5) ′
  • the first negative lens is provided on the most object side of the objective optical system and the following conditional expressions (6) and (7) are satisfied.
  • Nd_L01 is a refractive index at the d-line of the negative first lens L1
  • Nd_CE is the refractive index of the bonding agent 21g at the d-line
  • Nd_Pr01 is the refractive index at the d-line of the object-side prism 21b, It is.
  • Conditional expression (6) relates to an appropriate ratio between the refractive index of the negative first lens L1 and the refractive index of the bonding agent 21g.
  • Conditional expression (7) relates to an appropriate ratio between the refractive index of the negative first lens L1 and the refractive index of the first prism 21b.
  • conditional expression (7) If the upper limit of conditional expression (7) is exceeded, the refractive index of the first prism 21b becomes too small, making it difficult to obtain the material, which is not preferable.
  • conditional expressions (6) ′ and (7) ′ instead of the conditional expressions (6) and (7).
  • 0.8 ⁇ Nd_L01 / Nd_CE ⁇ 1.4 (6) ′ 0.8 ⁇ Nd_L01 / Nd_Pr01 ⁇ 1.3 (7) ′
  • conditional expressions (6) "and (7)” instead of the conditional expressions (6) and (7). 0.95 ⁇ Nd_L01 / Nd_CE ⁇ 1.3 (6) " 0.95 ⁇ Nd_L01 / Nd_Pr01 ⁇ 1.2 (7) "
  • the image processor 23 reads an image relating to two optical images captured by the image sensor 22 and has different focus positions, and an image for performing image correction on the two images read by the image reading unit 23a.
  • the image processing apparatus includes a correction processing unit 23b and an image composition processing unit 23c that performs image composition processing for compositing two corrected images.
  • the image correction processing unit 23b corrects the images related to the two optical images formed on the light receiving regions 22a and 22b of the image sensor 22 so that the differences other than the focus are substantially the same. That is, the two images are corrected so that the relative positions, angles, and magnifications in the optical images of the two images are substantially the same.
  • each optical image formed on the light receiving regions 22a and 22b of the image sensor 22 may have a relative displacement, a displacement, an angle, that is, a displacement in the rotation direction, and the like.
  • the lower one of the two images or images or the image or image having the lower luminance at the relatively same position of the two images or images is used as a reference. It is desirable to make corrections.
  • the image composition processing unit 23c selects a relatively high contrast image in a corresponding region between the two images corrected by the image correction processing unit 23b, and generates a composite image. That is, by comparing the contrast in each spatially identical pixel area in two images and selecting a pixel area having a relatively higher contrast, a composite image as one image synthesized from the two images Is generated.
  • a composite image is generated by a composite image process in which the pixel area is added with a predetermined weight.
  • the image processor 23 performs subsequent image processing such as color matrix processing, contour enhancement, and gamma correction on one image synthesized by the image synthesis processing unit 23c.
  • the image output unit 23d outputs an image that has been subjected to subsequent image processing.
  • the image output from the image output unit 23d is output to the image display unit 24.
  • first prism 21b and the second prism 21e are made of different glass materials according to the near point optical path and the far point optical path leading to the image sensor 22, and the refractive index is made different so that the relative focus position is relatively increased. It may be shifted.
  • step S101 the image correction processing unit 23b performs a correction process on two images, that is, the image related to the far point image and the image related to the near point image acquired by the image sensor 22 in the image correction unit 22b. That is, according to a preset correction parameter, the two images are corrected so that the relative position, angle, and magnification in the optical images of the two images are substantially the same, and the corrected images are combined.
  • the data is output to the processing unit 23c.
  • step S102 the two images that have undergone the correction processing are combined by the image combining processing unit 23c. At this time, contrast values are calculated and compared in the corresponding pixel regions of the two perspective images.
  • step S103 it is determined whether or not there is a difference in the compared contrast values. If there is a difference in contrast, the process proceeds to step S105, where a region having a high contrast value is selected and synthesized.
  • the difference in the contrast value to be compared is small or almost the same, it becomes an unstable factor in processing which of the two perspective images is selected. For example, if there are fluctuations in a signal such as noise, a discontinuous region may be generated in the composite image, or a problem may occur that the originally resolved subject image is blurred.
  • step S104 if the contrast values of the two images are substantially the same in the pixel region to be subjected to the contrast comparison, weighting is performed, and the image weighted in the next step S105 is added to perform image selection. The instability is resolved.
  • the field of view is prevented while preventing a discontinuous region from being generated in the composite image or the optical image from being blurred due to noise or the like.
  • An image with an increased depth can be acquired.
  • the manufacturing cost is reduced and the depth of field is increased without increasing the size of the device as compared with a device including a plurality of imaging devices. Can be obtained.
  • a desired depth of field can be obtained, and degradation of resolution can be prevented.
  • the mirror image correction by the odd number of reflections may be reversed by the image correction processing unit 23b. It is preferable to carry out by.
  • the imaging element 22 has a long shape in the endoscope longitudinal direction, it is preferable to appropriately rotate the composite image in consideration of the aspect ratio of the image display unit 24.
  • FIGS. 9A and 9B are diagrams showing a cross-sectional configuration of the objective optical system.
  • FIG. 9A is a diagram showing a cross-sectional configuration of the objective optical system in a normal observation state (a long distance object point).
  • FIG. 2B is a diagram showing a cross-sectional configuration of the objective optical system in the close-up observation state (short-distance object point).
  • the objective optical system includes, in order from the object side, a first lens group G1 having a negative refractive power, a second lens group G2 having a positive refractive power, and a third lens group G3 having a positive refractive power. , Is composed of.
  • the aperture stop S is disposed between the second lens group G2 and the third lens group G3.
  • the second lens group G2 moves on the optical axis AX to the image side, and corrects the change in the focal position accompanying the change from the normal observation state to the close observation state.
  • the first lens group G1 includes, in order from the object side, a plano-concave negative lens L1 having a plane facing the object side, a parallel flat plate L2, a biconcave negative lens L3, and a plano-convex positive lens L4 having a plane facing the image side. And consist of Here, the negative lens L3 and the positive lens L4 are cemented.
  • the second lens group G2 includes a positive meniscus lens L5 having a convex surface directed toward the object side.
  • the third lens group G3 includes, in order from the object side, a biconvex positive lens L6, a negative meniscus lens L7 with a convex surface facing the image side, a planoconvex positive lens L8 with a plane facing the object side, and a biconvex positive lens.
  • L9 and a negative meniscus lens L10 having a convex surface facing the image side.
  • the positive lens L6 and the negative meniscus lens L7 are cemented.
  • the positive lens L9 and the negative meniscus lens L10 are cemented.
  • the optical path splitting unit 20 described above is disposed on the image side of the third lens group G3. In the prism in the optical system, the optical path is bent.
  • the parallel flat plate L2 is a filter provided with a coating for cutting a specific wavelength, for example, 1060 nm of a YAG laser, 810 nm of a semiconductor laser, or an infrared region.
  • FIGS. 10A, 10B, 10C, and 10D show spherical aberration (SA), astigmatism (AS), distortion aberration (DT), and lateral chromatic aberration (CC) in the normal observation state of this embodiment.
  • 10E, 10F, 10G, and 10H show spherical aberration (SA), astigmatism (AS), distortion aberration (DT), and lateral chromatic aberration (CC) in the close-up observation state of this example. ).
  • These aberration diagrams are shown for wavelengths of 656.27 nm (C line), 587.56 nm (d line), and 435.84 nm (g line). In each figure, “ ⁇ ” indicates a half angle of view.
  • r are the radius of curvature of each lens surface
  • d is the distance between the lens surfaces
  • nd is the refractive index of the d-line of each lens
  • ⁇ d is the Abbe number of each lens
  • FNO is the F number
  • is the half field angle It is.
  • the back focus fb represents the distance from the most image-side optical surface to the paraxial image surface in terms of air. The total length is obtained by adding back focus to the distance (not converted to air) from the lens surface closest to the object side to the optical surface closest to the image side.
  • Example 1 The numerical values of conditional expressions (1) to (7) in Example 1, Example 2, and Example 3 are shown below.
  • the specification values of the objective optical system OBL (Numerical Example 1) are common to the three examples.
  • Conditional expression (1) d ⁇ sin ( ⁇ '-45) / (cos ⁇ ' ⁇ im_pitch) (2) Nd_Pr01 / Nd_CE (3) Nd_Pr02 / Nd_CE (4) d ⁇ sin ( ⁇ '-45) / (cos ⁇ ' ⁇ fw) (5) d ⁇ sin ( ⁇ '-45) / (cos ⁇ ' ⁇ ih) (6) Nd_L01 / Nd_CE (7) Nd_L01 / Nd_Pr01 Value corresponding to conditional expression Example 1 Example 2
  • Example 3 (1) 1.12 1.92 9.98 (2) 1.10 1.14 1.29 (3) 1.10 1.14 1.29 (4) 0.0017 0.0038 0.013 (5) 0.0017 0.004 0.0135 (6) 1.26 1.26 1.29
  • Example 1 Example 2
  • Example 3 Nd_Pr01 1.63854 1.69895 1.883
  • Nd_Pr02 1.63854 1.69895 1.883
  • Nd_CE 1.49 1.49 1.46 d 0.01 0.015 0.015 ⁇ '51.04 53.73 65.78 im_pitch (um) 1.5 2
  • im_pitch (um) 1.5 2
  • 1.3 fw 1 1 1 ih 0.959 0.959 0.959 Nd_L01 1.883 1.883 1.883
  • the present invention is useful for an endoscope system that can acquire a high-quality image in which the depth of field is enlarged and aberrations are corrected favorably.

Abstract

L'objectif de la présente invention est de fournir un système endoscopique avec lequel il est possible d'obtenir une bonne qualité d'image depuis une zone centrale jusqu'à une zone périphérique d'un écran. Ce système endoscopique comprend : un système optique d'objectif OBL; une unité de division de trajet optique 20 pour diviser, à l'aide de deux prismes, une image de sujet obtenue par le système optique d'objectif OBL en deux images optiques ayant des points de focalisation différents; un élément d'imagerie 22 pour acquérir les deux images optiques; et une unité de traitement de synthèse d'image 23 pour sélectionner, dans une région prescrite, une image ayant un contraste relativement élevé parmi les deux images optiques acquises afin de générer une image synthétisée. Le système endoscopique est caractérisé en ce que les deux prismes sont reliés par un agent de liaison intercalé entre eux de telle sorte que les surfaces de liaison sont sensiblement parallèles l'une à l'autre, et les surfaces de liaison et l'axe optique du système optique d'objectif OBL forment un angle de 45 degrés et satisfont l'expression conditionnelle suivante (1). (1) : -10<d×sin(θ'-45)/(cosθ'×im_pitch)<10.
PCT/JP2017/010694 2016-04-21 2017-03-16 Système endoscopique WO2017183371A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2017544793A JPWO2017183371A1 (ja) 2016-04-21 2017-03-16 内視鏡システム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016085445 2016-04-21
JP2016-085445 2016-04-21

Publications (1)

Publication Number Publication Date
WO2017183371A1 true WO2017183371A1 (fr) 2017-10-26

Family

ID=60116831

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/010694 WO2017183371A1 (fr) 2016-04-21 2017-03-16 Système endoscopique

Country Status (2)

Country Link
JP (1) JPWO2017183371A1 (fr)
WO (1) WO2017183371A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019167310A1 (fr) * 2018-02-27 2019-09-06 オリンパス株式会社 Système optique d'objectif pour endoscope
WO2019220730A1 (fr) * 2018-05-14 2019-11-21 オリンパス株式会社 Système optique endoscopique
WO2019235006A1 (fr) * 2018-06-08 2019-12-12 オリンパス株式会社 Endoscope
WO2020003604A1 (fr) * 2018-06-27 2020-01-02 オリンパス株式会社 Appareil d'affichage d'images et procédé d'affichage d'images
JPWO2019187195A1 (ja) * 2018-03-27 2021-02-25 オリンパス株式会社 対物光学系、撮像装置、内視鏡及び内視鏡システム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0588019A (ja) * 1991-09-30 1993-04-09 Fujitsu Ltd 偏光分離プリズムの製造方法
JP2004109490A (ja) * 2002-09-18 2004-04-08 Fuji Photo Optical Co Ltd 偏光ビームスプリッタおよびこれを用いた投写型画像表示装置
JP2004533019A (ja) * 2001-06-11 2004-10-28 スリーエム イノベイティブ プロパティズ カンパニー 低い非点収差を有する投影システム
WO2014002740A1 (fr) * 2012-06-28 2014-01-03 オリンパスメディカルシステムズ株式会社 Système d'endoscope

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0588019A (ja) * 1991-09-30 1993-04-09 Fujitsu Ltd 偏光分離プリズムの製造方法
JP2004533019A (ja) * 2001-06-11 2004-10-28 スリーエム イノベイティブ プロパティズ カンパニー 低い非点収差を有する投影システム
JP2004109490A (ja) * 2002-09-18 2004-04-08 Fuji Photo Optical Co Ltd 偏光ビームスプリッタおよびこれを用いた投写型画像表示装置
WO2014002740A1 (fr) * 2012-06-28 2014-01-03 オリンパスメディカルシステムズ株式会社 Système d'endoscope

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11460676B2 (en) 2018-02-27 2022-10-04 Olympus Corporation Objective optical system and endoscope
JPWO2019167310A1 (ja) * 2018-02-27 2020-12-03 オリンパス株式会社 対物光学系、及び内視鏡
WO2019167310A1 (fr) * 2018-02-27 2019-09-06 オリンパス株式会社 Système optique d'objectif pour endoscope
JP6995978B2 (ja) 2018-03-27 2022-01-17 オリンパス株式会社 内視鏡用対物光学系、撮像装置、内視鏡及び内視鏡システム
JPWO2019187195A1 (ja) * 2018-03-27 2021-02-25 オリンパス株式会社 対物光学系、撮像装置、内視鏡及び内視鏡システム
US11815739B2 (en) 2018-05-14 2023-11-14 Olympus Corporation Endoscope optical system, endoscope, image pickup unit and endoscope insertion device
WO2019220730A1 (fr) * 2018-05-14 2019-11-21 オリンパス株式会社 Système optique endoscopique
JPWO2019220730A1 (ja) * 2018-05-14 2021-05-20 オリンパス株式会社 内視鏡光学系、内視鏡、撮像ユニット及び内視鏡挿入部
JPWO2019235006A1 (ja) * 2018-06-08 2021-06-03 オリンパス株式会社 内視鏡
JP7105303B2 (ja) 2018-06-08 2022-07-22 オリンパス株式会社 内視鏡
WO2019235006A1 (fr) * 2018-06-08 2019-12-12 オリンパス株式会社 Endoscope
US11969151B2 (en) 2018-06-08 2024-04-30 Olympus Corporation Endoscope
JPWO2020003604A1 (ja) * 2018-06-27 2021-06-03 オリンパス株式会社 画像生成装置、画像表示装置、及び画像表示方法
JP7055202B2 (ja) 2018-06-27 2022-04-15 オリンパス株式会社 画像生成装置、画像表示装置、及び画像表示方法
WO2020003604A1 (fr) * 2018-06-27 2020-01-02 オリンパス株式会社 Appareil d'affichage d'images et procédé d'affichage d'images
US11470283B2 (en) 2018-06-27 2022-10-11 Olympus Corporation Image generation apparatus, image display apparatus, and image display method

Also Published As

Publication number Publication date
JPWO2017183371A1 (ja) 2018-04-26

Similar Documents

Publication Publication Date Title
JP6006464B1 (ja) 内視鏡システム
WO2017183371A1 (fr) Système endoscopique
US8456767B2 (en) Objective optical system
JP6498364B2 (ja) 内視鏡システム及び内視鏡システムの調整方法
WO2014129089A1 (fr) Système optique d&#39;objectif d&#39;endoscope et dispositif d&#39;imagerie
WO2017119188A1 (fr) Système optique d&#39;objectif
JP6513307B2 (ja) 内視鏡システム
CN103562771B (zh) 内窥镜物镜光学系统
CN107430260B (zh) 斜视物镜光学系统和具备该斜视物镜光学系统的斜视用内窥镜
CN106255912A (zh) 物镜光学系统
WO2019171642A1 (fr) Système optique endoscopique et dispositif endoscopique
WO2017073292A1 (fr) Unité d&#39;imagerie endoscopique
JP6463573B1 (ja) 内視鏡撮像システム
JP2017209154A (ja) 内視鏡システム
JP6836466B2 (ja) 内視鏡対物光学系
JP6363570B2 (ja) ファインダーおよび撮像装置
JP6363818B1 (ja) 内視鏡システム
WO2020174561A1 (fr) Système optique d&#39;objectif d&#39;endoscope
JP5725972B2 (ja) アダプタ光学系及びそれを有する撮像装置
CN112334811A (zh) 物镜光学系统及使用了该物镜光学系统的硬性镜用光学系统、硬性镜
JP2009282182A (ja) ビューファインダー及びそれを用いた撮像装置

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2017544793

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17785713

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17785713

Country of ref document: EP

Kind code of ref document: A1