WO2020249236A1 - Optical assembly, observation apparatus and method for optically overlaying input images - Google Patents

Optical assembly, observation apparatus and method for optically overlaying input images Download PDF

Info

Publication number
WO2020249236A1
WO2020249236A1 PCT/EP2019/065733 EP2019065733W WO2020249236A1 WO 2020249236 A1 WO2020249236 A1 WO 2020249236A1 EP 2019065733 W EP2019065733 W EP 2019065733W WO 2020249236 A1 WO2020249236 A1 WO 2020249236A1
Authority
WO
WIPO (PCT)
Prior art keywords
input image
image
optical assembly
optical
color
Prior art date
Application number
PCT/EP2019/065733
Other languages
French (fr)
Inventor
Georg THEMELIS
Original Assignee
Leica Instruments (Singapore) Pte. Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Leica Instruments (Singapore) Pte. Ltd. filed Critical Leica Instruments (Singapore) Pte. Ltd.
Priority to EP19731268.9A priority Critical patent/EP3983844A1/en
Priority to PCT/EP2019/065733 priority patent/WO2020249236A1/en
Publication of WO2020249236A1 publication Critical patent/WO2020249236A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/02Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the intensity of light
    • G02B26/023Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the intensity of light comprising movable attenuating elements, e.g. neutral density filters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000094Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00186Optical arrangements with imaging filters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00193Optical arrangements adapted for stereoscopic vision
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/18Arrangements with more than one light path, e.g. for comparing two specimens
    • G02B21/20Binocular arrangements
    • G02B21/22Stereoscopic arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • G02B21/367Control or image processing arrangements for digital or video microscopes providing an output produced by processing a plurality of individual source images, e.g. image tiling, montage, composite images, depth sectioning, image comparison
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/007Optical devices or arrangements for the control of light using movable or deformable optical elements the movable or deformable optical element controlling the colour, i.e. a spectral characteristic, of the light
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/02Viewing or reading apparatus
    • G02B27/022Viewing apparatus
    • G02B27/024Viewing apparatus comprising a light source, e.g. for viewing photographic slides, X-ray transparancies
    • G02B27/026Viewing apparatus comprising a light source, e.g. for viewing photographic slides, X-ray transparancies and a display device, e.g. CRT, LCD, for adding markings or signs or to enhance the contrast of the viewed object
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0112Head-up displays characterised by optical features comprising device for genereting colour display
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0141Head-up displays characterised by optical features characterised by the informative content of the display
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0145Head-up displays characterised by optical features creating an intermediate image
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/0012Surgical microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/2407Optical details
    • G02B23/2415Stereoscopic endoscopes

Landscapes

  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Optics & Photonics (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Signal Processing (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Multimedia (AREA)
  • Astronomy & Astrophysics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Endoscopes (AREA)

Abstract

The invention relates to an optical assembly (100), in particular a head-up display (102), for an observation apparatus (104) such as a microscope (106a) or endoscope (106b), the optical assembly being configured to generate an output image (112, 112a, 112b) for projection to a viewing system (114), the output image (112, 112a, 112b) comprising an optical overlay (113) of a first input image (108, 108a, 108b) and a second input image (110, 110a, 110b) onto one another, wherein the optical assembly (100) is further configured to create at least one of a predetermined color (116a) and predetermined brightness (116b) in at least one location (118) of the output image (112, 112a, 112b) by adapting at least one of a color (120a) and a brightness (120b) at a matching location (122) in the second input image depending on at least one of a color (124a) and a brightness (124b) at a matching location (126) in the second input image.

Description

Optical Assembly, Observation Apparatus and Method for Optically Overlaying Input
Images
The invention relates to an optical assembly, in particular a head-up display for an observation apparatus, such as a microscope or endoscope, to an observation apparatus and to a method for generating an output image from a first and a second input image.
Optical assemblies, such as head-up displays, in particular head-up displays for medical and biological applications nowadays need to display more information than is visible to the eye or highlight some information over other information, in order to direct the user’s attention to specific features. Often, information from various sources is displayed together to provide better orientation to the user. For example, MRI, ultrasound and/or fluorescence images may be displayed together with the standard white-light images, or specific features in a white-light image need to be highlighted.
The invention aims to improve the output images that are presented to a user in such a situation.
This object is achieved by an optical assembly, in particular a head-up display, for an observation apparatus, such as a microscope or endoscope, the optical assembly being configured to generate an output image for projection to a viewing system, the output image comprising an optical overlay of a first input image and second input image onto one another, wherein the optical assembly is further configured to create at least one of a predetermined color and predetermined brightness in at least one location of the output image by adapting at least one of a color and a brightness at a matching location in one of the first and second input image depending on at least one of a color and a brightness at matching location in the other one of the first and second input image.
Further, the above object is reached by an observation apparatus comprising at least one of a microscope and an endoscope, and an optical assembly as above. Finally, the object is achieved by a method for generating an output image from a first input image and a second input image, the method comprising the steps of optically overlaying the first input image and the second input image, and obtaining at least one of a predetermined brightness and a predetermined color at a location of the output image by changing, at overlying locations, at least one of the brightness and color of the overlaid second input image depending on at least one of the brightness and color of the first input image.
According to this solution, it is possible to obtain a desired color and/or brightness at a certain location in the output image despite the fact that the output image results from optically overlaying the first and second input image, and not by electronically merging image data by image processing.
The above solution may be further improved by adding one or more of the following optional features. Hereby, each of the following optional features is advantageous on its own, and may be combined independently with any other optional feature. Further, each of the optional features may be indiscriminately used for improving any one of the optical assembly, observation apparatus, and method.
For example, a location can be a pixel, voxel, a plurality of preferably contiguous pixels, a plurality of preferably contiguous voxels or any combination thereof. A location may also be a pattern which has been defined or recognized by a pattern recognition algorithm. Such a location may represent a physical structure of the object. The physical structure has preferably optical properties, such as a common texture, a common color and/or a common intensity.
The terms“first” and“second” input image are used for differentiation purposes only and do not imply any order, importance or other relationship. Matching locations are those locations in the first and second input image that are overlying one another in the optical overlay. Thus, if for example the first input image is optically overlaid onto the second input image (or vice versa), matching locations are located at the same position in the resulting overlay and thus in the output image. In other words, matching locations are located on top of one another as seen from a viewing position. Further, any one of the first and second input image, and the output image may be part of a time series of images, e.g. be a video frame. The first and the second input image may be of the same size or of different sizes.
One of the first and second input images may be matched and registered to the other one of the first and second input images. In such matched and registered images, physical structures in the first input image and the second input image that correspond to each other, have the same size, shape and orientation. To obtain such matching and registration, at least one of, preferably the second input image, may be processed to match the other one of the first and second input image. The matching and registration may be done optically, by obtaining the first and the second input image from aligned optics, or electronically, by matching and registering the second input image to the first input image, or a combination of both.
According to one aspect, the optical assembly may comprise a first imaging sub-assembly. The first imaging sub-assembly may be configured to display and/or project the first input image. In particular, the first imaging sub-assembly may be configured to project an image of an object located in a field of view of a lens, e.g. in a probe volume, to the optical overlay as the first input image.
Preferably, the first input image is displayed in and/or projected onto a first image plane in the optical overlay.
The optical assembly may comprise a second imaging sub-assembly, the second imaging sub- assembly being configured to display and/or project the second input image in and/or to the optical overlay. In particular, the second imaging sub-assembly may be configured to generate a display of the second input image from electronic data representing the second input image. Preferably, the second input image is projected onto and/or displayed in a second image plane in the optical overlay.
The first and second image plane, or the first and the second input image in the optical overlay, are preferably parallel to and stacked above one another. The first image plane may be above or below the second image plane in particular along an optical axis, thus creating the optical overlay For an observer looking at the first and second image plane, preferably from a direction perpendicular to the first and second image plane, one of the first and the second input image is in front of the other one of the first and second input image. The one of the first and second input image that is closer to a viewer or observer is preferably translucent so that the other one of the first and second input image is visible behind it. The first and the second image plane may coincide and/or be movable relative to one another.
The optical assembly may comprise an optical image mixer which is configured to optically overlay the first and second input image onto one another. For example, the optical image mixer may comprise optical elements that are configured for overlaying the first and the second input image, such as a beam splitter, e.g. a dichroic beam splitter, mirrors, a screen, such as a milky screen, and/or a display, such as a LCD display, which may be translucent, as is explained further below.
The first and second input image may represent all kinds of images. For example, the first and/or the second input image may be a fluorescence image, which has been recorded in the fluorescence spectrum of at least one fluorophore. The first and/or the second input image may be monochrome images and/or color images. In such color images, any color space may be represented, such as in RGB images or in multispectral or hyperspectral images. As a monochrome image, the first and/or second input image may represent a black and white, grey scale, image or a single color channel of a color image. The first and/or second input image may be white-light images.
Alternatively or cumulatively, the second input image may be or comprise an image obtained by non-optical means such as ultrasound, PET, SPECT, EEG, MEG or ECG as well as radiographic imaging, tomography, elastography, photoacoustic imaging, magnetic particle imaging. The second input image may be an entirely synthetic image, i.e. a computer-generated image comprising or consisting of e.g. at least one color field and/or a pattern at the at least one location.
According to another aspect of the invention, the optical assembly, in particular the optical mixer, may be configured to at least one of additively and subtractively mix the first input image and the second input image to generate the output image. As the overlay is generated optically, the output image is not computed, but generated by an optical combination or overlay. A subtractive mix may e.g. be achieved, if one of the first and the second input image is used as a filter blocking, at the at least one location, predetermined wavelengths of the other one of the first and the second input image. In other words, the spectrum of the first or the second input image is subtracted from the spectrum of the other one of the first or second input image at the at least one location. An additive mix may be generated if one of the first and second input image is projected onto the other one of the first or second input image, such that the spectra of the first and the second input image are added.
The optical assembly may comprise an image processor which is configured to compute the at least one color and/or brightness at the at least one location of the second input image which will result in the predetermined color and/or predetermined brightness at the matching location in the output image, if the first and the second input image are overlaid or, more particularly, the color and/or brightness at a location of one image is additively and/or subtractively mixed with the color and/or brightness at the matching location in the other image.
The optical assembly, in particular the second imaging sub-assembly and/or the optical mixer, may comprise at least one electric and/or electronic display device. The at least one display device may be configured to display the second input image. The display device may comprise a screen such as a LCD, LED and/or OLED screen, and/or a projector which is configured to project the image generated by such a screen onto the second image plane. If no projector is used, the display device itself may represent the second image plane.
The at least one display device may comprise, according to a further embodiment, at least one translucent, electronic, display configured to display the second input image, and arranged for the first input image to be visible through, or behind, the translucent display. In this embodiment, the optical overlay is created in that the first input image is viewed through the second input image. Thus, the second input image may act as an optical filter for the first input image.
The translucent display for displaying the second input image may be arranged in an optical path of a viewing system. The viewing system may be configured for viewing the output image and/or optical overlay preferably by a human, and comprise an eyepiece. Alternatively or cumulatively, the viewing system may be configured to project the output image towards an observer and/or an image sensor. Examples of such a viewing system are a camera recording the output image. The optical path preferably extends between the viewing system on one side and the optical overlay on the other. The translucent display allows to generate an exact overlay of the first and second input image. Further, any kind of image data may be displayed on the translucent display as the second input image, giving the optical assembly a large degree of versatility in generating the output image. Using high-resolution display devices allows to adjust the color and/or brightness of even minute details in the optical overlay and thus in the output image. Alternatively or cumulatively, the optical assembly, in particular the optical mixer, may comprise a projection system, which is configured to project one of the first and second input image onto the other one of the first and second input image to generate the optical overlay. The projection system may comprise optical elements such as lenses, apertures, beam splitters and/or reflectors. Further, the projection system may comprise a display device such as described above. In this configuration, the first or second input image displayed on the display device is projected onto the other one of the first and second input image.
To obtain a predetermined brightness at a matching location in the output image, the optical assembly may be configured to darken at least one predetermined location in one of the first input image and the second input image. In this configuration, it is possible to, for example, darken the area around a region of interest, so that the region of interest is displayed brighter than its surroundings. This facilitates focusing the observer’s attention onto the brighter area.
For darkening the at least one predetermined location, it is for example possible to use the translucent display as a selectively operable grey filter, which is activated only at the at least one predetermined location, e.g. by switching a pixel to display a grey color. The optical assembly and/or observation apparatus may comprise an adjustable illumination system, which is configured to automatically increase the intensity of the illumination light if the at at least one location is darkened, in order to maintain the overall brightness in the optical overlay. The illumination light is directed towards the field of view of the optical assembly. The illumination system may be controlled depending on the overall brightness in the optical overlay and/or the output image, respectively. The illumination system may further be configured to adjust the spectrum of the illumination light depending on at least one of the color and brightness of the first and/or the second input image and/or the desired color and/or brightness at the at least one location.
The transmissivity and/or transmission spectrum at the at least one location of the second input image displayed by the translucent display may be adjusted depending on the color and/or brightness of the matching or overlying location of the other one of the first input image. This may be used to equalize an uneven illumination with regard to brightness and/or color. For example, glare or differences in reflectance may be adapted. This aspect may further be used to maintain an even brightness in a highlighted, e.g. brightened and/or pseudocolored, area of interest by darkening parts of the location that would otherwise be too bright. Again, this aspect may be combined with an automatic illumination system to be able to go beyond the capabilities of a mere filtering process. The selective darkening and/or coloring of locations may also be used to overlay a pattern in the second input image onto the first input image so that the output image contains a patterned area. For example, the second input image may comprise a temporarily and/or spatially varying pattern, such as a hatching, a blinking field, any other regular repetition of a geometric pattern in time and/or in space, and any combination thereof. In the optical overlay and thus the output image, the pattern will be combined with the first input image.
It is to be noted, that the optical assembly may of course be used to combine more than two input images in an optical overlay, e.g. by using more than two display devices and by displaying a third or more input image in the optical overlay as described above for the second input image.
The first input image may be directly projected from a probe volume in the optical assembly’s field of view to the optical overlay. According to this aspect, the optical assembly may comprise a lens directed to the probe volume. The probe volume is preferably configured to receive an object to be inspected.
According to one embodiment, at least one of the first input image, the second input image and the output image is a stereoscopic image. It is most preferred that all of the first input image, the second image and the output image are stereoscopic. In particular, the first and/or the second imaging sub-assembly may be configured to display stereoscopic images. The optical mixer may be adapted to overlay stereoscopic images. In its easiest implementation, stereoscopy may be achieved by using two optical stereoscopic channels, one for the left eye and one for the right eye. The optical stereoscopic channels may have identical structures and components. In a modification of this, a single display device, e.g. a single screen or projector, may be split in two halves, wherein each half is assigned to a different stereoscopic channel.
Another aspect of the invention, which is independent of the aspect of adjusting color and/or brightness, but may be used to improve the effects of the optical overlay as well, is concerned with adjusting the perceived depth level of the first and the second input image relative to one another using stereoscopy.
According to this aspect, the optical assembly may be configured to alter a binocular disparity of a stereoscopic second input image. Adjustment of the binocular disparity allows to adjust the perceived depth location of the second input image relative to the first input image. The second input image is perceived to move towards the observer if the binocular disparity is increased, and away from the observer if the binocular disparity is decreased. If the binocular disparity of the first input image remains unchanged, the second input image may appear to be located in front of or behind the first input image depending on the binocular disparity of the second input image. The binocular disparity may be altered in only one of the first input image and second input image, or in any of these images independently from the other image.
This solution allows to display, to a stereoscopic observer, the second input image at a depth location, in which the second input image has been recorded with respect to the depth location of the first input image. For example, the second input image may represent a plane of a three- dimensional image, such as an MRI image, of a structure underneath a surface which is depicted in the first input image. Adjusting the binocular disparity in the second input image allows to display the structure at the correct distance underneath the surface represented by the first input image.
Alternatively or cumulatively, the optical assembly may be configured to move at least one of the first and the second image plane along an optical axis of the optical assembly. For example, the screen, which displays the second input image, may be moved towards or away from the image plane of the first input image to display these two images at a distance from each other that reflects to the distance between the planes in which the images have been recorded. Alternatively or additionally, a projection system may change its focal length or the position of the projected image plane to change the position of the second image plane.
For adjusting the binocular disparity or the location of the image planes correctly, it may be advantageous if at least one of the first input image and the second input image comprises image depth-location information and if the optical assembly is configured to alter the binocular disparity and / or the location of at least one of the first and the second image plane depending on the depth-location information. The image depth-location information is representative of the depth, e.g. distance from a certain point along the optical axis of the optical assembly, where the respective first or second input image has been recorded. For example, the image depth-location information may be representative for the focal distance used for recording the respective first or second input image, respectively. The image depth-location information may be considered part of the input image, even if it is present as separate data and not contained as a subset of the data representing the image. It is preferred, however, that the image depth-location information is coded right into the data that represents the image, e.g. as a data field. The optical assembly may comprise an image processor that is configured to compute the image depth-location information by pattern recognition. For example, the depth location of the first input image relative to the second input image may be computed by locating the patterns of the first input image in a three- dimensional image of which the second input image represents a plane.
According to another embodiment, both the first input image and the second input image may comprise image depth-location information. The optical assembly may be configured to alter the binocular disparity depending on a difference between the image depth-location information of the first input image and the image depth-location information of the second input image. The binocular disparity can e.g. be altered by an image processor operating on a digital representation of the first and/or second input image.
For adjusting the relative locations of the image planes of the first input image and the second input image in the optical overlay, the optical assembly may include a depth adjustment device. The depth adjustment device may be configured to alter a distance, in particular along an optical axis, between the second input image and the first input image, or between the image plane of the first input image and the image plane of the second input image, respectively. For altering the distance, the depth adjustment device may comprise a drive acting on an optical element such as a lens, a beam splitter, a screen and/ora display device depending on the image depth-location information.
Independent of the aspect of adjusting the brightness and/or color in one of the first and the second input image, and of adjusting the perceived distance between the first and the second input image, there is a need to switch off the first input image projected from the field of view of the optical assembly during operation and to display only the second input image, or any additional, such as a third, input image that is generated from electronic data. This switch allows an observer to concentrate on only one input image. This can be achieved if the optical assembly comprises a first mode of operation and a second mode of operation, wherein, in the first mode of operation, the output image comprises the optical overlay of the first input image and the second input image, and, in the second mode of operation, the output image comprises only one of the first input image and the second input image. The optical assembly may comprise a third mode of operation in which only the first input image is displayed. This mode of operation may be attained by simply switching off the display device which displays the second input image.
For example, in the first mode of operation, a fluorescence image may be displayed in false colors overlaid onto the white-light color image projected from the probe volume. In the second mode of operation, only the fluorescence image is shown. In the third mode of operation, only the white- light image is shown. To achieve this, the optical assembly may comprise a light blocking device, the light blocking device being located in one of an optical path of the first input image and an optical path of the second input image. In one embodiment, the light blocking device may be located in an optical path between the probe volume and the optical overlay. The light blocking device may be configured to at least one of darken and close the respective optical path. The light blocking device may comprise at least one of an adjustable filter, e.g. a translucent LCD screen of which pixels can be set to be non-transmissive, an iris, a shutter, a diaphragm, movable polarizers and/or other optical members. For example, if an iris or shutter is used as a light blocking device, it may be closed in the second mode of operation. The optical assembly may further comprise a camera system comprising at least one preferably stereoscopic monochrome or color camera such as an RGB camera, and/or a multi- and/or hyperspectral camera. The camera system may be configured to record an intermediate input image which may correspond to the second input image or from which the second input image may be computed using e.g. an image processor of the observation device. Preferably, the intermediate input image recorded by the camera system is matched and registered to the first input image. The use of a camera system, simultaneously with the projected image from the probe volume, allows to extract and display non-visible information or information and selected wavebands. The second input image may be displayed in real time by e.g. the display device and/or the projector, preferably after being processed by the image processor. Another camera system may be used for recording the first input image and/or the output image, e.g. for documentation and/or for determining the color and/or brightness at the location in the first input image and the output image respectively. This color and/or brightness may be used to determine and adjust the color and/or brightness at the matching location in the second input image, e.g. using a forward-control or a feedback loop control. Finally, another aspect is concerned with a computer program with a program code for obtaining at least one of a predetermined brightness and a predetermined color at a location of the input image by changing, at overlying locations, at least one of the brightness and color of one of overlaid first and second input image depending on at least one of the brightness and color of the other one of the overlaid first and second input image, when the computer program is run on a processor.
The microscope of which the optical assembly may be a part, may be a surgical or a laboratory microscope. The endoscope, of which the optical assembly may be a part of, may be a surgical endoscope.
In the following, exemplary embodiments of the invention are described with reference to the drawings. The shown and described embodiments serve explanatory purposes only. The combination of features shown in the embodiments may be changed according to the foregoing description. For example, a feature which is not shown in an embodiment but described above may be added, if the technical effect associated with this feature is beneficial for a particular application. Vice versa, a feature shown as part of an embodiment may be omitted as described above, if the technical effect associated with this feature is not needed in a particular application.
In the drawings, elements that correspond to each other with respect to function and/or structure have been provided with the same reference numeral.
In the drawings,
Fig. 1 shows an exemplary schematic rendition of an optical assembly and an observation apparatus comprising the optical assembly;
Fig. 2 shows a schematic rendition of a different view on the optical assembly and the observation apparatus of Fig. 1 ;
Fig. 3 shows a schematic representation of a detail of an optical assembly;
Fig. 4 shows a schematic rendition of an optical overlay;
Fig. 5 shows a schematic rendition of adjusting the binocular disparity; and
Fig. 6 shows a schematic flow chart of the process steps performed by the optical assembly.
First, the structure and function of an optical assembly 100 are explained with reference to the exemplary embodiment of Fig. 1.
The optical assembly 100 as shown in Fig. 1 comprises two separate optical channels, a left stereoscopic channel L and a right optical stereoscopic channel R. In the exemplary embodiment, the left stereoscopic channel L is designed differently from the right optical channel R. This is to explain different possible configurations. In practice, it may be preferred to have optical stereoscopic channels L, R which are structured identically. Of course, the optical assembly does not need to be stereoscopic, but may have only a single optical channel, which may be designed as any one of the two optical stereoscopic channels L, R.
The optical assembly 100 may be a head-up display 102 in an observation apparatus 104, such as a microscope 106a or an endoscope 106b. The microscope 106a may be a surgical microscope, such as for neurosurgery or for ophthalmology, or a laboratory or industrial microscope. The endoscope 106b may be an industrial and/or a surgical endoscope.
The optical assembly 100 is configured to generate an output image 1 12, which may consist of two stereoscopic (part) images 1 12a, 1 12b. If the optical assembly 100 is not stereoscopic, only one single optical channel and consequently one output image 1 12 without any stereoscopic (part) images 1 12a, 1 12b is generated.
The output image is generated to be viewed by a viewing system 1 14, which may comprise an eyepiece 144, in particular a binocular eyepiece, as shown in the right stereoscopic channel R and/or a camera 156, as shown for the left stereoscopic channel L. The camera 156 may be a monochrome camera but is preferably a color camera, such as an RGB, multi- or hyperspectral camera. The camera may be a CCD camera. The camera 156 may itself be stereoscopic, i.e. simultaneously record the left and the right stereoscopic channel L, R. If an eyepiece 144 is used, the two stereoscopic channels L, R serve to provide a 3D-like impression to a human observer looking through the eyepiece 144 and/or the camera 156.
The output image 1 12, 1 12a, 1 12b comprises an optical overlay 1 13 of a first input image 108, 108a, 108b, and a second input image 1 10 onto one another. At least one of the first and second input image may be a stereoscopic image, which comprises two stereoscopic part-images 108a, 108b in case of the first input image 108 and two stereoscopic images 1 10a, 1 10b in case of the second input image 1 10. The optical assembly 100 is configured to create at least one of a predetermined color 1 16a and predetermined brightness 1 16b in at least one predetermined location of the output image 1 12, 1 12a, 1 12b.
A location may be a pixel, voxel, a plurality of preferably contiguous pixels, a plurality of preferably contiguous voxels or any combination thereof. A location can for example be a pattern which has been defined by a pattern algorithm. Further, a location may represent a physical structure of the object having common optical properties, such as a common texture, and/or a common color or color component, and/or a common intensity. Examples of such a physical structure may be a blood vessel, an organ, or a tumor. The optical assembly 100 creates the predetermined color 116a and/or the predetermined brightness 1 16b at the at least one location 1 18 of the output image by adapting at least of a color 120a and a brightness 120b at a matching location in one of the first input image 108, 108a, 108b and the second input image 1 10, 1 10a, 1 10b depending on at least one of a color 124a and a brightness 124b at a matching location 126 in the other one of the first input image 108, 108a, 108b and the second input image 110, 110a, 1 10b.
It is preferred that the second input image 1 10, 110a, 110b is generated from electronic data that may have been processed by the observation device 104 or its image processor 182. The first input image 108, 108a, 108b is preferably projected directly from an object 138 to be observed and located in the field of view 139 of the optical assembly 100.
The first and the second input image are preferably matched and registered. For this, the observation apparatus 104 may be configured to match and register the second input image to the first input image. The observation apparatus may comprise an image processor 182 which is configured to match and register the second input image to the first input image. Alternatively or in addition, the first and second image may be created from sharing the same optical components of the optical assembly and naming the same field of view. In the registered and matched images, the structures in both images have the same shape, size and orientation, i.e. are congruent. The matching locations 122, 126 overlay one another in the optical overlay.
The optical assembly may comprise a first imaging sub-assembly 158 which is configured to display the first input image 108, 108a, 108b and/or to project the first input image 108, 108a, 108b to generate the optical overlay 113. For example, the first imaging sub-assembly may comprise a lens 134 which projects an image of the object 138, which is located in a probe volume 136. The probe volume 136 is a spatial region, in which the object 138 may be placed for observation by the optical assembly 100. The probe volume 136 is located in the field of view 139 of the optical assembly 100, or the first imaging sub-assembly 158, respectively.
The optical assembly may comprise a second imaging sub-assembly 160 which is configured to display the second input image 110, 1 10a, 110b and/or to project the second input image 110, 1 10a, 110b towards the optical overlay 113. In the right stereoscopic channel R, the second imaging sub-assembly 160 is shown to include a display device 128, such as a translucent display 130. The translucent display 130 may be a translucent LCD screen or any other screen being capable of being translucent and displaying a color or black-and-white image.
As the display 130 is located in an optical path of the first input image 108, 108a, 108b and is translucent, an optical overlay 113 is generated, in which the first input image 108, 108a, 108b is visible through the second input image 1 10, 1 10a, 1 10b, resulting in the output image 1 12, 1 12a, 1 12b.
In the left stereoscopic channel L, the second imaging sub-assembly 160 is shown for explanatory purposes as a projection system or projector 132. The projection system 132 may comprise a display device 128 for displaying the second input image 1 10, 1 10a, 1 10b. The projection system 132 may comprise a projector lens 162 which is configured to project the second input image 1 10, 1 10a, 1 10b such that a (second) plane of the projection system 132 is located at the optical overlay. For displaying the second input image, the projection system 132 may comprise a display 133, which need not be translucent.
The displays 130, 132 may, independently of one another, be an LCD display, an LED display, an OLED display, a DLP display, an LSoS display and/or a laser display as well as any combination thereof.
The second imaging sub-assembly 160 may comprise a beam splitter 164 which is arranged in the optical path 152 of the corresponding stereoscopic channel, here the left stereoscopic channel L, and an optical path 154 of the second input image 1 10, 1 10a, 1 10b. In the embodiment shown, the optical path 154 of the second input image 1 10, 1 10a, 1 10b corresponds to the optical path of the projection system 132.
For generating the optical overlay 1 13 of the first input image 108, 108a, 108b and the second input image 1 10, 1 10a, 1 10b, the optical assembly 100 may comprise an optical mixer 166. The optical mixer may comprise a beam splitter, such as the beam splitter 164 or a translucent display, such as the translucent display 130. In the beam splitter 164, the optical overlay 1 13 of the first input image 108, 108a, 108b and the second input image 1 10, 1 10a, 1 10b takes place, as both images are visibly superposed. Of course, other components may also be comprised by the optical mixer 166 alternatively or cumulatively. For example, the optical mixer 166 may, instead of the translucent display 130 comprise a semi-translucent screen onto which both the first and the second input image is projected.
If a translucent display 130 is used to display for example the second input image 1 10, 1 10a, 1 10b and the first input image 108, 108a, 108b is viewed through the translucent display 130, the translucent display 130 may act as a filter. The filter properties of this filter are determined by the second input image that is displayed in the translucent display 130. Each pixel of the translucent display 130 attenuates part of the spectrum of the overlaid first input image 108, 108a, 108b depending on the color of this pixel of the translucent display 130. Thus, by changing the color and/or the brightness of a pixel of the translucent screen 130, i.e. at a location in the second input image, the color and/or brightness at an underlying location matching this pixel may be changed. Thus, if a certain color and/or brightness is desired at a predetermined location in the output image 112, 1 12a, 112b, the color displayed at a matching location 122 of the second input image 110, 110a, 110b on the translucent display 130 needs to be matched to the color and/or brightness at the matching locations in the first input image 108, 108a, 108b. In more general terms, the optical assembly 100, in particular the optical mixer 166 may be configured to at least one of additively and subtractively mix the first input image 108, 108a, 108b and the second input image 1 10, 1 10a, 110b to generate the optical overlay 113. A subtractive mix takes place if, for example, the translucent display 130 is used as a filter. An additive mix takes place if, for example, the second input image 110, 1 10a, 110b and the first input image 108, 108a, 108b are projected onto one another so that the two spectra and the two brightnesses of the two images add.
A combination of an additive and subtractive mix may take place if the optical overlays generated by combining a projection of the second input image 110, 110a, 110b and using the second input image 1 10, 1 10a, 110b as a filter, as per translucent display 130, together with the first input image 108, 108a, 108b. Depending on which locations are projected onto one another and which are displayed on the translucent display 130, an additive or a subtractive mix, or a combination of both, may be carried out.
The optical assembly 100 may be configured to darken at least one predetermined location in the output image relative to a matching location 122, 126 and at least one of the first input image 108, 108a, 108b and the second input image 110, 110a, 1 10b. This may be carried out for example by darkening a location in the translucent display 130, e.g. by displaying a grey color.
The optical assembly 100 may be configured to brighten at least one predetermined location 118 in the output image relative to a matching location 122, 126 in at least one of the first input image 108, 108a, 108b and the second input image 110, 110a, 110b. This may be accomplished by e.g. darkening the surroundings of this location as described above, e.g., by displaying a grey color, and by increasing an intensity of illumination light 168 illuminating the probe volume 136. For this, the optical assembly 100, or the observation apparatus 104 may comprise an illumination system 170. The optical assembly 100 may be configured to automatically adjust the illumination system 170 depending on at least one of the predetermined color 116a and the predetermined brightness 1 16b at the at least one location 1 18 in the output image 1 12, 1 12a, 1 12b. Thus, the relative darkening due to the filtering effect of the translucent display 130 may be automatically compensated by the increased intensity of the illumination light 168.
Alternatively or cumulatively to an automatic control of the illumination system 170 depending on the color 116a and/or the brightness 1 16b at the at least one location 118 of the output image 112, 112a, 112b, the projection system 132 may project, as a second input image 110, 1 10a, 1 10b, an identical section of the first input image 108, 108a, 108b having the same color, but higher brightness or intensity onto the first input image 108, 108a, 108b. Due to the optical overlay, the output image 1 12, 112a, 112b will have the same color but higher brightness at that location.
As indicated by arrows 172, the optical assembly 100 may be configured to alter the relative position of the second input image 110, 1 10a, 110b and the first input image 108, 108a, 108b with respect to one another in the optical overlay. For example, the translucent display 130 may be moveable along the optical axis 152 or the second input image 110, 110a, 1 10b may be projected to a different location along the optical path 152 by e.g. changing the position of the beam splitter 164 along the optical axis 152 or adjusting the projector lens 162.
In these cases, the optical assembly 100 may be configured to alter the distance of a first image plane, in which the first input image 108, 108a, 108b is located in the optical overlay or the optical mixer 166 respectively, relative to the location of the second image plane, in which the second input image 110, 110a, 110b is located in the optical overlay 113 or optical mixer 166. This allows to display the first and the second input image at different depth levels 174. For an observer, the second input image 1 10, 1 10a, 110b will be perceived to be at a different depth level 174 as compared to what is in focus of the probe volume 136 in the first input image 108, 108a, 108b.
The optical assembly 100 may comprise an image adjustment device 176 which is configured to alter the relative position of the second image and the first input image in the optical overlay or optical mixer 166, as indicated by arrows 172.
The optical assembly may further comprise a first mode of operation 146 and a second mode of operation 148. In the first mode of operation 146, the output image 112, 112a, 112b comprises the first input image 108, 108a, 108b and the second input image 1 10, 110a, 110b as an optical overlay. In the second mode of operation 148, the output image may comprise only one or a part of the first input image 108, 108a, 108b and the second input image 110, 1 10a, 110b. For example, in the second mode of operation 148, the optical path 152 in any one of the stereoscopic optical channels L, R may be shut off. In the first mode of operation 146, the optical path 152 is unblocked. For switching between the first mode of operation 146 and the second mode of operation 148, the optical assembly 100 may comprise a light blocking device 150, which may be an iris, a diaphragm, a shutter, a moveable polarizer and/or an adjustable filter. The light blocking device 150 is located between the probe volume and the optical overlay113 or optical mixer 166 in the optical path 152. Alternatively or cumulatively, independently operable light blocking devices 150 may be positioned in each of the stereoscopic optic channels L, R, so that each of these channels may be blocked independently of one another.
The optical assembly 100 may comprise a camera system 178 which is configured for recording an intermediate input image 180. The intermediate input image 180 may comprise two stereoscopic part-images 180a, 180b if the camera system 178 is stereoscopic. The intermediate input image 180 may be processed by an image processer 182 and converted to the second input image 110, 110a, 1 10b, which is displayed by the display device 128 for the optical overlay. In order to record an intermediate input image 180, 180a, 180b, which is registered and matched with the first input image 108, 108a, 108b, the intermediate input image 180, 180a, 180b may be recorded using the same optical path 152 as the first input image 108, 108a, 108b by inserting one or more beam splitters 181 into the optical path 152.
The camera system 178 may comprise one or more monochrome and/or color cameras 186, for example an RGB camera, a multi- or hyperspectral camera. A single camera 186 may be used for recording two stereoscopic optical channels simultaneously, by e.g. assigning a different area of an image sensor 188 of the camera 186 to each stereoscopic channel. Alternatively, a separate camera 186 may be used for each stereoscopic channel.
The intermediate input image 180, 180a, 180b may be recorded together with an image depth- location information 142 indicating for example the focal length used for recording the intermediate input image 180, 180a, 180b. Similarly, image depth-location information 142 may also be obtained from lens 134 of the optical assembly 100, which is directed onto the probe volume 136 and may be an endoscope or microscope objective. The optical assembly 100 may be configured to change the relative position of the first input image 108, 108a, 108b and the second input image 1 10, 1 10a, 1 10b relative to one another in the optical overlay 1 13 depending on the image depth-location information 142. The first input image 108, 108a, 108b may be an image having visible-light content, as it is projected directly to the viewing system or the eyepiece 144. For example, the first input image 108, 108a, 108b may be an image containing fluorescence in the visible spectrum or be a white- light image.
The intermediate input image 180, 180a, 180b may be recorded using a subset of the wavelengths in the visible light-spectrum. For example, the intermediate input image 180, 180a, 180b may be recorded in only a narrow part of the visible spectrum. Alternatively or cumulatively, the intermediate input image 180, 180a, 180b may comprise non-visible wavebands, such as infrared, near infrared, or ultraviolet light. The optical assembly 100, in particular the image processor 182 may be configured to compute a pseudo-color image from the intermediate image 180, 180a, 180b, which pseudo-color image is then displayed as the second input image 110, 110a, 110b. The image processor 182 may further be configured to perform any other kind of image processing on the intermediate input image 180, 180a, 180b.
Alternatively or in addition to the intermediate input image 180, 180a, 180b recorded by a camera system 178, the second input image 110, 1 10a, 110b may contain or consist of images that have been recorded outside the optical assembly 100. For example, the second input image 110, 1 10a, 110b may comprise tomographic, radiographic and/or ultrasound data. The optical assembly 100 may be configured to retrieve the second input image 110, 110a, 110b from a storage system 190, e.g. a computer memory or stationary or removable storage, such as a USB stick, a hard disk, or a memory card. The image processor 182 may be configured to render the second input image 110, 110a, 1 10b from such data.
The optical assembly 100, in particular the image processor 182 may be configured to generate the second input image 1 10, 110a, 1 10b synthetically. For example, the second input image 1 10, 1 10a, 1 10b may comprise temporarily and/or spatially regularly varying patterns, such as a blinking color, a moving hatching, or a stationary geometrically repetitive pattern, as a hatching.
If the second input image 1 10, 110a, 110b is a stereoscopic image, the optical assembly 100, in particular the image processor 182 may be configured to modify a binocular disparity 140 in the stereoscopic second input image 110, 1 10a, 110b in particular relative to a binocular disparity 140 in the first input image 108, 108a, 108b. The modification of the binocular disparity translates into a different perceived depth level 174. Depending on the relative binocular disparities, the first input image 108, 108a, 108b is perceived to be in front of, behind or coincident with the second input image 110, 1 10a, 1 10b. The alteration of the binocular disparity 140 may be used as an alternative to or in combination with an alteration of the relative position of the first and second input image in the optical overlay. Altering the binocular disparity 140 allows to alter the depth level 174 without needing to change the relative position of the first and second input image in the optical overlay.
The camera system 178 may be used to determine the color 120a or brightness 120b at a location 1 18 in the first input image 108, 108a, 108b. Alternatively, a separate camera 200, shown in Fig.
2, may be used. The camera 200 may record the first input image 108, 108a, 108b for documentary purposes or for display e.g. on a monitor 202 for assistant staff or an audience.
In Fig. 3, a schematic rendition of an image adjustment device 176 is shown. A translucent display 130 is moved relative to a first image plane 300 of the first input image 108, 108a, 108b. The first input image 108, 108a, 108b is projected onto the first image plane 300 by the first imaging sub- assembly 158. The translucent display 130 represents the second image plane 302. In the first image plane 300, a screen 304 may be provided, onto which the first input image 108, 108a, 108b is projected. The image adjustment device 176 comprises a drive system 306 to move the second image plane 302 with respect to the first image plane 300. For this, the image adjustment device 176 may be provided with a linear drive system 308, such as a ball screw, and an actuator 310, such as an electric motor.
As is clear from Fig. 3, moving the second image plane 302 relative to the first image plane 300 will create the impression that the two images are separated from one another i.e. lie at different depth levels 174 in the object 138. For example, an observer or a camera looking through the binocular eyepiece 144 may view the second input image displayed at the second image plane 302 at position A lying in plane 312a and in position B in plane 312b, whereas the first input image is perceived to be located at plane 312c in the object 138 in both positions A and B.
With respect to Fig. 4, it is explained how the color at the location 126 of the second input image 1 10 is changed depending on the color of the matching location 126 in the first input image 108, 108a, 108b to obtain a predetermined color 116a in the location 1 18 in the output image 1 12. For simplicity, only one stereoscopic channel is shown in Fig. 4.
If, for example, it is desired to have yellow as the color 1 16a in the output image 112 at the location 1 18, and the color 120a at the location 126 in the first input image 108, 108a, 108b is red, the color 124a of the location 126 in the second input image will be set to green. If an additive mixing takes place in the optical overlay 113, or in the optical mixer 166, respectively, the two colors 124a and 120a add up to yellow. As described above, any one of the cameras 156, 178, 186 may be used to establish a forward or, as desired, a feedback control. The image processor 182 may be configured to compute the color 124a at the location 126 to arrive at the desired color 116a in the output image 1 12 depending on the color 120a in the matching location 126 of the first input image 108, 108a, 108b. In the same way, the brightness at the location 1 18 may be adjusted to a desired value.
Next, with reference to Fig. 5, adjusting the binocular disparity 140 differently in the first input image 108, 108a, 108b, and the second input image 110, 110a, 110b is explained. For an observer looking for example through the eyepiece 144, the object is located at a certain distance, i.e. depth level 174. For the binocular observer, the left stereoscopic part-image 108a is therefore viewed in a different angle than the right stereoscopic part-image 108b depending on the depth level. Thus, a structure 500 in the first input image 108, 108a, 108b will be shifted laterally in the right stereoscopic part-image 108b with respect to the left stereoscopic part-image 108a. This shift is the binocular disparity 140, which in the first input image is designated 140a. To visualize the binocular disparity 140, 140a, the structure 500a as seen in the left stereoscopic channel L is inserted into the right stereoscopic channel R in dotted lines in addition to the same structure 500b as seen in the right stereoscopic channel. The closer the object is to the observer, the larger the binocular disparity 140a is.
By changing the binocular disparity 140b in the stereoscopic second input image 1 10a, 110b, the distance is altered in which a stereoscopic observer perceives a structure 502, or 502a in the left and 502b in the right stereoscopic channel. Increasing the binocular disparity 140b makes the structure 502 appear closer to the observer. Lowering the binocular disparity 140b puts the structure 502 further away from the observer. It should be noted that the second input image 1 10a, 110b may contain a plurality of different structures 502, of which the binocular disparity 140b may be changed independently. Preferably, the image processor 182 of the observation apparatus 104 is configured to compute the binocular disparity 140b of a structure 502, which has been identified by a pattern recognition algorithm. If the binocular disparity 140b of the structure 502 is smaller than the binocular disparity 140a of the structure 500, the structure 502 will be viewed as being further away than the structure 500 from the observer.
Each structure 502 in the second input image 1 10a, 110b may be assigned a different depth level. For example, the depth level may be part of the data defining the structure 502 in the second input image 1 10a, 1 10b. The binocular disparity 140b is then computed depending on the depth level information for the structure 502, or if the entire second input image 1 10a, 1 10b comprises only a single image depth-location information on the image depth-level information.
Next, an embodiment of a method is explained with reference to Fig. 6.
In a first step 600, which may not necessarily be carried out by the optical assembly 100 or the observation apparatus 104, but in a separate device, the second input image 110, 1 10a, 110b is recorded. Of course, a camera 178 may be used to record an intermediate input image 180, 180a, 180b which may form the basis for the second input image 110, 110a, 110b.
In a step 602, the second input image 1 10, 1 10a, 1 10b is retrieved from e.g. a storage device. If the second input image is recorded by e.g. the camera 178, which is part of the optical assembly 100, the second input image may be retrieved directly from the camera or, again, from an intermediate storage device. In step 604, the color and/or brightness at at least one location in the first input image 108, 108a, 108b is determined, e.g. by recording the first input image using the camera system 178.
According to step 606, the color and/or brightness at a matching location in the second input image 1 10, 110a, 110b is adjusted depending on the color and/or brightness at the location in the first input image 108, 108a, 108b, as determined in step 604.
In step 608, the binocular disparity of the second input image 1 10, 1 10a, 1 10b may be adjusted, in particular depending on a depth-level information. The binocular disparity may either be the same for the two stereoscopic part-images of the second input image 1 10a, 110b, or different structures in the second input image 1 10, 1 10a, 110b may be displayed with different binocular disparities.
In step 610, the first input image 108, 108a, 108b is projected e.g. from an object 138 to be inspected. In step 612, the first input image 108, 108a, 108b and the second input image 108, 108a, 108b are superimposed to form the optical overlay 113 and thus the output image 112, 1 12a, 112b. In step 614, the optical overlay 113 or the output image 1 12, 1 12a, 1 12b may be recorded, e.g. to use a feedback control for adjusting the color and/or brightness and/or binocular disparity in the second input image 1 10, 110a, 1 10b.
Instead of or in addition to adjusting the binocular disparity in step 608, the first image plane 300 and the second image plane 302 may be moved relative to each other depending on the depth- level information.
Although some aspects have been described in the context of an apparatus, it is clear that these aspects also represent a description of the corresponding method, where a block or device corresponds to a method step or a feature of a method step. Analogously, aspects described in the context of a method step also represent a description of a corresponding block or item or feature of a corresponding apparatus. Some or all of the method steps may be executed by (or using) a hardware apparatus, like for example, a processor, a microprocessor, a programmable computer or an electronic circuit. In some embodiments, some one or more of the most important method steps may be executed by such an apparatus.
Depending on certain implementation requirements, embodiments of the invention can be implemented in hardware or in software. The implementation can be performed using a non- transitory storage medium such as a digital storage medium, for example a floppy disc, a DVD, a Blu-Ray, a CD, a ROM, a PROM, and EPROM, an EEPROM or a FLASH memory, having electronically readable control signals stored thereon, which cooperate (or are capable of cooperating) with a programmable computer system such that the respective method is performed. Therefore, the digital storage medium may be computer readable.
Some embodiments according to the invention comprise a data carrier having electronically readable control signals, which are capable of cooperating with a programmable computer system, such that one of the methods described herein is performed.
Generally, embodiments of the present invention can be implemented as a computer program product with a program code, the program code being operative for performing one of the methods when the computer program product runs on a computer. The program code may, for example, be stored on a machine readable carrier.
Other embodiments comprise the computer program for performing one of the methods described herein, stored on a machine readable carrier.
In other words, an embodiment of the present invention is, therefore, a computer program having a program code for performing one of the methods described herein, when the computer program runs on a computer.
A further embodiment of the present invention is, therefore, a storage medium (or a data carrier, or a computer-readable medium) comprising, stored thereon, the computer program for performing one of the methods described herein when it is performed by a processor. The data carrier, the digital storage medium or the recorded medium are typically tangible and/or non transitionary. A further embodiment of the present invention is an apparatus as described herein comprising a processor and the storage medium.
A further embodiment of the invention is, therefore, a data stream or a sequence of signals representing the computer program for performing one of the methods described herein. The data stream or the sequence of signals may, for example, be configured to be transferred via a data communication connection, for example, via the internet.
A further embodiment comprises a processing means, for example, a computer or a programmable logic device, configured to, or adapted to, perform one of the methods described herein.
A further embodiment comprises a computer having installed thereon the computer program for performing one of the methods described herein. A further embodiment according to the invention comprises an apparatus or a system configured to transfer (for example, electronically or optically) a computer program for performing one of the methods described herein to a receiver. The receiver may, for example, be a computer, a mobile device, a memory device or the like. The apparatus or system may, for example, comprise a file server for transferring the computer program to the receiver.
In some embodiments, a programmable logic device (for example, a field programmable gate array) may be used to perform some or all of the functionalities of the methods described herein. In some embodiments, a field programmable gate array may cooperate with a microprocessor in order to perform one of the methods described herein. Generally, the methods are preferably performed by any hardware apparatus.
The image processor 182 may comprise a hardware device, such as one or more CPUs, FPUs, vector processors, graphic processors, software, or any combination thereof. The image processor may be part of a computer system 192, which may further be configured to control the observation device 104.
REFERENCE NUMERALS
100 optical assembly
102 head-up display
104 observation apparatus
106a microscope
106b endoscope
108 first input image
108a stereoscopic part-image of a first input image
108b stereoscopic part-image of a second input image
1 10 second input image
1 10a stereoscopic part-image of second input image
1 10b stereoscopic part-image of second input image
1 12 output image
1 12a stereoscopic part-image of output image
1 12b stereoscopic part-image of output image
113 optical overlay
114 viewing system
116a color at a location in output image
1 16b brightness at location in output image
118 location in output image
120a color at a matching location in first input image
120b brightness at a matching location in first input image 122 matching location in first input image
124a color at a matching location in second input image 124b brightness at a matching location in second input image 126 matching location in second input image
128 display device
130 translucent display
132 projection system on projector
133 display
134 lens
136 probe volume
138 object
139 field of view
140, 140a, 140b binocular disparity
142 image depth-location information
144 eyepiece 146 first mode of operation
148 second mode of operation
150 light blocking device
152 optical path of one of the first and second input image
154 optical path of the other one of the first and second input image
156 camera
158 first imaging sub-assembly
160 second imaging sub-assembly
162 projector lens
164 beam splitter
166 optical mixer
168 illumination light
170 illumination system
172 arrow
174 depth levels
176 image adjustment device
178 camera system
180 intermediate input image
180a stereoscopic part-image of intermediate input image
180b stereoscopic part-image of intermediate input image
181 beam splitter
182 image processor
184 beam splitter
186 camera
188 image sensor
190 storage system
192 computer system
200 camera
202 monitor
300 first image plane
302 second image plane
304 screen
306 drive system
308 linear drive system
310 actuator 312a, 312b, 312c planes in object
500, 500a, 500b structure in first input image 502, 502a, 502b structure in second input image
600 recording the second input image
602 retrieving the second input image
604 determining color and/or brightness
606 adjusting the color and/or brightness
608 adjustment of binocular disparity
610 projecting the first input image
612 generating the optical overlay
614 recording the optical overlay position
B position
L left stereoscopic (optical) channel
R right stereoscopic (optical) channel

Claims

1. Optical assembly (100), in particular head-up display (102), for an observation apparatus (104) such as a microscope (106a) or endoscope (106b), the optical assembly being configured to generate an output image (112, 1 12a, 112b) for projection to a viewing system (1 14), the output image (1 12, 112a, 112b) comprising an optical overlay (113) of a first input image (108, 108a, 108b) and a second input image (1 10, 1 10a, 1 10b) onto one another, wherein the optical assembly (100) is further configured to create at least one of a predetermined color (1 16a) and predetermined brightness (1 16b) in at least one location (1 18) of the output image (1 12, 112a, 112b) by adapting at least one of a color (124a) and a brightness (124b) at a matching location (122) in the second input image depending on at least one of a color (120a) and a brightness (120b) at a matching location (126) in the first input image.
2. Optical assembly (100) according to claim 1 , the optical assembly (100) being configured to at least one of additively and subtractively mix the first input image (108, 108a, 108b) and the second input image (110, 1 10a, 110b) to generate the optical overlay (113).
3. Optical assembly (100) according to claim 1 or 2, the optical assembly (100) comprising at least one display device (128), the at least one display device (128) being configured to display the second input image (110; 110a, 110b).
4. Optical assembly (100) according to claim 3, wherein the at least one display device (128) comprises at least one translucent display (130) for displaying the second input image (110,
1 10a, 1 10b), the first input image (108, 108a, 108b) being visible through the translucent display (130).
5. Optical assembly (100) according to any one of claims 1 to 4, the optical assembly (100) comprising a projection system (132), the projection system (132) configured to project the second input image (1 10, 1 10a, 1 10b) onto the first input image (108, 108a, 108b) to generate the optical overlay (113).
6. Optical assembly (100) according to any one of claims 1 to 5, wherein the optical assembly (100) is configured to at least one of darken and brighten at least one predetermined location (118) in the output image (1 12, 1 12a, 1 12b) relative to a matching location (122, 126) in at least one of the first input image (108, 108a, 108b) and the second input image
(1 10, 1 10a, 110b).
7. Optical assembly (100) according to any one of claims 1 to 6, wherein the optical assembly (100) comprises a lens (134) directed to a probe volume (136), the probe volume being configured to receive an object (138) to be inspected, the first input image (108, 108a, 108b) being projected from the probe volume (136) by the lens (134). 8. Optical assembly (100) according to any one of claims 1 to 7, wherein at least one of the first input image (108, 108a, 108b), the second input image (1 10, 1 10a, 1 10b) and the output image (1 12, 1 12a, 112b) is a stereoscopic image.
9. Optical assembly (100) according to claim 8, the optical assembly (100) being configured to alter a binocular disparity (140) in the second input image (110, 1 10a, 110b). 10. Optical assembly (100) according to claim 9, wherein at least one of the first input image (108, 108a, 108b) and the second input image (1 10, 1 10a, 1 10b) comprises image depth- location information (142), the optical assembly (100) being configured to alter the binocular disparity (140) depending on the image depth-location information (142).
1 1. Optical assembly (100) according to claim 10, wherein both the first input image (108, 108a, 108b) and second input image (1 10, 1 10a, 1 10b) comprise image depth-location information (142), the optical assembly (100) being configured to alter the binocular disparity (140) depending on a difference between the image depth-location information (142) of the first input image (108, 108a, 108b) and the image depth-location information (142) of the second input image (110, 110a, 110b). 12. Optical assembly (100) according to any one of claims 1 to 11 , wherein the optical assembly
(100) comprises at least two modes (146, 148) of operation, wherein, in one mode of operation, the output image (1 12, 1 12a, 1 12b) comprises the first input image (108, 108a, 108b) and the second input image (1 10, 110a, 110b), and, in another mode of operation, the output image (1 12, 1 12a, 112b) comprises only one of the first input image (108, 108a, 108b) and the second input image (110, 1 10a, 110b).
13. Optical assembly (100) according to any one of claims 1 to 12, wherein the optical assembly (100) comprises a light blocking device (150), the light blocking device (150) being located in one of an optical path (152) of the first input image (108, 108a, 108b) and an optical path (154) of the second input image (110, 1 10a, 1 10b), the light blocking device (150) being configured to one of darken and shut off the respective optical path (152, 154).
14. Observation apparatus (104) comprising at least one of a microscope (106a) and an endoscope (106b), the at least one of a microscope (106a) and endoscope (106b) comprising an optical assembly (100) according to any one of claims 1 to 13.
15. Method for generating an output image (112, 1 12a, 112b) from a first input image (108, 108a, 108b) and a second input image (1 10, 110a, 110b), the method comprising the steps of
- optically overlaying the first input image (108, 108a, 108b) and the second input image (110, 1 10a, 110b), and
- obtaining at least one of a predetermined color (1 16a) and a predetermined brightness (1 16b) at a location (118) of the output image (112, 1 12a, 112b) by changing, at overlying locations (118, 122), at least one of the brightness and color (120, 124) of the second input image (1 10, 1 10a, 110b) depending on at least one of the brightness and color (120, 124) of the first input image (108, 108a, 108b).
16. Computer program with a program code for obtaining at least one of a predetermined color (1 16a) and a predetermined brightness (1 16b) at a location (118) of the output image (1 12,
1 12a, 112b) by changing, at overlying locations (1 18, 122), at least one of the brightness and color (120, 124) of the second input image (1 10, 1 10a, 1 10b) depending on at least one of the brightness and color (120, 124) of the first input image (108, 108a, 108b) when the computer program is run on a processor.
PCT/EP2019/065733 2019-06-14 2019-06-14 Optical assembly, observation apparatus and method for optically overlaying input images WO2020249236A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP19731268.9A EP3983844A1 (en) 2019-06-14 2019-06-14 Optical assembly, observation apparatus and method for optically overlaying input images
PCT/EP2019/065733 WO2020249236A1 (en) 2019-06-14 2019-06-14 Optical assembly, observation apparatus and method for optically overlaying input images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2019/065733 WO2020249236A1 (en) 2019-06-14 2019-06-14 Optical assembly, observation apparatus and method for optically overlaying input images

Publications (1)

Publication Number Publication Date
WO2020249236A1 true WO2020249236A1 (en) 2020-12-17

Family

ID=66912861

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2019/065733 WO2020249236A1 (en) 2019-06-14 2019-06-14 Optical assembly, observation apparatus and method for optically overlaying input images

Country Status (2)

Country Link
EP (1) EP3983844A1 (en)
WO (1) WO2020249236A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220343834A1 (en) * 2021-04-23 2022-10-27 Netflix, Inc. Adjustable light-emissive elements in a display wall

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5488492A (en) * 1993-06-04 1996-01-30 Asahi Kogaku Kogyo Kabushiki Kaisha Apparatus for adjusting color tone of image to be recorded
EP1008005A1 (en) * 1996-09-27 2000-06-14 Leica Inc. Optical in situ information system
EP3248531A1 (en) * 2016-05-23 2017-11-29 Leica Instruments (Singapore) Pte. Ltd. Medical observation device, such as a microscope or an endoscope, and method using a pseudo-color pattern having temporal and/or spatial modulation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5488492A (en) * 1993-06-04 1996-01-30 Asahi Kogaku Kogyo Kabushiki Kaisha Apparatus for adjusting color tone of image to be recorded
EP1008005A1 (en) * 1996-09-27 2000-06-14 Leica Inc. Optical in situ information system
EP3248531A1 (en) * 2016-05-23 2017-11-29 Leica Instruments (Singapore) Pte. Ltd. Medical observation device, such as a microscope or an endoscope, and method using a pseudo-color pattern having temporal and/or spatial modulation

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220343834A1 (en) * 2021-04-23 2022-10-27 Netflix, Inc. Adjustable light-emissive elements in a display wall
US11694604B2 (en) * 2021-04-23 2023-07-04 Netflix, Inc. Adjustable light-emissive elements in a display wall

Also Published As

Publication number Publication date
EP3983844A1 (en) 2022-04-20

Similar Documents

Publication Publication Date Title
CN107137053B (en) Medical examination device such as microscope or endoscope using pseudo-color
US11010610B2 (en) Augmented reality microscope for pathology
US20120056996A1 (en) Special-illumination surgical video stereomicroscope
JP5808146B2 (en) Image processing system, apparatus and method
JP2017510837A (en) Generation of observation image of target area
KR102579730B1 (en) System and method for specular reflection detection and reduction
JP6608884B2 (en) Observation device for visual enhancement of observation object and operation method of observation device
JP2013252185A (en) Endoscope and endoscope apparatus
US9445082B2 (en) System, apparatus, and method for image processing
JP2004105736A (en) Ophthalmic operation microscope with subject lighting
CN110087528B (en) Endoscope system and image display device
JP6734386B2 (en) Display system
WO2018105411A1 (en) Image processing device and method, and operating microscope system
KR102148685B1 (en) Surgical video creation system
WO2020249236A1 (en) Optical assembly, observation apparatus and method for optically overlaying input images
WO2012147664A1 (en) Image processing system and method
US8174588B1 (en) Stereo video microscope
KR20220052957A (en) binoculars device
JP7101740B2 (en) Optical systems and corresponding equipment, methods and computer programs
US9861267B2 (en) Video processor for endoscope, and endoscope system including the same
US10873710B2 (en) IR/NIR imaging with discrete scale comparator objects
JP6526516B2 (en) Medical image processing device
JP5835980B2 (en) Image processing system, apparatus, method, and medical image diagnostic apparatus
JP5754878B2 (en) Ophthalmic device and method of operating the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19731268

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2019731268

Country of ref document: EP