EP0623209A4 - Multifocal optical apparatus. - Google Patents

Multifocal optical apparatus.

Info

Publication number
EP0623209A4
EP0623209A4 EP19930904495 EP93904495A EP0623209A4 EP 0623209 A4 EP0623209 A4 EP 0623209A4 EP 19930904495 EP19930904495 EP 19930904495 EP 93904495 A EP93904495 A EP 93904495A EP 0623209 A4 EP0623209 A4 EP 0623209A4
Authority
EP
European Patent Office
Prior art keywords
image
ccc
imager
scene
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP19930904495
Other languages
English (en)
Other versions
EP0623209A1 (en
Inventor
Shelemyahu Zacks
Ziv Soferman
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of EP0623209A1 publication Critical patent/EP0623209A1/en
Publication of EP0623209A4 publication Critical patent/EP0623209A4/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals

Definitions

  • the present invention relates to focusing optical systems.
  • a significant limitation in the operation of many optical devices is their limited depth of field, i.e. the depth over which an object in a field of view remains in acceptable focus.
  • the present invention seeks to provide
  • SUBSTITUTESHEET optical apparatus capable of in-focus imaging of a significantly greater depth than that imaged by conventional optical apparatus.
  • optical apparatus including multifocal imaging apparatus defining a depth domain, sensor apparatus receiving light from a scene via the multifocal imaging apparatus and image processing apparatus for receiving an output of the sensor apparatus and providing an in-focus image of the portion of the scene falling within the depth domain.
  • the depth domain includes all planes falling between and parallel to first and second planes which are both orthogonal to the optical axis.
  • the depth domain includes a first subdomain including all planes falling between and parallel to first and second planes which are both orthogonal to the optical axis and a second subdomain, disjoint to the first subdomain, which includes all planes falling between and parallel to third and fourth planes which are both orthogonal to the optical axis and which are not included in the first subdomain.
  • the domain includes more than two disjoint subdomains such as those described above.
  • SUBSTITUTE SHEET to receive light from a scene, each of the plurality of surfaces having a different focal length, sensor apparatus receiving light from the scene from each of the plurality of surfaces, such that a different part of the light from the scene received from each of the plurality of surfaces is in focus and image processing apparatus for receiving an output of the sensor apparatus and providing an image including in-focus parts received from each of the plurality of surfaces.
  • the image processing apparatus is operative to provide a restoration procedure which produces a sharp image from a blurred image provided by the optical apparatus.
  • the imaging apparatus includes lens apparatus. In accordance with another preferred embodiment of the invention, the imaging apparatus includes mirror apparatus.
  • an image processing method including the steps of providing a digital representation of a viewed scene including at least two scene locations whose distances from the point of view are nonequal, dividing the digital representation of the viewed scene into a multiplicity of digital representations of a corresponding plurality of portions of the viewed scene and separately sharpening each of the multiplicity of digital representations and assembling the multiplicity of sharpened digital representations into a single sharpened digital representation.
  • the step of dividing and sharpening includes the steps of operating a plurality of restoration filters on each of the multiplicity of digital representations, and selecting, for each individual one of the multiplicity of digital representations, an individual one of the plurality of restoration filters to be employed in providing the sharpened digital representation of the individual one of the multiplicity of digital representations.
  • optical apparatus including a multifocal imager defining a depth domain, a sensor receiving light from a scene via the multifocal imager, and an image processor for receiving an output of the sensor and ' for providing an in-focus image of the portion of the scene falling within the depth domain.
  • the image processor operates in real time.
  • the image processor operates off line with respect, to the sensor.
  • the senor includes photographic film.
  • the image processor includes an image digitizer operative to provide a digital representation of the scene to the image processor.
  • the multifocal imager defines a plurality of
  • SUBSTITUTE SHEET surfaces each arranged to receive light from a scene, each of the plurality of surfaces having a different focal length.
  • the senor receives light from the scene via each of the plurality of surfaces, such that a different part of the light from the scene received via each of the plurality of surfaces is in focus.
  • the image processor is operative to provide a composite image built up using in-focus parts received from each of the plurality of surfaces.
  • the xmager includes a lens and/or a mirror system.
  • the imager has an invertible transfer function and the image processor includes a computational unit for inverting the transfer function, thereby to restore sharp details of the scene.
  • the transfer function includes no zeros for predetermined domains of distance from the imager.
  • the transfer function includes no zeros.
  • the absolute value of the transfer function has a predetermined lower bound which is sufficiently large to permit image restoration.
  • the absolute value of the transfer function has a predetermined lower bound which is sufficiently large to permit image restoration for a predetermined domain of distances from the imager.
  • an image processing method including the steps of providing a digital representation of a viewed scene including at least two scene locations whose distances from the point of view are nonequal, dividing the digital representation of the viewed scene into a multiplicity of digital representations of a corresponding plurality of portions of the viewed scene and separately sharpening each of the multiplicity of digital representations, and assembling the multiplicity of sharpened digital representations into a single sharpened digital representation.
  • the step of dividing and sharpening includes the steps of operating a plurality of restoration filters on each of the multiplicity of digital representations, and selecting, for each individual one of the multiplicity of digital representations, an individual one of the plurality of restoration filters to be employed in providing the sharpened digital representation of the individual one of the multiplicity of digital representations.
  • video camera apparatus including an imager defining a depth domain and a sensor receiving light from a scene via the imager for generating video images, wherein the imager includes a multifocal imager, and wherein the video camera apparatus also includes an image processor for receiving an output of the sensor and for providing an in-focus image of the portion of the scene falling within the depth domain.
  • electronic still camera apparatus including an imager defining a depth domain and a sensor receiving light from a scene via the imager for generating electronic still images, wherein the imager includes a multifocal imager, and wherein the electronic still camera apparatus also includes an image processor for receiving an output of the sensor and for providing an in-focus image of the portion of the scene falling within the depth domain.
  • camcorder apparatus including an imager defining a depth domain and a sensor receiving light from a scene via the imager for generating video images, wherein the imager includes a multifocal imager, and wherein the camcorder apparatus also includes an image processor for receiving an output of the sensor and for providing an in-focus image of the portion of the scene falling within the depth domain.
  • film camera development apparatus including a multifocal-combining image processor operative to receive a multifocal representation of a scene and to provide an in-focus representation of a portion of the scene falling within a predetermined depth domain.
  • microscope apparatus including an objective lens including a multifocal imager defining a depth domain, a sensor receiving light from a scene, having more than two dimensions, via the multifocal imager, and an image processor for receiving an output of the sensor and for providing an in-focus image of a plurality of planes within the scene which fall within the depth domain.
  • pparatus for internal imaging of the human body including a multifocal imager defining a depth domain, which is configured for insertion into the human body, a sensor receiving light from an internal portion of the human body via the imager, and an image processor for receiving an output of the sensor and for providing an in-focus image of the internal portion of the human body falling within the depth domain.
  • the multifocal imager is operative to provide an image, including a superposition of in-focus contributions of a plurality of planes, to a single sensor.
  • the multifocal imager is operative to provide an image, including a superposition of in-focus contributions of an infinite number of planes.
  • the senor includes a single sensor.
  • an imaging method including the steps of providing a multifocal imager defining a depth domain, sensing light from a scene via the multifocal imager, and receiving an output of the sensor and image processing the output, thereby to provide an in-focus image of the portion of the scene falling within the depth domain.
  • Fig. 1 is a generalized block diagram illustration of the optical apparatus of the invention
  • Fig. 2A is a conceptual illustration of imaging lens apparatus useful in the apparatus of Fig. 1;
  • Fig. 2B is a conceptual illustration of imaging mirror apparatus useful in the apparatus of Fig. 1;
  • Fig. 3 is a simplified illustration of the images produced at a sensor by the apparatus of Figs. 2A and 2B;
  • Fig. 4 is an optical diagram of a typical design of an imaging lens assembly useful in the apparatus of Fig. 1;
  • Fig. 5 is an optical diagram of a
  • SUBSTITUTESHEET typical design of an imaging mirror useful in the apparatus of Fig. 1;
  • Fig. 6 is a simplified block diagram of image processing apparatus particularly suited for processing gray level images and useful in the apparatus in Fig. 1;
  • Fig. 7 is a simplified block diagram of another embodiment of image processing apparatus particularly suited for processing gray level images and useful in the apparatus of Fig. 1;
  • Fig. 8 is a conceptual illustration of a method for computing a restoration transfer function provided in accordance with one preferred embodiment of the present invention.
  • Fig. 9 is a conceptual illustration of an alternative method to the method of Fig. 8;
  • Fig. 10 is a conceptual illustration of a method for fusing a plurality of' restored images into a single restored image;
  • Fig. 11 is a simplified block diagram of an alternative embodiment of image processing apparatus particularly suited for processing color images and useful in the apparatus of Fig. 1;
  • Fig. 12 is a simplified block diagram of video or electronic still camera apparatus or camcorder apparatus in which conventional functions of these devices are combined with the functions of the invention shown and described herein;
  • Fig. 13 is a simplified block diagram of film camera apparatus in which conventional functions of film cameras are combined with the functions of the invention shown and described herein;
  • Fig. 14 is a simplified block diagram
  • Fig. 15 is a simplified block diagram of a laproscope or endocscope in which conventional laproscopy/endoscopy functions are combined, with the functions of the invention shown and described herein;
  • Fig. 16 is a simplified block diagram of a laproscopy or endocscopy apparatus which is a modification of the apparatus of Fig. 15;
  • Fig. 17 is a simplified block diagram of multifocal imaging apparatus constructed and operative in accordance with another embodiment of the present invention.
  • DETAILED DESCRIPTION OF A PREFERRED EMBODIMENT Reference is now made to Fig. 1, which illustrates optical apparatus constructed and operative in accordance with a preferred embodiment of the present invention and including imaging apparatus 10, such as multifocal lens apparatus or multifocal mirror apparatus, arranged to receive light from a scene 11 which includes objects located at various distances from the imaging apparatus.
  • imaging apparatus 10 such as multifocal lens apparatus or multifocal mirror apparatus
  • the imaging apparatus is operative to direct light from the scene to a sensor 12 such that focused light from various objects in the scene at different distances from the imaging apparatus simultaneously reaches the s@__.sor.
  • Sensor 12 is typically a charge coupled device (CCD) .
  • the imaging apparatus 10 has a transfer function which
  • SUBSTITUTESHEET varies just slightly with the distance between the imaging apparatus and an object in the working domain (depth domain) , as measured parallel to the optical axis of the optical apparatus.
  • the working domain is a continuous or discontinuous range along the optical axis.
  • the transfer function preferably has no zeros in the working domain.
  • the absolute value of the transfer function in the working domain has a lower bound V, which is large enough to allow image restoration, which can be provided computationally, even in the presence of a reasonable amount of noise.
  • the transfer function is invertible such that the sharp details of the scene within the depth domain can be restored by computational means.
  • the composite image seen by the sensor 12 may normally appear not in focus to a human eye.
  • electronic image processing apparatus 14 is provided for converting the information received by sensor 12 to an in-focus image of the scene, which is displayed on an output device 16, such as a display, image analyzer, video recorder or other storage means, or printer.
  • output device 16 such as a display, image analyzer, video recorder or other storage means, or printer.
  • record and playback functions can be provided downstream of sensor 12 and upstream of image processor 14. Record and playback functions can be analog with subsequent digitization or, alternatively, digitization can be performed upstream of the record and playback functions, in which case the record and playback functions are digital.
  • FIG. 2A conceptually illustrates multifocal lens means 20 useful as the imaging apparatus of the present invention.
  • Fig. 2A shows that the multifocal lens means 20 images two objects, indicated as A and B, such as point light sources, which are situated at two different distances from the lens assembly 20, onto an image plane 22.
  • Each of the images of objects A and B in the image plane 22 is built up from a superposition of in-focus contributions made by some portion of lens means 20 and of out-of- focus contributions made by other portions of the lens means 20.
  • the in-focus contribution comes from a different portion of the lens means 20.
  • the in-focus contribution comes from region 2 of the lens means and for object B, the in-focus contribution comes from region 1 of the lens means 20.
  • SUBSTITUTESHEET Therefore, their depth domains may overlap rather than being disjoint and, consequently, they are attenuated by less than the attenuation factor of the high resolution in-focus contributions, i.e., in the illustrated example, they are attenuated by a factor of less than 3.
  • resolution refers to the number of resolvable lines per unit length, e.g. resolvable lines/ mm.
  • FIG. 2B conceptually illustrates a multifocal mirror assembly 30 useful as the imaging apparatus of the present invention.
  • Fig. 3 shows that the multifocal mirror assembly 30 focuses two objects A and B, such as point light sources, which are situated at two different distances from the mirror assembly 30, onto an image plane 22.
  • each of the images of objects A and B in the image plane 22 is built up from a superposition of in- focus contributions made by some portion of mirror assembly 30 and of out-of-focus contributions made by other portions of the mirror assembly 30.
  • Fig. 3 provides a qualitative indication of the appearance of the images of the two objects A and B as seen in the image
  • the imaging means of the present invention may comprise a plurality of discrete regions of different focal lengths, as in the embodiment of Fig. 2A, for example.
  • the imaging means may comprise a continuum of locations each corresponding to a different focal length.
  • the "concentric ring" implementation of the multifocal imaging apparatus sho*__ ⁇ and described herein may be replaced by a "continuous" implementation in which the focal length of the lens varies continuously over the lens as a function of the distance from the center of the lens.
  • the final image may be based on a combination of high resolution (high frequency) components arriving from each plane in space which intersects the scene and is perpendicular to the optical axis.
  • high resolution components of the image are based on information arriving from only a thin slice, orthogonal to the optical axis, which contains in-focus information. High resolution (high frequency) components arriving from all other planes is strongly suppressed or strongly attenuated.
  • Fig. 4 illustrates an optical design of a preferred embodiment of lens means of the Double Gauss type, having a continuously varying focal length.
  • the lens means of Fig. 4 includes a plurality of lens portions, each having corresponding lens surfaces, labeled I - XIV.
  • T the distance from an individual surface to the next surface
  • N index of refraction
  • V Abbe number, (Nd-1)/(Nf-Nc)
  • Nd index of refraction for the Sodium d line
  • Nf index of refraction for the Hydrogen f line
  • Nc index of refraction for the Hydrogen c line (0.6563 microns).
  • the continual variation in focus is provided by a Schmidt plate, whose surfaces are referenced V and VI, which is located in the plane of the aperture stop and which is configured and arranged to generate multifocality by providing spherical aberration.
  • surfaces V and VI are generally parallel except for an aspheric correction term, (+0.5 x 10" 3 x rho 4 ) , for surface VI.
  • the exit pupil is, for computation purposes, similar to a union of an infinite number of infinitesimal concentric rings among the optical axis, all with the same area.
  • the focal length of the rings varies continuously as a function of ring radius, whereas the magnification of the respective image contribution remains constant.
  • the light energy which falls on the sensor may be computed as a sum or integral of the contributions from all of the infinitesimal concentric rings, because the image on the sensor is, for computation purposes, a superposition of the images contributed by the rings.
  • a particular feature of the superposition is that, preferably, for each plane of the object space perpendicular to the optical axis which falls within the depth domain, there exists a "contribution" from a particular ring which brings that plane into sharp focus.
  • the superpositioned image received by the sensor includes, inter alia, an in-focus image of each plane within the depth domain of the system.
  • FIG. 5 illustrates a preferred optical design of a
  • SUBSTITUTE SHEET multifocal mirror assembly 30 which is similar to that conceptually illustrated in Fig. 2B.
  • the mirror assembly of Fig. 2B includes a plurality of mirror surface portions.
  • three surfaces each with different focal lengths are provided, here, a suitable number of rings is provided which together form a mirror, such that the respective focal lengths of the rings together cover a predetermined depth domain.
  • any suitable number of rings may be provided, since, theoretically, an infinite number of infinitesimal rings of different focal lengths may be employed so as to provide an entirely continuous dependence of focal length on ring radius.
  • multifocality is generated by providing an aspheric surface which applies a third order spherical aberration which is selected to be strong enough to generate multifocality in the desired depth domain.
  • FIG. 6 is a simplified block diagram of image processing unit 14 of Fig. 1 constructed and operative in accordance with an embodiment of the present invention which is particularly suited to processing gray level images.
  • the apparatus of Fig. 6 includes a Fourier transform unit 100 which is operative to
  • SUBSTITUTESHEET compute a Fourier transform such as an FFT (Fast Fourier Transform) of a digitized blurred image I received from sensor 12 of Fig. 1.
  • the output of unit 100 is termed I " .
  • Fourier transforms, including FFT's, are well known in the art and are described in Rosenfeld, A. and Kak, A. C, Digital picture processing. Academic Press, 1982, Vol. I, pp. 13-27, the disclosure of which is incorporated herein by reference.
  • a digital storage unit 102 is operative to store a restoration transfer function T" for Fourier space restoration.
  • a computational unit 104 receives I" from Fourier transform unit 100 and T ⁇ from memory 102 and generates a restored image in the Fourier plane. For each pair of Fourier plane coordinates (u, v) , the restored image is I"(u,v)/T"(u,v) .
  • An inversion unit 106 inverts the restored image generated by unit 104 from the Fourier plane to the real plane, thereby to provide a sharpened output image.
  • inversion unit 106 performs an inverse FFT operation on the image supplied to it by unit 104.
  • the output I A of inversion unit 106 is supplied to unit 16 of Fig. 1.
  • h(x,y,d) h(x,y,d) .he PSF (point spread function) of the le or mirror of imaging apparatus 10 of Fig. 1, for a planar object which lies perpendicular to the lens axis and parallel to sensor 12 which may be a screen onto which the lens or mirror projects an image of the planar object.
  • h(x,y,d) describes the blurring imposed by the lens on the planar object.
  • x, y cartesian coordinates of axes orthogonal to the optical axis
  • d the distance between a planar object and the lens;
  • the restoration transfer function T" (u,v) may be defined, for each (u,v) , as H(u,v,d 0 ) such that:
  • >
  • >
  • d a member of a work domain [d 1 ,d 2 ] of the lens, or of a fragmented work domain which includes a plurality of spaced non-overlapping intervals such as [d x , d 2 ] , [d 3 , d 4 ] and [d 5 , d 6 3.
  • FIG. 7 is a simplified block diagram of another embodiment of image processing unit 14 of Fig. 1 which is particularly useful in applications in which the embodiment of Fig. 6 is difficult to implement due to use of a relatively non- powerful processor having computational limitations.
  • the apparatus of Fig. 7 includes a convolution unit 110 which convolves the output
  • SUBSTITUTE SHEET I of unit 12 of Fig. 1 with a convolution kernel t may be supplied by a kernel storage or kernel computation unit 112.
  • the kernel t(x,y) preferably is approximately equal to the inverse Fourier transform of
  • the image restoration methods shown and described above may be replaced by an adaptive image restoration method in which the optical restoration transfer function is adapted to d, the depth of the object. Blurring of the object by the lens varies slightly as a function of d and a particular feature of the adaptive method is that this variation is taken into account.
  • the adaptive method preferably comprises the following steps: a.
  • the working domain which may or may not be fragmented, is partitioned into a plurality of n-1 subdomains. For example, for a non-fragmented working domain [d 1# d 2 ] , the partition includes a plurality of n-1 subdomains
  • the restoration transfer function is computed in accordance with Equation 1, set forth above, thereby to define a plurality of n-1 restoration transfer functions T " 17 ..., ⁇ n __ ⁇ , as illustrated conceptually in
  • the kernel for real space restoration is computed, as explained hereinabove with reference to Fig.
  • the image is restored either in the Fourier plane or in real space, as convenient, using the restoration transfer functions in the first instance and the convolution kernels in the second instance.
  • the result of this step is a plurality of n-1 restored images I A ⁇ ..., I ⁇ n _ ⁇ * d.
  • the n-1 restored images computed in step c are fused or merged into a single restored image I" using a suitable method such as the following, illustrated conceptually in
  • Fig. 10 For each of the images I * lf , ..., I ⁇ n r define a plurality of m ⁇ x vim square blocks which cover the image.
  • SUBSTITUTE SHEET sharpest square block from among the blocks (i. j.k) .
  • the criteria for the selection process may be conventional statistical or morphological criteria for image quality.
  • the selected block may be that which is most similar to a preassembled directory of known image features such as edges, isolated local extreme points, roof edges and smooth functions, where a correlational similarity criterion is employed, iii. Assemble the single restored image I" by fusing the ___-_ x m 2 sharpest square blocks selected in step ii.
  • the final t(x,y) for restoration of block (j,k) is preferably interpolated from the kernels selected for block (j,k) and for its neighbors, blocks (j,k+l), (j+l,k) and (j+l,k+l) .
  • a suitable interpolation procedure prevents formation of artificial edges at the boundaries between blocks.
  • interpolation method Any suitable interpolation method may be employed, such as bilinear interpolation methods.
  • An interpolation method which is believed to provide high quality results is described in the following publication, the disclosure of which is incorporated herein by reference:
  • Fig. 11 is a simplified block diagram of an alternative embodiment of image processing unit 14 of Fig. 1 which is particularly suited for processing color image data such as RGB data.
  • the apparatus of Fig. 11 includes a color component channeling unit 202 which channels each of a plurality of color components of the input color image I, such as R, G and B components, to a corresponding one of a plurality of image processing subunits 204, 206 and 208, corresponding in number to the plurality of color components.
  • Each image processing subunit may be similar to the image processing apparatus of
  • I ⁇ outputs of image processing subunits 204, 206 and 208 are referenced I ⁇ R , I G and I B in Fig. 11. All three outputs are provided to a color combining unit 210 which combines them and provides an output color image I".
  • many output devices such as display monitors, video recorders, video printers and computer with video acquisition equipment, include a color combining function because they are capable of receiving a plurality of channels, such as 3 channels of R, G and B data respectively, and providing an output color image.
  • blurred images may be recorded by a video recorder without restoration and restoration may be carried out later by playing the blurred image from a video player.
  • a particular feature of the invention shown and described herein is that, in contrast to conventional imaging systems, here, the mapping from the three-dimensional object to the two-dimensional image is almost independent of the depth of the object within the field of view. This is because the point spread function h(x,y,z) is, up to a first order, independent of
  • Appendix C is a Fortran language listing of a software implementation of image processing unit 14 of Fig. 1 according to one alternative embodiment of the present invention. Appendix C also includes results of operation of the software implementation on data derived from a software simulation of imaging apparatus 10 and sensor 12. It is appreciated that inclusion of the listing of Appendix C is intended merely to provide an extremely detailed disclosure of the present invention and is not intended to limit the scope of the present invention to the particular implementation of Appendix C.
  • distance of an object or image from an optical system refers to that component of the distance which is parallel to the optical axis of the system.
  • distance of an object point from an optical system refers to the distance between a first plane and a second plane, both of which are perpendicular to the optical axis of the system, wherein the first plane includes the object point and the second plane includes the operative portion of the optical system.
  • depth of an object relates to distance of the object, along the optical
  • FIG. 12 - 16 illustrate various applications of the imaging apparatus shown and described above, it being understood that the particular applications of Figs. 12 - 16 are merely exemplary of possible imaging applications and are not intended to be limiting.
  • the apparatus of Fig. 12 includes video or electronic still camera apparatus or camcorder apparatus 250 which is imaging a scene 252 in which various objects are located at different distances from the camera/camcorder 250.
  • the lens 254 of the camera/camcorder 250 is not a conventional lens but rather comprises an imaging device constructed and operative in accordance with the present invention such as any one of the imaging apparatus embodiments shown and described above with reference to Figs. 2A - 5.
  • a sensor 256 receives the raw image formed by the multifocal imager 254.
  • the sensed raw image is recorded and, optionally, played back by recording unit 258 and playback unit 260, respectively, for off-line reception by digitizer 264.
  • the sensed raw image is digitized by a digitizer 264 and the resulting digital representation of the image is provided to an image restoration computation unit 266 which has image processing capabilities as shown and
  • the restored digital image generated by computation unit 266 may be converted into analog representation by a D/A unit 270, and may then be displayed or printed by an output device 272 such as a TV monitor or printer.
  • the digital output of computation unit 266 may be stored on a conventional digital memory or digital tape 274.
  • the digital output of computation unit 266 may also be outputted directly by a digital output device 280 which is capable of handling digital input, such as but not limited to a printer, recorder or monitor with digital interface.
  • image restoration computation unit 266 may receive distance information from a rangefinder 282 such as an ultrasound rangefinder or laser rangefinder. Unit 266 may employ this information to improve the image of the features falling within the range found by the rangefinder.
  • a rangefinder allows a user to select an object in a scene which is of relative importance, and determine the distance to that object using the rangefinder.
  • the image restoration computation unit 266 is preferably operative to receive the distance information and to select a transfer function on the basis of that distance such that optimal restoration results are achieved for the depth at which the selected object lies.
  • an improved electronic still camera, video camera or camcorder may be constructed to include all components of Fig. 12.
  • the components of the apparatus of Fig. 12 which are other than conventional may be retrofitted to an existing conventional electronic still camera, video camera or camcorder.
  • Fig. 12 is useful in high-definition television (HDTV) applications where, due to very high resolution, the depth domain is, in conventional apparatus, very small and even small deviations from focus result in perceptible defocussing.
  • HDTV high-definition television
  • Fig. 13 illustrates filming apparatus for filming a scene 310 including a film camera 312 fitted with a multifocal lens 314 which is constructed and operative in accordance with the present invention and which may comprise any of the imaging apparatus embodiments shown and described hereinabove with reference to Figs. 2A - 5.
  • the film 320 generated by film camera 312 is developed conventionally and the developed film is scanned by a scanner 324 which provides a digital output.
  • the digital output of scanner 324 is provided to an image restoration computation unit 326 which has image processing functions in accordance with the present invention such as those shown and described above with reference to Figs. 6 - 11.
  • the sharpened image generated by the restoration unit 326 is converted into a hard copy 330 by a suitable output device 332 such as a plotter.
  • a suitable output device 332 such as a plotter.
  • an improved film camera system may be constructed to include all components of Fig. 13.
  • the components of the apparatus of Fig. 13 which are other than conventional, such as the multifocal lens and the image restorer, may be provided stand-alone, for use in conjunction with existing conventional equipment including film camera, development laborator, digital scanner and plotter.
  • SUBSTITUTESHEET Fig. 14 is a simplified block diagram of microscopy apparatus in which conventional microscope functions are combined with the functions of the invention shown and described herein.
  • the apparatus of Fig. 14 includes a microscope 350 fitted with an multifocus objective lens 352 which is constructed and operative in accordance with the present invention and which may comprise any of the imaging devices shown and described above with reference to Figs. 2A - 5.
  • a camera 354 such as a video, electronic still or film camera captures the magnified image from its sensor 355 which may be a CCD, tube or film.
  • the digital output of the came a is provided to an image restoration unit 356. If the output of the camera is other than digital, the output is first digitized by an A/D unit 358.
  • Image restoration unit 356 is a computational unit with image processing capabilities such as those shown and described above with reference to Figs. 6 - 11.
  • the output of unit 356 is provided to an analog output device 360 via a D/A unit 362 or to a digital output device 364.
  • an improved microscopy system may be constructed to include all components of Fig. 14.
  • the components of the apparatus of Fig. 14 which are other than conventional, such as the multifocal lens and the image restorer, may be provided stand-alone, for use in conjunction with existing conventional equipment including microscope, camera, A/D and D/A converters, and output device.
  • SUBSTITUTE SHEET inspecting a wide variety of objects and scenes such as but not limited to a drop of liquid, a three-dimensional trasparent or semitransparent scene, a 2.5-dimensional surface such as the surface of a microchip and an object or material whose surface is not flat.
  • Fig. 15 illustrates a laproscope or endocscope for imaging and photography of interior portions of a body such as that of a human patient.
  • the apparatus of Fig. 15 includes an objective lens 400 which is inserted into the body using conventional medical techniques at a location depending on the location which it is desired to image.
  • the objective lens 400 comprises a multifocal imager such as any of those shown and described above.
  • a first bundle 402 of optical fibers is provided, one end 403 of which is disposed in the image plane of the multifocal imager.
  • the other end 404 of optical fiber bundle 402 is operative associated with a relay lens 408 and a sensor 410 such that the image formed at end 404 of the bundle 402 is projected onto sensor 410 via relay lens 408.
  • the image captured by sensor 410 is provided to an image restoration computation unit 414 which may be similar to any of the image processing units shown and described above.
  • the image generated by the image processor has a much higher resolution over a large depth domain. This image is provided to a suitable output device 416.
  • Illumination of the body interior portion to be imaged may be provided by a second optical fiber bundle 420 which is operatively associated with a source of light located externally of the body of the patient.
  • Fig. 16 illustrates laproscopy or endocscopy apparatus which is a modification of the apparatus of Fig. 15 in that the entire camera apparatus is inserted into the body.
  • the apparatus of Fig. 16 is generally similar to the apparatus of Fig. 15 except that relay lens 408 is eliminated and, instead, the sensor 410 is attached to the first optical fiber bundle 402 at end 404 thereof.
  • All elements of the apparatus apart from image restoration computation unit 414 and output device 416 may be inserted into the body.
  • the elements of the apparatus which operate interiorly of the human body may communicate with exteriorly operating elements 414 and 416 via a suitable electric cable 424.
  • image restoration computation unit 414 may also be inserted into the body, such that all elements of the apparatus apart from output device 416 operate interiorly of the body.
  • the interiorly operating elements may, in this case, communicate with the exteriorly operating output device via a suitable electric cable 428.
  • Fig. 17 is a simplified block diagram of multifocal imaging apparatus constructed and operative in accordance with another embodiment of the present invention.
  • the apparatus of Fig. 17 includes lens 450 which is operatively associated with a plurality of semitransparent mirrors or beam splitters such as three splitting elements 452, 454 and 456.
  • the focal planes of splitting element 454 for a point source 458 are indicated by dotted lines 460 and
  • SUBSTITUTESHEET 462 respectively.
  • the focal planes of splitting element 456 for point source 458 are indicated by dotted line 464 and dotted line 466 respectively.
  • Four sensors are provided for respectively imaging the four images generated in the illustrated embodiment.
  • the locations of the four sensors are indicated by solid lines 470, 472, 474 and 476. All sensors are, as illustrated, parallel respectively to the corresponding focal planes.
  • the four sensors are located at different respective distances from their corresponding focal planes, respectively, thereby to provide a plurality of differently focussed images. These images are then preferably rescaled to a uniform magnification by one or more rescaling units 480.
  • the rescaled differently focussed images may be added, i.e. combined "one on top of the other" by addition, averaging or other suitable methods, into a single image.
  • This image is processed by an image restoration computation unit 484, similar to those described above, so as to generate a final image including contributions from all four differently focussed images.
  • the present invention is not limited in its applicability to planar sensors such as planar CCD's or such as films lying in a plane. If a nonplanar sensor is employed, the present invention may be used, for example, by treating the digitized sensor output as though it had arrived from a planar sensor.
  • transfer function has been used generally to refer to the appropriate one of the following terms: contrast transfer function, optical transfer function (OTF) ,
  • SUBSTITUTESHEET modulation transfer function (MTF) or phase transfer function.
  • the present invention is useful in applications in which the edge information is the most crucial portion of the total image information, such as in industrial robot vision applications.
  • a conventional edge detector function is provided upstream of or instead of the image processing unit 14 which is operative to enhance edges located in any of the planes falling within the depth domain of the apparatus.
  • record and playback functions are provided between digitization unit 264 and image processor 266 instead of or in addition to playback and record functions 258 and 260.
  • record and playbaick functions are provided which are operative upstream of the image processing function. This allows image processing, such as image processing of a video movie, to be performed off-line as well as on-line.
  • the following procedure may be employed: a. Recording sensor output either before or after digitization; b. Use the playback function to play back frame by frame; c. Perform image processing, such as image restoration or edge detection, on each frame; and d. Record the output, frame by frame, in the output device.
  • SUBSTITUTESHEET performed frame by frame, off-line, recording the output frame by frame for later viewing.
  • the present apparatus can be employed to generate an in- focus video movie.
  • record and playback functions are optionally provided intermediate camera 354 and A/D unit 358 and/or intermediate camera 354 and image restoration computational unit 356.
  • either or both of the following sequences of functions may be added intermediate sensor 410 and image restoration computational unit 414: a. A/D, recorder of digital image, playback; and/or b. analog image recorder, playback, A/D. Suitable record and playback functions, similar to those described above, may also be provided in the embodiment of Fig. 15.
  • DIAMETER ',G12.5, /5X,'MIN/MAX OBJ.
  • DIST ' ,2G12.5///)
  • DOB-DOB1+(I-l) *DEL CCC AVPSF IS THE ROUTINE WHICH PRODUCES CCC THE POINT SPREAD FUNCTION (PSF) CCC (AS A SIMPLIFIED SIMULATION) BY A CCC SUPERPOSITION OF A NUMBER OF PSF'S CCC SEE DESCRIPTION IN THE ROUTINE AVPSF.
  • DOB3 0.5*(DOB1+DOB2)
  • FOB3 l./F-l./DOB3
  • FOB3 l./FOB3
  • IA1 THE PSF CALL GAG0PER1(EZ2,A1,NX,NY) I DISPLACE
  • ARRAY BY “BBB” (MULT. BY AAA) .
  • A(I) A(I)/PSFAC(I)
  • NXT1 NXT - 1
  • IDIM(3) IDIM(l)
  • IDIM(4) IDIM(l)
  • IDIM(4) IDIM(2)
  • ARRAY(J) ARRAY(J) *ONEVOL
  • ARRAY(J+l) -ARRAY(J+l) *ONEVOL 100 CONTINUE C
  • IDIM(2) NXP2
  • IDIM(3) IDIM(l)
  • IDIM(4) IDIM(2)
  • IDIM(5) 2 C C TAKE COMPLEX CONJUGATE TO DO INVERSE & SCALE
  • ARRAY(J) ARRAY(J) *ONEVOL
  • ARRAY(J+l) -ARRAY(J+l) *ONEVOL 300 CONTINUE
  • ARRAY(J) ARRAY(J)*ONEVOL
  • ARRAY(J+l) ARRAY(J+l)*ONEVOL 400 CONTINUE C
  • IDIM(4) IDIM(l)
  • ODD(L) F + D 100 CONTINUE 200 CONTINUE 300 CONTINUE C
  • EVEN(L) EVEN(K) - ODD(K)
  • EVEN(K) EVEN(K) + ODD(K)
  • TWO PI 6.283185
  • TWO N FLOAT(2*N)
  • DO 300 II Kl, NT, D3

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)
EP19930904495 1992-01-14 1993-01-13 Multifocal optical apparatus. Withdrawn EP0623209A4 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
IL100657 1992-01-14
IL100657A IL100657A0 (en) 1992-01-14 1992-01-14 Multifocal optical apparatus
PCT/US1993/000330 WO1993014384A1 (en) 1992-01-14 1993-01-13 Multifocal optical apparatus

Publications (2)

Publication Number Publication Date
EP0623209A1 EP0623209A1 (en) 1994-11-09
EP0623209A4 true EP0623209A4 (en) 1994-11-17

Family

ID=11063273

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19930904495 Withdrawn EP0623209A4 (en) 1992-01-14 1993-01-13 Multifocal optical apparatus.

Country Status (6)

Country Link
EP (1) EP0623209A4 (xx)
JP (1) JPH07505725A (xx)
AU (1) AU3583193A (xx)
GB (1) GB9413367D0 (xx)
IL (1) IL100657A0 (xx)
WO (1) WO1993014384A1 (xx)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6927922B2 (en) 2001-12-18 2005-08-09 The University Of Rochester Imaging using a multifocal aspheric lens to obtain extended depth of field
US7336430B2 (en) 2004-09-03 2008-02-26 Micron Technology, Inc. Extended depth of field using a multi-focal length lens with a controlled range of spherical aberration and a centrally obscured aperture
US20070279618A1 (en) * 2004-10-15 2007-12-06 Matsushita Electric Industrial Co., Ltd. Imaging Apparatus And Image Improving Method
US20100295973A1 (en) * 2007-11-06 2010-11-25 Tessera North America, Inc. Determinate and indeterminate optical systems
DK2399157T3 (da) * 2009-02-20 2013-07-29 Thales Canada Inc Optisk billeddannelsessystem med dobbelt synsfelt og dobbeltfokuslinse

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1988001069A1 (en) * 1986-07-30 1988-02-11 Drexler Technology Corporation Method and apparatus for reading data with ccd area arrays
EP0280588A1 (fr) * 1987-01-21 1988-08-31 Matra Procédé et dispositif de prise de vue à grande profondeur de champ
DE3905619A1 (de) * 1988-02-23 1989-08-31 Olympus Optical Co Bildeingabe-/ausgabevorrichtung

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3767934A (en) * 1971-06-16 1973-10-23 Exxon Co Fault responsive switching system
US4548495A (en) * 1981-02-27 1985-10-22 Takeomi Suzuki Proper focusing state detecting device
JPS5821715A (ja) * 1981-07-31 1983-02-08 Asahi Optical Co Ltd 光束分割器
JPS6038613A (ja) * 1983-08-10 1985-02-28 Canon Inc 測距光学系
DE3729334A1 (de) * 1987-09-02 1989-03-16 Sick Optik Elektronik Erwin Lichttaster
US4993830A (en) * 1989-12-18 1991-02-19 Systronics, Incorporated Depth and distance measuring system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1988001069A1 (en) * 1986-07-30 1988-02-11 Drexler Technology Corporation Method and apparatus for reading data with ccd area arrays
EP0280588A1 (fr) * 1987-01-21 1988-08-31 Matra Procédé et dispositif de prise de vue à grande profondeur de champ
DE3905619A1 (de) * 1988-02-23 1989-08-31 Olympus Optical Co Bildeingabe-/ausgabevorrichtung

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO9314384A1 *

Also Published As

Publication number Publication date
AU3583193A (en) 1993-08-03
EP0623209A1 (en) 1994-11-09
IL100657A0 (en) 1992-09-06
JPH07505725A (ja) 1995-06-22
WO1993014384A1 (en) 1993-07-22
GB9413367D0 (en) 1994-08-31

Similar Documents

Publication Publication Date Title
US7723662B2 (en) Microscopy arrangements and approaches
US8432479B2 (en) Range measurement using a zoom camera
JP5274623B2 (ja) 画像処理装置、撮像装置、画像処理プログラム、および画像処理方法
Green et al. Multi-aperture photography
US11928794B2 (en) Image processing device, image processing program, image processing method, and imaging device
CN103533227A (zh) 图像拾取装置和透镜装置
CN102959945A (zh) 根据通过图像捕捉设备的阵列获得的数据产生虚拟输出图像的方法和系统
CA2577735A1 (en) Extended depth of field using a multi-focal length lens with a controlled range of spherical aberration and centrally obscured aperture
JP6576046B2 (ja) 複眼撮像装置
JP2011118235A (ja) 撮像装置
JP2012103741A (ja) 撮像装置、画像処理装置、および画像処理方法、並びにプログラム
WO2007137112A2 (en) Method and system for correcting optical aberrations, including widefield imaging applications
JPH06194758A (ja) 奥行画像形成方法及び装置
JP5677366B2 (ja) 撮像装置
EP2564234A1 (en) Range measurement using a coded aperture
JP3013721B2 (ja) デジタル画像形成手段を有した光学装置
JP2014107631A (ja) 画像生成方法及び装置
Labussière et al. Leveraging blur information for plenoptic camera calibration
Chan et al. Super-resolution reconstruction in a computational compound-eye imaging system
WO2013124664A1 (en) A method and apparatus for imaging through a time-varying inhomogeneous medium
EP0623209A4 (en) Multifocal optical apparatus.
JPH0887600A (ja) 特徴抽出装置
JP6168220B2 (ja) 画像生成装置、画像処理装置、画像生成方法及び画像処理プログラム
JP2007156749A (ja) 画像入力装置
JP2013236291A (ja) 立体撮像装置

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 19940812

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LI NL PT SE

A4 Supplementary search report drawn up and despatched

Effective date: 19940927

AK Designated contracting states

Kind code of ref document: A4

Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LI NL PT SE

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 19960801