EP0623209A1 - Appareil optique multifocal - Google Patents
Appareil optique multifocalInfo
- Publication number
- EP0623209A1 EP0623209A1 EP93904495A EP93904495A EP0623209A1 EP 0623209 A1 EP0623209 A1 EP 0623209A1 EP 93904495 A EP93904495 A EP 93904495A EP 93904495 A EP93904495 A EP 93904495A EP 0623209 A1 EP0623209 A1 EP 0623209A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- image
- ccc
- imager
- scene
- sensor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/36—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
Definitions
- the present invention relates to focusing optical systems.
- a significant limitation in the operation of many optical devices is their limited depth of field, i.e. the depth over which an object in a field of view remains in acceptable focus.
- the present invention seeks to provide optical apparatus capable of in-focus imaging of a significantly greater depth than that imaged by conventional optical apparatus.
- optical apparatus including multifocal imaging apparatus defining a depth domain, sensor apparatus receiving light from a scene via the multifocal imaging apparatus and image processing apparatus for receiving an output of the sensor apparatus and providing an in-focus image of the portion of the scene falling within the depth domain.
- the depth domain includes all planes falling between and parallel to first and second planes which are both orthogonal to the optical axis.
- the depth domain includes a first subdomain including all planes falling between and parallel to first and second planes which are both orthogonal to the optical axis and a second subdomain, disjoint to the first subdomain, which includes all planes falling between and parallel to third and fourth planes which are both orthogonal to the optical axis and which are not included in the first subdomain.
- optical apparatus including imaging apparatus defining a plurality of surfaces each arranged to receive light from a scene, each of the plurality of surfaces having a different focal length, sensor apparatus receiving light from the scene from each of the plurality of
- the image processing apparatus is operative to provide a restoration procedure which produces a sharp image from a blurred image provided by the optical apparatus.
- the imaging apparatus includes mirror apparatus.
- an image processing method including the steps of providing a digital representation of a viewed scene including at least two scene locations whose distances from the point of view are nonequal, dividing the digital
- the step of dividing and sharpening includes the steps of operating a plurality of restoration filters on each of the multiplicity of digital
- multifocal imager defining a depth domain
- a sensor receiving light from a scene via the multifocal imager
- an image processor for receiving an output of the sensor and for providing an in-focus image of the portion of the scene falling within the depth domain.
- the senor includes photographic film.
- the image processor includes an image digitizer operative to provide a digital representation of the scene to the image processor.
- the multifocal imager defines a plurality of surfaces each arranged to receive light from a scene, each of the plurality of surfaces having a different focal length.
- the senor receives light from the scene via each of the plurality of surfaces, such that a different part of the light from the scene received via each of the plurality of surfaces is in focus.
- the image processor is operative to provide a composite image built up using in-focus parts received from each of the plurality of surfaces.
- the xmager includes a lens and/or a mirror system.
- the transfer function includes no zeros.
- absolute value of the transfer function has a predetermined lower bound which is sufficiently large to permit image restoration.
- the absolute value of the transfer function has a predetermined lower bound which is
- the step of dividing and sharpening includes the steps of operating a plurality of restoration filters on each of the multiplicity of digital
- video camera apparatus including an imager defining a depth domain and a sensor receiving light from a scene via the imager for generating video images, wherein the imager includes a multifocal imager, and wherein the video camera apparatus also includes an image processor for receiving an output of the sensor and for providing an in-focus image of the portion of the scene falling within the depth domain.
- electronic still camera apparatus including an imager defining a depth domain and a sensor receiving light from a scene via the imager for generating electronic still images, wherein the imager includes a multifocal imager, and wherein the electronic still camera apparatus also includes an image processor for receiving an output of the sensor and for providing an in-focus image of the portion of the scene falling within the depth domain.
- camcorder apparatus including an imager defining a depth domain and a sensor receiving light from a scene via the imager for generating video images, wherein the imager includes a multifocal imager, and wherein the camcorder apparatus also includes an image processor for receiving an output of the sensor and for providing an in-focus image of the portion of the scene falling within the depth domain.
- film camera development apparatus including a multifocal-combining image processor operative to receive a multifocal representation of a scene and to provide an in-focus
- microscope apparatus in accordance with a preferred embodiment of the present invention, microscope apparatus
- an objective lens including a lens
- multifocal imager defining a depth domain
- a sensor receiving light from a scene, having more than two dimensions, via the multifocal imager, and an image processor for receiving an output of the sensor and for providing an in-focus image of a plurality of planes within the scene which fall within the depth domain.
- pparatus for internal imaging of the human body including a multifocal imager defining a depth domain, which is configured for insertion into the human body, a sensor
- processor for receiving an output of the sensor and for providing an in-focus image of the internal portion of the human body falling within the depth domain.
- multifocal imager is operative to provide an image, including a superposition of in-focus contributions of a plurality of planes, to a single sensor.
- the multifocal imager is operative to provide an image, including a superposition of in-focus contributions of an infinite number of planes.
- the sensor includes a single sensor.
- an imaging method including the steps of providing a multifocal imager defining a depth domain, sensing light from a scene via the multifocal imager, and receiving an output of the sensor and image processing the output, thereby to provide an in-focus image of the portion of the scene falling within the depth domain.
- depth domain and “working domain” are used substantially interchangeably.
- Fig. 1 is a generalized block diagram illustration of the optical apparatus of the invention
- Fig. 2A is a conceptual illustration of imaging lens apparatus useful in the
- Fig. 2B is a conceptual illustration of imaging mirror apparatus useful in the apparatus of Fig. 1;
- Fig. 3 is a simplified illustration of the images produced at a sensor by the apparatus of Figs. 2A and 2B;
- Fig. 4 is an optical diagram of a typical design of an imaging lens assembly useful in the apparatus of Fig. 1;
- Fig. 5 is an optical diagram of a typical design of an imaging mirror useful in the apparatus of Fig. 1;
- Fig. 6 is a simplified block diagram of image processing apparatus particularly suited for processing gray level images and useful in the apparatus in Fig. 1;
- Fig. 7 is a simplified block diagram of another embodiment of image processing apparatus particularly suited for processing gray level images and useful in the apparatus of Fig. 1;
- Fig. 8 is a conceptual illustration of a method for computing a restoration transfer function provided in accordance with one
- Fig. 9 is a conceptual illustration of an alternative method to the method of Fig. 8;
- Fig. 10 is a conceptual illustration of a method for fusing a plurality of' restored images into a single restored image
- Fig. 11 is a simplified block diagram of an alternative embodiment of image processing apparatus particularly suited for processing color images and useful in the apparatus of Fig. 1;
- Fig. 12 is a simplified block diagram of video or electronic still camera apparatus or camcorder apparatus in which conventional functions of these devices are combined with the functions of the invention shown and described herein;
- Fig. 13 is a simplified block diagram of film camera apparatus in which conventional functions of film cameras are combined with the functions of the invention shown and described herein;
- Fig. 14 is a simplified block diagram of microscopy apparatus in which conventional microscope functions are combined with the functions of the invention shown and described herein;
- Fig. 15 is a simplified block diagram of a laproscope or endocscope in which
- Fig. 16 is a simplified block diagram of a laproscopy or endocscopy apparatus which is a modification of the apparatus of Fig. 15;
- Fig. 17 is a simplified block diagram of multifocal imaging apparatus constructed and operative in accordance with another embodiment of the present invention.
- FIG. 1 illustrates optical apparatus constructed and operative in accordance with a preferred
- imaging apparatus 10 such as
- multifocal lens apparatus or multifocal mirror apparatus, arranged to receive light from a scene 11 which includes objects located at various distances from the imaging apparatus.
- the imaging apparatus is operative to direct light from the scene to a sensor 12 such that focused light from various objects in the scene at different distances from the imaging apparatus
- Sensor 12 is typically a charge coupled device (CCD).
- CCD charge coupled device
- the apparatus 10 has a transfer function which varies just slightly with the distance between the imaging apparatus and an object in the working domain (depth domain), as measured parallel to the optical axis of the optical apparatus.
- the working domain is a continuous or discontinuous range along the optical axis.
- the transfer function preferably has no zeros in the working domain.
- the absolute value of the transfer function in the working domain has a lower bound V, which is large enough to allow image restoration, which can be provided computationally, even in the presence of a reasonable amount of noise.
- the transfer function is invertible such that the sharp details of the scene within the depth domain can be restored by computational means.
- the composite image seen by the sensor 12 may normally appear not in focus to a human eye.
- electronic image processing apparatus 14 is provided for converting the information received by sensor 12 to an in-focus image of the scene, which is displayed on an output device 16, such as a display, image analyzer, video recorder or other storage means, or printer.
- record and playback functions can be provided downstream of sensor 12 and upstream of image processor 14.
- Record and playback functions can be analog with subsequent digitization or, alternatively, digitization can be performed upstream of the record and playback functions, in which case the record and playback functions are digital.
- Fig. 2A conceptually illustrates multifocal lens means 20 useful as the imaging apparatus of the present invention.
- Fig. 2A shows that the multifocal lens means 20 images two objects, indicated as A and B, such as point light sources, which are situated at two different distances from the lens assembly 20, onto an image plane 22.
- Each of the images of objects A and B in the image plane 22 is built up from a
- the in-focus contribution comes from a different portion of the lens means 20.
- the in-focus contribution comes from region 2 of the lens means and for object B, the in-focus contribution comes from region 1 of the lens means 20.
- resolution refers to the number of resolvable lines per unit length, e.g. resolvable
- FIG. 2B conceptually illustrates a multifocal mirror assembly 30 useful as the imaging apparatus of the present invention.
- Fig. 3 shows that the multifocal mirror assembly 30 focuses two objects A and B, such as point light sources, which are situated at two different distances from the mirror assembly 30, onto an image plane 22.
- each of the images of objects A and B in the image plane 22 is built up from a superposition of in- focus contributions made by some portion of mirror assembly 30 and of out-of-focus
- the in-focus contribution comes from a different portion of the mirror assembly 30.
- the in-focus contribution comes from region 2 of the mirror assembly 30 and for object B, the in- focus contribution comes from region 1 of the mirror assembly 30.
- Fig. 3 provides a qualitative indication of the appearance of the images of the two objects A and B as seen in the image plane 22, for either or both of the embodiments of Figs. 2A and 2B. If it is assumed that in both Figs. 2A and 2B, the relative placement of objects A and B and the optical apparatus is the same, both the lens means 20 of Fig. 2A and the mirror assembly 30 of Fig. 2B can produce a substantially identical image.
- the imaging means of the present invention may comprise a plurality of discrete regions of different focal lengths, as in the embodiment of Fig. 2A, for example.
- the imaging means may comprise a continuum of locations each corresponding to a different focal length.
- the "concentric ring" implementation of the multifocal imaging apparatus shown and described herein may be replaced by a "continuous" implementation in which the focal length of the lens varies continuously over the lens as a function of the distance from the center of the lens.
- the final image may be based on a combination of high resolution (high frequency) components arriving from each plane in space which
- the high resolution components of the image are based on information arriving from only a thin slice, orthogonal to the optical axis, which contains in-focus information. High resolution (high frequency) components arriving from all other planes is strongly suppressed or strongly attenuated.
- Fig. 4 illustrates an optical design of a preferred embodiment of lens means of the Double Gauss type, having a continuously varying focal length.
- the lens means of Fig. 4 includes a plurality of lens portions, each having
- R the radius of curvature
- T the distance from an individual surface to the next surface
- N index of refraction
- V Abbe number, (Nd-1) / (Nf-Nc);
- Nd index of refraction for the Sodium d line
- Nf index of refraction for the Hydrogen f line
- Nc index of refraction for the Hydrogen c line (0.6563 microns).
- Schmidt plate whose surfaces are referenced V and VI, which is located in the plane of the aperture stop and which is
- surfaces V and VI are generally parallel except for an aspheric correction term, (+0.5 x 10 -3 x rho 4 ), for surface VI.
- the surface sag, Z is defined as:
- X, Y system axes which are orthogonal to one another and to the axis of the optical system.
- the exit pupil is, for computation purposes, similar to a union of an infinite number of infinitesimal concentric rings among the optical axis, all with the same area.
- the focal length of the rings varies continuously as a function of ring radius, whereas the magnification of the
- the light energy which falls on the sensor may be computed as a sum or integral of the contributions from all of the infinitesimal concentric rings, because the image on the sensor is, for computation purposes, a
- superpositioned image received by the sensor includes, inter alia, an in-focus image of each plane within the depth domain of the system.
- Fig. 5 illustrates a preferred optical design of a multifocal mirror assembly 30 which is similar to that conceptually illustrated in Fig. 2B.
- the mirror assembly of Fig. 2B includes a plurality of mirror surface portions.
- a suitable number of rings is provided which together form a mirror, such that the respective focal lengths of the rings together cover a predetermined depth domain. It is appreciated that any suitable number of rings may be provided, since, theoretically, an infinite number of infinitesimal rings of different focal lengths may be employed so as to provide an entirely continuous dependence of focal length on ring radius.
- multifocality is generated by providing an aspheric surface which applies a third order spherical aberration which is selected to be strong enough to generate multifocality in the desired depth domain.
- aspheric aberrations may alternatively be achieved using only spherical surfaces or by using a
- Fig. 6 is a simplified block diagram of image
- processing unit 14 of Fig. 1 constructed and operative in accordance with an embodiment of the present invention which is particularly suited to processing gray level images.
- the apparatus of Fig. 6 includes a Fourier transform unit 100 which is operative to compute a Fourier transform such as an FFT (Fast Fourier Transform) of a digitized blurred image I received from sensor 12 of Fig. 1.
- a Fourier transform such as an FFT (Fast Fourier Transform) of a digitized blurred image I received from sensor 12 of Fig. 1.
- the output of unit 100 is termed I ⁇ .
- a digital storage unit 102 is operative to store a restoration transfer function T ⁇ for Fourier space restoration.
- a computational unit 104 receives I ⁇ from Fourier transform unit 100 and T ⁇ from memory 102 and generates a restored image in the Fourier plane. For each pair of Fourier plane coordinates (u, v), the restored image is I ⁇ (u,v) /T ⁇ (u,v).
- An inversion unit 106 inverts the restored image generated by unit 104 from the Fourier plane to the real plane, thereby to provide a sharpened output image.
- inversion unit 106 performs an inverse FFT operation on the image supplied to it by unit 104.
- the output I ⁇ of inversion unit 106 is supplied to unit 16 of Fig. 1.
- Any suitable method may be employed to compute a restoration transfer function for storage in memory unit 102.
- the following method may be used.
- h(x,y,d) the PSF (point spread function) of the le or mirror of imaging apparatus 10 of Fig. 1, for a planar object which lies perpendicular to the lens axis and parallel to sensor 12 which may be a screen onto which the lens or mirror projects an image of the planar object.
- h(x,y,d) describes the blurring imposed by the lens on the planar object.
- x, y cartesian coordinates of axes orthogonal to the optical axis
- d the distance between a planar object and the lens
- d 1 , d 2 boundaries of the working domain of the lens.
- d 1 and d 2 are the smallest and largest values
- the restoration transfer function T ⁇ (u,v) may be defined, for each (u,v), as H(u,v,d 0 ) such that:
- >
- Equation 1 The above equation is referenced herein Equation 1.
- a particular feature of the present invention is that in the working domain, the
- the lower bound can be made large enough to allow restoration even in the presence of a substantial level of noise.
- which is always between 0 and 1
- is relatively large, and may for example have a value of 1/4 or 1/2.
- the blurring process for a planar object at a distance d from a lens or mirror may be described as:
- O ⁇ b (u,v,d) the planar Fourier transform of the planar object
- n ⁇ (u,v) Fourier transform of the noise.
- T ⁇ (u,v,), which is independent of d to restore an image of an object whose d falls within the working domain.
- T ⁇ (u,v,) which is independent of d
- Fig. 7 is a simplified block diagram of another
- the apparatus of Fig. 7 includes a convolution unit 110 which convolves the output I of unit 12 of Fig. 1 with a convolution kernel t.
- the convolution kernel t may be supplied by a kernel storage or kernel computation unit 112.
- the kernel t(x,y) preferably is approximately equal to the inverse Fourier transform of
- the adaptive method preferably comprises the
- the working domain which may or may not be fragmented, is partitioned into a
- n may be an integer within a suitable range such as 3 - 10.
- the restoration transfer function is computed in accordance with Equation 1, set forth above, thereby to define a plurality of n-1 restoration transfer functions T ⁇ 1 , ..., T ⁇ n -1, as illustrated conceptually in
- the kernel for real space restoration is computed, as explained hereinabove with reference to Fig.
- the image is restored either in the Fourier plane or in real space, as convenient, using the restoration transfer functions in the first instance and the convolution kernels in the second instance.
- the result of this step is a plurality of n-1 restored images I ⁇ 1 , ... , I ⁇ n-
- n-1 restored images computed in step c are fused or merged into a single
- the criteria for the selection process may be conventional statistical or morphological criteria for image quality.
- the selected block may be that which is most similar to a preassembled directory of known image features such as edges, isolated local extreme points, roof edges and smooth functions, where a correlational similarity criterion is employed, iii. Assemble the single restored image I ⁇ by fusing the m 1 x m 2 sharpest square blocks selected in step ii.
- the final t(x,y) for restoration of block (j,k) is preferably interpolated from the kernels selected for block (j,k) and for its neighbors, blocks (j,k+1), (j+1,k) and
- interpolation method Any suitable interpolation method may be employed, such as bilinear interpolation methods.
- An interpolation method which is believed to provide high quality results is described in the following publication, the disclosure of which is incorporated herein by reference:
- Fig. 11 is a simplified block diagram of an alternative embodiment of image processing unit 14 of Fig. 1 which is particularly suited for processing color image data such as RGB data.
- the apparatus of Fig. 11 includes a color component channeling unit 202 which channels each of a plurality of color components of the input color image I, such as R, G and B components, to a corresponding one of a plurality of image processing subunits 204, 206 and 208,
- Each image processing subunit may be similar to the image processing apparatus of
- I ⁇ outputs of image processing subunits 204, 206 and 208 are referenced I ⁇ R , I ⁇ G and I ⁇ B in Fig. 11. All three outputs are provided to a color combining unit 210 which combines them and provides an output color image I ⁇ . It is appreciated that many output devices, such as display monitors, video recorders, video
- printers and computer with video acquisition equipment include a color combining function because they are capable of receiving a
- plurality of channels such as 3 channels of R, G and B data respectively, and providing an output color image.
- blurred images may be any suitable blurred images. According to one embodiment of the present invention, blurred images may be any suitable blurred images.
- a particular feature of the invention shown and described herein is that, in contrast to conventional imaging systems, here, the mapping from the three-dimensional object to the two-dimensional image is almost independent of the depth of the object within the field of view. This is because the point spread function h(x,y,z) is, up to a first order, independent of the depth coordinate z within a predetermined depth domain, i.e. within a predetermined range of z's.
- imaging apparatus shown and described herein are merely exemplary of the wide variety of ways in which it is possible to implement the multifocal imaging apparatus of Fig. 1.
- Appendix C is a Fortran language listing of a software implementation of image processing unit 14 of Fig. 1 according to one alternative embodiment of the present invention. Appendix C also includes results of operation of the software implementation on data derived from a software simulation of imaging apparatus 10 and sensor 12. It is appreciated that inclusion of the listing of Appendix C is intended merely to provide an extremely detailed disclosure of the present invention and is not intended to limit the scope of the present invention to the particular implementation of Appendix C.
- distance of an object or image from an optical system refers to that component of the distance which is parallel to the optical axis of the system.
- distance of an object point from an optical system refers to the distance between a first plane and a second plane, both of which are perpendicular to the optical axis of the system, wherein the first plane includes the object point and the second plane includes the operative portion of the optical system.
- depth of an object relates to distance of the object, along the optical axis, from the imaging plane.
- FIG. 12 - 16 illustrate various applications of the imaging apparatus shown and described above, it being understood that the particular applications of Figs. 12 - 16 are merely exemplary of possible imaging applications and are not intended to be limiting.
- the apparatus of Fig. 12 includes video or electronic still camera apparatus or camcorder apparatus 250 which is imaging a scene 252 in which various objects are located at different distances from the camera/camcorder 250.
- the lens 254 of the camera/camcorder 250 is not a conventional lens but rather comprises an imaging device constructed and operative in accordance with the present invention such as any one of the imaging apparatus embodiments shown and described above with reference to Figs. 2A - 5.
- a sensor 256 receives the raw image formed by the multifocal imager 254.
- the sensed raw image is recorded and, optionally, played back by recording unit 258 and playback unit 260, respectively, for off-line reception by digitizer 264.
- the sensed raw image is digitized by a digitizer 264 and the resulting digital
- an image restoration computation unit 266 which has image processing capabilities as shown and described above with reference to Figs. 6 - 11.
- computation unit 266 may be converted into analog representation by a D/A unit 270, and may then be displayed or printed by an output device 272 such as a TV monitor or printer.
- the digital output of computation unit 266 may be stored on a
- the digital output of computation unit 266 may also be outputted directly by a digital output device 280 which is capable of handling digital input, such as but not limited to a printer, recorder or monitor with digital interface.
- image restoration computation unit 266 may receive distance information from a rangefinder 282 such as an ultrasound rangefinder or laser rangefinder.
- a rangefinder 282 such as an ultrasound rangefinder or laser rangefinder.
- Unit 266 may employ this information to improve the image of the features falling within the range found by the rangefinder.
- the rangefinder allows a user to select an object in a scene which is of relative importance, and determine the distance to that object using the rangefinder.
- the image restoration computation unit 266 is preferably operative to receive the distance information and to select a transfer function on the basis of that distance such that optimal restoration results are achieved for the depth at which the selected object lies.
- camcorder may be constructed to include all components of Fig. 12.
- the components of the apparatus of Fig. 12 which are other than conventional may be retrofitted to an existing conventional electronic still camera, video camera or camcorder.
- Fig. 12 is useful in high-definition television (HDTV) applications where, due to very high resolution, the depth domain is, in conventional apparatus, very small and even small deviations from focus result in perceptible defocussing.
- HDTV high-definition television
- Fig. 13 illustrates filming apparatus for filming a scene 310 including a film camera 312 fitted with a multifocal lens 314 which is constructed and operative in accordance with the present invention and which may comprise any of the imaging apparatus embodiments shown and described hereinabove with reference to Figs. 2A - 5.
- the film 320 generated by film camera 312 is developed conventionally and the
- scanner 324 which provides a digital output.
- the digital output of scanner 324 is provided to an image
- restoration computation unit 326 which has image processing functions in accordance with the present invention such as those shown and described above with reference to Figs. 6 - 11.
- the sharpened image generated by the restoration unit 326 is converted into a hard copy 330 by a suitable output device 332 such as a plotter.
- an improved film camera system may be constructed to include all components of Fig. 13.
- the components of the apparatus of Fig. 13 which are other than conventional, such as the multifocal lens and the image restorer, may be provided stand-alone, for use in conjunction with
- Fig. 14 is a simplified block diagram of microscopy apparatus in which conventional microscope functions are combined with the functions of the invention shown and described herein.
- the apparatus of Fig. 14 includes a microscope 350 fitted with an multifocus objective lens 352 which is constructed and operative in accordance with the present invention and which may comprise any of the imaging devices shown and described above with reference to Figs. 2A - 5.
- a camera 354 such as a video, electronic still or film camera captures the magnified image from its sensor 355 which may be a CCD, tube or film.
- the digital output of the camera is provided to an image restoration unit 356. If the output of the camera is other than digital, the output is first digitized by an A/D unit 358.
- Image restoration unit 356 is a computational unit with image processing
- the output of unit 356 is provided to an analog output device 360 via a D/A unit 362 or to a digital output device 364.
- an improved microscopy system may be constructed to include all components of Fig. 14.
- the components of the apparatus of Fig. 14 which are other than conventional, such as the multifocal lens and the image restorer, may be provided stand-alone, for use in conjunction with
- the apparatus of Fig. 14 is useful in inspecting a wide variety of objects and scenes such as but not limited to a drop of liquid, a three-dimensional trasparent or semitransparent scene, a 2.5-dimensional surface such as the surface of a microchip and an object or material whose surface is not flat.
- Fig. 15 illustrates a laproscope or endocscope for imaging and photography of interior portions of a body such as that of a human patient.
- the apparatus of Fig. 15 includes an objective lens 400 which is inserted into the body using conventional medical techniques at a location depending on the location which it is desired to image.
- the objective lens 400 comprises a multifocal imager such as any of those shown and described above.
- a first bundle 402 of optical fibers is
- optical fiber bundle 402 is operative associated with a relay lens 408 and a sensor 410 such that the image formed at end 404 of the bundle 402 is projected onto sensor 410 via relay lens 408.
- the image captured by sensor 410 is provided to an image restoration computation unit 414 which may be similar to any of the image processing units shown and described above.
- the image generated by the image is provided to an image restoration computation unit 414 which may be similar to any of the image processing units shown and described above.
- processor has a much higher resolution over a large depth domain. This image is provided to a suitable output device 416.
- Illumination of the body interior portion to be imaged may be provided by a second optical fiber bundle 420 which is operatively associated with a source of light located externally of the body of the patient.
- Fig. 16 illustrates laproscopy or endocscopy apparatus which is a modification of the apparatus of Fig. 15 in that the entire camera apparatus is inserted into the body.
- the apparatus of Fig. 16 is generally similar to the apparatus of Fig. 15 except that relay lens 408 is eliminated and, instead, the sensor 410 is attached to the first optical fiber bundle 402 at end 404 thereof. All elements of the
- computation unit 414 and output device 416 may be inserted into the body.
- the elements of the apparatus which operate interiorly of the human body may communicate with exteriorly operating elements 414 and 416 via a suitable electric cable 424.
- image restoration computation unit 414 may also be inserted into the body, such that all elements of the
- operating elements may, in this case,
- Fig. 17 is a simplified block diagram of multifocal imaging apparatus constructed and operative in accordance with another embodiment of the present invention.
- the apparatus of Fig. 17 includes lens 450 which is operatively associated with a plurality of semitransparent mirrors or beam splitters such as three
- splitting elements 452, 454 and 456 are indicated by dotted lines 460 and 462 respectively.
- the focal planes of splitting element 456 for point source 458 are indicated by dotted line 464 and dotted line 466
- sensors are provided for respectively imaging the four images generated in the illustrated embodiment.
- the locations of the four sensors are indicated by solid lines 470, 472, 474 and 476. All sensors are, as illustrated, parallel respectively to the corresponding focal planes.
- the four sensors are located at different respective distances from their corresponding focal planes, respectively, thereby to provide a plurality of differently focussed images. These images are then
- the rescaled differently focussed images may be added, i.e. combined "one on top of the other" by addition, averaging or other suitable methods, into a single image.
- This image is processed by an image restoration computation unit 484, similar to those described above, so as to generate a final image including contributions from all four differently focussed images.
- the present invention is not limited in its applicability to planar sensors such as planar CCD's or such as films lying in a plane. If a nonplanar sensor is employed, the present invention may be used, for example, by treating the digitized sensor output as though it had arrived from a planar sensor.
- transfer function has been used generally to refer to the appropriate one of the following terms: contrast transfer function, optical transfer function (OTF), modulation transfer function (MTF) or phase transfer function.
- the present invention is useful in applications in which the edge information is the most crucial portion of the total image information, such as in industrial robot vision applications.
- the edge information is the most crucial portion of the total image information, such as in industrial robot vision applications.
- a conventional edge detector function is provided upstream of or instead of the image processing unit 14 which is operative to enhance edges located in any of the planes falling within the depth domain of the
- processor 266 instead of or in addition to playback and record functions 258 and 260.
- record and playbaick functions are provided which are operative upstream of the image processing function. This allows image processing, such as image
- the above optional feature has the advantage of allowing image restoration to be performed frame by frame, off-line, recording the output frame by frame for later viewing.
- the present apparatus can be employed to generate an in- focus video movie.
- record and playback functions are optionally provided intermediate camera 354 and A/D unit 358 and/or intermediate camera 354 and image restoration computational unit 356.
- A/D recorder of digital image, playback
- Suitable record and playback functions may also be provided in the embodiment of Fig. 15.
- FOB1 1./F-1./DOB1
- FOB2 1./F-1./DOB2
- CCC (AS A SIMPLIFIED SIMULATION) BY A CCC SUPERPOSITION OF A NUMBER OF PSF'S CCC SEE DESCRIPTION IN THE ROUTINE AVPSF.
- FOB3 1./F-1./DOB3
- A(I) A(I)/PSFAC(I)
- SUBROUTINE AVPSF (A,NX,NY,EZ,DLENS,F, FOB,DOB, FOB1, FOB2,NP)
- FOB 1./ (1./F-1./DOB) ! FOCAL DISTANCE
- DOB2 1./F-1./FOB2
- RAD ABS (FF-FOB) *DLENS/2./FOB! RAD/
- DLENS/ (2*FOB)
- FOB 1./ (1. /F-1. /DOB) ! FOCAL DISTANCE
- DOB2 1./F-1./FOB2
- CCCCCCCCCCCCCCCCCCCCC RAD ABS (FF-FOB) *DLENS/2. /FOB!
- CCC F 1/(2*PI*SIG**2) * EXP (- (X**2+Y**2) /
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
Abstract
Appareil optique comprenant un appareil d'imagerie multifocal définissant un domaine de profondeur, un appareil détecteur recevant de la lumière provenant d'une scène via l'appareil d'imagerie multifocal (10) et un appareil de traitement d'images (14) qui reçoit une sortie de l'appareil de détection (12) et fournit une image mise au point de la partie de la scène (11) tombant à l'intérieur du domaine de profondeur.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IL100657 | 1992-01-14 | ||
IL100657A IL100657A0 (en) | 1992-01-14 | 1992-01-14 | Multifocal optical apparatus |
PCT/US1993/000330 WO1993014384A1 (fr) | 1992-01-14 | 1993-01-13 | Appareil optique multifocal |
Publications (2)
Publication Number | Publication Date |
---|---|
EP0623209A1 true EP0623209A1 (fr) | 1994-11-09 |
EP0623209A4 EP0623209A4 (en) | 1994-11-17 |
Family
ID=11063273
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP19930904495 Withdrawn EP0623209A4 (en) | 1992-01-14 | 1993-01-13 | Multifocal optical apparatus. |
Country Status (6)
Country | Link |
---|---|
EP (1) | EP0623209A4 (fr) |
JP (1) | JPH07505725A (fr) |
AU (1) | AU3583193A (fr) |
GB (1) | GB9413367D0 (fr) |
IL (1) | IL100657A0 (fr) |
WO (1) | WO1993014384A1 (fr) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7336430B2 (en) | 2004-09-03 | 2008-02-26 | Micron Technology, Inc. | Extended depth of field using a multi-focal length lens with a controlled range of spherical aberration and a centrally obscured aperture |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU2002357321A1 (en) | 2001-12-18 | 2003-06-30 | University Of Rochester | Multifocal aspheric lens obtaining extended field depth |
JP2008516299A (ja) * | 2004-10-15 | 2008-05-15 | 松下電器産業株式会社 | 撮像装置及び画像改質処理方法 |
WO2009061439A2 (fr) * | 2007-11-06 | 2009-05-14 | Tessera North America, Inc. | Systèmes optiques déterminés et indéterminés |
ES2423604T3 (es) * | 2009-02-20 | 2013-09-23 | Thales Canada Inc. | Sistema de captación de imágenes óptico de doble campo de visión con una lente focal doble |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1988001069A1 (fr) * | 1986-07-30 | 1988-02-11 | Drexler Technology Corporation | Procede et appareil de lecture de donnees avec des reseaux de zones ccd |
EP0280588A1 (fr) * | 1987-01-21 | 1988-08-31 | Matra | Procédé et dispositif de prise de vue à grande profondeur de champ |
DE3905619A1 (de) * | 1988-02-23 | 1989-08-31 | Olympus Optical Co | Bildeingabe-/ausgabevorrichtung |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3767934A (en) * | 1971-06-16 | 1973-10-23 | Exxon Co | Fault responsive switching system |
US4548495A (en) * | 1981-02-27 | 1985-10-22 | Takeomi Suzuki | Proper focusing state detecting device |
JPS5821715A (ja) * | 1981-07-31 | 1983-02-08 | Asahi Optical Co Ltd | 光束分割器 |
JPS6038613A (ja) * | 1983-08-10 | 1985-02-28 | Canon Inc | 測距光学系 |
DE3729334A1 (de) * | 1987-09-02 | 1989-03-16 | Sick Optik Elektronik Erwin | Lichttaster |
US4993830A (en) * | 1989-12-18 | 1991-02-19 | Systronics, Incorporated | Depth and distance measuring system |
-
1992
- 1992-01-14 IL IL100657A patent/IL100657A0/xx unknown
-
1993
- 1993-01-13 AU AU35831/93A patent/AU3583193A/en not_active Abandoned
- 1993-01-13 EP EP19930904495 patent/EP0623209A4/en not_active Withdrawn
- 1993-01-13 WO PCT/US1993/000330 patent/WO1993014384A1/fr not_active Application Discontinuation
- 1993-01-13 JP JP5512658A patent/JPH07505725A/ja active Pending
-
1994
- 1994-07-01 GB GB9413367A patent/GB9413367D0/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1988001069A1 (fr) * | 1986-07-30 | 1988-02-11 | Drexler Technology Corporation | Procede et appareil de lecture de donnees avec des reseaux de zones ccd |
EP0280588A1 (fr) * | 1987-01-21 | 1988-08-31 | Matra | Procédé et dispositif de prise de vue à grande profondeur de champ |
DE3905619A1 (de) * | 1988-02-23 | 1989-08-31 | Olympus Optical Co | Bildeingabe-/ausgabevorrichtung |
Non-Patent Citations (1)
Title |
---|
See also references of WO9314384A1 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7336430B2 (en) | 2004-09-03 | 2008-02-26 | Micron Technology, Inc. | Extended depth of field using a multi-focal length lens with a controlled range of spherical aberration and a centrally obscured aperture |
US8086058B2 (en) | 2004-09-03 | 2011-12-27 | Aptina Imaging Corporation | Extended depth of field using a multi-focal length lens with a controlled range of spherical aberration and a centrally obscured aperture |
Also Published As
Publication number | Publication date |
---|---|
AU3583193A (en) | 1993-08-03 |
WO1993014384A1 (fr) | 1993-07-22 |
EP0623209A4 (en) | 1994-11-17 |
IL100657A0 (en) | 1992-09-06 |
JPH07505725A (ja) | 1995-06-22 |
GB9413367D0 (en) | 1994-08-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7723662B2 (en) | Microscopy arrangements and approaches | |
JP5274623B2 (ja) | 画像処理装置、撮像装置、画像処理プログラム、および画像処理方法 | |
Green et al. | Multi-aperture photography | |
US8743245B2 (en) | Image processing method, image pickup apparatus, image processing apparatus, and non-transitory computer-readable storage medium | |
US11928794B2 (en) | Image processing device, image processing program, image processing method, and imaging device | |
CN103533227A (zh) | 图像拾取装置和透镜装置 | |
JP2012103741A (ja) | 撮像装置、画像処理装置、および画像処理方法、並びにプログラム | |
CN102959945A (zh) | 根据通过图像捕捉设备的阵列获得的数据产生虚拟输出图像的方法和系统 | |
JP5677366B2 (ja) | 撮像装置 | |
JP6576046B2 (ja) | 複眼撮像装置 | |
JP2011118235A (ja) | 撮像装置 | |
WO2007137112A2 (fr) | Procédé et système pour la correction d'aberrations optiques, notamment dans des applications d'imagerie à champ élargi | |
WO2011137140A1 (fr) | Mesure de distance à l'aide d'une ouverture codée | |
JP3013721B2 (ja) | デジタル画像形成手段を有した光学装置 | |
US20090201386A1 (en) | Image processing apparatus, image processing method, image capturing apparatus, and medium storing a program | |
Labussière et al. | Leveraging blur information for plenoptic camera calibration | |
CN103828362B (zh) | 成像设备以及视频记录与再现系统 | |
Chan et al. | Super-resolution reconstruction in a computational compound-eye imaging system | |
WO2013124664A1 (fr) | Procédé et appareil pour l'imagerie à travers milieu inhomogène à variation temporelle | |
EP0623209A1 (fr) | Appareil optique multifocal | |
JPH0887600A (ja) | 特徴抽出装置 | |
JP2019036115A (ja) | 画像処理方法、画像処理装置、撮像装置、および、プログラム | |
JP2013236291A (ja) | 立体撮像装置 | |
JP5820650B2 (ja) | 撮像装置 | |
JP6569769B2 (ja) | 任意視点画像合成方法及び画像処理装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 19940812 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LI NL PT SE |
|
A4 | Supplementary search report drawn up and despatched |
Effective date: 19940927 |
|
AK | Designated contracting states |
Kind code of ref document: A4 Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LI NL PT SE |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 19960801 |