WO2010012697A1 - Method and device for enhancing the resolution of a camera - Google Patents

Method and device for enhancing the resolution of a camera Download PDF

Info

Publication number
WO2010012697A1
WO2010012697A1 PCT/EP2009/059688 EP2009059688W WO2010012697A1 WO 2010012697 A1 WO2010012697 A1 WO 2010012697A1 EP 2009059688 W EP2009059688 W EP 2009059688W WO 2010012697 A1 WO2010012697 A1 WO 2010012697A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
image sensor
scene
resolution
camera
Prior art date
Application number
PCT/EP2009/059688
Other languages
French (fr)
Inventor
Guido Becker
Marc Schmiz
Original Assignee
Lion Systems S.A.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lion Systems S.A. filed Critical Lion Systems S.A.
Priority to EP09781144A priority Critical patent/EP2318883A1/en
Publication of WO2010012697A1 publication Critical patent/WO2010012697A1/en

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/08Stereoscopic photography by simultaneous recording
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/16Special procedures for taking photographs; Apparatus therefor for photographing the track of moving objects
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B39/00High-speed photography
    • G03B39/005High-speed photography using image converters or amplifiers

Definitions

  • the present invention relates to a method and device for enhancing the resolution of a camera.
  • the invention is used for enhancing the resolution of a three-dimensional camera for rapid image sequences.
  • the presently known methods for recording three-dimensional image information in an optical manner relate primarily to stereographic methods as well as to triangulation methods. To obtain three-dimensional data of the object to be imaged, these methods require a complex mathematical post-processing operation and are thus only suitable for photographing static objects. A rapid image sequence or even a film cannot be achieved by these methods.
  • the object of the present invention is to provide a method and a device for enhancing the resolution of a time-of-flight camera.
  • the object of the present invention is to increase in real time the resolution of a three-dimensional time-of-flight camera for rapid image sequences.
  • a device for enhancing in real time the resolution of rapid image sequences representing three-dimensional scenes and taken by a time-of-flight camera.
  • a camera comprises a modulated light source and an image sensor that has a fixed number of pixels in two dimensions, each pixel having a fixed expansion in each of the two dimensions.
  • the proposed device comprises optical means such as for example lenses, prisms or mirrors for displacing the image of the scene on the image sensor in at least one dimension. The arrangement is such that any light passes through these optical means before falling on the image sensor.
  • the optical means are further arranged so that the modulated light beam that travels from the light source to a point in the scene which reflects it back to the image sensor, covers a distance that is substantially twice as long as the distance covered by a light beam that would originate at said point in the scene before falling on the image sensor.
  • the optical means for displacing the image of the scene on the image sensor may comprise at least one mirror.
  • the mirror or mirrors used in the device may further present a high reflectivity with respect to the modulated light emitted by the light source.
  • the mirror may further be tilted by substantially 45° with respect to the direction of the emitted modulated light beam.
  • the optical means for displacing the image of the scene on the image sensor may comprise at least one prism or lens.
  • the means for displacing the image on the image sensor may be configured such that the image is displaced on the image sensor by half a pixel expansion.
  • At least one actuator may be provided, which may be adjusted in at least one dimension to control the means for displacing the image.
  • the considered actuators may be piezoelectric elements or magnet coils.
  • the time- of-flight camera comprises a modulated light source, and an image sensor with a fixed number of pixels in two dimensions. Each of the pixels has a fixed expansion in each direction.
  • the camera further comprises a device with optical means for displacing the image of the scene on the image sensor in at least one dimension.
  • optical means are arranged so that any light passes through them before falling on the image sensor, and so that the modulated light beam that travels from the light source to a point in the scene which reflects it back to the image sensor, covers a distance that is substantially twice as long as the distance covered by a light beam that would originate at said point in the scene before falling on the image sensor.
  • the method comprises illuminating the object to be recorded by means of the modulated light source.
  • a first partial image is captured by the image sensor.
  • the image of the scene or object is displaced in at least one direction.
  • At least one further partial image is taken by the image sensor.
  • the captured partial images are merged to produce a complete image of a higher resolution.
  • the object image may be displaced in at least one dimension by a fraction of a pixel expansion.
  • the fraction of a pixel expansion may correspond to one half.
  • the object image on the image sensor may advantageously be displaced in only one dimension during each displacement.
  • a mechanical and/or optical method may enhance the geometric resolution of a camera.
  • the present invention makes it possible to reach a compromise between the dynamics and the necessary resolution in accordance with the use.
  • Fig. 1 shows a basic overview of the time-of-flight method.
  • Fig. 2A shows the low-resolution individual images which are obtained according to the invention.
  • Fig. 2B shows the high-resolution image obtained according to the invention.
  • Fig. 3 shows a first view of the device according to a first embodiment of the invention.
  • Fig. 4 shows a second view of the device according to the first embodiment of the invention.
  • Fig. 5 show the offset of the optical light path by means of attachment optics according to an embodiment of the invention.
  • Fig. 6 shows the evolution in time of the control signal of the adjusting means according to the present invention.
  • Fig. 7 shows a sectional view of the device according to the second embodiment of the invention.
  • the optical time-of-flight method on which the time-of-flight camera is based is founded on a measuring method for determining the phase displacement.
  • the implementation of the measuring method is realized by means of an active light source, the light intensity of which is modulated. This modulated light is reflected at the observed object and then falls back onto the lens of the sensor.
  • the sensor demodulates the reflected light in a precision- timed manner according to the modulation of the emitted light.
  • a phase displacement is computed, which corresponds to exactly twice the distance between the object and the camera. Since the time-of-flight sensor comprises a matrix of measuring cells, topographical information about the observed object is obtained.
  • the time-of-flight of the integrated illumination is used to determine the third dimension.
  • the fundamental method also termed continuous wave (cw) measurement, is based on a light intensity modulation. In this respect, it is not directly the time-of-flight, but the phase displacement between the emitted and the detected light that is measured. The connection between the emitted modulation and the received modulation is evaluated and the phase displacement is determined. The phase displacement corresponds proportionally to the time-of-flight traveled by the light. Since the light travels to the object and back, the time-of-flight that is determined corresponds to twice the distance between the object and the camera.
  • the depth measurement that is obtained by demodulation of the received light will only be meaningful if the distance that the emitted modulated light travels before reaching the camera, corresponds to twice the distance between the imaged object and the image sensor.
  • the device according to the invention for enhancing the image resolution of a time-of-flight camera relates to a time-of-flight camera, which comprises a modulated light source and an image sensor with a fixed number of pixels in two dimensions. This provides for time-of-flight recordings of a three- dimensional object, the third dimension being obtained from the phase displacement between emitted and detected modulated light. Each pixel in the camera has a fixed expansion in each of the two dimensions.
  • a device for displacing the image of an object to be photographed with respect to the image sensor is provided.
  • the method for enhancing the geometric resolution is based on the fact that the optical light path of the image is preferably repeatedly displaced each time by half a pixel width in at least one direction in the plane of the optical sensor. This displacement and the later superimposition of the sequentially recorded individual images produce a higher resolution image with a so-called sub-pixel resolution.
  • the object to be recorded is first of all continuously illuminated by the modulated light source.
  • a first partial image the resolution of which is established by the number of pixels of the image sensor, is recorded by the image sensor.
  • the image of the object is then displaced with respect to the image sensor at least once in at least one dimension by a fraction of a pixel expansion.
  • At least one further partial image is recorded by the image sensor.
  • the recorded partial images are merged to form a complete image of a higher resolution, as shown in Fig. 2.
  • the two embodiments, described in detail in the following, for realizing the device for displacing the optical image with respect to the image sensor are preferred.
  • the arrangement of the means for displacing the optical image is such that the emitted light, the reflection of which is used to obtain a depth measurement of the scene or object, travels double the distance between the image sensor and the imaged object before the measurement can be made.
  • a conventional time-of-flight camera having a image sensor with restricted resolution
  • mechanically controllable attachment optics which are positioned in front of the optics integrated into the standard camera.
  • These attachment optics make it possible to offset the light path of the camera optics on the image sensor by half a pixel width in each case.
  • a pixel is an individual optically active surface, which corresponds to one point of the geometric resolution. In the case of the time-of- flight camera used by way of example, this corresponds to an offset of 25 ⁇ m.
  • the camera resolution is increased by a factor of four via four photographs which are displaced with respect to one another and which will subsequently be merged to form one image.
  • optical prisms 8, 9 or lenses are used in the light path in front of the imaging optics 7 and the image sensor 6 of the TOF camera 5.
  • This realization is a very compact possibility of deflecting the light path of the image.
  • two prisms or lenses 8, 9 are advantageously used which can be adjusted in at least one direction.
  • the lenses or prisms 8, 9 can be pivotal in each case about an axis 1 1 respectively 12, which extends perpendicularly to the light path. If two prisms or lenses 8, 9 are used, their pivot axes 1 1 , 12 are advantageously offset by 90° with respect to one another.
  • the modulated light source is made up of several Light Emitting Diodes (LED) 13. They are arranged around the imaging sensor 6 and optics 7, so that the distance that the emitted light travels to the object that is to be imaged and back to the sensor can be considered to be twice as long as the distance between the sensor and the object. If the modulated light falls on the prisms 8 and 9 before reaching the object to be imaged, some reflections may occur and the emitted light is at least partially sent back to the sensor 6, leading to an erroneous depth measurement. In order to avoid this error, the prisms 8 and 9 need to be shielded from the emitted light by an attachment optics housing 10. The housing 10 allows the incoming, reflected modulated light to reach the sensor 6, while shielding the outgoing modulated light from the displacement optics 8, 9. This aspect of the invention allows for a correct depth measurement of each image that is taken using the displacement optics.
  • LED Light Emitting Diodes
  • the lenses or prisms can be provided such that they deflect the light path when they are rotated about an axis, which is substantially parallel to the light path.
  • Each of the lenses or prisms per se slightly deflects the light path, such that with two prisms or lenses, the light path is deflected in two dimensions with respect to the image sensor.
  • the lenses or prisms need to be shielded by a housing 10.
  • the housing 10 allows the incoming, reflected modulated light to reach the sensor, while shielding the outgoing modulated light from the displacement optics. This aspect of the invention allows for a correct depth measurement of each image that is taken using the displacement optics.
  • the two prisms of the attachment optics are suspended in an axial manner, the pivot axes 1 1 and 12 of the lenses 8 and 9 being twisted by 90° with respect to one another.
  • the prisms or lenses are positioned as precisely as possible, which allows an absolute X or Y deviation of the light path of ⁇ 1.5° with a reproducibility of 0.015°, see Fig. 5.
  • This 1.5° displacement of the light path results from using a camera optics, which has the following technical data:
  • the piezoelectric elements or magnet coils used as actuators are controlled by means of electronics.
  • the control signals S x and S y of the prisms or lenses are synchronized by the readout signal of the camera CAMERA_READ, as shown in Fig. 6.
  • One configuration of the device according to the invention is in the form of electronically adjustable attachment optics, which are controlled by the camera in a temporally synchronous manner, for a so-called "micro scanner”.
  • At least one optical mirror 14 is used to deflect the light path.
  • the image sensor and the camera optics are not shown in Fig. 7.
  • the mirror 14 and the camera are preferably contained in a housing 16. It is particularly advantageous for the mirror 4 that is located in the light path to be mounted by means of a so-called tip/tilt platform 15, which can be tilted in two axes via an electric control that is applied on the contacts.
  • the mirror 4 is preferably tilted at an angle of substantially 45° with respect to the light path.
  • the control signal shown in Fig. 6 can be used accordingly for this piezoelectric platform 15 to displace the image with respect to the image sensor in two dimensions by adjusting the mirror angle in the light path.
  • the correct dimensioning of the mirror 4 is of high importance.
  • the modulated light emitted by the light source 13 needs to travel twice the distance between the sensor and the object or scene to be recorded 18, before falling back on the sensor. Therefore, all the emitted modulated light must be deflected by the mirror 4, as shown in Figure 7.
  • the light beams that are deflected by the mirror 17 illuminate the scene 18. They are reflected on the scene and travel the same distance back to the image sensor, where they are demodulated to retrieve the depth information corresponding to the scene 18.
  • the distance d that separates the light source 13 and the image sensor (not shown) from the mirror 14 is too small, the emitted light beams will be reflected 20 without reaching the scene 18 that is to be illuminated. This leads to erroneous depth measurements. If on the other hand, the distance c/ that separates the light source 13 and the image sensor (not shown) from the mirror 14 is too large, the emitted light beams will not be deflected 19 in the direction of the scene 18 and will not be useful for any depth measurement.
  • the distance d must be carefully chosen such as to maximize the amount of light that is correctly deflected into the direction of the scene to be illuminated.
  • the mirror 14 is provided with a coating that presents a high reflectivity in the emitted light spectrum.
  • two mirrors can also be used which are each adjustable in one dimension.
  • attachment optics into the optical path requires this element to be specifically adapted to the method.
  • the use of the mechanically controllable attachment optics must correspond in optical and mechanical characteristics to the measuring principle in speed, stability and transmission.
  • the attachment optics are directed into a stable and reproducible state in the short time of 3 ms corresponding to the control of the sensor. This can be achieved by the mentioned piezoelectric elements or magnet coils.
  • the size and the material of the mirror 14 must be chosen so that its mass is small enough to avoid a resonance behavior when the mirror is displaced. As noted above, the size of the mirror 14 also has an impact on the distance d that separates the mirror 14 from the light source 13 and the imaging sensor. The smaller the mirror 14, the closer it has to be positioned with respect to both the light source 13 and the imaging sensor.
  • Precise time control of the image detection in the device according to the invention excludes optical blurring effects during the multiplication of the resolution by means of the mechanically moved displacement means.
  • the measuring accuracy of the camera depends heavily on the optical power budget between the illumination and the individual pixels of the sensor.
  • power budget is understood as meaning the weakening of the light intensity between the light that is emitted by the illumination and reflected by the object and the light that is received on the optically active surface of the pixel.
  • the use of attachment optics results in a reduction of this power budget.
  • the objective is to keep this reduction as low as possible and to determine it precisely by the choice of material of the lenses.
  • the choice of material of the optical coating of the lenses also plays a very important part, because optical surfaces would influence the measured values on account of reflections. Such influences can be prevented by suitable optical coating methods.
  • the choice of the advantageously electrical actuators for controlling the individual elements in the form of prisms, lenses or mirrors has a crucial effect on the function of the attachment optics.
  • the key requirements imposed on the attachment optics are the speed and the precise reproducibility.
  • the adjusting speed of the attachment optics is provided by the physically predetermined readout speed of the camera.
  • the repeat accuracy of the attachment optics is provided computationally by the geometry of the individual pixel and also by the light path of the camera optics.
  • the combination of factors in the previously mentioned example results in the fact that a minimum reproducibility of the element adjustment of 2.5 ⁇ m is permissible, which corresponds to a beam deviation of only 0.015°. This, and the requirement that the lenses are moved at most by the maximum reproducibility tolerance of 2.5 ⁇ m between the adjusting cycles, is the challenge to the mechanical adjustment unit of the attachment optics.
  • piezoelectric elements are, for example the piezoelectric tip/tilt platforms S-330.20L manufactured by Physik lnstrumente (Pl) GmbH & Co. KG, Düsseldorf, Germany.
  • Pl Physik lnstrumente
  • one or two magnet coils can also be used as displacement elements.
  • control the adjustment unit electronics are provided which receive a start signal from the camera at the time of the readout of the sensor. These electronics control the actuators such that the adjustment time of 3 ms is observed.
  • the supply voltage of the adjustment unit is stabilized such that the accuracy of 2.5 ⁇ m is ensured between the individual adjustment times.
  • the control voltage at the actuator advantageously also determines the respective end positions, without necessitating mechanical boundaries.
  • the preparation of the high-resolution three-dimensional image is realized by a computer post-processing operation.
  • a computer- assisted method which combines the four successive images which have a defined optical offset of ⁇ 1.5° in the X and Y directions, to form a complete image with sub-pixel resolution, as shown in Fig. 2.
  • the sub-pixel resolution ensues from the merging of the pixels of the four individual photographs 1 , 2, 3, 4 in a resulting image.
  • a high-resolution time-of-flight camera according to the invention can be used, for example, to dynamically record and analyze the sole of a human foot.

Abstract

Device and method for enhancing the resolution of a time-of-flight camera for taking rapid image sequences, the time-of-flight camera comprising a modulated light source as well as an image sensor having a fixed number of pixels in two dimensions, each pixel having a fixed expansion in each of the two dimensions.

Description

METHOD AND DEVICE FOR ENHANCING THE RESOLUTION OF A CAMERA
TECHNICAL FIELD
The present invention relates to a method and device for enhancing the resolution of a camera. In particular, the invention is used for enhancing the resolution of a three-dimensional camera for rapid image sequences.
TECHNICAL BACKGROUND
Taking three-dimensional pictures in real time has become increasingly important in recent years. Thus, it is often necessary in technical and medical fields to obtain the distance and/or expansion information of objects in addition to the pure image thereof in a two-dimensional plane. Spatial vision in particular facilitates and simplifies many applications in this area.
The presently known methods for recording three-dimensional image information in an optical manner relate primarily to stereographic methods as well as to triangulation methods. To obtain three-dimensional data of the object to be imaged, these methods require a complex mathematical post-processing operation and are thus only suitable for photographing static objects. A rapid image sequence or even a film cannot be achieved by these methods.
In recent years, various developments have been proposed around the world in which compact so-called 3D TOF (time-of-flight) systems have been realized using time-of-flight methods. Such systems may also be referred to as "optical radar" systems. A description of the principles underlying a TOF camera can for example be found in the German patent DE 44 40 613 C1 . For the first time, these TOF or time-of-flight cameras allow dynamic recording of three- dimensional sequences in real time. An image rate of more than 10 images per second already qualifies as being sufficient in reproducing moving sequences to the observer's eye. DISADVANTAGES OF THE PRIOR ART
There is, however, a restriction on the geometric image resolution of these time- of-flight cameras due to very expensive light detection elements in the available sensors. A substantial restriction exists here in respect of the size of the feasible physical sensor structure (X * Y pixels). However, many applications, which require three-dimensional dynamic image information, demand a higher geometric resolution.
TECHNICAL OBJECT TO BE ACHIEVED
Therefore, the object of the present invention is to provide a method and a device for enhancing the resolution of a time-of-flight camera. In particular, the object of the present invention is to increase in real time the resolution of a three-dimensional time-of-flight camera for rapid image sequences.
SUMMARY OF THE INVENTION
According to a first aspect of the invention a device according to the invention is proposed for enhancing in real time the resolution of rapid image sequences representing three-dimensional scenes and taken by a time-of-flight camera. Such a camera comprises a modulated light source and an image sensor that has a fixed number of pixels in two dimensions, each pixel having a fixed expansion in each of the two dimensions. The proposed device comprises optical means such as for example lenses, prisms or mirrors for displacing the image of the scene on the image sensor in at least one dimension. The arrangement is such that any light passes through these optical means before falling on the image sensor.
The optical means are further arranged so that the modulated light beam that travels from the light source to a point in the scene which reflects it back to the image sensor, covers a distance that is substantially twice as long as the distance covered by a light beam that would originate at said point in the scene before falling on the image sensor. In a first configuration of the above device, the optical means for displacing the image of the scene on the image sensor may comprise at least one mirror.
The mirror or mirrors used in the device may further present a high reflectivity with respect to the modulated light emitted by the light source.
The mirror may further be tilted by substantially 45° with respect to the direction of the emitted modulated light beam.
In another configuration of the device, the optical means for displacing the image of the scene on the image sensor may comprise at least one prism or lens.
In a further configuration of the device, the means for displacing the image on the image sensor may be configured such that the image is displaced on the image sensor by half a pixel expansion.
In yet a further configuration of the device at least one actuator may be provided, which may be adjusted in at least one dimension to control the means for displacing the image.
In a further aspect of that configuration, the considered actuators may be piezoelectric elements or magnet coils.
According to the invention, a method for enhancing the image resolution of a time-of-flight camera for taking rapid image sequences is proposed. The time- of-flight camera comprises a modulated light source, and an image sensor with a fixed number of pixels in two dimensions. Each of the pixels has a fixed expansion in each direction. The camera further comprises a device with optical means for displacing the image of the scene on the image sensor in at least one dimension. These optical means are arranged so that any light passes through them before falling on the image sensor, and so that the modulated light beam that travels from the light source to a point in the scene which reflects it back to the image sensor, covers a distance that is substantially twice as long as the distance covered by a light beam that would originate at said point in the scene before falling on the image sensor. The method comprises illuminating the object to be recorded by means of the modulated light source. A first partial image is captured by the image sensor. The image of the scene or object is displaced in at least one direction. At least one further partial image is taken by the image sensor. The captured partial images are merged to produce a complete image of a higher resolution.
The object image may be displaced in at least one dimension by a fraction of a pixel expansion.
The fraction of a pixel expansion may correspond to one half.
The object image on the image sensor may advantageously be displaced in only one dimension during each displacement.
ADVANTAGES OF THE INVENTION
According to the present invention, a mechanical and/or optical method may enhance the geometric resolution of a camera. The present invention makes it possible to reach a compromise between the dynamics and the necessary resolution in accordance with the use.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will now be described in depth with reference to the following non-limiting drawings.
Fig. 1 shows a basic overview of the time-of-flight method.
Fig. 2A shows the low-resolution individual images which are obtained according to the invention.
Fig. 2B shows the high-resolution image obtained according to the invention. Fig. 3 shows a first view of the device according to a first embodiment of the invention.
Fig. 4 shows a second view of the device according to the first embodiment of the invention.
Fig. 5 show the offset of the optical light path by means of attachment optics according to an embodiment of the invention.
Fig. 6 shows the evolution in time of the control signal of the adjusting means according to the present invention.
Fig. 7 shows a sectional view of the device according to the second embodiment of the invention.
DETAILED DESCRIPTION OF THE INVENTION
As shown in Fig. 1 , the optical time-of-flight method on which the time-of-flight camera is based is founded on a measuring method for determining the phase displacement. The implementation of the measuring method is realized by means of an active light source, the light intensity of which is modulated. This modulated light is reflected at the observed object and then falls back onto the lens of the sensor. The sensor demodulates the reflected light in a precision- timed manner according to the modulation of the emitted light. By means of the phase evaluation of the light modulation compared to the modulation of the reflected light, a phase displacement is computed, which corresponds to exactly twice the distance between the object and the camera. Since the time-of-flight sensor comprises a matrix of measuring cells, topographical information about the observed object is obtained.
In the time-of-flight method for taking three-dimensional images, shown in Fig. 1 , the time-of-flight of the integrated illumination is used to determine the third dimension. The fundamental method, also termed continuous wave (cw) measurement, is based on a light intensity modulation. In this respect, it is not directly the time-of-flight, but the phase displacement between the emitted and the detected light that is measured. The connection between the emitted modulation and the received modulation is evaluated and the phase displacement is determined. The phase displacement corresponds proportionally to the time-of-flight traveled by the light. Since the light travels to the object and back, the time-of-flight that is determined corresponds to twice the distance between the object and the camera.
Conversely, it is to be noted that the depth measurement that is obtained by demodulation of the received light will only be meaningful if the distance that the emitted modulated light travels before reaching the camera, corresponds to twice the distance between the imaged object and the image sensor.
The device according to the invention for enhancing the image resolution of a time-of-flight camera relates to a time-of-flight camera, which comprises a modulated light source and an image sensor with a fixed number of pixels in two dimensions. This provides for time-of-flight recordings of a three- dimensional object, the third dimension being obtained from the phase displacement between emitted and detected modulated light. Each pixel in the camera has a fixed expansion in each of the two dimensions. In addition, a device for displacing the image of an object to be photographed with respect to the image sensor is provided.
The method for enhancing the geometric resolution is based on the fact that the optical light path of the image is preferably repeatedly displaced each time by half a pixel width in at least one direction in the plane of the optical sensor. This displacement and the later superimposition of the sequentially recorded individual images produce a higher resolution image with a so-called sub-pixel resolution.
As a result of merging four individually recorded images, there is produced, for example, a complete image with fourfold geometric resolution, as depicted in Fig. 2. According to the present invention, care is taken to make sure that the emitted light travels twice the distance between the object to be imaged and the imaging sensor when the displacement means for displacing the image are positioned in the optical path between the object and the sensor. If that is not the case, a correct depth measurement according to the time-of-flight method will not be possible.
According to the present invention, the object to be recorded is first of all continuously illuminated by the modulated light source. A first partial image, the resolution of which is established by the number of pixels of the image sensor, is recorded by the image sensor. The image of the object is then displaced with respect to the image sensor at least once in at least one dimension by a fraction of a pixel expansion. At least one further partial image is recorded by the image sensor. The recorded partial images are merged to form a complete image of a higher resolution, as shown in Fig. 2. In this respect, according to the present invention, the two embodiments, described in detail in the following, for realizing the device for displacing the optical image with respect to the image sensor are preferred. In these embodiments the arrangement of the means for displacing the optical image is such that the emitted light, the reflection of which is used to obtain a depth measurement of the scene or object, travels double the distance between the image sensor and the imaged object before the measurement can be made.
According to the present invention, a conventional time-of-flight camera, having a image sensor with restricted resolution, is provided with mechanically controllable attachment optics which are positioned in front of the optics integrated into the standard camera. These attachment optics make it possible to offset the light path of the camera optics on the image sensor by half a pixel width in each case. A pixel is an individual optically active surface, which corresponds to one point of the geometric resolution. In the case of the time-of- flight camera used by way of example, this corresponds to an offset of 25 μm. By means of this offset of the image on the image sensor, the camera resolution is increased by a factor of four via four photographs which are displaced with respect to one another and which will subsequently be merged to form one image.
In a first embodiment according to the invention as shown in Figure 3, optical prisms 8, 9 or lenses are used in the light path in front of the imaging optics 7 and the image sensor 6 of the TOF camera 5. This realization is a very compact possibility of deflecting the light path of the image. As shown in Fig. 4, two prisms or lenses 8, 9 are advantageously used which can be adjusted in at least one direction. In this respect, the lenses or prisms 8, 9 can be pivotal in each case about an axis 1 1 respectively 12, which extends perpendicularly to the light path. If two prisms or lenses 8, 9 are used, their pivot axes 1 1 , 12 are advantageously offset by 90° with respect to one another.
In typical time-of-flight cameras, the modulated light source is made up of several Light Emitting Diodes (LED) 13. They are arranged around the imaging sensor 6 and optics 7, so that the distance that the emitted light travels to the object that is to be imaged and back to the sensor can be considered to be twice as long as the distance between the sensor and the object. If the modulated light falls on the prisms 8 and 9 before reaching the object to be imaged, some reflections may occur and the emitted light is at least partially sent back to the sensor 6, leading to an erroneous depth measurement. In order to avoid this error, the prisms 8 and 9 need to be shielded from the emitted light by an attachment optics housing 10. The housing 10 allows the incoming, reflected modulated light to reach the sensor 6, while shielding the outgoing modulated light from the displacement optics 8, 9. This aspect of the invention allows for a correct depth measurement of each image that is taken using the displacement optics.
Alternatively, as shown in Fig. 5, the lenses or prisms can be provided such that they deflect the light path when they are rotated about an axis, which is substantially parallel to the light path. Each of the lenses or prisms per se slightly deflects the light path, such that with two prisms or lenses, the light path is deflected in two dimensions with respect to the image sensor. In such an alternative embodiment, the lenses or prisms need to be shielded by a housing 10. The housing 10 allows the incoming, reflected modulated light to reach the sensor, while shielding the outgoing modulated light from the displacement optics. This aspect of the invention allows for a correct depth measurement of each image that is taken using the displacement optics.
The compact realization of the attachment optics with prisms or lenses makes it possible to provide rapid displacement units in a small space.
According to one embodiment, the two prisms of the attachment optics are suspended in an axial manner, the pivot axes 1 1 and 12 of the lenses 8 and 9 being twisted by 90° with respect to one another.
By controlling actuators which are configured as piezoelectric elements or magnet coils, the prisms or lenses are positioned as precisely as possible, which allows an absolute X or Y deviation of the light path of ± 1.5° with a reproducibility of 0.015°, see Fig. 5. This 1.5° displacement of the light path results from using a camera optics, which has the following technical data:
Diameter: 30 mm
Distance of surface of front lens from image sensor: 47 mm
Stop number: f = 1.0
Focal length f = 8.5 mm
Image diagonal 9 mm
Optical offset on the sensor side 25 μm
The piezoelectric elements or magnet coils used as actuators are controlled by means of electronics. The control signals Sx and Sy of the prisms or lenses are synchronized by the readout signal of the camera CAMERA_READ, as shown in Fig. 6.
One configuration of the device according to the invention is in the form of electronically adjustable attachment optics, which are controlled by the camera in a temporally synchronous manner, for a so-called "micro scanner".
As shown in Fig. 7, according to a second embodiment, at least one optical mirror 14 is used to deflect the light path. For the sake of clarity, the image sensor and the camera optics are not shown in Fig. 7. The mirror 14 and the camera are preferably contained in a housing 16. It is particularly advantageous for the mirror 4 that is located in the light path to be mounted by means of a so- called tip/tilt platform 15, which can be tilted in two axes via an electric control that is applied on the contacts. When the image is not displaced with respect to the imaging sensor, the mirror 4 is preferably tilted at an angle of substantially 45° with respect to the light path. The control signal shown in Fig. 6 can be used accordingly for this piezoelectric platform 15 to displace the image with respect to the image sensor in two dimensions by adjusting the mirror angle in the light path.
In this embodiment, the correct dimensioning of the mirror 4 is of high importance. In order to have a meaningful depth measurement of the imaged scene, the modulated light emitted by the light source 13 needs to travel twice the distance between the sensor and the object or scene to be recorded 18, before falling back on the sensor. Therefore, all the emitted modulated light must be deflected by the mirror 4, as shown in Figure 7. The light beams that are deflected by the mirror 17 illuminate the scene 18. They are reflected on the scene and travel the same distance back to the image sensor, where they are demodulated to retrieve the depth information corresponding to the scene 18. If the distance d that separates the light source 13 and the image sensor (not shown) from the mirror 14 is too small, the emitted light beams will be reflected 20 without reaching the scene 18 that is to be illuminated. This leads to erroneous depth measurements. If on the other hand, the distance c/ that separates the light source 13 and the image sensor (not shown) from the mirror 14 is too large, the emitted light beams will not be deflected 19 in the direction of the scene 18 and will not be useful for any depth measurement.
Therefore the distance d must be carefully chosen such as to maximize the amount of light that is correctly deflected into the direction of the scene to be illuminated. Preferably, the mirror 14 is provided with a coating that presents a high reflectivity in the emitted light spectrum.
Instead of one mirror, which can be adjusted in two dimensions, alternatively two mirrors can also be used which are each adjustable in one dimension.
The introduction of attachment optics into the optical path requires this element to be specifically adapted to the method. The use of the mechanically controllable attachment optics must correspond in optical and mechanical characteristics to the measuring principle in speed, stability and transmission.
The requirements imposed on the system layout of a high precision, dynamic three-dimensional near-field detection for movable objects with at least 10 high- resolution images per second (10 Hz) necessitates a high speed of image deflection. For example, 10 high-resolution images per second at CIF resolution (352 x 288 pixel) require 40 image recordings via the movable attachment optics with the physical camera resolution QCIF (172 x 144 pixel).
Since a displacement of the optical axis during the image recording would result in the data becoming blurred, the light path is displaced during the time in which the image data is read out from the camera. Thus, the attachment optics are directed into a stable and reproducible state in the short time of 3 ms corresponding to the control of the sensor. This can be achieved by the mentioned piezoelectric elements or magnet coils.
Since a mechanical adjustment of optical elements in the form of prisms, lenses or mirrors is concerned here, due to the material mass of the elements, said elements must be prevented from becoming self-resonant. Should the optical elements become self-resonant, a reproducible repetition of the beam deflection would be impossible. The term "self resonance" is understood as meaning the vibration of the elements due to the dead weight and the speed of adjustment. For this reason, the dynamics of the elements is determined very precisely and, if necessary, the elements are made of appropriate materials so that a resonance behavior between the mass and the speed does not occur.
Referring to Figure 7, the size and the material of the mirror 14 must be chosen so that its mass is small enough to avoid a resonance behavior when the mirror is displaced. As noted above, the size of the mirror 14 also has an impact on the distance d that separates the mirror 14 from the light source 13 and the imaging sensor. The smaller the mirror 14, the closer it has to be positioned with respect to both the light source 13 and the imaging sensor.
To avoid blurring effects in the reproduction, optical assemblies in image- producing systems do not usually have any moving elements. Precise time control of the image detection in the device according to the invention excludes optical blurring effects during the multiplication of the resolution by means of the mechanically moved displacement means.
Since the three-dimensional time-of-flight camera is based on the combination of an active illumination with a precise recording technique, the measuring accuracy of the camera depends heavily on the optical power budget between the illumination and the individual pixels of the sensor. The term "power budget" is understood as meaning the weakening of the light intensity between the light that is emitted by the illumination and reflected by the object and the light that is received on the optically active surface of the pixel. The use of attachment optics results in a reduction of this power budget. The objective is to keep this reduction as low as possible and to determine it precisely by the choice of material of the lenses. Furthermore, the choice of material of the optical coating of the lenses also plays a very important part, because optical surfaces would influence the measured values on account of reflections. Such influences can be prevented by suitable optical coating methods.
The choice of the advantageously electrical actuators for controlling the individual elements in the form of prisms, lenses or mirrors has a crucial effect on the function of the attachment optics. The key requirements imposed on the attachment optics are the speed and the precise reproducibility. The adjusting speed of the attachment optics is provided by the physically predetermined readout speed of the camera. The repeat accuracy of the attachment optics is provided computationally by the geometry of the individual pixel and also by the light path of the camera optics. The combination of factors in the previously mentioned example results in the fact that a minimum reproducibility of the element adjustment of 2.5 μm is permissible, which corresponds to a beam deviation of only 0.015°. This, and the requirement that the lenses are moved at most by the maximum reproducibility tolerance of 2.5 μm between the adjusting cycles, is the challenge to the mechanical adjustment unit of the attachment optics.
Possible piezoelectric elements are, for example the piezoelectric tip/tilt platforms S-330.20L manufactured by Physik lnstrumente (Pl) GmbH & Co. KG, Karlsruhe, Germany. Alternatively, one or two magnet coils can also be used as displacement elements.
To control the adjustment unit, electronics are provided which receive a start signal from the camera at the time of the readout of the sensor. These electronics control the actuators such that the adjustment time of 3 ms is observed. The supply voltage of the adjustment unit is stabilized such that the accuracy of 2.5 μm is ensured between the individual adjustment times. The control voltage at the actuator advantageously also determines the respective end positions, without necessitating mechanical boundaries.
The preparation of the high-resolution three-dimensional image is realized by a computer post-processing operation. Provided for this purpose is a computer- assisted method which combines the four successive images which have a defined optical offset of ± 1.5° in the X and Y directions, to form a complete image with sub-pixel resolution, as shown in Fig. 2. The sub-pixel resolution ensues from the merging of the pixels of the four individual photographs 1 , 2, 3, 4 in a resulting image.
In an advantageous application of the present application, a high-resolution time-of-flight camera according to the invention can be used, for example, to dynamically record and analyze the sole of a human foot.
Doubtlessly, many other effective configurations will occur to a person skilled in the art. Therefore, the invention is not restricted to the embodiments that have been described and includes obvious modifications for a person skilled in the art which are covered by the scope of protection of the following claims.

Claims

1. Device for enhancing in real time the resolution of rapid image sequences representing three-dimensional scenes and taken by a time- of-flight camera, said camera (5) comprising a modulated light source (13) and an image sensor (6) having a fixed number of pixels in two dimensions, each pixel having a fixed expansion in each of the two dimensions, the device comprising optical means (8,9,14) for displacing the image of the scene on the image sensor in at least one dimension, operatively arranged so that any light passes through said optical means before falling on the image sensor (6), and so that the modulated light beam that travels from the light source (13) to a point in the scene which reflects it back to the image sensor (6), covers a distance that is substantially twice as long as the distance covered by a light beam that would originate at said point in the scene before falling on the image sensor (6).
2. The device according to claim 1 , wherein the optical means for displacing the image of the scene on the image sensor comprise at least one mirror.
3. The device according to any of the preceding claims, wherein the mirror has a high reflectivity with respect to the modulated light emitted by the light source.
4. The device according to any of the preceding claims, wherein the mirror is tilted by substantially 45° with respect to the direction of the emitted modulated light beam.
5. The device according to claim 1 , wherein the optical means for displacing the image of the scene on the image sensor comprise at least one prism or lens.
6. The device according to any one of the preceding claims, wherein the means for displacing the image on the image sensor are configured such that the image is displaced on the image sensor by half a pixel expansion.
7. The device according to any one of the preceding claims, wherein at least one actuator is provided, which can be adjusted in at least one dimension to control the means for displacing the image.
8. The device according to claim 7, wherein the at least one actuator is a piezoelectric element or a magnet coil.
9. Method for enhancing the image resolution of a time-of-flight camera for taking rapid image sequences, wherein the time-of-flight camera (5) comprises a modulated light source (13), an image sensor (6) with a fixed number of pixels in two dimensions, each pixel having a fixed expansion in each direction, and a device comprising optical means (8,9,14) for displacing the image of the scene on the image sensor (6) in at least one dimension, operatively arranged so that any light passes through said optical means before falling on the image sensor (6), and so that the modulated light beam that travels from the light source (13) to a point in the scene which reflects it back to the image sensor (6), covers a distance that is substantially twice as long as the distance covered by a light beam that would originate at said point in the scene before falling on the image sensor (6), the method comprising :
- the illumination of the scene to be recorded by means of the modulated light source (13);
- the capture of a first partial image by the image sensor (6);
- the displacement of the image of the scene at least once in at least one direction;
- the capture of at least one further partial image by the image sensor (6); - the merging of the captured partial images to produce a complete image of a higher resolution.
10. Method according to claim 9, wherein the image of the scene is displaced in at least one dimension by a fraction of a pixel expansion.
1 1. Method according to claim 10, wherein the fraction of a pixel expansion corresponds to half a pixel expansion.
12. Method according to any one of claims 9 to 1 1 , wherein the image of the scene on the image sensor is displaced in each case only in one dimension during each displacement.
PCT/EP2009/059688 2008-07-28 2009-07-27 Method and device for enhancing the resolution of a camera WO2010012697A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP09781144A EP2318883A1 (en) 2008-07-28 2009-07-27 Method and device for enhancing the resolution of a camera

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
LU91464A LU91464B1 (en) 2008-07-28 2008-07-28 Method and device for increasing the resolution of a camera
LU91464 2008-07-28

Publications (1)

Publication Number Publication Date
WO2010012697A1 true WO2010012697A1 (en) 2010-02-04

Family

ID=40407706

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2009/059688 WO2010012697A1 (en) 2008-07-28 2009-07-27 Method and device for enhancing the resolution of a camera

Country Status (3)

Country Link
EP (1) EP2318883A1 (en)
LU (1) LU91464B1 (en)
WO (1) WO2010012697A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
LU92168B1 (en) * 2013-03-18 2014-09-19 Lion Systems Sa Stance and gait analysis
US10091492B2 (en) 2014-10-21 2018-10-02 Infineon Technologies Ag Imaging apparatuses and a time of flight imaging method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001309229A (en) * 2000-04-24 2001-11-02 Victor Co Of Japan Ltd Solid-state imaging apparatus and resolution conversion apparatus
US6323942B1 (en) * 1999-04-30 2001-11-27 Canesta, Inc. CMOS-compatible three-dimensional image sensor IC
JP2002171446A (en) * 2000-11-30 2002-06-14 Victor Co Of Japan Ltd Image pickup device
US7003177B1 (en) * 1999-03-30 2006-02-21 Ramot At Tel-Aviv University Ltd. Method and system for super resolution
US20070098388A1 (en) * 2005-10-28 2007-05-03 Richard Turley Systems and methods of generating Z-buffers for an image capture device of a camera
US20070171284A1 (en) * 2006-01-23 2007-07-26 Intel Corporation Imager resolution enhancement based on mechanical pixel shifting

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08107517A (en) * 1994-10-04 1996-04-23 Sony Corp Solid-state image pickup device
DE69528915T2 (en) * 1994-02-28 2003-07-31 Canon Kk Imaging device
US6570613B1 (en) * 1999-02-26 2003-05-27 Paul Howell Resolution-enhancement method for digital imaging
US8055054B2 (en) * 2006-12-15 2011-11-08 General Electric Company Method and apparatus for thermographic nondestructive evaluation of an object

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7003177B1 (en) * 1999-03-30 2006-02-21 Ramot At Tel-Aviv University Ltd. Method and system for super resolution
US6323942B1 (en) * 1999-04-30 2001-11-27 Canesta, Inc. CMOS-compatible three-dimensional image sensor IC
JP2001309229A (en) * 2000-04-24 2001-11-02 Victor Co Of Japan Ltd Solid-state imaging apparatus and resolution conversion apparatus
JP2002171446A (en) * 2000-11-30 2002-06-14 Victor Co Of Japan Ltd Image pickup device
US20070098388A1 (en) * 2005-10-28 2007-05-03 Richard Turley Systems and methods of generating Z-buffers for an image capture device of a camera
US20070171284A1 (en) * 2006-01-23 2007-07-26 Intel Corporation Imager resolution enhancement based on mechanical pixel shifting

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
LU92168B1 (en) * 2013-03-18 2014-09-19 Lion Systems Sa Stance and gait analysis
US10091492B2 (en) 2014-10-21 2018-10-02 Infineon Technologies Ag Imaging apparatuses and a time of flight imaging method

Also Published As

Publication number Publication date
EP2318883A1 (en) 2011-05-11
LU91464B1 (en) 2010-01-29

Similar Documents

Publication Publication Date Title
US8717450B2 (en) Moving imager camera for track and range capture
US6600168B1 (en) High speed laser three-dimensional imager
US6496218B2 (en) Stereoscopic image display apparatus for detecting viewpoint and forming stereoscopic image while following up viewpoint position
US7078720B2 (en) Range finder for measuring three-dimensional geometry of object and method thereof
JP2009300268A (en) Three-dimensional information detection device
WO2010096634A1 (en) Speckle noise reduction for a coherent illumination imaging system
US8754949B2 (en) Shake measurement system, shake measurement method, and imaging device
US10791286B2 (en) Differentiated imaging using camera assembly with augmented pixels
JP7409443B2 (en) Imaging device
US20120163791A1 (en) Stereoscopic Imaging Device
WO2010012697A1 (en) Method and device for enhancing the resolution of a camera
JP2014224808A (en) Image detection system
US11924395B2 (en) Device comprising a multi-aperture imaging device for generating a depth map
US11330161B2 (en) Device comprising a multi-aperture imaging device for accumulating image information
CN216246133U (en) Structured light projection device, depth data measuring head and computing equipment
CN115218820A (en) Structured light projection device, depth data measuring head, computing device and measuring method
EP4328542A1 (en) Depth data measurement head, depth data computing device, and corresponding method
CN216283296U (en) Depth data measuring head and depth data calculating apparatus
Thorstensen et al. Compact interferometric projector for high accuracy 3D imaging in space
JP2000088539A (en) Method and apparatus for three-dimensional inputting
US20100253802A1 (en) Enhanced microscan apparatus and methods
BR102022022076A2 (en) PHOTOGRAPHY MODULE AND ELECTRONIC DEVICE
Godber The development of novel stereoscopic imaging sensors
JP2000088540A (en) Method and apparatus for three-dimensional inputting
JP2010107530A (en) Optical device and optical equipment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09781144

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2009781144

Country of ref document: EP