WO2009078002A1 - 3d camera and methods of gating thereof - Google Patents

3d camera and methods of gating thereof Download PDF

Info

Publication number
WO2009078002A1
WO2009078002A1 PCT/IL2007/001571 IL2007001571W WO2009078002A1 WO 2009078002 A1 WO2009078002 A1 WO 2009078002A1 IL 2007001571 W IL2007001571 W IL 2007001571W WO 2009078002 A1 WO2009078002 A1 WO 2009078002A1
Authority
WO
WIPO (PCT)
Prior art keywords
gate
light
gates
scene
pulse
Prior art date
Application number
PCT/IL2007/001571
Other languages
English (en)
French (fr)
Inventor
Giora Yahav
Gil Zigelman
Allan C. Entis
Original Assignee
Microsoft International Holdings B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft International Holdings B.V. filed Critical Microsoft International Holdings B.V.
Priority to EP07849597A priority Critical patent/EP2235563A1/en
Priority to PCT/IL2007/001571 priority patent/WO2009078002A1/en
Priority to CN2007801023367A priority patent/CN102099703A/zh
Publication of WO2009078002A1 publication Critical patent/WO2009078002A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • G01S17/18Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves wherein range gates are used

Definitions

  • the invention relates to methods and apparatus for acquiring 3D images of a scene.
  • 3D optical imaging systems hereinafter referred to as "3D cameras”, that are capable of providing distance measurements to objects and points on objects that they image, are used for many different applications.
  • these applications are profile inspections of manufactured goods, CAD verification, robot vision, geographic surveying and imaging objects selectively as a function of distance.
  • Some 3D cameras provide simultaneous measurements to substantially all points of objects in a scene they image.
  • these 3D cameras comprise a light source, such as a laser, which is pulsed or shuttered so that it provides pulses of light for illuminating a scene being imaged and a gated imaging system for imaging light from the light pulses that is reflected from objects in the scene.
  • the gated imaging system comprises a camera having a photosensitive surface, hereinafter referred to as a "photosurface”, such as a CCD or CMOS photosurface and a gating means for gating the camera open and closed, such as an optical shutter or a gated image intensif ⁇ er.
  • the reflected light is registered on pixels of the photosurface of the camera only if it reaches the camera when the camera is gated open.
  • the scene is generally illuminated with a train of light pulses radiated from the light source. For each radiated light pulse in the train, following an accurately determined delay from the time that the light pulse is radiated, the camera is gated open for a period of time hereinafter referred to as a "gate".
  • a gate Light from the light pulse that is reflected from an object in the scene is imaged on the photosurface of the camera if it reaches the camera during the gate.
  • the time elapsed between radiating a light pulse and the gate that follows it is known, the time it took imaged light to travel from the light source to the reflecting object in the scene and back to the camera is known. The time elapsed is used to determine the distance to the object.
  • the cameras described in these patents use amounts of light registered by pixels in the camera during times at which the camera is gated open to determine distances to features in contiguous slices of a scene.
  • the slice locations and spatial widths are defined by length and timing of gates during which the cameras are gated open relative to pulse lengths and timing of light pulses that are transmitted to illuminate the scene.
  • At least two gates are used to acquire a 3D image of a slice of the scene. Relative to a time at which a light pulse is transmitted to illuminate the scene, a front gate starts at a same time that a long gate begins and a back gate ends at a same time as a long gate ends.
  • the short gates optionally have a gate width equal to a pulse width of pulses of light used to illuminate the scene and a long gate has a pulse width equal to twice the light pulse width.
  • Amounts of light registered during the at least one short gate by a pixel in the camera that images a feature of the scene are normalized to amounts of light registered by the pixel during the at least one long gate.
  • the normalized amounts of registered light are used to determine a distance to the feature.
  • a total acquisition time for acquiring data for a 3D image of a scene is substantially equal to a number of slices of the scene that are imaged, times a time, a "slice acquisition time", required to acquire 3D data for a single slice.
  • a slice acquisition time is a function of a number of light pulses and gates required to register quantities of light for the various gates sufficient to provide data for determining distances to features of the scene located in the slice.
  • a 3D camera using a pulsed source of illumination and a gated imaging system is described in "Design and Development of a Multi-detecting two Dimensional Ranging Sensor", Measurement Science and Technology 6 (September 1995), pages 1301-1308, by S. Christie, et al., and in “Range-gated Imaging for Near Field Target Identification", Yates et al, SPIE Vol. 2869, p374 - 385 which are herein incorporated by reference.
  • Another 3D camera is described in U.S. patent 5,081,530 to Medina, which is incorporated herein by reference.
  • a 3D camera described in this patent registers energy in a pulse of light reflected from a target that reaches the camera's imaging system during each gate of a pair of gates. Distance to a target is determined from the ratio of the difference between the amounts of energy registered during each of the two gates to the sum of the amounts of energy registered during each of the two gates.
  • An aspect of some embodiments of the invention relates to providing a gating procedure for gating a gated 3D camera that provides a relatively short acquisition time for a 3D image of a scene.
  • An aspect of some embodiments of the invention relates to providing a configuration of gates for providing a 3D image of a scene for which, for a given width of a light pulse that illuminates the scene the gates provide 3D data for slices of the scene that have relatively large spatial widths.
  • a method of acquiring a 3D image of a scene comprising: transmitting at least one light pulse at a transmission time to illuminate the scene; imaging light reflected by the scene from the at least one light pulse on a gateable photosurface during a plurality of sets of gates, each set comprising at least one first, second and third gate having start and stop times and for which, relative to a transmission time of a light pulse of the at least one light pulse, the start and stop times of the at least one first gate and the at least one second gate are between the start and stop times of the at least one third gate, wherein for at least one of the set of gates, the at least one third gate has a start time equal to about the stop time of the at least one third gate of another of the set of gates; and using amounts of reflected light imaged on the photosurface to determine distances to the scene.
  • the start time of the second gate is substantially equal to the stop time of the first gate.
  • the start time of the first gate is delayed relative to the start time of the third gate by a pulse width of the at least one light pulse.
  • the first and second gates have equal gate widths.
  • the gate width of the first and second gates is equal to half a pulse width of the at least one light pulse.
  • the gate width of the third gate is substantially equal to three pulse widths of the at least one light pulse.
  • the at least one third gate has a start time earlier than the stop time of the at least one third gate
  • a method of acquiring a 3D image of a scene comprising: transmitting at least one light pulse at a transmission time to illuminate the scene; imaging light reflected by the scene from the at least one light pulse on a gateable photosurface during a plurality of equal length gates having start and stop times and for which, relative to a transmission time of a light pulse of the at least one light pulse, the start time of at least one first gate is substantially a time half way between the start and stop times of at least one second gate; and using amounts of reflected light imaged on the photosurface during the at least one first gate and the at least one second gate to determine distance to a feature of the scene.
  • the gates have a gate width substantially equal to twice a pulse width of the at least one light pulse.
  • a method of acquiring a 3D image of a scene comprising: transmitting at least one light pulse having a pulse width at a transmission time to illuminate the scene; imaging light reflected by the scene from the at least one light pulse on a gateable photosurface during a set of gates comprising at least one first, second and third gates having start and stop times and for which, relative to a transmission time of a light pulse of the at least one light pulse, the start and stop times of each first gate and each second gate are between the start and stop times of a same third gate; and using amounts of reflected light imaged on the photosurface to determine distances to the scene.
  • the first gate has a start time delayed relative to a start time of the third gate by about the pulse width.
  • the second gate has a stop time that precedes a stop time of the at least one third gate by about a pulse width.
  • the first gate has a gate width equal to about half a pulse width.
  • the second gate has a gate width equal to about half a pulse width.
  • the third gate has a gate width equal to about three pulse widths.
  • a camera useable to acquire a 3D image of a scene comprising: a gateable photosurface; and a controller that gates the photosurface in accordance with an embodiment of the invention.
  • the camera comprises a light source controllable to illuminate the scene with a pulse of light.
  • Fig. 1 schematically illustrates a gated 3D camera being used to acquire a 3D image of a scene
  • Fig. 2 shows a time-distance graph that illustrates a temporal configuration of gates of the camera and at least one light pulse that illuminates the scene shown in Fig. 1 used to acquire a 3D image of the scene, in accordance with prior art;
  • Fig. 3 shows a time-distance graph that illustrates an ambiguity in determining distance to a feature in a scene, in accordance with prior art
  • Fig 4 shows a time-distance graph that illustrates a temporal configuration of gates for removing the ambiguity illustrated in Fig. 3, in accordance with an embodiment of the invention
  • Fig. 5 schematically shows a scene and slices of the scene for which a gated camera provides 3D data
  • Fig. 6 shows a time-distance graph that graphically illustrates timing of gates used to acquire 3D data for the plurality of slices of the scene shown in Fig. 5 using the gating configuration illustrated in Fig. 4, in accordance with an embodiment of the invention
  • Fig. 7 shows a time-distance graph that graphically illustrates timing of gates used to acquire 3D data for a plurality of slices of a scene, in accordance with another embodiment of the invention
  • Fig. 8 shows a time-distance graph that graphically illustrates another timing configuration of gates used to acquire 3D data for a scene, in accordance with prior art
  • Fig. 9 shows a time-distance graph that graphically illustrates another timing configuration of gates used to acquire 3D data for a scene, in accordance with prior art
  • Fig. 10 shows a time-distance graph that graphically illustrates timing of gates configured as shown Fig. 9 used to acquire 3D data for a plurality of slices of a scene, in accordance with prior art
  • Fig. 11 shows a time-distance graph that graphically illustrates timing of gates used to acquire 3D data for a plurality of slices of a scene, in accordance with another embodiment of the invention.
  • Fig. 1 schematically illustrates a gated 3D camera 20 being used to acquire a 3D image of a scene 30 having objects schematically represented by objects 31 and 32.
  • Camera 20 which is represented very schematically, comprises a lens system, represented by a lens 21, and a photosurface 22 having pixels 23 on which the lens system images the scene.
  • Photosurface 22 is "gateable” so that it may be selectively gated on or off to be made sensitive or insensitive respectively to light for desired periods.
  • pixels in the photosurface are independently gateable.
  • Photosurfaces that are gated by shutters are described in US Patents 6,057,909, 6,327,073, 6,331,911 and 6,794,628, the disclosures of which are incorporated herein by reference.
  • Photosurfaces having independently gateable pixels are described in PCT Publication WO 00/36372 describes photosurfaces having pixels that are independently gateable.
  • a “gate” refers to a period during which a photosurface or pixel in the photosurface is gated on and made sensitive to light. For convenience of presentation, it is assumed that all the pixels 23 in photosurface 22 are gated on or off simultaneously.
  • Camera 20 optionally comprises a suitable light source 26, such as for example, a laser or a LED or an array of lasers and/or LEDs, controllable to illuminate scene 30 with pulses of light.
  • a controller 24 controls pulsing of light source 26 and gating of photosurface 22.
  • controller 24 controls light source 26 to emit at least one light pulse, schematically represented by a wavy arrow 40, to illuminate scene 30.
  • Light from each at least one light pulse 40 is reflected by features in scene 30 and some of the reflected light is incident on camera 20 and collected by lens 21. Reflected light from at least one light pulse 40 that reaches camera 20 is schematically represented by wavy arrows 45.
  • controller 24 gates photosurface 22 on at a suitable time relative to a time at which the light pulse is emitted to receive and image reflected light 45 collected by lens 21 on photosurface 22. Amounts of light 45 imaged on pixels 23 of the photosurface are used to determine distances to features of scene 30 that are imaged on the pixels and provide thereby a 3D image of the scene.
  • Fig. 2 shows a time-distance graph 60 that illustrates relationships between timing of a light pulse 40 that illuminates scene 30, a gate of photosurface 22 and amounts of light 45 collected by lens 21 and registered by a pixel 23 in the photosurface that images a feature of scene 30 located at a distance D from camera 20.
  • distance D from camera 20 is indicated along an abscissa 61 and time is indicated along right and left ordinates 62 and 63 respectively.
  • the ordinates are scaled in units proportional to "ct" - time "t” multiplied by the speed of light "c".
  • camera 20 will receive light reflected from the light pulse by the feature for a period having duration ⁇ xp W from a time at which a first photon reflected by the feature from the light pulse reaches the camera.
  • Block arrows 72 extending from time tj(Df) to time t2(Df) and having a length Axp W shown along left and right hand time ordinates 62 and 63 schematically represent light reflected by feature 71 that reaches camera 20.
  • Aw 11 , A ⁇ p W .
  • FIG. 1 imaging the feature will register light from the feature.
  • Q 0 is an amount of light that would be registered by pixel 23 imaging feature 71, were all the light reflected from pulse 40 by the feature and collected by lens 21 registered by the pixel, irrespective of whether or not the reflected light reached the camera during short gate 80.
  • Q 0 is a function of reflectance of feature 71 and its distance Df from the camera.
  • Q 0 is determined by controlling light source 26 to emit at least one light pulse 40 and for each emitted light pulse 40, gating camera 20 with a long gate so that all light reflected by feature 71 from the light pulse that reaches the camera is imaged on photosurface 22.
  • Camera 20 receives and registers light reflected from light pulse 40 on its photosurface 22 during short gate 80 for any feature of scene 30 located at a distance Df between lower and upper bound distances, DSL 3 ⁇ DSy respectively, from the camera.
  • the upper and lower bound distances define a "slice", schematically indicated in graph 60 by a shaded rectangle 83, of scene 30.
  • a long gate corresponding to short gate 80 for determining Q 0 for any feature located in slice 83 is schematically represented by a rectangle 90 along right hand time ordinate 63.
  • Light registered by pixel 23 that images feature 71 during long gate 90 is graphically represented by a shaded area 91 in gate 90.
  • long gate 90 optionally has a gate width ⁇ /,TM, at least equal to a sum of short gate width Aw 14 , and twice the pulse width Aw 1 , of light pulse 40, i. e.
  • Gate 90 optionally has a start time tjgi relative to time t 0 at which light pulse 40 is emitted that is equal to a time that a first photon from a feature in scene 30 having a distance DSL f rom camera 20 reaches the camera.
  • Gate 90 optionally has a stop time t/g2 equal to a time at which a last photon from pulse 40 reaches camera 20 from a feature of the scene having a distance DSu fr° m camera 20.
  • long gate 90 provides information sufficient to determine Q 0 for any feature in slice 83, it is noted that neither a quantity Q of light 81 registered on pixel 23 that images feature 71 during short gate 80 or a quantity Q 0 of light 91 registered by the pixel during long gate 90, provides information as to which of equations (3) or (4) should be to used to determine
  • Df and Df * satisfy a relationship:
  • Fig. 3 schematically shows distance Df and complimentary distance Df* for feature 71 in a time-distance graph 75.
  • Amounts of light registered by pixel 23 for distance Df are schematically shown by shaded areas 81 and 91 for short and long gates 80 and 90 respectively, both of which gates are schematically shown along left time ordinate 62.
  • Position of feature 71 for distance Df* is indicated along time-distance line 70 by a circle 71*.
  • Amounts of light registered by pixel 23 for Df* are schematically represented by shaded areas 81* and 91* for gates 80 and 90, which for distance Df* are shown along right hand time ordinate 63.
  • a block arrow labeled 72* schematically represents light from light pulse 40 reflected from feature 71 that reaches camera 20 were the feature to be located at Df*. It is noted that whereas light is registered for distances Df and Df* by pixel 23 at different times during gates 80 and 90, the amounts of light registered during gate 80 are the same in each case, as are the amounts registered during gate 90.
  • the ambiguity of whether to use equation (3) or equation (4) in determining Df is removed by gating photosurface 22 on following each of at least one light pulse 40 for two short gates, a first, "front” short gate and a second, “back” short gate, optionally having equal gate widths.
  • photosurface 22 is gated on for a front short gate having a gate width ⁇ p ⁇ g ⁇ at a front gate start time, tpg, following a time at which the light pulse is emitted.
  • a shaded area 91 represents' light registered by pixel ' 23 during long gate 90 for distance Df.
  • Shaded areas 103 and 104 in front and back short gates 101 and 102 respectively graphically represent amounts of light registered by pixel 23 for feature 71, located at Df, during the short gates.
  • the same front and back short gates 101 and 102 and long gate 90 are shown along right hand time ordinate 63.
  • Shaded areas 105 and 106 in front and back short gates 101 and 102 shown along right hand time ordinate 63 represent amounts of light registered by pixel 23 during the short gates respectively, assuming that feature 71 were to be located at Df .
  • a shaded area 91 represents light registered by pixel 23 during long gate 90 for Df*. From Fig.
  • At least one light pulse 40 comprises a plurality of light pulses and each quantity of registered charge Qp 5 QB and Q 0 is determined from a train of light pulses 40.
  • controller 24 optionally controls light source 26 to emit a train of light pulses 40.
  • controller 24 gates photosurface 22 on for a short gate 101. A total amount of light registered by each pixel 23 for all short gates 101 is used to provide Qp for the pixel.
  • Qg another train of light pulses 40 is emitted and following a delay of t ⁇ g (equation (11)) after an emission time of each light pulse, photosurface 22 is gated on for a back gate 102.
  • a total amount of light registered by each pixel for all the light pulses in the pulse train is used to determine Qg.
  • Q 0 is similarly determined from a pulse train of light pulses 40 and a long gate 90 for each light pulse in the light pulse train, which gate 90 follows the light pulse by a delay of t/gj (equation (8)).
  • a "slice acquisition time", "Ts”, be a time during which a scene, such as scene 30, is illuminated by light source 26 with light pulses 40 and camera 20 gated to acquire values for Qp, QB and Q 0 for pixels 23 in photosurface 22 to provide distances to features in a slice of the scene imaged on the pixels.
  • slices imaged by a gated camera such as camera 20 are relatively narrow, and a scene for which distances are to be determined using the camera typically has features located in a range of distances substantially greater than a range defined by lower and upper bound distances DSL anc ⁇ DSy (Figs. 2-4) of the slice.
  • a plurality of substantially contiguous slices of the scene are imaged using camera 20.
  • the scene is illuminated by at least one light pulse 40 and camera 20 is gated on for front, back and long gates to acquire distances to features in the slice.
  • Tf total 3D scene acquisition time
  • Fig. 5 schematically shows scene 30 shown in Fig. 1 and a plurality "N" of slices S ⁇ , S 2 ... Sj ⁇ that are used by camera 20 to provide a 3D image of the scene, which extends from a range R ⁇ to a range R.2-
  • the slices shown in Fig. 5 are represented in graph 95 by shaded rectangles labeled Sj, S2 ... Sjq- along time-distance line 70.
  • slices represented by S n having a larger subscript n are farther from camera 20 than slices represented by S n having a smaller index, and slices whose indices differ by 1 are contiguous.
  • the short front and back gates and the long gate that define a given slice S n are graphically represented by rectangles labeled respectively FG n , BG n and LG n respectively. Gates for adjacent slices S n are shown on opposite sides of time ordinate 62.
  • a first slice Sj in range R1-R2 defined by gates FGj, BGj and LG ⁇ is slice 83 shown in Fig. 4 and shaded regions 103, 104 and 91 (Fig. 4) graphically representing light registered by a pixel 23 for feature 71 are shown for gates FGj , BGj and LGj.
  • a total 3D acquisition time Tj (equation (16)) required to acquire a 3D image of a scene by 3D imaging contiguous slices as described above can be too long to provide satisfactory 3D imaging of the scene.
  • an acquisition time Tj may be too long for scenes having moving features that displace by distances on the order of a slice width during time Tj.
  • a total time Tf for acquiring a 3D image of a scene can be reduced by modifying timing of gates used to acquire values for Qp, QQ and Q 0 for slices of the scene, optionally, without changing the lengths of the gates.
  • a time delay equal to 2Axp W temporally separates start times of front gates for contiguous slices, and long gates for the slices overlap (relative to t 0 ) as shown in Fig. 4.
  • a 3D image of a scene can be acquired, in accordance with an embodiment of the invention, in a reduced total acquisition time T ⁇ if long gates of adjacent slices are timed so that they do not overlap and when one long gate of adjacent slices ends, the other begins.
  • interstitial slices For the interstitial slices, the front and short gates do not provide 3D information. In accordance with an embodiment of the invention, the non-overlapping long gates provide additional information for determining distance to features located in the interstitial slices. For convenience of presentation, slices of a scene for which short front and back gates provide information are referred to as "regular slices".
  • each long gate and its associated front and back short gates provide 3D data for a regular slice and in addition 3D data for an interstitial slice. If the front, back and non-overlapping long, gates have same gate widths "SW" as corresponding gates having overlapping long gates shown in Fig. 6 that define a slice of a scene, a regular gate widths "SW" as corresponding gates having overlapping long gates shown in Fig. 6 that define a slice of a scene, a regular
  • each set of gates that define a regular slice of the scene provides 3D data for features in larger range of distances than a corresponding set of gates in Fig. 6. Therefore, a smaller number of sets of gates having non-overlapping long
  • a slice acquisition time for a regular slice is about the same as a slice acquisition time for a slice defined using overlapping gates.
  • a total 3D acquisition time Tj for a scene using non-overlapping long gates in accordance with an embodiment of the invention is therefore generally shorter than a total
  • Fig. 7 schematically shows a timing and distance graph 115 that graphically illustrates timing of short front and back, and non-overlapping long gates, in accordance with an embodiment of the invention.
  • Regular slices of scene 30 that are imaged by camera 20 are, graphically represented by shaded rectangles that are labeled RSj RS2 ... RS ⁇ along time-
  • corresponding front back and long gates of camera 20 that define the slice are graphically represented by rectangles labeled RFG n , RBG n and RLG n respectively.
  • the gates are graphically shown along left hand time ordinate 62, with gates for adjacent regular slices shown on opposite sides of the time ordinate. Relative to a start time for any given long gate RLG n , the corresponding short front and back gates RFG n , RBG n have a
  • First regular slice RS j in graph 115 is, by way of example, identical to slice S ⁇ shown in graph 95 of Fig. 6. And relative to an emission time t 0 of a first light pulse 40, the start, stop and gate lengths for gates RFG j , RBGi 3 ⁇ RLGi 3 ⁇ me same respectively as for gates FGj, BG ⁇ and LG ⁇ n Fig. 6.
  • Distance to a feature located in a regular slice RS n is determined from amounts of light Qp, Qg and Q 0 registered by a pixel 23 that images the feature during front, back and long gates RFG n , RBG n and RLG n that define the slice, using equations (13) and (14). Amounts of light registered by a pixel that images feature 71 during gates RFGi, RBGl and RLGj are graphically represented by shaded areas 103, 104 and 91 respectively in the gates.
  • a long gate RLG n begins when a preceding long gate RLG n .
  • regular slices RS n are not contiguous but are separated by interstitial slices.
  • Interstitial slices are graphically represented in Fig. 7 by shaded rectangles labeled IS n n +i, where the subscripts n, n+1 indicate the regular slices RS n and RS n + ⁇ that bracket the interstitial slice.
  • a pixel 23- that images the feature does not register light reflected by the feature from a light pulse 40 during short and front gates of regular slices that bracket the interstitial slice.
  • 3D information for the feature is not acquired by camera 20 during the front and back gates and distance to the feature cannot be determined using conventional equations such as equations (13) and (14).
  • the pixel does not register any light reflected from a light pulse 40 by the feature during the short gates, the pixel does register light reflected from a light pulse 40 during long gates of the regular slices that bracket the interstitial slice.
  • a feature of scene 30 located in interstitial slice IS 1 2 at a distance Df(121) is schematically represented by a circle labeled 121 along time-distance line 70.
  • Reflected light registered by a pixel 23 that images the feature during long gates RLGj and RLG2 is graphically indicated by shaded regions 122 and 124 respectively.
  • light registered by the long gates is used to determine distance to the feature.
  • a camera such as camera 20, used to determine distances to features of a scene and gated in accordance with an embodiment of the invention, let amounts of light registered by a pixel that images a feature in the scene during gates RFG n , RBG n be represented by Q F (n) and Q B (n) respectively. Let times at which the gates RFG n and RBG n are turned on be represented by tpg(n) and t Bg (n) respectively. Then distance Df to the feature may be determined in accordance with an embodiment of the invention, using the following general set of conditions and equations. If QF(II) ⁇ 0 or Q B (n) ⁇ 0 (20)
  • adjacent long gates are timed, relative to time t 0 , to "overlap" slightly to remove the ambiguity.
  • a long gate may have a start time earlier than the stop time of an immediately previous long gate by a period equal to or less than about a tenth of a pulse width.
  • to provide the overlap long gates are lengthened by the period of the overlap.
  • camera 30 is gated on for N sets of gates RFG n , RBG n and RLG n where (R 2 -Ri)/[3/2)SW], (27)
  • a slice acquisition time T ⁇ for a regular slice in accordance with an embodiment of the invention, is the same as a slice acquisition time for a slice acquired in accordance with an embodiment of the invention illustrated in Fig. 6.
  • a total time, Tj/, to acquire data for a 3D image of scene 30 in accordance with an embodiment of the invention is therefore 2/3 the time required to 3D image the scene in accordance with gating shown in Fig. 6.
  • distances to features in a slice of scene, such as scene 30, are determined by a short, "extended", front gate having a gate width equal to the pulse width A%p W of light pulses 40 (Fig. 1) that illuminate the scene.
  • the extended front gate and corresponding long gate begin at a same time relative to an emission time t 0 of respective light pulses 40 that illuminate the scene and provide light that is reflected from the scene and imaged by camera 20 during the gates.
  • Fig. 8 shows a time-distance graph 200 that illustrates timing of an extended front gate 202 and its corresponding long gate 204 relative to an emission time t 0 of a light pulse 40 that illuminates scene 30.
  • Front gate 202 is shown along left hand time ordinate 62 and long gate 204 is shown along right hand time ordinate 63.
  • Gates 202 and 204 begin at a same time relative to time t 0 .
  • long gate 90 shown in Figs. 2-4 optionally has a gate width equal to 3Axp W
  • long gate 204 used with extended front gate 202 optionally has a shorter gate width 2A%p W .
  • Extended front gate 202 and its associated long gate 204 define a slice, of scene 30 having a spatial width cAip W /2.
  • the slice defined by the gates is graphically represented by a shaded rectangle 206 along time-distance line 70 in Fig. 8.
  • t S g Let a time at which extended front gate 202 and its corresponding long gate begin following a light pulse emission time t 0 be represented by t S g. Then light acquired by pixels 23 in photosurface 22 during the gates may be used to determine distances Df to features in a slice of the scene having lower and upper distance bounds DSL, DSU where DSL, DSy and Df satisfy a relationship,
  • Light registered by a pixel 23 that images a feature 71 of scene 30 located in slice 206 during extended front gate 202 and long gate 204 is graphically represented by shaded areas 208 and 210 respectively in the gates. From Fig. 8 it may be seen that for any feature located in slice 206, an amount of light 210 registered by pixel 23 is substantially all the light reflected by feature 71 that reaches camera 20. However, for such a feature, an amount of light 208 registered by the pixel during extended front gate 202 is dependent on the location of the feature in slice 206. Let the amounts of registered light 208 and 210 be represented in symbols by "Q" and "Q 0 " respectively.
  • since there are relatively few features for which Q/Q o 1, ignoring such features does not substantially, adversely affect providing a complete 3D image of the slice.
  • Distances to features in slice 206 may also be determined in accordance with prior art using an extended back gate instead of an extended front gate.
  • the extended back gate has a gate width equal to that of an extended front gate but ends at a same time relative to a light pulse emission time t o as a corresponding long gate, instead of beginning at a same time as the long gate.
  • An extended back gate may be considered a mirror image of an extended front gate.
  • Fig. 9 shows a time-distance graph 220 that illustrates timing of an extended back gate
  • Extended back gate 222 is shown along left hand time ordinate 62 and long gate 224 is shown along right hand time ordinate 63.
  • Gates 222 and 224 end at a same time tgp- relative to time t 0 .
  • Light reflected from feature 71 registered by pixel 23 that images the feature during extended back and long gates 222 and 224 is represented by shaded areas 226 and 228 respectively.
  • Distance Df to a feature in slice 206 is given by
  • Df ct eg /2 + (c ⁇ p W )(Q/Q 0 -2)/2, (33) where "Q" and "Q 0 ", represent amounts of registered light 226 and 228 respectively.
  • configurations of an extended front gate 202 (Fig. 8) or an extended back gate 222 (Fig. 9) together with a long gate 204 and 224 respectively may be repeated to acquire distances to features in a plurality of optionally contiguous slices of scene 30.
  • a range over which distances to features in the scene extends from a distance R ⁇ to a distance R 2 , and a slice has a slice width SW, a plurality of N slices, where are needed to acquire a 3D image of the scene.
  • a "gate acquisition time" ⁇ Tg be required for a pixel 23 to register an amount of light for an extended front gate, extended back gate or a long gate suitable for use in determining a distance Df to a feature of a scene imaged on the pixel.
  • a 3D acquisition time for a slice of the scene may then be written
  • Fig. 10 shows a time-distance graph 240 that illustrates temporal relationships of a plurality of extended back gates and associated long gates that are used to acquire 3D images of slices of a scene, such as scene 30.
  • a first extended back gate 242 and its associated long gate 244 are shown along left time ordinate 62 and define a slice 246. Amounts of light reflected from a light pulse 40 (Fig. 1) from a feature 248 in slice 246 that are registered by a pixel imaging the feature during extended back gate 242 and long gate 244 are shown as shaded areas 249 and 250 respectively in the gates.
  • a second extended back gate 252 and its associated long gate 254 are shown along right time ordinate 63 and define a slice 256.
  • slice 248 in scene 30 are assumed contiguous with slice 256 in the scene and as a result in Fig. 10 touch at a corner. Ki ⁇ umHwm
  • Amounts of light reflected from a light pulse 40 (Fig. 1) from a feature 258 in slice 256 that are registered by a pixel 23 imaging the feature during extended back gate 252 and long gate 254 are shown as shaded areas 259 and 260 respectively in the gates.
  • the inventors have realized that if only long gates are used to acquire a plurality of slices of a scene a total acquisition time, Tp, to acquire data for a 3D image of scene 30 may be reduced relative to a total acquisition time required using an extended front or back gate and an associated long gate.
  • Fig. 11 shows a time-distance graph 280 that illustrates temporal relationships of a plurality of long gates used to acquire a 3D image of a scene, in accordance with an embodiment of the invention.
  • a sequence of long gates LGj , LG2, LG3 ... used to acquire a 3D image of scene 30 are shown along left time ordinate 62.
  • gates, hereinafter “odd gates”, labeled with an odd subscript and gates, hereinafter “even gates”, labeled with an even number subscript are shown on opposite sides of left time ordinate 62.
  • Gates LGj and LG2 are also repeated along right time ordinate 63. Each pair of sequential odd and even gates defines a spatial slice of scene 30.
  • gates LGj and LG2 define a slice located along time-distance line 70 labeled SI4 2 in Fig. 11.
  • Light registered by pixels 23 of photosurface 22 during gates LG ⁇ and LG2 may be used in accordance with an embodiment of the invention to determine distance to all features in slice SLj 2 located in the slice.
  • gate pairs (LG3.LG4) define slice SL3 4 shown in Fig. 11. In an embodiment of the invention, even and odd gates have a same gate width,
  • a ⁇ LGW 2 ⁇ p W , (37) and each even and odd gate begins at a time (relative to t 0 ) that is half a gate width, i.e. A ⁇ p W later than a time at which a preceding odd and even gate respectively begins.
  • distance Df to the feature imaged by the pixel is determined in accordance with the following equations,
  • a total acquisition time for the scene in accordance with an embodiment of the invention is therefore one half the prior art acquisition time given by equation (36).

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
PCT/IL2007/001571 2007-12-19 2007-12-19 3d camera and methods of gating thereof WO2009078002A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP07849597A EP2235563A1 (en) 2007-12-19 2007-12-19 3d camera and methods of gating thereof
PCT/IL2007/001571 WO2009078002A1 (en) 2007-12-19 2007-12-19 3d camera and methods of gating thereof
CN2007801023367A CN102099703A (zh) 2007-12-19 2007-12-19 3d照相机及其选通方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IL2007/001571 WO2009078002A1 (en) 2007-12-19 2007-12-19 3d camera and methods of gating thereof

Publications (1)

Publication Number Publication Date
WO2009078002A1 true WO2009078002A1 (en) 2009-06-25

Family

ID=39797928

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2007/001571 WO2009078002A1 (en) 2007-12-19 2007-12-19 3d camera and methods of gating thereof

Country Status (3)

Country Link
EP (1) EP2235563A1 (zh)
CN (1) CN102099703A (zh)
WO (1) WO2009078002A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8264581B2 (en) 2008-07-17 2012-09-11 Microsoft International Holdings B.V. CMOS photogate 3D camera system having improved charge sensing cell and pixel geometry
US8681321B2 (en) 2009-01-04 2014-03-25 Microsoft International Holdings B.V. Gated 3D camera
US8890952B2 (en) 2008-07-29 2014-11-18 Microsoft Corporation Imaging system
US20150109414A1 (en) * 2013-10-17 2015-04-23 Amit Adam Probabilistic time of flight imaging

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9826214B2 (en) * 2014-09-08 2017-11-21 Microsoft Technology Licensing, Llc. Variable resolution pixel
US9874630B2 (en) * 2015-01-30 2018-01-23 Microsoft Technology Licensing, Llc Extended range gated time of flight camera
US10708577B2 (en) * 2015-12-16 2020-07-07 Facebook Technologies, Llc Range-gated depth camera assembly

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003054579A1 (de) * 2001-12-06 2003-07-03 Astrium Gmbh Verfahren und vorrichtung zum erzeugen von 3d-enfernungsbildern
US20070091175A1 (en) * 1998-09-28 2007-04-26 3Dv Systems Ltd. 3D Vision On A Chip

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11508359A (ja) * 1995-06-22 1999-07-21 3ディブイ・システムズ・リミテッド 改善された光学測距カメラ
US7224384B1 (en) * 1999-09-08 2007-05-29 3Dv Systems Ltd. 3D imaging system
US7236235B2 (en) * 2004-07-06 2007-06-26 Dimsdale Engineering, Llc System and method for determining range in 3D imaging systems
EP1659418A1 (en) * 2004-11-23 2006-05-24 IEE INTERNATIONAL ELECTRONICS & ENGINEERING S.A. Method for error compensation in a 3D camera
CN100337122C (zh) * 2005-03-25 2007-09-12 浙江大学 无扫描器脉冲调制式三维成像方法及系统
CN100462737C (zh) * 2006-06-29 2009-02-18 哈尔滨工业大学 距离选通式激光3d成像雷达系统

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070091175A1 (en) * 1998-09-28 2007-04-26 3Dv Systems Ltd. 3D Vision On A Chip
WO2003054579A1 (de) * 2001-12-06 2003-07-03 Astrium Gmbh Verfahren und vorrichtung zum erzeugen von 3d-enfernungsbildern

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8264581B2 (en) 2008-07-17 2012-09-11 Microsoft International Holdings B.V. CMOS photogate 3D camera system having improved charge sensing cell and pixel geometry
US8890952B2 (en) 2008-07-29 2014-11-18 Microsoft Corporation Imaging system
US8681321B2 (en) 2009-01-04 2014-03-25 Microsoft International Holdings B.V. Gated 3D camera
US9641825B2 (en) 2009-01-04 2017-05-02 Microsoft International Holdings B.V. Gated 3D camera
US20150109414A1 (en) * 2013-10-17 2015-04-23 Amit Adam Probabilistic time of flight imaging
KR20160071390A (ko) * 2013-10-17 2016-06-21 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 확률적 tof 이미징
CN105723238A (zh) * 2013-10-17 2016-06-29 微软技术许可有限责任公司 概率飞行时间成像
US10063844B2 (en) * 2013-10-17 2018-08-28 Microsoft Technology Licensing, Llc. Determining distances by probabilistic time of flight imaging
KR102233419B1 (ko) 2013-10-17 2021-03-26 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 확률적 tof 이미징

Also Published As

Publication number Publication date
CN102099703A (zh) 2011-06-15
EP2235563A1 (en) 2010-10-06

Similar Documents

Publication Publication Date Title
WO2009078002A1 (en) 3d camera and methods of gating thereof
US20220128663A1 (en) Methods and apparatus for array based lidar systems with reduced interference
JP6910010B2 (ja) 距離測定装置
CN102147553B (zh) 快速选通光敏面的方法、用于确定到场景中的特征的距离的方法及照相机
KR102656399B1 (ko) 구조화된 광 조명기가 있는 비행-시간 센서
KR101992511B1 (ko) 3차원 줌 이미저
JP5647118B2 (ja) 撮像システム
KR102233419B1 (ko) 확률적 tof 이미징
EP2311251B1 (en) Rolling shutter camera system and method
US8681321B2 (en) Gated 3D camera
KR102559910B1 (ko) 차량 주변 환경을 특성화하기 위한 시스템
JP2021532648A (ja) ハイブリッド飛行時間型イメージャモジュール
CN101446641B (zh) 距离测量系统和距离测量方法
JP6526178B2 (ja) 視野を監視するための撮像システムおよび視野を監視するための方法
JP2022551427A (ja) シーンまでの距離を決定するための方法および装置
WO2021065138A1 (ja) 測距装置および制御方法
CN112470035A (zh) 距离信息取得装置、距离信息取得方法及程序
CN104049258B (zh) 一种空间目标立体成像装置及方法
CN117940795A (zh) 用于操作选通摄像头的方法、用于执行这种方法的控制装置、具有这种控制装置的选通摄像头和具有这种选通摄像头的机动车
KR20220155362A (ko) 이미지 데이터 획득을 위한 장치 및 방법

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200780102336.7

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07849597

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 5196/DELNP/2010

Country of ref document: IN

Ref document number: 2007849597

Country of ref document: EP