WO2019182784A1 - Programmable light curtains - Google Patents

Programmable light curtains Download PDF

Info

Publication number
WO2019182784A1
WO2019182784A1 PCT/US2019/021569 US2019021569W WO2019182784A1 WO 2019182784 A1 WO2019182784 A1 WO 2019182784A1 US 2019021569 W US2019021569 W US 2019021569W WO 2019182784 A1 WO2019182784 A1 WO 2019182784A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
plane
line
sensor
sensing
Prior art date
Application number
PCT/US2019/021569
Other languages
French (fr)
Inventor
Srinivasa Narasimhan
Jian Wang
Aswin C. Sankaranarayanan
Joseph BARTELS
William Whittaker
Original Assignee
Carnegie Mellon University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Carnegie Mellon University filed Critical Carnegie Mellon University
Priority to US16/470,885 priority Critical patent/US11493634B2/en
Priority to CA3094199A priority patent/CA3094199A1/en
Priority to EP19772623.5A priority patent/EP3769037A4/en
Priority to JP2020550649A priority patent/JP7570099B2/en
Priority to US17/601,780 priority patent/US11972586B2/en
Publication of WO2019182784A1 publication Critical patent/WO2019182784A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/04Systems determining the presence of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/32Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning

Definitions

  • 3D sensors play an important role in the deployment of many autonomous systems, including field robots and self-driving cars.
  • a self-guided vehicle on a road or a robot in the field does not need a full-blown 3D depth sensor to detect potential collisions or monitor its blind spot.
  • a self-guided vehicle on a road or a robot in the field does not need a full-blown 3D depth sensor to detect potential collisions or monitor its blind spot.
  • the vehicle is able to detect if any object comes within a pre-defmed perimeter of the vehicle to allow for collision avoidance. This is a much easier task than full depth scanning and object identification.
  • Various embodiments are generally directed to a device that monitors the presence of objects passing through or impinging on a virtual shell near the device, which is referred to herein as a“light curtain”.
  • Light curtains offer a lightweight, resource-efficient and programmable approach for proximity awareness for obstacle avoidance and navigation. They also have additional benefits in terms of improving visibility in fog as well as flexibility in handling light fall-off
  • the light curtains are created by rapidly rotating a line sensor and a line laser in synchrony.
  • the embodiment is capable of generating light curtains of various shapes with a range of 20-30m in sunlight (40m under cloudy skies and 50m indoors) and adapts dynamically to the demands of the task.
  • light curtains may be implemented by triangulating an illumination plane, created by fanning out a laser, with a sensing plane of a line sensor. In the absence of ambient illumination, the sensor senses light only from the intersection between these two planes, which, in the physical world, is a line. The light curtain is then created by sweeping the illumination and sensing planes in synchrony.
  • the light curtains may have a programmable shape to allow for the detection of objects along a particular perimeter, such as for detecting a vehicle impinging on a lane for a self-driving vehicle.
  • the capability of the light curtain can be enhanced by using correlation-based time-of-flight (ToF) sensors.
  • ToF time-of-flight
  • the light curtain is capable of detecting the presence of objects that intersect a virtual shell around the system. By detecting only the objects that intersect with the virtual shell, many tasks pertaining to collision avoidance and situational awareness can be solved with little or no computational overhead.
  • Light curtains provide a novel approach for proximity sensing and collision avoid-ance that has immense benefits on autonomous devices.
  • the shape of the light curtain can be changed on the fly and can be used to provide better detections, especially under strong ambient light (like sunlight) as well as global illumination (like fog).
  • FIG. 1 is a schematic view of the components of the light curtain device.
  • FIG. 2 is a diagram showing the intersecting planes of the light projector the sensor and the formation of the light curtain by synchronously scanning the projector and sensor such as to move the intersecting line of the planes in the desired shape.
  • FIG. 3 is a top view of the diagram of FIG. 2.
  • FIG. 4 is a diagram showing (a) the viewing an illumination geometry of a light curtain generated by rotating the laser light plane and sensor plane about a parallel axis are; (b) a view of the coordinate frame showing various parameters; and (c) a top view of the coordinate system.
  • FIG. 5 is a diagram showing the thickness of the light curtain.
  • FIG. 6 shows two applications of the light curtain.
  • FIG. 6(a) shows a light curtain being used for a tilted plane a proximity arc and a path check.
  • FIG. 6(b) shows the application of a light curtain in a self-driving vehicle sensing for upcoming traffic adjacent traffic and an adjacent laying check.
  • FIG. 7(a-d) are illustrations of light curtains resulting when the sensor and laser rotates about an axis and about a point.
  • the device consists of a line scan laser (Illumination module 100) and a line scan sensor (Sensor Module 120) as shown in a top view in FIG. 1.
  • illumination module 100 uses a laser diode 102 is a light source.
  • the laser diode 102 may be, for example, a 638nm laser diode with the peak power of 700mW.
  • the light emitted from laser diode 102 is collimated using collimation lens 104.
  • the light is 11 stretched into a line with line lens 106, which may, in some embodiments, be a 45° Powell lens.
  • a steerable galvo mirror 108 is used.
  • the galvo mirror has a dimension of 11 mm x 7 mm and has a 22.5° mechanical angle and can give the sensor and laser a 45° field of view.
  • the galvo mirror 108 takes 500 ps to rotate through a .2° optical angle.
  • a micro-controller is used to synchronize the sensor, the laser and the galvo mirrors 108, 126.
  • the galvo mirror 108 used for the illumination module 100 and the galvo mirror 126 use for sensor module 120 will be identical.
  • a mechanical motor may be used to steer the light beam and sensor.
  • a 2D sensor and with a rolling shutter or a region of interest mask may be used to effectively emulate a faster line sensor.
  • Sensor module 120 comprises a line sensor 122, lens 124 and steerable galvo mirror 126.
  • line sensor 122 is a line scan intensity sensor.
  • the line scan intensity sensor is a 6mm fl 2 S-mount lens having a diagonal field-of-view of 45° and an image circle 7mm in diameter.
  • the line sensor may have 2048x2 total pixels with the pixel size being approximately 7 pm x 7 pm. In preferred embodiments, only the central 1000 pixels of the sensor are used due to the limited circle of illumination of the lens.
  • the line scan sensor may be capable of scanning 95,000 lines per second and may be fitted with an optical bandpass filter having a 630nm center wavelength and a 50nm bandwidth suppress ambient light.
  • the rotation axes our aligned to be parallel and fixed with a baseline of 300 mm.
  • the resulting field-of-view of the system is approximately 45° by
  • the Powell lens 106 fans the laser beam into a planar sheet of light and the line sensor
  • the line sensor 122 senses light from a single plane.
  • the two planes intersect at a line in 3D, as shown in FIG. 2, and, in the absence of ambient and indirect illuminations, the line sensor 122 measures light scattered by any object on the line.
  • the intersecting line can be swept to form any ruled surface.
  • This ruled surface, on which presence of objects can be detected, is the light curtain.
  • the resulting device is programmable, in terms of its light curtain shape, and flexible, in terms of being able to vary laser power and sensor exposure time to suit the demands of an application.
  • FIG. 4(a) shows the viewing and illumination geometry of a light curtain generated by rotating the laser light plane and sensor plane about parallel axes r.
  • the intersection line is also parallel to the two rotation axes, as shown in FIG. 4(b).
  • the coordinate frame of FIG. 4(b) and top view of FIG. 4(c) show various parameters of interest. Note that changing 6 C and q r synchronously generates light curtains with different shapes.
  • FIG. 4(c) and FIG. 5 show the finite sizes of sensor pixels and finite thickness of laser sheet leads to a thick light curtain upon triangulation.
  • the senor 122 When operated in strong ambient light, for example, sunlight, the sensor 122 also
  • galvo mirrors 108, 122 may take time to stabilize after rotation.
  • the stabilization time may be as much as 500ps, before the mirrors are stable enough to capture a line. This limits the overall frame-rate of the device. Adding two lOOps exposures for laser on and off to filter out ambient light allows a display of 1400 lines per second. If the light curtains are designed to contain 200 lines, the entire light curtain can be refreshed at a rate of 5.6 fps. Galvo mirrors which stabilize and time shorter than 500 ps would allow the curtain refresh rates to reach 20 to 30 fps.
  • the light curtain device can also be configured with line sensor 122 and laser 102 rotating over non-parallel axes or with each of them enjoying full rotational degrees of freedom. These configurations have their own unique advantages. When the devices have full rotational freedom, i.e., capable of rotating around a point with no restrictions, then any ruled surface (including for example, a mobius strip) can be generated as a light curtain. Full rotational freedom, however, is hard to implement since multi-axis galvos or gimbals are needed and are often cost-prohibitive.
  • FIG. 7(a,b,c) are illustrations of light curtains resulting when each of the sensor 122 and laser 102 rotates about an axis.
  • FIG. 7(d) is an illustration of a light curtain resulting when the sensor 122 and laser 102 rotate about a point.
  • the lines in light curtain should also be parallel to l c or l p , thus r(t) is a constant value and is in the direction of l c .
  • the lines in light curtain should also go through A, thus:
  • r(t) can be derived as follows.
  • FIG. 7(d) shows a mobius strip as an example. The proof is trivial, that any line in the ruled surface and the rotation center form a plane, which will determine the rotation of line sensor 122 or line laser 102.
  • Optimizing Light Curtains - Parameters of interest in practical light curtains can be quantified, for example, their thickness and SNR of measured detections, and approaches to optimize them are presented herein.
  • Of particular interest is the minimizing of the thickness of the curtain as well as optimizing exposure time and laser power for improved detection accuracy when the curtain spans a large range of depths.
  • Thickness of light curtain The light curtain produced by device described herein has a finite thickness due to the finite size of the sensor pixels and finite thickness of the laser illumination.
  • the laser spot has a thickness of A L meters and each pixel has an angular extent of 5 C radians.
  • the thickness of the light curtain is given as an area of a parallelogram shaded in FIGS. 4(c) and 5, which evaluates to: where r c and r p is the distance between the intersected point and the sensor and laser, respectively.
  • thickness by triangulation can be formalized as following: where r c (r) and r p (t) are the distance from r(t) to sensor rotation center [C, 0,0] and laser rotation center [ P , 0,0] respectively, z(t) is the depth of r(t), and (— , - ) is the range through which the rotate center of the sensor and laser can position. For simplicity, only consider a cross- section of a light curtain in the xz plane.
  • a key advantage of the light curtain device is that the power of the laser or the exposure time can be adapted for each intersecting line to compensate for light fall-off, which is inversely proportional to the square of the depth.
  • points close to be sensor get saturated easily.
  • system of the present invention has an additional degree of freedom wherein the power of the laser and/or the exposure time of the sensor can be adjusted according to depth such that light fall-off is compensated to the extent possible under the device constraints and with respect to eye safety.
  • the laser can send small amounts of light to just overcome the readout noise of the sensor or the photon noise of ambient light, and only a l-bit sensor is required.
  • CW-TOF sensors measures phase to obtain depth.
  • a CW-TOF sensor works by
  • phase difference f and the depth d of the scene point are related as:
  • the depth resolution of a TOF sensor is constant and independent of depth. Further, the depth resolution increases with the frequency of the amplitude wave.
  • TOF-based depth recovery has a phase wrapping problem due to the presence of the mod( ) operator, which implies that the depth estimate has an ambiguity problem and this problem gets worse at higher frequencies.
  • traditional triangulation- based depth estimation has no ambiguity problem, but at the cost of quadratic depth uncertainty.
  • Triangulation and phase are fused by measuring the phase (as with regular correlation-based ToF) in addition the usual measurement of intensity.
  • phase-based depth gating using appropriate codes at illumination and sensing.
  • the use of triangulation automatically eliminates the depth ambiguity of phase-based gating provided the thickness of the triangulation is smaller than the wavelength of the amplitude wave. With this, it is possible to create thinner light curtains over a larger depth range.
  • the senor receives first-bounce light reflected from the object as well as a lot of single-scattered light. With light curtains, the line sensor 102 avoids single-scattered light and only receives multi-scattered light. The ratio between first-bounce light and global light is much higher, thus contrast is better.
  • the light curtain method and device described herein has many benefits.
  • the shape of a light curtain is programmable and can be configured dynamically to suit the demands of the immediate task.
  • light curtains can be used to determine whether a vehicle is changing lanes in front, whether a pedestrian is in the crosswalk, or whether there are vehicles in neighboring lanes.
  • a robot might use a curtain that extrudes its planned (even curved) motion trajectory.
  • FIGS. 6(a,b) show various light curtains for use in robots and cars respectively.
  • the optical design of the light curtain shares similarities with confocal imaging in that small regions are selectively illuminated and sensed. When imaging in scattering media, such as fog and murky waters, this has the implicit advantage that many multi-bounce light paths a. re optically avoided thereby providing images with increased contrast.
  • a key advantage of light curtains is that illumination and sensing can be concentrated to a thin region. Together with the power and exposure adaptability, this provides significantly better performance under strong ambient illumination, including direct sunlight, at large distances (i.e., 20-30m). The performance increases under cloudy skies and indoors to 40m and 50m respectively.
  • the senor only captures a single line of the light curtain that often has small depth variations and hence, little variation in intensity fall-off Thus, the dynamic range of the measured brightness can be low. A such, even a one-bit sensor with a programmable threshold would be ample for the envisioned tasks.
  • CMOS intensity sensors
  • CCD CCD
  • IuGaAs time-of- flight sensors
  • SPAD correlation, SPAD
  • DRS neuromorphic sensors
  • the system may be run under control of one or more microprocessors in communication with memory containing software implementing the functions of the system.
  • the movement of the galvo mirrors 108, 126 is under the control of the software to define the shape of the light curtain.
  • the software may be configurable to allow the definition of light curtains of various shapes.
  • the software may control the cycling of light source 102 as well as the timing of the reading of the data from line sensor 122 and the application of any filtering to the data, for example, the filtering of ambient light.
  • Objects may be detected by breaking the light curtain, causing a variance in the light in the line of pixels sensed by line sensor 122. Upon detection of an object that has breached the light curtain, an alert may be raise and communicated off-unit.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Input (AREA)

Abstract

Embodiments described herein are generally directed to a device that monitors for the presence of objects passing through or impinging on a virtual shell near the device, referred to herein as a light curtain, which is created by rapidly rotating a line sensor and a line laser in synchrony. The boundaries of the light curtain are defined by a sweeping line defined by the intersection of the sensing and illumination planes.

Description

PROGRAMMABLE LIGHT CURTAINS
Related Applications
[0001] This application claims the benefit of U.S. Provisional Patent Application Serial No.
62/761,479, filed March 23, 2018, which is incorporate herein in its entirety.
Government Interest
[0002] This invention was made with government support under CNS 1446601 awarded by the NSF, N000141512358 awarded by the ONR, N000141612906 awarded by the ONR, DTRT13GUTC26 awarded by the DOT, and HR00111620021 awarded by DARPA. The government has certain rights in the invention.
Background of the Invention
[0003] 3D sensors play an important role in the deployment of many autonomous systems, including field robots and self-driving cars. However, there are many tasks for which it may not be necessary to use a full-blown 3D scanner. As an example, a self-guided vehicle on a road or a robot in the field does not need a full-blown 3D depth sensor to detect potential collisions or monitor its blind spot. Instead, what is necessary is for the vehicle to be able to detect if any object comes within a pre-defmed perimeter of the vehicle to allow for collision avoidance. This is a much easier task than full depth scanning and object identification.
[0004] Consider a robot that is maneuvering a dynamic terrain. While full 3D perception is important for long-term path planning, it is less useful for time-critical tasks like obstacle detection and avoidance. Similarly, in autonomous driving, collision avoidance is a task that must be continually performed but does not require full 3D perception of the scene. For such tasks, a proximity sensor with much reduced energy and computational footprint may be sufficient.
Summary of the Invention
[0005] This summary presents a simplified summary to provide a basic understanding of some novel embodiments described herein. This summary is not an extensive overview, and it is not intended to identify key/critical elements or to delineate the scope thereof. Some concepts are presented in a simplified form as a prelude to the more detailed description that is presented later.
[0006] Various embodiments are generally directed to a device that monitors the presence of objects passing through or impinging on a virtual shell near the device, which is referred to herein as a“light curtain”. Light curtains offer a lightweight, resource-efficient and programmable approach for proximity awareness for obstacle avoidance and navigation. They also have additional benefits in terms of improving visibility in fog as well as flexibility in handling light fall-off
[0007] In one embodiment, the light curtains are created by rapidly rotating a line sensor and a line laser in synchrony. The embodiment is capable of generating light curtains of various shapes with a range of 20-30m in sunlight (40m under cloudy skies and 50m indoors) and adapts dynamically to the demands of the task.
[0008] In one embodiment, light curtains may be implemented by triangulating an illumination plane, created by fanning out a laser, with a sensing plane of a line sensor. In the absence of ambient illumination, the sensor senses light only from the intersection between these two planes, which, in the physical world, is a line. The light curtain is then created by sweeping the illumination and sensing planes in synchrony. [0009] In another embodiment, the light curtains may have a programmable shape to allow for the detection of objects along a particular perimeter, such as for detecting a vehicle impinging on a lane for a self-driving vehicle.
[0010] In yet another embodiment, the capability of the light curtain can be enhanced by using correlation-based time-of-flight (ToF) sensors.
[0011] The light curtain is capable of detecting the presence of objects that intersect a virtual shell around the system. By detecting only the objects that intersect with the virtual shell, many tasks pertaining to collision avoidance and situational awareness can be solved with little or no computational overhead.
[0012] Light curtains provide a novel approach for proximity sensing and collision avoid-ance that has immense benefits on autonomous devices. The shape of the light curtain can be changed on the fly and can be used to provide better detections, especially under strong ambient light (like sunlight) as well as global illumination (like fog).
Brief Descrivtion of the Drawings
[0001] The Detailed Description of the invention will be better understood when read in
conjunction with the figures appended hereto. For the purpose of illustrating the invention, there is shown in the drawings a preferred embodiment. It is
understood, however, that this invention is not limited to these embodiments or the precise arrangements shown.
[0013] FIG. 1 is a schematic view of the components of the light curtain device.
[0014] FIG. 2 is a diagram showing the intersecting planes of the light projector the sensor and the formation of the light curtain by synchronously scanning the projector and sensor such as to move the intersecting line of the planes in the desired shape. [0015] FIG. 3 is a top view of the diagram of FIG. 2.
[0016] FIG. 4 is a diagram showing (a) the viewing an illumination geometry of a light curtain generated by rotating the laser light plane and sensor plane about a parallel axis are; (b) a view of the coordinate frame showing various parameters; and (c) a top view of the coordinate system.
[0017] FIG. 5 is a diagram showing the thickness of the light curtain.
[0018] FIG. 6 shows two applications of the light curtain. FIG. 6(a) shows a light curtain being used for a tilted plane a proximity arc and a path check. FIG. 6(b) shows the application of a light curtain in a self-driving vehicle sensing for upcoming traffic adjacent traffic and an adjacent laying check.
[0019] FIG. 7(a-d) are illustrations of light curtains resulting when the sensor and laser rotates about an axis and about a point.
Detailed Description
[0020] The invention is described in detail below. Implementations used in the description of the invention, including implementations using various components or arrangements of components, should be considered exemplary only and are not meant to limit the invention in any way. As one of skill in the art would realize, many variations on implementations discussed herein which fall within the scope of the invention are possible. Accordingly, the exemplary methods and apparatuses disclosed herein are not to be taken as limitations on the invention, but as an illustration thereof.
[0021] In one embodiment, the device consists of a line scan laser (Illumination module 100) and a line scan sensor (Sensor Module 120) as shown in a top view in FIG. 1. In one embodiment, illumination module 100 uses a laser diode 102 is a light source. In one embodiment, the laser diode 102 may be, for example, a 638nm laser diode with the peak power of 700mW. The light emitted from laser diode 102 is collimated using collimation lens 104. The light is 11 stretched into a line with line lens 106, which may, in some embodiments, be a 45° Powell lens.
[0022] To steer the light beam, a steerable galvo mirror 108 is used. In preferred embodiments, the galvo mirror has a dimension of 11 mm x 7 mm and has a 22.5° mechanical angle and can give the sensor and laser a 45° field of view. The galvo mirror 108 takes 500 ps to rotate through a .2° optical angle. A micro-controller is used to synchronize the sensor, the laser and the galvo mirrors 108, 126. In preferred embodiments, the galvo mirror 108 used for the illumination module 100 and the galvo mirror 126 use for sensor module 120 will be identical. In alternate embodiments, a mechanical motor may be used to steer the light beam and sensor. In yet other embodiments, a 2D sensor and with a rolling shutter or a region of interest mask may be used to effectively emulate a faster line sensor.
[0023] Sensor module 120 comprises a line sensor 122, lens 124 and steerable galvo mirror 126.
In one embodiment, line sensor 122 is a line scan intensity sensor. In one embodiment, the line scan intensity sensor is a 6mm fl 2 S-mount lens having a diagonal field-of-view of 45° and an image circle 7mm in diameter. The line sensor may have 2048x2 total pixels with the pixel size being approximately 7 pm x 7 pm. In preferred embodiments, only the central 1000 pixels of the sensor are used due to the limited circle of illumination of the lens. Preferably, the line scan sensor may be capable of scanning 95,000 lines per second and may be fitted with an optical bandpass filter having a 630nm center wavelength and a 50nm bandwidth suppress ambient light.
[0024] In preferred embodiments, the rotation axes our aligned to be parallel and fixed with a baseline of 300 mm. The resulting field-of-view of the system is approximately 45° by
45°. [0025] The Powell lens 106 fans the laser beam into a planar sheet of light and the line sensor
122 senses light from a single plane. In the general configuration, the two planes intersect at a line in 3D, as shown in FIG. 2, and, in the absence of ambient and indirect illuminations, the line sensor 122 measures light scattered by any object on the line. By rotating both the line sensor 122 and the laser 102 at a high speed, the intersecting line can be swept to form any ruled surface. This ruled surface, on which presence of objects can be detected, is the light curtain. The resulting device is programmable, in terms of its light curtain shape, and flexible, in terms of being able to vary laser power and sensor exposure time to suit the demands of an application.
[0026] FIG. 4(a) shows the viewing and illumination geometry of a light curtain generated by rotating the laser light plane and sensor plane about parallel axes r. The intersection line is also parallel to the two rotation axes, as shown in FIG. 4(b). The coordinate frame of FIG. 4(b) and top view of FIG. 4(c) show various parameters of interest. Note that changing 6C and qr synchronously generates light curtains with different shapes. FIG. 4(c) and FIG. 5 show the finite sizes of sensor pixels and finite thickness of laser sheet leads to a thick light curtain upon triangulation.
[0027] When operated in strong ambient light, for example, sunlight, the sensor 122 also
measures the contribution from the ambient light illuminating the entire scene. To suppress this, two image captures are performed at sensor 122 for each setting of the galvos, one with and one without illumination from laser 102, each with exposure of lOOps. The images may then be subtracted to filter out the ambient light.
[0028] In one embodiment, galvo mirrors 108, 122 may take time to stabilize after rotation. The stabilization time may be as much as 500ps, before the mirrors are stable enough to capture a line. This limits the overall frame-rate of the device. Adding two lOOps exposures for laser on and off to filter out ambient light allows a display of 1400 lines per second. If the light curtains are designed to contain 200 lines, the entire light curtain can be refreshed at a rate of 5.6 fps. Galvo mirrors which stabilize and time shorter than 500 ps would allow the curtain refresh rates to reach 20 to 30 fps.
[0029] Light Curtains with Parallel Axes of Rotation - The case where the sensor and laser can each be rotated about a single fixed parallel axis, as shown in FIG. 4 can be implemented by placing two 1D galvo mirrors, one each in the front of the line sensor and the laser, respectively. Let the sensor and laser rotation axes be r. The intersecting line in the curtain will also be parallel and of the form:
Po + ur
[0030] where p0 is any 3D point on the line and u e (—a, a) is the offset along the axis of
rotation (see FIG. 4 (a,b)). Then, the light curtain s(t, u ) c E3 is obtained by sweeping the intersection line such that: s(t, u )— p(t) + ur
[0031] where p(t) e E3 is a 3D path that describes the points scanned by the center pixel on the line sensor and t e [0,1] is the parameterization of the path.
[0032] Given a light curtain s(t, u), the rotation angles of the sensor and laser can be computed.
Without loss of generality, it is assumed that the origin of the coordinate system is at the midpoint between the centers of the line sensor and the laser. It is further assumed that the rotation axes are aligned along the y-axis and that the 3D path can be written as p(t) = [x(t), 0, z(t)]T. To achieve this light curtain, suppose that the laser rotates about its axis with an angular profile of 0p(t), where the angle is measured counter-clockwise with respect to the x-axis. Similarly, the line sensor rotates with an angular profile of 0c(t).
Let b be the baseline between the laser and the line sensor. 6>c (t) and 0p(t) can be derived as:
Figure imgf000010_0001
[0033] General Light Curtains - The light curtain device can also be configured with line sensor 122 and laser 102 rotating over non-parallel axes or with each of them enjoying full rotational degrees of freedom. These configurations have their own unique advantages. When the devices have full rotational freedom, i.e., capable of rotating around a point with no restrictions, then any ruled surface (including for example, a mobius strip) can be generated as a light curtain. Full rotational freedom, however, is hard to implement since multi-axis galvos or gimbals are needed and are often cost-prohibitive.
[0034] Rotation Over Two Axes - When the line sensor and line laser rotate over two axes, lc and Ip, respectively, given a 3D path p(t), the generated ruled surface can be determined. Each line in the surface should not only go through p(t), but also be co-planar to lc and lp simultaneously. The parametric form of the generated light curtain s(t, u) c R3 can be written as: s(t, u) = p(t) + ur(t) where u is a scalar and r(t) e R3 is the direction of the line which is analyzed in the following from easy to general conditions.
[0035] FIG. 7(a,b,c) are illustrations of light curtains resulting when each of the sensor 122 and laser 102 rotates about an axis. FIG. 7(d) is an illustration of a light curtain resulting when the sensor 122 and laser 102 rotate about a point. When lc || lp, as shown in FIG. 7(a), the lines in light curtain should also be parallel to lc or lp , thus r(t) is a constant value and is in the direction of lc. When lc and lp intersect at a point, as shown in FIG. 7(b), the lines in light curtain should also go through A, thus:
Figure imgf000011_0001
[0036] When lc and lp are non-coplanar, as shown in FIG. 7(c), r(t) can be derived as follows.
If lp intersects with plane p(t)— lc at A(t), the line should also pass A(t), thus:
Figure imgf000011_0002
[0037] If there is no intersection, meaning lp is parallel to this plane, r(t) should be in the direction of lp.
[0038] Given a 3D path p(t), the rotation angles of the line sensor and line laser can be
computed as follows. Without loss of generality, assume that the origin of the coordinate system is at the midpoint between the centers of the line sensor 122 and line laser 102. The distance between two centers is b, and the directions of lc and lp are M and N e I3 respectively. The given 3D path is at the y = 0 plane and can be written as p(t) =
[x(t), 0, z(t)]T . The rotation angle of sensor 122, which is measured counter-clockwise with respect to the xy-plane, is:
Figure imgf000011_0003
[0039] and the rotation angle of laser 102 is similar. They are:
Figure imgf000012_0001
[0040] When lc and lp are co-planar, the equation can be simplified. Without loss of generality, assume both are co-planar at the xy-plane, and are ±g to the y-axis,
Figure imgf000012_0002
and when lc || lp, g = 0, this can be simplified further as:
Figure imgf000012_0003
[0041] Rotation Over Two Points - When line sensor 122 and line laser 102 can rotate over two points respectively (full rotational degree of freedom), any ruled surface can be generated. FIG. 7(d) shows a mobius strip as an example. The proof is trivial, that any line in the ruled surface and the rotation center form a plane, which will determine the rotation of line sensor 122 or line laser 102.
[0042] Optimizing Light Curtains - Parameters of interest in practical light curtains can be quantified, for example, their thickness and SNR of measured detections, and approaches to optimize them are presented herein. Of particular interest is the minimizing of the thickness of the curtain as well as optimizing exposure time and laser power for improved detection accuracy when the curtain spans a large range of depths.
[0043] Thickness of light curtain - The light curtain produced by device described herein has a finite thickness due to the finite size of the sensor pixels and finite thickness of the laser illumination. Suppose that the laser spot has a thickness of AL meters and each pixel has an angular extent of 5C radians. Given a device with a baseline of length b meters, imaging a point at depth z(t) = z, then the thickness of the light curtain is given as an area of a parallelogram shaded in FIGS. 4(c) and 5, which evaluates to:
Figure imgf000013_0001
where rc and rp is the distance between the intersected point and the sensor and laser, respectively.
[0044] Given that different light curtain geometries can produce curtains of the same area, a more intuitive and meaningful metric for characterizing the thickness is the length:
Figure imgf000013_0002
[0045] In any given system, changing the laser thickness AL, requires changing the optics of the illumination module. Similarly, changing 5C requires either changing the pixel pitch or the field-of-view of the sensor. In contrast, varying the base-line provides an easier alternative to changing the thickness of the curtain that involves a single translation. This is important because different applications often have differing needs regarding the thickness of the curtain. A larger baseline helps in achieving very thin curtains, which is important when there is a critical need to avoid false alarms. On the other hand, thick curtains that can be achieved by having a smaller baseline are important in scenarios where it is critical to avoid mis-detections. Further, a sufficiently thick curtain also helps in avoiding mis-detections caused by the inherent discreteness.
[0046] Minimizing the thickness and energy for nearby light curtains - When a light curtain is far away, the largest possible baseline can be used to minimize uncertainty, and no matter how the device is configured, the consumed energy is close. But, when the light curtain is nearby, the best configuration is nontrivial.
[0047] Given the curtain shape in the xz plane r(t), the optimization problem mini-mizing
thickness by triangulation can be formalized as following:
Figure imgf000014_0001
where rc(r) and rp (t) are the distance from r(t) to sensor rotation center [C, 0,0] and laser rotation center [ P , 0,0] respectively, z(t) is the depth of r(t), and (— , - ) is the range through which the rotate center of the sensor and laser can position. For simplicity, only consider a cross- section of a light curtain in the xz plane.
[0048] When the objective is to minimize energy, the problem becomes
Figure imgf000014_0002
[0049] When the objective is to minimize one subject to the other one smaller than some value, like
Figure imgf000015_0001
and
Figure imgf000015_0002
the optimization result is hard to predict.
[0050] Adapting laser power and exposure - A key advantage of the light curtain device is that the power of the laser or the exposure time can be adapted for each intersecting line to compensate for light fall-off, which is inversely proportional to the square of the depth. In a traditional projector-sensor system, it is commonplace to increase the brightness of the projection to compensate for light fall-off, so that far-away scenes points can be well illuminated. However, this would imply that points close to be sensor get saturated easily. This would further imply a need for a high dynamic range sensor as well as reduced resource efficiency due to the need for sending strong light to nearby scenes.
[0051] In contrast, system of the present invention has an additional degree of freedom wherein the power of the laser and/or the exposure time of the sensor can be adjusted according to depth such that light fall-off is compensated to the extent possible under the device constraints and with respect to eye safety. Further, because the system only detects the presence or absence of objects, in an ideal scenario where albedo is the same everywhere, the laser can send small amounts of light to just overcome the readout noise of the sensor or the photon noise of ambient light, and only a l-bit sensor is required.
[0052] Combining with time-of-flight sensors - The analysis above also indicates that the light curtain can be expected to get thicker, quadratically, with depth. Increasing baseline and other parameters of the system can only alleviate this effect in part due to the physical constraints on sensor size, laser spot thickness as well as the baseline. Replacing the line intensity sensor with a 1D continuous wave time-of-flight (CW-TOF) sensor alleviates the quadratic dependence of thickness with depth.
[0053] CW-TOF sensors measures phase to obtain depth. A CW-TOF sensor works by
illuminating the scene with an amplitude modulated wave (typically, a periodic signal of frequency fm Hz) and measuring the phase between the illumination and the light intensity at each pixel. The phase difference f and the depth d of the scene point are related as:
Figure imgf000016_0001
[0054] As a consequence, the depth resolution of a TOF sensor is constant and independent of depth. Further, the depth resolution increases with the frequency of the amplitude wave. However, TOF-based depth recovery has a phase wrapping problem due to the presence of the mod( ) operator, which implies that the depth estimate has an ambiguity problem and this problem gets worse at higher frequencies. In contrast, traditional triangulation- based depth estimation has no ambiguity problem, but at the cost of quadratic depth uncertainty.
[0055] The complementary strengths of traditional triangulation and CW-TOF can be leveraged to enable light curtains with near-constant thickness over a large range. Triangulation and phase are fused by measuring the phase (as with regular correlation-based ToF) in addition the usual measurement of intensity.
[0056] Knowing the depth of the curtain, the appropriate phase to retain and discard pixels with phase values that are significantly different can be calculated. An alternative approach achieves this by performing phase-based depth gating using appropriate codes at illumination and sensing. The use of triangulation automatically eliminates the depth ambiguity of phase-based gating provided the thickness of the triangulation is smaller than the wavelength of the amplitude wave. With this, it is possible to create thinner light curtains over a larger depth range.
[0057] See through Scattering Media - In traditional imaging, the sensor receives first-bounce light reflected from the object as well as a lot of single-scattered light. With light curtains, the line sensor 102 avoids single-scattered light and only receives multi-scattered light. The ratio between first-bounce light and global light is much higher, thus contrast is better.
[0058] The light curtain method and device described herein has many benefits. The shape of a light curtain is programmable and can be configured dynamically to suit the demands of the immediate task. For example, light curtains can be used to determine whether a vehicle is changing lanes in front, whether a pedestrian is in the crosswalk, or whether there are vehicles in neighboring lanes. Similarly, a robot might use a curtain that extrudes its planned (even curved) motion trajectory. FIGS. 6(a,b) show various light curtains for use in robots and cars respectively.
[0059] Given an energy budget, in terms of average laser power, exposure time, and refresh rate of the light curtain, higher power and exposure can be allocated to lines in the curtain that are further away to combat inverse square light fall-off This is a significant advantage over traditional depth sensors that typically expend the same high power in all directions to capture a 3D point cloud in an entire volume.
[0060] The optical design of the light curtain shares similarities with confocal imaging in that small regions are selectively illuminated and sensed. When imaging in scattering media, such as fog and murky waters, this has the implicit advantage that many multi-bounce light paths a. re optically avoided thereby providing images with increased contrast. [0061] A key advantage of light curtains is that illumination and sensing can be concentrated to a thin region. Together with the power and exposure adaptability, this provides significantly better performance under strong ambient illumination, including direct sunlight, at large distances (i.e., 20-30m). The performance increases under cloudy skies and indoors to 40m and 50m respectively.
[0062] At any instant, the sensor only captures a single line of the light curtain that often has small depth variations and hence, little variation in intensity fall-off Thus, the dynamic range of the measured brightness can be low. A such, even a one-bit sensor with a programmable threshold would be ample for the envisioned tasks.
[0063] Many sensor types are available for use with the device. Any line sensor could be used with the described device including intensity sensors (CMOS, CCD, IuGaAs), time-of- flight (ToF) sensors (correlation, SPAD), and neuromorphic sensors (DVS).
[0064] The system may be run under control of one or more microprocessors in communication with memory containing software implementing the functions of the system. The movement of the galvo mirrors 108, 126 is under the control of the software to define the shape of the light curtain. The software may be configurable to allow the definition of light curtains of various shapes. In addition, the software may control the cycling of light source 102 as well as the timing of the reading of the data from line sensor 122 and the application of any filtering to the data, for example, the filtering of ambient light. Objects may be detected by breaking the light curtain, causing a variance in the light in the line of pixels sensed by line sensor 122. Upon detection of an object that has breached the light curtain, an alert may be raise and communicated off-unit.
[0065] To those skilled in the art to which the invention relates, many modifications and
adaptations of the invention will suggest themselves. The exemplary methods and apparatuses disclosed herein are not to be taken as limitations on the invention, but as an illustration thereof. The intended scope of the invention is defined by the claims which follow.

Claims

We claim:
1. A system comprising: a light source for creating a line of light defining an illumination plane; a line sensor for sensing a line of light in a sensing plane, the sensed line of light being defined by an intersection of the illumination plane and the sensing plane; a processor; software executed by the processor for performing the functions of: steering the illumination plane and the sensing plane synchronously to create the intersection between the illumination plane and the sensing plane such that the intersection of the illumination plane and the sensing plane follows a pre-defined path defining a light curtain; cycling the light source on and off; and reading the line sensor as the light source is cycled on and off.
2. The system of claim 1 wherein the light source comprises:
a laser diode;
a collimation lens; and
a Powell-type lens;
wherein light emitted by the laser diode is formed into a line of light by the Powell-type lens and before striking the first steerable mirror.
3. The system of claim 3 wherein the line sensor comprises a lens and a sensor capable of sensing a single line of pixels.
4. The system of claim 4 wherein the software performs the further functions of:
detecting a variance in the light detected by the line sensor, indicative of the presence of an object intersecting the light curtain.
5. The system of claim 4 wherein the software performs the further functions of:
communicating an alert off-unit upon detection of the variance.
6. The system of claim 1 wherein the illumination plane and the sensing plane are rotated about parallel axes.
7. The system of claim 1 wherein the illumination plane and the sensing plane are rotated over two axes.
8. The system of claim 1 wherein the illumination plane and the sensing plane are rotated over two points.
9. The system of claim 1 wherein the line sensor of a continuous wave time-of-flight sensor.
10. The system of claim 1 wherein the illumination plane and the sensing plane are steered by steerable galvo mirrors.
11. The system of claim 1 wherein the illumination plane and the sensing plane are steered by mechanical motors.
12. The system of claim 1 wherein the line sensor is a 2D sensor having a rolling shutter or a region of interest mask.
13. A method comprising: creating an illumination plane by continuously sweeping a line light source back and forth over a first pre-defmed path; creating a sensing plane by sweeping a line sensor back and forth over a second pre-defmed path; and synchronizing the first and second pre-defmed paths such that a line defining an intersection of the illumination plane and the sensing plane creates a light curtain of a desired shape.
14. The method of claim 13 further comprising: cycling the light source on and off; and reading the line sensor as the light source is cycled on and off.
15. The method of claim 14 further comprising: detecting variances in the light detected by the line sensor, indicative of the presence of an object intersecting the light curtain.
16. The method of claim 13 wherein the illumination plane and the sensing plane are rotated about parallel axes.
17. The method of claim 13 wherein the illumination plane and the sensing plane are rotated over two axes.
18. The system of claim 13 wherein the illumination plane and the sensing plane are rotated over two points.
19. A non-transitory computer-readable storage media containing instructions that, when executed by a processor, perform the steps of: creating an illumination plane by continuously sweeping a line light source back and forth over a first pre-defmed path; creating a sensing plane by sweeping a line sensor back and forth over a second pre-defmed path; and synchronizing the first and second pre-defmed paths such that a line defining an intersection of the illumination plane and the sensing plane creates a light curtain of a desired shape.
20. The computer-readable storage media of claim 19 containing further instructions that, when executed by a processor, perform the steps of: cycling the light source on and off; and reading the line sensor as the light source is cycled on and off.
21. The computer-readable storage media of claim 19 containing further instructions that, when executed by a processor, perform the steps of: detecting variances in the light detected by the line sensor, indicative of the presence of an object intersecting the light curtain.
22. The computer-readable storage media of claim 19 containing further instructions that, when executed by a processor, perform the steps of: communicating an alert off-unit upon detection of the variance.
23. The computer-readable storage media of claim 20 containing further instructions that, when executed by a processor, perform the steps of: filtering ambient light from the light sensed by the line sensor by reading the same line in the light curtain twice, once with the light source illuminated and once without, and subtracting the readings.
PCT/US2019/021569 2015-02-13 2019-03-11 Programmable light curtains WO2019182784A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US16/470,885 US11493634B2 (en) 2015-02-13 2019-03-11 Programmable light curtains
CA3094199A CA3094199A1 (en) 2018-03-23 2019-03-11 Programmable light curtains
EP19772623.5A EP3769037A4 (en) 2018-03-23 2019-03-11 Programmable light curtains
JP2020550649A JP7570099B2 (en) 2018-03-23 2019-03-11 Programmable Light Curtain
US17/601,780 US11972586B2 (en) 2015-02-13 2019-09-25 Agile depth sensing using triangulation light curtains

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862761479P 2018-03-23 2018-03-23
US62/761,479 2018-03-23

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
PCT/US2016/017942 Continuation-In-Part WO2016131036A1 (en) 2015-02-13 2016-02-15 Imaging system with synchronized dynamic control of directable beam light source and reconfigurably masked photo-sensor
US15/545,391 Continuation-In-Part US10359277B2 (en) 2015-02-13 2016-02-15 Imaging system with synchronized dynamic control of directable beam light source and reconfigurably masked photo-sensor

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US16/470,885 A-371-Of-International US11493634B2 (en) 2015-02-13 2019-03-11 Programmable light curtains
US17/601,780 Continuation-In-Part US11972586B2 (en) 2015-02-13 2019-09-25 Agile depth sensing using triangulation light curtains

Publications (1)

Publication Number Publication Date
WO2019182784A1 true WO2019182784A1 (en) 2019-09-26

Family

ID=67987449

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2019/021569 WO2019182784A1 (en) 2015-02-13 2019-03-11 Programmable light curtains

Country Status (4)

Country Link
EP (1) EP3769037A4 (en)
JP (1) JP7570099B2 (en)
CA (1) CA3094199A1 (en)
WO (1) WO2019182784A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024201881A1 (en) * 2023-03-30 2024-10-03 三菱電機株式会社 Three-dimensional sensing device
WO2024201888A1 (en) * 2023-03-30 2024-10-03 三菱電機株式会社 Image sensing device
WO2024201865A1 (en) * 2023-03-30 2024-10-03 三菱電機株式会社 Image sensing device
WO2024201877A1 (en) * 2023-03-30 2024-10-03 三菱電機株式会社 Image sensing device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010035636A1 (en) 1998-11-27 2001-11-01 Hamamatsu Photonics K.K. Occupant sensor and airbag control apparatus
US6529627B1 (en) * 1999-06-24 2003-03-04 Geometrix, Inc. Generating 3D models by combining models from a video-based technique and data from a structured light technique
US20110299135A1 (en) * 2010-06-02 2011-12-08 Pfu Limited Overhead image reading apparatus
US20140055779A1 (en) * 2012-08-23 2014-02-27 Otsuka Electronics Co., Ltd. Light distribution characteristic measurement apparatus and light distribution characteristic measurement method
US20150067929A1 (en) * 2013-08-28 2015-03-05 United Sciences, Llc Illumination for optical scan and measurement
US20160209183A1 (en) * 2013-09-27 2016-07-21 Megalink As System and method for determining the position of a bullet projectile on a target plane
WO2016131036A1 (en) 2015-02-13 2016-08-18 Carnegie Mellon University Imaging system with synchronized dynamic control of directable beam light source and reconfigurably masked photo-sensor

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08159756A (en) * 1994-12-12 1996-06-21 Nikon Corp Obstacle detector
JPH11132713A (en) * 1997-10-30 1999-05-21 Oki Electric Ind Co Ltd Distance measuring device and automatic focus control and photographing device, and recessed and projecting part detector
JP5243996B2 (en) 2009-03-02 2013-07-24 スタンレー電気株式会社 Object detection device
JP6552043B2 (en) * 2014-11-04 2019-07-31 オリンパス株式会社 Sheet illumination microscope

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010035636A1 (en) 1998-11-27 2001-11-01 Hamamatsu Photonics K.K. Occupant sensor and airbag control apparatus
US6529627B1 (en) * 1999-06-24 2003-03-04 Geometrix, Inc. Generating 3D models by combining models from a video-based technique and data from a structured light technique
US20110299135A1 (en) * 2010-06-02 2011-12-08 Pfu Limited Overhead image reading apparatus
US20140055779A1 (en) * 2012-08-23 2014-02-27 Otsuka Electronics Co., Ltd. Light distribution characteristic measurement apparatus and light distribution characteristic measurement method
US20150067929A1 (en) * 2013-08-28 2015-03-05 United Sciences, Llc Illumination for optical scan and measurement
US20160209183A1 (en) * 2013-09-27 2016-07-21 Megalink As System and method for determining the position of a bullet projectile on a target plane
WO2016131036A1 (en) 2015-02-13 2016-08-18 Carnegie Mellon University Imaging system with synchronized dynamic control of directable beam light source and reconfigurably masked photo-sensor

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
ACHAR, SUPREETH ET AL.: "Epipolar Time-of-Flight Imaging", ACM TRANSACTIONS ON GRAPHICS, vol. 36, no. 4, July 2017 (2017-07-01), XP058372811, Retrieved from the Internet <URL:http://www.cs.toronto.edu/-kyros/pubs/17.siggraph.epitof.pdf> DOI: 10.1145/3072959.3073686 *
See also references of EP3769037A4
WANG, JIAN ET AL.: "Programmable Triangulation Light Curtains", ECCV COMPUTER VISION FOUNDATION, 2018, XP047498226, Retrieved from the Internet <URL:http://openaccess.thecvf.com/content_ECCV_2018/papers/Jian_Wang_Programmable_Light_Curtains_ECCV_2018_paper.pdf> *

Also Published As

Publication number Publication date
EP3769037A4 (en) 2021-11-24
CA3094199A1 (en) 2019-09-26
JP2021518536A (en) 2021-08-02
EP3769037A1 (en) 2021-01-27
JP7570099B2 (en) 2024-10-21

Similar Documents

Publication Publication Date Title
Wang et al. Programmable triangulation light curtains
US10649072B2 (en) LiDAR device based on scanning mirrors array and multi-frequency laser modulation
EP3769037A1 (en) Programmable light curtains
Fuchs Multipath interference compensation in time-of-flight camera images
JP7133554B2 (en) Range sensor with adjustable focus image sensor
US9285477B1 (en) 3D depth point cloud from timing flight of 2D scanned light beam pulses
JP6528447B2 (en) Disparity calculation system and distance measuring device
JP6387407B2 (en) Perimeter detection system
CN107209265B (en) Optical detection and distance measurement device
US10782392B2 (en) Scanning optical system and light projecting and receiving apparatus
CN111344647A (en) Intelligent laser radar system with low-latency motion planning update
ES2512965B2 (en) System and method to scan a surface and computer program that implements the method
US8860930B2 (en) Three dimensional surface mapping system using optical flow
EP3167311B1 (en) Method and system for adjusting light pattern for structured light imaging
JP7140474B2 (en) A system for stereo triangulation
US10162171B2 (en) Scanning optical system and light projecting and receiving apparatus
KR101545971B1 (en) System for sensing complex image
JP7403860B2 (en) Agile depth sensing using triangulation light curtains
KR20200071960A (en) Method and Apparatus for Vehicle Detection Using Lidar Sensor and Camera Convergence
KR20230028265A (en) Attachment of a glass mirror to a rotating metal motor frame
English et al. TriDAR: A hybrid sensor for exploiting the complimentary nature of triangulation and LIDAR technologies
CN114375408A (en) System and method for modifying a LIDAR field of view
US20190377073A1 (en) Distance-measuring apparatus with polarizing filter
US11493634B2 (en) Programmable light curtains
US10818024B2 (en) Ranging objects external to an aircraft using multi-camera triangulation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19772623

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 3094199

Country of ref document: CA

ENP Entry into the national phase

Ref document number: 2020550649

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2019772623

Country of ref document: EP