WO2021126082A1 - Eye-safe operation of lidar scanner - Google Patents
Eye-safe operation of lidar scanner Download PDFInfo
- Publication number
- WO2021126082A1 WO2021126082A1 PCT/SG2020/050751 SG2020050751W WO2021126082A1 WO 2021126082 A1 WO2021126082 A1 WO 2021126082A1 SG 2020050751 W SG2020050751 W SG 2020050751W WO 2021126082 A1 WO2021126082 A1 WO 2021126082A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- distance
- illumination source
- basis
- intensity
- objects
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4814—Constructional features, e.g. arrangements of optical elements of transmitters alone
- G01S7/4815—Constructional features, e.g. arrangements of optical elements of transmitters alone using multiple transmitters
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4816—Constructional features, e.g. arrangements of optical elements of receivers alone
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/484—Transmitters
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/491—Details of non-pulse systems
- G01S7/4911—Transmitters
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/42—Simultaneous measurement of distance and other co-ordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
- G01S17/894—3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
Definitions
- the disclosure relates to a LIDAR (light detection and ranging) system.
- the present disclosure relates to LIDAR systems.
- FIG. 1 An example of a known LIDAR system 100 is illustrated in Figure 1.
- the system comprises a plurality of channels, each of which has an illumination source 101 .
- Each illumination source illuminates a respective spatial volume 102, and the reflected light is picked up by one or more detectors (not shown).
- the properties of the reflected light e.g. the time delay between illumination and reflection, the wavelength and/or the brightness is used to determine the distance of objects within each spatial region.
- the extent of the spatial volume 102 will depend on the solid angle over which the illumination source casts light (i.e. the frame of the illumination), and the maximum range at which an object illuminated by the illumination source can be detected by the detector(s).
- each illumination source There may be a single detector, which detects reflected light from each illumination source (e.g. with the illumination sources being activated sequentially), or there may be a detector for each illumination source, configured to detect light reflected from each object in the respective spatial volume.
- LIDAR systems Some problems associated with such known LIDAR systems are the compromises necessary between range and eye safety. Obtaining a longer detection range for a LIDAR system requires higher intensity of illumination. Flowever, where the system may be used around people or animals (e.g. on autonomous vehicles), a high intensity could cause damage to the eyes of anyone within the illuminated region. As such, the intensity of LIDAR systems must be limited for safety purposes - but this reduces their effective range, and hence their usefulness. Additionally, greater intensity of illumination requires greater power input, so there is also a compromise between energy usage and range. It is therefore an aim of the present disclosure to provide a LIDAR system and/or method of operating such that address one or more of the problems above or at least provides a useful alternative.
- a light detection and ranging, LIDAR system.
- the LIDAR system comprises a plurality of illumination sources, at least one detector, and a controller.
- Each illumination source is configured to illuminate a respective spatial volume.
- the or each detector is configured to receive light emitted by the plurality of illumination sources and reflected by one or more objects within one or more of the respective spatial volumes.
- the controller is configured to determine the distance of the one or more objects within each spatial volume from the LIDAR system on the basis of readings from the at least one detector; and adjust an intensity of each illumination source on the basis of the distance of an object within the respective spatial volume.
- Each illumination source may comprise a VSCEL and a lens.
- the lenses may form a multi-lens array, MLA, and the VSCELs may be arranged in a corresponding array.
- the VSCEL array may be disposed on a single chip, and the MLA may be on a single substrate.
- the intensity of each illumination source may be adjusted on the basis of the distance of the closest object within each respective spatial volume.
- the LIDAR system may further comprises a camera and an image recognition unit.
- the image recognition unit may be configured to assign a category to each object within each spatial volume, and the controller may configured to adjust the intensity of each illumination source on the basis of the distance of the closest object within a particular category or set of categories.
- the controller may be configured to further adjust the intensity of each illumination source on the basis of the distance of one of the objects in adjacent spatial volumes.
- the LIDAR system may comprise a secondary distance measuring system.
- the controller may be configured to further adjust the intensity of the illumination source adjusted on the basis of the distance of one of the objects within the respective spatial volume as measured by the secondary distance measuring system.
- the secondary distance measurement system may be RADAR and/or a further LIDAR system.
- the LIDAR system comprises a plurality of illumination sources and at least one detector.
- Each illumination source is configured to illuminate a respective spatial volume.
- the or each detector is configured to receive light emitted by the plurality of illumination sources and reflected by one or more objects within one or more of the respective spatial volumes. Values are determined for the distance of the one or more objects within one or more of the respective spatial volumes on the basis of measurements by the at least one detector.
- the output power of each illumination source is adjusted on the basis of the distance of an object within the respective spatial volume.
- the intensity of each illumination source may be adjusted on the basis of the distance of the closest object within each respective spatial volume.
- An image recognition unit may be used to assign a category to each object within each spatial volume, and the intensity of each illumination source may be adjusted on the basis of the distance of the closest object within a particular category or set of categories.
- the intensity of each illumination source may be further adjusted on the basis of the distance of one of the objects in adjacent spatial volumes.
- the distance of objects within each spatial volume may also be measured using a secondary distance measuring system, and the intensity of the illumination source may be further adjusted on the basis of the distance of one of the objects within the respective spatial volume as measured by the secondary distance measuring system.
- the secondary distance measurement system may be RADAR or a further LIDAR system.
- Figure 1 is an illustration of a known LIDAR system
- FIG. 2 is an illustration of an exemplary LIDAR system
- Figure 3 is a schematic illustration of an exemplary LIDAR system
- Figure 4 is a flowchart showing a method of operating a LIDAR system.
- the disclosure provides a LIDAR system in which the intensity of illumination in each channel is variable.
- the intensity in one or more of the channels is reduced based at least on the distance of an object in the channel (e.g. the closest object, or the closest object which might be a person).
- LIDAR systems used for autonomous driving require a larger working range, for example 30 to 40m, than LIDAR systems used for conventional purposes.
- the larger working range requires a higher intensity and the higher intensity may pose an eye-safety risk to people or animals.
- the inventors have appreciated that the risk can be mitigated by operating a variable intensity light source, whereby the intensity is reduced to a safe level when an object is detected within the field of view of the LIDAR system depending on one or more criteria such as the type of object, movement of the object or the LIDAR system, inferred location of the LIDAR system or other criteria.
- FIG. 2 shows an exemplary LIDAR system.
- the system comprises a plurality of illumination sources 201 , each of which is configured to illuminate a respective spatial volume 202.
- the illumination sources are arranged in a 2-dimensional array, but other arrangements are possible.
- the system also comprises one or more detectors (not shown) for receiving reflected light from objects in the spatial volumes.
- a controller 203 is configured to take the measurements from the detector(s), and use them to determine the distance of objects 204a, 204b within each spatial volume (by techniques as known in the art).
- the objects are shown at the furthest extent of the spatial volume they occupy, for ease of interpretation of the drawing, but may be at any location in the spatial volume.
- the controller is also configured to adjust the intensity of each of the illumination sources, based on one or more factors.
- each illumination source There may be a single detector, which detects reflected light from each illumination source (e.g. with the illumination sources being activated sequentially), or there may be a detector for each illumination source, configured to detect light reflected from each object in the respective spatial volume.
- a single detector which detects reflected light from each illumination source (e.g. with the illumination sources being activated sequentially), or there may be a detector for each illumination source, configured to detect light reflected from each object in the respective spatial volume.
- Other arrangements are possible as known in the art for LIDAR systems.
- the adjustment of the intensity of an illumination source 201 a may be based on one or more of:
- FIG. 3 is a schematic illustration of an exemplary LIDAR system. Optional components are shown with dotted lines.
- the LIDAR system comprises a plurality of illumination sources 301 , one or more detectors 302, and a controller 303.
- the illumination sources are each configured to illuminate a respective spatial volume.
- the detector(s) is (are each) configured to receive reflected light from the illumination sources, i.e. light reflected from objects in the respective spatial volumes.
- the controller is configured to determine the distance of objects within each spatial volume from the LIDAR system on the basis of readings from the detector, and adjust the intensity of each illumination source at least on the basis of the distance of an object within the spatial volume.
- the intensity of a laser beam decreases over distance, and the system is able to adjust taking the beam spread and attenuation into account.
- the controller may be configured to adjust the intensity on the basis of any of the factors listed previously, or any combination of those factors.
- the controller may receive additional input from a secondary distance measuring unit 304, e.g. a RADAR system or another LIDAR system.
- a secondary distance measuring unit 304 e.g. a RADAR system or another LIDAR system.
- the controller may receive additional input from a camera 305, possibly via an image recognition processor 306 (though the controller may instead be configured to perform the image recognition).
- Each illumination source 301 may comprise a VSCEL and a lens, and the VSCELS and lenses may be arranged into a VSCEL array (e.g. on a single chip) and a corresponding multi-lens array.
- Figure 4 is a flowchart of a method of operating a LIDAR system, such as the one in claim 3.
- steps 401 values are determined for the distance of object in each of the spatial volumes of the LIDAR system, on the basis of measurements by the one or more detectors.
- step 402 the power of each illumination source is adjusted on the basis of the distance of an object within the respective spatial volume.
- Embodiments of the present disclosure can be employed in many different applications including for autonomous vehicles, robotics, or laser scanning, for example.
- the present disclosure may be used with both flash LIDAR and scanning LIDAR systems.
- the illumination sources emit a high-energy pulse or flash of light at periodic intervals.
- the frequency at which the flashes repeat may typically be determined by the desired frame rate or refresh rate for a given use case of the LIDAR system.
- An example use case where a high frame rate or refresh rate is typically required is in the field of autonomous vehicles where near-real time visualisation of objects near the vehicle may be required.
- Light from the illumination sources propagates to objects in a scene where it is reflected and detected by an array of sensors positioned in a focal plane of a lens of the LIDAR system.
- the time for the light to propagate from the illumination sources of the LIDAR system to objects in the scene and back to the sensors of the LIDAR system is used to determine the distances from the objects to the LIDAR system.
- Each sensor in the array acts as a receiving element from which a data point may be obtained.
- flash LIDAR a single flash thus provides the same number of data points as the number of sensors in the system. Accordingly, a large volume of information about a scene being illuminated may be obtained from each flash.
- the illumination sources emit a continuous pulsed beam of light that scans across a scene to be illuminated.
- Mechanical actuators that move mirrors, lenses and/or other optical components may be used to move the beam around during scanning.
- a phased array may be used to scan the beam over the scene.
- a phased array is typically advantageous as there are fewer moving parts and accordingly a lower risk of mechanical failure of the system.
- time-of-flight measurements are also used to determine distance from the LIDAR system to the objects of a scene.
- the power emitted by the illumination sources per flash of a flash LIDAR system is high relative to the power of the continuous scanning beam of a scanning LIDAR system.
- the power of the emitted light is typically lower than flash LIDAR but may still need to be increased to less safe levels to achieve ranges of 30-40 meters as described above. Accordingly, by determining the distance to an object prior to emitting light from the LIDAR illumination sources, the power of both flash and scanning LIDAR systems can be decreased for lower ranges, thereby improving safety of both types of systems.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
A light detection and ranging, LIDAR, system. The system comprises a plurality of illumination sources, at least one detector, and a controller. Each illumination source is configured to illuminate a respective spatial volume. The or each detector is configured to receive light emitted by the plurality of illumination sources and reflected by one or more objects within one or more of the respective spatial volumes. The controller is configured to determine the distance of the one or more objects within each spatial volume from the LIDAR system on the basis of readings from the at least one detector; and to adjust an intensity of each illumination source on the basis of the distance of an object within the respective spatial volume.
Description
Eye-safe operation of LIDAR scanner
Technical Field of the Disclosure
The disclosure relates to a LIDAR (light detection and ranging) system.
Background of the Disclosure
The present disclosure relates to LIDAR systems.
An example of a known LIDAR system 100 is illustrated in Figure 1. The system comprises a plurality of channels, each of which has an illumination source 101 . Each illumination source illuminates a respective spatial volume 102, and the reflected light is picked up by one or more detectors (not shown). The properties of the reflected light (e.g. the time delay between illumination and reflection, the wavelength and/or the brightness) is used to determine the distance of objects within each spatial region.
The extent of the spatial volume 102 will depend on the solid angle over which the illumination source casts light (i.e. the frame of the illumination), and the maximum range at which an object illuminated by the illumination source can be detected by the detector(s).
There may be a single detector, which detects reflected light from each illumination source (e.g. with the illumination sources being activated sequentially), or there may be a detector for each illumination source, configured to detect light reflected from each object in the respective spatial volume.
Some problems associated with such known LIDAR systems are the compromises necessary between range and eye safety. Obtaining a longer detection range for a LIDAR system requires higher intensity of illumination. Flowever, where the system may be used around people or animals (e.g. on autonomous vehicles), a high intensity could cause damage to the eyes of anyone within the illuminated region. As such, the intensity of LIDAR systems must be limited for safety purposes - but this reduces their effective range, and hence their usefulness. Additionally, greater intensity of illumination requires greater power input, so there is also a compromise between energy usage and range.
It is therefore an aim of the present disclosure to provide a LIDAR system and/or method of operating such that address one or more of the problems above or at least provides a useful alternative.
Summary
According to a first aspect of the invention, there is provided a light detection and ranging, LIDAR, system. The LIDAR system comprises a plurality of illumination sources, at least one detector, and a controller. Each illumination source is configured to illuminate a respective spatial volume. The or each detector is configured to receive light emitted by the plurality of illumination sources and reflected by one or more objects within one or more of the respective spatial volumes. The controller is configured to determine the distance of the one or more objects within each spatial volume from the LIDAR system on the basis of readings from the at least one detector; and adjust an intensity of each illumination source on the basis of the distance of an object within the respective spatial volume.
Each illumination source may comprise a VSCEL and a lens. The lenses may form a multi-lens array, MLA, and the VSCELs may be arranged in a corresponding array. The VSCEL array may be disposed on a single chip, and the MLA may be on a single substrate.
The intensity of each illumination source may be adjusted on the basis of the distance of the closest object within each respective spatial volume.
The LIDAR system may further comprises a camera and an image recognition unit. The image recognition unit may be configured to assign a category to each object within each spatial volume, and the controller may configured to adjust the intensity of each illumination source on the basis of the distance of the closest object within a particular category or set of categories.
The controller may be configured to further adjust the intensity of each illumination source on the basis of the distance of one of the objects in adjacent spatial volumes.
The LIDAR system may comprise a secondary distance measuring system. The controller may be configured to further adjust the intensity of the illumination source adjusted on the basis of the distance of one of the objects within the respective spatial volume as measured by the secondary distance measuring system. The secondary distance measurement system may be RADAR and/or a further LIDAR system.
According to a second aspect, there is provided a method of operating a LIDAR system. The LIDAR system comprises a plurality of illumination sources and at least one detector. Each illumination source is configured to illuminate a respective spatial volume. The or each detector is configured to receive light emitted by the plurality of illumination sources and reflected by one or more objects within one or more of the respective spatial volumes. Values are determined for the distance of the one or more objects within one or more of the respective spatial volumes on the basis of measurements by the at least one detector. The output power of each illumination source is adjusted on the basis of the distance of an object within the respective spatial volume.
The intensity of each illumination source may be adjusted on the basis of the distance of the closest object within each respective spatial volume.
An image recognition unit may be used to assign a category to each object within each spatial volume, and the intensity of each illumination source may be adjusted on the basis of the distance of the closest object within a particular category or set of categories.
The intensity of each illumination source may be further adjusted on the basis of the distance of one of the objects in adjacent spatial volumes.
The distance of objects within each spatial volume may also be measured using a secondary distance measuring system, and the intensity of the illumination source may be further adjusted on the basis of the distance of one of the objects within the respective spatial volume as measured by the secondary distance measuring system. The secondary distance measurement system may be RADAR or a further LIDAR system.
Brief Description of the Drawings
Figure 1 is an illustration of a known LIDAR system;
Figure 2 is an illustration of an exemplary LIDAR system;
Figure 3 is a schematic illustration of an exemplary LIDAR system; and Figure 4 is a flowchart showing a method of operating a LIDAR system.
Detailed Description of the Preferred Embodiments
Generally speaking, the disclosure provides a LIDAR system in which the intensity of illumination in each channel is variable. In particular, the intensity in one or more of the channels is reduced based at least on the distance of an object in the channel (e.g. the closest object, or the closest object which might be a person).
LIDAR systems used for autonomous driving, for example, require a larger working range, for example 30 to 40m, than LIDAR systems used for conventional purposes. The larger working range requires a higher intensity and the higher intensity may pose an eye-safety risk to people or animals. The inventors have appreciated that the risk can be mitigated by operating a variable intensity light source, whereby the intensity is reduced to a safe level when an object is detected within the field of view of the LIDAR system depending on one or more criteria such as the type of object, movement of the object or the LIDAR system, inferred location of the LIDAR system or other criteria.
Some examples of the solution are given in the accompanying figures.
Figure 2 shows an exemplary LIDAR system. The system comprises a plurality of illumination sources 201 , each of which is configured to illuminate a respective spatial volume 202. The illumination sources are arranged in a 2-dimensional array, but other arrangements are possible. The system also comprises one or more detectors (not shown) for receiving reflected light from objects in the spatial volumes. A controller 203 is configured to take the measurements from the detector(s), and use them to determine the distance of objects 204a, 204b within each spatial volume (by techniques as known in the art). The objects are shown at the furthest extent of the spatial volume they occupy, for ease of interpretation of the drawing, but may be at any location in the spatial
volume. The controller is also configured to adjust the intensity of each of the illumination sources, based on one or more factors.
There may be a single detector, which detects reflected light from each illumination source (e.g. with the illumination sources being activated sequentially), or there may be a detector for each illumination source, configured to detect light reflected from each object in the respective spatial volume. Other arrangements are possible as known in the art for LIDAR systems.
The adjustment of the intensity of an illumination source 201 a may be based on one or more of:
• the closest object 204a within the spatial volume (i.e. the closer the object, the lower the intensity)
• the closest object 204b in nearby spatial volumes (i.e. reducing the intensity if there is an object detected by another channel which might move into this channel)
• some or all channels adjacent to the spatial volume may be attenuated
• distance measurements from secondary sources, e.g. RADAR systems or neighbouring LIDAR systems
• image recognition on objects nearby, where the objects are assigned to categories and the intensity of the illumination is reduced if objects in certain categories are detected (e.g. reducing the intensity if image recognition performed on a camera feed detects an object which is likely to be important for eye safety, such as a person or vehicle, but not reducing the intensity if the image recognition only detects unimportant objects such as signs or trees)
• information about the location of the system, such as from a GPS or from sign recognition (e.g. reducing intensity in urban areas).
The intensity may be reduced or the light source may be switched off altogether. The intensity may be reduced to under a predetermined amount considered to be safe. Maximum permissible exposure (MPE) is the highest power considered to be safe and depends on wavelength and exposure time. Acceptable intensities are set by standards known to the skilled person (such as European standard EN 207).
Figure 3 is a schematic illustration of an exemplary LIDAR system. Optional components are shown with dotted lines. As with the system of Figure 2, the LIDAR system comprises a plurality of illumination sources 301 , one or more detectors 302, and a controller 303. The illumination sources are each configured to illuminate a respective spatial volume. The detector(s) is (are each) configured to receive reflected light from the illumination sources, i.e. light reflected from objects in the respective spatial volumes.
The controller is configured to determine the distance of objects within each spatial volume from the LIDAR system on the basis of readings from the detector, and adjust the intensity of each illumination source at least on the basis of the distance of an object within the spatial volume. The intensity of a laser beam decreases over distance, and the system is able to adjust taking the beam spread and attenuation into account.
The controller may be configured to adjust the intensity on the basis of any of the factors listed previously, or any combination of those factors.
To enable adjustment based on secondary distance information, the controller may receive additional input from a secondary distance measuring unit 304, e.g. a RADAR system or another LIDAR system.
To enable adjustment based on image recognition and categorisation of objects, the controller may receive additional input from a camera 305, possibly via an image recognition processor 306 (though the controller may instead be configured to perform the image recognition).
Each illumination source 301 may comprise a VSCEL and a lens, and the VSCELS and lenses may be arranged into a VSCEL array (e.g. on a single chip) and a corresponding multi-lens array.
Figure 4 is a flowchart of a method of operating a LIDAR system, such as the one in claim 3. In step 401 , values are determined for the distance of object in each of the spatial volumes of the LIDAR system, on the basis of measurements by the one or more detectors. In step 402, the power of each illumination source is adjusted on the basis of the distance of an object within the respective spatial volume.
Embodiments of the present disclosure can be employed in many different applications including for autonomous vehicles, robotics, or laser scanning, for example.
List of reference numerals:
100 L I DAR system
101 illumination source
102 spatial volume (illuminated by an illumination source)
201 illumination source
201 a a particular illumination source
202 spatial volume (illuminated by an illumination source)
203 controller
204a, b object within one of the spatial volumes
301 illumination source
302 detector
303 controller
304 secondary distance measuring unit (optional)
305 camera (optional)
306 image recognition processor (optional)
401 first method step
402 second method step
The skilled person will understand that in the preceding description and appended claims, positional terms such as ‘above’, ‘along’, ‘side’, etc. are made with reference to conceptual illustrations, such as those shown in the appended drawings. These terms are used for ease of reference but are not intended to be of limiting nature. These terms are therefore to be understood as referring to an object when in an orientation as shown in the accompanying drawings.
Although the disclosure has been described in terms of preferred embodiments as set forth above, it should be understood that these embodiments are illustrative only and that the claims are not limited to those embodiments. Those skilled in the art will be able to make modifications and alternatives in view of the disclosure which are contemplated
as falling within the scope of the appended claims. Each feature disclosed or illustrated in the present specification may be incorporated in any embodiments, whether alone or in any appropriate combination with any other feature disclosed or illustrated herein.
For example, it is envisaged that the present disclosure may be used with both flash LIDAR and scanning LIDAR systems. In flash LIDAR systems, the illumination sources emit a high-energy pulse or flash of light at periodic intervals. The frequency at which the flashes repeat may typically be determined by the desired frame rate or refresh rate for a given use case of the LIDAR system. An example use case where a high frame rate or refresh rate is typically required is in the field of autonomous vehicles where near-real time visualisation of objects near the vehicle may be required. Light from the illumination sources propagates to objects in a scene where it is reflected and detected by an array of sensors positioned in a focal plane of a lens of the LIDAR system. The time for the light to propagate from the illumination sources of the LIDAR system to objects in the scene and back to the sensors of the LIDAR system is used to determine the distances from the objects to the LIDAR system. Each sensor in the array acts as a receiving element from which a data point may be obtained. Typically, there will be a one-to-one correspondence of illumination sources to sensors. For example, if there are 10,000 illumination sources in an array, the sensor array may comprise 10,000 corresponding sensors. In flash LIDAR, a single flash thus provides the same number of data points as the number of sensors in the system. Accordingly, a large volume of information about a scene being illuminated may be obtained from each flash.
In contrast, in scanning LIDAR systems, the illumination sources emit a continuous pulsed beam of light that scans across a scene to be illuminated. Mechanical actuators that move mirrors, lenses and/or other optical components may be used to move the beam around during scanning. Alternatively, a phased array may be used to scan the beam over the scene. A phased array is typically advantageous as there are fewer moving parts and accordingly a lower risk of mechanical failure of the system. In scanning LIDAR systems, time-of-flight measurements are also used to determine distance from the LIDAR system to the objects of a scene.
Typically, the power emitted by the illumination sources per flash of a flash LIDAR system is high relative to the power of the continuous scanning beam of a scanning LIDAR system. In scanning LIDAR systems, the power of the emitted light is typically lower than
flash LIDAR but may still need to be increased to less safe levels to achieve ranges of 30-40 meters as described above. Accordingly, by determining the distance to an object prior to emitting light from the LIDAR illumination sources, the power of both flash and scanning LIDAR systems can be decreased for lower ranges, thereby improving safety of both types of systems.
Claims
1 . A light detection and ranging, LIDAR, system comprising: a plurality of illumination sources, each illumination source being configured to illuminate a respective spatial volume; at least one detector configured to receive light emitted by the plurality of illumination sources and reflected by one or more objects within one or more of the respective spatial volumes; a controller configured to: determine the distance of the one or more objects within each spatial volume from the LIDAR system on the basis of readings from the at least one detector; adjust an intensity of each illumination source on the basis of the distance of an object within the respective spatial volume.
2. A LIDAR system according to claim 1 , wherein each illumination source comprises a VSCEL and a lens.
3. A LIDAR system according to claim 2, wherein the lenses form a multi-lens array, MLA, and the VSCELs are arranged in a corresponding array.
4. A LIDAR system according to any preceding claim, wherein the intensity of each illumination source is adjusted on the basis of the distance of the closest object within each respective spatial volume.
5. A LIDAR system according to any preceding claim, wherein the LIDAR system further comprises a camera and an image recognition unit, wherein the image recognition unit is configured to assign a category to each object within each spatial volume, and the controller is configured to adjust the intensity of each illumination source on the basis of the distance of the closest object within a particular category or set of categories.
6. A LIDAR system according to any preceding claim, wherein the controller is configured to further adjust the intensity of each illumination source on the basis of the distance of one of the objects in adjacent spatial volumes.
7. A LIDAR system according to any preceding claim, and comprising a secondary distance measuring system, and wherein controller is configured to further adjust the intensity of the illumination source adjusted on the basis of the distance of one of the objects within the respective spatial volume as measured by the secondary distance measuring system.
8. A LIDAR system according to claim 5, wherein the secondary distance measurement system is RADAR.
9. A method of operating a LIDAR system, the LIDAR system comprising a plurality of illumination sources, each illumination source being configured to illuminate a respective spatial volume, and at least one detector configured to receive light emitted by the plurality of illumination sources and reflected by one or more objects within one or more of the respective spatial volumes, the method comprising: determining values of the distance of the one or more objects within one or more of the respective spatial volumes on the basis of measurements by the at least one detector; adjusting the output power of each illumination source on the basis of the distance of an object within the respective spatial volume.
10. A method according to claim 9, wherein the intensity of each illumination source is adjusted on the basis of the distance of the closest object within each respective spatial volume.
11. A method according to claim 9 or 10, and comprising using an image recognition unit to assign a category to each object within each spatial volume, and adjusting the intensity of each illumination source on the basis of the distance of the closest object within a particular category or set of categories.
12. A method according to any of claims 9 to 11 , and comprising further adjusting the intensity of each illumination source on the basis of the distance of one of the objects in adjacent spatial volumes.
13. A method according to any of claims 9 to 12, and comprising:
further measuring the distance of objects within each spatial volume using a secondary distance measuring system, further adjusting the intensity of the illumination source adjusted on the basis of the distance of one of the objects within the respective spatial volume as measured by the secondary distance measuring system.
14. A method according to any of claims 9 to 13, wherein the secondary distance measurement system is RADAR.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201962951219P | 2019-12-20 | 2019-12-20 | |
US62/951,219 | 2019-12-20 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021126082A1 true WO2021126082A1 (en) | 2021-06-24 |
Family
ID=73857240
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/SG2020/050751 WO2021126082A1 (en) | 2019-12-20 | 2020-12-16 | Eye-safe operation of lidar scanner |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2021126082A1 (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170307759A1 (en) * | 2016-04-26 | 2017-10-26 | Cepton Technologies, Inc. | Multi-Range Three-Dimensional Imaging Systems |
WO2018055449A2 (en) * | 2016-09-20 | 2018-03-29 | Innoviz Technologies Ltd. | Lidar systems and methods |
US20190162823A1 (en) * | 2017-11-27 | 2019-05-30 | Atieva, Inc. | Flash Lidar with Adaptive Illumination |
-
2020
- 2020-12-16 WO PCT/SG2020/050751 patent/WO2021126082A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170307759A1 (en) * | 2016-04-26 | 2017-10-26 | Cepton Technologies, Inc. | Multi-Range Three-Dimensional Imaging Systems |
WO2018055449A2 (en) * | 2016-09-20 | 2018-03-29 | Innoviz Technologies Ltd. | Lidar systems and methods |
US20190162823A1 (en) * | 2017-11-27 | 2019-05-30 | Atieva, Inc. | Flash Lidar with Adaptive Illumination |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10775507B2 (en) | Adaptive transmission power control for a LIDAR | |
CN110402398B (en) | Eye-safe scanning lidar system | |
KR102364531B1 (en) | Noise Adaptive Solid-State LIDAR System | |
RU2672767C1 (en) | Device for infrared laser lighting | |
US9921312B2 (en) | Three-dimensional measuring device and three-dimensional measuring method | |
CN111830530A (en) | Distance measuring method, system and computer readable storage medium | |
US20210215801A1 (en) | Eye-safe lidar system having an adjustable scanning range | |
CN111766596A (en) | Distance measuring method, system and computer readable storage medium | |
US11908119B2 (en) | Abnormality detection device for vehicle | |
CN111796295B (en) | Collector, manufacturing method of collector and distance measuring system | |
US20210311193A1 (en) | Lidar sensor for optically detecting a field of vision, working device or vehicle including a lidar sensor, and method for optically detecting a field of vision | |
US10962644B1 (en) | Dynamic laser power control in light detection and ranging (LiDAR) systems | |
CN109490904B (en) | Time-of-flight sensor and detection method thereof | |
CN113567952B (en) | Laser radar control method and device, electronic equipment and storage medium | |
CN213091889U (en) | Distance measuring system | |
EP3709052A1 (en) | Object detector | |
US20230028749A1 (en) | Lidar with multi-range channels | |
WO2021126082A1 (en) | Eye-safe operation of lidar scanner | |
US20230146289A1 (en) | Light projecting apparatus, distance measuring apparatus, and light projection control method of laser light | |
CN111796296A (en) | Distance measuring method, system and computer readable storage medium | |
US20220326364A1 (en) | Method for operating a lidar system | |
US20220390229A1 (en) | Device and method for light-supported distance determination, control unit and working device | |
US20240067094A1 (en) | Gating camera, vehicle sensing system, and vehicle lamp | |
EP4057025A1 (en) | Optical scanning system | |
WO2022071332A1 (en) | Sensor device and lamp |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20828694 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20828694 Country of ref document: EP Kind code of ref document: A1 |