WO2023115104A1 - Spatial estimation system with controlled outgoing light - Google Patents

Spatial estimation system with controlled outgoing light Download PDF

Info

Publication number
WO2023115104A1
WO2023115104A1 PCT/AU2022/051489 AU2022051489W WO2023115104A1 WO 2023115104 A1 WO2023115104 A1 WO 2023115104A1 AU 2022051489 W AU2022051489 W AU 2022051489W WO 2023115104 A1 WO2023115104 A1 WO 2023115104A1
Authority
WO
WIPO (PCT)
Prior art keywords
estimation system
power density
spatial estimation
outgoing light
light
Prior art date
Application number
PCT/AU2022/051489
Other languages
French (fr)
Inventor
Federico COLLARTE BONDY
Anton Lohr
Matthew TRUMAN
Original Assignee
Baraja Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2021904278A external-priority patent/AU2021904278A0/en
Application filed by Baraja Pty Ltd filed Critical Baraja Pty Ltd
Publication of WO2023115104A1 publication Critical patent/WO2023115104A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/484Transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4868Controlling received signal intensity or exposure of sensor

Definitions

  • the present disclosure generally relates to a system and method for controlling outgoing light in a spatial estimation system.
  • LiDAR light detection and ranging
  • vehicle driver assistance and vehicle automation including autonomous or semi-autonomous vehicles.
  • a LiDAR system may detect characteristics of an environment (whether it be land, sea or air) of a vehicle, enabling an automated response to the detected environment.
  • the detecting range, resolution and field of view (FOV) of the system depends on several variables, including the power of outgoing light.
  • a longer range and/or higher resolution and/or larger FOV is often desirable, but the maximum power of the outgoing light may be constrained, for example, for maintaining eye/skin safety, which limits the detecting range and/or resolution and/or FOV of the LiDAR system.
  • Embodiments of a spatial estimation system include at least one light source, configured to provide outgoing light of the spatial estimation system and optical components configured to direct the outgoing light into an environment across a field of view and receive incoming light from the environment, wherein the incoming light from the environment includes reflected outgoing light.
  • the spatial estimation system also includes a system to control the operation of the spatial estimation system (“control system”).
  • the control system is configured to cause the spatial estimation system to provide outgoing light according to a first scan pattern from a plurality of selectable scan patterns, to operate the spatial estimation system in a first power density mode, to receive first incoming signals representative of incoming light received by the optical components, including reflected outgoing light in accordance with the first scan pattern, and in response to an eye-safety relaxation trigger derived from the first incoming signals, cause the spatial estimation system to provide outgoing light according to a second scan pattern from the plurality of selectable scan patterns to operate the spatial estimation system in a second power density mode.
  • a power density of the outgoing light for the first power density mode may be lower than a power density of the outgoing light for the second power density mode.
  • control system is configured to cause the spatial estimation system to provide outgoing light according to a first scan pattern from a plurality of selectable scan patterns, to operate the spatial estimation system in a first power density mode, to receive first incoming signals representative of incoming light received by the optical components, including reflected outgoing light in accordance with the first scan pattern, and in response to an eye-safety restriction trigger derived from the first incoming signals, cause the spatial estimation system to provide outgoing light according to a second scan pattern from the plurality of selectable scan patterns to operate the spatial estimation system in a second power density mode.
  • a power density of the outgoing light for the second power density mode may be lower than a power density of the outgoing light for the first power density mode.
  • control system is configured to cause the spatial estimation system to operate according to both embodiments in the preceding paragraphs, for example acting in response to the eye-safety relaxation trigger and subsequently acting in response to the eye-safety restriction trigger.
  • the second power density mode of the first set of embodiments is the same as the first power density mode of the second set of embodiments and the first power density mode of the first set of embodiments is either the same as or different to the second power density mode of the second set of embodiments.
  • At least one of the eye-safety relaxation trigger and the eye-safety restriction trigger is variable, wherein the variability is based on at least one of a speed of travel (e.g. of a vehicle carrying the spatial estimation system) and a determination of object exposure to laser light.
  • a determination of the object exposure to laser light may be based on predetermined power density parameters associated with each scan pattern.
  • the methods utilise one or more triggers to change power density modes, for example one or both of an eye-safety relaxation trigger and an eye-safety restriction trigger.
  • Nontransient computer storage containing instructions for a control system of a spatial estimation system.
  • the instructions cause one or more processors of the control system to control the operation of the spatial estimation system.
  • the control may utilise one or more triggers to change power density modes, for example one or both of an eye-safety relaxation trigger and an eye-safety restriction trigger.
  • Figure 1 illustrates an example arrangement of a spatial estimation system.
  • Figure 1 A illustrates an example arrangement of a control system in a spatial estimation system according to some embodiments of the present disclosure.
  • Figure 2 illustrates a process for applying a scan pattern to a scan of a spatial estimation system according to some embodiments of the present disclosure.
  • Figure 3A illustrates some exemplary scan patterns each with substantially uniform point density across a FOV.
  • Figure 3B illustrates some exemplary scan patterns each with non-uniform point density across a FOV.
  • Figure 4 illustrates a process for determining exposure of an object to outgoing light from a spatial estimation system according to some embodiments of the present disclosure.
  • Light hereinafter includes electromagnetic radiation having optical frequencies, including far-infrared radiation, infrared radiation, visible radiation and ultraviolet radiation.
  • a light-based spatial estimation system may be referred to as a LiDAR system.
  • LiDAR involves transmitting light into the environment and subsequently detecting the light returned by the environment. By determining the time it takes for the light to make a round trip to and from, and hence the distance of, reflecting surfaces within a FOV, a spatial estimation of the environment may be formed. Alternatively or additionally, the distance of reflecting surfaces within the FOV may be determined using frequency-modulated continuous wave (FMCW) techniques. Examples of LiDAR range detection, including examples using FMCW techniques, are discussed in international patent application no.
  • FMCW frequency-modulated continuous wave
  • one of the dimensions relates to the range of a point from the origin of the optical beam, whereas the other two dimensions relate to the two dimensional space (e.g. in Cartesian (x, y) or polar (theta, phi) coordinates) the optical beam is steered across.
  • the range of the point in the environment represents a primary variable of the environment for measurement.
  • the other two dimensions extend across a FOV of the three-dimensional mapping system.
  • Some LiDAR systems scan one or more optical beams across an environment.
  • Two significant performance variables of these LiDAR systems include (a) the frame rate or time within which a scan of a portion of the FOV is completed (i.e. temporal resolution) and (b) the resolution or number of pixels across or within a portion of the FOV (i.e. spatial resolution).
  • one or both of the spatial resolution and the temporal resolution can be affected by changing or adjusting the FOV.
  • one scan may be completed across a first FOV for the system and a subsequent scan may be completed across a second FOV, different to the first FOV.
  • the second FOV is a part of the first FOV.
  • the second FOV is larger than the first FOV.
  • the first and the second FOVs may overlap.
  • the LiDAR system may, in a further subsequent scan, be configured to return to scanning across the first FOV or to scan across a third FOV that is different from the first and second FOVs.
  • one or both of the spatial resolution within a portion of the FOV and the temporal resolution can be affected by changing the point density, or in other words changing the number of locations within a portion of the FOV at which a range measurement is performed by the LiDAR system.
  • LiDAR system incorporate beam directors that direct outgoing light in different directions based on the wavelength of the outgoing light.
  • a LiDAR system may include one or more wavelength tunable lasers, which scan through one or more ranges of wavelength channels to effect a corresponding scan of the outgoing light within a FOV. Examples of wavelength-steerable LiDAR systems are described in international patent application no. PCT/AU2016/050899 (published as WO 2017/054036 A1).
  • the point density can be changed by changing the number of optical pulses or other optical ranging signals per scan and/or by configuring the wavelength channel of the optical pulses or other optical ranging signals so that more (or less) pulses or ranging signals are within a first set of one or more wavelength ranges and less (or more) pulses or ranging signals are within a second set of one or more wavelength ranges; the wavelength range(s) in the second set being different to the wavelength range(s) in the first set.
  • the temporal resolution may be affected by the tuning speed of the wavelength-tunable light source.
  • the FOV and/or point density can be changed by changing the number of optical pulses or other optical ranging signals per scan and/or by adjusting the steering rate and/or steering angle of one or more of the mechanical steering components. For instance, if the mechanical steering components rotate in order to direct light in different directions, a change in the rotation rate can effect a corresponding change in the temporal resolution and may also effect a corresponding change in the spatial resolution.
  • a change in the rotation angle of the one or more mechanical steering components can effect a corresponding change in the spatial resolution and may also effect a corresponding change in the temporal resolution.
  • either or both the mechanical components and components affecting the wavelength-based steering may be controlled for changing the FOV and/or point density.
  • Examples of a LiDAR system with both wavelength and mechanical based steering are described in the applicant’s international patent application nos. PCT/AU2017/051395 (published as WO 2018/107237 A1) and PCT/AU2019/050437 (published as WO 2019/241825 A1). The content of both of these international patent applications is incorporated into this disclosure by reference.
  • Foveation in the context of a LiDAR system refers, at least herein, to the ability to be controlled to exhibit differential temporal resolution and/or to exhibit differential spatial resolution in different regions of the FOV. While foveation may be used to increase spatial resolution at one or more regions of interest (increased foveation), potentially at the expense of other region(s) (decreased foveation), it also increases the spatial power density in the one or more regions of interest (other relevant variables, for example the intensity of the outgoing light, remaining constant). The degree of foveation may affect how dense the points are distributed within a particular area of the FOV. The degree of foveation may therefore be limited by the maximum spatial power density for maintaining eye/skin safety, for example according to relevant Laser Safety Standards.
  • the LiDAR system is configured with a plurality of selectable scan patterns.
  • the plurality of selectable scan patterns are utilised by the LiDAR system to control the optical power and/or number of points and/or distribution of the points.
  • the plurality of selectable scan patterns may be formed based on eye-safety requirements.
  • the optical power and/or the point density across one or two dimensions of a FOV may be specified by, or controlled according to, entries in the plurality of selectable scan patterns.
  • Figure 1 illustrates an example arrangement of a spatial estimation system 100.
  • electrical connections e.g. analogue or digital signals
  • optical connections e.g. guided or free space optical transmission
  • the blocks represent functional components of the spatial estimation system 100.
  • a light source may include an integrated amplifier and/or an integrated modulator.
  • the system 100 includes a light transmitter 101 for generating outgoing light.
  • the light transmitter 101 may include a light source 102.
  • the light source 102 includes a wavelength-tunable light source, such as a wavelength-tunable laser diode, providing light of a tunable wavelength.
  • the wavelength may be based on one or more electrical currents (e.g. the injection current into the one or more wavelength tuning elements in the laser cavity) applied to the laser diode.
  • the light source 102 accordingly is configured to provide outgoing light at a selected one or more of the multiple wavelength channels (each represented by its respective centre wavelength Ai , A2, ... A n ).
  • the light source 102 may include a single tunable laser or more than one tunable laser (or other types of lasers). The light source 102 may select one wavelength channel at a time or may simultaneously provide two or more different selected wavelength channels (i.e. channels with different centre wavelengths). In another example, the light source 102 may include a broadband light source and one or more tunable spectral filters to provide substantially continuous-wave (CW) light intensity at the selected wavelength(s). In another example, the light source 102 includes multiple laser diodes, each being wavelength-tunable over a respective range and whose respective outputs are combined to form a single output. The respective outputs may be combined using a wavelength combiner, such as an optical splitter or an arrayed waveguide grating (AWG).
  • AWG arrayed waveguide grating
  • the light source 102 is configured to provide the outgoing light to include at least one time-varying profile at the selected one or more of the multiple wavelength channels.
  • the time-varying profile may be used in determining the round trip time of the light.
  • the light source 102 includes a modulator (not shown) for imparting a time-varying profile on the outgoing light. This modulation may be in addition to any wavelength tuning as herein before described. In other words, the modulation would be of light at the tuned wavelength. It will be appreciated that the tuned wavelength may refer to a centre frequency or other measure of a wavelength channel that is generated.
  • the time-varying profile may, for example, be one or more of a variation in intensity, frequency, or phase imparted to the outgoing light.
  • the light source 102 emits pulses of light, which pulses may include the time-varying profile.
  • the difference between the presence of a pulse and the absence of a pulse is a time-varying profile for use in determining the round trip time of light.
  • the outgoing light from the light source 102 has a different form, for example individual pulses for detecting the round trip time of the individual pulses instead of detecting the round trip time of a series of modulated pulses.
  • the light transmitter 101 also includes an optical amplifier 104 to amplify (providing gain to) light from the light source 102.
  • the optical amplifier 102 is an Erbium-doped fibre amplifier (EDFA) of one or more stages.
  • EDFA Erbium-doped fibre amplifier
  • SOA semiconductor optical amplifier
  • BOA booster optical amplifier
  • solid state amplifier e.g. a Nd:YAG amplifier
  • the modulator may be located either before or after the optical amplifier 104 in the outgoing light path.
  • the optical amplifier 104 may be omitted.
  • the light transmitter 101 includes a sampled-grating distributed Bragg reflector (SG-DBR) laser.
  • SG-DBR sampled-grating distributed Bragg reflector
  • the SG-DBR laser may be controllable to provide 10 Gbps modulation, may operate across a 35 nm wavelength range and change from one wavelength to another in less than 100 nanoseconds.
  • the wavelengths may have centre frequencies about 20 MHz or more apart.
  • the outgoing light from the light transmitter 101 is received by transmission (Tx) optics for directing light to a beam director 105.
  • the transmission optics may also condition the light, for example, by including one or more collimators to form one or more beams of light.
  • the transmission optics form a light transceiver 103, configured to both provide outgoing light to the beam director 105 and receive collected incoming light from the beam director 105.
  • the outgoing and incoming lights paths may be separate, in whole or in part.
  • the receiver aperture and associated components (i.e. reception (Rx) optics) for receiving the incoming light may not form part of the transmission optics for outgoing light (not shown).
  • the beam director 105 functions to direct light over one or two dimensions into the environment to be estimated by the spatial estimation system 100.
  • An example mechanical based beam director includes one or more rotating mirrors.
  • the beam director 105 may be wavelength-based and may include dispersive or refractive elements to direct light of different wavelengths in different directions.
  • the beam director 105 includes a combination of light directing components selected from diffractive, refractive and reflective components.
  • the light source 102 is a tunable light source to provide light at different wavelengths at different times and the beam director is wavelength dependent, for example configured with diffractive and/or refractive components to direct the different wavelengths in different directions.
  • the diffractive and/or refractive components are used in combination with a mechanical-based beam director, whereby the combined steering by the components directs the light over the FOV of spatial estimation system 100.
  • the direction is solely by mechanical components or solely by wavelength dependent components, to provide light across one or two dimensions.
  • the scanning may be linear (e.g. like a raster scan) or non-linear.
  • the beam director 105 includes bidirectional components, whereby both the outgoing light to the environment and incoming light from the environment traverse substantially the same path through the beam director 105, in opposite directions.
  • Figure 1 illustrates a bidirectional example for the optical signal traversing the beam director 105 and represents the directionality with a bidirectional arrow. Examples of beam directors are described in the incorporated international patent application nos. PCT/AU2017/051395 (published as WO 2018/107237 A1) and PCT/AU2019/050437 (published as WO 2019/241825 A1).
  • the beam director 105 directs the incoming light to the reception optics (e.g. the light transceiver), which collects the light and passes it to a light receiver 109.
  • the reception optics e.g. the light transceiver
  • the light receiver 109 may include a light detector circuitry 106.
  • the light detector circuitry 106 includes one or more photodetectors.
  • An example photodetector is an avalanche photodiode (APD).
  • the light detector circuitry 106 generates incoming electrical signals that are representative of the detected incoming light.
  • the light detector circuitry 106 may include a trans-impedance amplifier following the APD.
  • the light receiver 109 may also include an analog-to-digital converter (ADC) 108 following the light detector circuitry 106.
  • ADC analog-to-digital converter
  • the analog-to-digital converter 108 may convert analog incoming electrical singles to digital incoming electrical signals.
  • the incoming digital signals are received and processed by a control system 110.
  • the light source 102 and the amplifier 104 may be also controlled by the control system 110.
  • one or both of the wavelength tunable laser (e.g. its wavelength) and the modulator (e.g. the modulating waveform) of the light source 102 may be controlled by the control system 110.
  • the control system 110 may determine a round trip time for the light based on its control or knowledge of the outgoing light and based on the incoming light signals.
  • the control system 110 may control beam steering directions of the beam director 105, through e.g. wavelength-based steering and/or mechanical steering, for example for effecting a scan pattern.
  • control system 110 may directly control the beam director 105 via an electrical signal.
  • an electrical signal to the beam director 105 is not required.
  • the control system 110 may include an application specific device configured to perform the operations described herein, such as a configured programmable logic device implemented, or a general purpose computing device with computer readable memory storing instructions to cause the computing device to perform the operations.
  • an application specific device configured to perform the operations described herein, such as a configured programmable logic device implemented, or a general purpose computing device with computer readable memory storing instructions to cause the computing device to perform the operations.
  • the instructions and/or data for controlling operation of the processing unit may be in whole or in part implemented by firmware or hardware elements, including configured logic gates in one or more integrated circuit devices.
  • the control system 110 may include, for example, a single computer processing device (e.g. a central processing unit, graphics processing unit, or other computational device), or may include a plurality of computer processing devices.
  • the control system 110 may also include a communications bus in data communication with one or more machine readable storage (memory) devices which store instructions and/or data for controlling aspects of the operation of the processing unit.
  • the memory devices may include system memory (e.g. a BIOS), volatile memory (e.g.
  • control system 110 includes one or more interfaces.
  • the interfaces may include a control interface with the light transmitter 101 and a communication interface with the light receiver 109.
  • the control system 110 may also receive and process external information that are not derived from the spatial estimation system 100.
  • light from the light source 102 is also provided to the light detector circuitry 106 to provide a reference signal via a light path (not shown) from the light source 102 the light detector circuitry 106.
  • the light from the light source 102 may enter a sampler (e.g. a 90/10 fibre-optic coupler), where a majority portion (e.g. 90%) of the light is provide to the transceiver 103 and the remaining sample portion (e.g. 10%) of the light is provided instead to the light detector circuitry 106.
  • the light detector circuitry 106 may then be configured to inhibit detection of nonreflected light based on a difference in wavelength or modulation between the outgoing light and the non-reflected light.
  • the light detector circuitry 106 includes one or more balanced detectors to coherently detect the reflected light mixed with reference light at the one or more balanced detectors.
  • the light receiver 109 may perform, for example, homodyne or heterodyne detection of the incoming light.
  • the spatial estimation system 100 separates the functional components into two main physical units, i.e. an engine 111 and a sensor head 107.
  • the engine 111 and the sensor head 107 are substantially collocated. The collocation allows these components to be compactly packed within a single unit or in a single housing.
  • the sensor head 107 is remote from the engine 111.
  • the engine 111 is optically coupled to the remote sensor head 107 via one or more waveguides, such as optical fibres.
  • a spatial estimation system 100 may include a single engine 111 and multiple sensor heads. Each of the multiple sensor heads may be optically coupled to the engine 111 via respective waveguides. The multiple sensor heads may be placed at different locations and/or orientated with different FOVs.
  • LiDAR systems can be controlled to adjust the FOV, for example, to provide for effective control over temporal and/or spatial resolution.
  • This control may provide a more effective and/or safer spatial estimation system at least in certain applications.
  • spatial estimation systems are used for autonomous vehicles with an ability to increase temporal and/or spatial resolution in a particular region of the FOV (i.e. degree of foveation) and/or to increase the detection range while still maintaining practical eye/skin safety.
  • Figure 1 A generally illustrates an example arrangement of a control system for a spatial estimation system, for controlling and applying a scan pattern to a scan of the spatial estimation system.
  • the control system may be the control system 110 of Figure 1 and the following description is given with reference to that example.
  • the light receiver 109 produces incoming electrical signals 1008 that are representative of the incoming optical pulses 1006 to the control system 110 for processing.
  • the incoming electrical signals 1008 include at least object distance information.
  • the control system 110 includes an object distance determination module 1003 for determining the object distance information from the incoming electrical signals 1008.
  • control system 110 also includes a speed determination module 1005 for determining the speed information of the entity (e.g. vehicle) where the spatial estimation system 100 is located.
  • the speed determination may be based on the incoming electrical signals 1008, either directly, or based on distance determinations over time by the object distance determination module 1003. Additionally or alternatively, the speed information is obtained from an externally received signal 1014.
  • At least one of the processed signals indicative of object distance information and vehicle speed (1010 and 1012, respectively) are fed into a processing unit 1001 to generate one or more control signals 1002 corresponding to one of multiple scan patterns.
  • the one or more control signals 1002 are then applied to the light transmitter 101 for generating outgoing optical pulses 1004 according to a selected scan pattern.
  • the one or more control signals may also be applied to the beam director 105 for controlling wavelength-based steering and/or mechanical steering of the beam director 105 in accordance with the selected scan pattern.
  • FIG. 2 illustrates a process 200 for applying a scan pattern to a scan of a spatial estimation system, according to some embodiments of the present disclosure.
  • the process 200 may be implemented for or by a control system, which may include one or more application specific devices configured to perform the operations of the process 200, or one or more general purpose computing devices with associated memory containing instructions to cause the computer to perform the operation of the process 200.
  • step 201 may be implemented for the control system 110, as part of a manufacturing or configuration process and steps 203 to 205 may be implemented by the control system 110.
  • the following description is with reference to that example.
  • a plurality of selectable scan patterns of different spatial and/or temporal distribution for use to operate a spatial estimation system are stored in or otherwise made accessible to the control system 110.
  • the plurality of selectable scan patterns may be stored in the memory devices of or in data communication with the control system 110, for example, in non-volatile memory and/or in volatile memory.
  • the plurality of selectable scan patterns may include one or more regions or lines of substantially uniform point density across the FOV.
  • Figure 3A illustrates some examples of the scan patterns 301 , 303 and 305 with substantially uniform point density across a FOV.
  • the scan pattern 301 has 32 pixels (or points) vertically and 32 pixels horizontally over the FOV with a uniform (or even) density distribution.
  • the scan pattern 303 has 64 pixels along the vertical axis and 16 pixels along the horizontal axis with a uniform density distribution.
  • the scan pattern 305 has 128 pixels along the vertical axis and 8 pixels along the horizontal axis with a uniform density distribution.
  • the plurality of selectable scan patterns may include one or more non-uniform point density distribution across the same or different FOV, for example with a relatively increased point density within one or more subregions of the FOV, optionally with a reduced point density outside of the sub-region(s).
  • Figure 3B illustrates some examples of the scan patterns 302, 304, 306, 308, 310, 312, and 314 with non-uniform point density.
  • the four scan patterns 302, 304, 306 and 308 represents scan patterns with constant vertical FOV at different levels of compression along the vertical axis. The level of compression increases left to right between scan patterns 302 and 308, i.e.
  • the scan profile 302 illustrates the least compressed scan pattern while the scan profile 308 illustrates the most compressed scan pattern.
  • the three scan patterns 310, 312 and 314 represent scan patterns with reduced vertical FOV, the FOV reducing left to right in Figure 3B.
  • Figure 3B only illustrates examples of scan patterns with non-uniform point density distribution along the vertical axis, it will be appreciated that the non-uniform point density distribution of a scan pattern may also be applied to horizontal axis or both vertical and horizontal axes.
  • one of the stored scan patterns (i.e. a first scan pattern) is selected to operate the spatial estimation system in a first power density mode for object distance measurement.
  • the first scan pattern is utilised by the spatial estimation system for vehicle speed estimation.
  • a vehicle speed estimation by the spatial estimation system may be of particular utility when the spatial estimation system is used for vehicles, including but not limited to autonomous or semi- autonomous vehicles.
  • the first scan pattern may be used solely for speed estimation or may also be used for spatial estimation of the environment.
  • the first scan pattern may be used alone, or together with another source of speed estimation, for example an indication of speed received in the signal 1014.
  • speed may be estimated or determined or received without reference to the first scan pattern, for example based solely on signal 1014.
  • the spatial estimation may be an initial estimation, which in some embodiments is used, in combination with the speed estimation or determination, as an input to select a second scan pattern (e.g. in step 205 described herein).
  • Adjustment of power density may be achieved by adjusting one or more of parameters that affect the power density.
  • Example parameters include the output power of the light transmitter 101 , the spatial resolution of the scan pattern, and the temporal resolution of the scan pattern.
  • the combination of the power density parameters may be predetermined for each of the stored scan patterns. It will be appreciated that the predetermined combination of power density parameters facilitates predetermination of (a) exposure power at any given region of interest within the FOV and/or (b) exposure energy over any given time period.
  • the first power density mode at step 203 is a low power density mode, which refers to an operating mode of the spatial estimation system with a selected scan pattern that is deemed eye/skin-safe anywhere within the FOV.
  • a high power density mode refers to any other operating mode of the spatial estimation system, including in particular any operating mode with a selected scan pattern that is deemed potentially non-eye/skin-safe in at least some region within the FOV.
  • Some of the scan patterns stored at step 201 may be designated for use only when operating the spatial estimation system in a low power density mode.
  • some of the scan patterns stored at step 201 may be designated for use only when operating the spatial estimation system in a high power density mode.
  • scanning of the environment is initiated with the first scan pattern, which is a low power density mode. Scanning with the first scan pattern is performed for object distance measurement, and optionally for object identification and/or speed determination.
  • a second scan pattern is selected from the stored scan patterns to operate the spatial estimation system in a second power density mode.
  • the second power density mode at this step may be a high power density mode.
  • the selection may be responsive to an eye-safety-relaxation trigger.
  • the second scan pattern may be selected to increase foveation in at least part of the FOV.
  • the eye-safety-relaxation trigger is a determination that there is lack of objects detected at or within a threshold distance.
  • the detected object(s) may be identified and linked to a class label, such as a person, fauna, car, flora, etc., using one or more advanced perception algorithms.
  • the eye-safety-relaxation trigger may be a determination that there is no presence of particular object classes e.g. humans or animals at or within a threshold distance.
  • the eye-safety relaxation trigger is dependent on a speed, for example the speed estimated, determined or received in step 203.
  • the threshold distance for detection of objects is dependent on the speed of the entity on or in which the spatial estimation system is located (e.g. vehicle).
  • the threshold distance is continuously variable based on the vehicle speed.
  • the threshold distance is discretely variable based on the vehicle speed. For example, the threshold distance may vary based on whether the vehicle speed is above or below a threshold speed.
  • the threshold speed may be somewhere in the range of 10-100 km/h, such as 10, 20, 30, 40, 50, 60, 70, 80, 90 or 100 km/h.
  • the threshold distance is a first distance.
  • the threshold distance when the vehicle speed is below the threshold speed may be a shorter distance (e.g. 10 cm), whereas the threshold distance when the vehicle speed is at or above the threshold speed may be a larger distance (e.g. 2 m).
  • the eye-safety-relaxation trigger may be a determination that the vehicle speed is above a threshold speed.
  • the threshold speed may be the sole variable for the eye-safety-relaxation trigger. For example, when the vehicle speed is above the threshold speed, it is considered that it will be unlikely for an eye iris at 10 cm distance from the scanning vertex of the spatial estimation system that is located on or in the vehicle.
  • there may be more than one eye-safety-relaxation trigger with one trigger operating in some conditions and one or more other triggers operating in other conditions.
  • the eye-safety-relaxation trigger may be based on whether a detected object or a detected object within one or more particular classes is within a threshold distance and when the speed determination is above that threshold the eye- safety-relaxation trigger is based on speed alone.
  • the vehicle speed may be determined or received from a vehicle information system (e.g. a Controller Area Network (CAN) bus). Additionally or alternatively, the vehicle speed is estimated or determined, by the speed determination module 1005 of the control system 110, based on information from the spatial estimation system. For example, the vehicle speed is estimated or determined by calculating the speed at which the spatial estimation system approaches a static object detected in the environment, such as a tree, a bridge, a building, road markers, or a road sign. [0062] Alternatively or additionally, the eye-safety-relaxation trigger may be based on a determination of location or based on one or more environmental conditions. In applications where spatial estimation systems are used for autonomous vehicles, an example environment condition is a road condition.
  • a vehicle information system e.g. a Controller Area Network (CAN) bus
  • the vehicle speed is estimated or determined, by the speed determination module 1005 of the control system 110, based on information from the spatial estimation system. For example, the vehicle speed is estimated or determined by calculating the speed at which the
  • the eye-safety-relaxation trigger is based on a determination that the vehicle is driving in a highway or in a non-pedestrian/bicyclist area. It will be appreciated that this approach may be considered eye safe because it is unlikely that people (or any other creatures susceptible to laser eye/skin-safety) will be present within that environment.
  • the spatial estimation system may determine its location based on geolocation technologies, such as the global positioning system (GPS) and/or based on object recognition (e.g. road sign recognition).
  • the spatial estimation system transitions to operate in a high power density mode from a low power density mode in responsive to an eye-safety relaxation trigger and in the high power density mode the system uses a scan pattern (the second scan pattern of step 205) to increase foveation in at least part of the FOV, such as along or near a detected horizon.
  • a third scan pattern is selected from the stored scan patterns to operate the spatial estimation system in a third power density mode.
  • the third power density mode at this step may be a low power density mode.
  • the selection may be responsive to a determination of an eye-safety-restriction trigger.
  • the third scan pattern may be selected to decrease foveation in at least part of the FOV.
  • decreasing foveation means foveating at a reduced level.
  • decreasing foveation means completely defoveating (e.g. using a scan with substantially uniform point density across a FOV).
  • decreasing foveation means to lower the optical power of the outgoing light, completely turn off the light transmitter 101 or a combination of both.
  • Lowering the optical power of the outgoing light may be achieved by reduced in the intensity of the outgoing light and/or by reducing the duration of the pulses of outgoing light. It will be appreciated that the third scan pattern may be the same as or different to the first scan pattern and that the third power density mode may be the same as or different to the first power density mode.
  • the eye-safety-restriction trigger may be a determination that there are one or more objects detected within a threshold distance.
  • This threshold distance may be the same as or different to (e.g. less than) the threshold distance for the eye-safety-relaxation trigger.
  • the detected object(s) may be identified and linked to a class label using one or more advanced perception algorithms, as described herein.
  • the eye-safety-restriction trigger may be a determination that there is one or more objects of one or more particular classes within a threshold distance.
  • the threshold distance may be dependent on the speed of the entity where the spatial estimation systems is located.
  • the threshold distance may be continuously or discretely variable based on the vehicle speed.
  • the eye-safety-restriction trigger may be a determination that the vehicle speed is below a threshold speed at least because it is considered that it will be likely to have an eye iris at 10 cm distance from the scanning vertex of the spatial estimation system that is located on or in the vehicle.
  • the object distance or threshold distance may be irrelevant.
  • the vehicle speed may be obtained from the vehicle information system and/or from the spatial estimation system.
  • the eye-safety-restriction trigger may be based on a determination of the environment condition, as described with reference to the eye- safety- relaxation trigger.
  • the environment condition may be a road condition.
  • the spatial profiling system transitions to operate in a low power density mode from a high power density mode in responsive to an eye-safety restriction trigger by using a third scan pattern that decreases foveation in at least part of the FOV.
  • MPE Maximum permissible exposure
  • the MPE is related to the operating wavelength and the optical power.
  • the MPEs are set by the International Commission on Non-Ionizing Radiation Protection (ICNIRP), which are internationally accepted and are also adopted by standardisation committees such as I EC TC 76 and ANSI for the respective laser safety standards I EC 60825-1 and ANSI Z136.1 , respectively.
  • ICNIRP International Commission on Non-Ionizing Radiation Protection
  • the MPE over a 10-second duration exposure is 1000 W/m 2 .
  • the Accessible Emission Limit (expressed in W or J) for 1550nm Class 1 laser products is based on the measurement of average power over 10s in an eye iris with a diameter of 3.5mm at a distance of 10cm according to I EC 60825-1
  • the Accessible Emission Limit for a spatial estimation system operating at 1550nm wavelength is 10mW over a 10-second duration exposure.
  • the spatial estimation system may be configured with a response time for the transition from the high power density mode to the low power density mode.
  • the response time may be based on an MPE or Accessible Emission Limit for a given laser class.
  • the response time after the eye- safety- restriction trigger occurs may be set such that the MPE of the corresponding spatial estimation system for maintaining eye/skin safety is not exceeded.
  • the maximum response time after the eye-safety-restriction trigger occurs may result in an exposure not greater than the MPE, e.g. of 50%, 25% or 10% of the MPE of the corresponding spatial estimation system.
  • the spatial estimation system may be configured with a non-transition period for preventing a further transition from the low power density mode to another higher density mode.
  • the non-transition period prevents hysteresis behaviour.
  • the non-transition period may be a predetermined time between 0.1 to 10 seconds.
  • detection of an object may include or be followed by determining object exposure to the outgoing light from the spatial estimation system.
  • a region of interest may include a region of the FOV occupied by a detected object (i.e. an object region).
  • the exposure of the object to the outgoing light may be determined based on predetermined power density parameters stored for each of the scan patterns.
  • the object exposure may be inferred from a particular set of predetermined power density parameters, such as via a look-up process, rather than from run-time measurement of power density parameters.
  • detection of an object may include determining that the power density, based on the predetermined power density parameters of the selected scan pattern, integrated over the object region exceeds a threshold power.
  • detection of an object may include determining that the power density, based on the predetermined power density parameters of the selected scan pattern, integrated over the object region and accumulated over a predetermined time period, exceeds a threshold energy.
  • the predetermined time period may be based on a resulting exposure not greater than the MPE, e.g. of 75%, 50% or 25% of the MPE.
  • the eye-safety-restriction trigger is not activated if the determined object exposure does not exceed the threshold power and/or threshold energy.
  • the above determinations based on threshold power and/or threshold energy can be realised by counting the number of returned pulses per unit measure detected at or below a threshold distance according to a process 400 as illustrated in Figure 4.
  • a new distance measurement of a point is received at step 401 by the spatial profiling system for scanning an environment. If the outgoing light pulses of a selected scan pattern are directed at an object in the environment, at least part of the outgoing light pulses may be reflected, e.g. scattered, by the object back to the spatial estimation system as incoming light pulses.
  • the object distance to the spatial estimation system is determined.
  • the process 400 involves counting the number of returned pulses over a detection frame.
  • a detection frame is the FOV of the spatial estimation system, or a region of the FOV, for process 400 to determine whether an eye-safety restriction/relaxation trigger is activated.
  • the detection frame is instead or further based on a fixed time window. For example, counting for a new detection frame may commence every 10 seconds or a smaller time interval.
  • the reckoning of time may be based on a clock signal or based on another parameter, for example a number of scans of a FOV.
  • the detection frame is instead or further based on a variable time window which is variable in commencement and/or in duration.
  • counting for a detection frame may commence when an object is detected within the FOV. Counting for a detection frame may continue while the object is continued to be detected within the FOV, or may end after a predetermined frame end time. If the object is still in the FOV at the predetermined frame end time, another count for the detection frame may be immediately triggered. In some embodiments only objects detected within a predetermined threshold distance (which may the same or different to the distance described herein with reference to step 404 of process 400) may commence a count for a detection frame. In another example a count for a detection frame commences and continues while the spatial estimation system is foveating and ends when the spatial system stops foveating. Some scan patterns with foveation may trigger a count for a detection frame, whereas others may not. Alternatively all scan patterns with foveation may trigger a count for a detection frame.
  • the spatial estimation system determines whether or not the point received at step 401 represents the commencement of a new detection frame count. Determining whether a new detection frame count has commenced may avoid double or over counting a received point as contributing to object exposure towards the threshold power/energy.
  • the process proceeds to step 404.
  • a counter for counting the number of received points within the detection frame that provides a measured distance below the threshold distance is reset at step 403. After the counter is reset, the process proceeds to step 404.
  • the control system determines whether or not the distance to an object is less than a threshold distance (e.g. 300 mm).
  • the object may be an object determined to be of a particular class (e.g. a person or animal), as described herein. Distances to other classes of object may not be used for the purpose of step 404.
  • the process proceeds to step 405. Otherwise the process proceeds to step 409 in which the control system determines that the eye-safety-restriction trigger is not activated.
  • the threshold distance in some embodiments is based on speed of the entity (e.g. vehicle) on or in which the spatial estimation system is located.
  • the control system determines whether the counter number is less than a threshold value.
  • the threshold value may be associated with a threshold power or a threshold energy. It will be appreciated that, by determining the number of optical pulses reflected by the object per unit measure, the object exposure can be determined according to the set of predetermined power density parameters associated with the selected scan pattern. Similarly, the threshold power or threshold energy can be adjusted to a threshold value that is comparable with the number of returned pulses according to the set of predetermined power density parameters associated with the selected scan pattern.
  • the unit measure may be a predetermined size within the FOV, for example, approximate size of a human head.
  • the unit measure may be a predetermined period of time, such as the maximum response time.
  • the eye-safety-restriction trigger is activated (or the eyesafety relaxation trigger is not activated) at step 407. Otherwise, the control system determines at step 409 that the eye-safety-restriction trigger is not activated (or the eye- safety-relaxation trigger is activated).
  • step 404 may involve counting returned pulses within a threshold distance within a region of interest.
  • the region of interest may be a region within the FOV with an increased point density (i.e. a foveated region). All returned pulses with the region of interest may be counted.
  • the detection of objects as described in step 404 above may be applied only to the region of interest.
  • a foveated region may be a region where the point density exceeds the point density if the points are uniform across the FOV, such 110%, 150%, 200%, 250% or 300% per unit area relative to a uniform point density.
  • detection of an object outside the foveated region alone does not activate the eye-safety-restriction trigger.
  • detection of an object depends on the (individual or total) size of any detected object(s).
  • the eye-safety-restriction trigger may not be triggered if any individual or the total object region is determined to be smaller than a threshold area, such as the size of a human head.
  • the same stored scan pattern may cause the spatial estimation system to operate in either high or low power density mode. That is, the power density mode in which the spatial estimation system is operated depends not only on the selected scan pattern but also on other run-time variables, such as measured distance of a detected object, location of a detected object within the FOV, and/or speed of vehicle (or any other entity where the spatial estimation system is located). For example, according to the present disclosure, lack of an object detected within the threshold distance in a region of interest may cause the spatial estimation system to switch operation from a low power density mode to a high power density mode (e.g. to switch operation from step 203 to step 205) or to continue operation in a high power density mode (e.g.
  • an object detected within the threshold distance may cause the spatial estimation system to switch operation from a high power density mode to a low power density mode (e.g. to switch operation from step 205 to 207) or to continue operation in a lower power density mode (e.g. to maintain operation at step 203).
  • an object detected within the threshold distance but outside the foveated region within the FOV may cause the spatial estimation system to switch operation from a low power density mode to a high power density mode (e.g. to switch operation from step 203 to 205) or to continue operation in a high power density mode (e.g. to maintain operation at 205) and does not activate the eye-safety-restriction trigger.
  • changing resolution of the spatial estimation system 100 may be achieved by changing maximum allowable round trip time (i.e. flight time). For example, changing the maximum allowable round trip time from 2ps to 1 ps may result in half optical power of the outgoing light and therefore half detecting range. On the other hand, it allows the light transmitter 101 to recover more quickly between scans and therefore to provide the outgoing light according to a scan pattern twice within the same amount of time so as to increase the resolution.
  • maximum allowable round trip time i.e. flight time

Abstract

Spatial estimation systems and methods of operation are described. A light source is configured to provide outgoing light. The system includes optical components configured to direct the outgoing light into an environment across a field of view and receive incoming reflected light from the environment. The spatial estimation system also includes a control system to control the operation of the spatial estimation system to control the power density distribution within the field of view, responsive to trigger events, in particular trigger events related to eye-safety operation.

Description

Spatial estimation system with controlled outgoing light
Field
[0001] The present disclosure generally relates to a system and method for controlling outgoing light in a spatial estimation system.
Background
[0002] Light-based spatial estimation systems, in particular light detection and ranging (LiDAR) systems, are of significant interest across several industries. One industry in which light-based spatial estimation systems has application is vehicle driver assistance and vehicle automation, including autonomous or semi-autonomous vehicles. For example, a LiDAR system may detect characteristics of an environment (whether it be land, sea or air) of a vehicle, enabling an automated response to the detected environment.
[0003] In LiDAR systems, the detecting range, resolution and field of view (FOV) of the system depends on several variables, including the power of outgoing light. A longer range and/or higher resolution and/or larger FOV is often desirable, but the maximum power of the outgoing light may be constrained, for example, for maintaining eye/skin safety, which limits the detecting range and/or resolution and/or FOV of the LiDAR system.
[0004] Reference to any prior art in the specification is not an acknowledgement or suggestion that this prior art forms part of the common general knowledge in any jurisdiction or that this prior art could reasonably be expected to be combined with any other piece of prior art by a skilled person in the art.
Summary of the disclosure
[0005] Embodiments of a spatial estimation system include at least one light source, configured to provide outgoing light of the spatial estimation system and optical components configured to direct the outgoing light into an environment across a field of view and receive incoming light from the environment, wherein the incoming light from the environment includes reflected outgoing light. The spatial estimation system also includes a system to control the operation of the spatial estimation system (“control system”). [0006] In a first set of embodiments, the control system is configured to cause the spatial estimation system to provide outgoing light according to a first scan pattern from a plurality of selectable scan patterns, to operate the spatial estimation system in a first power density mode, to receive first incoming signals representative of incoming light received by the optical components, including reflected outgoing light in accordance with the first scan pattern, and in response to an eye-safety relaxation trigger derived from the first incoming signals, cause the spatial estimation system to provide outgoing light according to a second scan pattern from the plurality of selectable scan patterns to operate the spatial estimation system in a second power density mode. Within at least a region of a field of view of the spatial estimation system encompassed by the first and second and scan patterns, a power density of the outgoing light for the first power density mode may be lower than a power density of the outgoing light for the second power density mode.
[0007] In a second set of embodiments, the control system is configured to cause the spatial estimation system to provide outgoing light according to a first scan pattern from a plurality of selectable scan patterns, to operate the spatial estimation system in a first power density mode, to receive first incoming signals representative of incoming light received by the optical components, including reflected outgoing light in accordance with the first scan pattern, and in response to an eye-safety restriction trigger derived from the first incoming signals, cause the spatial estimation system to provide outgoing light according to a second scan pattern from the plurality of selectable scan patterns to operate the spatial estimation system in a second power density mode. Within at least a region of a field of view of the spatial estimation system encompassed by the first and second and scan patterns, a power density of the outgoing light for the second power density mode may be lower than a power density of the outgoing light for the first power density mode.
[0008] In some embodiments the control system is configured to cause the spatial estimation system to operate according to both embodiments in the preceding paragraphs, for example acting in response to the eye-safety relaxation trigger and subsequently acting in response to the eye-safety restriction trigger. It will be appreciated that in these embodiments there may be two or three distinct power density modes, in which the second power density mode of the first set of embodiments is the same as the first power density mode of the second set of embodiments and the first power density mode of the first set of embodiments is either the same as or different to the second power density mode of the second set of embodiments. There may be one or more further power density modes, responsive to one or more further triggers.
[0009] In some embodiments at least one of the eye-safety relaxation trigger and the eye-safety restriction trigger is variable, wherein the variability is based on at least one of a speed of travel (e.g. of a vehicle carrying the spatial estimation system) and a determination of object exposure to laser light. A determination of the object exposure to laser light may be based on predetermined power density parameters associated with each scan pattern.
[0010] Also described are embodiments of methods of spatial estimation and methods of operating a spatial estimation system. The methods utilise one or more triggers to change power density modes, for example one or both of an eye-safety relaxation trigger and an eye-safety restriction trigger.
[0011] Also described are embodiments of computer storage, in particular nontransient computer storage, containing instructions for a control system of a spatial estimation system. The instructions cause one or more processors of the control system to control the operation of the spatial estimation system. The control may utilise one or more triggers to change power density modes, for example one or both of an eye-safety relaxation trigger and an eye-safety restriction trigger.
[0012] Further embodiments will become apparent from the following description, given by way of example and with reference to the accompanying drawings.
Brief description of the drawings
[0013] Figure 1 illustrates an example arrangement of a spatial estimation system.
[0014] Figure 1 A illustrates an example arrangement of a control system in a spatial estimation system according to some embodiments of the present disclosure.
[0015] Figure 2 illustrates a process for applying a scan pattern to a scan of a spatial estimation system according to some embodiments of the present disclosure.
[0016] Figure 3A illustrates some exemplary scan patterns each with substantially uniform point density across a FOV. [0017] Figure 3B illustrates some exemplary scan patterns each with non-uniform point density across a FOV.
[0018] Figure 4 illustrates a process for determining exposure of an object to outgoing light from a spatial estimation system according to some embodiments of the present disclosure.
Detailed description of the embodiments
[0019] “Light” hereinafter includes electromagnetic radiation having optical frequencies, including far-infrared radiation, infrared radiation, visible radiation and ultraviolet radiation. A light-based spatial estimation system may be referred to as a LiDAR system. LiDAR involves transmitting light into the environment and subsequently detecting the light returned by the environment. By determining the time it takes for the light to make a round trip to and from, and hence the distance of, reflecting surfaces within a FOV, a spatial estimation of the environment may be formed. Alternatively or additionally, the distance of reflecting surfaces within the FOV may be determined using frequency-modulated continuous wave (FMCW) techniques. Examples of LiDAR range detection, including examples using FMCW techniques, are discussed in international patent application no. PCT/AU2016/050899 (published as WO 2017/054036 A1), the entire content of which is incorporated herein by reference. In three-dimensional mapping, one of the dimensions relates to the range of a point from the origin of the optical beam, whereas the other two dimensions relate to the two dimensional space (e.g. in Cartesian (x, y) or polar (theta, phi) coordinates) the optical beam is steered across. The range of the point in the environment represents a primary variable of the environment for measurement. The other two dimensions extend across a FOV of the three-dimensional mapping system.
[0020] Some LiDAR systems scan one or more optical beams across an environment. Two significant performance variables of these LiDAR systems include (a) the frame rate or time within which a scan of a portion of the FOV is completed (i.e. temporal resolution) and (b) the resolution or number of pixels across or within a portion of the FOV (i.e. spatial resolution).
[0021] Within a LiDAR system, one or both of the spatial resolution and the temporal resolution can be affected by changing or adjusting the FOV. For example, in some embodiments of a LiDAR system one scan may be completed across a first FOV for the system and a subsequent scan may be completed across a second FOV, different to the first FOV. In some embodiments, the second FOV is a part of the first FOV. In other embodiments, the second FOV is larger than the first FOV. The first and the second FOVs may overlap. In any of these embodiments, the LiDAR system may, in a further subsequent scan, be configured to return to scanning across the first FOV or to scan across a third FOV that is different from the first and second FOVs.
[0022] Additionally or instead (e.g. when the first and second FOVs are the same size), one or both of the spatial resolution within a portion of the FOV and the temporal resolution can be affected by changing the point density, or in other words changing the number of locations within a portion of the FOV at which a range measurement is performed by the LiDAR system.
[0023] Some embodiments of LiDAR system incorporate beam directors that direct outgoing light in different directions based on the wavelength of the outgoing light. For example, a LiDAR system may include one or more wavelength tunable lasers, which scan through one or more ranges of wavelength channels to effect a corresponding scan of the outgoing light within a FOV. Examples of wavelength-steerable LiDAR systems are described in international patent application no. PCT/AU2016/050899 (published as WO 2017/054036 A1).
[0024] In wavelength-steerable LiDAR systems, the point density can be changed by changing the number of optical pulses or other optical ranging signals per scan and/or by configuring the wavelength channel of the optical pulses or other optical ranging signals so that more (or less) pulses or ranging signals are within a first set of one or more wavelength ranges and less (or more) pulses or ranging signals are within a second set of one or more wavelength ranges; the wavelength range(s) in the second set being different to the wavelength range(s) in the first set. Additionally or alternatively, the temporal resolution may be affected by the tuning speed of the wavelength-tunable light source.
[0025] In LiDAR systems with one or more mechanical steering components, the FOV and/or point density can be changed by changing the number of optical pulses or other optical ranging signals per scan and/or by adjusting the steering rate and/or steering angle of one or more of the mechanical steering components. For instance, if the mechanical steering components rotate in order to direct light in different directions, a change in the rotation rate can effect a corresponding change in the temporal resolution and may also effect a corresponding change in the spatial resolution.
Similarly, a change in the rotation angle of the one or more mechanical steering components can effect a corresponding change in the spatial resolution and may also effect a corresponding change in the temporal resolution.
[0026] In LiDAR systems with one or more mechanical steering components and which is also configured for wavelength-based steering, either or both the mechanical components and components affecting the wavelength-based steering may be controlled for changing the FOV and/or point density. Examples of a LiDAR system with both wavelength and mechanical based steering are described in the applicant’s international patent application nos. PCT/AU2017/051395 (published as WO 2018/107237 A1) and PCT/AU2019/050437 (published as WO 2019/241825 A1). The content of both of these international patent applications is incorporated into this disclosure by reference.
[0027] Foveation in the context of a LiDAR system refers, at least herein, to the ability to be controlled to exhibit differential temporal resolution and/or to exhibit differential spatial resolution in different regions of the FOV. While foveation may be used to increase spatial resolution at one or more regions of interest (increased foveation), potentially at the expense of other region(s) (decreased foveation), it also increases the spatial power density in the one or more regions of interest (other relevant variables, for example the intensity of the outgoing light, remaining constant). The degree of foveation may affect how dense the points are distributed within a particular area of the FOV. The degree of foveation may therefore be limited by the maximum spatial power density for maintaining eye/skin safety, for example according to relevant Laser Safety Standards.
[0028] In some embodiments, the LiDAR system is configured with a plurality of selectable scan patterns. The plurality of selectable scan patterns are utilised by the LiDAR system to control the optical power and/or number of points and/or distribution of the points. The plurality of selectable scan patterns may be formed based on eye-safety requirements. For example, the optical power and/or the point density across one or two dimensions of a FOV may be specified by, or controlled according to, entries in the plurality of selectable scan patterns. [0029] Figure 1 illustrates an example arrangement of a spatial estimation system 100. As shown in the figure key, in the diagram electrical connections (e.g. analogue or digital signals) are represented by a solid lines and optical connections (e.g. guided or free space optical transmission) are represented by dashed lines. The blocks represent functional components of the spatial estimation system 100. It will be appreciated that functionality may be provided by distinct or integrated physical components. For examples, a light source may include an integrated amplifier and/or an integrated modulator.
[0030] The system 100 includes a light transmitter 101 for generating outgoing light. The light transmitter 101 may include a light source 102. In one example, the light source 102 includes a wavelength-tunable light source, such as a wavelength-tunable laser diode, providing light of a tunable wavelength. The wavelength may be based on one or more electrical currents (e.g. the injection current into the one or more wavelength tuning elements in the laser cavity) applied to the laser diode. The light source 102 accordingly is configured to provide outgoing light at a selected one or more of the multiple wavelength channels (each represented by its respective centre wavelength Ai , A2, ... An). The light source 102 may include a single tunable laser or more than one tunable laser (or other types of lasers). The light source 102 may select one wavelength channel at a time or may simultaneously provide two or more different selected wavelength channels (i.e. channels with different centre wavelengths). In another example, the light source 102 may include a broadband light source and one or more tunable spectral filters to provide substantially continuous-wave (CW) light intensity at the selected wavelength(s). In another example, the light source 102 includes multiple laser diodes, each being wavelength-tunable over a respective range and whose respective outputs are combined to form a single output. The respective outputs may be combined using a wavelength combiner, such as an optical splitter or an arrayed waveguide grating (AWG).
[0031] In one arrangement, the light source 102 is configured to provide the outgoing light to include at least one time-varying profile at the selected one or more of the multiple wavelength channels. The time-varying profile may be used in determining the round trip time of the light. In the one example, the light source 102 includes a modulator (not shown) for imparting a time-varying profile on the outgoing light. This modulation may be in addition to any wavelength tuning as herein before described. In other words, the modulation would be of light at the tuned wavelength. It will be appreciated that the tuned wavelength may refer to a centre frequency or other measure of a wavelength channel that is generated. The time-varying profile may, for example, be one or more of a variation in intensity, frequency, or phase imparted to the outgoing light. In some embodiments, the light source 102 emits pulses of light, which pulses may include the time-varying profile. In other embodiments, the difference between the presence of a pulse and the absence of a pulse is a time-varying profile for use in determining the round trip time of light. In other embodiments, the outgoing light from the light source 102 has a different form, for example individual pulses for detecting the round trip time of the individual pulses instead of detecting the round trip time of a series of modulated pulses. Techniques for determining the spatial profile of an environment are described in the incorporated international application no. PCT/AU2016/050899 (WO 2017/054036 A1).
[0032] In some embodiments, the light transmitter 101 also includes an optical amplifier 104 to amplify (providing gain to) light from the light source 102. In some embodiments the optical amplifier 102 is an Erbium-doped fibre amplifier (EDFA) of one or more stages. In other embodiments one or more stages of a semiconductor optical amplifier (SOA), a booster optical amplifier (BOA), or a solid state amplifier (e.g. a Nd:YAG amplifier) may be used. It will be appreciated that the modulator may be located either before or after the optical amplifier 104 in the outgoing light path. In some embodiments, the optical amplifier 104 may be omitted.
[0033] In some embodiments the light transmitter 101 includes a sampled-grating distributed Bragg reflector (SG-DBR) laser. By way of example, the SG-DBR laser may be controllable to provide 10 Gbps modulation, may operate across a 35 nm wavelength range and change from one wavelength to another in less than 100 nanoseconds. The wavelengths may have centre frequencies about 20 MHz or more apart.
[0034] The outgoing light from the light transmitter 101 is received by transmission (Tx) optics for directing light to a beam director 105. The transmission optics may also condition the light, for example, by including one or more collimators to form one or more beams of light. In some embodiments, the transmission optics form a light transceiver 103, configured to both provide outgoing light to the beam director 105 and receive collected incoming light from the beam director 105. In other embodiments, the outgoing and incoming lights paths may be separate, in whole or in part. For example, in other embodiments, the receiver aperture and associated components (i.e. reception (Rx) optics) for receiving the incoming light may not form part of the transmission optics for outgoing light (not shown).
[0035] The beam director 105 functions to direct light over one or two dimensions into the environment to be estimated by the spatial estimation system 100. An example mechanical based beam director includes one or more rotating mirrors. The beam director 105 may be wavelength-based and may include dispersive or refractive elements to direct light of different wavelengths in different directions. In some embodiments, the beam director 105 includes a combination of light directing components selected from diffractive, refractive and reflective components. In some embodiments, the light source 102 is a tunable light source to provide light at different wavelengths at different times and the beam director is wavelength dependent, for example configured with diffractive and/or refractive components to direct the different wavelengths in different directions. In some embodiments, the diffractive and/or refractive components are used in combination with a mechanical-based beam director, whereby the combined steering by the components directs the light over the FOV of spatial estimation system 100. In other embodiments the direction is solely by mechanical components or solely by wavelength dependent components, to provide light across one or two dimensions. The scanning may be linear (e.g. like a raster scan) or non-linear.
[0036] In some embodiments, the beam director 105 includes bidirectional components, whereby both the outgoing light to the environment and incoming light from the environment traverse substantially the same path through the beam director 105, in opposite directions. Figure 1 illustrates a bidirectional example for the optical signal traversing the beam director 105 and represents the directionality with a bidirectional arrow. Examples of beam directors are described in the incorporated international patent application nos. PCT/AU2017/051395 (published as WO 2018/107237 A1) and PCT/AU2019/050437 (published as WO 2019/241825 A1). The beam director 105 directs the incoming light to the reception optics (e.g. the light transceiver), which collects the light and passes it to a light receiver 109.
[0037] The light receiver 109 may include a light detector circuitry 106. The light detector circuitry 106 includes one or more photodetectors. An example photodetector is an avalanche photodiode (APD). The light detector circuitry 106 generates incoming electrical signals that are representative of the detected incoming light. The light detector circuitry 106 may include a trans-impedance amplifier following the APD. The light receiver 109 may also include an analog-to-digital converter (ADC) 108 following the light detector circuitry 106. The analog-to-digital converter 108 may convert analog incoming electrical singles to digital incoming electrical signals.
[0038] The incoming digital signals are received and processed by a control system 110. The light source 102 and the amplifier 104 may be also controlled by the control system 110. For example, one or both of the wavelength tunable laser (e.g. its wavelength) and the modulator (e.g. the modulating waveform) of the light source 102 may be controlled by the control system 110. The control system 110 may determine a round trip time for the light based on its control or knowledge of the outgoing light and based on the incoming light signals. The control system 110 may control beam steering directions of the beam director 105, through e.g. wavelength-based steering and/or mechanical steering, for example for effecting a scan pattern. In some embodiments, for example where the beam director includes one or more microelectromechanical systems, the control system 110 may directly control the beam director 105 via an electrical signal. In other embodiments, for example implementing wavelength based steering, an electrical signal to the beam director 105 is not required.
[0039] The control system 110 may include an application specific device configured to perform the operations described herein, such as a configured programmable logic device implemented, or a general purpose computing device with computer readable memory storing instructions to cause the computing device to perform the operations.
[0040] In the instance of an application specific device, the instructions and/or data for controlling operation of the processing unit may be in whole or in part implemented by firmware or hardware elements, including configured logic gates in one or more integrated circuit devices. In the instance of a general purpose computing device, the control system 110 may include, for example, a single computer processing device (e.g. a central processing unit, graphics processing unit, or other computational device), or may include a plurality of computer processing devices. The control system 110 may also include a communications bus in data communication with one or more machine readable storage (memory) devices which store instructions and/or data for controlling aspects of the operation of the processing unit. The memory devices may include system memory (e.g. a BIOS), volatile memory (e.g. random access memory), and non- volatile memory (e.g. one or more hard disk or solid state drives to provide non-transient storage). The operations for spatial estimation are generally controlled by instructions in the non-volatile memory and/or the volatile memory. In addition, the control system 110 includes one or more interfaces. The interfaces may include a control interface with the light transmitter 101 and a communication interface with the light receiver 109. The control system 110 may also receive and process external information that are not derived from the spatial estimation system 100.
[0041] In some embodiments, light from the light source 102 is also provided to the light detector circuitry 106 to provide a reference signal via a light path (not shown) from the light source 102 the light detector circuitry 106. For example, the light from the light source 102 may enter a sampler (e.g. a 90/10 fibre-optic coupler), where a majority portion (e.g. 90%) of the light is provide to the transceiver 103 and the remaining sample portion (e.g. 10%) of the light is provided instead to the light detector circuitry 106. The light detector circuitry 106 may then be configured to inhibit detection of nonreflected light based on a difference in wavelength or modulation between the outgoing light and the non-reflected light. For example, the light detector circuitry 106 includes one or more balanced detectors to coherently detect the reflected light mixed with reference light at the one or more balanced detectors. The light receiver 109 may perform, for example, homodyne or heterodyne detection of the incoming light.
[0042] In the embodiment of Figure 1 , the spatial estimation system 100 separates the functional components into two main physical units, i.e. an engine 111 and a sensor head 107. In one example, the engine 111 and the sensor head 107 are substantially collocated. The collocation allows these components to be compactly packed within a single unit or in a single housing. In another example, the sensor head 107 is remote from the engine 111. In this example, the engine 111 is optically coupled to the remote sensor head 107 via one or more waveguides, such as optical fibres. In yet another example, a spatial estimation system 100 may include a single engine 111 and multiple sensor heads. Each of the multiple sensor heads may be optically coupled to the engine 111 via respective waveguides. The multiple sensor heads may be placed at different locations and/or orientated with different FOVs.
[0043] As discussed herein, LiDAR systems can be controlled to adjust the FOV, for example, to provide for effective control over temporal and/or spatial resolution. This control may provide a more effective and/or safer spatial estimation system at least in certain applications. For example, in applications where spatial estimation systems are used for autonomous vehicles with an ability to increase temporal and/or spatial resolution in a particular region of the FOV (i.e. degree of foveation) and/or to increase the detection range while still maintaining practical eye/skin safety.
[0044] Figure 1 A generally illustrates an example arrangement of a control system for a spatial estimation system, for controlling and applying a scan pattern to a scan of the spatial estimation system. The control system may be the control system 110 of Figure 1 and the following description is given with reference to that example.
[0045] As discussed above, the light receiver 109 produces incoming electrical signals 1008 that are representative of the incoming optical pulses 1006 to the control system 110 for processing. The incoming electrical signals 1008 include at least object distance information. The control system 110 includes an object distance determination module 1003 for determining the object distance information from the incoming electrical signals 1008.
[0046] In some embodiments, the control system 110 also includes a speed determination module 1005 for determining the speed information of the entity (e.g. vehicle) where the spatial estimation system 100 is located. The speed determination may be based on the incoming electrical signals 1008, either directly, or based on distance determinations over time by the object distance determination module 1003. Additionally or alternatively, the speed information is obtained from an externally received signal 1014.
[0047] At least one of the processed signals indicative of object distance information and vehicle speed (1010 and 1012, respectively) are fed into a processing unit 1001 to generate one or more control signals 1002 corresponding to one of multiple scan patterns. The one or more control signals 1002 are then applied to the light transmitter 101 for generating outgoing optical pulses 1004 according to a selected scan pattern. In some embodiments, the one or more control signals may also be applied to the beam director 105 for controlling wavelength-based steering and/or mechanical steering of the beam director 105 in accordance with the selected scan pattern.
[0048] Figure 2 illustrates a process 200 for applying a scan pattern to a scan of a spatial estimation system, according to some embodiments of the present disclosure. The process 200 may be implemented for or by a control system, which may include one or more application specific devices configured to perform the operations of the process 200, or one or more general purpose computing devices with associated memory containing instructions to cause the computer to perform the operation of the process 200. For example, step 201 may be implemented for the control system 110, as part of a manufacturing or configuration process and steps 203 to 205 may be implemented by the control system 110. The following description is with reference to that example.
[0049] At step 201 , a plurality of selectable scan patterns of different spatial and/or temporal distribution for use to operate a spatial estimation system, for example, the spatial estimation system 100, are stored in or otherwise made accessible to the control system 110. In particular, the plurality of selectable scan patterns may be stored in the memory devices of or in data communication with the control system 110, for example, in non-volatile memory and/or in volatile memory.
[0050] The plurality of selectable scan patterns may include one or more regions or lines of substantially uniform point density across the FOV. Figure 3A illustrates some examples of the scan patterns 301 , 303 and 305 with substantially uniform point density across a FOV. As illustrated, the scan pattern 301 has 32 pixels (or points) vertically and 32 pixels horizontally over the FOV with a uniform (or even) density distribution. The scan pattern 303 has 64 pixels along the vertical axis and 16 pixels along the horizontal axis with a uniform density distribution. The scan pattern 305 has 128 pixels along the vertical axis and 8 pixels along the horizontal axis with a uniform density distribution. Although these three examples have the same total number of pixels across the FOV, it will be appreciated that the total number of pixels for each scan pattern of the plurality of selectable scan patterns may be different.
[0051] Alternatively or additionally, the plurality of selectable scan patterns may include one or more non-uniform point density distribution across the same or different FOV, for example with a relatively increased point density within one or more subregions of the FOV, optionally with a reduced point density outside of the sub-region(s). Figure 3B illustrates some examples of the scan patterns 302, 304, 306, 308, 310, 312, and 314 with non-uniform point density. As illustrated, the four scan patterns 302, 304, 306 and 308 represents scan patterns with constant vertical FOV at different levels of compression along the vertical axis. The level of compression increases left to right between scan patterns 302 and 308, i.e. the scan profile 302 illustrates the least compressed scan pattern while the scan profile 308 illustrates the most compressed scan pattern. The three scan patterns 310, 312 and 314 represent scan patterns with reduced vertical FOV, the FOV reducing left to right in Figure 3B. Although Figure 3B only illustrates examples of scan patterns with non-uniform point density distribution along the vertical axis, it will be appreciated that the non-uniform point density distribution of a scan pattern may also be applied to horizontal axis or both vertical and horizontal axes.
[0052] Returning to Figure 2, at step 203, one of the stored scan patterns (i.e. a first scan pattern) is selected to operate the spatial estimation system in a first power density mode for object distance measurement. In one example use case, the first scan pattern is utilised by the spatial estimation system for vehicle speed estimation. A vehicle speed estimation by the spatial estimation system may be of particular utility when the spatial estimation system is used for vehicles, including but not limited to autonomous or semi- autonomous vehicles. The first scan pattern may be used solely for speed estimation or may also be used for spatial estimation of the environment. The first scan pattern may be used alone, or together with another source of speed estimation, for example an indication of speed received in the signal 1014. Alternatively speed may be estimated or determined or received without reference to the first scan pattern, for example based solely on signal 1014. The spatial estimation may be an initial estimation, which in some embodiments is used, in combination with the speed estimation or determination, as an input to select a second scan pattern (e.g. in step 205 described herein).
[0053] Adjustment of power density may be achieved by adjusting one or more of parameters that affect the power density. Example parameters include the output power of the light transmitter 101 , the spatial resolution of the scan pattern, and the temporal resolution of the scan pattern. The combination of the power density parameters may be predetermined for each of the stored scan patterns. It will be appreciated that the predetermined combination of power density parameters facilitates predetermination of (a) exposure power at any given region of interest within the FOV and/or (b) exposure energy over any given time period.
[0054] In some embodiments, the first power density mode at step 203 is a low power density mode, which refers to an operating mode of the spatial estimation system with a selected scan pattern that is deemed eye/skin-safe anywhere within the FOV. In contrast, a high power density mode refers to any other operating mode of the spatial estimation system, including in particular any operating mode with a selected scan pattern that is deemed potentially non-eye/skin-safe in at least some region within the FOV. Some of the scan patterns stored at step 201 may be designated for use only when operating the spatial estimation system in a low power density mode. Similarly, some of the scan patterns stored at step 201 may be designated for use only when operating the spatial estimation system in a high power density mode.
[0055] In some embodiments, scanning of the environment is initiated with the first scan pattern, which is a low power density mode. Scanning with the first scan pattern is performed for object distance measurement, and optionally for object identification and/or speed determination.
[0056] At step 205, a second scan pattern is selected from the stored scan patterns to operate the spatial estimation system in a second power density mode. The second power density mode at this step may be a high power density mode. The selection may be responsive to an eye-safety-relaxation trigger. The second scan pattern may be selected to increase foveation in at least part of the FOV.
[0057] In some embodiments, the eye-safety-relaxation trigger is a determination that there is lack of objects detected at or within a threshold distance. Alternatively or additionally, the detected object(s) may be identified and linked to a class label, such as a person, fauna, car, flora, etc., using one or more advanced perception algorithms.
Once the objects are identified, the eye-safety-relaxation trigger may be a determination that there is no presence of particular object classes e.g. humans or animals at or within a threshold distance.
[0058] In some embodiments, the eye-safety relaxation trigger is dependent on a speed, for example the speed estimated, determined or received in step 203. For example in some embodiments the threshold distance for detection of objects is dependent on the speed of the entity on or in which the spatial estimation system is located (e.g. vehicle). For eye-safety applications, the higher the determined speed, the higher the threshold distance and similarly the lower the determined speed, the lower the threshold distance. [0059] In one example, the threshold distance is continuously variable based on the vehicle speed. In another example, the threshold distance is discretely variable based on the vehicle speed. For example, the threshold distance may vary based on whether the vehicle speed is above or below a threshold speed. In some examples, the threshold speed may be somewhere in the range of 10-100 km/h, such as 10, 20, 30, 40, 50, 60, 70, 80, 90 or 100 km/h. When the vehicle speed is below the threshold speed, the threshold distance is a first distance. When the vehicle speed is at or above the threshold speed, the threshold distance becomes a second distance, different to the first distance. For example, the threshold distance when the vehicle speed is below the threshold speed may be a shorter distance (e.g. 10 cm), whereas the threshold distance when the vehicle speed is at or above the threshold speed may be a larger distance (e.g. 2 m). There may be two or more threshold speeds, above and below which the threshold distance varies to larger and smaller threshold distances, respectively.
[0060] Alternatively since eye-safety is assessed with an eye iris at 10 cm from the scanning vertex of the spatial estimation system, the eye-safety-relaxation trigger may be a determination that the vehicle speed is above a threshold speed. The threshold speed may be the sole variable for the eye-safety-relaxation trigger. For example, when the vehicle speed is above the threshold speed, it is considered that it will be unlikely for an eye iris at 10 cm distance from the scanning vertex of the spatial estimation system that is located on or in the vehicle. In still further alternatives, there may be more than one eye-safety-relaxation trigger, with one trigger operating in some conditions and one or more other triggers operating in other conditions. For example when the speed is below a certain threshold, the eye-safety-relaxation trigger may be based on whether a detected object or a detected object within one or more particular classes is within a threshold distance and when the speed determination is above that threshold the eye- safety-relaxation trigger is based on speed alone.
[0061] The vehicle speed may be determined or received from a vehicle information system (e.g. a Controller Area Network (CAN) bus). Additionally or alternatively, the vehicle speed is estimated or determined, by the speed determination module 1005 of the control system 110, based on information from the spatial estimation system. For example, the vehicle speed is estimated or determined by calculating the speed at which the spatial estimation system approaches a static object detected in the environment, such as a tree, a bridge, a building, road markers, or a road sign. [0062] Alternatively or additionally, the eye-safety-relaxation trigger may be based on a determination of location or based on one or more environmental conditions. In applications where spatial estimation systems are used for autonomous vehicles, an example environment condition is a road condition. For example, in some embodiments the eye-safety-relaxation trigger is based on a determination that the vehicle is driving in a highway or in a non-pedestrian/bicyclist area. It will be appreciated that this approach may be considered eye safe because it is unlikely that people (or any other creatures susceptible to laser eye/skin-safety) will be present within that environment. The spatial estimation system may determine its location based on geolocation technologies, such as the global positioning system (GPS) and/or based on object recognition (e.g. road sign recognition).
[0063] In some embodiments, the spatial estimation system transitions to operate in a high power density mode from a low power density mode in responsive to an eye-safety relaxation trigger and in the high power density mode the system uses a scan pattern (the second scan pattern of step 205) to increase foveation in at least part of the FOV, such as along or near a detected horizon.
[0064] At step 207, a third scan pattern is selected from the stored scan patterns to operate the spatial estimation system in a third power density mode. The third power density mode at this step may be a low power density mode. The selection may be responsive to a determination of an eye-safety-restriction trigger. The third scan pattern may be selected to decrease foveation in at least part of the FOV. In one example, decreasing foveation means foveating at a reduced level. In another example, decreasing foveation means completely defoveating (e.g. using a scan with substantially uniform point density across a FOV). In yet another example, decreasing foveation means to lower the optical power of the outgoing light, completely turn off the light transmitter 101 or a combination of both. Lowering the optical power of the outgoing light may be achieved by reduced in the intensity of the outgoing light and/or by reducing the duration of the pulses of outgoing light. It will be appreciated that the third scan pattern may be the same as or different to the first scan pattern and that the third power density mode may be the same as or different to the first power density mode.
[0065] In some embodiments, the eye-safety-restriction trigger may be a determination that there are one or more objects detected within a threshold distance. This threshold distance may be the same as or different to (e.g. less than) the threshold distance for the eye-safety-relaxation trigger. Alternatively or additionally, the detected object(s) may be identified and linked to a class label using one or more advanced perception algorithms, as described herein. Once the objects are identified and classified, the eye-safety-restriction trigger may be a determination that there is one or more objects of one or more particular classes within a threshold distance. As discussed in the previous paragraphs, the threshold distance may be dependent on the speed of the entity where the spatial estimation systems is located. The threshold distance may be continuously or discretely variable based on the vehicle speed.
[0066] Alternatively or additionally, since eye-safety is assessed with an eye iris at 10 cm from the scanning vertex of the spatial estimation system, the eye-safety-restriction trigger may be a determination that the vehicle speed is below a threshold speed at least because it is considered that it will be likely to have an eye iris at 10 cm distance from the scanning vertex of the spatial estimation system that is located on or in the vehicle. The object distance or threshold distance may be irrelevant. As discussed above, the vehicle speed may be obtained from the vehicle information system and/or from the spatial estimation system.
[0067] Alternatively or additionally, the eye-safety-restriction trigger may be based on a determination of the environment condition, as described with reference to the eye- safety- relaxation trigger. In applications where spatial estimation systems are used for autonomous vehicles, the environment condition may be a road condition.
[0068] In some embodiments, the spatial profiling system transitions to operate in a low power density mode from a high power density mode in responsive to an eye-safety restriction trigger by using a third scan pattern that decreases foveation in at least part of the FOV.
[0069] Prior to a transition from a high power density mode to a low power density mode, there may be some exposure to the high power density laser beam. However, for the case of an exposure to the laser beam, eye safety depends on the level of exposure. A level of exposure that is at a border between safe and potentially harmful exposure is called “maximum permissible exposure”, or MPE (expressed in W/m2 or J/m2). The MPE is related to the operating wavelength and the optical power. The MPEs are set by the International Commission on Non-Ionizing Radiation Protection (ICNIRP), which are internationally accepted and are also adopted by standardisation committees such as I EC TC 76 and ANSI for the respective laser safety standards I EC 60825-1 and ANSI Z136.1 , respectively. For example, for a spatial estimation system operating at 1550 nm wavelength, the MPE over a 10-second duration exposure is 1000 W/m2. As the Accessible Emission Limit (expressed in W or J) for 1550nm Class 1 laser products is based on the measurement of average power over 10s in an eye iris with a diameter of 3.5mm at a distance of 10cm according to I EC 60825-1 , the Accessible Emission Limit for a spatial estimation system operating at 1550nm wavelength is 10mW over a 10-second duration exposure.
[0070] Therefore, the spatial estimation system may be configured with a response time for the transition from the high power density mode to the low power density mode. The response time may be based on an MPE or Accessible Emission Limit for a given laser class. In particular, the response time after the eye- safety- restriction trigger occurs may be set such that the MPE of the corresponding spatial estimation system for maintaining eye/skin safety is not exceeded. For example, the maximum response time after the eye-safety-restriction trigger occurs may result in an exposure not greater than the MPE, e.g. of 50%, 25% or 10% of the MPE of the corresponding spatial estimation system.
[0071] In some embodiments, following the transition from the high power density mode to the low density mode, the spatial estimation system may be configured with a non-transition period for preventing a further transition from the low power density mode to another higher density mode. The non-transition period prevents hysteresis behaviour. For example, the non-transition period may be a predetermined time between 0.1 to 10 seconds.
[0072] In some embodiments, detection of an object may include or be followed by determining object exposure to the outgoing light from the spatial estimation system. In particular, a region of interest may include a region of the FOV occupied by a detected object (i.e. an object region). The exposure of the object to the outgoing light may be determined based on predetermined power density parameters stored for each of the scan patterns. In this example, the object exposure may be inferred from a particular set of predetermined power density parameters, such as via a look-up process, rather than from run-time measurement of power density parameters. [0073] For example, detection of an object may include determining that the power density, based on the predetermined power density parameters of the selected scan pattern, integrated over the object region exceeds a threshold power. Alternatively or additionally, detection of an object may include determining that the power density, based on the predetermined power density parameters of the selected scan pattern, integrated over the object region and accumulated over a predetermined time period, exceeds a threshold energy. For example, the predetermined time period may be based on a resulting exposure not greater than the MPE, e.g. of 75%, 50% or 25% of the MPE. In some examples, the eye-safety-restriction trigger is not activated if the determined object exposure does not exceed the threshold power and/or threshold energy.
[0074] In one example, the above determinations based on threshold power and/or threshold energy can be realised by counting the number of returned pulses per unit measure detected at or below a threshold distance according to a process 400 as illustrated in Figure 4. A new distance measurement of a point is received at step 401 by the spatial profiling system for scanning an environment. If the outgoing light pulses of a selected scan pattern are directed at an object in the environment, at least part of the outgoing light pulses may be reflected, e.g. scattered, by the object back to the spatial estimation system as incoming light pulses. By, for example, determining the time it takes for an outgoing light pulse to make a round trip to and from the object at the object distance determination module 1003 of the control system 110, the object distance to the spatial estimation system is determined.
[0075] The process 400 involves counting the number of returned pulses over a detection frame. A detection frame is the FOV of the spatial estimation system, or a region of the FOV, for process 400 to determine whether an eye-safety restriction/relaxation trigger is activated. In some embodiments the detection frame is instead or further based on a fixed time window. For example, counting for a new detection frame may commence every 10 seconds or a smaller time interval. The reckoning of time may be based on a clock signal or based on another parameter, for example a number of scans of a FOV. In other embodiments the detection frame is instead or further based on a variable time window which is variable in commencement and/or in duration. For example, counting for a detection frame may commence when an object is detected within the FOV. Counting for a detection frame may continue while the object is continued to be detected within the FOV, or may end after a predetermined frame end time. If the object is still in the FOV at the predetermined frame end time, another count for the detection frame may be immediately triggered. In some embodiments only objects detected within a predetermined threshold distance (which may the same or different to the distance described herein with reference to step 404 of process 400) may commence a count for a detection frame. In another example a count for a detection frame commences and continues while the spatial estimation system is foveating and ends when the spatial system stops foveating. Some scan patterns with foveation may trigger a count for a detection frame, whereas others may not. Alternatively all scan patterns with foveation may trigger a count for a detection frame.
[0076] At step 402, the spatial estimation system, in particular the control system 110 determines whether or not the point received at step 401 represents the commencement of a new detection frame count. Determining whether a new detection frame count has commenced may avoid double or over counting a received point as contributing to object exposure towards the threshold power/energy. In response to determining that the received point represents the commencement of a new detection frame count, the process proceeds to step 404. In response to determining that a new detection frame has commenced, a counter for counting the number of received points within the detection frame that provides a measured distance below the threshold distance is reset at step 403. After the counter is reset, the process proceeds to step 404.
[0077] At step 404, the control system determines whether or not the distance to an object is less than a threshold distance (e.g. 300 mm). The object may be an object determined to be of a particular class (e.g. a person or animal), as described herein. Distances to other classes of object may not be used for the purpose of step 404. In response to a determination that the determined object distance is less than the threshold distance, the process proceeds to step 405. Otherwise the process proceeds to step 409 in which the control system determines that the eye-safety-restriction trigger is not activated. As mentioned above, the threshold distance in some embodiments is based on speed of the entity (e.g. vehicle) on or in which the spatial estimation system is located.
[0078] In step 405 the counter is incremented. Then at step 406 the control system determines whether the counter number is less than a threshold value. The threshold value may be associated with a threshold power or a threshold energy. It will be appreciated that, by determining the number of optical pulses reflected by the object per unit measure, the object exposure can be determined according to the set of predetermined power density parameters associated with the selected scan pattern. Similarly, the threshold power or threshold energy can be adjusted to a threshold value that is comparable with the number of returned pulses according to the set of predetermined power density parameters associated with the selected scan pattern. For determination based on a threshold power, the unit measure may be a predetermined size within the FOV, for example, approximate size of a human head. For determination based on a threshold energy, the unit measure may be a predetermined period of time, such as the maximum response time. In response to determining that the counter is not less than the threshold value, the eye-safety-restriction trigger is activated (or the eyesafety relaxation trigger is not activated) at step 407. Otherwise, the control system determines at step 409 that the eye-safety-restriction trigger is not activated (or the eye- safety-relaxation trigger is activated).
[0079] Alternatively or additionally to tracking the pulses returned from an object as described in step 404 above, step 404 may involve counting returned pulses within a threshold distance within a region of interest. The region of interest may be a region within the FOV with an increased point density (i.e. a foveated region). All returned pulses with the region of interest may be counted. Alternatively, the detection of objects as described in step 404 above may be applied only to the region of interest. A foveated region may be a region where the point density exceeds the point density if the points are uniform across the FOV, such 110%, 150%, 200%, 250% or 300% per unit area relative to a uniform point density. In some examples, detection of an object outside the foveated region alone does not activate the eye-safety-restriction trigger.
[0080] Alternatively or additionally, detection of an object depends on the (individual or total) size of any detected object(s). For example, the eye-safety-restriction trigger may not be triggered if any individual or the total object region is determined to be smaller than a threshold area, such as the size of a human head.
[0081] Accordingly, it will be appreciated that the same stored scan pattern may cause the spatial estimation system to operate in either high or low power density mode. That is, the power density mode in which the spatial estimation system is operated depends not only on the selected scan pattern but also on other run-time variables, such as measured distance of a detected object, location of a detected object within the FOV, and/or speed of vehicle (or any other entity where the spatial estimation system is located). For example, according to the present disclosure, lack of an object detected within the threshold distance in a region of interest may cause the spatial estimation system to switch operation from a low power density mode to a high power density mode (e.g. to switch operation from step 203 to step 205) or to continue operation in a high power density mode (e.g. to maintain operation at step 205). However, for the same selected pattern, an object detected within the threshold distance may cause the spatial estimation system to switch operation from a high power density mode to a low power density mode (e.g. to switch operation from step 205 to 207) or to continue operation in a lower power density mode (e.g. to maintain operation at step 203). In another example, for the same selected pattern, an object detected within the threshold distance but outside the foveated region within the FOV may cause the spatial estimation system to switch operation from a low power density mode to a high power density mode (e.g. to switch operation from step 203 to 205) or to continue operation in a high power density mode (e.g. to maintain operation at 205) and does not activate the eye-safety-restriction trigger.
[0082] It will also be appreciated that, instead of or in addition to adjusting scan patterns, changing resolution of the spatial estimation system 100 may be achieved by changing maximum allowable round trip time (i.e. flight time). For example, changing the maximum allowable round trip time from 2ps to 1 ps may result in half optical power of the outgoing light and therefore half detecting range. On the other hand, it allows the light transmitter 101 to recover more quickly between scans and therefore to provide the outgoing light according to a scan pattern twice within the same amount of time so as to increase the resolution.
[0083] It will be understood that the invention disclosed and defined in this specification extends to all alternative combinations of two or more of the individual features mentioned or evident from the text or drawings. All of these different combinations constitute various alternative aspects of the invention.

Claims

24
CLAIMS A spatial estimation system, including: at least one light source, configured to provide outgoing light of the spatial estimation system; optical components configured to direct the outgoing light into an environment across a field of view and receive incoming light from the environment, wherein the incoming light from the environment includes reflected outgoing light; a control system configured to: cause the spatial estimation system to provide outgoing light according to a first scan pattern from a plurality of selectable scan patterns, to operate the spatial estimation system in a first power density mode; receive first incoming signals representative of incoming light received by the optical components, including reflected outgoing light in accordance with the first scan pattern; in response to an eye-safety relaxation trigger derived from the first incoming signals, cause the spatial estimation system to provide outgoing light according to a second scan pattern from the plurality of selectable scan patterns to operate the spatial estimation system in a second power density mode; receive second incoming signals representative of incoming light received by the optical components, including reflected outgoing light in accordance with the second scan pattern; and in response to an eye-safety restriction trigger derived from the second incoming signals, cause the spatial estimation system to provide outgoing light according to a third scan pattern from the plurality of selectable scan patterns to operate the spatial estimation system in a third power density mode; wherein, within at least a region of a field of view of the spatial estimation system encompassed by the first, second and third scan patterns, a power density of the outgoing light for the first power density mode is lower than a power density of the outgoing light for the second power density mode; and a power density of the outgoing light for the third power density mode is lower than the power density for the second power density mode. The spatial estimation system of claim 1, wherein a set of predetermined power density parameters are associated with each of the plurality of selectable scan patterns. The spatial estimation system of claim 2, wherein the set of predetermined power density parameters includes one or more of optical power of the light source, spatial resolution and temporal resolution of the associated scan pattern within at least the region of the field of view. The spatial estimation system of claim 2 or claim 3, wherein at least one of the eye-safety-relaxation trigger and the eye-safety-restriction trigger is based on detection of one or more objects with respect to at least one threshold distance. The spatial estimation system of claim 4, wherein the detection of the one or more objects includes identification of at least one of the one or more objects. The spatial estimation system of claim 4 or claim 5, wherein the detection of the one or more object includes determination of exposure of the one or more objects to the outgoing light. The spatial estimation system of claim 6, wherein the determination of exposure of the one or more objects to the outgoing light is based on the one or more predetermined power density parameters. The spatial estimation system of claim 6 or claim 7, wherein the determination of exposure of the one or more objects includes determination of power density integrated over a region of the field of view occupied by the one or more objects. The spatial estimation system of claim 6 or claim 7, wherein the determination of exposure of the one or more objects includes determination of power density integrated over a region of the field of view occupied by the one or more objects and accumulated over a predetermined time period. The spatial estimation system of claim 9, wherein the predetermined time period is based on a resulting exposure not greater than a maximum permissible exposure of the spatial estimation system. The spatial estimation system of any one of claims 4 to 10, wherein the detection of the one or more objects includes detection of the one or more objects within a region within the field of view with an increased point density. The spatial estimation system of any one of claims 4 to 11 , wherein the detection of the one or more objects includes detection of individual or total size of the one or more objects. The spatial estimation system of any one of claims 1 to 12 wherein at least one of the eye-safety relaxation trigger and the eye-safety restriction trigger is dependent on a speed of an entity on or in which the spatial estimation system is located or the spatial estimation system of any one of claims 4 to 12, wherein the at least one threshold distance is dependent on speed of an entity on or in which the spatial estimation system is located. The spatial estimation system of claim 13, wherein the at least one threshold distance is continuously variable based on the entity speed. The spatial estimation system of claim 13, wherein the at least threshold distance is discretely variable based on the entity speed. The spatial estimation system of claim 15, wherein the at least one threshold distance varies based on whether the entity speed is above or below at least one threshold speed, wherein the at least one threshold distance is a first distance when the entity speed is below the at least one threshold speed and the at least one threshold distance is a second distance, different to the first distance, when the entity speed is at or above the at least one threshold speed. The spatial estimation system of any one of the preceding claims, wherein at least one of the eye-safety-relaxation trigger and the eye-safety-restriction trigger 27 is based on a determination of speed of an entity where the spatial estimation system is located with respect to at least one threshold speed. The spatial estimation system of any one of the preceding claims, wherein at least one of the eye-safety-relaxation trigger and the eye-safety-restriction trigger is based on a determination of condition of the environment. The spatial estimation system of claim 18, wherein the determination of environment condition includes determining one or more road conditions. The spatial estimation system of any one of claims 13 to 19, wherein the entity speed is determined by a speed determination module of the control system. The spatial estimation system of claim 20, wherein the entity speed is determined by calculating speed at which the spatial estimation system approaches a static object detected in the environment. The spatial estimation system any one of the preceding claims, wherein a response time for the spatial estimation system to switch operation from the second power density mode to the third power density mode is based on a resulting exposure not greater than a maximum permissible exposure of the spatial estimation system. A method of operating a spatial estimation system including, by the spatial estimation system: generating first outgoing light and directing the first outgoing light into the environment across a field of view according to a first scan pattern from a plurality of selectable scan patterns, to operate the spatial estimation system in a first power density mode; receiving incoming light from the environment, wherein the incoming light from the environment includes reflected first outgoing light; in response to an eye-safety relaxation trigger derived from the first incoming light: generating second outgoing light and directing the second outgoing light into the environment across a field of view according to a second 28 scan pattern from the plurality of selectable scan patterns, to operate the spatial estimation system in a second power density mode, different to the first power density mode; receiving incoming light from the environment, wherein the incoming light from the environment includes reflected second outgoing light; in response to an eye-safety restriction trigger derived from the second incoming light: generating third outgoing light and directing the third outgoing light into the environment across a field of view according to a third scan pattern from the plurality of selectable scan patterns, to operate the spatial estimation system in a third power density mode, different to the second power density mode; wherein, within at least a region of a field of view of the spatial estimation system encompassed by the first, second and third scan patterns, a power density of the outgoing light for the first power density mode is lower than a power density of the outgoing light for the second power density mode; and a power density of the outgoing light for the third power density mode is lower than the power density for the second power density mode. The method of claim 23, wherein the spatial estimation system includes at least one light source configured to provide light at a plurality of different power levels and wherein the power density of the outgoing light for the first power density mode is lower than the power density of the outgoing light for the second power density mode due to the outgoing light having a lower power level within at least the region of the field of view. The method of claim 23, wherein the spatial estimation system includes at least one light source configured to provide light at a plurality of different power levels and wherein the power density of the outgoing light for the third power density mode is lower than the power density of the outgoing light for the second power density mode due to the outgoing light having a lower power level within at least the region of the field of view. 29 The method of any one of claims 23 to 25, wherein the spatial estimation system includes at least one light source configured to provide light at a plurality of different wavelength channels and a beam director configured to direct the different wavelength channels into the environment in different directions, and wherein the power density of the outgoing light for the first power density mode is lower than the power density of the outgoing light for the second power density mode based on the outgoing light having a different distribution of wavelengths between the first scan pattern and the second scan pattern. The method of any one of claims 23 to 25, wherein the spatial estimation system includes at least one light source configured to provide light at a plurality of different wavelength channels and a beam director configured to direct the different wavelength channels into the environment in different directions, and wherein the power density of the outgoing light for the third power density mode is lower than the power density of the outgoing light for the second power density mode based on the outgoing light having a different distribution of wavelengths between the third scan pattern and the second scan pattern. The method of any one of claims 23 to 27, wherein at least one of the eye-safety- relaxation trigger and the eye-safety-restriction trigger is based on detection of one or more objects with respect to at least one threshold distance. The method of claim 28, wherein the detection of the one or more objects includes identification of at least one of the one or more objects. The method of claim 28 or claim 29, wherein the detection of the one or more object includes determination of exposure of the one or more objects to the outgoing light. The method of any one of claims 28 to 30, wherein the detection of the one or more objects includes detection of the one or more objects within a region within the field of view with an increased point density. The method of any one of claims 28 to 31 , wherein the detection of the one or more objects includes detection of individual or total size of the one or more objects. 30 The method of any one of claims 23 to 32 wherein at least one of the eye-safety relaxation trigger and the eye-safety restriction trigger is dependent on speed of an entity on or in which the spatial estimation system is located or the spatial estimation system of any one of claims 28 to 32, wherein the at least one threshold distance is dependent on speed of an entity on or in which the spatial estimation system is located. The method of claim 33, wherein the at least one threshold distance is continuously variable based on the entity speed. The method of claim 33, wherein the at least threshold distance is discretely variable based on the entity speed. The method of claim 35, wherein the at least one threshold distance varies based on whether the entity speed is above or below at least one threshold speed, wherein the at least one threshold distance is a first distance when the entity speed is below the at least one threshold speed and the at least one threshold distance is a second distance, different to the first distance, when the entity speed is at or above the at least one threshold speed. The method of any one of claims 23 to 36, wherein at least one of the eye-safety- relaxation trigger and the eye-safety-restriction trigger is based on a determination of speed of an entity where the spatial estimation system is located with respect to at least one threshold speed. The method of any one of claims 23 to 37, wherein at least one of the eye- safety-relaxation trigger and the eye-safety-restriction trigger is based on a determination of condition of the environment. The method of any one of claims 23 to 38, wherein a response time for the spatial estimation system to switch operation from the second power density mode to the third power density mode is based on a resulting exposure less than a maximum permissible exposure of the spatial estimation system.
PCT/AU2022/051489 2021-12-24 2022-12-12 Spatial estimation system with controlled outgoing light WO2023115104A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2021904278A AU2021904278A0 (en) 2021-12-24 Spatial estimation system with controlled outgoing light
AU2021904278 2021-12-24

Publications (1)

Publication Number Publication Date
WO2023115104A1 true WO2023115104A1 (en) 2023-06-29

Family

ID=86900752

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2022/051489 WO2023115104A1 (en) 2021-12-24 2022-12-12 Spatial estimation system with controlled outgoing light

Country Status (1)

Country Link
WO (1) WO2023115104A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180130480A (en) * 2016-10-25 2018-12-07 전자부품연구원 Optical transmitting and receiving device and method
KR20190114343A (en) * 2018-03-29 2019-10-10 주식회사 에스오에스랩 Lidar scanning device
US20200300977A1 (en) * 2019-03-22 2020-09-24 Viavi Solutions Inc. Time of flight-based three-dimensional sensing system
WO2020242834A1 (en) * 2019-05-30 2020-12-03 OPSYS Tech Ltd. Eye-safe long-range lidar system using actuator
KR20210027004A (en) * 2019-08-28 2021-03-10 주식회사 에스오에스랩 Distance measuring device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180130480A (en) * 2016-10-25 2018-12-07 전자부품연구원 Optical transmitting and receiving device and method
KR20190114343A (en) * 2018-03-29 2019-10-10 주식회사 에스오에스랩 Lidar scanning device
US20200300977A1 (en) * 2019-03-22 2020-09-24 Viavi Solutions Inc. Time of flight-based three-dimensional sensing system
WO2020242834A1 (en) * 2019-05-30 2020-12-03 OPSYS Tech Ltd. Eye-safe long-range lidar system using actuator
KR20210027004A (en) * 2019-08-28 2021-03-10 주식회사 에스오에스랩 Distance measuring device

Similar Documents

Publication Publication Date Title
US11940538B2 (en) Spatial profiling system and method
US11686826B2 (en) Measuring time-of-flight using a plurality of detector subsystems and histogram storage
US11226403B2 (en) Chip-scale coherent lidar with integrated high power laser diode
US11789132B2 (en) Compensation circuitry for lidar receiver systems and method of use thereof
US9335414B2 (en) Frequency agile LADAR
US20140241731A1 (en) System and method for free space optical communication beam acquisition
CN106443707B (en) Hyperspectral laser radar system and control method thereof
WO2022007727A1 (en) Preamble pulse based lidar systems and methods
US20220113411A1 (en) Lidar system and method with coherent detection
WO2023115104A1 (en) Spatial estimation system with controlled outgoing light
WO2024020526A1 (en) Nanosecond pulsed wavelength agile laser
WO2022150129A1 (en) Systems and methods for controlling laser power in light detection and ranging (lidar) systems
WO2023044524A1 (en) Light-based spatial estimation systems
US11940570B2 (en) Virtual windows for LiDAR safety systems and methods
US20240069330A1 (en) Optical beam director
US20220128689A1 (en) Optical systems and methods for controlling thereof
EP3982153A1 (en) Lidar system and method with coherent detection
EP4202486A1 (en) Lidar system and a method of calibrating the lidar system
CN116500630A (en) Ranging method for laser radar, laser radar and computer readable storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22908866

Country of ref document: EP

Kind code of ref document: A1