WO2018055513A2 - Methods circuits devices assemblies systems and functionally associated machine executable code for light detection and ranging based scanning - Google Patents

Methods circuits devices assemblies systems and functionally associated machine executable code for light detection and ranging based scanning Download PDF

Info

Publication number
WO2018055513A2
WO2018055513A2 PCT/IB2017/055665 IB2017055665W WO2018055513A2 WO 2018055513 A2 WO2018055513 A2 WO 2018055513A2 IB 2017055665 W IB2017055665 W IB 2017055665W WO 2018055513 A2 WO2018055513 A2 WO 2018055513A2
Authority
WO
WIPO (PCT)
Prior art keywords
pulse
scene
photonic
scanning
signal
Prior art date
Application number
PCT/IB2017/055665
Other languages
French (fr)
Other versions
WO2018055513A3 (en
Inventor
Omer KEILAF
Oren BUSKILA
Amit Steinberg
David Elooz
Nir OSIROFF
Ronen ESHEL
Yair ANTMAN
Guy ZOHAR
Yair Alpern
Moshe Medina
Smadar David
Pavel BERMAN
Oded Yeruhami
Julian Vlaiko
Hanoch KREMER
Original Assignee
Innoviz Technologies Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US15/391,916 external-priority patent/US20180100928A1/en
Priority claimed from US15/393,749 external-priority patent/US20180113216A1/en
Priority claimed from US15/393,593 external-priority patent/US20180081038A1/en
Priority claimed from US15/393,285 external-priority patent/US20180081037A1/en
Application filed by Innoviz Technologies Ltd. filed Critical Innoviz Technologies Ltd.
Publication of WO2018055513A2 publication Critical patent/WO2018055513A2/en
Publication of WO2018055513A3 publication Critical patent/WO2018055513A3/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Definitions

  • the present invention relates generally to the field of light detection and ranging. More specifically, the present invention relates to a Lidar with a photonic steering assembly.
  • Lidar which may also be called “LADAR” is a surveying method that measures a distance to a target by illuminating that target with a laser light. Lidar is sometimes considered an acronym of "Light Detection and Ranging", or a portmanteau of light and radar. Lidar may be used with terrestrial, airborne, and mobile applications.
  • Autonomous Vehicle Systems are directed to vehicle level autonomous systems involving a Lidar system.
  • An autonomous vehicle system stands for any vehicle integrating partial or full autonomous capabilities.
  • Autonomous or semi-autonomous vehicles are vehicles (such as motorcycles, cars, buses, trucks and more) that at least partially control a vehicle without human input.
  • the autonomous vehicles sense their environment and navigate to a destination input by a user/driver.
  • Unmanned aerial vehicles which may be referred to as drones are aircrafts without a human on board may also utilize Lidar systems.
  • the drones may be manned/controlled autonomously or by a remote human operator.
  • Autonomous vehicles and drones may use Lidar technology in their systems to aid in detecting and scanning a scene/ the area in which the vehicle and/or drones are operating in.
  • a host system stands for any computing environment that interfaces with the Lidar, be it a vehicle system or testing/qualification environment.
  • Such computing environment includes any device, PC, server, cloud or a combination of one or more of these.
  • This category also covers, as a further example, interfaces to external devices such as camera and car ego motion data (acceleration, steering wheel deflection, reverse drive, etc.). It also covers the multitude of interfaces that a Lidar may interface with the host system, such as a CAN bus.
  • a light detection and ranging (Lidar) device may include a photonic pulse emitter assembly including one or more photonic emitters to generate and focus a photonic inspection pulse towards a photonic transmission (TX) path of the Lidar device, a photonic detection assembly including one or more photo sensors to receive and sense photons of a reflected photonic inspection pulses received through a receive (RX) path of the device, a photonic steering assembly located along both the TX and the RX paths and including a Complex Reflector (CR) made of an array of steerable reflectors, where a first set of steerable reflectors are part of the TX path and a second set of steerable reflectors are part of the RX path.
  • CR Complex Reflector
  • the first set of steerable reflectors may direct a photonic inspection pulse from the photonic pulse emitter assembly towards a given segment of a scene to be inspected.
  • the second set of steerable reflectors may direct a photonic inspection pulse reflection, reflected off of a surface of an element present in the given segment of the scene, towards the photonic detection assembly.
  • the array of steerable reflectors may be dynamic steerable reflectors.
  • the dynamic steerable reflectors may have a controllable state, such as a transmission state, a reception state and/or an idle state.
  • the separate set of reflectors allocated to the RX path and to the TX path may achieve optical isolation between the TX and RX paths by spatial diversity and multiplexing.
  • the first set of steerable reflectors may be mechanically coupled to each other and the second set of steerable reflectors may be mechanically coupled to each other.
  • the dynamic steerable reflectors are individually steerable.
  • the first set of steerable reflectors may have a first phase and may be substantially synchronized and the second set of steerable reflectors may have a second phase and may be substantially synchronized.
  • the first phase and the second phase may have a substantially fixed difference between them.
  • the first set of steerable reflectors may oscillate together at a first frequency and the second set of steerable reflectors may oscillate together at a second frequency.
  • the first and second frequency may have a substantially fixed phase shift between them.
  • increasing a number of dynamic steerable reflectors in a transmission state increases a transmission beam spread and/or decreasing a number of dynamic steerable reflectors in a reception state may decrease reception field of view (FOV) and may compensate for ambient light conditions.
  • FOV reception field of view
  • dynamic steerable reflectors in an idle state may provide isolation between dynamic steerable reflectors in a transmission state and a reception state.
  • a first set of steerable reflectors may be surrounded by a second set of steerable reflectors.
  • a second set of steerable reflectors may be surrounded by a first set of steerable reflectors.
  • a method of scanning a scene may include: emitting a photonic pulse towards a photonic transmission (TX) path, receiving reflected photonic pulses received through a receive (RX) path, detecting with a detector a scene signal based on the reflected photonic inspection pulses, and complexly steering the photonic pulse towards a scene and the reflected photonic pulses from a scene to the detector, by reflecting at a first phase the photonic pulse and receiving at a second phase the reflected pulse, where the difference between the first and second phase may be dependent on the time it takes the photonic pulse to be reflected and return.
  • TX photonic transmission
  • RX receive
  • a vehicle may include: a scanning device to produce a detected scene signal, the scanning device including: a photonic pulse emitter assembly including one or more photonic emitters to generate and focus a photonic inspection pulse towards a photonic transmission (TX) path of the device, a photonic detection assembly including one or more photo sensors to receive and sense photons of a reflected photonic inspection pulses received through a receive (RX) path of the device, a photonic steering assembly located along both the TX and the RX paths and including a Complex Reflector (CR) made of an array of steerable reflectors, where a first set of steerable reflectors are part of the TX path and a second set of steerable reflectors are part of the RX path, and a host controller to receive the detected scene signal and control the host device at least partially based on the detected scene signal.
  • a photonic pulse emitter assembly including one or more photonic emitters to generate and focus a photonic inspection pulse towards a photonic transmission (TX) path of the device
  • a host controller may be configured to relay a host signal to the scanning device.
  • Other aspects of the present invention include methods, circuits, assemblies, devices, systems and functionally associated machine executable code for controllably steering an optical beam.
  • a light steering device including: a mirror connected to one or more electromechanical actuators through a flexible interconnect element, one or more electromechanical actuators mechanically interconnected to a frame, and a controllable electric source to, during operation of the device, provide sensing signal at a source voltage to an electric source contact on at least one of the one or more actuators.
  • the light steering device may include a electrical sensing circuit connected to an electric sensing contact on at least one of the one or more actuators, and during operation of the device measure parameters of the sensing circuit.
  • the electric source and the electrical sensing circuit may be connected to the same actuator and facilitate sensing of a mechanical deflection of the actuator to which the electric source and the electrical sensing circuit are connected.
  • the device may include a sensor to relay a signal indicating an actual deflection determined based on the mechanical deflection.
  • the device may include a controller to control the controllable electric source and the electrical sensing circuit. The controller may also control deflection of the actuator and may correct a steering signal based on the sensed mechanical deflection.
  • the electric source and the electrical sensing circuit may be each connected to a contact on two separate actuators and they may facilitate sensing of a mechanical failure of one or more elements supported by the two separate actuators.
  • sensing of a mechanical failure is determined based on an amplitude of a sensed current and/or or sensing of a mechanical failure is determined based on a difference between an expected current and a sensed current.
  • current with: (a) voltage, or (b) a current frequency, or (c) a voltage frequency or (d) electrical charge and more are understood.
  • a scanning device may include: a photonic emitter assembly (PTX) to produce pulses of inspection photons which pulses are characterized by at least one pulse parameter, a photonic reception and detection assembly (PRX) to receive reflected photons reflected back from an object, the PRX including a detector to detect the reflected photons and produce a detected scene signal, a photonic steering assembly (PSY) functionally associated with both the PTX and the PRX to direct the pulses of inspection photons in a direction of an inspected scene segment based on at least one PSY parameter and to produce a sensing signal, and a closed loop controller to: (a) control the PSY, (b) receive the sensing signal and (c) update the at least one PSY parameter at least partially based on the detected scene signal.
  • the sensing signal may be indicative of an actual deflection of the PSY and/or a mechanical failure.
  • a method of scanning utilizing a mirror assembly including a mirror and a conductive actuator may include: setting a mirror having a conductive actuator to a predetermined deflection, detecting a current through the actuator indicative of a mechanical deflection of the mirror, and determining if the predetermined direction is substantially similar to the actual deflection. The method may further include correcting the actual deflection if the predetermined deflection and the actual deflection are substantially different. The method may also include detecting an actual current through the actuator and the mirror indicative of a electro-mechanical state of the mirror assembly and comparing the actual current to an expected current and determining if a mechanical failure has occurred.
  • a scanning device may include a photonic emitter assembly (PTX) to emit at least one pulse of inspection photons in accordance with at least one adjustable pulse (generation) parameter, a photonic reception and detection assembly (PRX) to receive reflected photons reflected back from an object, the PRX including a dynamic detector to detect the reflected photons based on one or more adjustable detector parameter, the detector further configured to produce a detected scene signal, and a closed loop controller to control the PTX and PRX and to receive a PTX feedback and a PRX feedback, the controller further comprising a situational assessment unit to receive the detected scene signal from the detector and produce a scanning plan and update the at least one adjustable pulse parameter and at least one detector parameter at least partially based on the scanning plan.
  • PTX photonic emitter assembly
  • PRX photonic reception and detection assembly
  • the scanning device may include a photonic steering assembly (PSY) and the situational assessment unit may be configured to determine the scanning plan based on a global cost function where the PSY feedback, PRX feedback, PTX feedback, memory information, host feedback and the detected scene signal are used in producing the scanning plan and the host feedback includes an override flag to indicate that the host feedback is to override the other signals and feedbacks.
  • PSY photonic steering assembly
  • a scanning device including a photonic emitter assembly (PTX), a photonic reception and detection assembly (PRX), a photonic steering assembly (PSY) and a controller adapted to synchronize operation of the PTX, PRX and PSY.
  • the controller may be a situationaliy aware controller which dynamically adjusts the operational mode and operational/scanning parameters of the PTX, PRX and/or PSY based on one or more detected situational characteristics.
  • a scanning device may include a photonic emitter assembly (PTX) to produce pulses of inspection photons wherein the pulses are characterized by at least one pulse parameter, a photonic reception and detection assembly (PRX) to receive reflected photons reflected back from an object, the PRX including a detector to detect the reflected photons and produce a detected scene signal, a photonic steering assembly (PSY) functionally associated with both the PTX and the PRX to direct the pulses of inspection photons in a direction of an inspected scene segment and to steer the reflection photons back to the PRX, and a closed loop controller to: (a) control the PTX, PRX and PSY, (b) receive the detected scene signal from the detector and (c) update the at least one pulse parameter at least partially based on the detected scene signal.
  • PTX photonic emitter assembly
  • PRX photonic reception and detection assembly
  • PSY photonic steering assembly
  • At least one pulse parameter may be selected from the following group: pulse power intensity, pulse width, pulse repetition rate pulse sequence, pulse duty cycle, wavelength, phase and/or polarization.
  • the controller may include a situational assessment unit to receive the detected scene signal and produce a scanning/work plan.
  • the situational assessment unit may receive a PSY feedback from the PSY.
  • the situational assessment unit may receive information stored on a memory.
  • the information may be selected from the following list: laser power budget, electrical operational characteristics and/or calibration data.
  • the situational assessment unit may use the PSY feedback to produce the scanning/work plan.
  • Laser power budget may be derived from constraints such as: eye safety limitations, thermal budget, laser aging over time and more.
  • the work plan may be produced based on (a) real-time detected scene signal (b) intra-frame level scene signal and (c) inter- frame level scene signal accumulated and analyzed over two or more frames.
  • the detector may be a dynamic detector having one or more detector parameters and the closed loop controller may update the detector parameters based on the work plan.
  • the detector parameters may be selected from the following group: scanning direction, frame rate, ambient light effects, mechanical static and dynamic impairments, dynamic gating for reducing parasitic light, dynamic sensitivity and/or thermal effects.
  • the PSY may have one or more steering parameters and the closed loop controller may update the steering based on the work plan.
  • the steering parameters may be selected from the following group: scanning method, power modulation, single or multiple axis methods, synchronization components.
  • the situational assessment unit may receive a host feedback from a host device and use the host feedback to produce or contribute to the work plan.
  • a method of scanning a scene may include: producing pulses of inspection photons wherein the pulses may be characterized by at least one pulse parameter, receiving reflected photons reflected back from an object; detecting the reflected photons and producing a detected scene signal; and updating at least one pulse parameter based on the detected scene signal.
  • the method may include producing a work plan based on the detected scene signal.
  • producing a work plan is also based on a PSY feedback, and may also be based on information stored on a memory such as a look up table or otherwise.
  • the method may include updating one or more detector parameters based on the work plan, and updating steering of the PSY based on the work plan.
  • a vehicle may include a scanning device and a host device to receive a detected scene signal and control the vehicle at least partially based on the detected scene signal and to relay a host feedback to the scanning device.
  • the situational assessment unit of the scanning device may receive a host feedback from the host device and use the host feedback to produce the work plan,
  • a scanning device may include: a photonic emitter assembly (PTX) to emit at least one pulse of inspection photons in accordance with at least one adjustable pulse parameter; a photonic reception and detection assembly (PRX) to receive reflected photons reflected back from an object, the PRX may include a detector to detect in accordance with at least one adjustable detection parameter the reflected photons and produce a detected scene signal, a photonic steering assembly (PSY) functionally associated with both the PTX and the PRX to direct the pulses of inspection photons in a direction of an inspected scene segment and to steer the reflection photons back to the PRX, and a closed loop controller to: (a) control the PTX, PRX and PSY, (b) receive the detected scene signal from the detector and (c) update the at least one pulse parameter and at least one defection parameter at least partially based on
  • the steering assembly may be configured to direct and to steer in accordance with at least one adjustable steering parameter, determined by a work plan.
  • the steering parameters may be selected from: transmission pattern, sample size of the scene, power modulation that defines the range accuracy of the scene, correction of axis impairments, dynamic FOV determination, scanning method, single or multiple deflection axis methods, synchronization components and more.
  • Themodule parameter may be selected from: pulse power intensity,module width, pulse repetition rate,installe sequence, pulse duty cycle, wavelength, phase and polarization and more.
  • the detection parameter may be selected from: scanning direction, frame rate, ambient light effects, mechanical static and dynamic impairments, thermal effects and more.
  • the work plan may be derived from a background model, a region of interest (RO! model a region of non- interest (RON! model and/or a host signal or otherwise.
  • the steering parameter may be a field of view (FOV) determination.
  • the detected scene signal may be characterized by an adjustable quality of service,
  • an autonomous vehicle may include a scanning device as discussed above and a host controller to receive the detected scene signal and to relay a host feedback to the scanning device including host ego- motion information.
  • Ego-motion information may include: wheels steering position, vehicle speed, vehicle acceleration, vehicle braking, headlights status, turning lights status, GPS location information and more.
  • the work plan may be derived from a background model at least partially stored in the host controller and may be relayed to the scanning device via the host feedback.
  • the detected scene signal may be emitted in accordance with an adjustable quality of service.
  • a method of scanning a scene may include: emitting at least oneroue of inspection photons in accordance with at least one adjustableinstalle parameter; detecting in accordance with at least one adjustable detection parameter reflected photons and producing a detected scene signal; estimating a scene composition of scene elements present within a scene segment and deriving a scanning plan at least partially from the detected scene signal, and updating at least oneactue parameter and at least one detection parameter at least partially based on the scanning plan.
  • Fig. A1 A shows an active scanning device which may include or be otherwise functionally associated with one or more photonic steering assemblies in accordance with some embodiments;
  • Fig. A1 B is an example embodiment of the scanning device of Fig. A1 A;
  • FIGs. A2A & A2B show a side view of a plurality of steerabie reflector units in accordance with some embodiments
  • Fig, A2C shows a block level diagram of a steerabie reflector unit in accordance with some embodiments
  • Fig. A3 shows an example complex reflector in accordance with some embodiments
  • FIGs. A4A - A4D show example steering devices in accordance with some embodiments
  • FIGs. ASA - A5C show example scanning device schematics in accordance with some embodiments
  • Fig. A6 shows an example scanning system in accordance with some embodiments
  • Fig. A7 is a flowchart associated with a method of scanning a scene in accordance with some embodiments.
  • Fig. B1 shows a steering device which may be associated with or part of a scanning device in accordance with some embodiments
  • Fig. B2 shows an example embodiment of a steering device and a central processing unit (CPU) in accordance with some embodiments
  • Fig. B3 shows an example actuator-mirror depiction in accordance with some embodiments
  • Figs. B4A-B4C show a dual axis mems mirror, a single axis mems mirror and a round mems mirror (respectively) in accordance with some embodiments;
  • Figs. B5A-B5C show example scanning device schematics
  • Fig. B6 shows an example scanning system in accordance with some embodiments
  • Fig. B7 shows a flowchart of a method for scanning in accordance with some embodiments
  • Figs. C1 A - C1 C show examples of scanning device schematics in accordance with some embodiments
  • Fig. C2 shows a scanning system in accordance with some embodiments
  • Figs. C3A & C3B show example inspection photonic pulses control signals including examples laser signal in accordance with some embodiments
  • Fig, C4 shows an example scanning system in accordance with some embodiments
  • FIGs. C5A & C5B show example host systems in accordance with some embodiments
  • Fig. C6 shows a flowchart for a method of scanning a scene in accordance with some embodiments
  • FIGs. D1 A - D1 C depict example monostatic and bistatic (appropriately) scanning device schematics in accordance with some embodiments
  • Fig. D2 depicts an example scanning system in accordance with some embodiments
  • Fig. D3 shows example inspection photon pulses control signals including example laser signals A, B and C;
  • Figs. D4A - D4F show schematics of different scanning plans which may be utilized to control pulse parameters and/or detector parameters and/or steering parameters and an identical key 402 for ail of these figures;
  • Fig. D5A shows a schematic of a scene to be scanned by a scanning device in accordance with some embodiments
  • Fig. D5B shows a chart of the power or resource allocation for the scene of Fig. D5A and a chart depicting interleaving of ROIs in power allocation over time in accordance with some embodiments;
  • Fig. D6 shows a flowchart of a method for avoiding exceeding a maximal reflected signal value by controlling the transmitted signal in accordance with some embodiments
  • Fig. D7A shows an example scene which may include one or more background element in accordance with some embodiments
  • Fig. D7B shows a flowchart associated with a system learning method for utilizing and updating a background model in accordance with some embodiments
  • Fig. D8 shows two identical scenes in accordance with some embodiments.
  • Fig. D9A shows a FOV ratio including a maximal FOV and an active FOV
  • Figs. D9B & D9C include example maximal and active FOVs in accordance with some embodiments.
  • Fig. D10 shows a flowchart for scanning a scene in accordance with some embodiments.
  • Embodiments of the present invention may include apparatuses for performing the operations herein.
  • This apparatus may be specially constructed for the desired purposes, or it may comprise a general purpose computer selectively activated or reconfigured by a computer program stored in the computer.
  • a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs) electrically programmable read-only memories (EPROMs), electrically erasable and programmable read only memories (EEPROMs), magnetic or optical cards, or any other type of media suitable for storing electronic instructions, and capable of being coupled to a computer system bus.
  • the present invention may include methods, circuits, devices, assemblies, systems and functionally associated machine executable code for Lidar based scanning.
  • a scene scanning device adapted to inspect regions or segments of a scene using photonic pulses.
  • the photonic pulses used to inspect the scene also referred to as inspection pulses, may be generated and transmitted with characteristics which are dynamically selected as a function of various parameters relating to the scene to be scanned and/or relating to a state, location and/or trajectory of the device.
  • Sensing and/or measuring of characteristics of inspection pulse reflections from scene elements illuminated with one or more inspection pulses may also be dynamic and may include a modulating optical elements on an optical receive path of the device.
  • inspection of a scene segment may include illumination of the scene segment or region with a pulse of photons, which pulse may have known parameters such as pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter and/or average power. Inspection may also include detecting and characterizing various parameters of reflected inspection photons, which reflected inspection photons are inspection pulse photons reflected back towards the scanning device from an illuminated element present within the inspected scene segment (i.e. scene segment element).
  • Parameters of reflected inspections photons may include photon time of flight (time from emission till detection), instantaneous power at and during return pulse detection, average power across entire return pulse and photon distribution/signal over return pulse period.
  • a distance and possibly a physical characteristic of one or more elements present in the inspected scene segment may be estimated.
  • a scene may vary from embodiment to embodiment, depending on the specific intended application of the invention.
  • the term scene may be defined as the physical space, up to a certain distance, surrounding the vehicle (in-front, sides, above below and behind the vehicle).
  • a scene segment or scene region according to embodiments may be defined by a set of angles in a polar coordinate system, for example, corresponding to a diverging pulse or beam of light in a given direction.
  • the light beam/pulse may have a center radial vector in the given direction and may also be characterized by a broader defined by angular divergence values, polar coordinate ranges, of the light beam/pulse.
  • a monostatic scanning Lidar system utilizes the same optical path for transmission (Tx) and reception (Rx) of the laser beam.
  • the laser in the transmission path and appropriately the inspection photons emitted from the laser may be well co!iimated and can be focused into a narrow spot while the reflected photons in the reception path becomes a larger patch due to dispersion. Accordingly a steering device is required that is efficient for a large reflection photon patch in the reception path and the need for a beam splitter that redirects the received beam (the reflection photons) to the detector.
  • the large patch of reflection photons requires a large microelectromechanical systems ( E!VIS) mirror that may have a negative impact on the FOV and frame rate performance. Accordingly, an array of reflective surfaces having a phase between the transmission and reception surfaces is shown. An array contains small mirrors that can perform at a high scan rate with larger angles of deflection.
  • the mirror array may essentially act as a large mirror in terms of effective area. This method decouples the mirror design from the Tx and Rx path and also obsoletes the requirement for a beam splitter.
  • Using the same photonic steering assembly may provide for tight synchronization between a direction in which a photonic pulse/beam is steered and emitted by the photonic emitter assembly and a direction of a concurrent FOV of one or more optical sensors of the photonic detection assembly.
  • This shared photonic steering assembly configuration may allow for a photonic detector assembly of a given device to focus upon and almost exclusively collect/receive reflected photons from substantially the same scene segment being concurrently illuminated by the given device's photonic emitter assembly.
  • an active scanning device 100 which may include or be otherwise functionally associated with one or more photonic steering assemblies such as complex reflector (CR) 102.
  • CR 102 may be adapted to adjustably steer photons and/or a photonic pulse towards a selected direction, such as the direction of a center vector of a scene segment to be inspected.
  • CR 102 may be part of a Photonic Transmission (PTX) path 104 of inspection photons emitted by the photonic emitter assembly and may direct, reflectively or using refraction, a pulse of inspection photons towards a scene segment to be inspected 106, (since the inspected scene segment is changing it is external to the scanning device and therefore has dashed line in the figure).
  • PTX Photonic Transmission
  • CR 102 may also be part of a photonic reception (PRX) path 108 for reflected inspection photons reflected from a surface of a scene element (object) present within an inspected/illumined scene segment 106, where CR 102 may direct reflected inspection photons, reflectively or using refraction, towards a photon detection aperture/opening of a photonic detector assembly such detector as 1 10.
  • PRX photonic reception
  • CR 102 may include one or more sub-groups of steerable reflectors (SR) such SR1 1 12 and SR2 1 14.
  • SR steerable reflectors
  • Each sub-group of electrically controllable /steerable reflectors may include one or more steerable reflector units such as unit 1 16.
  • Unit 1 16 may include a microelectromechanicai systems mirror or reflective surface assembly and/or an electromechanical actuator and more.
  • SR1 1 12 and/or SR2 1 14 may each include one or more units arranged next to one another in either a one or two dimensional matrix to form Complex Reflector 102.
  • Unit 1 16 may be individually controllable, for example by a device controller and or a steering assembly controller, such that each reflector may be made to tilt towards a specific angle along each of one or two separate axis.
  • a set of array reflectors, optionally reflectors adjacent to one another, may be grouped into a Common Control Reflector (CCR) set of array reflectors which are synchronously controlled with one another so as to concurrently tilt or point in approximately the same direction.
  • CCR Common Control Reflector
  • SR 1 1 12 and SR2 1 14 may each be comprised of one or more CCRs. Accordingly, CR 102 may be parsed into two or more CCR sets. SR1 1 12 and SR2 1 14 may each be in-line with, or part of, a separate optical path. As shown in this example SR1 1 12 may be part of PTX 104 while SR2 1 14 is part of PRX 108,
  • CR 102 may be configured to electrically steer one or more reflectors such as unit 1 16 to overcomes mechanical impairments and drifts due to thermal and gain effects or otherwise.
  • one or more units 1 16 may move differently than intended (frequency, rate, speed etc.) and their movement may be compensated for by electrically controlling the reflectors appropriately.
  • PTX 104 may be configured to produce pulses of inspection photons.
  • PTX 606 may include a laser or alternative light source such as laser 1 18.
  • Laser 1 18 may be a laser such as a solid state laser, a high power laser or otherwise or an alternative light source such as, a LED based light source or otherwise.
  • PTX may include additional elements shown by TX elements 120 which may include a collimator, controller, feedback controller/signals and more.
  • PRX 108 may be configured to receive photons reflected back from an object or scene element of scene 106 and produce a detected scene signal.
  • PRX 108 may include a detector such as detector 1 10.
  • Detector 1 10 may be configured to detect the reflected photons reflected back from an object or scene element and produce a detected scene signal.
  • PRX 108 may include additional elements shown in RX elements 122 which may include a module to position a singular scanned pixel window onto/in the direction of detector 1 10.
  • the detected scene signal may include information such as: time of flight which is indicative of the difference in time between the time a photon was emitted and detected after reflection from an object, reflected intensity, polarization values and more.
  • scanning device 100 may be a monostatic scanning system where PTX 104 and PRX 108 have a joint optical path for example, scene 106 may be a common path as well as CR 102 which, as described above, may be configured to direct pulses of inspection photons from PTX 606 in a direction of an inspected scene and to steer reflection photons from the scene back to PRX 608.
  • Fig. A2A shown is a side view of a plurality of sfeerable reflector units 200, each unit may be substantially similar to unit 1 16 of Fig. A1 .
  • Each unit may include a reflective surface such as mirror 202, mirror 204, mirror 206, mirror 208 and mirror 210 associated with an actuator such as actuator 212, actuator 214, actuator 216, actuator 218, and actuator 220 (appropriately).
  • Actuators 212-220 may alternatively be termed cantilevers or benders or actuators.
  • IVIirrors 202-210 may be any reflective surface, for example, made from polished gold, aluminum, silicon, silver, or otherwise.
  • IVIirrors 202-210 may be identical or different reflective surfaces varying in size and/or material.
  • Actuators 212-220 may be electrically controllable electromechanical actuator such as a stepper motor, direct current motor, galvanometric actuator, electrostatic, magnetic or piezo elements or thermal based actuator or otherwise. Each actuator 212-220 may cause movement in a mirror support or spring such as segment 222-230.
  • each actuator 212-220 may be a separate actuator or may be a joined actuator for two or more mirrors, for example if actuator 216 and actuator 214 are a single actuator mirrors them mirrors 202 and 206 may move together.
  • two or more actuators may be controlled to operate substantially in conjunction with each other. It is understood that a sub group of mirrors and actuators operating in unison or with a shared actuator may form a steerable reflector such as SR1 of Fig. 1 or that a single unit may be a steerable reflector in and of itself.
  • mirrors operating in a transmission path may have a first angle or a phase shift compared to mirrors operating in a reception path.
  • the phase shift may remain constant across the entire scanning pattern or may exhibit a variation according to the angular position of both the mirrors in the PRX and PTX paths.
  • mirrors 202, 204 and 206 are configured to be reception mirrors and are in a first angle while mirror 210 is in a second angle, and configured to be a transmission mirror.
  • the mirrors may be disabled and/or in an idle mode: (1 ) by being electrically or mechanically disabled for example by being denied the dynamic electrical signal that provides power to the actuator, the mirror may remain static in a certain position that does not contribute any signal to the scanned scene. It stays in a static location either by applying a certain voltage level or blocking the mirror by a mechanical means and/or (2) the mirror may point or scan an orthogonal direction with respect to the scene, out of the active FOV or (3) otherwise.
  • Reflectors in an idle mode may serve as isolation between transmission and reception reflectors which may improve signal to noise ratio and overall signal detection/ quality of signal (QoS).
  • mirrors which are in a transmission state, reception state and/or idle state may be dynamically controllable/selectable.
  • Fig. A2B shown is a side view of a plurality of steerable reflector units 250. Having the same mirrors 202-210 and actuators 212-220, except that in this instance mirrors 202 and 204 are configured to be transmission mirrors, mirrors 208 and 210 are configured to be reception mirrors and mirror 206 is an idle mirror.
  • Fig. A2C shown is a block level diagram of steerable reflector unit 270 which is substantially similar to unit 1 16 of Fig. A1 A.
  • Unit 270 may include a reflective surface such as mirror 272 and an actuator such as actuator 274. It is understood that mirror 272 is substantially similar to any of mirrors 202-210 and that actuator 274 is substantially similar to actuators 212-220 of Figs. A2A&A2B.
  • Actuator 274 may be part of or attached at one end to a support frame such as frame 278. Actuator 274 may cause movement or power to be relayed to mirror 272,
  • Actuator 274 may include a piezo-electric layer and a semiconductor layer and optionally, a support or base layer.
  • a flexible interconnect element or connector, such as spring 276, may be utilized to adjoin and relay movement from actuator 274 to mirror 272.
  • CR 302 may serve as an example for CR 102 and that the discussion is applicable here as well.
  • CR 302 may include a plurality of steerable reflector such as SRs 304-334. While a 4X4 matrix of SRs is shown it is understood that many dimensions of a matrix is applicable such as 1 X2, 1 X1 , 1 X4, 2X8 3X7 or otherwise.
  • SRs 304-334 may be dynamic SRs so that at each point of operating a scanning device which includes CR 302, each SR 304- 334 may be controllably designated as either : (a) a complex reception reflector (CRXR) (in a reception state) included in the reception path and accordingly may steer reflected photons to a detector; (b) a complex transmission reflector (CTXR) (in a transmission state) included in the transmission path and may steer inspection photons in the direction of a scene or (c) an idle reflector (in an idle state). Accordingly, the same SR may be at times a CTXR and at other times a CRXR or an idle reflector.
  • CRXR complex reception reflector
  • CXR complex transmission reflector
  • SRs 304-334 may be static SRs so that they are each either a CRXR or a CTXR, A sub-set out of SRs 304-334 may operate in unison as a steerabie reflector in a reception path and a different sub-set out of SRs 304-334 may operate in unison as a SR in a transmission path.
  • the sub-sets may be mechanically, electrically and/or elecfro-mechanically coupled to each other.
  • a first set of SRs may operate in conjunction as a sub-set of CTXR which may operate in a first phase and a second sub-set of CRXR may operate in a second phase.
  • the difference or delta between the first phase and second phase may be determined based on (or to compensate for) an expected difference between transmitted and received photons.
  • the difference between the two phases may be fixed and/or synchronized. If CR 302 is moving in a predetermined controllable path to scan a scene then the location of CR when inspection photons is different than the location when the reflection photons are received so the difference in location can be compensated for by planning the second phase accordingly.
  • the difference in phase is primarily utilized to separate the TX path from the RX path.
  • the first and second subsets may oscillate at substantially the same frequency with differences due to mechanical inaccuracies or due to compensation for mechanical inaccuracies, mechanical impairments and/or drifts.
  • Operation of the first and second sets of reflectors may be synchronized.
  • the reflectors of the first set and reflectors of the second set may be made to oscillate at substantially the same frequency.
  • a phase shift between reflectors of the first set and reflectors of the second set may be substantially fixed and/or otherwise synchronized.
  • the phase shift may vary in amplitude dynamically in order to compensate for the time delay between the transmitted photonic pulses and the received reflection. The purpose is to minimize the detector sensitive area by locating the reflected laser spot in the same place on the detector for the entire period of time of flight.
  • SRs 304-334 are static SRs
  • a subset of SR's may be mechanically coupled so that they inherently operate in unison.
  • the sub-set may include some or all of the SR's of the same path.
  • each SR may be controlled separately and a sub-set of SRs may be controlled substantially in unison so that they operate substantially in unison, in such an example the different SRs may be electrically controlled/coupled together.
  • a combination where part of a sub-group is mechanically coupled to each other is understood (for example, having a shared frame or cantilever).
  • Figs. A4A-A4D shown are example steering devices.
  • Fig A4A depicts a non-symmetric steering device 410, with a plurality of static reception steerable reflectors and a couple of off-center static transmission steerable reflectors, in this example the steerabie reflectors are ail of a unison size.
  • Fig. A4B depicts a symmetric steering device 420, with a single centered static reception steerable reflector of a first size and a plurality of static transmission steerable reflectors each of a second size.
  • A4C depicts a non-symmetric steering device 430, with a plurality of static transmission steerable reflectors and a couple of off- center static reception steerable reflectors, in this example the steerabie reflectors are ail of a unison size.
  • Fig. A4D depicts a non-symmetric steering device 440, with a plurality of static reception steerable reflectors of varying sizes, a plurality of static transmission steerabie reflectors of varying sizes and a plurality of complex transmission reflectors which may each function as a transmission, reception or idle reflector.
  • increasing the amount of transmission Tx reflectors may increase a reflected photons beam spread. Decreasing the amount or reception Rx reflectors may narrow the reception field and compensate for ambient light conditions (such as clouds, rain, fog, extreme heat and more) and improve signal to noise ratio.
  • FIG. A1 B shown is an active scanning device 150 which may include or be otherwise functionally associated with one or more photonic steering assemblies such as complex reflector (CR) 152 which is meant to depict an example reflector embodiment, !n this example, Tx reflector 162 is an example embodiment of SR1 1 12 of Fig. A1A having a single unit. Rx reflector sub-unit 164 is an example of SR2 1 14 of Fig. A1 , having 8 units.
  • CR complex reflector
  • a scene scanning device such as scanning device 512 which may be adapted to inspect regions or segments of a scene (shown here is a specific FOV being scanned) using photonic pulses (transmitted light) whose characteristics may be dynamicaliy selected as a function of: (a) optical characteristics of the scene segment being inspected; (b) optical characteristics of scene segments other than the one being inspected; (c) scene elements present or within proximity of the scene segment being inspected; (d) scene elements present or within proximity of scene segments other than the one being inspected; (e) an operational mode of the scanning device; and/or (f) a situational feature/ characteristic of a host platform with which the scanning device is operating.
  • a scene scanning device such as scanning device 512 which may be adapted to inspect regions or segments of a scene (shown here is a specific FOV being scanned) using photonic pulses (transmitted light) whose characteristics may be dynamicaliy selected as a function of: (a) optical characteristics of the scene segment being inspected; (b) optical characteristics of scene segments
  • the scene scanning device may be adapted to inspect regions or segments of a scene using a set of one or more photonic transmitters 522 (including a light source such as pulse laser 514) , receptors including sensors (such as detecting element 516) and/or steering assembly 524; whose configuration and/or arrangement may be dynamically selected as a function of: (a) optical characteristics of the scene segment being inspected; (b) optical characteristics of scene segments other than the one being inspected; (c) scene elements present or within proximity of the scene segment being inspected; (d) scene elements present or within proximity of scene segments other than the one being inspected; (e) an operational mode of the scanning device; and/or (f) a situational characteristic of a host platform with which the scanning device is operating.
  • a photonic transmitters 522 including a light source such as pulse laser 514) , receptors including sensors (such as detecting element 516) and/or steering assembly 524; whose configuration and/or arrangement may be dynamically selected as a function of: (a) optical characteristics of the scene segment
  • Active scanning device 512 may include: (a) a photonic emitter assembly 522 which produces pulses of inspection photons; (b) a photonic steering assembly 524 that directs the pulses of inspection photons to/from the inspected scene segment; (c) a photonic detector assembly 516 to detect inspection photons reflected back from an object within an inspected scene segment; and (d) a controller to regulate operation of the photonic emitter assembly, the photonic steering assembly and the operation of the photonic detection assembly in a coordinated manner and in accordance with scene segment inspection characteristics of the present invention at least partially received from internal feedback of the scanning device so that the scanning device is a closed loop dynamic scanning device.
  • a closed loop scanning device is characterized by having feedback from at least one of the elements and updating one or more parameter based on the received feedback.
  • a closed loop system may receive feedback and update the systems own operation at least partially based on that feedback.
  • a dynamic system or element is one that may be updated during operation.
  • inspection of a scene segment may include illumination of the scene segment or region with a pulse of photons (transmitted light), which pulse may have known parameters such as pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter average power, pulse power intensity, pulse width, pulse repetition rate, pulse sequence, pulse duty cycle, wavelength, phase, polarization and more. Inspection may also include detecting and characterizing various aspects of reflected inspection photons, which reflected inspection photons are inspection pulse photons (reflected light) reflected back towards the scanning device (or laser reflection) from an illuminated element present within the inspected scene segment (i.e. scene segment element).
  • reflected inspection photons are inspection pulse photons (reflected light) reflected back towards the scanning device (or laser reflection) from an illuminated element present within the inspected scene segment (i.e. scene segment element).
  • Characteristics of reflected inspection photons may include photon time of flight (time from emission till detection), instantaneous power (or power signature) at and during return pulse detection, average power across entire return pulse and photon distribution/signal over return pulse period the reflected inspection photons are a function of the inspection photons and the scene elements they are reflected from and so the received reflected signal is analyzed accordingly.
  • a distance and possibly a physical characteristic such as reflected intensity of one or more scene elements present in the inspected scene segment may be estimated.
  • Fig. A5B depicted is an example bistatic scanning device schematic 550. !t is understood that scanning device 562 is substantially similar to scanning device 512. However, scanning device 512 is a monostatic scanning device while scanning device 562 is a bi static scanning device. Accordingly, steering element 574 is comprised of two steering elements: steering element for PTX 571 and steering element for PRX 573. The rest of the discussion relating to scanning device 512 of Fig. ASA is applicable to scanning device 562 Fig. A5B,
  • Fig. A5C depicted is an example scanning device with a plurality of photonic transmitters 522 and a plurality of detectors 516, ail having a joint steering element 520. It is understood that scanning device 587 is substantially similar to scanning device 512. However, scanning device 587 is a monostatic scanning device with a plurality of transmitting and receiving elements. The rest of the discussion relating to scanning device 512 of Fig. ASA is applicable to scanning device 587 Fig, A5C.
  • Scanning system 600 may be configured to operate in conjunction with a host device 628 which may be a part of system 600 or may be associated with system 600.
  • Scanning system 600 may include a scene scanning device such as scanning device 604 adapted to inspect regions or segments of a scene using photonic pulses whose characteristics may be dynamically selected.
  • Scanning device 604 may include a photonic emitter assembly (PTX) such as PTX 606 to produce pulses of inspection photons.
  • PTX 606 may include a laser or alternative light source.
  • the light source may be a laser such as a solid state laser, a high power laser or otherwise or an alternative light source such as, a LED based light source or otherwise.
  • Scanning device 604 may be an example embodiment for scanning device 612 of Fig. ASA and/or scanning device 562 of Fig. A5B and/or scanning device 587 of Fig. A5C and the discussion of those scanning devices is applicable to scanning device 604.
  • the photon pulses may be characterized by one or more controllable pulse parameters such as: pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter average power, pulse power intensity, pulse width, pulse repetition rate, pulse sequence, pulse duty cycle, wavelength, phase, polarization and more.
  • the inspection photons may be controlled so that they vary in pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter average power, pulse power intensity, pulse width, pulse repetition rate, pulse sequence, pulse duty cycle, wavelength, phase, polarization and more.
  • the photon pulses may vary between each other and the parameters may change during the same signal.
  • the inspection photon pulses may be pseudo random, chirp sequence and/or may be periodical or fixed and/or a combination of these.
  • the inspection photon pulses may be characterized as: sinusoidal, chirp sequences, step functions, pseudo random signals, or linear signals or otherwise.
  • scanning device 604 may include a photonic reception and detection assembly (PRX) such as PRX 608 to receive reflected photons reflected back from an object or scene element and produce detected scene signal 610.
  • PRX 608 may include a detector such as detector 612.
  • Detector 612 may be configured to detect the reflected photons reflected back from an object or scene element and produce detected scene signal 610.
  • detected scene signal 610 may include information such as: time of flight which is indicative of the difference in time between the time a photon was emitted and defected after reflection from an object, reflected intensity, polarization values and more.
  • scanning device 604 may be a monostatic scanning system where PTX 606 and PRX 608 have a joint optical path.
  • Scanning device 604 may include a photonic steering assembly (PSY), such as PSY 616, to direct pulses of inspection photons from PTX 606 in a direction of an inspected scene and to steer reflection photons from the scene back to PRX 608.
  • PSY photonic steering assembly
  • PTX 616 may also be in charge of positioning the singular scanned pixel window onto/in the direction of detector 612.
  • PSY 616 may be a dynamic steering assembly and may be controllable by steering parameters control 618.
  • Example steering parameters may include: scanning method that defines the acquisition pattern and sample size of the scene, power modulation that defines the range accuracy of the acquired scene, correction of axis impairments based on collected feedback and reliability confirmation, definition which sub-sections are CRXR and which are CTXR.
  • PSY 616 may include: (a) a Single Dual-Axis MEMS mirror; (b) a dual single axis MEMS mirror; (c) a mirror array where multiple mirrors are synchronized in unison and acting as a single large mirror; (d) a mirror splitted array with separate transmission and reception and/or (e) a combination of these and more.
  • part of the array may be used for the transmission path and the second part of the array may be used for the reception path.
  • the transmission mirrors may be synchronized and the reception mirrors may be synchronized separately from the transmission mirrors.
  • the transmission mirrors and the reception mirrors sub arrays may maintain an angular shift between themselves in order to steer the beam into separate ports, essentially integrating a circulator module.
  • PSY 616 may include one or more PSY state sensors to produce a signal indicating an operational state of PSY 616 for example power information or temperature information, reflector state, reflector actual axis positioning, reflector mechanical state and reflector operative state (transmission state, reception state or idle state) more.
  • PSY 616 may include one or more steerable reflectors, each of which may include a reflective surface associated with an electrically controllable actuator.
  • PSY 616 may include or be otherwise associated with one or more microeiectromechanicai systems (MEMS) mirror assemblies or an array with a plurality of steerable reflectors,
  • MEMS microeiectromechanicai systems
  • a photonic steering assembly according to refractive embodiments may include one or more reflective materials whose index of refraction may be electrically modulated, either by inducing an electric field around the material or by applying electromechanical vibrations to the material.
  • PSY 606 complex reflector may include two or more CCRs steerable separately or dependent on each other.
  • the complex reflector may include one or more dynamic CCRs which same complex reflector may be controllable to switch between a transmission, reception and/or idle mode.
  • control signals to PSY 817 may also control a transmission phase and/or a reception phase; a single phase which transmission and/or reception phase are both derived from, specific phases for each complex reflector, or for each CCR, and mode selection for dynamic reflectors (transmission, reception and/or idle) and/or frequency parameters.
  • scanning device 604 may include a controller, such as controller 620.
  • Controller 604 may receive scene signal 610 from detector 612 and may control PTX 606, PSY 618 PRX 608 including detector 612 based on information stored in the controller memory 622 as well as received scene signal 610 including accumulated information from a plurality of scene signals 610 received over time.
  • controller 620 may process scene signal 610 optionally, with additional information and signals and produce a vision output such as vision signal 624 which may be relayed/transmitted/ to an associated host device. Controller 620 may receive detected scene signal 610 from detector 612, optionally scene signal 610 may include time of flight values and intensity values of the received photons. Controller 620 may build up a point cloud or 3D or 2D representation for the FOV by utilizing digital signal processing, image processing and computer vision techniques.
  • controller 620 may include situational assessment logic or circuitry such as situational assessment logic (SAL) 626.
  • SAL 626 may receive detected scene signal 610 from detector 612 as well as information from additional blocks/elements either internal or external to scanning device 104.
  • scene signal 210 can be assessed and calculated according with or without additional feedback signals such as a PSY feedback PTX feedback, PRX feedback and host feedback and information stored in memory 622 to a weighted means of local and global cost functions that determine a work plan such as work plan signal 634 for scanning device 604 (such as: which pixels in the FOV are scanned, at which laser parameters budget, at which detector parameters budget).
  • controller 620 may be a closed loop dynamic controller that receives system feedback and updates the system's operation based on that feedback.
  • SAL 626 may receive one or more feedback signals from PSY 616 via PSY feedback 630.
  • PSY feedback 630 may include instantaneous position of PSY 616 where PSY 616 may include one or more reflecting elements and each reflecting element may contain one or more axis of motion, it is understood that the instantaneous position may be defined or measured in one or more dimensions.
  • PSY's have an expected position however PSY 616 may produce an internal signal measuring the instantaneous position (meaning, the actual position) then providing such feedback may be utilized by situational assessment logic 626 for calculating drifts and offsets parameters in the PRX and/ or for correcting steering parameters control 218 of PSY 616 to correct an offset.
  • PSY feedback 630 may indicate a mechanical failure which may be relayed to host 628 which may either compensate for the mechanical failure or control host 628 to avoid an accident due to the mechanical failure.
  • SAL 626 may select and operate array reflectors. SAL 626 may dynamically select a first set of array reflectors to use as part of the PTX, and may select a second set of reflectors to use as part of the PRX.
  • SAL 626 may increase a number of reflectors in the first set to reflectors in or to increase inspection pulse (TX beam) spread. SAL 626 may also decrease a number of reflectors in the second set in order to narrow RX FOV and/or to compensate for background noise or ambient light conditions. Sub control circuits may be included in PSY 616.
  • PSY feedback 630 may include instantaneous scanning speed of PSY 616.
  • PSY 616 may produce an internal signal measuring the instantaneous speed (meaning, the actual speed and not the estimated or anticipated speed) then providing such feedback may be utilized by situational assessment logic 626 for calculating drifts and offsets parameters in the PRX and/ or for correcting steering parameters control 618 of PSY 616 to correct an offset.
  • the frequency may be for a single CR or for a CCR and more.
  • PSY feedback 630 may include instantaneous scanning frequency of PSY 616, PSY 616 may produce an internal signal measuring the instantaneous frequency (meaning, the actual frequency and not the estimated or anticipated frequency) then providing such feedback may be utilized by situational assessment logic 626 for calculating drifts and offsets parameters in the PRX and/ or for correcting steering parameters control 618 of PSY 616 to correct an offset.
  • the instantaneous frequency may be relative to one or more axis.
  • PSY feedback 630 may include mechanical overshoot of PSY 616, which represents a mechanical de-calibration error from the expected position of the PSY in one or more axis.
  • PSY 616 may produce an infernal signal measuring the mechanical overshoot then providing such feedback may be utilized by situational assessment logic 626 for calculating drifts and offsets parameters in the PRX and/ or for correcting steering parameters control 618 of PSY 616 to correct an offset.
  • PSY feedback may also be utilized in order to correct steering parameters in case of vibrations induced by the Lidar system or by external factors such as vehicle engine vibrations or road induces shocks.
  • PSY feedback 630 may be utilized to correct steering parameters 618 to correct the scanning trajectory and linearize it.
  • the raw scanning pattern may typically be non-linear to begin with and contains artifacts resulting from fabrication variations and the physics of the MEMS mirror or reflective elements.
  • Mechanical impairments may be static, for example a variation in the curvature of the mirror, and dynamic, for example mirror warp/twist at the scanning edge of motion correction of the steering parameters to compensate for these non-linearizing elements may be utilized to linearize the PSY scanning trajectory.
  • SAL 626 may receive one or more signals from memory 622.
  • Information received from the memory may include laser power budget (defined by eye safety limitations, thermal limitations reliability limitation or otherwise); electrical operational parameters such as current and peak voltages; calibration data such as expected PSY scanning speed, expected PSY scanning frequency, expected PSY scanning position and more.
  • steering parameters of PSY 616, detector parameters of detector 612 and/or pulse parameters of PTX 606 may be updated based on the calculated/determined work plan 634. Work plan 634 may be tracked and determined at specific time intervals and with increasing level of accuracy and refinement of feedback signals (such as 630 and 632).
  • FIG. A7 shown is a flowchart associated with a method of scanning a scene 700.
  • a photonic pulse may be emitted (702) and a reflected pulse may be received (704) and detected (706).
  • the pulses may be steered in a joint path, the photonic pulse steered toward the scene, the pulse characterized by a first phase and the reflected pulse from the scene toward the detector, the reflected pulse characterized by a second phase (708).
  • the steering parameters may be updated including correcting the first or second phase, oscillating frequency and more (712).
  • a steering state may be programmable/adjustable in which case the initial state is determined and may be updated based on the feedbacks (714).
  • a scanning device may analyze a changing scene to determine/detect scene elements.
  • the scanning device may provide a detected scene output.
  • the host device may utilize a detected scene output or signal from the scanning device to automatically steer or operate or control the host device.
  • the scanning device may receive information from the host device and update the scanning parameters accordingly.
  • Scanning parameters may include: pulse parameters, detector parameters, steering parameters and /or otherwise.
  • a scanning device may detect an obstruction ahead and may cause the host to steer away from the obstruction.
  • the scanning device may also utilize a turning of a steering wheel and update the scanning device to analyze the area in front of the upcoming turn or if a host device is a drone a signal indicating that the drone is intended to land may cause the scanning device to analyze the scene for landing requirements instead of flight requirements.
  • a light source throughout this application has been termed a "laser” however, it is understood that alternative light sources that do not fail under technical lasers may replace a laser wherever one is discussed, for example a light emitting diode (LED) based light source or otherwise. Accordingly, a Lidar may actually include a light source which is not necessarily a laser.
  • laser light emitting diode
  • a sensing signal or an electrical sensing signal may be: (a) current, (b) voltage, or (c) a current frequency, or (d) a voltage frequency or (e) electrical charge or any other electrical characteristic (such as capacitance, resistivity and more) is applicable and understood. Accordingly, any embodiments detailing a current may include any of the other options detailed herein.
  • steering device 100 which may be associated with or part of a scanning device.
  • steering device 100 may include one or more reflective surfaces such as mirror 102, or a mirror base structure that can be attached to an external mirror assembly.
  • Mirror 102 may be any reflective surface, for example, made from polished gold, aluminum, silicon, silver, or otherwise.
  • Each of which reflective surfaces may be associated to an electrically controllable electromechanical actuator/cantilever/bender such as actuator 104.
  • Actuator 104 may be a stepper motor, direct current motor, galvanometric actuator, electrostatic, magnetic or piezo elements or thermal based actuator or otherwise.
  • actuator 104 may include a piezo-eiectric layer and a semiconductor layer and optionally, a support or base layer.
  • Actuator 104 may be connected to a support frame 108 and may further cause movement or power to be relayed to a flexible interconnect element or connector, such as spring 106.
  • Spring 106 may be utilized to adjoin actuator 104 to mirror 102.
  • Actuator 104 may include two or more electrical contacts such as contacts 1 10 and 1 12.
  • steering device 100 may include a single dual-axis mirror or dual single-axis mirrors or otherwise.
  • actuator 104 may be a partially conductive element or may include embedded conductive circuitry.
  • actuator 104 may include a semi conductive layer which may be made of a semi-conductive material which may be doped to have controllable conductive characteristics as can be achieved with silicon and similar materials.
  • actuator 104 may be designed to be conductive in some sections and isolated (or function as isolation) in others. Conductivity may be achieved by doping a silicon based actuator, for example.
  • actuator 104 may include a conductive element which may be adhesed or otherwise mechanically or chemically connected to a non-conducting (or isolated or function as isolation) base layer of the actuator.
  • one of the contacts may be coupled to an electrical source 1 14 and may be utilized to provide electrical current, voltage and/or power to actuator 104.
  • Contact 1 12 may be connected to a sensor 1 16 and may be used as an electrical sensing contact and used to measure one or more parameters of a sensing circuit.
  • a parameter of a sensing circuit may include: current, voltage, current frequency, voltage frequency, capacitance, resistivity/resistance and/or charge and more.
  • Sensor 1 16 may be electrical elements or logic circuitry and more. Electrical source 1 14 and/or sensor 1 16 may be external or included in steering device 100 and/or an associated scanning device.
  • steering device 100 may include contacts/inputs to connect to an external power source 1 14 and/or an external sensor 1 16.
  • contact 1 10 and 1 12 are interchangeable so that contact 1 10 may be connected to a sensor 1 16 and contact 1 12 may be connected to a power source 1 14.
  • actuator 104 may cause mirror 102 to move in a first direction
  • actuator 104 may be configured to cause mirror 102 to move in two directions (forward and backwards for example).
  • one or more of actuators may be utilized so that mirror 102 may move in a first range of directions represented by ⁇ and one or more additional actuator's may be utilized to cause mirror 102 to move in a second range of directions represented by ⁇ .
  • the first and second range/directions are orthogonal to each other.
  • mirror 102 may include a mirror base structure support and the reflective elements may be adhesed or otherwise mechanically or chemically connected to the mirror base structure support.
  • sensor 1 16 may detect a mechanical breakdown or failure or may sense a mechanical deflection to indicate an actual position of mirror 102.
  • steering device 100 may be associated with a controller and a scanning device.
  • the associated controller may utilize a detector feedback to determine if steering device 100 has a mechanical breakdown or failure and/or to compare an actual position of steering device 100 with an expected position.
  • scanning device may correct steering device 100 positioning based on the feedback or relay to a host device that a mechanical breakdown has occurred.
  • Fig. B2 shown is an example embodiment of steering device 202 and a central processing unit (CPU) such as controller 204 which may be local and included within scanning device 202 or a general controller of scanning device 202.
  • CPU central processing unit
  • Mirror 206 may be associated with an electrically controllable electromechanical driver such as actuation driver 208.
  • Actuation driver 208 may cause movement or power to be relayed to an actuator/canti lever/bender such as actuator 210.
  • Actuator 210 may be part of a support frame such as frame 21 1 or they may be interconnected. Additional actuators such as actuator s 212, 214 and 216 may each be controller/driven by additional actuation drivers as shown, and may each have a support frame 213, 215 and 217 (appropriately). It is understood that frames 21 1 , 213, 21 5 and/or 217 may comprise a single frame supporting ail of the actuators or may be a plurality of interconnected frames. Furthermore, the frames may be electrically separated by isolation (isn) elements or sections (as shown). Optionally, a flexible interconnect element or connector, such as spring 218, may be utilized to adjoin actuator 210 to mirror 206, to relay power or movement from actuation driver 208 to mirror 206.
  • a flexible interconnect element or connector such as spring 218, may be utilized to adjoin actuator 210 to mirror 206, to relay power or movement from actuation driver 208 to mirror 206.
  • Actuator 210 may include two or more electrical contacts such as contacts 21 OA, 210B, 210C and 210D.
  • one or more contacts 21 OA, 210B, 210C and/or 210D may be situated on frame 21 1 or actuator 210 provided that frame 21 1 and actuator 210 are electronically connected.
  • actuator 210 may be a semi-conductor which may be doped so that actuator 210 is generally conductive between contacts 210A-210D and isolative in isolation 220 and 222 to electronically isolate actuator 210 from actuators 212 and 216 (respectively).
  • actuator 210 may include a conductive element which may be adhesed or otherwise mechanically or chemically connected to actuator 210, in which case isolation elements may be inherent in the areas of actuator 210 that do not have a conductive element adhesed to them.
  • Actuator 210 may include a piezo electric layer so that current flowing through actuator 210 may cause a reaction in the piezo electric section which may cause actuator 210 to controllabiy bend.
  • CPU 204 may output/relay to mirror driver 224 a desired angular position described by ⁇ , ⁇ parameters.
  • Mirror driver 224 may be configured to control movement of mirror 206 and may cause actuation driver 208 to push a certain voltage amplitude to contacts 210C and 210D in order to attempt to achieve specific requested values for ⁇ , ⁇ deflection values of mirror 206 based on bending of actuators 210, 212, 214 and 216 (appropriate operation of actuation drivers shown for the additional actuators is understood and discussed below).
  • position feedback control circuitry may be configured to supply an electrical source (such as voltage or current) to a contact such as contact 21 OA (or 210B) and the other contact such as 210B (or 21 OA, appropriately) may be connected to a sensor within position feedback 226, which may be utilized to measure one or more electrical parameters of actuator 210 to determine a bending of actuator 210 and appropriately an actual deflection of mirror 206.
  • an electrical source such as voltage or current
  • additional positional feedback similar to position feedback 226 and an additional actuation driver similar to actuation driver 208 may be replicated for each of actuators 212-216 and mirror driver 224 and CPU 204 may control those elements as well so that a mirror deflection is controlled for ail directions.
  • the actuation drivers including actuation driver 208 may push forward a signal that causes an electro-mechanical reaction in actuators 210-216 which each, in turn is sampled for feedback.
  • the feedback on the actuators' (210-216) positions serves as a signal to mirror driver 224 enabling it to converge efficiently towards the desired position ⁇ , ⁇ set by the CPU 204, correcting a requested value based on a detected actual deflection.
  • a scanning device or LiDAR may utilize piezoelectric actuator micro electro mechanical (MEMS) mirror devices for deflecting a laser beam scanning a field of view (FOV).
  • MEMS micro electro mechanical
  • Mirror 206 deflection is a result of voltage potential/current applied to the piezoelectric element that is built up on actuator 210.
  • Mirror 206 deflection is translated into an angular scanning pattern that may not behave in a linear fashion, for a certain voltage level actuator 210 does not translate to a constant displacement value.
  • a scanning LiDAR system where the FOV dimensions are deterministic and repeatab!e across different devices is optimally realized using a closed loop method that provides an angular deflection feedback from position feedback and sensor 226 to mirror driver 224 and/or CPU 204.
  • Fig. B3 shown is an example actuator-mirror depiction 300 in accordance with some embodiments. It is understood that mirror 306 may be an example embodiment of mirror 206 of Fig. B2 and that actuator 310 may be an example embodiment of actuator 210 also of Fig. B2.
  • Actuator 310 is made of silicon and includes a PZT piezo electric layer 31 1 , a semi conductive layer 313 and a base layer 315. Contacts 31 OA and 310B are substantially similar to contact 21 OA and 210B of Fig. B2.
  • the resistivity of actuator 310 may be measured in an active stage (Ractive) when the mirror is deflected at a certain angular position and compared to the resistivity at a resting state (Rrest).
  • a feedback including Ractive may provide information to measure/determine the actual mirror deflection angle compared to an expected angle and accordingly, mirror 306 deflection may be corrected.
  • the physical property of the silicon (or semiconductor) based actuator 310 is based on an observable modulation of its electrical conductivity according to mechanical stresses that actuator 310 experiences. When actuator 310 is at rest the electrical conductivity exhibited at the two contacts 31 OA and 310B would be Rrest.
  • the PZT material of layer 31 1 if activated (by applying electrical voltage/current), would exert force on actuator 310 and cause it to bend. Bending actuator 310 experiences a mechanical force that would modify the electrical conductivity Ractive exhibited at the two contacts 31 OA and 31 OB.
  • the difference between Rrest and Ractive is correlated by a mirror drive (such as mirror driver 224 of Fig. B2) into an angular deflection value that serves to close the loop.
  • This method is used for dynamic tracking of the actual mirror position and may optimize response, amplitude, and deflection efficiency, frequency for both linear mode and resonant mode MEMS mirror schemes. Controlling the supply current/voltage may enable an expected Ractive to be achieved and appropriately an intended deflection.
  • position feedback and sensor 226 may also be utilized as a reliability feedback module.
  • a plurality of elements may include semiconductors or conducting elements, or a layer and accordingly, actuators 201 -216 could at least partially include a semi conducting element, springs 218, 226, 228 and 230 may each include a semiconductor and so may mirror 206.
  • Electrical Power (current and/or voltage) may be supplied at a first actuator contact via position feedback 226 and position feedback 226 may sense an appropriate signal at actuator 212, 214 and/or 216 via contacts 212 A or 212 B, 214A or 214B and/or 216A or 216B.
  • Fig. B4A depicting a dual axis mems mirror (400), Fig. B4B depicting a single axis mems mirror (450) and Fig. B4C depicting a round mems mirror (475).
  • current may be able to flow from contact 41 OA to contact 412B (through actuator 410 then through spring 418, mirror 406, spring 426 and to actuator 412).
  • Isolation gaps in the semiconducting frame such as isolation 420 may cause actuator 410 and 412 to be two separate islands connected electrically through the springs and mirror or mirror base structure as described.
  • the current flow or associated electrical parameter may be monitored by an associated position feedback.
  • a scene scanning device such as scanning device 512 which may be adapted to inspect regions or segments of a scene (shown here is a specific field of view (FOV) being scanned) using photonic pulses (transmitted light) whose characteristics may be dynamically selected as a function of: (a) optical characteristics of the scene segment being inspected; (b) optical characteristics of scene segments other than the one being inspected; (c) scene elements present or within proximity of the scene segment being inspected; (d) scene elements present or within proximity of scene segments other than the one being inspected; (e) an operational mode of the scanning device; and/or (f) a situational feature/ characteristic of a host platform with which the scanning device is operating.
  • FOV field of view
  • the scene scanning device may be adapted to inspect regions or segments of a scene using a set of one or more photonic transmitters 522 (including a light source such as pulse laser 514), receptors including sensors (such as detecting element 516) and/or steering assemblies 524 (which may include steering element 520); whose configuration and/or arrangement may be dynamically selected as a function of: (a) optical characteristics of the scene segment being inspected; (b) optical characteristics of scene segments other than the one being inspected; (c) scene elements present or within proximity of the scene segment being inspected; (d) scene elements present or within proximity of scene segments other than the one being inspected; (e) an operational mode of the scanning device; and/or (f) a situational characteristic of a host platform with which the scanning device is operating.
  • a photonic transmitters 522 including a light source such as pulse laser 514
  • receptors including sensors (such as detecting element 516) and/or steering assemblies 524 (which may include steering element 520)
  • whose configuration and/or arrangement may be dynamic
  • steering assembly 524 may be substantially similar to steering device 202 of Fig. B2.
  • Active scanning device 512 may include: (a) a photonic emitter assembly 522 which produces pulses of inspection photons; (b) a photonic steering assembly 524 that directs the pulses of inspection photons to/from the inspected scene segment; (c) a photonic detector assembly 516 to detect inspection photons reflected back from an object within an inspected scene segment; and (d) a controller to regulate operation of the photonic emitter assembly, the photonic steering assembly and the operation of the photonic detection assembly in a coordinated manner and in accordance with scene segment inspection characteristics of the present invention at least partially received from internal feedback of the scanning device so that the scanning device is a closed loop dynamic scanning device.
  • a closed loop scanning device is characterized by having feedback from at least one of the elements and updating one or more parameter based on the received feedback.
  • a closed loop system may receive feedback and update the system's own operation at least partially based on that feedback.
  • a dynamic system or element is one that may be updated during operation.
  • inspection of a scene segment may include illumination of the scene segment or region with a pulse of photons (transmitted light), which pulse may have known parameters such as pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter average power, pulse power intensity, pulse width, pulse repetition rate, pulse sequence, pulse duty cycle, wavelength, phase, polarity and more. Inspection may also include detecting and characterizing various aspects of reflected inspection photons, which reflected inspection photons are inspection pulse photons (reflected light) reflected back towards the scanning device (or laser reflection) from an illuminated element present within the inspected scene segment (i.e. scene segment element).
  • reflected inspection photons are inspection pulse photons (reflected light) reflected back towards the scanning device (or laser reflection) from an illuminated element present within the inspected scene segment (i.e. scene segment element).
  • Characteristics of reflected inspection photons may include photon time of flight (time from emission till detection), instantaneous power (or power signature) at and during return pulse detection, average power across entire return pulse and photon distribution/signal over return pulse period the reflected inspection photons are a function of the inspection photons and the scene elements they are reflected from and so the received reflected signal is analyzed accordingly.
  • a distance and possibly a physical characteristic such as reflected intensity of one or more scene elements present in the inspected scene segment may be estimated.
  • the definition according to embodiments of the present invention may vary from embodiment to embodiment, depending on the specific intended application of the invention.
  • the term scene may be defined as the physical space, up to a certain distance, in-front, behind, below and/or on the sides of the vehicle and/or generally in the vicinity of the vehicle or drone in all directions.
  • the term scene may also include the space behind the vehicle or drone in certain embodiments.
  • a scene segment or scene region according to embodiments may be defined by a set of angles in a polar coordinate system, for example, corresponding to a pulse or beam of light in a given direction.
  • the light beam/pulse having a center radial vector in the given direction may also be characterized by angular divergence values, polar coordinate ranges of the light beam/pulse and more.
  • Fig. B5B depicted is an example bi-static scanning device schematic 550. !t is understood that scanning device 562 is substantially similar to scanning device 512. However, scanning device 512 is a monostatic scanning device while scanning device 562 is a bi static scanning device. Accordingly, steering element 574 is comprised of two steering elements: steering element for PTX 571 and steering element for PRX 573. The rest of the discussion relating to scanning device 512 of Fig. B5A is applicable to scanning device 562 of Fig. B5B.
  • Fig. BSC depicted is an example scanning device schematic 575 with a plurality of photonic transmitters 522 and a plurality of detectors 516. All of the transmitters 522 and detectors 516 may have a joint steering element 520. !t is understood that scanning device 587 is substantially similar to scanning device 512. However, scanning device 587 is a monostatic scanning device with a plurality of transmitting and receiving elements. The rest of the discussion relating to scanning device 512 of Fig. B5A is applicable to scanning device 587 Fig. BSC.
  • Scanning system 600 may be configured to operate in conjunction with a host device 628 which may be a part of system 600 or may be associated with system 600, Scanning system 600 may include a scene scanning device such as scanning device 604 adapted to inspect regions or segments of a scene using photonic pulses whose characteristics may be dynamically selected. Scanning device 604 may include a photonic emitter assembly (PTX) such as PTX 606 to produce pulses of inspection photons. PTX 606 may include a laser or alternative light source.
  • PTX photonic emitter assembly
  • the light source may be a laser such as a solid state laser, a high power laser or otherwise or an alternative light source such as, a LED based light source or otherwise.
  • Scanning device 604 may be an example embodiment for scanning device 512 of Fig. B5A and/or scanning device 562 of Fig. B5B and/or scanning device 587 of Fig. B5C and the discussion of those scanning devices is applicable to scanning device 604.
  • the photon pulses may be characterized by one or more controllable pulse parameters such as: pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter average power, pulse power intensity, pulse width, pulse repetition rate, pulse sequence, pulse duty cycle, wavelength, phase, polarity and more.
  • the inspection photons may be controlled so that they vary in pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter average power, pulse power intensity, pulse width, pulse repetition rate, pulse sequence, pulse duty cycle, wavelength, phase, polarity and more.
  • the photon pulses may vary between each other and the parameters may change during the same signal.
  • the inspection photon pulses may be pseudo random, chirp sequence and/or may be periodical or fixed and/or a combination of these.
  • the inspection photon pulses may be characterized as: sinusoidal, chirp sequences, step functions, pseudo random signals, or linear signals or otherwise.
  • scanning device 604 may include a photonic reception and detection assembly (PRX) such as PRX 608 to receive reflected photons reflected back from an object or scene element and produce detected scene signal 610.
  • PRX 608 may include a detector such as detector 612.
  • Detector 612 may be configured to detect the reflected photons reflected back from an object or scene element and produce detected scene signal 610,
  • detected scene signal 610 may include information such as: time of flight which is indicative of the difference in time between the time a photon was emitted and detected after reflection from an object, reflected intensity, polarization values and more.
  • scanning device 604 may be a bi static scanning device where PTX 606 and PRX 608 have separate optical paths or scanning device 604 may be a rnonosfatic scanning system where PTX 606 and PRX 608 have a joint optical path.
  • scanning device 604 may include a photonic steering assembly (PSY), such as PSY 616, to direct pulses of inspection photons from PTX 606 in a direction of an inspected scene and to steer reflection photons from the scene back to PRX 608.
  • PSY photonic steering assembly
  • PTX 616 may also be in charge of positioning the singular scanned pixel window onto/in the direction of detector 612.
  • PSY 216 may be a joint PSY, and accordingly, may be joint between PTX 606 and PRX 608 which may be a preferred embodiment for a monostatic scanning system
  • PSY 616 may include a plurality of steering assemblies or may have several parts one associated with PTX 616 and another associated with PRX 608.
  • PSY 616 may be a dynamic steering assembly and may be controllable by steering parameters control 618.
  • Example steering parameters may include: scanning method that defines the acquisition pattern and sample size of the scene, power modulation that defines the range accuracy of the acquired scene, correction of axis impairments based on collected feedback and reliability confirmation and controlling deflection as described above,
  • PSY 616 may include: (a) a Single Dual-Axis MEIVIS mirror; (b) a dual single axis MEMS mirror; (c) a mirror array where multiple mirrors are synchronized in unison and acting as a single large mirror; (d) a mirror spiitted array with separate transmission and reception and/or (e) a combination of these and more.
  • PSY 818 includes a MEMS splitted array
  • the beam splitter may be integrated with the laser beam steering.
  • part of the array may be used for the transmission path and the second part of the array may be used for the reception path.
  • the transmission mirrors may be synchronized and the reception mirrors may be synchronized separately from the transmission mirrors.
  • the transmission mirrors and the reception mirrors sub arrays maintain an angular shift between themselves in order to steer the beam into separate ports, essentially integrating a circulator module.
  • PSY 616 may include one or more PSY state sensors to produce a signal indicating an operational state of PSY 616 for example power information or temperature information, reflector state, reflector actual axis positioning, reflector mechanical state and more, as discussed in the embodiments above.
  • PSY 616 may include one or more reflective surfaces, each of which reflective surface may be associated to an electrically controllable eiectromechanicaliy actuator.
  • the reflective surface(s) may be made from polished gold, aluminum, silicon, silver, or otherwise.
  • the eiectrometrical actuator(s) may be selected from actuators such as stepper motors, direct current motors, gaivanometric actuators, electrostatic, magnetic or piezo elements or thermal based actuators.
  • PSY 616 may include or be otherwise associated with one or more microeiectromechanicai systems (MEMS) mirror assemblies.
  • MEMS microeiectromechanicai systems
  • a photonic steering assembly according to refractive embodiments may include one or more reflective materials whose index of refraction may be electrically modulated, either by inducing an electric field around the material or by applying electromechanical vibrations to the material.
  • scanning device 604 may include a controller, such as controller 620.
  • Controller 604 may receive scene signal 610 from detector 612 and may control PTX 606, PSY 618 PRX 608 including detector 612 based on information stored in the controller memory 622 as well as received scene signal 610 including accumulated information from a plurality of scene signals 610 received over time.
  • controller 620 may process scene signal 610 optionally, with additional information and signals and produce a vision output such as vision signal 624 which may be relayed/transmitted/ to an associated host device.
  • Controller 620 may receive detected scene signal 610 from detector 612, optionally scene signal 610 may include time of flight values and intensity values of the received photons. ControHer 620 may build up a point cloud or 3D or 2D representation for the FOV by utilizing digital signal processing, image processing and computer vision techniques.
  • controller 620 may include situational assessment logic or circuitry such as situational assessment logic (SAL) 626.
  • SAL 626 may receive detected scene signal 610 from detector 612 as well as information from additional blocks/elements either internal or external to scanning device 104.
  • scene signal 210 can be assessed and calculated according with or without additional feedback signals such as a PSY feedback PTX feedback, PRX feedback and host feedback and information stored in memory 622 to a weighted means of local and global cost functions that determine a work plan such as work plan signal 634 for scanning device 604 (such as: which pixels in the FOV are scanned, at which laser parameters budget, at which detector parameters budget).
  • controller 620 may be a closed loop dynamic controller that receives system feedback and updates the system's operation based on that feedback.
  • SAL 626 may receive one or more feedback signals from PSY 616 via PSY feedback 630.
  • PSY feedback 630 may include instantaneous position of PSY 616 where PSY 616 may include one or more reflecting elements and each reflecting element may contain one or more axis of motion, it is understood that the instantaneous position may be defined or measured in one or more dimensions.
  • PSY's have an expected position however PSY 616 may produce an internal signal measuring the instantaneous position (meaning, the actual position) then providing such feedback may be utilized by situational assessment logic 626 for calculating drifts and offsets parameters in the PRX and/ or for correcting steering parameters control 218 of PSY 616 to correct an offset.
  • PSY feedback 630 may indicate a mechanical failure which may be relayed to host 628 which may either compensate for the mechanical failure or control host 628 to avoid an accident due to the mechanical failure.
  • PSY feedback 630 may include instantaneous scanning speed of PSY 616.
  • PSY 616 may produce an infernal signal measuring the instantaneous speed (meaning, the actual speed and not the estimated or anticipated speed) then providing such feedback may be utilized by situational assessment logic 626 for calculating drifts and offsets parameters in the PRX and/ or for correcting steering parameters control 618 of PSY 616 to correct an offset.
  • PSY feedback 630 may include instantaneous scanning frequency of PSY 616.
  • PSY 616 may produce an internal signal measuring the instantaneous frequency (meaning, the actual frequency and not the estimated or anticipated frequency) then providing such feedback may be utilized by situational assessment logic 626 for calculating drifts and offsets parameters in the PRX and/ or for correcting steering parameters control 618 of PSY 616 to correct an offset.
  • the instantaneous frequency may be relative to one or more axis.
  • PSY feedback 630 may include mechanical overshoot of PSY 616, which represents a mechanical de-calibration error from the expected position of the PSY in one or more axis, PSY 616 may produce an infernal signal measuring the mechanical overshoot then providing such feedback may be utilized by situational assessment logic 626 for calculating drifts and offsets parameters in the PRX and/ or for correcting steering parameters control 618 of PSY 616 to correct an offset.
  • PSY feedback may also be utilized in order to correct steering parameters in case of vibrations induced by the LiDAR system or by external factors such as vehicle engine vibrations or road induces shocks.
  • PSY feedback 630 may be utilized to correct steering parameters 618 to correct the scanning trajectory and linearize it.
  • the raw scanning pattern may typically be non-linear to begin with and contains artifacts resulting from fabrication variations and the physics of the MEMS mirror or reflective elements.
  • Mechanical impairments may be static, for example a variation in the curvature of the mirror, and dynamic, for example mirror warp/twist at the scanning edge of motion correction of the steering parameters to compensate for these non-linearizing elements may be utilized to linearize the PSY scanning trajectory.
  • SAL 626 may receive one or more signals from memory 622.
  • Information received from the memory may include laser power budget (defined by eye safety limitations, thermal limitations reliability limitation or otherwise); electrical operational parameters such as current and peak voltages; calibration data such as expected PSY scanning speed, expected PSY scanning frequency, expected PSY scanning position and more.
  • steering parameters of PSY 616, detector parameters of detector 612 and/or pulse parameters of PTX 606 may be updated based on the calculated/determined work plan 634.
  • Work plan 634 may be tracked and determined at specific time intervals and with increasing level of accuracy and refinement of feedback signals (such as 630 and 632).
  • FIG. B7 there is shown is a flowchart 700 of a method for scanning in accordance with some embodiments.
  • a mirror may be set to a predetermined controllable deflection (712).
  • An electrical signal indicative if an actual mechanical deflection may be detected (714) and used to determine if the predetermined deflection is substantially similar to the actual deflection (within an allowed range surrounding the predetermined deflection) (716) and if the actual deflection is substantially different than predetermined deflection the mirror's deflection may be corrected (718).
  • an electrical signal indicative of an electro-mechanical state of the mirror assembly may be detected (720) and compared to an expected electrical signal (722) to determine if a mechanical failure has occurred.
  • the present invention may include methods, circuits, devices, assemblies, systems and functionally associated machine executable code for active scene scanning.
  • a scanning device may analyze a changing scene to determine/detect scene elements.
  • the scanning device may provide a detected scene output.
  • the host device may utilize a detected scene output or signal from the scanning device to automatically steer or operate or control the host device.
  • the scanning device may receive information from the host device and update the scanning parameters accordingly.
  • Scanning parameters may include: adjustable pulse parameters, adjustable detector parameters, adjustable steering parameters and /or otherwise. For example, a scanning device may detect an obstruction ahead and steer the host away from the obstruction.
  • a scanning device may also utilize a turning of a steering wheel and update the scanning device to analyze the area in front of the upcoming turn or if a host device is a drone, a signal indicating that the drone is intended to land may cause the scanning device to analyze the scene for landing requirements instead of flight requirements.
  • a scanning device may have hierarchical field of view (FOV) perception capabilities that can be shifted in space and time. These capabilities may enable high performance LiDAR across a very large FOV area by adaptive partitioning into segments of FOVs that are allocated a certain level of quality of service (QoS). It is typically impossible to assign the highest QoS for all segments, therefore the need for an adaptive allocation method will be henceforth described.
  • FOV field of view
  • QoS depends on the signal to noise ratio between the laser pulse transmitted and the laser reflection detected from the target reflection. Different levels of laser power may be applied in different regions in the LiDAR FOV. The levels of power may range from zero up to the maximum power that the laser device is capable of transmitting and/or receiving. QoS has limitations stemming from physical design, eye safety, thermal constraints, cost and form factor and more. Accordingly, a scanning device may be limited by one or more of the following system and/or scene features: horizontal and vertical FOV range; data acquisition rate (e.g. frame rate); resolution (e.g. number of pixels in a frame); accuracy (spatial and temporal); range (effective detection distance) and more.
  • data acquisition rate e.g. frame rate
  • resolution e.g. number of pixels in a frame
  • accuracy spatialal and temporal
  • range effective detection distance
  • a light source throughout this application has been termed a "laser” however, it is understood that alternative light sources that do not fail under technical lasers may replace a laser wherever one is discussed, for example a light emitting diode (LED) based light source or otherwise. Accordingly, a Lidar may actually include a light source which is not necessarily a laser.
  • Fig. C1A depicted is an example scanning device schematic 10.
  • a scene scanning device such as scanning device 12 which may be adapted to inspect regions or segments of a scene (shown here is a specific FOV being scanned) using photonic pulses (transmitted light) whose characteristics may be dynamically selected as a function of: (a) optical characteristics of the scene segment being inspected; (b) optical characteristics of scene segments other than the one being inspected; (c) scene elements present or within proximity of the scene segment being inspected; (d) scene elements present or within proximity of scene segments other than the one being inspected; (e) an operational mode of the scanning device; and/or (f) a situational feature/ characteristic of a host platform with which the scanning device is operating.
  • the scene scanning device may be adapted to inspect regions or segments of a scene using a set of one or more photonic transmitters 22 (including a light source such as pulse laser 14) , receptors including sensors (such as detecting element 16) and/or steering assemblies 24 (which may include splitter element 18 and steering element 20); whose configuration and/or arrangement may be dynamically selected as a function of: (a) optical characteristics of the scene segment being inspected; (b) optical characteristics of scene segments other than the one being inspected; (c) scene elements present or within proximity of the scene segment being inspected; (d) scene elements present or within proximity of scene segments other than the one being inspected; (e) an operational mode of the scanning device; and/or (f) a situational characteristic of a host platform with which the scanning device is operating.
  • a photonic transmitters 22 including a light source such as pulse laser 14
  • receptors including sensors (such as detecting element 16) and/or steering assemblies 24 (which may include splitter element 18 and steering element 20)
  • Active scanning device 12 may include: (a) a photonic emitter assembly 22 which produces pulses of inspection photons; (b) a photonic steering assembly 24 that directs the pulses of inspection photons to/from the inspected scene segment; (c) a photonic detector assembly 16 to detect inspection photons reflected back from an object within an inspected scene segment; and (d) a controller to regulate operation of the photonic emitter assembly, the photonic steering assembly and the operation of the photonic detection assembly in a coordinated manner and in accordance with scene segment inspection characteristics of the present invention at least partially received from internal feedback of the scanning device so that the scanning device is a closed loop dynamic scanning device.
  • a closed loop scanning device is characterized by having feedback from at least one of the elements and updating one or more parameters based on the received feedback.
  • a closed loop system may receive feedback and update the system's own operation at least partially based on that feedback.
  • a dynamic system or element is one that may be updated during operation.
  • scanning device 12 may be characterized in that accumulative feedback from a plurality of elements may be used to update/control parameters of those and other elements.
  • inspection of a scene segment may include illumination of the scene segment or region with a pulse of photons (transmitted light), which pulse may have known parameters such as pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter average power, pulse power intensity, pulse width, pulse repetition rate, pulse sequence, pulse duty cycle, wavelength, phase, polarization and more. Inspection may also include detecting and characterizing various aspects of reflected inspection photons, which reflected inspection photons are inspection pulse photons (reflected light) reflected back towards the scanning device (or laser reflection) from an illuminated element present within the inspected scene segment (i.e. scene segment element).
  • reflected inspection photons are inspection pulse photons (reflected light) reflected back towards the scanning device (or laser reflection) from an illuminated element present within the inspected scene segment (i.e. scene segment element).
  • Characteristics of reflected inspection photons may include photon time of flight (time from emission till detection), instantaneous power (or power signature) at and during return pulse detection, average power across entire return pulse and photon distribution/signal over return pulse period the reflected inspection photons are a function of the inspection photons and the scene elements they are reflected from and so the received reflected signal is analyzed accordingly.
  • characteristics of a photonic inspection pulse with characteristics of a corresponding reflected and detected photonic pulse, a distance and possibly a physical characteristic such as reflected intensity of one or more scene elements present in the inspected scene segment may be estimated.
  • an entire scene may be scanned in order to produce a map of the scene.
  • the definition according to embodiments of the present invention may vary from embodiment to embodiment, depending on the specific intended application of the invention.
  • the term scene may be defined as the physical space, up to a certain distance, in-front, behind, below and/or on the sides of the vehicle and/or generally in the vicinity of the vehicle or drone in all directions.
  • the term scene may also include the space behind the vehicle or drone in certain embodiments.
  • a scene segment or scene region may be defined by a set of angles in a polar coordinate system, for example, corresponding to a pulse or beam of light in a given direction.
  • the light beam/pulse having a center radial vector in the given direction may also be characterized by angular divergence values, polar coordinate ranges of the light beam/pulse and more.
  • Fig. C1 B depicted is an example bistatic scanning device schematic 50. It is understood that scanning device 62 is substantially similar to scanning device 12. However, scanning device 12 is a monostatic scanning device while scanning device 62 is a bistatic scanning device. Accordingly, steering element 74 is comprised of two steering elements: steering element for PTX 71 and steering element for PRX 73. The rest of the discussion relating to scanning device 12 of Fig. C1 A is applicable to scanning device 62 Fig. C1 B.
  • Fig. C1 C depicted is an example scanning device with a plurality of photonic transmitters 22 and a plurality of splitter elements 18 and a plurality of detectors 16. Ail of the transmitters22, detectors, 16 and splitters 18 may have a joint steering element 20. It is understood that scanning device 87 is substantially similar to scanning device 12. However, scanning device 87 is a monostatic scanning device with a plurality of transmitting and receiving elements. The rest of the discussion relating to scanning device 12 of Fig. C1 A is applicable to scanning device 87 Fig. C1 C.
  • Scanning system 100 may be configured to operate in conjunction with a host device.
  • Scanning system 100 may include a scene scanning device such as scanning device 104 adapted to inspect regions or segments of a scene using photonic pulses whose characteristics may be dynamically selected.
  • Scanning device 104 may include a photonic emitter assembly (PTX) such as PTX 106 to produce pulses of inspection photons.
  • PTX 106 may include a laser or alternative light source.
  • the light source may be a laser such as a solid-state laser, a high power laser or otherwise or an alternative light source such as, a LED based light source or otherwise.
  • Scanning device 104 may be an example embodiment for scanning device 12 of Fig. C1A and/or scanning device 62 of Fig. C1 B and/or scanning device 87 of Fig. C1 C and the discussion of those scanning devices is applicable to scanning device 104.
  • the photonic pulses may be characterized by one or more controllable pulse parameters such as: pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter average power, pulse power intensity, pulse width, pulse repetition rate, pulse sequence, pulse duty cycle, wavelength, phase, polarization and pulse calibration and more.
  • Pulse calibration may include correcting a or compensating for a pulse intensity or direction so that the actual pulse is aligned with an expected/intended pulse to compensate for either differences resulting from production or for changes that may occur (such as degradation) over time.
  • the inspection photons may be controlled so that they vary in pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter average power, pulse power intensity, pulse width, pulse repetition rate, pulse sequence, pulse duty cycle, wavelength, phase, polarization and more.
  • the photonic pulses may vary between each other and the parameters may change during the same signal.
  • the inspection photonic pulses may be characterized as: sinusoidal, chirp sequences, step functions, pseudo random signals, or linear signals, they may be periodical or fixed or otherwise and/or a combination of these. Examples are shown in Figs.
  • C3A&C3B which depict example inspection photo pulses control signals 200 and 250 including examples laser signal A-laser signal H (202-256, appropriately) depicting the control signal enabling a photonic pulse and determining the intensity, width, repetition rate of the pulse as well as pulse repetition rate and/or pulse sequence.
  • PTX 106 laser may operate in different laser modes such as modulated continuous wave (CW), pulsed quasi CW (Q-CW), mode locked, and may include a plurality of laser emitters.
  • Fig, C3B depicts example inspection photonic pulses control signals 250 including examples laser signal F (252); laser signal G (254) and laser signal H (256) depicting the control signal enabling a photonic pulse and determining the intensity, width, repetition rate of the pulse as well as pulse repetition rate and/or pulse sequence.
  • Laser signal F 252 for example, is characterized by increased power pulses, this type of sequence may be applicable to cover targets at increased ranges.
  • Laser signal G 254, for example, is characterized by a chirp pulse position modulation and may be applicable for increased SNR
  • Laser signal H 256 may be characterized by a combination of chirp pulse position modulation and increased power range applicable for increased range and increased SNR.
  • PTX 106 may include additional elements such as a collimator to compensate for divergence effects of the laser emitter and render the beam into an optimal shape suitable for steering, transmission and detection.
  • PTX 106 may also include a thermoelectric cooler to optimize temperature stabilization as solid-state lasers, for example, may experience degradation in performance with temperature increase, so cooling the laser may enable a higher power yield.
  • PTX 106 may also include an optical outlet.
  • PTX 106 may include one or more PTX state sensors to produce a signal indicating an operational state of PTX 106.
  • An operational state of PTX 106 may include information such as power information or temperature information, laser state, laser degradation (in order to compensate for it), laser calibration information and more.
  • scanning device 104 may include a photonic reception and detection assembly (PRX) such as PRX 108 to receive reflected photons reflected back from an object or scene element and produce detected scene signal 1 10.
  • PRX 108 may include a detector such as detector 1 12.
  • Detector 1 12 may be configured to detect the reflected photons reflected back from an object or scene element and produce detected scene signal 1 10.
  • detected scene signal 1 10 may include information such as: time of flight which is indicative of the difference in time between the time a photon was emitted and detected after reflection from an object, reflected intensity, polarization values and more.
  • detected scene signal 1 10 may be represented using point cloud, 3D signal or vector, 4D signal or vector (adding time to the other three dimensions) and more.
  • detector 1 12 may have one or more updatabie detector parameters controlled by detector parameters control 1 14 such as: scanning direction, frame rate, ambient light effects, mechanical static and dynamic impairments, thermal effects, wear and tear, area of interest, resolution, sensitivity, detector calibration and more.
  • Calibration of detector 1 12 may include correcting or compensating a detection sensitivity or otherwise so that the actual detection sensitivity is aligned with an expected/intended detection sensitivity to compensate for either differences resulting from production or for changes that may occur (such as degradation) over time.
  • detector parameters control 1 14 may be utilized for dynamic operation of detector 1 12 for controlling the updatabie detector parameters.
  • scanning direction may be utilized for dynamic allocation of detector power/resoiution/sensitivity/resources. Scanning direction may be the expected direction of the associated inspection photons, frame rate may be the laser or PRX's frame rate, ambient light effect may include detected noise photons or expected inspection photons (before they are reflected), mechanical impairments may also be correlated to issues relating to deviation of other elements of the system that need to be compensated for, knowledge of thermal effects may be utilized to reduce signal to noise ratio, wear and tear refers to wear and tear of detector 1 12 and/or other blocks of the system that detector 1 12 can compensate for, area of interest may be an area of the scanned scene that is more important and more.
  • Ambient conditions such as fog/rain/smoke impact signal to noise (lifting the noise floor) can be used as a parameter that defines the operating conditions of the detector and also the laser.
  • Another critical element is the gating of the detector in a monostatic design with the purpose of avoiding the blinding of the detector with the initial transmission of the laser pulse - TX/RX co-channel interference,
  • defector 1 12 may include an array of detectors such as an array of avalanche photo diodes (APD), single photon detection avalanche diodes (SPADs) or a single defecting element that measures the time of flight from a laser pulse transmission event to the reception event and the intensity of the received photons.
  • the reception event may be the result of the laser pulse being reflected from a target in the FOV present at the scanned angular position of the laser of PTX 106.
  • the time of flight is a timestamp value that represents the distance of the reflecting target, object or scene element to scanning device 104. Time of flight values may be realized by photon detection and counting methods such as: TCSPC (time correlated single photon counters), analog methods for photon detection such as signal integration and qualification (via analog to digital converters or plain comparators) or otherwise.
  • TCSPC time correlated single photon counters
  • analog methods for photon detection such as signal integration and qualification (via analog to digital converters or plain comparators) or otherwise.
  • detector 1 12 may include a full array of single photon detection avalanche diodes which may be partitioned into one or more pixels that capture a fragment of the FOV.
  • a pixel may represent the basic data element that build up the captured FOV in the 3 dimensional space (e.g. the basic element of a point cloud representation) including a spatial position and the reflected intensity value,
  • some optional embodiments of detector 1 12 may include: (a) a two-dimensional array sized to capture one or more pixels out of the FOV, a pixel window may contain a fraction of a pixel, one or more pixels or otherwise; (b) a two dimensional array that captures multiple rows or columns in a FOV up to an entire FOV; (c) a single dimensional array and/or (d) a single SPAD element or otherwise.
  • PRX 1 12 may also include an optical inlet which may be a single physical path with a single lens or no lens at ail.
  • PRX 1 12 may include one or more PRX state sensors to produce a signal indicating an operational state of PRX 1 12 for example power information or temperature information, detector state and more.
  • scanning device 104 may be a bistatic scanning device where PTX 106 and PRX 108 have separate optical paths or scanning device 104 may be a monostatic scanning system where PTX 106 and PRX 108 have a joint optical path.
  • scanning device 104 may include a photonic steering assembly (PSY), such as PSY 1 16, to direct pulses of inspection photons from PTX 106 in a direction of an inspected scene and to steer reflection photons from the scene back to PRX 108.
  • PSY photonic steering assembly
  • PTX 1 16 may also be in charge of positioning the singular scanned pixel window onto/in the direction of detector 1 12.
  • PSY 1 16 may be a joint PSY, and accordingly, may be joint between PTX 106 and PRX 108 which may be a preferred embodiment for a monostatic scanning system.
  • PSY 1 16 may include a plurality of steering assemblies or may have several parts one associated with PTX 1 16 and another associated with PRX 108. (see also Figs. 1 A-1 C).
  • PSY 1 16 may be a dynamic steering assembly and may be controllable by steering parameters control 1 18.
  • Example steering parameters may include: scanning method that defines the acquisition pattern and sample size of the scene, power modulation that defines the range accuracy of the acquired scene, correction of axis impairments based on collected feedback, calibration of steering to expected characteristics.
  • Calibration may include correcting or compensating a steering axis so that the actual direction is aligned with an expected/intended direction to compensate for either differences resulting from production or for changes that may occur (such as degradation) over time.
  • PSY 1 16 may include: (a) a Single Dual-Axis MEMS mirror; (b) a dual single axis MEMS mirror; (c) a mirror array where multiple mirrors are synchronized in unison and acting as a single large mirror; (d) a mirror splitter array with separate transmission and reception and/or (e) a combination of these and more.
  • PSY 1 16 includes a MEMS splitted array
  • the beam splitter may be integrated with the laser beam steering.
  • part of the array may be used for the transmission path and the second part of the array may be used for the reception path.
  • the transmission mirrors may be synchronized and the reception mirrors may be synchronized separately from the transmission mirrors.
  • the transmission mirrors and the reception mirrors sub arrays maintain an angular shift between themselves in order to steer the beam into separate ports, essentially integrating a circulator module.
  • PSY 1 16 may include one or more PSY state sensors to produce a signal indicating an operational state of PSY1 16 for example power information or temperature information, reflector state, reflector actual axis positioning, reflector mechanical state and more.
  • PSY 1 16 may also include a circulator Model/Beam splitter, although it is understood that the splitter may also be part of PRX 108 instead.
  • the beam splitter may be configured to separate the transmission path of PTX 106 from the reception path of PRX 108.
  • the beam splitter may either be integrated in the steering assembly (for example if a splitter array is utilized) or may be redundant or not needed and accordingly the scanning device may not include a beam splitter.
  • the beam splitter of PSY 1 16 may be a polarized beam splitter(PBS), a PBS integrating a slit, a circulator beam splitter and/or a slit based reflector or otherwise.
  • PSY 1 16 may include one or more reflective surfaces, each of which reflective surface may be associated with an electrically controllable electromechanical actuator.
  • the reflective surface(s) may be made from polished gold, aluminum, silicon, silver, or otherwise.
  • the electrometricai acfuator(s) may be selected from actuators such as stepper motors, direct current motors, galvanometric actuators, electrostatic, magnetic or piezo elements or thermal based actuators.
  • PSY 1 16 may include or be otherwise associated with one or more microeiectromechanical systems (!VIE S) mirror assemblies.
  • !VIE S microeiectromechanical systems
  • PSY 1 16 may include one or more reflective materials whose index of refraction may be electrically modulated, either by inducing an electric field around the material or by applying electromechanical vibrations to the material.
  • PSY 1 16 may include a beam splitter to help separate transmission path from the reception path. Using the same photonic steering assembly may provide for tight synchronization between a direction in which a photonic pulse/beam is steered and emitted by the photonic emitter assembly and a direction of a concurrent FOV of one or more optical sensors of the photonic detection assembly.
  • Shared photonic steering assembly configuration may allow for a photonic detector assembly of a given device to focus upon and almost exclusively to collect/receive reflected photons from substantially the same scene segment being concurrently illuminated by the given device's photonic emitter assembly. Accordingly, as PSY 1 16 moves, so may a photonic pulse illumination angle along with the FOV angle.
  • scanning device 104 may include a controller to control scanning device 104, such as controller 120.
  • Controller 120 may receive scene signal 1 10 from detector 1 12 and may control PTX 106, PSY 1 16 and PRX 108 including detector 1 12, based on: (i) information stored in the controller memory 122, (ii) received scene signal 1 10 and (iii) accumulated information from a plurality of scene signals 1 10 received over time.
  • controller 120 may process scene signal 1 10 optionally, with additional information and signals and produce a vision output such as vision signal 124 which may be relayed/transmitted/ to an associated host device. Controller 120 may receive detected scene signal 1 10 from detector 1 12, optionally scene signal 1 10 may include time of flight values and intensity values of the received photons. Controller 120 may build up a point cloud or 3D or 2D representation for the FOV by utilizing digital signal processing, image processing and computer vision techniques.
  • controller 120 may include situational assessment logic or circuitry such as situational assessment logic (SAL) 126.
  • SAL 126 may receive detected scene signal 1 10 from detector 1 12 as well as information from additional blocks/elements either internal or external to scanning device 104.
  • scene signal 1 10 may be assessed and calculated with or without additional feedback signals such as a PSY feedback PTX feedback, PRX feedback and host feedback and information stored in memory 122 in a weighted means of local and global cost functions that determine a scanning/work plan such as work plan signal 134 for scanning device 104 (such as: which pixels in the FOV are scanned, at which laser parameters budget, at which detector parameters budget).
  • controller 120 may be a closed loop dynamic controller that receives system feedback and updates the system ' s operation based on that feedback.
  • Fig. C4 depicted is an example scanning system 300 in accordance with some embodiments, !t is understood that elements 304-326 and 334 are substantially similar to elements104-126 and 134 of Fig. C2 (appropriately) and that the description of those elements are applicable to elements 304-326 and 334.
  • Scanning system 300 may include host 328 in conjunction to which scanning device 304 may operate. It is understood that host 328 may be a part of scanning system 300 or may be associated with scanning device 304 and that the following description is applicable to either embodiments.
  • SAL 326 may receive detected scene signal 310 from detector 312 as well as information from additional blocks/elements either internal or external to scanning device 304, these signals and information will now be discussed in more detail.
  • SAL 326 may receive a PTX feedback 329 indicating PTX associated information such as an operational state, power consumption, temperature and more.
  • SAL 326 may receive a PRX feedback 331 indicating PRX associated information such as power consumption, temperature, detector state feedback and more.
  • SAL 326 may receive one or more feedback signals from PSY 316 via PSY feedback 330.
  • PSY feedback 330 may include: PSY operational state, instantaneous position of PSY 316 where PSY 316 may include one or more reflecting elements and each reflecting element may contain one or more axis of motion, it is understood that the instantaneous position may be defined or measured in one or more dimensions.
  • PSY's have an expected position, however PSY 316 may produce an internal signal measuring the instantaneous position (meaning, the actual position) then providing such feedback may be utilized by situational assessment logic 326 for calculating drifts and offsets parameters in the PRX and/ or for correcting steering parameters control 318 of PSY 316 to correct an offset.
  • PSY feedback 330 may include instantaneous scanning speed of PSY 316.
  • PSY 316 may produce an infernal signal measuring the instantaneous speed (meaning, the actual speed and not the estimated or anticipated speed) then providing such feedback may be utilized by situational assessment logic 326 for calculating drifts and offsets parameters in the PRX and/ or for correcting steering parameters control 318 of PSY 318 to correct an offset.
  • PSY feedback 330 may include instantaneous scanning frequency of PSY 316.
  • PSY 316 may produce an internal signal measuring the instantaneous frequency (meaning, the actual frequency and not the estimated or anticipated frequency) then providing such feedback may be utilized by situational assessment logic 326 for calculating drifts and offsets parameters in the PRX and/ or for correcting steering parameters control 318 of PSY 316 to correct an offset.
  • the instantaneous frequency may be relative to one or more axis.
  • PSY feedback 330 may include mechanical overshoot of PSY 316, which represents a mechanical decalibration error from the expected position of the PSY in one or more axis.
  • PSY 316 may produce an internal signal measuring the mechanical overshoot then providing such feedback may be utilized by situational assessment logic 326 for calculating drifts and offsets parameters in the PRX and/ or for correcting steering parameters control 318 of PSY 316 to correct an offset.
  • PSY feedback may also be utilized in order to correct steering parameters in case of vibrations induced by the LiDAR system or by external factors such as vehicle engine vibrations or road induces shocks.
  • PSY feedback 330 may be utilized to correct steering parameters 318 to correct the scanning trajectory and linearize it.
  • the raw scanning pattern may typically be non-linear and may contain artifacts resulting from fabrication variations and the physics of the MEMS mirror or reflective elements. Mechanical impairments may be static (for example a variation in the curvature of the mirror) and/or dynamic (for example mirror warp/tvvist at the scanning edge of motion). Correction of the steering parameters to compensate for these non-linearizing elements may be utilized to linearize the PSY scanning trajectory.
  • SAL 326 may receive one or more host signals from host 328 via host feedback and information 332, Information received from the host may be additional information from other sensors in the system such other LiDARs, camera, RF radar, acoustic proximity system and more or feedback following processing of vision signal 324 at host 328 processing unit ,
  • host information may be configured to override other SAL 326 inputs so that if a host indicates that a turn is expected, for example, scanning device 304 may analyze the upcoming turn.
  • the host feedback may include an override command structure including a flag indicating that the host input is to override the internal feedbacks and signals.
  • the override structure may contain direct designation to scan certain portion(s) of the scene at a certain power that translates into the LiDAR range and more.
  • SAL 326 may receive one or more signals from memory 322.
  • Information received from the memory may include laser power budget (defined by eye safety limitations, thermal limitations reliability limitation or otherwise); electrical operational parameters such as current and peak voltages; calibration data such as expected PSY scanning speed, expected PSY scanning frequency, expected PSY scanning position and more.
  • SAL 326 may be configured to produce a feedback parameter and/or a vision signal 324 utilizing digital signal processing, image processing and computer vision techniques.
  • SAL 326 may analyze information and take into consideration a plurality of different types of information such as: (a) thermal envelope which may constrain the working regime and performance of the LiDAR such as pixel rate, frame rate, defection range and FOV depth resolution (4D resolution), FOV and angular range, (b) identified road delimiters or other constant elements in the FOV of the scanning device, (c) object of interest tracking, (d) optical flow that determines, tracks and predicts global motion of the scene and individual element's motion in the scene, (e) localization data associated with the location of the scanning device which may be received from host 328, (f) volumetric effects such as rain, fog, smoke, or otherwise; (g) interference such as ambient light, sun, other LiDARs on other hosts and more; (h) Ego-motion parameters from the host 328 associated with a host's steering wheel, blinkers or otherwise; (i) fusion with camera or other sensor associated with host 328 and more.
  • thermal envelope which may constrain the working regime and performance of the LiDAR such as
  • SAL 320 may output to a host device vision signal 324.
  • the controller and/or SAL may analyze process and refine detected scene signal 310 by utilizing digital signal processing, image processing and computer vision techniques.
  • Vision signal 324 may be a qualified point data structure (e.g. , cloud map and or point cloud or otherwise). And may contain parameters not restricted to a 3D positioning of the pixels in the FOV, reflectivity intensity, confidence level according to a quality of service metric, metadata layer of identified objects to a host system.
  • a quality of service metric may be an indication of system expected QOS and may be applicable, for example when the scanning device is operating in a low QOS to compensate for high surrounding temperatures or otherwise.
  • scene signal 310 may be assessed and calculated accordingly, with or without additional feedback signals such as PSY feedback 330 PTX feedback 329, PRX feedback 331 and/or host feedback and information 332, according to a weighted means of local and global cost functions that determine a scanning/work plan such as work plan signal 334 for scanning device 304 (such as: which pixels in the FOV are scanned, at which laser parameters budget, at which detector parameters budget).
  • controller 320 may be a closed loop dynamic controller that receives system feedback and updates the system's operation based on that feedback.
  • steering parameters of PSY 316, detector parameters of detector 312 and/or pulse parameters of PTX 306 may be updated based on the calculated/determined work plan 334.
  • Work plan 334 may be tracked and determined at specific time intervals and with increasing level of accuracy and refinement of feedback signals.
  • updating of the parameters (steering, detector and/or laser pulse) based on work plan 334 may be updated in predetermined times or intervals, may be synchronous or asynchronous and may be dependent on work plan 334 itself, meaning that if a high priority update is received the update may be asynchronous and if not, it may be updated at a predetermined time.
  • work plan 334 may be updated based on real time detected scene information which may also be termed as pixel information.
  • Real time information may analyze detected fast signals during time of flight that contains one or more reflections for a given photonic inspection pulse. For example, an unexpected detected target in a low priority field may cause controller 318 to update the pulse frequency of the laser of PTX 306 via updating of the pulse parameters.
  • Work plan 334 may also be updated a frame or sub-frame level which may be information received accumulated and/or analyzed within a single frame. Furthermore, work plan 334 may be updated on an inter-frame level, which is information accumulated and analyzed over two or more frames.
  • Increased levels of real time accuracy meaning that work plan 334 is updated in a pixel or sub-frame resolution, is carried out when higher levels of computation produce increasingly usable results.
  • Increased level of non-real time accuracy within a specific time period as slower converging data becomes available e.g. computer vision generated optical flow estimation of objects over several frames
  • work plan 334 may be updated as new information becomes evident based on an inter-frame analysis.
  • controller 320 may adjust operation of PTX 306, such as: (a) inspection pulse intensity; (b) inspection pulse duration; (c) inspection pulsing patterns; and (d) more., for a given scene segment based on: (a) ambient light conditions; (b) pre-pulse reading on PRX; (c) energy level of prior reflected inspection pulse from same or nearby scene segment; and (d) relevance value of the given scene segment.
  • PTX 306 such as: (a) inspection pulse intensity; (b) inspection pulse duration; (c) inspection pulsing patterns; and (d) more., for a given scene segment based on: (a) ambient light conditions; (b) pre-pulse reading on PRX; (c) energy level of prior reflected inspection pulse from same or nearby scene segment; and (d) relevance value of the given scene segment.
  • Controller 320 may adjust operation of PRX 308, such as (a) photonic sensor selection, (b) photonic sensor biasing, (c) photonic sensor operation timing with respect to the PTX operation timing and with respect to the scene segment, (d) photon sensor output processing, and more for a given scene segment based on: (a) corresponding photonics inspection pulse intensity; (b) pre-pulse reading on PRX photonic sensor(s); (c) energy level of prior reflected inspection pulse from same or nearby scene segment; (d) current scanning direction of photonic steering assembly being used and more.
  • PRX 308 such as (a) photonic sensor selection, (b) photonic sensor biasing, (c) photonic sensor operation timing with respect to the PTX operation timing and with respect to the scene segment, (d) photon sensor output processing, and more for a given scene segment based on: (a) corresponding photonics inspection pulse intensity; (b) pre-pulse reading on PRX photonic sensor(s); (c) energy level of prior reflected inspection pulse from same
  • Controller 320 may adjust operation of PSY 316, such as: scanning method that defines the acquisition pattern and sample size of the scene, power modulation that defines the range accuracy of the acquired scene, correction of axis impairments based on collected feedback, calibration of steering to expected characteristics and more.
  • Figs. C5A & C5B depicted are host systems 400 and 450 including host 428 and 478, respectively. It is understood that elements 404- 432 of Fig. C5A and elements 454-482 of Fig. C5B are substantially similar to elements 304-332 of Fig. C4 and that the discussion of those elements is applicable to elements 404-432 and 454-482, appropriately. Furthermore, host 428 and 478 includes host controller 448 and 498 (respectively). It is understood that scanning devices 404 and 454 each include ail of the sub elements depicted in Fig. C4 with regard to scanning device 304 and are not detailed here for clarity, detailed are the blocks currently being discussed. Differences from Fig. C4 will be discussed below.
  • hosts 428 and/or 478 may each be a vehicle or a drone.
  • host 428 may receive vision signal 424 and relay to scanning device host feedback and information 432, which may include information from additional host modules such as additional scanning devices, sensors, cameras, host steering system, host controller 448 and more.
  • At least part of the situational assessment logic functionality may be executed in/or implemented by host controller 498 instead of scanning device 454. Accordingly, scene signal 460 (or a derivative of scene signal 460, hence the dashed iine__ may be relayed to host controller 498 and the rest of the analysis carried out at the host controller's SAL 476.
  • FIG. C6 shown is a flowchart 600 for a method of scanning a scene according to some embodiments.
  • a scanning device may be operated based on default values or an initial signal(s) for scanning parameters (602).
  • a detected scene signal is received/detected from a detector associated with the scanning device (604).
  • one or more elements of a scanning device may be configured to provide feedback regarding operation of the elements of a scanning device (606) and a host device associated with the scanning device may provide additional information (608) such as host information and feedback regarding additional elements of the host (additional scanning devices, sensors and more).
  • a visual situation may be assessed (610) either by the scanning device, by the host or a combination of the two. Based on the visual situation the scanning parameters may be updated (612) causing the scanning device to scan a scene based on the visual situation. The visual situation or a signal associated with the visual situation may be relayed to a host.
  • a scene scanning device adapted to inspect regions or segments of a scene using photonic pulses, which device may be a Lidar device.
  • the photonic pulses used to inspect the scene also referred to as inspection pulses, may be generated and transmitted with characteristics which are dynamically selected as a function of various parameters relating to the scene to be scanned and/or relating to a state, location and/or trajectory of the device.
  • Sensing and/or measuring of characteristics of inspection pulse reflections from scene elements illuminated with one or more inspection pulses may also be dynamic and may include a modulating optical elements on an optical receive path of the device.
  • inspection of a scene segment may include illumination of the scene segment or region with a modulated pulse of photons, which pulse may have known parameters such as: pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter and/or average power. Inspection may also include detecting and characterizing various parameters of reflected inspection photons, which reflected inspection photons are inspection pulse photons reflected back towards the scanning device from an illuminated element present within the inspected scene segment (i.e. scene segment element). [00246]
  • the definition of a scene according to embodiments of the present invention may vary from embodiment to embodiment, depending on the specific intended application of the invention.
  • the term scene may be defined as the physical space, up to a certain distance, surrounding the vehicle (in-front, on the sides, behind, below and/or above).
  • a scene segment or scene region may be defined by a set of angles in a polar coordinate system, for example, corresponding to a diverging pulse or beam of light in a given direction.
  • the light beam/pulse having a center radial vector in the given direction may also be characterized by broader defined angular divergence values, polar coordinate ranges of the light beam/pulse.
  • a scene segment or region being inspected at any given time, with any given photonic pulse may be of varying and expanding dimensions. Accordingly, an inspection resolution of a scene segment may be reduced the further away illuminated scene segment elements are away from the active scene scanning device.
  • One of the critical tasks at hand for a scanning system is to observe the scene and understand semantics, such as drivable areas, obstacles, traffic signs and take vehicle control action upon them.
  • a scene scanning device such as scanning device 1 12 which may be adapted to inspect regions or segments of a scene (shown here is a specific FOV being scanned) using photonic pulses (transmitted light) whose characteristics may be dynamically selected as a function of: (a) optical characteristics of the scene segment being inspected; (b) optical characteristics of scene segments other than the one being inspected; (c) scene elements present or within proximity of the scene segment being inspected; (d) scene elements present or within proximity of scene segments other than the one being inspected; (e) an operational mode of the scanning device; (f) expected scene elements; (g) importance/priority of an expected scene element and/or (h) a situational feature/ characteristic of a host platform with which the scanning device is operating.
  • the scene scanning device may be adapted
  • a set of one or more photonic transmitters 122 including a light source such as pulse laser 1 14
  • receptors including sensors (such as detector assembiy 1 16) and/or steering assembiy 124; whose configuration and/or arrangement may be dynamically selected as a function of: (a) optical characteristics of the scene segment being inspected; (b) optical characteristics of scene segments other than the one being inspected; (c) scene elements present or within proximity of the scene segment being inspected; (d) scene elements present or within proximity of scene segments other than the one being inspected; (e) an operational mode of the scanning device; (f) expected scene elements; (g) importance/priority of an expected scene element and/or (h) a situational feature/ characteristic of a host platform with which the scanning device is operating.
  • Active scanning device 1 12 may include: (a) a photonic transmitter 122 which produces pulses of inspection photons; (b) a photonic steering assembiy 124 that directs the pulses of inspection photons to/from the inspected scene segment; (c) a photonic detector assembly 1 16 to detect inspection photons reflected back from an object within an inspected scene segment; and (d) a controller to regulate operation of the photonic emitter assembly, the photonic steering assembiy and the operation of the photonic detection assembiy in a coordinated manner and in accordance with scene segment inspection characteristics of the present invention at least partially received from internal feedback of the scanning device so that the scanning device is a closed loop dynamic scanning device.
  • a closed loop scanning device is characterized by having feedback from at least one of the elements and updating one or more parameter of two or more scanning device blocks (photonic transmitters 122, steering assembly 124 and/or detector assembly 1 16) based on the received feedback.
  • a closed loop system may receive feedback and update the systems own operation at least partially based on that feedback.
  • a dynamic system or element is one that may be updated during operation.
  • inspection of a scene segment may include illumination of the scene segment or region with a pulse of photons (transmitted light), which pulse may have known parameters such as pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter average power, pulse power intensity, pulse width, pulse repetition rate, pulse sequence, pulse duty cycle, wavelength, phase, polarization and more. Inspection may also include detecting and characterizing various aspects of reflected inspection photons, which reflected inspection photons are inspection pulse photons (reflected light) reflected back towards the scanning device (or laser reflection) from an illuminated element present within the inspected scene segment (i.e. scene segment element).
  • reflected inspection photons are inspection pulse photons (reflected light) reflected back towards the scanning device (or laser reflection) from an illuminated element present within the inspected scene segment (i.e. scene segment element).
  • Characteristics of reflected inspection photons may include photon time of flight (time from emission till detection), instantaneous power (or power signature) at and during return pulse detection, average power across entire return pulse and photon distribution/signal over return pulse period the reflected inspection photons are a function of the inspection photons and the scene elements they are reflected from and so the received reflected signal is analyzed accordingly.
  • a distance and possibly a physical characteristic such as reflected intensity of one or more scene elements present in the inspected scene segment may be estimated.
  • an entire scene may be scanned in order to produce a map of the scene.
  • Scanning device 1 12 may have hierarchical FOV perception capabilities that can be shifted in space and time. These capabilities may enable high performance LiDAR across a very large FOV area by adaptive partitioning into segments of FOVs that are allocated a certain level of quality of service (QoS). It is typically impossible to assign the highest QoS for all segments, therefore the need for an adaptive allocation method will be henceforth described. QoS depends on the signal to noise ratio between the laser pulse transmitted 126 and the laser reflection detected 128 from the target reflection. Different levels of laser power may be applied in different regions in the LiDAR FOV. The levels of power may range from zero up to the maximum power that the laser device is capable of transmitting and/or receiving.
  • QoS quality of service
  • scanning device 1 12 may be limited by one or more of the following system and/or scene features: horizontal and vertical FOV range; data acquisition rate (e.g. frame rate); resolution (e.g. number of pixels in a frame); accuracy (spatial and temporal); range (effective detection distance) and more,
  • scanning device 1 12 may be assembled and fixed on a vehicle in constrained locations which may cause a fixed boresight. For this and additional reasons, scanning device 1 12 may be "observing" the FOV of the driving scene in a sub-optimal manner. Scanning device 1 12 may experience obstructing elements in the vehicle assembly as well as sub-optimal location in relation to the vehicle dimensions and aspect ratio and more.
  • laser power allocation affects data frame quality which is represented by the following parameters: range of target, frame rate and/or FOV and spatial resolution.
  • range of target the farther the target within FOV, the longer the path the laser pulse has to travel and the larger the laser signal loss.
  • SNR signal to noise ratio
  • the laser energy may be achieved by modulating the laser pulse transmitted 126 for example: by appropriately controlling the laser light pulse width and the laser light pulse repetition rate.
  • FOV and spatial resolution the number of data elements (e.g. 3D or 4D pixels) in a frame combined with the FOV define the size of the frame.
  • Fig. D1 B depicted is an example bistatic scanning device schematic 150. It is understood that scanning device 162 is substantially similar to scanning device 1 12. However, scanning device 1 12 is a monostatic scanning device while scanning device 162 is a bistatic scanning device. Accordingly, steering element 174 is comprised of two steering elements: steering element for PTX 171 and steering element for PRX 173. The rest of the discussion relating to scanning device 1 12 of Fig. D1A is applicable to scanning device 162 Fig. D1 B. [00253] Turning to Fig.
  • D1 C depicted is an example scanning device schematic 175 with a plurality of photonic transmitters 122 and a plurality of splitter elements 1 18 and a plurality of detector assemblies 1 16. All of the transmitters 122, detectors, 1 16 and splitters 1 18 may have a joint steering element 120. !t is understood that scanning device 187 is substantially similar to scanning device 1 12. However, scanning device 187 is a monostatic scanning device with a plurality of transmitting and receiving elements. The rest of the discussion relating to scanning device 1 12 of Fig. D1A is applicable to scanning device 187 Fig. D1 C.
  • Scanning system 200 may include a scene scanning device such as scanning device 204 adapted to inspect regions or segments of a scene using photonic pulses which may be emitted in accordance with dynamically selected parameters.
  • Scanning device 204 may be configured to operate in conjunction with a host device, such as host 228 which may be part of the system 200 or associated with the system 200.
  • Scanning device 204 may be an example embodiment for scanning device 1 12 of Fig. D1 A, scanning device 162 of Fig. D1 B and/or scanning device 187 of Fig. D1 C and the discussion of those scanning devices is applicable to scanning device 204.
  • scanning device 204 may include a photonic emitter assembly (PTX) such as PTX 206 to produce pulses of inspection photons.
  • PTX 206 may include a laser or alternative light source.
  • the light source may be a laser such as a solid-state laser, a high power laser or otherwise or an alternative light source such as, a LED based light source or otherwise.
  • the photon pulses may be characterized by one or more controllable pulse parameters such as: pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter average power, pulse power intensity, pulse width, pulse repetition rate, pulse sequence, pulse duty cycle, wavelength, phase, polarization and more.
  • the inspection photons may be controlled so that they vary in pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter average power, pulse power intensify, pulse width, pulse repetition rate, pulse sequence, pulse duty cycle, wavelength, phase, polarization and more.
  • the photon pulses may vary between each other and the parameters may change during the same signal.
  • the inspection photon pulses may be pseudo random, chirp sequence and/or may be periodical or fixed and/or a combination of these.
  • the inspection photon pulses may be characterized as: sinusoidal, chirp sequences, step functions, pseudo random signals, or linear signals or otherwise. Examples are shown in Fig. 3 which depicts example inspection photon pulses control signals 300 including examples laser signal A (302); laser signal B (304) and laser signal C (306) depicting the control signal enabling a photon pulse and determining the intensity, width, repetition rate of the pulse as well as pulse repetition rate and/or pulse sequence.
  • Laser signal A 302 for example, is characterized by increased power pulses, this type of sequence may be applicable to cover targets at increased ranges.
  • Laser signal B 304 is characterized by a chirp pulse position modulation and may be applicable for increased SNR.
  • Laser signal C 308 may be characterized by a combination of chirp pulse position modulation and increased power range applicable for both increased range and increased SNR.
  • PTX 206 laser may operate in different laser modes such as modulated continuous wave (CW), pulsed quasi CW (Q-CW), mode locked, and may include a plurality of laser emitters.
  • CW modulated continuous wave
  • Q-CW pulsed quasi CW
  • mode locked mode locked
  • PTX 206 may include additional elements such as a collimator to compensate for divergence effects of the laser emitter and render the beam into an optimal shape suitable for steering, transmission and detection.
  • PTX 206 may also include a thermoelectric cooler to optimize temperature stabilization as solid-state lasers, for example, may experience degradation in performance with temperature increase, so cooling the laser may enable a higher power yield.
  • PTX 206 may also include an optical outlet.
  • PTX 206 may include one or more PTX state sensors to produce a signal indicating an operational state of PTX 206 which may include information such as PTX power consumption, temperature, laser condition and more.
  • scanning device 204 may include a photonic reception and defection assembly (PRX) such as PRX 208 to receive reflected photons reflected back from an object or scene element and produce detected scene signal 210.
  • PRX 208 may include a detector such as detector 212.
  • Detector 212 may be configured to detect the reflected photons reflected back from an object or scene element and produce detected scene signal 210.
  • detected scene signal 210 may include information such as: time of flight which is indicative of the difference in time between the time a photon was emitted and detected after reflection from an object, reflected intensity, polarization values and more.
  • detected scene signal 210 may be represented using point cloud, 3D signal or vector, 4D signal or vector (adding time to the other three dimensions) and more.
  • detector 212 may have one or more updatable detector parameters controlled by detector parameters control 214 such as: scanning direction, frame rate, ambient light effects, mechanical static and dynamic impairments, thermal effects, wear and tear, area of interest, resolution, sensitivity and more.
  • detector parameters control 214 may be utilized for dynamic operation of detector 212, for example, scanning direction may be utilized for dynamic allocation of detector power/resolution/sensitivity/resources.
  • Scanning direction may be the expected direction of the associated inspection photons
  • frame rate may be the laser or PRX's frame rate
  • ambient light effect may include detected noise photons or expected inspection photons (before they are reflected)
  • mechanical impairments may also be correlated to issues deviation of other elements of the system that need to be compensated for
  • knowledge of thermal effects may be utilized to reduce signal to noise ratio
  • wear and tear refers to wear and tear of detector 212 and/or other blocks of the system that detector 212 can compensate for
  • area of interest may be an area of the scanned scene that is more important and more.
  • Ambient conditions such as fog/rain/smoke impact signal to noise (lifting the noise floor) can be used as a parameter that defines the operating conditions of detector 212 and also laser of PTX 206.
  • detector 212 may include an array of detectors such as an array of avalanche photo diodes (APD), single photon detection avalanche diodes (SPADs) or a single detecting elements that measure the time of flight from a laser pulse transmission event to the reception event and the intensity of the received photons.
  • the reception event is the result of the laser pulse being reflected from a target in the FOV present at the scanned angular position of the laser of PTX 208.
  • the time of flight is a timestamp value that represents the distance of the reflecting target, object or scene element to scanning device 204.
  • Time of flight values may be realized by photon detection and counting methods such as: TCSPC (time correlated single photon counters), analog methods for photon detection such as signal integration and qualification (via analog to digital converters or plain comparators) or otherwise.
  • detector 212 may include a full array of single photon detection avalanche diodes which may be partitioned into one or more pixels that capture a fragment of the FOV.
  • a pixel may represent the basic data element that build up the captured FOV in the 3-dimensional space (e.g. the basic element of a point cloud representation) including a spatial position and the reflected intensity value.
  • detector 212 may include: (a) a two dimensional array sized to capture one or more pixels out of the FOV, a pixel window may contain a fraction of a pixel, one or more pixels or otherwise; (b) a two dimensional array that captures multiple rows or columns in a FOV up to an entire FOV; (c) a single dimensional array and/or (d) a single SPAD element or otherwise.
  • PRX 212 may also include an optical inlet which may be a single physical path with a single lens or no lens at ail.
  • PRX 212 may include one or more PRX state sensors to produce a signal indicating an operational state of PRX 212 for example power information or temperature information, detector state and more.
  • scanning device 204 may be a bi static scanning device where PTX 206 and PRX 208 have separate optical paths or scanning device 204 may be a monostatic scanning system where PTX 206 and PRX 208 have a joint optical path.
  • scanning device 204 may include a photonic steering assembly (PSY), such as PSY 216, to direct pulses of inspection photons from PTX 206 in a direction of an inspected scene and to steer reflection photons from the scene back to PRX 208.
  • PSY photonic steering assembly
  • PTX 216 may also be in charge of positioning the singular scanned pixel window onto/in the direction of detector 212.
  • PSY 216 may be a joint PSY, and accordingly, may be joint between PTX 206 and PRX 208 which may be a preferred embodiment for a monostatic scanning system.
  • PSY 216 may include a plurality of steering assemblies or may have several parts one associated with PTX 216 and another associated with PRX 208.
  • PSY 216 may be a dynamic steering assembly and may be controllable by steering parameters control 218.
  • Example steering parameters may include: scanning method that defines the acquisition pattern and sample size of the scene, power modulation that defines the range accuracy of the acquired scene, correction of axis impairments based on collected feedback.
  • PSY 216 may include: (a) a Single Dual-Axis MEMS mirror; (b) a dual single axis MEMS mirror; (c) a mirror array where multiple mirrors are synchronized in unison and acting as a single large mirror; (d) a mirror splitted array with separate transmission and reception and/or (e) a combination of these and more.
  • PSY 216 includes a MEMS splitted array
  • the beam splitter may be integrated with the laser beam steering.
  • part of the array may be used for the transmission path and the second part of the array may be used for the reception path.
  • the transmission mirrors may be synchronized and the reception mirrors may be synchronized separately from the transmission mirrors.
  • the transmission mirrors and the reception mirrors sub arrays maintain an angular shift between themselves in order to steer the beam into separate ports, essentially integrating a circulator module.
  • PSY 218 may include one or more PSY state sensors which may at least partially be used for producing a signal indicating an operational state of PSY216 such as PSY feedback 230 which may include power information or temperature information, reflector state, reflector actual axis positioning, reflector mechanical state, operational health state and more.
  • PSY feedback 230 may include power information or temperature information, reflector state, reflector actual axis positioning, reflector mechanical state, operational health state and more.
  • PSY 216 may also include a circulator Model/Beam splitter, although it is understood that the splitter may also be part of PRX 208 instead.
  • the beam splitter may be configured to separate the transmission path of PTX 206 from the reception path of PRX 208.
  • the beam splitter may either be integrated in the steering assembly (for example if a splitter array is utilized) or may be redundant or not needed and accordingly the scanning device may not include a beam splitter.
  • the beam splitter of PSY 216 may be a polarized beam spiitter(PBS), a slitter PBS (polarizing beam splitter) integrating a mirror and a quarter wave plate, circulator beam splitter and/or a slit based reflector or the like.
  • PBS polarized beam spiitter
  • PBS polarizing beam splitter
  • PSY 216 may include one or more reflective surfaces, each of which reflective surface may be associated to an electrically controllable electromechanically actuator.
  • the reflective surface(s) may be made from polished gold, aluminum, silicon, silver, or otherwise.
  • the eiectrometrical actuator(s) may be selected from actuators such as stepper motors, direct current motors, gaivanometric actuators, electrostatic, magnetic or piezo elements or thermal based actuators.
  • PSY 216 may include or be otherwise associated with one or more microelectromechanical systems (MEMS) mirror assemblies.
  • MEMS microelectromechanical systems
  • a photonic steering assembly according to refractive embodiments may include one or more reflective materials whose index of refraction may be electrically modulated, either by inducing an electric field around the material or by applying electromechanical vibrations to the material.
  • the PSY 216 may include a beam splitter to help separate transmission path from the reception path.
  • Using the same photonic steering assembly may provide for tight synchronization between a direction in which a photonic pulse/beam is steered and emitted by the photonic emitter assembly and a direction of a concurrent FOV of one or more optical sensors of the photonic detection assembly.
  • Shared photonic steering assembly configuration may allow for a photonic detector assembly of a given device to focus upon and almost exclusively to collect/receive reflected photons from substantially the same scene segment being concurrently illuminated by the given device's photonic emitter assembly. Accordingly, as the photonic steering assembly moves, so does the photonic pulse illumination angle along with the FOV angle.
  • scanning device 204 may include a controller to control scanning device 204, such as controller 220.
  • Controller 204 may receive scene signal 210 from detector 212 and may control PTX 206, PSY 218 PRX 208 including detector 212 based on information stored in the controller memory 222 as well as received scene signal 210 including accumulated information from a plurality of scene signals 210 received over time.
  • SAL 226 may receive a PTX feedback
  • PTX associated information such as power consumption, temperature, laser operational status, actual emitted signal and more.
  • SAL 226 may receive a PRX feedback 231 indicating PRX associated information such as power consumption, temperature, detector state feedback, detector actual state, PRX operational status and more.
  • SAL 226 may receive a PSY feedback
  • PSY associated information such as power consumption, temperature, instantaneous position of PSY 218, instantaneous scanning speed of PSY 218, instantaneous scanning frequency of PSY 218, mechanical overshoot of PSY 218, PSY operational status and more.
  • SAL 226 may receive a host information and feedback signal such as host feedback 232 which may include information received from the host.
  • Host feedback may include information from other sensors in the system such other LiDARs, camera, RF radar, acoustic proximity system and more.
  • controller 220 may process scene signal 210, optionally, with additional information and signals and produce a vision output such as vision signal 234 which may be relayed/transmitted/ to an associated host device. Controller 220 may receive detected scene signal 210 from detector 212, optionally scene signal 210 may include time of flight values and intensity values of the received photons. Controller 220 may build up a point cloud or 3D or 2D representation for the FOV by utilizing digital signal processing, image processing and computer vision techniques.
  • controller 220 may include situational assessment logic or circuitry such as situational assessment logic (SAL) 228.
  • SAL 126 may receive detected scene signal 210 from detector 212 as well as information from additional blocks/elements either internal or external to scanning device 204 such as PTX feedback 229, PSY feedback 230, PRX feedback 231 , host feedback 232 and more.
  • scene signal 210 can be assessed and calculated with or without additional feedback signals such as a PSY feedback 230, PTX feedback 229, PRX feedback 231 and host feedback 232 and information stored in memory 222 to a weighted means of local and global cost functions that determine a scanning plan such as work plan signal 234 for scanning device 204 (such as: which pixels in the FOV are scanned, at which laser parameters budget, at which detector parameters budget).
  • Controls such as PTX control signal 251 , steering parameters control 218, PRX control 252 and/or detector parameters control 214 may be determined/ updated based on work plan 234.
  • controller 220 may be a closed loop dynamic controller that receives system feedback and updates the system's operation based on that feedback.
  • a scanning device for scanning one or more segments of a scene, also referred to as scene segments.
  • the device may include one or more photonic emitter assemblies (PTX), one or more photonic reception and detection assemblies (PRX), a photonic steering assembly (PSY) and a situationaiiy aware controller adapted to synchronize operation of the PTX, PRX and PSY, such that the device may dynamically perform active scanning of one or more scene segments, or regions, of a scene during a scanning frame.
  • PTX photonic emitter assemblies
  • PRX photonic reception and detection assemblies
  • PSY photonic steering assembly
  • a situationaiiy aware controller adapted to synchronize operation of the PTX, PRX and PSY, such that the device may dynamically perform active scanning of one or more scene segments, or regions, of a scene during a scanning frame.
  • Active scanning may include transmission of one or more photonic inspection pulses towards and across a scene segment, and when a scene element present within the scene segment is hit by an inspection pulse, measuring a roundtrip time-of-f!ight for the pulse to hit the element and its reflections to return, in order to estimate a distance and a (relative) 3-dimensional coordinate of point hit by the inspection pulse on the scene element.
  • a 3-dimensionai point cloud may be generated and used to detect, register and possibly identify the scene element.
  • the controller may be a situationally aware controller and may dynamically adjust the operational mode and operational parameters of the PTX, PRX and/or PSY based on one or more detected and/or otherwise known scene related situational parameters.
  • the controller may generate and/or adjust a work plan such as scanning plan 234 for scanning portions of a scene, as part of a scanning frame intended to scan/cover one or more segments of the scene, based on an understanding of situational parameters such as scene elements present within the one or more scene segment.
  • Other situational parameters which may be factored in generating the scanning plan may include a location and/or a trajectory of a host platform carrying a device according to embodiments.
  • Yet further situational parameters which may be factored in generating the scanning plan may include a topography, include road slope, pitch and curvature, surrounding a host platform carrying a device according to embodiments.
  • Scanning plan 234 may include: (a) a designation of scene segments within the scene to be actively scanned as part of a scanning frame, (b) an inspection pulse set scheme (PSS) which may define a pulse distribution pattern and/or individual pulse characteristics of a set of inspection pulses used to scan at least one of the scene segments, (c) a detection scheme which may define a detector sensitivity or responsivity pattern, (d) a steering scheme which may define a steering direction, frequency, designate idle elements within a steering array and more, !n other words, scanning plan 234 may at least partially affect/determine PTX control signal 251 , steering parameters control 218, PRX control 252 and/or detector parameters control 214 so that a scanning frame is actively scanned based on scene analysis.
  • PPS inspection pulse set scheme
  • scene related situational parameters factored in formulating work plan 234 may come from: (a) Localized output of a shared/pre-stored background model (Background, Topography, Road, Landmarks, etc.) (b) Localization Using GPS, Terrestrial Radio Beacons, INS, Visual landmark detection (c) Accelerometer, Gravity Meter, etc.
  • factors in formulating/generating/adjusting work plan 234 may include: (a) Host location and/or trajectory; (b) Terrain (such as road features and delimiters, static features such as trees, buildings, bridges signs and landmarks and more), (c) Background Elements (assumed and detected), and (d) Foreground Elements' (Detected) Location and Trajectory and more.
  • work plan 234 may determine or cause the FOV to be modified/determined.
  • Scanning device 204 can change its reference or nominal FOV observation by modifying, for example, the boresight reference point of sight.
  • a solid-state Lidar if incorporated in scanning device 204 may control the boresight reference point in space while maintaining the same FOV, a feature not feasible with fixed FOV Lidar devices.
  • SAL 226 may determine scanning plan 234 at least partially by determining/detecting/receiving regions of interest within the FOV and regions of non-interest within the FOV.
  • Regions of interest may be sections/pixels/elements within the FOV that are important to monitor/detect, for example, areas which may be marked as regions of interest may include, crosswalks, moving elements, people, nearby vehicles and more.
  • Regions of non- interest may be static (non-moving) far-away buildings, skyline and more.
  • scanning plan 234 may control one or more control signals including: PTX control 251 , PSY control 218, PRX control 252 and/or detector control 214.
  • the control signals may be utilized for (a) laser power scheduling to allocate laser power for each element or tri-dimensional pixel of a frame that is in the process of acquisition of scheduled for acquisition; (b) laser pulse modulation characteristics such as duration, rate, peak and average power, spot shape and more; (c) detector resources allocation for example to activate detector elements where a ROI is expected and disable detector elements where regions of non-interest are expected to reduce noise, detector sensitivity such as high sensitivity for long range detection where the reflected power is low, detector resolution such as long range detection with a weak reflected signal may result in averaging of multiple detector elements otherwise serving as separate higher resolution pixels; (d) updating steering parameters to scan an active FOV.
  • Figs. D4A - D4F shown are schematics depicting scanning plans which may be utilized to control pulse parameters and/or detector parameters and/or steering parameters using an identical key 402 for ail of these figures.
  • Fig. D4A depicts a first frame 404 wherein all of the pixels are of the same importance/priority having a default power allocated to them, this may, for example be utilized in a start-up phase or periodically interleaved in a scanning plan to monitor the whole frame for unexpected/new elements.
  • the pulse parameters may be configured to have a constant amplitude.
  • a second frame 408 which may be a partial pixel frame, a section of frame 402 is configured to have a high power while the rest of the frame may be configured to have no power.
  • the pixels having maximal power may be a ROI.
  • the resulting frame may have a low number of pixels enable a high range in the ROI due to concentration of laser power.
  • the pulse parameters may, for example, be configured to have a high amplitude only in the ROI and no power steered in the RONI
  • a steering device may be utilized to deflect the signal only in the RO! and/or a detector may be configured to receive a signal only where the ROI is expected to be received to avoid any noise for the pixels that have no power.
  • Fig. D4C depicted is a third frame 408 which may be characterized in that all the pixels have a power allocation according to the ROI designation. Thus, the most interesting/important regions may have the highest power and so on.
  • Fig. D4D depicted is a fourth frame 410 which is characterized in a range of different powered pixels.
  • the ROI in the center is allocated with maximal power while the lower interest region has a default power in a lower spatial resolution which is a different way of receiving information for a RON! or region of lower interest.
  • the pulse parameters may be configured to have a high amplitude in the ROI and a lower amplitude with a lower frequency may be utilized for the other pixels.
  • the detector may be turned off in the turned off pixels and steering parameter may be modified, for example, for rows that do not have a ROI in them.
  • Fig. D4E shown is a fifth frame 412 which is characterized as having a variable resolution, variable power/range.
  • the ROI in this example, has high resolution and high power, additional pixels at default power, low power pixels and lower spatial resolution.
  • Fig. D4F shown is a sixth frame 414 which includes a compact vehicle and a bus (see silhouettes) the edges of the vehicle and bus may be tracked with high power and the central mass of the vehicle and bus may be allocated lesser power (or no power). Such power allocation enables concentrating more power on the edges and less on the center which has less importance.
  • scanning plan 234 may dynamically allocate laser, detector and steering resources towards regions of interest/non-interest based on several strategies.
  • the pixel may be skipped (by not allocating laser power, by disabling reflection toward the scene and/or by disabling the detector or otherwise).
  • This example may be utilized for a center pixel in a tracked vehicle that would be considered much less interesting than the edge pixels of the same vehicle (see also discussion of Fig. D4F).
  • power may be scheduled (by allocating laser power, by enabling reflection towards and from the pixel and by determining an efficient detector accuracy) for predicted locations of vertical edges of a building or the predicted location of a vehicle in motion that quickly changes lanes of the edges of the FOV that coincide with the host vehicle turning in a certain direction,
  • laser power may be scheduled periodically over one or more time related sequence (full frames, partial frames) in order to acquire non-deterministic data. Periodicity may be determined by prediction estimation quality factors. For example, a region may be considered noisy having a lot of movement and accordingly may be checked (i.e, may be scanned or may be scanned with more accuracy) more frequently than an area designated as static background.
  • Fig. D5A shown is a schematic of a scene 500 to be scanned by a scanning device traveling in the direction of arrow 502.
  • the regions of interest of the scene are designated as either being a RON! or a ROI having a level of interest between low and high (see key 504),
  • the road delimiters and the buildings vertical planes in the example, would be designated as being a region of high interest (R2)
  • the pedestrian and a moving car a bit farther ahead are designated as regions of medium interest (R1 ) and the rest of the scene is generally considered a region of low interest (R0)
  • the skyline is designated as a RON! (R3).
  • D5B chart 550 the power or resource allocation for scene 500 is as determined by an associated controller which includes an SAL.
  • Chart 575 depicts interleaving of ROIs in power allocation over time so that a signal intermittently allocates the most power to the region of highest interest R2, then to the region of medium interest R1 and lowest allocation to the low interest R0, Some power is also allocated to RON! R3 in order to periodically confirm that it is still a RGIMI.
  • SAL 226 may receive information from in- band and/or out-of-band sources.
  • In-band sources are internal sources of scanning device 204 and may include vision signal 234, detected scene signal 210, PTX feedback 229, PSY feedback 230, and/or memory 222 and more. Analysis of these in-band sources may yield yet further in-band information.
  • In band information may include a road plane and road delimiters, curbs, pedestrians, vehicles, a skyline, vertical planes such as building facets, tree canopies and more and intersections such as road intersections which may be considered a virtual plane.
  • Additional in- band information may include laser power budget such as eye safety limitations, thermal limitation, reliability limitations and more which may be stored in memory 222.
  • Additional in-band information may include electrical operational parameters such as peak currents and peak voltages, calibration data such as a detected and stored correction so that scanning device 204 is calibrated.
  • Calibration data may be static, meaning tested and stored in an initiation or production process or may be dynamic to compensate for ongoing degradation or changes in the system such as operating temperature, operating voltage, etc.
  • In-band information may also include an acquired background model, acquired ROI model and/or acquired RON! model each of which may be acquired overtime by scanning device 204, for example, if scanning device operates repeatedly in a certain location/area the system may accumulate scene information history via system learning models, RO! and RON! models and background models and store them locally.
  • out-of-band sources are external sources to scanning device 204.
  • the out-of-band information may be received via host feedback 232.
  • the out-of-band sources may be directly from host 228 or may be received by host 228 and relayed to scanning device 204.
  • Out-of-band type information may include !nertial IVleasurement Unit (IMU), Ego-motion, brake or acceleration of the associated host, host wheel or wing position, GPS information, directional audio information (police siren, ambulance siren, car crash, people shouting, horns, tires screeching etc.), a background shared model and more.
  • a background shared model may be a source of background local information such as a web map and more.
  • out-of-band sources which are sources in host 228 or associated with host 228 or detected by host 228 may include: a shared or pre-stored background model, accelerometer, gravity meter and additional sensors, an acquired background model, cameras and/or camera based features/element detection, landmark lists related to global or local positioning (such as GPS, Wireless, Wi ⁇ Fi, Bluetooth vehicle to vehicle infrastructure and more) which may be accessed via a crowd sharing model and may be downloaded from a shared storage such as a cloud server.
  • a shared or pre-stored background model such as GPS, Wireless, Wi ⁇ Fi, Bluetooth vehicle to vehicle infrastructure and more
  • laser power may be controlled so that maximal signal power is not exceeded and maximal detection sensitivity is also not exceeded.
  • maximal signal power not being exceeded the power for a transmitted laser signal is distributed according to prioritization, taking into consideration an expected model as shown with regard to chart 575 for example.
  • return signals it is understood that a reflected signal is scene dependent, depending on the reflectivity of the scene elements, noise and ambient conditions as well as distance of elements a maximal threshold from a reflected signal may unintentionally be exceeded.
  • Flow chart 600 of Fig. D6 shows an initiation stage (602) initiating a scanning sequence in which the laser power is set to the minimal power setting (above zero) and the reflected signal is expected to be received at a default value (604).
  • the signal is then transmitted with the predetermined signal power (606) which at this point is still the minimal power.
  • the power is tested/checked (608) if the received signal has not reached its maximal power threshold (610) and if the transmitted signal has not reached its maximal power threshold (614) then the transmitted power level is increased (616).
  • the maximal received signal threshold is received the scene may be detected and/or regular operation of the scanning device may proceed (620). It is understood that the monitoring of the received signal as described in flow chart 600 may be carried out in parallel to the regular operation of the scanning device and/or intermittently or periodically.
  • SAL 232 may also take into account accumulative temperature information and reduce QOS (by limiting, for example, the transmitted signal, detector power and more). Accordingly, a work plan may be derived in accordance with an adjustable QOS. While peak current and/or voltage limitations may be more lenient since typically, even if a peak current/voltage event occurs it may immediately be relieved/ stopped, with regard to exceeding a peak temperature the problem is harder to solve. Scanning device 204's temperature may be monitored in each block and/or in one or more dedicated sensors. It is understood that typically once a maximal threshold is exceeded it may be very difficult to cause scanning device 204 to cool down.
  • SAL 232 may be configured to prioritize temperature and weather conditions accordingly.
  • SAL 232 may prioritize information also based on if they are in-band or out-of-band information. For example, if a host signals to SAL 232 that a turn is expected that may cause work plan signal 234 to be updated regardless of scanning process since a new FOV is expected. Accordingly, an out-of-band signal/information may selectively interrupt a SAL 232 process for calculating/analyzing work plan signal 234.
  • the host feedback may include an override command structure including a flag indicating that the host input is to override the internal feedbacks and signals.
  • the override structure may contain direct designation to scan certain portion(s) of the scene at a certain power that translates into the LiDAR range and more.
  • Fig. D7A shown is an example scene according to some embodiments, such as scene 700 which may include one or more background elements.
  • Background elements may be regions of interest or regions of non- interest.
  • a background model may be utilized so that SAL 226 may at least partially utilize a background model in order to analyze a scene based on a-priori information and produce a work plan signal 234.
  • a scanning device may be traveling in the direction as shown by arrow 702.
  • Buildings 704 and 706 and traffic light 708 may be part of a background model stored in an associated memory or received from a host.
  • An associated SAL may utilize this background information so that scanning device does not need to receive a signal to detect building 704 but rather only needs to confirm existence of the expected building.
  • traffic light 708 may also be part of a background model, and so does not need to be detected but rather confirmed. However, since it may be considered very important to a scanning device to detect the status (red, green etc) and precise location of the traffic light based on the background model, the traffic light 708 may be designated as a region of high interest.
  • a traffic light might also be a region of high interest for sensor information fusion, for example complementing an accurate position of a LiDAR with color information detection from a RGB camera.
  • elements of background such as building 712
  • a scanning system may utilize system learning to update a background model.
  • Fig. D7B shown is a flow chart 750 in accordance with a system learning method for utilizing and updating a background model in accordance with some embodiments.
  • a localization or background model is retrieved from storage (752) the storage may be local or a shared remote storage or may be a local copy from a shared remote storage.
  • the background model is verified, confirming that the background is relevant to the expected upcoming frame at t+1 (754). If the background model is inaccurate/irrelevant then a new background model may be estimated (756).
  • step 756 in the context of Fig. D7A may include verifying that buildings 704 and 706 exist. As discussed with regard to fig.
  • D7A building 712 did not exist in the background model, in which case the additional background information may be added to background model (758).
  • the next step (based on the updated model or a correct model) is utilizing the background model for scanning frame at T+1 (762). If the model is confirmed by the captured scene elements as correct at t+1 it may be relayed to a shared background model (764 and 766) after which a scanning device may continue to a next frame (768) (such as T+2).
  • Scene 810 includes a vehicle 812 with a scanning device 814.
  • the vehicle is traveling downhill in the direction of a truck 816 and a second vehicle 818.
  • the FOV of scanning device 814 is shown FOV 815 as having a minimal and maximal elevation point which neither truck 816 nor vehicle 818 fail within. Accordingly scanning device 814 cannot detect truck 816 or vehicle 818 and is only expected to do so when it gets substantially closer to them.
  • Scene 820 is substantially similar however in scene 820 scanning device 814 has a dynamic FOV, and has updated FOV 819 with minimal and maximal FOV elevation based on the detected hill slope/incline vehicle 812 is driving on (acquired/detected/designated bay a work plan signal). Accordingly, both vehicle 818 and truck 816 are detected by scanning device 814 in scene 820. Accordingly, an SAL work plan may update a dynamic FOV. More examples are discussed in the following figures.
  • Fig. D9A shown is a FOV ratio 900 including maximal FOV 902 and an active FOV 904 within the maximal FOV 902, the active FOV selected by a SAL based on a work plan signal.
  • Fig. D9B includes an example FOV 910 depicting a default FOV having a center boresight 914, an example FOV 920 having a default FOV with a shifted boresight 924 and an example FOV 930 having a shifted boresight and a shifted aspect ratio 934.
  • Fig. D9C shown are examples of FOV and a transition in the active FOV within maximal FOV 902.
  • yaw relates to movement in vertical axis
  • pitch relates to movement in lateral axis
  • roil to movement in longitudinal axis.
  • FOV 940 shows a transition from a first active FOV 944 to a second active FOV 946 when a host intends to turn left.
  • FOV 950 shows a plurality of active FOV (954-958) ail acquired in parallel in accordance with a multiple boresight targets embodiment.
  • FOV 960 shows a transition from a first active FOV 964 to a second active FOV 966 when a host having 4 wheels causes two left wheels to drive on the sidewalk causing movement in the roil axis.
  • Roiling examples include: a bend is detected by LiDAR background estimation that causes the vehicle to roil sideways across the bend's berm.
  • a host vehicle may drive/park partially on a sidewalk or other element that changes the vehicle s parallelism with respect to the road and the FOV.
  • a static roll may be caused due to uneven weight distributed in the vehicle or a malfunction of the damping system.
  • FOV 970 shows a transition from a first active FOV 974 to a second active FOV 976 when a host is intending to or is moving downhill or into an underground garage causing movement in the pitch axis.
  • a correction along the pitch axis may include situations where a vehicle is no longer parallel to the road and the vertical FOV is not optimal, speed bumpers which are a special case when both the altitude and the tilt angle of the LiDAR effective FOV changes, or a vehicle nose dive or elevation when vehicle brakes, accelerates or wind pressure at high speed causing the vehicle to change level position.
  • Yet another example is a vehicle transitioning through short pathways that exhibit a large elevation difference, for example an underground parking: when exiting from an underground parking, the vehicle's front hood is obstructing the drivers FOV from perceiving obstacles at the end of the climb. Updating to active FOV enables overcoming these difficulties.
  • Additional yaw correction examples include when a bend is detected by a background estimation and the active FOV is gradually shifted according to the speed and the bend features, in order to optimize the target FOV and ultimately detect obstacles in the bend's path.
  • Another example is when a change in wheel steering in a certain direction causes the FOV to shift towards that direction.
  • FOV 990 shows a transition from a first active FOV 994 to a second active FOV 996 when a host drives over a berm curb on the left causing a transition in roil pitch and yaw axis.
  • SAL 226 may determine work plan 234 which in turn may update any of scanning device 204 updateable parameters (discussed above) based on a plurality of situational parameters.
  • Scene elements may be determined to be regions of interest by suppressing background features detected in a previous or current frame.
  • Computer and vision processing may be utilized to detect scene elements and objects, such as computer and vision processing may include: motion tracking methods, geometrical correction, model matching (that a detected element is the same as an expected background element or meets a standard element which may be used to detect curbs, stoplights, signals and more).
  • element and object prediction methods may be utilized based on current and previous frames.
  • SAL 226 may determine objects to be background or may confirm expected background objects are present in the scene. Background features may be predicted and as described above, accordingly they only need be verified and confirmed and therefore less power needs to be allocated in detecting these elements, allowing more power/resources to be allocated toward ROIs. SAL 226 may receive background models from a local memory, a shared storage and may also detect background elements independently. Furthermore, SAL 226 may update work plan 234 based on location and/or trajectory of a host platform 228, detected topography, and more. Furthermore, FOV determined by SAL 220 may cause an update in work plan 234 may update a dynamic FOV so that the required/appropriate FOV is scanned.
  • work plan 234 may be produced based on (a) real-time detected scene signal (b) intra-frame level scene signal and (c) inter-frame level scene signal accumulated and analyzed over two or more frames.
  • work pian 234 may be updated based on real time detected scene information which may also be termed as pixel information.
  • Real time information may analyze detected fast signals during time of flight that contains one or more reflections for a given photonic inspection pulse. For example, an unexpected detected target in a low priority field may cause controller 218 to update the pulse frequency of the laser of PTX 206 via updating of the pulse parameters.
  • Work pian 234 may also be updated at a frame or sub-frame level which may be information received accumulated and/or analyzed within a single frame. Furthermore, work plan 234 may be updated on an inter-frame level, which is information accumulated and analyzed over two or more frames. Increased levels of real time accuracy, meaning that work pian 234 is updated in a pixel or sub-frame resolution, is carried out when higher levels of computation produce increasingly usable results. Increased level of non-real-time accuracy within a specific time period as slower converging data becomes available (e.g. computer vision generated optical flow estimation of objects over several frames), meaning that work pian 234 may be updated as new information becomes evident based on an inter-frame analysis. [00324] According to some embodiments, Host 228 may include steering modules, GPS, crowd sharing background source, and additional scanning devices, cameras and more.
  • FIG. D10 shown is a flow chart 1000 for scanning a scene in accordance with some embodiments.
  • a scanning device may be operated (1002) to scan a scene.
  • a scene signal may be received alongside internal control signals of the scanning device (1004) as well as a background model (1006) and signal from an associated host (1008).
  • the scanning device may assess a visual situation based on at least one of these signals (1 100) and may update a scanning plan (1 102), a background model (1 104), and/or a RON!
  • the scanning plan may cause an update in the PTX, PRX and/or PSY including updating pulses parameters, scanning parameters and/or detecting parameter and a change in the dynamic FOV.

Abstract

Disclosed are methods, circuits, devices, systems and functionally associated machine executable code for light detection and ranging (Lidar). Lidars according to embodiments disclosed may include a photonic pulse emitter assembly including one or more photonic emitters to generate and focus a photonic inspection pulse towards a photonic transmission (TX) path of the Lidar device. There may be included a photonic detection assembly which may include one or more photo sensors to receive and sense photons of a reflected photonic inspection pulses received through a receive (RX) path of the device. A photonic steering assembly may be located along both the TX and/or the RX paths and may include a Complex Reflector (CR) made of an array of steerable reflectors.

Description

Methods Circuits Devices Assembles Systems and Functionally Associated Machine Executable Code for Light Detection and Ranging Based Scanning
FIELD OF THE INVENTION
[001] The present invention relates generally to the field of light detection and ranging. More specifically, the present invention relates to a Lidar with a photonic steering assembly.
BACKGROUND
[002] Lidar which may also be called "LADAR" is a surveying method that measures a distance to a target by illuminating that target with a laser light. Lidar is sometimes considered an acronym of "Light Detection and Ranging", or a portmanteau of light and radar. Lidar may be used with terrestrial, airborne, and mobile applications.
[003] Autonomous Vehicle Systems are directed to vehicle level autonomous systems involving a Lidar system. An autonomous vehicle system stands for any vehicle integrating partial or full autonomous capabilities.
[004] Autonomous or semi-autonomous vehicles are vehicles (such as motorcycles, cars, buses, trucks and more) that at least partially control a vehicle without human input. The autonomous vehicles, sense their environment and navigate to a destination input by a user/driver.
[005] Unmanned aerial vehicles, which may be referred to as drones are aircrafts without a human on board may also utilize Lidar systems. Optionally, the drones may be manned/controlled autonomously or by a remote human operator.
[006] Autonomous vehicles and drones may use Lidar technology in their systems to aid in detecting and scanning a scene/ the area in which the vehicle and/or drones are operating in.
[007] Lidar systems, drones and autonomous (or semi-autonomous) vehicles are currently expensive and non-reliable, unsuitable for a mass market where reliability and dependence are a concern - such as the automotive market. [008] Host Systems are directed to generic host-level and system-level configurations and operations involving a Lidar system. A host system stands for any computing environment that interfaces with the Lidar, be it a vehicle system or testing/qualification environment. Such computing environment includes any device, PC, server, cloud or a combination of one or more of these. This category also covers, as a further example, interfaces to external devices such as camera and car ego motion data (acceleration, steering wheel deflection, reverse drive, etc.). It also covers the multitude of interfaces that a Lidar may interface with the host system, such as a CAN bus.
SUMMARY OF THE INVENTION
[009] Aspect of the present invention may inciude methods, circuits, devices, assemblies, systems and functionally associated machine executable code for Lidar based scanning. According to some embodiments of the present invention, a light detection and ranging (Lidar) device may include a photonic pulse emitter assembly including one or more photonic emitters to generate and focus a photonic inspection pulse towards a photonic transmission (TX) path of the Lidar device, a photonic detection assembly including one or more photo sensors to receive and sense photons of a reflected photonic inspection pulses received through a receive (RX) path of the device, a photonic steering assembly located along both the TX and the RX paths and including a Complex Reflector (CR) made of an array of steerable reflectors, where a first set of steerable reflectors are part of the TX path and a second set of steerable reflectors are part of the RX path.
[0010] According to some embodiments, the first set of steerable reflectors may direct a photonic inspection pulse from the photonic pulse emitter assembly towards a given segment of a scene to be inspected. Optionally, the second set of steerable reflectors may direct a photonic inspection pulse reflection, reflected off of a surface of an element present in the given segment of the scene, towards the photonic detection assembly. The array of steerable reflectors may be dynamic steerable reflectors. The dynamic steerable reflectors may have a controllable state, such as a transmission state, a reception state and/or an idle state. Also, the separate set of reflectors allocated to the RX path and to the TX path may achieve optical isolation between the TX and RX paths by spatial diversity and multiplexing.
[0011] According to some embodiment, the first set of steerable reflectors may be mechanically coupled to each other and the second set of steerable reflectors may be mechanically coupled to each other. The dynamic steerable reflectors are individually steerable. The first set of steerable reflectors may have a first phase and may be substantially synchronized and the second set of steerable reflectors may have a second phase and may be substantially synchronized. The first phase and the second phase may have a substantially fixed difference between them. The first set of steerable reflectors may oscillate together at a first frequency and the second set of steerable reflectors may oscillate together at a second frequency. The first and second frequency may have a substantially fixed phase shift between them. Optionally, increasing a number of dynamic steerable reflectors in a transmission state increases a transmission beam spread and/or decreasing a number of dynamic steerable reflectors in a reception state may decrease reception field of view (FOV) and may compensate for ambient light conditions.
[0012] According to some embodiments, dynamic steerable reflectors in an idle state may provide isolation between dynamic steerable reflectors in a transmission state and a reception state. A first set of steerable reflectors may be surrounded by a second set of steerable reflectors. A second set of steerable reflectors may be surrounded by a first set of steerable reflectors.
[0013] According to some embodiments, a method of scanning a scene may include: emitting a photonic pulse towards a photonic transmission (TX) path, receiving reflected photonic pulses received through a receive (RX) path, detecting with a detector a scene signal based on the reflected photonic inspection pulses, and complexly steering the photonic pulse towards a scene and the reflected photonic pulses from a scene to the detector, by reflecting at a first phase the photonic pulse and receiving at a second phase the reflected pulse, where the difference between the first and second phase may be dependent on the time it takes the photonic pulse to be reflected and return.
[0014] According to some embodiments, a vehicle may include: a scanning device to produce a detected scene signal, the scanning device including: a photonic pulse emitter assembly including one or more photonic emitters to generate and focus a photonic inspection pulse towards a photonic transmission (TX) path of the device, a photonic detection assembly including one or more photo sensors to receive and sense photons of a reflected photonic inspection pulses received through a receive (RX) path of the device, a photonic steering assembly located along both the TX and the RX paths and including a Complex Reflector (CR) made of an array of steerable reflectors, where a first set of steerable reflectors are part of the TX path and a second set of steerable reflectors are part of the RX path, and a host controller to receive the detected scene signal and control the host device at least partially based on the detected scene signal. Optionally, the host controller may be configured to relay a host signal to the scanning device. [0015] Other aspects of the present invention include methods, circuits, assemblies, devices, systems and functionally associated machine executable code for controllably steering an optical beam. According to some embodiments, a light steering device including: a mirror connected to one or more electromechanical actuators through a flexible interconnect element, one or more electromechanical actuators mechanically interconnected to a frame, and a controllable electric source to, during operation of the device, provide sensing signal at a source voltage to an electric source contact on at least one of the one or more actuators.
[0016] According to some embodiments, the light steering device may include a electrical sensing circuit connected to an electric sensing contact on at least one of the one or more actuators, and during operation of the device measure parameters of the sensing circuit. The electric source and the electrical sensing circuit may be connected to the same actuator and facilitate sensing of a mechanical deflection of the actuator to which the electric source and the electrical sensing circuit are connected. The device may include a sensor to relay a signal indicating an actual deflection determined based on the mechanical deflection. The device may include a controller to control the controllable electric source and the electrical sensing circuit. The controller may also control deflection of the actuator and may correct a steering signal based on the sensed mechanical deflection.
[0017] According to some embodiments, the electric source and the electrical sensing circuit may be each connected to a contact on two separate actuators and they may facilitate sensing of a mechanical failure of one or more elements supported by the two separate actuators. Optionally, sensing of a mechanical failure is determined based on an amplitude of a sensed current and/or or sensing of a mechanical failure is determined based on a difference between an expected current and a sensed current. Alternative embodiments substituting current with: (a) voltage, or (b) a current frequency, or (c) a voltage frequency or (d) electrical charge and more are understood.
[0018] According to some embodiments, a scanning device may include: a photonic emitter assembly (PTX) to produce pulses of inspection photons which pulses are characterized by at least one pulse parameter, a photonic reception and detection assembly (PRX) to receive reflected photons reflected back from an object, the PRX including a detector to detect the reflected photons and produce a detected scene signal, a photonic steering assembly (PSY) functionally associated with both the PTX and the PRX to direct the pulses of inspection photons in a direction of an inspected scene segment based on at least one PSY parameter and to produce a sensing signal, and a closed loop controller to: (a) control the PSY, (b) receive the sensing signal and (c) update the at least one PSY parameter at least partially based on the detected scene signal. According to some embodiments, the sensing signal may be indicative of an actual deflection of the PSY and/or a mechanical failure.
[0019] According to some embodiments, a method of scanning utilizing a mirror assembly including a mirror and a conductive actuator may include: setting a mirror having a conductive actuator to a predetermined deflection, detecting a current through the actuator indicative of a mechanical deflection of the mirror, and determining if the predetermined direction is substantially similar to the actual deflection. The method may further include correcting the actual deflection if the predetermined deflection and the actual deflection are substantially different. The method may also include detecting an actual current through the actuator and the mirror indicative of a electro-mechanical state of the mirror assembly and comparing the actual current to an expected current and determining if a mechanical failure has occurred.
[0020] Further aspects of the present invention may include methods, circuits, assemblies, devices, systems and functionally associated machine executable code for closed loop dynamic scene scanning. According to some embodiments, a scanning device may include a photonic emitter assembly (PTX) to emit at least one pulse of inspection photons in accordance with at least one adjustable pulse (generation) parameter, a photonic reception and detection assembly (PRX) to receive reflected photons reflected back from an object, the PRX including a dynamic detector to detect the reflected photons based on one or more adjustable detector parameter, the detector further configured to produce a detected scene signal, and a closed loop controller to control the PTX and PRX and to receive a PTX feedback and a PRX feedback, the controller further comprising a situational assessment unit to receive the detected scene signal from the detector and produce a scanning plan and update the at least one adjustable pulse parameter and at least one detector parameter at least partially based on the scanning plan. The scanning device may include a photonic steering assembly (PSY) and the situational assessment unit may be configured to determine the scanning plan based on a global cost function where the PSY feedback, PRX feedback, PTX feedback, memory information, host feedback and the detected scene signal are used in producing the scanning plan and the host feedback includes an override flag to indicate that the host feedback is to override the other signals and feedbacks.
[0021] According to some embodiments of the present invention, there may be provided a scanning device including a photonic emitter assembly (PTX), a photonic reception and detection assembly (PRX), a photonic steering assembly (PSY) and a controller adapted to synchronize operation of the PTX, PRX and PSY. The controller may be a situationaliy aware controller which dynamically adjusts the operational mode and operational/scanning parameters of the PTX, PRX and/or PSY based on one or more detected situational characteristics.
[0022] According to some embodiments, a scanning device may include a photonic emitter assembly (PTX) to produce pulses of inspection photons wherein the pulses are characterized by at least one pulse parameter, a photonic reception and detection assembly (PRX) to receive reflected photons reflected back from an object, the PRX including a detector to detect the reflected photons and produce a detected scene signal, a photonic steering assembly (PSY) functionally associated with both the PTX and the PRX to direct the pulses of inspection photons in a direction of an inspected scene segment and to steer the reflection photons back to the PRX, and a closed loop controller to: (a) control the PTX, PRX and PSY, (b) receive the detected scene signal from the detector and (c) update the at least one pulse parameter at least partially based on the detected scene signal.
[0023] According to some embodiments, at least one pulse parameter may be selected from the following group: pulse power intensity, pulse width, pulse repetition rate pulse sequence, pulse duty cycle, wavelength, phase and/or polarization.
[0024] According to some embodiments, the controller may include a situational assessment unit to receive the detected scene signal and produce a scanning/work plan. The situational assessment unit may receive a PSY feedback from the PSY. The situational assessment unit may receive information stored on a memory. Optionally, the information may be selected from the following list: laser power budget, electrical operational characteristics and/or calibration data. The situational assessment unit may use the PSY feedback to produce the scanning/work plan. Laser power budget may be derived from constraints such as: eye safety limitations, thermal budget, laser aging over time and more.
[0025] According to some embodiments, the work plan may be produced based on (a) real-time detected scene signal (b) intra-frame level scene signal and (c) inter- frame level scene signal accumulated and analyzed over two or more frames.
[0026] According to some embodiments, the detector may be a dynamic detector having one or more detector parameters and the closed loop controller may update the detector parameters based on the work plan. The detector parameters may be selected from the following group: scanning direction, frame rate, ambient light effects, mechanical static and dynamic impairments, dynamic gating for reducing parasitic light, dynamic sensitivity and/or thermal effects. The PSY may have one or more steering parameters and the closed loop controller may update the steering based on the work plan. The steering parameters may be selected from the following group: scanning method, power modulation, single or multiple axis methods, synchronization components. Optionally, the situational assessment unit may receive a host feedback from a host device and use the host feedback to produce or contribute to the work plan.
[0027] According to some embodiments, a method of scanning a scene may include: producing pulses of inspection photons wherein the pulses may be characterized by at least one pulse parameter, receiving reflected photons reflected back from an object; detecting the reflected photons and producing a detected scene signal; and updating at least one pulse parameter based on the detected scene signal.
[0028] According to some embodiments, the method may include producing a work plan based on the detected scene signal. Optionally, producing a work plan is also based on a PSY feedback, and may also be based on information stored on a memory such as a look up table or otherwise. [0029] According to some embodiments, the method may include updating one or more detector parameters based on the work plan, and updating steering of the PSY based on the work plan.
[0030] According to some embodiments, a vehicle may include a scanning device and a host device to receive a detected scene signal and control the vehicle at least partially based on the detected scene signal and to relay a host feedback to the scanning device. The situational assessment unit of the scanning device may receive a host feedback from the host device and use the host feedback to produce the work plan,
[0031] According to yet further aspects of the present invention, there may be included methods, circuits, assemblies, devices, systems and functionally associated machine executable code for active scene scanning. A scanning device may include: a photonic emitter assembly (PTX) to emit at least one pulse of inspection photons in accordance with at least one adjustable pulse parameter; a photonic reception and detection assembly (PRX) to receive reflected photons reflected back from an object, the PRX may include a detector to detect in accordance with at least one adjustable detection parameter the reflected photons and produce a detected scene signal, a photonic steering assembly (PSY) functionally associated with both the PTX and the PRX to direct the pulses of inspection photons in a direction of an inspected scene segment and to steer the reflection photons back to the PRX, and a closed loop controller to: (a) control the PTX, PRX and PSY, (b) receive the detected scene signal from the detector and (c) update the at least one pulse parameter and at least one defection parameter at least partially based on a scanning/work plan indicative of an estimated composition of scene elements present within the scene segment covered by the given set of inspection pulses, the work plan derived at least partially from the detected scene signal.
[0032] According to some embodiments, the steering assembly may be configured to direct and to steer in accordance with at least one adjustable steering parameter, determined by a work plan. The steering parameters may be selected from: transmission pattern, sample size of the scene, power modulation that defines the range accuracy of the scene, correction of axis impairments, dynamic FOV determination, scanning method, single or multiple deflection axis methods, synchronization components and more. The puise parameter may be selected from: pulse power intensity, puise width, pulse repetition rate, puise sequence, pulse duty cycle, wavelength, phase and polarization and more. The detection parameter may be selected from: scanning direction, frame rate, ambient light effects, mechanical static and dynamic impairments, thermal effects and more. The work plan may be derived from a background model, a region of interest (RO!) model a region of non- interest (RON!) model and/or a host signal or otherwise. The steering parameter may be a field of view (FOV) determination. The detected scene signal may be characterized by an adjustable quality of service,
[0033] According to some embodiments, an autonomous vehicle may include a scanning device as discussed above and a host controller to receive the detected scene signal and to relay a host feedback to the scanning device including host ego- motion information. Ego-motion information may include: wheels steering position, vehicle speed, vehicle acceleration, vehicle braking, headlights status, turning lights status, GPS location information and more.
[0034] The work plan may be derived from a background model at least partially stored in the host controller and may be relayed to the scanning device via the host feedback. Optionally, the detected scene signal may be emitted in accordance with an adjustable quality of service.
[0035] According to some embodiments, a method of scanning a scene may include: emitting at least one puise of inspection photons in accordance with at least one adjustable puise parameter; detecting in accordance with at least one adjustable detection parameter reflected photons and producing a detected scene signal; estimating a scene composition of scene elements present within a scene segment and deriving a scanning plan at least partially from the detected scene signal, and updating at least one puise parameter and at least one detection parameter at least partially based on the scanning plan.
BRIEF DESCRIPTION OF THE DRAWINGS
[0036] The subject matter regarded as the invention is particularly pointed out and distinctly claimed in the concluding portion of the specification. The invention, however, both as to organization and method of operation, together with objects, features, and advantages thereof, may best be understood by reference to the following detailed description when read with the accompanying drawings in which:
[0037] Fig. A1 A shows an active scanning device which may include or be otherwise functionally associated with one or more photonic steering assemblies in accordance with some embodiments;
[0038] Fig. A1 B is an example embodiment of the scanning device of Fig. A1 A;
[0039] Figs. A2A & A2B show a side view of a plurality of steerabie reflector units in accordance with some embodiments;
[0040] Fig, A2C shows a block level diagram of a steerabie reflector unit in accordance with some embodiments;
[0041] Fig. A3 shows an example complex reflector in accordance with some embodiments;
[0042] Figs. A4A - A4D show example steering devices in accordance with some embodiments;
[0043] Figs. ASA - A5C show example scanning device schematics in accordance with some embodiments;
[0044] Fig. A6 shows an example scanning system in accordance with some embodiments;
[0045] Fig. A7 is a flowchart associated with a method of scanning a scene in accordance with some embodiments.
[0046] Fig. B1 shows a steering device which may be associated with or part of a scanning device in accordance with some embodiments;
[0047] Fig. B2 shows an example embodiment of a steering device and a central processing unit (CPU) in accordance with some embodiments;
[0048] Fig. B3 shows an example actuator-mirror depiction in accordance with some embodiments;
[0049] Figs. B4A-B4C show a dual axis mems mirror, a single axis mems mirror and a round mems mirror (respectively) in accordance with some embodiments;
[0050] . Figs. B5A-B5C show example scanning device schematics;
[0051] Fig. B6 shows an example scanning system in accordance with some embodiments; [0052] Fig. B7 shows a flowchart of a method for scanning in accordance with some embodiments;
[0053] Figs. C1 A - C1 C show examples of scanning device schematics in accordance with some embodiments;
[0054] Fig. C2 shows a scanning system in accordance with some embodiments;
[0055] Figs. C3A & C3B show example inspection photonic pulses control signals including examples laser signal in accordance with some embodiments;
[0056] Fig, C4 shows an example scanning system in accordance with some embodiments;
[0057] Figs. C5A & C5B show example host systems in accordance with some embodiments;
[0058] Fig. C6 shows a flowchart for a method of scanning a scene in accordance with some embodiments;
[0059] Figs. D1 A - D1 C depict example monostatic and bistatic (appropriately) scanning device schematics in accordance with some embodiments;
[0060] Fig. D2 depicts an example scanning system in accordance with some embodiments;
[0061] Fig. D3 shows example inspection photon pulses control signals including example laser signals A, B and C;
[0062] Figs. D4A - D4F show schematics of different scanning plans which may be utilized to control pulse parameters and/or detector parameters and/or steering parameters and an identical key 402 for ail of these figures;
[0063] Fig. D5A shows a schematic of a scene to be scanned by a scanning device in accordance with some embodiments;
[0064] Fig. D5B shows a chart of the power or resource allocation for the scene of Fig. D5A and a chart depicting interleaving of ROIs in power allocation over time in accordance with some embodiments;
[0065] Fig. D6 shows a flowchart of a method for avoiding exceeding a maximal reflected signal value by controlling the transmitted signal in accordance with some embodiments; [0066] Fig. D7A shows an example scene which may include one or more background element in accordance with some embodiments;
[0067] Fig. D7B shows a flowchart associated with a system learning method for utilizing and updating a background model in accordance with some embodiments;
[0068] Fig. D8 shows two identical scenes in accordance with some embodiments;
[0069] Fig. D9A shows a FOV ratio including a maximal FOV and an active FOV;
[0070] Figs. D9B & D9C include example maximal and active FOVs in accordance with some embodiments; and
[0071] Fig. D10 shows a flowchart for scanning a scene in accordance with some embodiments.
[0072] It will be appreciated that for simplicity and clarity of illustration, elements shown in the figures have not necessarily been drawn to scale. For example, the dimensions of some of the elements may be exaggerated relative to other elements for clarity. Further, where considered appropriate, reference numerals may be repeated among the figures to indicate corresponding or analogous elements.
[0073] In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the invention. However, it will be understood by those skilled in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, components and circuits have not been described in detail so as not to obscure the present invention.
[0074] Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as "processing", "computing", "calculating", "determining", or the like, refer to the action and/or processes of a computer or computing system, or similar electronic computing device, that manipulate and/or transform data represented as physical, such as electronic, quantities within the computing system's registers and/or memories into other data similarly represented as physical quantities within the computing system's memories, registers or other such information storage, transmission or display devices.
[0075] Embodiments of the present invention may include apparatuses for performing the operations herein. This apparatus may be specially constructed for the desired purposes, or it may comprise a general purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs) electrically programmable read-only memories (EPROMs), electrically erasable and programmable read only memories (EEPROMs), magnetic or optical cards, or any other type of media suitable for storing electronic instructions, and capable of being coupled to a computer system bus.
[0076] The processes and displays presented herein are not inherently related to any particular computer or other apparatus. Various general purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct a more specialized apparatus to perform the desired method. The desired structure for a variety of these systems will appear from the description below. In addition, embodiments of the present invention are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the inventions as described herein.
[0077] The present invention may include methods, circuits, devices, assemblies, systems and functionally associated machine executable code for Lidar based scanning. According to embodiments, there may be provided a scene scanning device adapted to inspect regions or segments of a scene using photonic pulses. The photonic pulses used to inspect the scene, also referred to as inspection pulses, may be generated and transmitted with characteristics which are dynamically selected as a function of various parameters relating to the scene to be scanned and/or relating to a state, location and/or trajectory of the device. Sensing and/or measuring of characteristics of inspection pulse reflections from scene elements illuminated with one or more inspection pulses, according to embodiments, may also be dynamic and may include a modulating optical elements on an optical receive path of the device.
[0078] According to some embodiments, inspection of a scene segment may include illumination of the scene segment or region with a pulse of photons, which pulse may have known parameters such as pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter and/or average power. Inspection may also include detecting and characterizing various parameters of reflected inspection photons, which reflected inspection photons are inspection pulse photons reflected back towards the scanning device from an illuminated element present within the inspected scene segment (i.e. scene segment element). Parameters of reflected inspections photons may include photon time of flight (time from emission till detection), instantaneous power at and during return pulse detection, average power across entire return pulse and photon distribution/signal over return pulse period. According to further embodiments, by comparing parameters of a photonic inspection pulse with parameters of a corresponding reflected and detected photonic pulse, a distance and possibly a physical characteristic of one or more elements present in the inspected scene segment may be estimated. By repeating this process across multiple adjacent scene segments, optionally in some pattern such as a raster, Lissajous or snake bidirectional pattern, an entire segment scene may be scanned to produce a depth map of the scene segment.
[0079] The definition of a scene according to embodiments of the present invention may vary from embodiment to embodiment, depending on the specific intended application of the invention. For Lidar applications, optionally used with a motor vehicle platform, the term scene may be defined as the physical space, up to a certain distance, surrounding the vehicle (in-front, sides, above below and behind the vehicle). A scene segment or scene region according to embodiments may be defined by a set of angles in a polar coordinate system, for example, corresponding to a diverging pulse or beam of light in a given direction. The light beam/pulse may have a center radial vector in the given direction and may also be characterized by a broader defined by angular divergence values, polar coordinate ranges, of the light beam/pulse. Since the light beam/pulse produces an illumination area, or spot, of expanding size the further out from the light source the spot hits a target, a scene segment or region being inspected at any given time, with any given photonic pulse, may be of varying and expanding in size the farther away the illuminated scene segment elements are from the active scene scanning device. Accordingly, an inspection resolution of a scene segment may be reduced the farther away the illuminated scene segment elements are from the active scene scanning device, [0080] A monostatic scanning Lidar system utilizes the same optical path for transmission (Tx) and reception (Rx) of the laser beam. The laser in the transmission path and appropriately the inspection photons emitted from the laser may be well co!iimated and can be focused into a narrow spot while the reflected photons in the reception path becomes a larger patch due to dispersion. Accordingly a steering device is required that is efficient for a large reflection photon patch in the reception path and the need for a beam splitter that redirects the received beam (the reflection photons) to the detector. The large patch of reflection photons requires a large microelectromechanical systems ( E!VIS) mirror that may have a negative impact on the FOV and frame rate performance. Accordingly, an array of reflective surfaces having a phase between the transmission and reception surfaces is shown. An array contains small mirrors that can perform at a high scan rate with larger angles of deflection. The mirror array may essentially act as a large mirror in terms of effective area. This method decouples the mirror design from the Tx and Rx path and also obsoletes the requirement for a beam splitter. Using the same photonic steering assembly may provide for tight synchronization between a direction in which a photonic pulse/beam is steered and emitted by the photonic emitter assembly and a direction of a concurrent FOV of one or more optical sensors of the photonic detection assembly. This shared photonic steering assembly configuration may allow for a photonic detector assembly of a given device to focus upon and almost exclusively collect/receive reflected photons from substantially the same scene segment being concurrently illuminated by the given device's photonic emitter assembly. Accordingly, as the photonic steering assembly moves, so does the photonic pulse illumination angle along with the FOV angle. [0081] Turning to Fig. A1A, shown is an active scanning device 100 which may include or be otherwise functionally associated with one or more photonic steering assemblies such as complex reflector (CR) 102. CR 102 may be adapted to adjustably steer photons and/or a photonic pulse towards a selected direction, such as the direction of a center vector of a scene segment to be inspected. CR 102 may be part of a Photonic Transmission (PTX) path 104 of inspection photons emitted by the photonic emitter assembly and may direct, reflectively or using refraction, a pulse of inspection photons towards a scene segment to be inspected 106, (since the inspected scene segment is changing it is external to the scanning device and therefore has dashed line in the figure). CR 102, according to embodiments, may also be part of a photonic reception (PRX) path 108 for reflected inspection photons reflected from a surface of a scene element (object) present within an inspected/illumined scene segment 106, where CR 102 may direct reflected inspection photons, reflectively or using refraction, towards a photon detection aperture/opening of a photonic detector assembly such detector as 1 10.
[0082] According to embodiments CR 102 may include one or more sub-groups of steerable reflectors (SR) such SR1 1 12 and SR2 1 14. Each sub-group of electrically controllable /steerable reflectors may include one or more steerable reflector units such as unit 1 16. Unit 1 16 may include a microelectromechanicai systems mirror or reflective surface assembly and/or an electromechanical actuator and more.
[0083] According to some embodiments, SR1 1 12 and/or SR2 1 14 may each include one or more units arranged next to one another in either a one or two dimensional matrix to form Complex Reflector 102. Unit 1 16 may be individually controllable, for example by a device controller and or a steering assembly controller, such that each reflector may be made to tilt towards a specific angle along each of one or two separate axis. A set of array reflectors, optionally reflectors adjacent to one another, may be grouped into a Common Control Reflector (CCR) set of array reflectors which are synchronously controlled with one another so as to concurrently tilt or point in approximately the same direction. According to some embodiments SR 1 1 12 and SR2 1 14 may each be comprised of one or more CCRs. Accordingly, CR 102 may be parsed into two or more CCR sets. SR1 1 12 and SR2 1 14 may each be in-line with, or part of, a separate optical path. As shown in this example SR1 1 12 may be part of PTX 104 while SR2 1 14 is part of PRX 108,
[0084] According to some embodiments, CR 102 may be configured to electrically steer one or more reflectors such as unit 1 16 to overcomes mechanical impairments and drifts due to thermal and gain effects or otherwise. For example, one or more units 1 16 may move differently than intended (frequency, rate, speed etc.) and their movement may be compensated for by electrically controlling the reflectors appropriately.
[0085] According to some embodiments, PTX 104 may be configured to produce pulses of inspection photons. PTX 606 may include a laser or alternative light source such as laser 1 18. Laser 1 18 may be a laser such as a solid state laser, a high power laser or otherwise or an alternative light source such as, a LED based light source or otherwise.
[0086] According to some embodiments PTX may include additional elements shown by TX elements 120 which may include a collimator, controller, feedback controller/signals and more.
[0087] According to some embodiments, PRX 108 may be configured to receive photons reflected back from an object or scene element of scene 106 and produce a detected scene signal. PRX 108 may include a detector such as detector 1 10. Detector 1 10 may be configured to detect the reflected photons reflected back from an object or scene element and produce a detected scene signal. PRX 108 may include additional elements shown in RX elements 122 which may include a module to position a singular scanned pixel window onto/in the direction of detector 1 10.
[0088] According to some embodiments, the detected scene signal may include information such as: time of flight which is indicative of the difference in time between the time a photon was emitted and detected after reflection from an object, reflected intensity, polarization values and more.
[0089] According to some embodiments, scanning device 100 may be a monostatic scanning system where PTX 104 and PRX 108 have a joint optical path for example, scene 106 may be a common path as well as CR 102 which, as described above, may be configured to direct pulses of inspection photons from PTX 606 in a direction of an inspected scene and to steer reflection photons from the scene back to PRX 608.
[0090] Turning to Fig. A2A, shown is a side view of a plurality of sfeerable reflector units 200, each unit may be substantially similar to unit 1 16 of Fig. A1 . Each unit may include a reflective surface such as mirror 202, mirror 204, mirror 206, mirror 208 and mirror 210 associated with an actuator such as actuator 212, actuator 214, actuator 216, actuator 218, and actuator 220 (appropriately). Actuators 212-220 may alternatively be termed cantilevers or benders or actuators. IVIirrors 202-210 may be any reflective surface, for example, made from polished gold, aluminum, silicon, silver, or otherwise. IVIirrors 202-210 may be identical or different reflective surfaces varying in size and/or material. Actuators 212-220 may be electrically controllable electromechanical actuator such as a stepper motor, direct current motor, galvanometric actuator, electrostatic, magnetic or piezo elements or thermal based actuator or otherwise. Each actuator 212-220 may cause movement in a mirror support or spring such as segment 222-230.
[0091 ] According to some embodiments, each actuator 212-220 may be a separate actuator or may be a joined actuator for two or more mirrors, for example if actuator 216 and actuator 214 are a single actuator mirrors them mirrors 202 and 206 may move together. Alternatively, two or more actuators may be controlled to operate substantially in conjunction with each other. It is understood that a sub group of mirrors and actuators operating in unison or with a shared actuator may form a steerable reflector such as SR1 of Fig. 1 or that a single unit may be a steerable reflector in and of itself.
[0092] According to some embodiments, mirrors operating in a transmission path may have a first angle or a phase shift compared to mirrors operating in a reception path. The phase shift may remain constant across the entire scanning pattern or may exhibit a variation according to the angular position of both the mirrors in the PRX and PTX paths. Accordingly, in the example of Fig. A2A, mirrors 202, 204 and 206 are configured to be reception mirrors and are in a first angle while mirror 210 is in a second angle, and configured to be a transmission mirror. It is understood that some of the mirrors may be disabled and/or in an idle mode: (1 ) by being electrically or mechanically disabled for example by being denied the dynamic electrical signal that provides power to the actuator, the mirror may remain static in a certain position that does not contribute any signal to the scanned scene. It stays in a static location either by applying a certain voltage level or blocking the mirror by a mechanical means and/or (2) the mirror may point or scan an orthogonal direction with respect to the scene, out of the active FOV or (3) otherwise. . Reflectors in an idle mode may serve as isolation between transmission and reception reflectors which may improve signal to noise ratio and overall signal detection/ quality of signal (QoS).
[0093] According to some embodiments, mirrors which are in a transmission state, reception state and/or idle state may be dynamically controllable/selectable. Turning to Fig. A2B, shown is a side view of a plurality of steerable reflector units 250. Having the same mirrors 202-210 and actuators 212-220, except that in this instance mirrors 202 and 204 are configured to be transmission mirrors, mirrors 208 and 210 are configured to be reception mirrors and mirror 206 is an idle mirror.
[0094] Turning to Fig. A2C shown is a block level diagram of steerable reflector unit 270 which is substantially similar to unit 1 16 of Fig. A1 A. Unit 270 may include a reflective surface such as mirror 272 and an actuator such as actuator 274. It is understood that mirror 272 is substantially similar to any of mirrors 202-210 and that actuator 274 is substantially similar to actuators 212-220 of Figs. A2A&A2B. Actuator 274 may be part of or attached at one end to a support frame such as frame 278. Actuator 274 may cause movement or power to be relayed to mirror 272, Actuator 274 may include a piezo-electric layer and a semiconductor layer and optionally, a support or base layer. Optionally, a flexible interconnect element or connector, such as spring 276, may be utilized to adjoin and relay movement from actuator 274 to mirror 272.
[0095] Turning to Fig. A3 shown is an example complex reflector such as CR 302. It is understood that CR 302 may serve as an example for CR 102 and that the discussion is applicable here as well. CR 302 may include a plurality of steerable reflector such as SRs 304-334. While a 4X4 matrix of SRs is shown it is understood that many dimensions of a matrix is applicable such as 1 X2, 1 X1 , 1 X4, 2X8 3X7 or otherwise.
[0096] According to some embodiments, SRs 304-334 may be dynamic SRs so that at each point of operating a scanning device which includes CR 302, each SR 304- 334 may be controllably designated as either : (a) a complex reception reflector (CRXR) (in a reception state) included in the reception path and accordingly may steer reflected photons to a detector; (b) a complex transmission reflector (CTXR) (in a transmission state) included in the transmission path and may steer inspection photons in the direction of a scene or (c) an idle reflector (in an idle state). Accordingly, the same SR may be at times a CTXR and at other times a CRXR or an idle reflector.
[0097] According to some embodiments, SRs 304-334 may be static SRs so that they are each either a CRXR or a CTXR, A sub-set out of SRs 304-334 may operate in unison as a steerabie reflector in a reception path and a different sub-set out of SRs 304-334 may operate in unison as a SR in a transmission path. The sub-sets may be mechanically, electrically and/or elecfro-mechanically coupled to each other.
[0098] According to some embodiments, a first set of SRs may operate in conjunction as a sub-set of CTXR which may operate in a first phase and a second sub-set of CRXR may operate in a second phase. The difference or delta between the first phase and second phase may be determined based on (or to compensate for) an expected difference between transmitted and received photons. The difference between the two phases may be fixed and/or synchronized. If CR 302 is moving in a predetermined controllable path to scan a scene then the location of CR when inspection photons is different than the location when the reflection photons are received so the difference in location can be compensated for by planning the second phase accordingly. Furthermore, the difference in phase is primarily utilized to separate the TX path from the RX path. The first and second subsets may oscillate at substantially the same frequency with differences due to mechanical inaccuracies or due to compensation for mechanical inaccuracies, mechanical impairments and/or drifts. Operation of the first and second sets of reflectors may be synchronized. For example, the reflectors of the first set and reflectors of the second set may be made to oscillate at substantially the same frequency. A phase shift between reflectors of the first set and reflectors of the second set may be substantially fixed and/or otherwise synchronized. The phase shift may vary in amplitude dynamically in order to compensate for the time delay between the transmitted photonic pulses and the received reflection. The purpose is to minimize the detector sensitive area by locating the reflected laser spot in the same place on the detector for the entire period of time of flight.
[0099] According to some embodiments, where SRs 304-334 are static SRs, a subset of SR's may be mechanically coupled so that they inherently operate in unison. The sub-set may include some or all of the SR's of the same path. Alternatively, each SR may be controlled separately and a sub-set of SRs may be controlled substantially in unison so that they operate substantially in unison, in such an example the different SRs may be electrically controlled/coupled together. Furthermore, a combination where part of a sub-group is mechanically coupled to each other is understood (for example, having a shared frame or cantilever).
[00100] Turning to Figs. A4A-A4D shown are example steering devices. Fig A4A depicts a non-symmetric steering device 410, with a plurality of static reception steerable reflectors and a couple of off-center static transmission steerable reflectors, in this example the steerabie reflectors are ail of a unison size. Fig. A4B depicts a symmetric steering device 420, with a single centered static reception steerable reflector of a first size and a plurality of static transmission steerable reflectors each of a second size. Fig. A4C depicts a non-symmetric steering device 430, with a plurality of static transmission steerable reflectors and a couple of off- center static reception steerable reflectors, in this example the steerabie reflectors are ail of a unison size. Fig. A4D depicts a non-symmetric steering device 440, with a plurality of static reception steerable reflectors of varying sizes, a plurality of static transmission steerabie reflectors of varying sizes and a plurality of complex transmission reflectors which may each function as a transmission, reception or idle reflector.
[00101] According to some embodiments, increasing the amount of transmission Tx reflectors may increase a reflected photons beam spread. Decreasing the amount or reception Rx reflectors may narrow the reception field and compensate for ambient light conditions (such as clouds, rain, fog, extreme heat and more) and improve signal to noise ratio.
[00102] In figures A4A - A4C example reflectors which may be mechanically coupled are circled together. The four different examples are intended to show that any combination is applicable. [00103] Turning to Fig. A1 B, shown is an active scanning device 150 which may include or be otherwise functionally associated with one or more photonic steering assemblies such as complex reflector (CR) 152 which is meant to depict an example reflector embodiment, !n this example, Tx reflector 162 is an example embodiment of SR1 1 12 of Fig. A1A having a single unit. Rx reflector sub-unit 164 is an example of SR2 1 14 of Fig. A1 , having 8 units.
[00104] Turning to Fig, ASA, depicted is an example scanning device schematic 510. According to some embodiments, there may be provided a scene scanning device such as scanning device 512 which may be adapted to inspect regions or segments of a scene (shown here is a specific FOV being scanned) using photonic pulses (transmitted light) whose characteristics may be dynamicaliy selected as a function of: (a) optical characteristics of the scene segment being inspected; (b) optical characteristics of scene segments other than the one being inspected; (c) scene elements present or within proximity of the scene segment being inspected; (d) scene elements present or within proximity of scene segments other than the one being inspected; (e) an operational mode of the scanning device; and/or (f) a situational feature/ characteristic of a host platform with which the scanning device is operating. The scene scanning device may be adapted to inspect regions or segments of a scene using a set of one or more photonic transmitters 522 (including a light source such as pulse laser 514) , receptors including sensors (such as detecting element 516) and/or steering assembly 524; whose configuration and/or arrangement may be dynamically selected as a function of: (a) optical characteristics of the scene segment being inspected; (b) optical characteristics of scene segments other than the one being inspected; (c) scene elements present or within proximity of the scene segment being inspected; (d) scene elements present or within proximity of scene segments other than the one being inspected; (e) an operational mode of the scanning device; and/or (f) a situational characteristic of a host platform with which the scanning device is operating. It is understood that steering assembly 524 may be substantially similar to CR 102 of Fig. A1A. Active scanning device 512 may include: (a) a photonic emitter assembly 522 which produces pulses of inspection photons; (b) a photonic steering assembly 524 that directs the pulses of inspection photons to/from the inspected scene segment; (c) a photonic detector assembly 516 to detect inspection photons reflected back from an object within an inspected scene segment; and (d) a controller to regulate operation of the photonic emitter assembly, the photonic steering assembly and the operation of the photonic detection assembly in a coordinated manner and in accordance with scene segment inspection characteristics of the present invention at least partially received from internal feedback of the scanning device so that the scanning device is a closed loop dynamic scanning device. A closed loop scanning device is characterized by having feedback from at least one of the elements and updating one or more parameter based on the received feedback. A closed loop system may receive feedback and update the systems own operation at least partially based on that feedback. A dynamic system or element is one that may be updated during operation.
[00105] According to some embodiments, inspection of a scene segment may include illumination of the scene segment or region with a pulse of photons (transmitted light), which pulse may have known parameters such as pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter average power, pulse power intensity, pulse width, pulse repetition rate, pulse sequence, pulse duty cycle, wavelength, phase, polarization and more. Inspection may also include detecting and characterizing various aspects of reflected inspection photons, which reflected inspection photons are inspection pulse photons (reflected light) reflected back towards the scanning device (or laser reflection) from an illuminated element present within the inspected scene segment (i.e. scene segment element). Characteristics of reflected inspection photons may include photon time of flight (time from emission till detection), instantaneous power (or power signature) at and during return pulse detection, average power across entire return pulse and photon distribution/signal over return pulse period the reflected inspection photons are a function of the inspection photons and the scene elements they are reflected from and so the received reflected signal is analyzed accordingly. In other words, by comparing characteristics of a photonic inspection pulse with characteristics of a corresponding reflected and detected photonic pulse, a distance and possibly a physical characteristic such as reflected intensity of one or more scene elements present in the inspected scene segment may be estimated. By repeating this process across multiple adjacent scene segments, optionally in some pattern such as raster, Lissajous or other patterns, an entire scene may be scanned in order to produce a map of the scene,
[00106] Turning to Fig. A5B, depicted is an example bistatic scanning device schematic 550. !t is understood that scanning device 562 is substantially similar to scanning device 512. However, scanning device 512 is a monostatic scanning device while scanning device 562 is a bi static scanning device. Accordingly, steering element 574 is comprised of two steering elements: steering element for PTX 571 and steering element for PRX 573. The rest of the discussion relating to scanning device 512 of Fig. ASA is applicable to scanning device 562 Fig. A5B,
[00107] Turning to Fig. A5C, depicted is an example scanning device with a plurality of photonic transmitters 522 and a plurality of detectors 516, ail having a joint steering element 520. It is understood that scanning device 587 is substantially similar to scanning device 512. However, scanning device 587 is a monostatic scanning device with a plurality of transmitting and receiving elements. The rest of the discussion relating to scanning device 512 of Fig. ASA is applicable to scanning device 587 Fig, A5C.
[00108] Turning to Fig. A6, depicted is an example scanning system 600 in accordance with some embodiments. Scanning system 600 may be configured to operate in conjunction with a host device 628 which may be a part of system 600 or may be associated with system 600. Scanning system 600 may include a scene scanning device such as scanning device 604 adapted to inspect regions or segments of a scene using photonic pulses whose characteristics may be dynamically selected. Scanning device 604 may include a photonic emitter assembly (PTX) such as PTX 606 to produce pulses of inspection photons. PTX 606 may include a laser or alternative light source. The light source may be a laser such as a solid state laser, a high power laser or otherwise or an alternative light source such as, a LED based light source or otherwise. Scanning device 604 may be an example embodiment for scanning device 612 of Fig. ASA and/or scanning device 562 of Fig. A5B and/or scanning device 587 of Fig. A5C and the discussion of those scanning devices is applicable to scanning device 604.
[00109] According to some embodiments, the photon pulses may be characterized by one or more controllable pulse parameters such as: pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter average power, pulse power intensity, pulse width, pulse repetition rate, pulse sequence, pulse duty cycle, wavelength, phase, polarization and more. The inspection photons may be controlled so that they vary in pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter average power, pulse power intensity, pulse width, pulse repetition rate, pulse sequence, pulse duty cycle, wavelength, phase, polarization and more. The photon pulses may vary between each other and the parameters may change during the same signal. The inspection photon pulses may be pseudo random, chirp sequence and/or may be periodical or fixed and/or a combination of these. The inspection photon pulses may be characterized as: sinusoidal, chirp sequences, step functions, pseudo random signals, or linear signals or otherwise.
[001 10] According to some embodiments, scanning device 604 may include a photonic reception and detection assembly (PRX) such as PRX 608 to receive reflected photons reflected back from an object or scene element and produce detected scene signal 610. PRX 608 may include a detector such as detector 612. Detector 612 may be configured to detect the reflected photons reflected back from an object or scene element and produce detected scene signal 610.
[00111] According to some embodiments, detected scene signal 610 may include information such as: time of flight which is indicative of the difference in time between the time a photon was emitted and defected after reflection from an object, reflected intensity, polarization values and more.
[00112] According to some embodiments, scanning device 604 may be a monostatic scanning system where PTX 606 and PRX 608 have a joint optical path. Scanning device 604 may include a photonic steering assembly (PSY), such as PSY 616, to direct pulses of inspection photons from PTX 606 in a direction of an inspected scene and to steer reflection photons from the scene back to PRX 608. PTX 616 may also be in charge of positioning the singular scanned pixel window onto/in the direction of detector 612.
[00113] According to some embodiments PSY 616 may be a dynamic steering assembly and may be controllable by steering parameters control 618. Example steering parameters may include: scanning method that defines the acquisition pattern and sample size of the scene, power modulation that defines the range accuracy of the acquired scene, correction of axis impairments based on collected feedback and reliability confirmation, definition which sub-sections are CRXR and which are CTXR.
[00114] According to some embodiments PSY 616 may include: (a) a Single Dual-Axis MEMS mirror; (b) a dual single axis MEMS mirror; (c) a mirror array where multiple mirrors are synchronized in unison and acting as a single large mirror; (d) a mirror splitted array with separate transmission and reception and/or (e) a combination of these and more.
[00115] According to some embodiments, part of the array may be used for the transmission path and the second part of the array may be used for the reception path. The transmission mirrors may be synchronized and the reception mirrors may be synchronized separately from the transmission mirrors. The transmission mirrors and the reception mirrors sub arrays may maintain an angular shift between themselves in order to steer the beam into separate ports, essentially integrating a circulator module.
[00116] According to some embodiments, PSY 616 may include one or more PSY state sensors to produce a signal indicating an operational state of PSY 616 for example power information or temperature information, reflector state, reflector actual axis positioning, reflector mechanical state and reflector operative state (transmission state, reception state or idle state) more.
[001 17] According to some embodiments, PSY 616 may include one or more steerable reflectors, each of which may include a reflective surface associated with an electrically controllable actuator. PSY 616 may include or be otherwise associated with one or more microeiectromechanicai systems (MEMS) mirror assemblies or an array with a plurality of steerable reflectors, A photonic steering assembly according to refractive embodiments may include one or more reflective materials whose index of refraction may be electrically modulated, either by inducing an electric field around the material or by applying electromechanical vibrations to the material. PSY 606 complex reflector may include two or more CCRs steerable separately or dependent on each other. Furthermore, the complex reflector may include one or more dynamic CCRs which same complex reflector may be controllable to switch between a transmission, reception and/or idle mode. Accordingly control signals to PSY 817 may also control a transmission phase and/or a reception phase; a single phase which transmission and/or reception phase are both derived from, specific phases for each complex reflector, or for each CCR, and mode selection for dynamic reflectors (transmission, reception and/or idle) and/or frequency parameters.
[00118] According to some embodiments, scanning device 604 may include a controller, such as controller 620. Controller 604 may receive scene signal 610 from detector 612 and may control PTX 606, PSY 618 PRX 608 including detector 612 based on information stored in the controller memory 622 as well as received scene signal 610 including accumulated information from a plurality of scene signals 610 received over time.
[001 19] According to some embodiments, controller 620 may process scene signal 610 optionally, with additional information and signals and produce a vision output such as vision signal 624 which may be relayed/transmitted/ to an associated host device. Controller 620 may receive detected scene signal 610 from detector 612, optionally scene signal 610 may include time of flight values and intensity values of the received photons. Controller 620 may build up a point cloud or 3D or 2D representation for the FOV by utilizing digital signal processing, image processing and computer vision techniques.
[00120] According to some embodiments, controller 620 may include situational assessment logic or circuitry such as situational assessment logic (SAL) 626. SAL 626 may receive detected scene signal 610 from detector 612 as well as information from additional blocks/elements either internal or external to scanning device 104.
[00121] According to some embodiments, scene signal 210 can be assessed and calculated according with or without additional feedback signals such as a PSY feedback PTX feedback, PRX feedback and host feedback and information stored in memory 622 to a weighted means of local and global cost functions that determine a work plan such as work plan signal 634 for scanning device 604 (such as: which pixels in the FOV are scanned, at which laser parameters budget, at which detector parameters budget). Accordingly, controller 620 may be a closed loop dynamic controller that receives system feedback and updates the system's operation based on that feedback.
[00122] According to some embodiments, SAL 626 may receive one or more feedback signals from PSY 616 via PSY feedback 630. PSY feedback 630 may include instantaneous position of PSY 616 where PSY 616 may include one or more reflecting elements and each reflecting element may contain one or more axis of motion, it is understood that the instantaneous position may be defined or measured in one or more dimensions. Typically, PSY's have an expected position however PSY 616 may produce an internal signal measuring the instantaneous position (meaning, the actual position) then providing such feedback may be utilized by situational assessment logic 626 for calculating drifts and offsets parameters in the PRX and/ or for correcting steering parameters control 218 of PSY 616 to correct an offset. Furthermore, PSY feedback 630 may indicate a mechanical failure which may be relayed to host 628 which may either compensate for the mechanical failure or control host 628 to avoid an accident due to the mechanical failure.
[00123] According to some embodiments SAL 626 may select and operate array reflectors. SAL 626 may dynamically select a first set of array reflectors to use as part of the PTX, and may select a second set of reflectors to use as part of the PRX.
[00124] According to further embodiments, SAL 626 may increase a number of reflectors in the first set to reflectors in or to increase inspection pulse (TX beam) spread. SAL 626 may also decrease a number of reflectors in the second set in order to narrow RX FOV and/or to compensate for background noise or ambient light conditions. Sub control circuits may be included in PSY 616.
[00125] According to some embodiments, PSY feedback 630 may include instantaneous scanning speed of PSY 616. PSY 616 may produce an internal signal measuring the instantaneous speed (meaning, the actual speed and not the estimated or anticipated speed) then providing such feedback may be utilized by situational assessment logic 626 for calculating drifts and offsets parameters in the PRX and/ or for correcting steering parameters control 618 of PSY 616 to correct an offset. The frequency may be for a single CR or for a CCR and more. [00126] According to some embodiments, PSY feedback 630 may include instantaneous scanning frequency of PSY 616, PSY 616 may produce an internal signal measuring the instantaneous frequency (meaning, the actual frequency and not the estimated or anticipated frequency) then providing such feedback may be utilized by situational assessment logic 626 for calculating drifts and offsets parameters in the PRX and/ or for correcting steering parameters control 618 of PSY 616 to correct an offset. The instantaneous frequency may be relative to one or more axis.
[00127] According to some embodiments, PSY feedback 630 may include mechanical overshoot of PSY 616, which represents a mechanical de-calibration error from the expected position of the PSY in one or more axis. PSY 616 may produce an infernal signal measuring the mechanical overshoot then providing such feedback may be utilized by situational assessment logic 626 for calculating drifts and offsets parameters in the PRX and/ or for correcting steering parameters control 618 of PSY 616 to correct an offset. PSY feedback may also be utilized in order to correct steering parameters in case of vibrations induced by the Lidar system or by external factors such as vehicle engine vibrations or road induces shocks.
[00128] According to some embodiments, PSY feedback 630 may be utilized to correct steering parameters 618 to correct the scanning trajectory and linearize it. The raw scanning pattern may typically be non-linear to begin with and contains artifacts resulting from fabrication variations and the physics of the MEMS mirror or reflective elements. Mechanical impairments may be static, for example a variation in the curvature of the mirror, and dynamic, for example mirror warp/twist at the scanning edge of motion correction of the steering parameters to compensate for these non-linearizing elements may be utilized to linearize the PSY scanning trajectory.
[00129] According to some embodiments, SAL 626 may receive one or more signals from memory 622. Information received from the memory may include laser power budget (defined by eye safety limitations, thermal limitations reliability limitation or otherwise); electrical operational parameters such as current and peak voltages; calibration data such as expected PSY scanning speed, expected PSY scanning frequency, expected PSY scanning position and more. [00130] According to some embodiments, steering parameters of PSY 616, detector parameters of detector 612 and/or pulse parameters of PTX 606 may be updated based on the calculated/determined work plan 634. Work plan 634 may be tracked and determined at specific time intervals and with increasing level of accuracy and refinement of feedback signals (such as 630 and 632).
[00131] Turning to Fig. A7 shown is a flowchart associated with a method of scanning a scene 700. A photonic pulse may be emitted (702) and a reflected pulse may be received (704) and detected (706). The pulses may be steered in a joint path, the photonic pulse steered toward the scene, the pulse characterized by a first phase and the reflected pulse from the scene toward the detector, the reflected pulse characterized by a second phase (708). Based on system feedback such as the detected signal, host information, steering feedback and more (710) the steering parameters may be updated including correcting the first or second phase, oscillating frequency and more (712).
[00132] According to some embodiments, a steering state may be programmable/adjustable in which case the initial state is determined and may be updated based on the feedbacks (714).
[00133] Aspects of the present invention may include methods, circuits, devices, assemblies, systems and functionally associated machine executable code for active scene scanning including devices for controllably steering an optical beam. According to some embodiments, a scanning device may analyze a changing scene to determine/detect scene elements. When used in conjunction with a host such as a vehicle platform and/or a drone platform, the scanning device may provide a detected scene output. The host device may utilize a detected scene output or signal from the scanning device to automatically steer or operate or control the host device. Furthermore, the scanning device may receive information from the host device and update the scanning parameters accordingly. Scanning parameters may include: pulse parameters, detector parameters, steering parameters and /or otherwise. For example, a scanning device may detect an obstruction ahead and may cause the host to steer away from the obstruction. In another example the scanning device may also utilize a turning of a steering wheel and update the scanning device to analyze the area in front of the upcoming turn or if a host device is a drone a signal indicating that the drone is intended to land may cause the scanning device to analyze the scene for landing requirements instead of flight requirements.
[00134] For clarity, a light source throughout this application has been termed a "laser" however, it is understood that alternative light sources that do not fail under technical lasers may replace a laser wherever one is discussed, for example a light emitting diode (LED) based light source or otherwise. Accordingly, a Lidar may actually include a light source which is not necessarily a laser.
[00135] For clarity, a sensing signal or an electrical sensing signal may be: (a) current, (b) voltage, or (c) a current frequency, or (d) a voltage frequency or (e) electrical charge or any other electrical characteristic (such as capacitance, resistivity and more) is applicable and understood. Accordingly, any embodiments detailing a current may include any of the other options detailed herein.
[00136] Turning to Fig. B1 , shown is a steering device 100 which may be associated with or part of a scanning device. According to some embodiments, steering device 100 may include one or more reflective surfaces such as mirror 102, or a mirror base structure that can be attached to an external mirror assembly. Mirror 102 may be any reflective surface, for example, made from polished gold, aluminum, silicon, silver, or otherwise. Each of which reflective surfaces may be associated to an electrically controllable electromechanical actuator/cantilever/bender such as actuator 104. Actuator 104 may be a stepper motor, direct current motor, galvanometric actuator, electrostatic, magnetic or piezo elements or thermal based actuator or otherwise. Optionally, actuator 104 may include a piezo-eiectric layer and a semiconductor layer and optionally, a support or base layer. Actuator 104 may be connected to a support frame 108 and may further cause movement or power to be relayed to a flexible interconnect element or connector, such as spring 106. Spring 106 may be utilized to adjoin actuator 104 to mirror 102. Actuator 104 may include two or more electrical contacts such as contacts 1 10 and 1 12.
[00137] According to some embodiments, steering device 100 may include a single dual-axis mirror or dual single-axis mirrors or otherwise. According to some embodiments, actuator 104 may be a partially conductive element or may include embedded conductive circuitry. According to preferred embodiments, actuator 104 may include a semi conductive layer which may be made of a semi-conductive material which may be doped to have controllable conductive characteristics as can be achieved with silicon and similar materials. Accordingly, actuator 104 may be designed to be conductive in some sections and isolated (or function as isolation) in others. Conductivity may be achieved by doping a silicon based actuator, for example. Optionally, instead of doping actuator 104, actuator 104 may include a conductive element which may be adhesed or otherwise mechanically or chemically connected to a non-conducting (or isolated or function as isolation) base layer of the actuator.
[00138] According to some embodiments, one of the contacts, such as contact 1 10 may be coupled to an electrical source 1 14 and may be utilized to provide electrical current, voltage and/or power to actuator 104. Contact 1 12 may be connected to a sensor 1 16 and may be used as an electrical sensing contact and used to measure one or more parameters of a sensing circuit. A parameter of a sensing circuit may include: current, voltage, current frequency, voltage frequency, capacitance, resistivity/resistance and/or charge and more. Sensor 1 16 may be electrical elements or logic circuitry and more. Electrical source 1 14 and/or sensor 1 16 may be external or included in steering device 100 and/or an associated scanning device. Optionally, steering device 100 may include contacts/inputs to connect to an external power source 1 14 and/or an external sensor 1 16. Furthermore, it is understood that contact 1 10 and 1 12 are interchangeable so that contact 1 10 may be connected to a sensor 1 16 and contact 1 12 may be connected to a power source 1 14.
[00139] According to some embodiments, actuator 104 may cause mirror 102 to move in a first direction, optionally actuator 104 may be configured to cause mirror 102 to move in two directions (forward and backwards for example). Optionally one or more of actuators may be utilized so that mirror 102 may move in a first range of directions represented by Θ and one or more additional actuator's may be utilized to cause mirror 102 to move in a second range of directions represented by Φ. Optionally the first and second range/directions are orthogonal to each other.
[00140] According to some embodiments, mirror 102 may include a mirror base structure support and the reflective elements may be adhesed or otherwise mechanically or chemically connected to the mirror base structure support. [00141 ] According to some embodiments, sensor 1 16 may detect a mechanical breakdown or failure or may sense a mechanical deflection to indicate an actual position of mirror 102.
[00142] According to some embodiments, steering device 100 may be associated with a controller and a scanning device. The associated controller may utilize a detector feedback to determine if steering device 100 has a mechanical breakdown or failure and/or to compare an actual position of steering device 100 with an expected position. Optionally, scanning device may correct steering device 100 positioning based on the feedback or relay to a host device that a mechanical breakdown has occurred.
[00143] Turning to Fig. B2, shown is an example embodiment of steering device 202 and a central processing unit (CPU) such as controller 204 which may be local and included within scanning device 202 or a general controller of scanning device 202. Shown is an example mirror configuration including mirror 208 which can be moved in two or more axis (θ, φ). Understood from this figure in combination with figs. 4B&4C is also a single axis embodiment or a round embodiment. Mirror 206 may be associated with an electrically controllable electromechanical driver such as actuation driver 208. Actuation driver 208 may cause movement or power to be relayed to an actuator/canti lever/bender such as actuator 210. Actuator 210 may be part of a support frame such as frame 21 1 or they may be interconnected. Additional actuators such as actuator s 212, 214 and 216 may each be controller/driven by additional actuation drivers as shown, and may each have a support frame 213, 215 and 217 (appropriately). It is understood that frames 21 1 , 213, 21 5 and/or 217 may comprise a single frame supporting ail of the actuators or may be a plurality of interconnected frames. Furthermore, the frames may be electrically separated by isolation (isn) elements or sections (as shown). Optionally, a flexible interconnect element or connector, such as spring 218, may be utilized to adjoin actuator 210 to mirror 206, to relay power or movement from actuation driver 208 to mirror 206. Actuator 210 may include two or more electrical contacts such as contacts 21 OA, 210B, 210C and 210D. Optionally, one or more contacts 21 OA, 210B, 210C and/or 210D may be situated on frame 21 1 or actuator 210 provided that frame 21 1 and actuator 210 are electronically connected. According to some embodiments, actuator 210 may be a semi-conductor which may be doped so that actuator 210 is generally conductive between contacts 210A-210D and isolative in isolation 220 and 222 to electronically isolate actuator 210 from actuators 212 and 216 (respectively). Optionally, instead of doping the actuator, actuator 210 may include a conductive element which may be adhesed or otherwise mechanically or chemically connected to actuator 210, in which case isolation elements may be inherent in the areas of actuator 210 that do not have a conductive element adhesed to them. Actuator 210 may include a piezo electric layer so that current flowing through actuator 210 may cause a reaction in the piezo electric section which may cause actuator 210 to controllabiy bend.
[00144] According to some embodiments, CPU 204 may output/relay to mirror driver 224 a desired angular position described by θ, φ parameters. Mirror driver 224 may be configured to control movement of mirror 206 and may cause actuation driver 208 to push a certain voltage amplitude to contacts 210C and 210D in order to attempt to achieve specific requested values for θ, φ deflection values of mirror 206 based on bending of actuators 210, 212, 214 and 216 (appropriate operation of actuation drivers shown for the additional actuators is understood and discussed below).
[00145] According to some embodiments, position feedback control circuitry may be configured to supply an electrical source (such as voltage or current) to a contact such as contact 21 OA (or 210B) and the other contact such as 210B (or 21 OA, appropriately) may be connected to a sensor within position feedback 226, which may be utilized to measure one or more electrical parameters of actuator 210 to determine a bending of actuator 210 and appropriately an actual deflection of mirror 206.
[00146] According to some embodiments, as shown, additional positional feedback similar to position feedback 226 and an additional actuation driver similar to actuation driver 208 may be replicated for each of actuators 212-216 and mirror driver 224 and CPU 204 may control those elements as well so that a mirror deflection is controlled for ail directions. The actuation drivers including actuation driver 208 may push forward a signal that causes an electro-mechanical reaction in actuators 210-216 which each, in turn is sampled for feedback. The feedback on the actuators' (210-216) positions serves as a signal to mirror driver 224 enabling it to converge efficiently towards the desired position θ, φ set by the CPU 204, correcting a requested value based on a detected actual deflection.
[00147] According to some embodiment, a scanning device or LiDAR may utilize piezoelectric actuator micro electro mechanical (MEMS) mirror devices for deflecting a laser beam scanning a field of view (FOV). Mirror 206 deflection is a result of voltage potential/current applied to the piezoelectric element that is built up on actuator 210. Mirror 206 deflection is translated into an angular scanning pattern that may not behave in a linear fashion, for a certain voltage level actuator 210 does not translate to a constant displacement value. A scanning LiDAR system where the FOV dimensions are deterministic and repeatab!e across different devices is optimally realized using a closed loop method that provides an angular deflection feedback from position feedback and sensor 226 to mirror driver 224 and/or CPU 204.
[00148] Turning to Fig. B3, shown is an example actuator-mirror depiction 300 in accordance with some embodiments. It is understood that mirror 306 may be an example embodiment of mirror 206 of Fig. B2 and that actuator 310 may be an example embodiment of actuator 210 also of Fig. B2. Actuator 310 is made of silicon and includes a PZT piezo electric layer 31 1 , a semi conductive layer 313 and a base layer 315. Contacts 31 OA and 310B are substantially similar to contact 21 OA and 210B of Fig. B2. It is depicted that the resistivity of actuator 310 may be measured in an active stage (Ractive) when the mirror is deflected at a certain angular position and compared to the resistivity at a resting state (Rrest). A feedback including Ractive may provide information to measure/determine the actual mirror deflection angle compared to an expected angle and accordingly, mirror 306 deflection may be corrected. The physical property of the silicon (or semiconductor) based actuator 310 is based on an observable modulation of its electrical conductivity according to mechanical stresses that actuator 310 experiences. When actuator 310 is at rest the electrical conductivity exhibited at the two contacts 31 OA and 310B would be Rrest. The PZT material of layer 31 1 , if activated (by applying electrical voltage/current), would exert force on actuator 310 and cause it to bend. Bending actuator 310 experiences a mechanical force that would modify the electrical conductivity Ractive exhibited at the two contacts 31 OA and 31 OB. The difference between Rrest and Ractive is correlated by a mirror drive (such as mirror driver 224 of Fig. B2) into an angular deflection value that serves to close the loop. This method is used for dynamic tracking of the actual mirror position and may optimize response, amplitude, and deflection efficiency, frequency for both linear mode and resonant mode MEMS mirror schemes. Controlling the supply current/voltage may enable an expected Ractive to be achieved and appropriately an intended deflection.
[00149] Returning to Fig. B2, position feedback and sensor 226 may also be utilized as a reliability feedback module. According to some embodiments, a plurality of elements may include semiconductors or conducting elements, or a layer and accordingly, actuators 201 -216 could at least partially include a semi conducting element, springs 218, 226, 228 and 230 may each include a semiconductor and so may mirror 206. Electrical Power (current and/or voltage) may be supplied at a first actuator contact via position feedback 226 and position feedback 226 may sense an appropriate signal at actuator 212, 214 and/or 216 via contacts 212 A or 212 B, 214A or 214B and/or 216A or 216B.
[00150] Turning to Fig. B4A depicting a dual axis mems mirror (400), Fig. B4B depicting a single axis mems mirror (450) and Fig. B4C depicting a round mems mirror (475). !t is understood that current may be able to flow from contact 41 OA to contact 412B (through actuator 410 then through spring 418, mirror 406, spring 426 and to actuator 412). Isolation gaps in the semiconducting frame such as isolation 420 may cause actuator 410 and 412 to be two separate islands connected electrically through the springs and mirror or mirror base structure as described. The current flow or associated electrical parameter (voltage, current frequency etc.) may be monitored by an associated position feedback. In case of a mechanical failure where spring 418, spring 426, actuator 410, actuator 412 and/or mirror or mirror base structure 406 is damaged the current flow (or associated electrical parameter) through the structure would alter and change from its functional/calibrated values. At an extreme situation (for example if a spring is broken), the current would stop completely as there's a circuit break in the electrical chain by means of a faulty element. It is understood that a plurality of contacts may be utilized to check relevant elements of the modules 400 and 450 such as current flowing in additional contacts, for example in Fig. B4A current through such as from actuator 410 to actuator 414 via contact 414B. As well known in electronics, a plurality of elements defining circuits may be controlled so that the circuits can be checked simultaneously or serially. Furthermore, monitoring for a breakdown may be carried out periodically or continuously.
[00151] Turning to Fig. B5A, depicted is an example monostatic scanning device schematic 510. According to some embodiments, there may be provided a scene scanning device such as scanning device 512 which may be adapted to inspect regions or segments of a scene (shown here is a specific field of view (FOV) being scanned) using photonic pulses (transmitted light) whose characteristics may be dynamically selected as a function of: (a) optical characteristics of the scene segment being inspected; (b) optical characteristics of scene segments other than the one being inspected; (c) scene elements present or within proximity of the scene segment being inspected; (d) scene elements present or within proximity of scene segments other than the one being inspected; (e) an operational mode of the scanning device; and/or (f) a situational feature/ characteristic of a host platform with which the scanning device is operating. The scene scanning device may be adapted to inspect regions or segments of a scene using a set of one or more photonic transmitters 522 (including a light source such as pulse laser 514), receptors including sensors (such as detecting element 516) and/or steering assemblies 524 (which may include steering element 520); whose configuration and/or arrangement may be dynamically selected as a function of: (a) optical characteristics of the scene segment being inspected; (b) optical characteristics of scene segments other than the one being inspected; (c) scene elements present or within proximity of the scene segment being inspected; (d) scene elements present or within proximity of scene segments other than the one being inspected; (e) an operational mode of the scanning device; and/or (f) a situational characteristic of a host platform with which the scanning device is operating. It is understood that steering assembly 524 may be substantially similar to steering device 202 of Fig. B2. Active scanning device 512 may include: (a) a photonic emitter assembly 522 which produces pulses of inspection photons; (b) a photonic steering assembly 524 that directs the pulses of inspection photons to/from the inspected scene segment; (c) a photonic detector assembly 516 to detect inspection photons reflected back from an object within an inspected scene segment; and (d) a controller to regulate operation of the photonic emitter assembly, the photonic steering assembly and the operation of the photonic detection assembly in a coordinated manner and in accordance with scene segment inspection characteristics of the present invention at least partially received from internal feedback of the scanning device so that the scanning device is a closed loop dynamic scanning device. A closed loop scanning device is characterized by having feedback from at least one of the elements and updating one or more parameter based on the received feedback. A closed loop system may receive feedback and update the system's own operation at least partially based on that feedback. A dynamic system or element is one that may be updated during operation.
[00152] According to some embodiments, inspection of a scene segment may include illumination of the scene segment or region with a pulse of photons (transmitted light), which pulse may have known parameters such as pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter average power, pulse power intensity, pulse width, pulse repetition rate, pulse sequence, pulse duty cycle, wavelength, phase, polarity and more. Inspection may also include detecting and characterizing various aspects of reflected inspection photons, which reflected inspection photons are inspection pulse photons (reflected light) reflected back towards the scanning device (or laser reflection) from an illuminated element present within the inspected scene segment (i.e. scene segment element). Characteristics of reflected inspection photons may include photon time of flight (time from emission till detection), instantaneous power (or power signature) at and during return pulse detection, average power across entire return pulse and photon distribution/signal over return pulse period the reflected inspection photons are a function of the inspection photons and the scene elements they are reflected from and so the received reflected signal is analyzed accordingly. In other words, by comparing characteristics of a photonic inspection pulse with characteristics of a corresponding reflected and detected photonic pulse, a distance and possibly a physical characteristic such as reflected intensity of one or more scene elements present in the inspected scene segment may be estimated. By repeating this process across multiple adjacent scene segments, optionally in some pattern such as raster, lissajous or other patterns, an entire scene may be scanned in order to produce a map of the scene,
[00153] The definition according to embodiments of the present invention may vary from embodiment to embodiment, depending on the specific intended application of the invention. For Lidar applications, optionally used with a motor vehicle platform/host and or drone platform/host, the term scene may be defined as the physical space, up to a certain distance, in-front, behind, below and/or on the sides of the vehicle and/or generally in the vicinity of the vehicle or drone in all directions. The term scene may also include the space behind the vehicle or drone in certain embodiments. A scene segment or scene region according to embodiments may be defined by a set of angles in a polar coordinate system, for example, corresponding to a pulse or beam of light in a given direction. The light beam/pulse having a center radial vector in the given direction may also be characterized by angular divergence values, polar coordinate ranges of the light beam/pulse and more.
[00154] Turning to Fig. B5B, depicted is an example bi-static scanning device schematic 550. !t is understood that scanning device 562 is substantially similar to scanning device 512. However, scanning device 512 is a monostatic scanning device while scanning device 562 is a bi static scanning device. Accordingly, steering element 574 is comprised of two steering elements: steering element for PTX 571 and steering element for PRX 573. The rest of the discussion relating to scanning device 512 of Fig. B5A is applicable to scanning device 562 of Fig. B5B.
[00155] Turning to Fig. BSC, depicted is an example scanning device schematic 575 with a plurality of photonic transmitters 522 and a plurality of detectors 516. All of the transmitters 522 and detectors 516 may have a joint steering element 520. !t is understood that scanning device 587 is substantially similar to scanning device 512. However, scanning device 587 is a monostatic scanning device with a plurality of transmitting and receiving elements. The rest of the discussion relating to scanning device 512 of Fig. B5A is applicable to scanning device 587 Fig. BSC.
[00156] Turning to Fig. B6, depicted is an example scanning system 600 in accordance with some embodiments. Scanning system 600 may be configured to operate in conjunction with a host device 628 which may be a part of system 600 or may be associated with system 600, Scanning system 600 may include a scene scanning device such as scanning device 604 adapted to inspect regions or segments of a scene using photonic pulses whose characteristics may be dynamically selected. Scanning device 604 may include a photonic emitter assembly (PTX) such as PTX 606 to produce pulses of inspection photons. PTX 606 may include a laser or alternative light source. The light source may be a laser such as a solid state laser, a high power laser or otherwise or an alternative light source such as, a LED based light source or otherwise. Scanning device 604 may be an example embodiment for scanning device 512 of Fig. B5A and/or scanning device 562 of Fig. B5B and/or scanning device 587 of Fig. B5C and the discussion of those scanning devices is applicable to scanning device 604.
[00157] According to some embodiments, the photon pulses may be characterized by one or more controllable pulse parameters such as: pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter average power, pulse power intensity, pulse width, pulse repetition rate, pulse sequence, pulse duty cycle, wavelength, phase, polarity and more. The inspection photons may be controlled so that they vary in pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter average power, pulse power intensity, pulse width, pulse repetition rate, pulse sequence, pulse duty cycle, wavelength, phase, polarity and more. The photon pulses may vary between each other and the parameters may change during the same signal. The inspection photon pulses may be pseudo random, chirp sequence and/or may be periodical or fixed and/or a combination of these. The inspection photon pulses may be characterized as: sinusoidal, chirp sequences, step functions, pseudo random signals, or linear signals or otherwise.
[00158] According to some embodiments, scanning device 604 may include a photonic reception and detection assembly (PRX) such as PRX 608 to receive reflected photons reflected back from an object or scene element and produce detected scene signal 610. PRX 608 may include a detector such as detector 612. Detector 612 may be configured to detect the reflected photons reflected back from an object or scene element and produce detected scene signal 610,
[00159] According to some embodiments, detected scene signal 610 may include information such as: time of flight which is indicative of the difference in time between the time a photon was emitted and detected after reflection from an object, reflected intensity, polarization values and more.
[00160] According to some embodiments, scanning device 604 may be a bi static scanning device where PTX 606 and PRX 608 have separate optical paths or scanning device 604 may be a rnonosfatic scanning system where PTX 606 and PRX 608 have a joint optical path.
[00161] According to some embodiments, scanning device 604 may include a photonic steering assembly (PSY), such as PSY 616, to direct pulses of inspection photons from PTX 606 in a direction of an inspected scene and to steer reflection photons from the scene back to PRX 608. PTX 616 may also be in charge of positioning the singular scanned pixel window onto/in the direction of detector 612.
[00162] According to some embodiments, PSY 216 may be a joint PSY, and accordingly, may be joint between PTX 606 and PRX 608 which may be a preferred embodiment for a monostatic scanning system
[00163] According to some embodiments, PSY 616 may include a plurality of steering assemblies or may have several parts one associated with PTX 616 and another associated with PRX 608.
[00164] According to some embodiments PSY 616 may be a dynamic steering assembly and may be controllable by steering parameters control 618. Example steering parameters may include: scanning method that defines the acquisition pattern and sample size of the scene, power modulation that defines the range accuracy of the acquired scene, correction of axis impairments based on collected feedback and reliability confirmation and controlling deflection as described above, [00165] According to some embodiments PSY 616 may include: (a) a Single Dual-Axis MEIVIS mirror; (b) a dual single axis MEMS mirror; (c) a mirror array where multiple mirrors are synchronized in unison and acting as a single large mirror; (d) a mirror spiitted array with separate transmission and reception and/or (e) a combination of these and more. [00166] According to some embodiments, if PSY 818 includes a MEMS splitted array the beam splitter may be integrated with the laser beam steering. According to further embodiments, part of the array may be used for the transmission path and the second part of the array may be used for the reception path. The transmission mirrors may be synchronized and the reception mirrors may be synchronized separately from the transmission mirrors. The transmission mirrors and the reception mirrors sub arrays maintain an angular shift between themselves in order to steer the beam into separate ports, essentially integrating a circulator module.
[00167] According to some embodiments, As described with regard to Figs. B1 - Fig. B4A&BB, PSY 616 may include one or more PSY state sensors to produce a signal indicating an operational state of PSY 616 for example power information or temperature information, reflector state, reflector actual axis positioning, reflector mechanical state and more, as discussed in the embodiments above.
[00168] According to some embodiments, PSY 616 may include one or more reflective surfaces, each of which reflective surface may be associated to an electrically controllable eiectromechanicaliy actuator. The reflective surface(s) may be made from polished gold, aluminum, silicon, silver, or otherwise. The eiectrometrical actuator(s) may be selected from actuators such as stepper motors, direct current motors, gaivanometric actuators, electrostatic, magnetic or piezo elements or thermal based actuators. PSY 616 may include or be otherwise associated with one or more microeiectromechanicai systems (MEMS) mirror assemblies. A photonic steering assembly according to refractive embodiments may include one or more reflective materials whose index of refraction may be electrically modulated, either by inducing an electric field around the material or by applying electromechanical vibrations to the material.
[00169] According to some embodiments, scanning device 604 may include a controller, such as controller 620. Controller 604 may receive scene signal 610 from detector 612 and may control PTX 606, PSY 618 PRX 608 including detector 612 based on information stored in the controller memory 622 as well as received scene signal 610 including accumulated information from a plurality of scene signals 610 received over time. [00170] According to some embodiments, controller 620 may process scene signal 610 optionally, with additional information and signals and produce a vision output such as vision signal 624 which may be relayed/transmitted/ to an associated host device. Controller 620 may receive detected scene signal 610 from detector 612, optionally scene signal 610 may include time of flight values and intensity values of the received photons. ControHer 620 may build up a point cloud or 3D or 2D representation for the FOV by utilizing digital signal processing, image processing and computer vision techniques.
[00171] According to some embodiments, controller 620 may include situational assessment logic or circuitry such as situational assessment logic (SAL) 626. SAL 626 may receive detected scene signal 610 from detector 612 as well as information from additional blocks/elements either internal or external to scanning device 104.
[00172] According to some embodiments, scene signal 210 can be assessed and calculated according with or without additional feedback signals such as a PSY feedback PTX feedback, PRX feedback and host feedback and information stored in memory 622 to a weighted means of local and global cost functions that determine a work plan such as work plan signal 634 for scanning device 604 (such as: which pixels in the FOV are scanned, at which laser parameters budget, at which detector parameters budget). Accordingly, controller 620 may be a closed loop dynamic controller that receives system feedback and updates the system's operation based on that feedback.
[00173] According to some embodiments, SAL 626 may receive one or more feedback signals from PSY 616 via PSY feedback 630. PSY feedback 630 may include instantaneous position of PSY 616 where PSY 616 may include one or more reflecting elements and each reflecting element may contain one or more axis of motion, it is understood that the instantaneous position may be defined or measured in one or more dimensions. Typically, PSY's have an expected position however PSY 616 may produce an internal signal measuring the instantaneous position (meaning, the actual position) then providing such feedback may be utilized by situational assessment logic 626 for calculating drifts and offsets parameters in the PRX and/ or for correcting steering parameters control 218 of PSY 616 to correct an offset. Furthermore, PSY feedback 630 may indicate a mechanical failure which may be relayed to host 628 which may either compensate for the mechanical failure or control host 628 to avoid an accident due to the mechanical failure.
[00174] According to some embodiments, PSY feedback 630 may include instantaneous scanning speed of PSY 616. PSY 616 may produce an infernal signal measuring the instantaneous speed (meaning, the actual speed and not the estimated or anticipated speed) then providing such feedback may be utilized by situational assessment logic 626 for calculating drifts and offsets parameters in the PRX and/ or for correcting steering parameters control 618 of PSY 616 to correct an offset.
[00175] According to some embodiments, PSY feedback 630 may include instantaneous scanning frequency of PSY 616. PSY 616 may produce an internal signal measuring the instantaneous frequency (meaning, the actual frequency and not the estimated or anticipated frequency) then providing such feedback may be utilized by situational assessment logic 626 for calculating drifts and offsets parameters in the PRX and/ or for correcting steering parameters control 618 of PSY 616 to correct an offset. The instantaneous frequency may be relative to one or more axis.
[00176] According to some embodiments, PSY feedback 630 may include mechanical overshoot of PSY 616, which represents a mechanical de-calibration error from the expected position of the PSY in one or more axis, PSY 616 may produce an infernal signal measuring the mechanical overshoot then providing such feedback may be utilized by situational assessment logic 626 for calculating drifts and offsets parameters in the PRX and/ or for correcting steering parameters control 618 of PSY 616 to correct an offset. PSY feedback may also be utilized in order to correct steering parameters in case of vibrations induced by the LiDAR system or by external factors such as vehicle engine vibrations or road induces shocks.
[00177] According to some embodiments, PSY feedback 630 may be utilized to correct steering parameters 618 to correct the scanning trajectory and linearize it. The raw scanning pattern may typically be non-linear to begin with and contains artifacts resulting from fabrication variations and the physics of the MEMS mirror or reflective elements. Mechanical impairments may be static, for example a variation in the curvature of the mirror, and dynamic, for example mirror warp/twist at the scanning edge of motion correction of the steering parameters to compensate for these non-linearizing elements may be utilized to linearize the PSY scanning trajectory.
[00178] According to some embodiments, SAL 626 may receive one or more signals from memory 622. Information received from the memory may include laser power budget (defined by eye safety limitations, thermal limitations reliability limitation or otherwise); electrical operational parameters such as current and peak voltages; calibration data such as expected PSY scanning speed, expected PSY scanning frequency, expected PSY scanning position and more.
[00179] According to some embodiments, steering parameters of PSY 616, detector parameters of detector 612 and/or pulse parameters of PTX 606 may be updated based on the calculated/determined work plan 634. Work plan 634 may be tracked and determined at specific time intervals and with increasing level of accuracy and refinement of feedback signals (such as 630 and 632).
[00180] Turning to Fig. B7, there is shown is a flowchart 700 of a method for scanning in accordance with some embodiments. A mirror may be set to a predetermined controllable deflection (712). An electrical signal indicative if an actual mechanical deflection may be detected (714) and used to determine if the predetermined deflection is substantially similar to the actual deflection (within an allowed range surrounding the predetermined deflection) (716) and if the actual deflection is substantially different than predetermined deflection the mirror's deflection may be corrected (718).
[00181 ] According to some embodiments, an electrical signal indicative of an electro-mechanical state of the mirror assembly may be detected (720) and compared to an expected electrical signal (722) to determine if a mechanical failure has occurred.
[00182] The present invention may include methods, circuits, devices, assemblies, systems and functionally associated machine executable code for active scene scanning. According to some embodiments, a scanning device may analyze a changing scene to determine/detect scene elements. When used in conjunction with a host such as a vehicle platform and/or a drone platform, the scanning device may provide a detected scene output. The host device may utilize a detected scene output or signal from the scanning device to automatically steer or operate or control the host device. Furthermore, the scanning device may receive information from the host device and update the scanning parameters accordingly. Scanning parameters may include: adjustable pulse parameters, adjustable detector parameters, adjustable steering parameters and /or otherwise. For example, a scanning device may detect an obstruction ahead and steer the host away from the obstruction. In another example the scanning device may also utilize a turning of a steering wheel and update the scanning device to analyze the area in front of the upcoming turn or if a host device is a drone, a signal indicating that the drone is intended to land may cause the scanning device to analyze the scene for landing requirements instead of flight requirements. According to some embodiments, a scanning device may have hierarchical field of view (FOV) perception capabilities that can be shifted in space and time. These capabilities may enable high performance LiDAR across a very large FOV area by adaptive partitioning into segments of FOVs that are allocated a certain level of quality of service (QoS). It is typically impossible to assign the highest QoS for all segments, therefore the need for an adaptive allocation method will be henceforth described. QoS depends on the signal to noise ratio between the laser pulse transmitted and the laser reflection detected from the target reflection. Different levels of laser power may be applied in different regions in the LiDAR FOV. The levels of power may range from zero up to the maximum power that the laser device is capable of transmitting and/or receiving. QoS has limitations stemming from physical design, eye safety, thermal constraints, cost and form factor and more. Accordingly, a scanning device may be limited by one or more of the following system and/or scene features: horizontal and vertical FOV range; data acquisition rate (e.g. frame rate); resolution (e.g. number of pixels in a frame); accuracy (spatial and temporal); range (effective detection distance) and more.
[00183] For clarity, a light source throughout this application has been termed a "laser" however, it is understood that alternative light sources that do not fail under technical lasers may replace a laser wherever one is discussed, for example a light emitting diode (LED) based light source or otherwise. Accordingly, a Lidar may actually include a light source which is not necessarily a laser. [00184] Turning to Fig. C1A, depicted is an example scanning device schematic 10. According to some embodiments, there may be provided a scene scanning device such as scanning device 12 which may be adapted to inspect regions or segments of a scene (shown here is a specific FOV being scanned) using photonic pulses (transmitted light) whose characteristics may be dynamically selected as a function of: (a) optical characteristics of the scene segment being inspected; (b) optical characteristics of scene segments other than the one being inspected; (c) scene elements present or within proximity of the scene segment being inspected; (d) scene elements present or within proximity of scene segments other than the one being inspected; (e) an operational mode of the scanning device; and/or (f) a situational feature/ characteristic of a host platform with which the scanning device is operating. The scene scanning device may be adapted to inspect regions or segments of a scene using a set of one or more photonic transmitters 22 (including a light source such as pulse laser 14) , receptors including sensors (such as detecting element 16) and/or steering assemblies 24 (which may include splitter element 18 and steering element 20); whose configuration and/or arrangement may be dynamically selected as a function of: (a) optical characteristics of the scene segment being inspected; (b) optical characteristics of scene segments other than the one being inspected; (c) scene elements present or within proximity of the scene segment being inspected; (d) scene elements present or within proximity of scene segments other than the one being inspected; (e) an operational mode of the scanning device; and/or (f) a situational characteristic of a host platform with which the scanning device is operating. Active scanning device 12 may include: (a) a photonic emitter assembly 22 which produces pulses of inspection photons; (b) a photonic steering assembly 24 that directs the pulses of inspection photons to/from the inspected scene segment; (c) a photonic detector assembly 16 to detect inspection photons reflected back from an object within an inspected scene segment; and (d) a controller to regulate operation of the photonic emitter assembly, the photonic steering assembly and the operation of the photonic detection assembly in a coordinated manner and in accordance with scene segment inspection characteristics of the present invention at least partially received from internal feedback of the scanning device so that the scanning device is a closed loop dynamic scanning device. A closed loop scanning device is characterized by having feedback from at least one of the elements and updating one or more parameters based on the received feedback. A closed loop system may receive feedback and update the system's own operation at least partially based on that feedback. A dynamic system or element is one that may be updated during operation. Furthermore, scanning device 12 may be characterized in that accumulative feedback from a plurality of elements may be used to update/control parameters of those and other elements.
[00185] According to some embodiments, inspection of a scene segment may include illumination of the scene segment or region with a pulse of photons (transmitted light), which pulse may have known parameters such as pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter average power, pulse power intensity, pulse width, pulse repetition rate, pulse sequence, pulse duty cycle, wavelength, phase, polarization and more. Inspection may also include detecting and characterizing various aspects of reflected inspection photons, which reflected inspection photons are inspection pulse photons (reflected light) reflected back towards the scanning device (or laser reflection) from an illuminated element present within the inspected scene segment (i.e. scene segment element). Characteristics of reflected inspection photons may include photon time of flight (time from emission till detection), instantaneous power (or power signature) at and during return pulse detection, average power across entire return pulse and photon distribution/signal over return pulse period the reflected inspection photons are a function of the inspection photons and the scene elements they are reflected from and so the received reflected signal is analyzed accordingly. In other words, by comparing characteristics of a photonic inspection pulse with characteristics of a corresponding reflected and detected photonic pulse, a distance and possibly a physical characteristic such as reflected intensity of one or more scene elements present in the inspected scene segment may be estimated. By repeating this process across multiple adjacent scene segments, optionally in some pattern such as raster, lissajous or other patterns, an entire scene may be scanned in order to produce a map of the scene. [00186] The definition according to embodiments of the present invention may vary from embodiment to embodiment, depending on the specific intended application of the invention. For Lidar applications, optionally used with a motor vehicle platform/host and or drone platform/host, the term scene may be defined as the physical space, up to a certain distance, in-front, behind, below and/or on the sides of the vehicle and/or generally in the vicinity of the vehicle or drone in all directions. The term scene may also include the space behind the vehicle or drone in certain embodiments. A scene segment or scene region according to embodiments may be defined by a set of angles in a polar coordinate system, for example, corresponding to a pulse or beam of light in a given direction. The light beam/pulse having a center radial vector in the given direction may also be characterized by angular divergence values, polar coordinate ranges of the light beam/pulse and more.
[00187] Turning to Fig. C1 B, depicted is an example bistatic scanning device schematic 50. It is understood that scanning device 62 is substantially similar to scanning device 12. However, scanning device 12 is a monostatic scanning device while scanning device 62 is a bistatic scanning device. Accordingly, steering element 74 is comprised of two steering elements: steering element for PTX 71 and steering element for PRX 73. The rest of the discussion relating to scanning device 12 of Fig. C1 A is applicable to scanning device 62 Fig. C1 B.
[00188] Turning to Fig. C1 C, depicted is an example scanning device with a plurality of photonic transmitters 22 and a plurality of splitter elements 18 and a plurality of detectors 16. Ail of the transmitters22, detectors, 16 and splitters 18 may have a joint steering element 20. It is understood that scanning device 87 is substantially similar to scanning device 12. However, scanning device 87 is a monostatic scanning device with a plurality of transmitting and receiving elements. The rest of the discussion relating to scanning device 12 of Fig. C1 A is applicable to scanning device 87 Fig. C1 C.
[00189] Turning to Fig. C2, depicted is an example scanning system 100 in accordance with some embodiments. Scanning system 100 may be configured to operate in conjunction with a host device. Scanning system 100 may include a scene scanning device such as scanning device 104 adapted to inspect regions or segments of a scene using photonic pulses whose characteristics may be dynamically selected. Scanning device 104 may include a photonic emitter assembly (PTX) such as PTX 106 to produce pulses of inspection photons. PTX 106 may include a laser or alternative light source. The light source may be a laser such as a solid-state laser, a high power laser or otherwise or an alternative light source such as, a LED based light source or otherwise. Scanning device 104 may be an example embodiment for scanning device 12 of Fig. C1A and/or scanning device 62 of Fig. C1 B and/or scanning device 87 of Fig. C1 C and the discussion of those scanning devices is applicable to scanning device 104.
[00190] According to some embodiments, the photonic pulses may be characterized by one or more controllable pulse parameters such as: pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter average power, pulse power intensity, pulse width, pulse repetition rate, pulse sequence, pulse duty cycle, wavelength, phase, polarization and pulse calibration and more. Pulse calibration may include correcting a or compensating for a pulse intensity or direction so that the actual pulse is aligned with an expected/intended pulse to compensate for either differences resulting from production or for changes that may occur (such as degradation) over time.
[00191] According to some embodiments, the inspection photons may be controlled so that they vary in pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter average power, pulse power intensity, pulse width, pulse repetition rate, pulse sequence, pulse duty cycle, wavelength, phase, polarization and more. The photonic pulses may vary between each other and the parameters may change during the same signal. The inspection photonic pulses may be characterized as: sinusoidal, chirp sequences, step functions, pseudo random signals, or linear signals, they may be periodical or fixed or otherwise and/or a combination of these. Examples are shown in Figs. C3A&C3B which depict example inspection photo pulses control signals 200 and 250 including examples laser signal A-laser signal H (202-256, appropriately) depicting the control signal enabling a photonic pulse and determining the intensity, width, repetition rate of the pulse as well as pulse repetition rate and/or pulse sequence. [00192] According to some embodiments PTX 106 laser may operate in different laser modes such as modulated continuous wave (CW), pulsed quasi CW (Q-CW), mode locked, and may include a plurality of laser emitters. Additional examples are shown in Fig, C3B which depicts example inspection photonic pulses control signals 250 including examples laser signal F (252); laser signal G (254) and laser signal H (256) depicting the control signal enabling a photonic pulse and determining the intensity, width, repetition rate of the pulse as well as pulse repetition rate and/or pulse sequence. Laser signal F 252, for example, is characterized by increased power pulses, this type of sequence may be applicable to cover targets at increased ranges. Laser signal G 254, for example, is characterized by a chirp pulse position modulation and may be applicable for increased SNR, Laser signal H 256 may be characterized by a combination of chirp pulse position modulation and increased power range applicable for increased range and increased SNR.
[00193] Turning back to Fig. C2, according some embodiments, PTX 106 may include additional elements such as a collimator to compensate for divergence effects of the laser emitter and render the beam into an optimal shape suitable for steering, transmission and detection. PTX 106 may also include a thermoelectric cooler to optimize temperature stabilization as solid-state lasers, for example, may experience degradation in performance with temperature increase, so cooling the laser may enable a higher power yield. PTX 106 may also include an optical outlet.
[00194] According to some embodiments, PTX 106 may include one or more PTX state sensors to produce a signal indicating an operational state of PTX 106. An operational state of PTX 106 may include information such as power information or temperature information, laser state, laser degradation (in order to compensate for it), laser calibration information and more.
[00195] According to some embodiments, scanning device 104 may include a photonic reception and detection assembly (PRX) such as PRX 108 to receive reflected photons reflected back from an object or scene element and produce detected scene signal 1 10. PRX 108 may include a detector such as detector 1 12. Detector 1 12 may be configured to detect the reflected photons reflected back from an object or scene element and produce detected scene signal 1 10. [00196] According to some embodiments, detected scene signal 1 10 may include information such as: time of flight which is indicative of the difference in time between the time a photon was emitted and detected after reflection from an object, reflected intensity, polarization values and more.
[00197] According to some embodiments, detected scene signal 1 10 may be represented using point cloud, 3D signal or vector, 4D signal or vector (adding time to the other three dimensions) and more.
[00198] According to some embodiments, detector 1 12 may have one or more updatabie detector parameters controlled by detector parameters control 1 14 such as: scanning direction, frame rate, ambient light effects, mechanical static and dynamic impairments, thermal effects, wear and tear, area of interest, resolution, sensitivity, detector calibration and more. Calibration of detector 1 12 may include correcting or compensating a detection sensitivity or otherwise so that the actual detection sensitivity is aligned with an expected/intended detection sensitivity to compensate for either differences resulting from production or for changes that may occur (such as degradation) over time.
[00199] According to some embodiments, detector parameters control 1 14 may be utilized for dynamic operation of detector 1 12 for controlling the updatabie detector parameters. For example, scanning direction may be utilized for dynamic allocation of detector power/resoiution/sensitivity/resources. Scanning direction may be the expected direction of the associated inspection photons, frame rate may be the laser or PRX's frame rate, ambient light effect may include detected noise photons or expected inspection photons (before they are reflected), mechanical impairments may also be correlated to issues relating to deviation of other elements of the system that need to be compensated for, knowledge of thermal effects may be utilized to reduce signal to noise ratio, wear and tear refers to wear and tear of detector 1 12 and/or other blocks of the system that detector 1 12 can compensate for, area of interest may be an area of the scanned scene that is more important and more. Ambient conditions such as fog/rain/smoke impact signal to noise (lifting the noise floor) can be used as a parameter that defines the operating conditions of the detector and also the laser. Another critical element is the gating of the detector in a monostatic design with the purpose of avoiding the blinding of the detector with the initial transmission of the laser pulse - TX/RX co-channel interference,
[00200] According to some embodiments, defector 1 12 may include an array of detectors such as an array of avalanche photo diodes (APD), single photon detection avalanche diodes (SPADs) or a single defecting element that measures the time of flight from a laser pulse transmission event to the reception event and the intensity of the received photons. The reception event may be the result of the laser pulse being reflected from a target in the FOV present at the scanned angular position of the laser of PTX 106. The time of flight is a timestamp value that represents the distance of the reflecting target, object or scene element to scanning device 104. Time of flight values may be realized by photon detection and counting methods such as: TCSPC (time correlated single photon counters), analog methods for photon detection such as signal integration and qualification (via analog to digital converters or plain comparators) or otherwise.
[00201 ] According to some embodiments, detector 1 12 may include a full array of single photon detection avalanche diodes which may be partitioned into one or more pixels that capture a fragment of the FOV. A pixel may represent the basic data element that build up the captured FOV in the 3 dimensional space (e.g. the basic element of a point cloud representation) including a spatial position and the reflected intensity value,
[00202] According to some embodiments, some optional embodiments of detector 1 12 may include: (a) a two-dimensional array sized to capture one or more pixels out of the FOV, a pixel window may contain a fraction of a pixel, one or more pixels or otherwise; (b) a two dimensional array that captures multiple rows or columns in a FOV up to an entire FOV; (c) a single dimensional array and/or (d) a single SPAD element or otherwise.
[00203] According to some embodiments, PRX 1 12 may also include an optical inlet which may be a single physical path with a single lens or no lens at ail.
[00204] According to some embodiments, PRX 1 12 may include one or more PRX state sensors to produce a signal indicating an operational state of PRX 1 12 for example power information or temperature information, detector state and more. [00205] According to some embodiments, scanning device 104 may be a bistatic scanning device where PTX 106 and PRX 108 have separate optical paths or scanning device 104 may be a monostatic scanning system where PTX 106 and PRX 108 have a joint optical path.
[00206] According to some embodiments, scanning device 104 may include a photonic steering assembly (PSY), such as PSY 1 16, to direct pulses of inspection photons from PTX 106 in a direction of an inspected scene and to steer reflection photons from the scene back to PRX 108. PTX 1 16 may also be in charge of positioning the singular scanned pixel window onto/in the direction of detector 1 12.
[00207] According to some embodiments, PSY 1 16 may be a joint PSY, and accordingly, may be joint between PTX 106 and PRX 108 which may be a preferred embodiment for a monostatic scanning system.
[00208] According to some embodiments, PSY 1 16 may include a plurality of steering assemblies or may have several parts one associated with PTX 1 16 and another associated with PRX 108. (see also Figs. 1 A-1 C).
[00209] According to some embodiments PSY 1 16 may be a dynamic steering assembly and may be controllable by steering parameters control 1 18. Example steering parameters may include: scanning method that defines the acquisition pattern and sample size of the scene, power modulation that defines the range accuracy of the acquired scene, correction of axis impairments based on collected feedback, calibration of steering to expected characteristics. Calibration may include correcting or compensating a steering axis so that the actual direction is aligned with an expected/intended direction to compensate for either differences resulting from production or for changes that may occur (such as degradation) over time.
[00210] According to some embodiments PSY 1 16 may include: (a) a Single Dual-Axis MEMS mirror; (b) a dual single axis MEMS mirror; (c) a mirror array where multiple mirrors are synchronized in unison and acting as a single large mirror; (d) a mirror splitter array with separate transmission and reception and/or (e) a combination of these and more.
[00211] According to some embodiments, if PSY 1 16 includes a MEMS splitted array the beam splitter may be integrated with the laser beam steering. According to further embodiments, part of the array may be used for the transmission path and the second part of the array may be used for the reception path. The transmission mirrors may be synchronized and the reception mirrors may be synchronized separately from the transmission mirrors. The transmission mirrors and the reception mirrors sub arrays maintain an angular shift between themselves in order to steer the beam into separate ports, essentially integrating a circulator module.
[00212] According to some embodiments, PSY 1 16 may include one or more PSY state sensors to produce a signal indicating an operational state of PSY1 16 for example power information or temperature information, reflector state, reflector actual axis positioning, reflector mechanical state and more.
[00213] According to some embodiments, PSY 1 16 may also include a circulator Model/Beam splitter, although it is understood that the splitter may also be part of PRX 108 instead. The beam splitter may be configured to separate the transmission path of PTX 106 from the reception path of PRX 108. In some embodiments the beam splitter may either be integrated in the steering assembly (for example if a splitter array is utilized) or may be redundant or not needed and accordingly the scanning device may not include a beam splitter.
[00214] According to some embodiments, the beam splitter of PSY 1 16 may be a polarized beam splitter(PBS), a PBS integrating a slit, a circulator beam splitter and/or a slit based reflector or otherwise.
[00215] According to some embodiments, PSY 1 16 may include one or more reflective surfaces, each of which reflective surface may be associated with an electrically controllable electromechanical actuator. The reflective surface(s) may be made from polished gold, aluminum, silicon, silver, or otherwise. The electrometricai acfuator(s) may be selected from actuators such as stepper motors, direct current motors, galvanometric actuators, electrostatic, magnetic or piezo elements or thermal based actuators. PSY 1 16 may include or be otherwise associated with one or more microeiectromechanical systems (!VIE S) mirror assemblies. PSY 1 16 according to refractive embodiments may include one or more reflective materials whose index of refraction may be electrically modulated, either by inducing an electric field around the material or by applying electromechanical vibrations to the material. [00216] According to yet further embodiments, PSY 1 16 may include a beam splitter to help separate transmission path from the reception path. Using the same photonic steering assembly may provide for tight synchronization between a direction in which a photonic pulse/beam is steered and emitted by the photonic emitter assembly and a direction of a concurrent FOV of one or more optical sensors of the photonic detection assembly. Shared photonic steering assembly configuration may allow for a photonic detector assembly of a given device to focus upon and almost exclusively to collect/receive reflected photons from substantially the same scene segment being concurrently illuminated by the given device's photonic emitter assembly. Accordingly, as PSY 1 16 moves, so may a photonic pulse illumination angle along with the FOV angle.
[00217] According to some embodiments, scanning device 104 may include a controller to control scanning device 104, such as controller 120. Controller 120 may receive scene signal 1 10 from detector 1 12 and may control PTX 106, PSY 1 16 and PRX 108 including detector 1 12, based on: (i) information stored in the controller memory 122, (ii) received scene signal 1 10 and (iii) accumulated information from a plurality of scene signals 1 10 received over time.
[00218] According to some embodiments, controller 120 may process scene signal 1 10 optionally, with additional information and signals and produce a vision output such as vision signal 124 which may be relayed/transmitted/ to an associated host device. Controller 120 may receive detected scene signal 1 10 from detector 1 12, optionally scene signal 1 10 may include time of flight values and intensity values of the received photons. Controller 120 may build up a point cloud or 3D or 2D representation for the FOV by utilizing digital signal processing, image processing and computer vision techniques.
[00219] According to some embodiments, controller 120 may include situational assessment logic or circuitry such as situational assessment logic (SAL) 126. SAL 126 may receive detected scene signal 1 10 from detector 1 12 as well as information from additional blocks/elements either internal or external to scanning device 104.
[00220] According to some embodiments, scene signal 1 10 may be assessed and calculated with or without additional feedback signals such as a PSY feedback PTX feedback, PRX feedback and host feedback and information stored in memory 122 in a weighted means of local and global cost functions that determine a scanning/work plan such as work plan signal 134 for scanning device 104 (such as: which pixels in the FOV are scanned, at which laser parameters budget, at which detector parameters budget). Accordingly, controller 120 may be a closed loop dynamic controller that receives system feedback and updates the system's operation based on that feedback.
[00221] Turning to Fig. C4, depicted is an example scanning system 300 in accordance with some embodiments, !t is understood that elements 304-326 and 334 are substantially similar to elements104-126 and 134 of Fig. C2 (appropriately) and that the description of those elements are applicable to elements 304-326 and 334. Scanning system 300 may include host 328 in conjunction to which scanning device 304 may operate. It is understood that host 328 may be a part of scanning system 300 or may be associated with scanning device 304 and that the following description is applicable to either embodiments.
[00222] According to some embodiments, SAL 326 may receive detected scene signal 310 from detector 312 as well as information from additional blocks/elements either internal or external to scanning device 304, these signals and information will now be discussed in more detail.
[00223] According to some embodiment, SAL 326 may receive a PTX feedback 329 indicating PTX associated information such as an operational state, power consumption, temperature and more.
[00224] According to some embodiment, SAL 326 may receive a PRX feedback 331 indicating PRX associated information such as power consumption, temperature, detector state feedback and more.
[00225] According to some embodiments, SAL 326 may receive one or more feedback signals from PSY 316 via PSY feedback 330. PSY feedback 330 may include: PSY operational state, instantaneous position of PSY 316 where PSY 316 may include one or more reflecting elements and each reflecting element may contain one or more axis of motion, it is understood that the instantaneous position may be defined or measured in one or more dimensions. Typically, PSY's have an expected position, however PSY 316 may produce an internal signal measuring the instantaneous position (meaning, the actual position) then providing such feedback may be utilized by situational assessment logic 326 for calculating drifts and offsets parameters in the PRX and/ or for correcting steering parameters control 318 of PSY 316 to correct an offset.
[00226] According to some embodiments, PSY feedback 330 may include instantaneous scanning speed of PSY 316. PSY 316 may produce an infernal signal measuring the instantaneous speed (meaning, the actual speed and not the estimated or anticipated speed) then providing such feedback may be utilized by situational assessment logic 326 for calculating drifts and offsets parameters in the PRX and/ or for correcting steering parameters control 318 of PSY 318 to correct an offset.
[00227] According to some embodiments, PSY feedback 330 may include instantaneous scanning frequency of PSY 316. PSY 316 may produce an internal signal measuring the instantaneous frequency (meaning, the actual frequency and not the estimated or anticipated frequency) then providing such feedback may be utilized by situational assessment logic 326 for calculating drifts and offsets parameters in the PRX and/ or for correcting steering parameters control 318 of PSY 316 to correct an offset. The instantaneous frequency may be relative to one or more axis.
[00228] According to some embodiments, PSY feedback 330 may include mechanical overshoot of PSY 316, which represents a mechanical decalibration error from the expected position of the PSY in one or more axis. PSY 316 may produce an internal signal measuring the mechanical overshoot then providing such feedback may be utilized by situational assessment logic 326 for calculating drifts and offsets parameters in the PRX and/ or for correcting steering parameters control 318 of PSY 316 to correct an offset. PSY feedback may also be utilized in order to correct steering parameters in case of vibrations induced by the LiDAR system or by external factors such as vehicle engine vibrations or road induces shocks.
[00229] According to some embodiments, PSY feedback 330 may be utilized to correct steering parameters 318 to correct the scanning trajectory and linearize it. The raw scanning pattern may typically be non-linear and may contain artifacts resulting from fabrication variations and the physics of the MEMS mirror or reflective elements. Mechanical impairments may be static (for example a variation in the curvature of the mirror) and/or dynamic (for example mirror warp/tvvist at the scanning edge of motion). Correction of the steering parameters to compensate for these non-linearizing elements may be utilized to linearize the PSY scanning trajectory.
[00230] According to some embodiments, SAL 326 may receive one or more host signals from host 328 via host feedback and information 332, Information received from the host may be additional information from other sensors in the system such other LiDARs, camera, RF radar, acoustic proximity system and more or feedback following processing of vision signal 324 at host 328 processing unit , Optionally, host information may be configured to override other SAL 326 inputs so that if a host indicates that a turn is expected, for example, scanning device 304 may analyze the upcoming turn. Optionally the host feedback may include an override command structure including a flag indicating that the host input is to override the internal feedbacks and signals. The override structure may contain direct designation to scan certain portion(s) of the scene at a certain power that translates into the LiDAR range and more.
[00231] According to some embodiments, SAL 326 may receive one or more signals from memory 322. Information received from the memory may include laser power budget (defined by eye safety limitations, thermal limitations reliability limitation or otherwise); electrical operational parameters such as current and peak voltages; calibration data such as expected PSY scanning speed, expected PSY scanning frequency, expected PSY scanning position and more.
[00232] According to some embodiments, SAL 326 may be configured to produce a feedback parameter and/or a vision signal 324 utilizing digital signal processing, image processing and computer vision techniques.
[00233] According to some embodiments, SAL 326 may analyze information and take into consideration a plurality of different types of information such as: (a) thermal envelope which may constrain the working regime and performance of the LiDAR such as pixel rate, frame rate, defection range and FOV depth resolution (4D resolution), FOV and angular range, (b) identified road delimiters or other constant elements in the FOV of the scanning device, (c) object of interest tracking, (d) optical flow that determines, tracks and predicts global motion of the scene and individual element's motion in the scene, (e) localization data associated with the location of the scanning device which may be received from host 328, (f) volumetric effects such as rain, fog, smoke, or otherwise; (g) interference such as ambient light, sun, other LiDARs on other hosts and more; (h) Ego-motion parameters from the host 328 associated with a host's steering wheel, blinkers or otherwise; (i) fusion with camera or other sensor associated with host 328 and more.
[00234] According to some embodiments, SAL 320 may output to a host device vision signal 324. The controller and/or SAL may analyze process and refine detected scene signal 310 by utilizing digital signal processing, image processing and computer vision techniques. Vision signal 324 may be a qualified point data structure (e.g. , cloud map and or point cloud or otherwise). And may contain parameters not restricted to a 3D positioning of the pixels in the FOV, reflectivity intensity, confidence level according to a quality of service metric, metadata layer of identified objects to a host system. A quality of service metric may be an indication of system expected QOS and may be applicable, for example when the scanning device is operating in a low QOS to compensate for high surrounding temperatures or otherwise. According to some embodiments, scene signal 310 may be assessed and calculated accordingly, with or without additional feedback signals such as PSY feedback 330 PTX feedback 329, PRX feedback 331 and/or host feedback and information 332, according to a weighted means of local and global cost functions that determine a scanning/work plan such as work plan signal 334 for scanning device 304 (such as: which pixels in the FOV are scanned, at which laser parameters budget, at which detector parameters budget). Accordingly, controller 320 may be a closed loop dynamic controller that receives system feedback and updates the system's operation based on that feedback.
[00235] Accordingly, steering parameters of PSY 316, detector parameters of detector 312 and/or pulse parameters of PTX 306 may be updated based on the calculated/determined work plan 334. Work plan 334 may be tracked and determined at specific time intervals and with increasing level of accuracy and refinement of feedback signals.
[00236] According to some embodiments, updating of the parameters (steering, detector and/or laser pulse) based on work plan 334 may be updated in predetermined times or intervals, may be synchronous or asynchronous and may be dependent on work plan 334 itself, meaning that if a high priority update is received the update may be asynchronous and if not, it may be updated at a predetermined time.
[00237] According to some embodiments, work plan 334 may be updated based on real time detected scene information which may also be termed as pixel information. Real time information may analyze detected fast signals during time of flight that contains one or more reflections for a given photonic inspection pulse. For example, an unexpected detected target in a low priority field may cause controller 318 to update the pulse frequency of the laser of PTX 306 via updating of the pulse parameters. Work plan 334 may also be updated a frame or sub-frame level which may be information received accumulated and/or analyzed within a single frame. Furthermore, work plan 334 may be updated on an inter-frame level, which is information accumulated and analyzed over two or more frames. Increased levels of real time accuracy, meaning that work plan 334 is updated in a pixel or sub-frame resolution, is carried out when higher levels of computation produce increasingly usable results. Increased level of non-real time accuracy within a specific time period as slower converging data becomes available (e.g. computer vision generated optical flow estimation of objects over several frames), meaning that work plan 334 may be updated as new information becomes evident based on an inter-frame analysis.
[00238] According to some embodiments, controller 320 may adjust operation of PTX 306, such as: (a) inspection pulse intensity; (b) inspection pulse duration; (c) inspection pulsing patterns; and (d) more., for a given scene segment based on: (a) ambient light conditions; (b) pre-pulse reading on PRX; (c) energy level of prior reflected inspection pulse from same or nearby scene segment; and (d) relevance value of the given scene segment. Controller 320 may adjust operation of PRX 308, such as (a) photonic sensor selection, (b) photonic sensor biasing, (c) photonic sensor operation timing with respect to the PTX operation timing and with respect to the scene segment, (d) photon sensor output processing, and more for a given scene segment based on: (a) corresponding photonics inspection pulse intensity; (b) pre-pulse reading on PRX photonic sensor(s); (c) energy level of prior reflected inspection pulse from same or nearby scene segment; (d) current scanning direction of photonic steering assembly being used and more. Controller 320 may adjust operation of PSY 316, such as: scanning method that defines the acquisition pattern and sample size of the scene, power modulation that defines the range accuracy of the acquired scene, correction of axis impairments based on collected feedback, calibration of steering to expected characteristics and more.
[00239] Turning to Figs. C5A & C5B depicted are host systems 400 and 450 including host 428 and 478, respectively. It is understood that elements 404- 432 of Fig. C5A and elements 454-482 of Fig. C5B are substantially similar to elements 304-332 of Fig. C4 and that the discussion of those elements is applicable to elements 404-432 and 454-482, appropriately. Furthermore, host 428 and 478 includes host controller 448 and 498 (respectively). It is understood that scanning devices 404 and 454 each include ail of the sub elements depicted in Fig. C4 with regard to scanning device 304 and are not detailed here for clarity, detailed are the blocks currently being discussed. Differences from Fig. C4 will be discussed below.
[00240] According to some embodiments, hosts 428 and/or 478 may each be a vehicle or a drone.
[00241 ] With regard to Fig. C5A, according to some embodiments, host 428 may receive vision signal 424 and relay to scanning device host feedback and information 432, which may include information from additional host modules such as additional scanning devices, sensors, cameras, host steering system, host controller 448 and more.
[00242] With regard to Fig. C5B, according to some embodiments, at least part of the situational assessment logic functionality, discussed with regard to Fig, C4, for example, may be executed in/or implemented by host controller 498 instead of scanning device 454. Accordingly, scene signal 460 (or a derivative of scene signal 460, hence the dashed iine__ may be relayed to host controller 498 and the rest of the analysis carried out at the host controller's SAL 476.
[00243] Turning to Fig. C6, shown is a flowchart 600 for a method of scanning a scene according to some embodiments. A scanning device may be operated based on default values or an initial signal(s) for scanning parameters (602). As a result of scanning a scene a detected scene signal is received/detected from a detector associated with the scanning device (604). Furthermore, one or more elements of a scanning device may be configured to provide feedback regarding operation of the elements of a scanning device (606) and a host device associated with the scanning device may provide additional information (608) such as host information and feedback regarding additional elements of the host (additional scanning devices, sensors and more). Based on the received signals and information a visual situation may be assessed (610) either by the scanning device, by the host or a combination of the two. Based on the visual situation the scanning parameters may be updated (612) causing the scanning device to scan a scene based on the visual situation. The visual situation or a signal associated with the visual situation may be relayed to a host.
[00244] According to embodiments of the present invention, there may be provided a scene scanning device adapted to inspect regions or segments of a scene using photonic pulses, which device may be a Lidar device. The photonic pulses used to inspect the scene, also referred to as inspection pulses, may be generated and transmitted with characteristics which are dynamically selected as a function of various parameters relating to the scene to be scanned and/or relating to a state, location and/or trajectory of the device. Sensing and/or measuring of characteristics of inspection pulse reflections from scene elements illuminated with one or more inspection pulses may also be dynamic and may include a modulating optical elements on an optical receive path of the device.
[00245] According to some embodiments, inspection of a scene segment may include illumination of the scene segment or region with a modulated pulse of photons, which pulse may have known parameters such as: pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter and/or average power. Inspection may also include detecting and characterizing various parameters of reflected inspection photons, which reflected inspection photons are inspection pulse photons reflected back towards the scanning device from an illuminated element present within the inspected scene segment (i.e. scene segment element). [00246] The definition of a scene according to embodiments of the present invention may vary from embodiment to embodiment, depending on the specific intended application of the invention. For Lidar applications, optionally used with a motor vehicle platform, the term scene may be defined as the physical space, up to a certain distance, surrounding the vehicle (in-front, on the sides, behind, below and/or above). A scene segment or scene region according to embodiments may be defined by a set of angles in a polar coordinate system, for example, corresponding to a diverging pulse or beam of light in a given direction. The light beam/pulse having a center radial vector in the given direction may also be characterized by broader defined angular divergence values, polar coordinate ranges of the light beam/pulse. Since the light beam/pulse produces an illumination area, or spot, of expanding size the farther out from the light source the spot hits a target, a scene segment or region being inspected at any given time, with any given photonic pulse, may be of varying and expanding dimensions. Accordingly, an inspection resolution of a scene segment may be reduced the further away illuminated scene segment elements are away from the active scene scanning device. One of the critical tasks at hand for a scanning system is to observe the scene and understand semantics, such as drivable areas, obstacles, traffic signs and take vehicle control action upon them.
[00247] Turning to Fig. D1 A, depicted is an example monostatic scanning device schematic 100. According to some embodiments, there may be provided a scene scanning device such as scanning device 1 12 which may be adapted to inspect regions or segments of a scene (shown here is a specific FOV being scanned) using photonic pulses (transmitted light) whose characteristics may be dynamically selected as a function of: (a) optical characteristics of the scene segment being inspected; (b) optical characteristics of scene segments other than the one being inspected; (c) scene elements present or within proximity of the scene segment being inspected; (d) scene elements present or within proximity of scene segments other than the one being inspected; (e) an operational mode of the scanning device; (f) expected scene elements; (g) importance/priority of an expected scene element and/or (h) a situational feature/ characteristic of a host platform with which the scanning device is operating. The scene scanning device may be adapted
85 to inspect regions or segments of a scene using a set of one or more photonic transmitters 122 (including a light source such as pulse laser 1 14) , receptors including sensors (such as detector assembiy 1 16) and/or steering assembiy 124; whose configuration and/or arrangement may be dynamically selected as a function of: (a) optical characteristics of the scene segment being inspected; (b) optical characteristics of scene segments other than the one being inspected; (c) scene elements present or within proximity of the scene segment being inspected; (d) scene elements present or within proximity of scene segments other than the one being inspected; (e) an operational mode of the scanning device; (f) expected scene elements; (g) importance/priority of an expected scene element and/or (h) a situational feature/ characteristic of a host platform with which the scanning device is operating. Active scanning device 1 12 may include: (a) a photonic transmitter 122 which produces pulses of inspection photons; (b) a photonic steering assembiy 124 that directs the pulses of inspection photons to/from the inspected scene segment; (c) a photonic detector assembly 1 16 to detect inspection photons reflected back from an object within an inspected scene segment; and (d) a controller to regulate operation of the photonic emitter assembly, the photonic steering assembiy and the operation of the photonic detection assembiy in a coordinated manner and in accordance with scene segment inspection characteristics of the present invention at least partially received from internal feedback of the scanning device so that the scanning device is a closed loop dynamic scanning device. A closed loop scanning device is characterized by having feedback from at least one of the elements and updating one or more parameter of two or more scanning device blocks (photonic transmitters 122, steering assembly 124 and/or detector assembly 1 16) based on the received feedback. A closed loop system may receive feedback and update the systems own operation at least partially based on that feedback. A dynamic system or element is one that may be updated during operation.
[00248] According to some embodiments, inspection of a scene segment may include illumination of the scene segment or region with a pulse of photons (transmitted light), which pulse may have known parameters such as pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter average power, pulse power intensity, pulse width, pulse repetition rate, pulse sequence, pulse duty cycle, wavelength, phase, polarization and more. Inspection may also include detecting and characterizing various aspects of reflected inspection photons, which reflected inspection photons are inspection pulse photons (reflected light) reflected back towards the scanning device (or laser reflection) from an illuminated element present within the inspected scene segment (i.e. scene segment element). Characteristics of reflected inspection photons may include photon time of flight (time from emission till detection), instantaneous power (or power signature) at and during return pulse detection, average power across entire return pulse and photon distribution/signal over return pulse period the reflected inspection photons are a function of the inspection photons and the scene elements they are reflected from and so the received reflected signal is analyzed accordingly. In other words, by comparing characteristics of a photonic inspection pulse with characteristics of a corresponding reflected and detected photonic pulse, a distance and possibly a physical characteristic such as reflected intensity of one or more scene elements present in the inspected scene segment may be estimated. By repeating this process across multiple adjacent scene segments, optionally in some pattern such as raster, lissajous or other patterns, an entire scene may be scanned in order to produce a map of the scene.
[00249] Scanning device 1 12 may have hierarchical FOV perception capabilities that can be shifted in space and time. These capabilities may enable high performance LiDAR across a very large FOV area by adaptive partitioning into segments of FOVs that are allocated a certain level of quality of service (QoS). It is typically impossible to assign the highest QoS for all segments, therefore the need for an adaptive allocation method will be henceforth described. QoS depends on the signal to noise ratio between the laser pulse transmitted 126 and the laser reflection detected 128 from the target reflection. Different levels of laser power may be applied in different regions in the LiDAR FOV. The levels of power may range from zero up to the maximum power that the laser device is capable of transmitting and/or receiving. QoS has limitations stemming from physical design, eye safety, thermal constraints, cost and form factor and more. Accordingly, scanning device 1 12 may be limited by one or more of the following system and/or scene features: horizontal and vertical FOV range; data acquisition rate (e.g. frame rate); resolution (e.g. number of pixels in a frame); accuracy (spatial and temporal); range (effective detection distance) and more,
[00250] According to some embodiments, scanning device 1 12 may be assembled and fixed on a vehicle in constrained locations which may cause a fixed boresight. For this and additional reasons, scanning device 1 12 may be "observing" the FOV of the driving scene in a sub-optimal manner. Scanning device 1 12 may experience obstructing elements in the vehicle assembly as well as sub-optimal location in relation to the vehicle dimensions and aspect ratio and more.
[00251] Typically, laser power allocation affects data frame quality which is represented by the following parameters: range of target, frame rate and/or FOV and spatial resolution. With regard to range of target- the farther the target within FOV, the longer the path the laser pulse has to travel and the larger the laser signal loss. A far target will require a higher energy laser pulse than a close target in order to maintain a certain signal to noise ratio (SNR) that is required for optimal detection of the target. The laser energy may be achieved by modulating the laser pulse transmitted 126 for example: by appropriately controlling the laser light pulse width and the laser light pulse repetition rate. With regard to FOV and spatial resolution: the number of data elements (e.g. 3D or 4D pixels) in a frame combined with the FOV define the size of the frame. The more data elements in a frame, the more laser energy that has to be spent in order to acquire more data scanning device 1 12 surroundings. Doubling the resolution and the FOV, for example, would result in doubling the laser energy spent in order to acquire double the size of the data set. With regard to frame rate: higher frame rate implies that the laser may be illuminating a certain target within the FOV at a higher rate and therefore more energy is also spent in this case.
[00252] Turning to Fig. D1 B, depicted is an example bistatic scanning device schematic 150. It is understood that scanning device 162 is substantially similar to scanning device 1 12. However, scanning device 1 12 is a monostatic scanning device while scanning device 162 is a bistatic scanning device. Accordingly, steering element 174 is comprised of two steering elements: steering element for PTX 171 and steering element for PRX 173. The rest of the discussion relating to scanning device 1 12 of Fig. D1A is applicable to scanning device 162 Fig. D1 B. [00253] Turning to Fig. D1 C, depicted is an example scanning device schematic 175 with a plurality of photonic transmitters 122 and a plurality of splitter elements 1 18 and a plurality of detector assemblies 1 16. All of the transmitters 122, detectors, 1 16 and splitters 1 18 may have a joint steering element 120. !t is understood that scanning device 187 is substantially similar to scanning device 1 12. However, scanning device 187 is a monostatic scanning device with a plurality of transmitting and receiving elements. The rest of the discussion relating to scanning device 1 12 of Fig. D1A is applicable to scanning device 187 Fig. D1 C.
[00254] Turning to Fig. D2, depicted is an example scanning system 200 in accordance with some embodiments. Scanning system 200 may include a scene scanning device such as scanning device 204 adapted to inspect regions or segments of a scene using photonic pulses which may be emitted in accordance with dynamically selected parameters. Scanning device 204 may be configured to operate in conjunction with a host device, such as host 228 which may be part of the system 200 or associated with the system 200. Scanning device 204 may be an example embodiment for scanning device 1 12 of Fig. D1 A, scanning device 162 of Fig. D1 B and/or scanning device 187 of Fig. D1 C and the discussion of those scanning devices is applicable to scanning device 204.
[00255] According to some embodiments, scanning device 204 may include a photonic emitter assembly (PTX) such as PTX 206 to produce pulses of inspection photons. PTX 206 may include a laser or alternative light source. The light source may be a laser such as a solid-state laser, a high power laser or otherwise or an alternative light source such as, a LED based light source or otherwise.
[00256] According to some embodiments, the photon pulses may be characterized by one or more controllable pulse parameters such as: pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter average power, pulse power intensity, pulse width, pulse repetition rate, pulse sequence, pulse duty cycle, wavelength, phase, polarization and more. The inspection photons may be controlled so that they vary in pulse duration, pulse angular dispersion, photon wavelength, instantaneous power, photon density at different distances from the emitter average power, pulse power intensify, pulse width, pulse repetition rate, pulse sequence, pulse duty cycle, wavelength, phase, polarization and more. The photon pulses may vary between each other and the parameters may change during the same signal. The inspection photon pulses may be pseudo random, chirp sequence and/or may be periodical or fixed and/or a combination of these. The inspection photon pulses may be characterized as: sinusoidal, chirp sequences, step functions, pseudo random signals, or linear signals or otherwise. Examples are shown in Fig. 3 which depicts example inspection photon pulses control signals 300 including examples laser signal A (302); laser signal B (304) and laser signal C (306) depicting the control signal enabling a photon pulse and determining the intensity, width, repetition rate of the pulse as well as pulse repetition rate and/or pulse sequence. Laser signal A 302, for example, is characterized by increased power pulses, this type of sequence may be applicable to cover targets at increased ranges. Laser signal B 304, for example, is characterized by a chirp pulse position modulation and may be applicable for increased SNR. Laser signal C 308 may be characterized by a combination of chirp pulse position modulation and increased power range applicable for both increased range and increased SNR.
[00257] Returning to Fig. D2, according to some embodiments PTX 206 laser may operate in different laser modes such as modulated continuous wave (CW), pulsed quasi CW (Q-CW), mode locked, and may include a plurality of laser emitters.
[00258] According some embodiments, PTX 206 may include additional elements such as a collimator to compensate for divergence effects of the laser emitter and render the beam into an optimal shape suitable for steering, transmission and detection. PTX 206 may also include a thermoelectric cooler to optimize temperature stabilization as solid-state lasers, for example, may experience degradation in performance with temperature increase, so cooling the laser may enable a higher power yield. PTX 206 may also include an optical outlet.
[00259] According to some embodiments, PTX 206 may include one or more PTX state sensors to produce a signal indicating an operational state of PTX 206 which may include information such as PTX power consumption, temperature, laser condition and more.
[00260] According to some embodiments, scanning device 204 may include a photonic reception and defection assembly (PRX) such as PRX 208 to receive reflected photons reflected back from an object or scene element and produce detected scene signal 210. PRX 208 may include a detector such as detector 212. Detector 212 may be configured to detect the reflected photons reflected back from an object or scene element and produce detected scene signal 210.
[00261 ] According to some embodiments, detected scene signal 210 may include information such as: time of flight which is indicative of the difference in time between the time a photon was emitted and detected after reflection from an object, reflected intensity, polarization values and more.
[00262] According to some embodiments, detected scene signal 210 may be represented using point cloud, 3D signal or vector, 4D signal or vector (adding time to the other three dimensions) and more.
[00263] According to some embodiments, detector 212 may have one or more updatable detector parameters controlled by detector parameters control 214 such as: scanning direction, frame rate, ambient light effects, mechanical static and dynamic impairments, thermal effects, wear and tear, area of interest, resolution, sensitivity and more. Detector parameters control 214 may be utilized for dynamic operation of detector 212, for example, scanning direction may be utilized for dynamic allocation of detector power/resolution/sensitivity/resources. Scanning direction may be the expected direction of the associated inspection photons, frame rate may be the laser or PRX's frame rate, ambient light effect may include detected noise photons or expected inspection photons (before they are reflected), mechanical impairments may also be correlated to issues deviation of other elements of the system that need to be compensated for, knowledge of thermal effects may be utilized to reduce signal to noise ratio, wear and tear refers to wear and tear of detector 212 and/or other blocks of the system that detector 212 can compensate for, area of interest may be an area of the scanned scene that is more important and more. Ambient conditions such as fog/rain/smoke impact signal to noise (lifting the noise floor) can be used as a parameter that defines the operating conditions of detector 212 and also laser of PTX 206. Another critical element is the gating of detector 212 in a monostatic design example embodiment thus avoiding the blinding of detector 212 with the initial transmission of the laser pulse, or due to any other example TX/RX co-channel interference. [00264] According to some embodiments, detector 212 may include an array of detectors such as an array of avalanche photo diodes (APD), single photon detection avalanche diodes (SPADs) or a single detecting elements that measure the time of flight from a laser pulse transmission event to the reception event and the intensity of the received photons. The reception event is the result of the laser pulse being reflected from a target in the FOV present at the scanned angular position of the laser of PTX 208. The time of flight is a timestamp value that represents the distance of the reflecting target, object or scene element to scanning device 204. Time of flight values may be realized by photon detection and counting methods such as: TCSPC (time correlated single photon counters), analog methods for photon detection such as signal integration and qualification (via analog to digital converters or plain comparators) or otherwise.
[00265] According to some embodiments, detector 212 may include a full array of single photon detection avalanche diodes which may be partitioned into one or more pixels that capture a fragment of the FOV. A pixel may represent the basic data element that build up the captured FOV in the 3-dimensional space (e.g. the basic element of a point cloud representation) including a spatial position and the reflected intensity value.
[00266] According to some optional embodiments of detector 212 may include: (a) a two dimensional array sized to capture one or more pixels out of the FOV, a pixel window may contain a fraction of a pixel, one or more pixels or otherwise; (b) a two dimensional array that captures multiple rows or columns in a FOV up to an entire FOV; (c) a single dimensional array and/or (d) a single SPAD element or otherwise.
[00267] According to some embodiments, PRX 212 may also include an optical inlet which may be a single physical path with a single lens or no lens at ail.
[00268] According to some embodiments, PRX 212 may include one or more PRX state sensors to produce a signal indicating an operational state of PRX 212 for example power information or temperature information, detector state and more.
[00269] According to some embodiments, scanning device 204 may be a bi static scanning device where PTX 206 and PRX 208 have separate optical paths or scanning device 204 may be a monostatic scanning system where PTX 206 and PRX 208 have a joint optical path.
[00270] According to some embodiments, scanning device 204 may include a photonic steering assembly (PSY), such as PSY 216, to direct pulses of inspection photons from PTX 206 in a direction of an inspected scene and to steer reflection photons from the scene back to PRX 208. PTX 216 may also be in charge of positioning the singular scanned pixel window onto/in the direction of detector 212.
[00271] According to some embodiments, PSY 216 may be a joint PSY, and accordingly, may be joint between PTX 206 and PRX 208 which may be a preferred embodiment for a monostatic scanning system.
[00272] According to some embodiments, PSY 216 may include a plurality of steering assemblies or may have several parts one associated with PTX 216 and another associated with PRX 208.
[00273] According to some embodiments PSY 216 may be a dynamic steering assembly and may be controllable by steering parameters control 218. Example steering parameters may include: scanning method that defines the acquisition pattern and sample size of the scene, power modulation that defines the range accuracy of the acquired scene, correction of axis impairments based on collected feedback.
[00274] According to some embodiments PSY 216 may include: (a) a Single Dual-Axis MEMS mirror; (b) a dual single axis MEMS mirror; (c) a mirror array where multiple mirrors are synchronized in unison and acting as a single large mirror; (d) a mirror splitted array with separate transmission and reception and/or (e) a combination of these and more.
[00275] According to some embodiments, if PSY 216 includes a MEMS splitted array the beam splitter may be integrated with the laser beam steering. According to further embodiments, part of the array may be used for the transmission path and the second part of the array may be used for the reception path. The transmission mirrors may be synchronized and the reception mirrors may be synchronized separately from the transmission mirrors. The transmission mirrors and the reception mirrors sub arrays maintain an angular shift between themselves in order to steer the beam into separate ports, essentially integrating a circulator module. [00276] According to some embodiments, PSY 218 may include one or more PSY state sensors which may at least partially be used for producing a signal indicating an operational state of PSY216 such as PSY feedback 230 which may include power information or temperature information, reflector state, reflector actual axis positioning, reflector mechanical state, operational health state and more.
[00277] According to some embodiments, PSY 216 may also include a circulator Model/Beam splitter, although it is understood that the splitter may also be part of PRX 208 instead. The beam splitter may be configured to separate the transmission path of PTX 206 from the reception path of PRX 208. According to some embodiments, the beam splitter may either be integrated in the steering assembly (for example if a splitter array is utilized) or may be redundant or not needed and accordingly the scanning device may not include a beam splitter.
[00278] According to some embodiments, the beam splitter of PSY 216 may be a polarized beam spiitter(PBS), a slitter PBS (polarizing beam splitter) integrating a mirror and a quarter wave plate, circulator beam splitter and/or a slit based reflector or the like.
[00279] According to some embodiments, PSY 216 may include one or more reflective surfaces, each of which reflective surface may be associated to an electrically controllable electromechanically actuator. The reflective surface(s) may be made from polished gold, aluminum, silicon, silver, or otherwise. The eiectrometrical actuator(s) may be selected from actuators such as stepper motors, direct current motors, gaivanometric actuators, electrostatic, magnetic or piezo elements or thermal based actuators. PSY 216 may include or be otherwise associated with one or more microelectromechanical systems (MEMS) mirror assemblies. A photonic steering assembly according to refractive embodiments may include one or more reflective materials whose index of refraction may be electrically modulated, either by inducing an electric field around the material or by applying electromechanical vibrations to the material.
[00280] According to yet further embodiments, the PSY 216 may include a beam splitter to help separate transmission path from the reception path. Using the same photonic steering assembly may provide for tight synchronization between a direction in which a photonic pulse/beam is steered and emitted by the photonic emitter assembly and a direction of a concurrent FOV of one or more optical sensors of the photonic detection assembly. Shared photonic steering assembly configuration may allow for a photonic detector assembly of a given device to focus upon and almost exclusively to collect/receive reflected photons from substantially the same scene segment being concurrently illuminated by the given device's photonic emitter assembly. Accordingly, as the photonic steering assembly moves, so does the photonic pulse illumination angle along with the FOV angle.
[00281] According to some embodiments, scanning device 204 may include a controller to control scanning device 204, such as controller 220. Controller 204 may receive scene signal 210 from detector 212 and may control PTX 206, PSY 218 PRX 208 including detector 212 based on information stored in the controller memory 222 as well as received scene signal 210 including accumulated information from a plurality of scene signals 210 received over time.
[00282] According to some embodiment, SAL 226 may receive a PTX feedback
229 indicating PTX associated information such as power consumption, temperature, laser operational status, actual emitted signal and more.
[00283] According to some embodiment, SAL 226 may receive a PRX feedback 231 indicating PRX associated information such as power consumption, temperature, detector state feedback, detector actual state, PRX operational status and more.
[00284] According to some embodiment, SAL 226 may receive a PSY feedback
230 indicating PSY associated information such as power consumption, temperature, instantaneous position of PSY 218, instantaneous scanning speed of PSY 218, instantaneous scanning frequency of PSY 218, mechanical overshoot of PSY 218, PSY operational status and more.
[00285] According to some embodiments, SAL 226 may receive a host information and feedback signal such as host feedback 232 which may include information received from the host. Host feedback may include information from other sensors in the system such other LiDARs, camera, RF radar, acoustic proximity system and more.
[00286] According to some embodiments, controller 220 may process scene signal 210, optionally, with additional information and signals and produce a vision output such as vision signal 234 which may be relayed/transmitted/ to an associated host device. Controller 220 may receive detected scene signal 210 from detector 212, optionally scene signal 210 may include time of flight values and intensity values of the received photons. Controller 220 may build up a point cloud or 3D or 2D representation for the FOV by utilizing digital signal processing, image processing and computer vision techniques.
[00287] According to some embodiments, controller 220 may include situational assessment logic or circuitry such as situational assessment logic (SAL) 228. SAL 126 may receive detected scene signal 210 from detector 212 as well as information from additional blocks/elements either internal or external to scanning device 204 such as PTX feedback 229, PSY feedback 230, PRX feedback 231 , host feedback 232 and more.
[00288] According to some embodiments, scene signal 210 can be assessed and calculated with or without additional feedback signals such as a PSY feedback 230, PTX feedback 229, PRX feedback 231 and host feedback 232 and information stored in memory 222 to a weighted means of local and global cost functions that determine a scanning plan such as work plan signal 234 for scanning device 204 (such as: which pixels in the FOV are scanned, at which laser parameters budget, at which detector parameters budget). Controls such as PTX control signal 251 , steering parameters control 218, PRX control 252 and/or detector parameters control 214 may be determined/ updated based on work plan 234. Accordingly, controller 220 may be a closed loop dynamic controller that receives system feedback and updates the system's operation based on that feedback.
[00289] According to some embodiments of the present invention, there may be provided a scanning device for scanning one or more segments of a scene, also referred to as scene segments. The device may include one or more photonic emitter assemblies (PTX), one or more photonic reception and detection assemblies (PRX), a photonic steering assembly (PSY) and a situationaiiy aware controller adapted to synchronize operation of the PTX, PRX and PSY, such that the device may dynamically perform active scanning of one or more scene segments, or regions, of a scene during a scanning frame. Active scanning, according to embodiments may include transmission of one or more photonic inspection pulses towards and across a scene segment, and when a scene element present within the scene segment is hit by an inspection pulse, measuring a roundtrip time-of-f!ight for the pulse to hit the element and its reflections to return, in order to estimate a distance and a (relative) 3-dimensional coordinate of point hit by the inspection pulse on the scene element. By collecting coordinates for a set of points on an element, using a set of inspection pulses, a 3-dimensionai point cloud may be generated and used to detect, register and possibly identify the scene element.
[00290] The controller may be a situationally aware controller and may dynamically adjust the operational mode and operational parameters of the PTX, PRX and/or PSY based on one or more detected and/or otherwise known scene related situational parameters. According to some embodiments, the controller may generate and/or adjust a work plan such as scanning plan 234 for scanning portions of a scene, as part of a scanning frame intended to scan/cover one or more segments of the scene, based on an understanding of situational parameters such as scene elements present within the one or more scene segment. Other situational parameters which may be factored in generating the scanning plan may include a location and/or a trajectory of a host platform carrying a device according to embodiments. Yet further situational parameters which may be factored in generating the scanning plan may include a topography, include road slope, pitch and curvature, surrounding a host platform carrying a device according to embodiments.
[00291] Scanning plan 234 according to embodiments may include: (a) a designation of scene segments within the scene to be actively scanned as part of a scanning frame, (b) an inspection pulse set scheme (PSS) which may define a pulse distribution pattern and/or individual pulse characteristics of a set of inspection pulses used to scan at least one of the scene segments, (c) a detection scheme which may define a detector sensitivity or responsivity pattern, (d) a steering scheme which may define a steering direction, frequency, designate idle elements within a steering array and more, !n other words, scanning plan 234 may at least partially affect/determine PTX control signal 251 , steering parameters control 218, PRX control 252 and/or detector parameters control 214 so that a scanning frame is actively scanned based on scene analysis. [00292] According to some embodiments, scene related situational parameters factored in formulating work plan 234 may come from: (a) Localized output of a shared/pre-stored background model (Background, Topography, Road, Landmarks, etc.) (b) Localization Using GPS, Terrestrial Radio Beacons, INS, Visual landmark detection (c) Accelerometer, Gravity Meter, etc. (d) Acquired background model (Background/Topology detection using camera and/or active (Lidar) scanning) (e) Active (Lidar) Foreground Scanning (f) Camera Based Feature/Element Detection/Registration (g) Host platform sensor such as camera, radar outputs (h) Host Ego-motion information such as wheels steering position, speed, acceleration, braking, headlights, turning lights, GPS and more (i) other LIDAR components in the system and (j) ROI and/or RON! models.
[00293] According to some embodiments, factors in formulating/generating/adjusting work plan 234 may include: (a) Host location and/or trajectory; (b) Terrain (such as road features and delimiters, static features such as trees, buildings, bridges signs and landmarks and more), (c) Background Elements (assumed and detected), and (d) Foreground Elements' (Detected) Location and Trajectory and more.
[00294] According to some embodiments, work plan 234 may determine or cause the FOV to be modified/determined. Scanning device 204 can change its reference or nominal FOV observation by modifying, for example, the boresight reference point of sight. A solid-state Lidar, if incorporated in scanning device 204 may control the boresight reference point in space while maintaining the same FOV, a feature not feasible with fixed FOV Lidar devices.
[00295] According to some embodiments, SAL 226 may determine scanning plan 234 at least partially by determining/detecting/receiving regions of interest within the FOV and regions of non-interest within the FOV. Regions of interest may be sections/pixels/elements within the FOV that are important to monitor/detect, for example, areas which may be marked as regions of interest may include, crosswalks, moving elements, people, nearby vehicles and more. Regions of non- interest may be static (non-moving) far-away buildings, skyline and more.
[00296] According to some embodiments, scanning plan 234 may control one or more control signals including: PTX control 251 , PSY control 218, PRX control 252 and/or detector control 214. The control signals may be utilized for (a) laser power scheduling to allocate laser power for each element or tri-dimensional pixel of a frame that is in the process of acquisition of scheduled for acquisition; (b) laser pulse modulation characteristics such as duration, rate, peak and average power, spot shape and more; (c) detector resources allocation for example to activate detector elements where a ROI is expected and disable detector elements where regions of non-interest are expected to reduce noise, detector sensitivity such as high sensitivity for long range detection where the reflected power is low, detector resolution such as long range detection with a weak reflected signal may result in averaging of multiple detector elements otherwise serving as separate higher resolution pixels; (d) updating steering parameters to scan an active FOV.
[00297] Turning to Figs. D4A - D4F shown are schematics depicting scanning plans which may be utilized to control pulse parameters and/or detector parameters and/or steering parameters using an identical key 402 for ail of these figures. Fig. D4A depicts a first frame 404 wherein all of the pixels are of the same importance/priority having a default power allocated to them, this may, for example be utilized in a start-up phase or periodically interleaved in a scanning plan to monitor the whole frame for unexpected/new elements. According to scanning plan depicted in 404 the pulse parameters may be configured to have a constant amplitude.
[00298] Turning to Fig. D4B, depicted is a second frame 408 which may be a partial pixel frame, a section of frame 402 is configured to have a high power while the rest of the frame may be configured to have no power. The pixels having maximal power may be a ROI. The resulting frame may have a low number of pixels enable a high range in the ROI due to concentration of laser power. According to the scanning plan depicted in 408 the pulse parameters may, for example, be configured to have a high amplitude only in the ROI and no power steered in the RONI, A steering device may be utilized to deflect the signal only in the RO! and/or a detector may be configured to receive a signal only where the ROI is expected to be received to avoid any noise for the pixels that have no power.
[00299] Turning to Fig. D4C, depicted is a third frame 408 which may be characterized in that all the pixels have a power allocation according to the ROI designation. Thus, the most interesting/important regions may have the highest power and so on.
[00300] Turning to Fig. D4D, depicted is a fourth frame 410 which is characterized in a range of different powered pixels. The ROI in the center is allocated with maximal power while the lower interest region has a default power in a lower spatial resolution which is a different way of receiving information for a RON! or region of lower interest. According to the scanning plan depicted in 410 the pulse parameters may be configured to have a high amplitude in the ROI and a lower amplitude with a lower frequency may be utilized for the other pixels. Furthermore, the detector may be turned off in the turned off pixels and steering parameter may be modified, for example, for rows that do not have a ROI in them.
[00301] Turning to Fig. D4E shown is a fifth frame 412 which is characterized as having a variable resolution, variable power/range. The ROI, in this example, has high resolution and high power, additional pixels at default power, low power pixels and lower spatial resolution.
[00302] Turning to Fig. D4F shown is a sixth frame 414 which includes a compact vehicle and a bus (see silhouettes) the edges of the vehicle and bus may be tracked with high power and the central mass of the vehicle and bus may be allocated lesser power (or no power). Such power allocation enables concentrating more power on the edges and less on the center which has less importance.
[00303] Turning back to Fig. D3, and as shown in the previous frame examples, scanning plan 234 may dynamically allocate laser, detector and steering resources towards regions of interest/non-interest based on several strategies.
[00304] In a first example, if no power is scheduled for one or more frames the pixel may be skipped (by not allocating laser power, by disabling reflection toward the scene and/or by disabling the detector or otherwise). This example may be utilized for a center pixel in a tracked vehicle that would be considered much less interesting than the edge pixels of the same vehicle (see also discussion of Fig. D4F).
[00305] In a second example, power may be scheduled (by allocating laser power, by enabling reflection towards and from the pixel and by determining an efficient detector accuracy) for predicted locations of vertical edges of a building or the predicted location of a vehicle in motion that quickly changes lanes of the edges of the FOV that coincide with the host vehicle turning in a certain direction,
[00306] According to some embodiments, laser power may be scheduled periodically over one or more time related sequence (full frames, partial frames) in order to acquire non-deterministic data. Periodicity may be determined by prediction estimation quality factors. For example, a region may be considered noisy having a lot of movement and accordingly may be checked (i.e, may be scanned or may be scanned with more accuracy) more frequently than an area designated as static background.
[00307] Turning to Fig. D5A, shown is a schematic of a scene 500 to be scanned by a scanning device traveling in the direction of arrow 502. The regions of interest of the scene are designated as either being a RON! or a ROI having a level of interest between low and high (see key 504), As is shown, the road delimiters and the buildings vertical planes, in the example, would be designated as being a region of high interest (R2), the pedestrian and a moving car a bit farther ahead are designated as regions of medium interest (R1 ) and the rest of the scene is generally considered a region of low interest (R0), the skyline is designated as a RON! (R3). As shown in Fig. D5B chart 550 the power or resource allocation for scene 500 is as determined by an associated controller which includes an SAL. Chart 575 depicts interleaving of ROIs in power allocation over time so that a signal intermittently allocates the most power to the region of highest interest R2, then to the region of medium interest R1 and lowest allocation to the low interest R0, Some power is also allocated to RON! R3 in order to periodically confirm that it is still a RGIMI.
[00308] Turning back to Figure D2, SAL 226 may receive information from in- band and/or out-of-band sources. In-band sources are internal sources of scanning device 204 and may include vision signal 234, detected scene signal 210, PTX feedback 229, PSY feedback 230, and/or memory 222 and more. Analysis of these in-band sources may yield yet further in-band information. In band information may include a road plane and road delimiters, curbs, pedestrians, vehicles, a skyline, vertical planes such as building facets, tree canopies and more and intersections such as road intersections which may be considered a virtual plane. Additional in- band information may include laser power budget such as eye safety limitations, thermal limitation, reliability limitations and more which may be stored in memory 222. Additional in-band information may include electrical operational parameters such as peak currents and peak voltages, calibration data such as a detected and stored correction so that scanning device 204 is calibrated. Calibration data may be static, meaning tested and stored in an initiation or production process or may be dynamic to compensate for ongoing degradation or changes in the system such as operating temperature, operating voltage, etc. In-band information may also include an acquired background model, acquired ROI model and/or acquired RON! model each of which may be acquired overtime by scanning device 204, for example, if scanning device operates repeatedly in a certain location/area the system may accumulate scene information history via system learning models, RO! and RON! models and background models and store them locally.
[00309] According to some embodiments, out-of-band sources are external sources to scanning device 204. The out-of-band information may be received via host feedback 232. The out-of-band sources, however may be directly from host 228 or may be received by host 228 and relayed to scanning device 204. Out-of-band type information may include !nertial IVleasurement Unit (IMU), Ego-motion, brake or acceleration of the associated host, host wheel or wing position, GPS information, directional audio information (police siren, ambulance siren, car crash, people shouting, horns, tires screeching etc.), a background shared model and more. A background shared model may be a source of background local information such as a web map and more.
[00310] According to some embodiments, out-of-band sources which are sources in host 228 or associated with host 228 or detected by host 228 may include: a shared or pre-stored background model, accelerometer, gravity meter and additional sensors, an acquired background model, cameras and/or camera based features/element detection, landmark lists related to global or local positioning (such as GPS, Wireless, Wi~Fi, Bluetooth vehicle to vehicle infrastructure and more) which may be accessed via a crowd sharing model and may be downloaded from a shared storage such as a cloud server.
[00311] According to some embodiments, laser power may be controlled so that maximal signal power is not exceeded and maximal detection sensitivity is also not exceeded. With regard to maximal signal power not being exceeded, the power for a transmitted laser signal is distributed according to prioritization, taking into consideration an expected model as shown with regard to chart 575 for example. However, when considering return signals it is understood that a reflected signal is scene dependent, depending on the reflectivity of the scene elements, noise and ambient conditions as well as distance of elements a maximal threshold from a reflected signal may unintentionally be exceeded. To elaborate, if a series of signals are emitted and subsequently reflected signals are reflected back to the scanning device and ultimately to the detector then the reflected signal may exceed a maximal threshold since noise from external light sources may be added to the signal and a plurality of reflected signals may accumulate due to the differences in time till a return signal is returned based on the distance of the reflecting element. A method for avoiding exceeding a maximal reflected signal value by controlling the transmitted signal is shown in Figure 6 in accordance with some embodiments. Flow chart 600 of Fig. D6 shows an initiation stage (602) initiating a scanning sequence in which the laser power is set to the minimal power setting (above zero) and the reflected signal is expected to be received at a default value (604). The signal is then transmitted with the predetermined signal power (606) which at this point is still the minimal power. Once a reflected signal is received the power is tested/checked (608) if the received signal has not reached its maximal power threshold (610) and if the transmitted signal has not reached its maximal power threshold (614) then the transmitted power level is increased (616). Once the maximal received signal threshold is received the scene may be detected and/or regular operation of the scanning device may proceed (620). It is understood that the monitoring of the received signal as described in flow chart 600 may be carried out in parallel to the regular operation of the scanning device and/or intermittently or periodically.
[00312] According to some embodiments, SAL 232 may also take into account accumulative temperature information and reduce QOS (by limiting, for example, the transmitted signal, detector power and more). Accordingly, a work plan may be derived in accordance with an adjustable QOS. While peak current and/or voltage limitations may be more lenient since typically, even if a peak current/voltage event occurs it may immediately be relieved/ stopped, with regard to exceeding a peak temperature the problem is harder to solve. Scanning device 204's temperature may be monitored in each block and/or in one or more dedicated sensors. It is understood that typically once a maximal threshold is exceeded it may be very difficult to cause scanning device 204 to cool down. Similarly, when extreme weather conditions occur (extreme heat and/or extreme cold for example) it may be preferable to reduce QOS but to maintain some level of detected scene output than having no output at ail or causing scanning device 204 irreparable temperature harm, SAL 232 may be configured to prioritize temperature and weather conditions accordingly.
[00313] According to some embodiments, SAL 232 may prioritize information also based on if they are in-band or out-of-band information. For example, if a host signals to SAL 232 that a turn is expected that may cause work plan signal 234 to be updated regardless of scanning process since a new FOV is expected. Accordingly, an out-of-band signal/information may selectively interrupt a SAL 232 process for calculating/analyzing work plan signal 234. Optionally the host feedback may include an override command structure including a flag indicating that the host input is to override the internal feedbacks and signals. The override structure may contain direct designation to scan certain portion(s) of the scene at a certain power that translates into the LiDAR range and more.
[00314] Turning to Fig. D7A shown is an example scene according to some embodiments, such as scene 700 which may include one or more background elements. Background elements may be regions of interest or regions of non- interest. A background model may be utilized so that SAL 226 may at least partially utilize a background model in order to analyze a scene based on a-priori information and produce a work plan signal 234. In example scene 700 a scanning device may be traveling in the direction as shown by arrow 702. Buildings 704 and 706 and traffic light 708 may be part of a background model stored in an associated memory or received from a host. An associated SAL may utilize this background information so that scanning device does not need to receive a signal to detect building 704 but rather only needs to confirm existence of the expected building. Accordingly, less resources need to be allocated to building 704 and it may be scanned periodically as a region of low interest. Similarly, traffic light 708 may also be part of a background model, and so does not need to be detected but rather confirmed. However, since it may be considered very important to a scanning device to detect the status (red, green etc) and precise location of the traffic light based on the background model, the traffic light 708 may be designated as a region of high interest. A traffic light might also be a region of high interest for sensor information fusion, for example complementing an accurate position of a LiDAR with color information detection from a RGB camera.
[00315] According to some embodiments, elements of background such as building 712, may not be included in a background model and a scanning system may utilize system learning to update a background model.
[00316] Turning to Fig. D7B shown is a flow chart 750 in accordance with a system learning method for utilizing and updating a background model in accordance with some embodiments. While a frame is being detected at time t a localization or background model is retrieved from storage (752) the storage may be local or a shared remote storage or may be a local copy from a shared remote storage. The background model is verified, confirming that the background is relevant to the expected upcoming frame at t+1 (754). If the background model is inaccurate/irrelevant then a new background model may be estimated (756). For example, step 756 in the context of Fig. D7A may include verifying that buildings 704 and 706 exist. As discussed with regard to fig. D7A building 712 did not exist in the background model, in which case the additional background information may be added to background model (758). The next step (based on the updated model or a correct model) is utilizing the background model for scanning frame at T+1 (762). If the model is confirmed by the captured scene elements as correct at t+1 it may be relayed to a shared background model (764 and 766) after which a scanning device may continue to a next frame (768) (such as T+2). Some redundancy or rechecking is described since a background model may require confirmation and validation before actually updating the model.
[00317] Turning to Fig. D8, shown are two identical scenes 810 and 820. Scene 810 includes a vehicle 812 with a scanning device 814. The vehicle is traveling downhill in the direction of a truck 816 and a second vehicle 818. The FOV of scanning device 814 is shown FOV 815 as having a minimal and maximal elevation point which neither truck 816 nor vehicle 818 fail within. Accordingly scanning device 814 cannot detect truck 816 or vehicle 818 and is only expected to do so when it gets substantially closer to them. Scene 820 is substantially similar however in scene 820 scanning device 814 has a dynamic FOV, and has updated FOV 819 with minimal and maximal FOV elevation based on the detected hill slope/incline vehicle 812 is driving on (acquired/detected/designated bay a work plan signal). Accordingly, both vehicle 818 and truck 816 are detected by scanning device 814 in scene 820. Accordingly, an SAL work plan may update a dynamic FOV. More examples are discussed in the following figures.
[00318] Turning to Fig. D9A shown is a FOV ratio 900 including maximal FOV 902 and an active FOV 904 within the maximal FOV 902, the active FOV selected by a SAL based on a work plan signal. Fig. D9B includes an example FOV 910 depicting a default FOV having a center boresight 914, an example FOV 920 having a default FOV with a shifted boresight 924 and an example FOV 930 having a shifted boresight and a shifted aspect ratio 934.
[00319] Turning to Fig. D9C shown are examples of FOV and a transition in the active FOV within maximal FOV 902. In this example yaw relates to movement in vertical axis, pitch relates to movement in lateral axis and roil to movement in longitudinal axis. FOV 940 shows a transition from a first active FOV 944 to a second active FOV 946 when a host intends to turn left. FOV 950 shows a plurality of active FOV (954-958) ail acquired in parallel in accordance with a multiple boresight targets embodiment. FOV 960 shows a transition from a first active FOV 964 to a second active FOV 966 when a host having 4 wheels causes two left wheels to drive on the sidewalk causing movement in the roil axis. Roiling examples include: a bend is detected by LiDAR background estimation that causes the vehicle to roil sideways across the bend's berm. A host vehicle may drive/park partially on a sidewalk or other element that changes the vehicle s parallelism with respect to the road and the FOV. Furthermore, a static roll may be caused due to uneven weight distributed in the vehicle or a malfunction of the damping system.
[00320] According to some embodiments, FOV 970 shows a transition from a first active FOV 974 to a second active FOV 976 when a host is intending to or is moving downhill or into an underground garage causing movement in the pitch axis. Additional examples where a correction along the pitch axis may include situations where a vehicle is no longer parallel to the road and the vertical FOV is not optimal, speed bumpers which are a special case when both the altitude and the tilt angle of the LiDAR effective FOV changes, or a vehicle nose dive or elevation when vehicle brakes, accelerates or wind pressure at high speed causing the vehicle to change level position. Yet another example is a vehicle transitioning through short pathways that exhibit a large elevation difference, for example an underground parking: when exiting from an underground parking, the vehicle's front hood is obstructing the drivers FOV from perceiving obstacles at the end of the climb. Updating to active FOV enables overcoming these difficulties. Additional yaw correction examples include when a bend is detected by a background estimation and the active FOV is gradually shifted according to the speed and the bend features, in order to optimize the target FOV and ultimately detect obstacles in the bend's path. Another example, is when a change in wheel steering in a certain direction causes the FOV to shift towards that direction. Another example is when turn indicators (such as blinkers) provide a hint that the vehicle is expected to perform a turn in/to a specific direction. A special case is when the vehicle is stopped at an intersection, crossing road is detected as a background model and the turn indicator is active, the FOV would shift radically towards the turn direction in order to detect fast moving elements that may pose a threat. FOV 990 shows a transition from a first active FOV 994 to a second active FOV 996 when a host drives over a berm curb on the left causing a transition in roil pitch and yaw axis.
[00321] Turning back to Fig. D2, SAL 226 may determine work plan 234 which in turn may update any of scanning device 204 updateable parameters (discussed above) based on a plurality of situational parameters. Scene elements may be determined to be regions of interest by suppressing background features detected in a previous or current frame. Computer and vision processing may be utilized to detect scene elements and objects, such as computer and vision processing may include: motion tracking methods, geometrical correction, model matching (that a detected element is the same as an expected background element or meets a standard element which may be used to detect curbs, stoplights, signals and more). Furthermore, element and object prediction methods may be utilized based on current and previous frames. [00322] According to some embodiments, SAL 226 may determine objects to be background or may confirm expected background objects are present in the scene. Background features may be predicted and as described above, accordingly they only need be verified and confirmed and therefore less power needs to be allocated in detecting these elements, allowing more power/resources to be allocated toward ROIs. SAL 226 may receive background models from a local memory, a shared storage and may also detect background elements independently. Furthermore, SAL 226 may update work plan 234 based on location and/or trajectory of a host platform 228, detected topography, and more. Furthermore, FOV determined by SAL 220 may cause an update in work plan 234 may update a dynamic FOV so that the required/appropriate FOV is scanned.
[00323] According to some embodiments, work plan 234 may be produced based on (a) real-time detected scene signal (b) intra-frame level scene signal and (c) inter-frame level scene signal accumulated and analyzed over two or more frames. According to some embodiments, work pian 234 may be updated based on real time detected scene information which may also be termed as pixel information. Real time information may analyze detected fast signals during time of flight that contains one or more reflections for a given photonic inspection pulse. For example, an unexpected detected target in a low priority field may cause controller 218 to update the pulse frequency of the laser of PTX 206 via updating of the pulse parameters. Work pian 234 may also be updated at a frame or sub-frame level which may be information received accumulated and/or analyzed within a single frame. Furthermore, work plan 234 may be updated on an inter-frame level, which is information accumulated and analyzed over two or more frames. Increased levels of real time accuracy, meaning that work pian 234 is updated in a pixel or sub-frame resolution, is carried out when higher levels of computation produce increasingly usable results. Increased level of non-real-time accuracy within a specific time period as slower converging data becomes available (e.g. computer vision generated optical flow estimation of objects over several frames), meaning that work pian 234 may be updated as new information becomes evident based on an inter-frame analysis. [00324] According to some embodiments, Host 228 may include steering modules, GPS, crowd sharing background source, and additional scanning devices, cameras and more.
[00325] Turning to Fig. D10 shown is a flow chart 1000 for scanning a scene in accordance with some embodiments. A scanning device may be operated (1002) to scan a scene. A scene signal may be received alongside internal control signals of the scanning device (1004) as well as a background model (1006) and signal from an associated host (1008). The scanning device may assess a visual situation based on at least one of these signals (1 100) and may update a scanning plan (1 102), a background model (1 104), and/or a RON! model (1 106) as well as outputting a vision output to a host device (1 108), the scanning plan may cause an update in the PTX, PRX and/or PSY including updating pulses parameters, scanning parameters and/or detecting parameter and a change in the dynamic FOV.
[00326] While certain features of the invention have been illustrated and described herein, many modifications, substitutions, changes, and equivalents will now occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the invention.

Claims

Claims What is claimed:
1 . A scanning device comprising:
a photonic emitter assembly (PTX) to emit at least one pulse of inspection photons in accordance with at least one adjustable pulse (generation) parameter; a photonic reception and detection assembly (PRX) to receive reflected photons reflected back from an object, said PRX including a dynamic detector to detect the reflected photons based on one or more adjustable detector parameter, said detector further configured to and produce a detected scene signal;
a photonic steering assembly (PSY) functionally associated with both said PTX and said PRX to direct said pulses of inspection photons in a direction of an inspected scene segment and to steer said reflection photons back to said PRX; and
a closed loop controller to control said PTX, PRX and PSY and to receive a PTX feedback, a PRX feedback and a PSY feedback, said controller further comprising a situational assessment unit to receive said detected scene signal from said detector and produce a scanning plan and update said at least one pulse parameter and at least one detector parameter at least partially based on said scanning plan.
2. The scanning device of claim 1 , wherein said at least one pulse parameter is selected from the group consisting of: pulse power intensity, pulse width, pulse repetition rate, pulse sequence, pulse duty cycle, wavelength, phase and polarization,
3. The scanning device of claim 1 , wherein said situational assessment unit is further configured to produce said scanning plan based on said PSY feedback from said PSY.
4. The scanning device of claim 3, wherein said situational assessment unit is further configured to receive information stored on a memory, said information selected from the list consisting of: laser power budget, electrical operational characteristics and calibration data.
5. The scanning device of ciaim 1 , wherein said scanning plan is produced based on (a) real-time detected scene signal (b) intra-frame-level scene signal and (c) inter-frame level scene signal accumulated and analyzed over two or more frames.
6. The scanning device of ciaim 1 , wherein said detector parameters are
selected from the group consisting of: scanning direction, frame rate, ambient light effects, mechanical static and dynamic impairments and thermal effects,
7. The scanning device of ciaim 1 , wherein said PSY has one or more steering parameters and said closed loop controller is further configured to update said steering parameters based on said scanning plan.
8. The scanning device of ciaim 7, wherein said steering parameters are
selected from the group consisting of: scanning method, power modulation, single or multiple deflection axis methods, synchronization components.
9. The scanning device of ciaim 1 , wherein said situational assessment unit is configured to receive a host feedback from a host device and to use said host feedback to produce said scanning plan.
10. The scanning device according to claim 1 , wherein said situational
assessment unit is configured to determine said scanning plan based on a global cost function wherein PSY feedback, PRX feedback, PTX feedback, memory information, host feedback and said detected scene signal are used in producing said scanning plan wherein said host feedback includes an override flag.
1 1 . A method of scanning a scene comprising:
producing pulses of inspection photons wherein said pulses are characterized by at least one pulse parameter;
receiving reflected photons reflected back from an object;
detecting the reflected photons and producing a detected scene signal; and updating at least one pulse parameter based on said detected scene signal.
12. The method of ciaim 1 1 , wherein said at least one pulse parameter is selected from the group consisting of: pulse power intensify, pulse width, pulse repetition rate pulse sequence, pulse duty cycle, wavelength, phase and polarization.
13. The method of claim 1 1 , furtherer comprising producing a work plan based on said detected scene signal.
14. The method according to claim 13, wherein said producing a work plan is also based on a PSY feedback.
15. The method according to claim 14, wherein said producing a work plan is also based on information stored on a memory, wherein said information selected from the list consisting of: laser power budget, electrical operational characteristics and calibration data.
16. The method according to claim 15, wherein said producing a work plan is produced based on (a) real-time detected scene signal (b) intra frame-level scene signal and (c) inter-frame level scene signal accumulated and analyzed over two or more frames.
17. The method according to claim 15, further comprising updating one or more detector parameters based on said work plan.
18. The method according to claim 15, further comprising updating steering of the PSY based on said work plan.
19. A vehicle comprising:
a scanning device including:
a photonic emitter assembly (PTX) to produce pulses of inspection photons wherein said pulses are characterized by at least one pulse parameter;
a photonic reception and detection assembly (PRX) to receive reflected photons reflected back from an object, said PRX including a detector to detect the reflected photons and produce a detected scene signal;
a photonic steering assembly (PSY) functionally associated with both said PTX and said PRX to direct said pulses of inspection photons in a direction of an inspected scene segment and to steer said reflection photons back to said PRX;
a closed loop controller to: (a) control said PTX, PRX and PSY, (b) receive said detected scene signal from said detector, and (c) update said at least one pulse parameter at least partially based on said defected scene signal; and
a host device to receive said detected scene signal and control said vehicle at least partially based on said detected scene signal and to relay a host feedback to said scanning device ,
wherein said situational assessment unit is configured to receive a host feedback from said host device and to use said host feedback to produce said work plan.
20. A scanning device comprising:
a photonic emitter assembly (PTX) to emit at least one pulse of inspection photons in accordance with at least one adjustable pulse parameter;
a photonic reception and detection assembly (PRX) to receive reflected photons reflected back from an object, said PRX including a detector to detect in accordance with at least one adjustable detection parameter the reflected photons and produce a detected scene signal;
a photonic steering assembly (PSY) functionally associated with both said PTX and said PRX to direct said pulses of inspection photons in a direction of an inspected scene segment and to steer said reflection photons back to said PRX; and
a closed loop controller to: (a) control said PTX, PRX and PSY, (b) receive said detected scene signal from said detector and (c) update said at least one pulse parameter and at least one detection parameter at least partially based on a work plan indicative of an estimated composition of scene elements present within the scene segment covered by the given set of inspection pulses, said work plan derived at least partially from said detected scene signal.
21 . The device according to claim 20, wherein said steering assembly is
configured to direct and to steer in accordance with at least one adjustable steering parameter, determined by said work plan.
22. The device according to claim 21 , wherein said steering parameters are
selected from the group consisting of: transmission pattern, sample size of the scene, power modulation that defines the range accuracy of the scene, correction of axis impairments and field of view determination, scanning method, single or multiple deflection axis methods, and synchronization components.
23. The device according to claim 20, wherein said pulse parameter is selected from the group consisting of: pulse power intensity, pulse width, pulse repetition rate, pulse sequence, pulse duty cycle, wavelength, phase and polarization.
24. The device according to claim 23, wherein said detection parameter is
selected from the group consisting of: scanning direction, frame rate, ambient light effects, mechanical static and dynamic impairments and thermal effects,
25. The device according to claim 20, wherein said work plan is further derived from a background model.
26. The device according to claim 25, wherein said work plan is further derived from a region of interest model.
27. The device according to claim 26, wherein said work plan is further derived from a region of non-interest model.
28. The device according to claim 26, wherein said work plan is further derived from a host signal.
29. The device according to claim 22, wherein said steering parameter is a field of view determination and said work plan is derived at least partially from a host signal.
30. The device according to claim 22, wherein said detected scene signal is
emitted in accordance with an adjustable quality of service.
31 . An autonomous vehicle comprising:
a scanning device including: (a) a photonic emitter assembly (PTX) to emit at least one pulse of inspection photons in accordance with at least one adjustable pulse parameter; (b) a photonic reception and detection assembly (PRX) to receive reflected photons reflected back from an object, said PRX including a detector to detect in accordance with at least one adjustable detection parameter the reflected photons and produce a detected scene signal; and (c) a closed loop controller to: (i) control said PTX and PRX, (ii) receive said detected scene signal from said detector and (iii) update said at least one pulse parameter and at least one detection parameter at least partially based on a work plan indicative of an estimated composition of scene elements present within the scene segment covered by the given set of inspection puises, said work plan derived at least partially from said detected scene signal; and
a host controller to receive said detected scene signal and to relay a host feedback to said scanning device including host ego-motion information.
32. The autonomous vehicle of claim 31 , wherein said ego-motion information is selected from the list consisting of: wheels steering position, vehicle speed, vehicle acceleration, vehicle braking, headlights status, turning lights status and GPS location information.
33. The autonomous vehicle of claim 31 , wherein said pulse parameter is
selected from the group consisting of: pulse power intensity, pulse width, pulse repetition rate, pulse sequence, pulse duty cycle, wavelength, phase and polarization.
34. The autonomous vehicle of claim 31 , wherein said detection parameter is selected from the group consisting of: scanning direction, frame rate, ambient light effects, mechanical static and dynamic impairments and thermal effects.
35. The autonomous vehicle of claim 31 , wherein said work plan is further derived from a background model at least partially stored in said host controller and relayed to said scanning device via said host feedback.
36. The autonomous vehicle of claim 31 , wherein said detected scene signal is emitted in accordance with an adjustable quality of service.
37. A method of scanning a scene comprising:
emitting at least one pulse of inspection photons in accordance with at least one adjustable pulse parameter;
detecting in accordance with at least one adjustable detection parameter reflected photons and producing a detected scene signal;
estimating a scene composition of scene elements present within a scene segment and deriving a scanning plan at least partially from said detected scene signal; and
updating at least one pulse parameter and at least one detection parameter at least partially based on said scanning plan.
38. A light steering device comprising:
a mirror connected to one or more electro mechanical actuators through a flexible interconnect element;
one or more actuators interconnected to a frame; and
a controllable electric source to, during operation of said device, provide sensing signal at a source voltage to an electric source contact on at least one of said one or more actuators.
39. The light steering device according to claim 38, and further comprising an electrical parameter sensing circuit connected to an electric sensing contact on at least one of said one or more actuators, and to during operation of said device measure parameters of the sensing circuit.
40. The light steering device according to claim 39, wherein said electric source and said electrical parameter sensing circuit are connected to the same actuator and facilitate sensing of a mechanical deflection of the actuator to which said electric source and said current sensing circuit are connected.
41 . The light steering device of claim 40, further comprising a sensor to relay a signal indicating an actual deflection determined based on said mechanical deflection.
42. The light steering device of claim 41 , wherein said device further comprises a controller to control said controllable electric source and said electrical parameter sensing circuit.
43. The light steering device of claim 42, wherein said controller is further
configured to control deflection of said actuator.
44. The light steering device of claim 43, wherein said controller is configured to correct a steering signal based on said sensed mechanical deflection.
45. The light steering device according to claim 39, wherein said electric source and said electrical parameter sensing circuit are each connected to a contact on two separate actuators and facilitate sensing of a mechanical failure of one or more elements supported by the two separate actuators.
46. The light steering device according to claim 45, wherein said sensing of a mechanical failure is determined based on an amplitude of a sensed current.
47. The light steering device according to claim 46, wherein said sensing of a mechanical failure is determined based on a difference between an expected current and a sensed current.
48. A scanning device comprising:
a photonic emitter assembly (PTX) to produce pulses of inspection photons wherein said pulses are characterized by at least one pulse parameter;
a photonic reception and detection assembly (PRX) to receive reflected photons reflected back from an object, said PRX including a detector to detect the reflected photons and produce a detected scene signal;
a photonic steering assembly (PSY) functionally associated with both said PTX and said PRX to direct said pulses of inspection photons in a direction of an inspected scene segment based on at least one PSY parameter and to produce a sensing signal; and
a closed loop controller to: (a) control said PSY, (b) receive said sensing signal and (c) update said at least one PSY parameter at least partially based on said detected scene signal.
49. The scanning device according to claim 48, wherein said sensing signal is indicative of an actual deflection of said PSY.
50. The scanning device according to claim 48, wherein said sensing signal is indicative of a mechanical failure.
51 . A method of scanning utilizing a mirror assembly including a mirror and
conductive actuator, the method comprising:
setting a mirror having a conductive actuator to a predetermined deflection; detecting a current through said actuator indicative of an mechanical deflection of said mirror; and
determining if the predetermined direction is substantially similar to said actual deflection.
52. The method according to claim 51 , further comprising correcting said actual deflection if said predetermined deflection and said actual deflection are substantially different.
53. The method according to claim 51 , further comprising detecting an actual current through said actuator and said mirror indicative of a electromechanical state of said mirror assembly.
54. The method according to claim 53, further comprising comparing said actual current to an expected current and determining if a mechanical failure has occurred.
55. A light detection and ranging (Lidar) device comprising:
a photonic pulse emitter assembly comprising one or more photonic emitters to generate and focus a photonic inspection pulse towards a photonic transmission (TX) path of said device;
a photonic detection assembly comprising one or more photo sensors to receive and sense photons of a reflected photonic inspection pulses received through a receive (RX) path of said device;
a photonic steering assembly located along both the TX and the RX paths and comprising a Complex Reflector (CR) made of an array of steerabie reflectors, wherein a first set of steerabie reflectors are part of the TX path and a second set of steerabie reflectors are part of the RX path.
56. The Lidar according to claim 55, wherein said first set of steerabie reflectors direct a photonic inspection pulse from said photonic pulse emitter assembly towards a given segment of a scene to be inspected.
57. The Lidar according to claim 58, wherein said second set of steerabie
reflectors direct a photonic inspection pulse reflection, reflected off of a surface of an element present in the given segment of the scene, towards said photonic detection assembly.
58. The Lidar device according to claim 55, wherein said array of steerabie
reflectors are dynamic steerabie reflectors.
59. The Lidar device according to c!airn58, wherein said reflectors are
dynamically steered to compensate for mechanical impairments and drifts.
60. The Lidar device according to claim 58, wherein said dynamic steerabie
reflectors have a controllable state, wherein said state is selected from the list consisting of: a transmission state, a reception state and an idle state.
61 . The Lidar device according to claim 55, wherein said first set of steerable reflectors are mechanically coupled to each other and said second set of steerable reflectors are mechanically coupled to each other.
62. The Lidar device according to claim 55, wherein said first set of steerable reflectors are electronically coupled to each other and said second set of steerable reflectors are electronically coupled to each other.
63. The Lidar device according to claim 55, wherein the dynamic steerable
reflectors are individually steerable.
64. The Lidar device according to claim 55, wherein said first set of steerable reflectors have a first phase and are substantially synchronized and said second set of steerable reflectors have a second phase and are substantially synchronized.
65. The Lidar device according to claim 64, wherein said first phase and said second phase have a substantially fixed difference between them.
66. The Lidar device according to claim 64, wherein said first set of steerable reflectors oscillate together at a first frequency and said second set of steerable reflectors oscillate together at a second frequency wherein said first and second frequency have a substantially fixed phase shift between them.
67. The Lidar device of claim 60, wherein increasing a number of dynamic
steerable reflectors in a transmission state increases a transmission beam spread.
68. The Lidar device of claim 67, wherein decreasing a number of dynamic
steerable reflectors in a reception state decreases reception field of view and is configured to compensate for ambient light conditions.
69. The Lidar device of claim 60, wherein dynamic steerable reflectors in an idle state provide isolation between dynamic steerable reflectors in a transmission state and a reception state.
70. The Lidar device of claim 55, wherein said first set of steerable reflectors are surrounded by said second set of steerable reflectors.
71 . The Lidar device of claim 55, wherein said second set of steerable reflectors are surrounded by said first set of steerable reflectors.
72. A method of scanning a scene comprising:
emitting a photonic pulse towards a photonic transmission (TX) path;
receiving reflected photonic pulses received through a receive (RX) path; detecting with a detector a scene signal based on said reflected photonic inspection pulses; and
complexly steering the photonic pulse towards a scene and the reflected photonic pulses from a scene to the detector; by reflecting at a first phase said photonic pulse and receiving at a second phase said reflected pulse, wherein the difference between said first and second phase is dependent on the time it takes the photonic pulse to be reflected and return.
73. A vehicle comprising:
a scanning device to produce a detected scene signal, said scanning device including: a photonic pulse emitter assembly comprising one or more photonic emitters to generate and focus a photonic inspection pulse towards a photonic transmission (TX) path of said device;
a photonic detection assembly comprising one or more photo sensors to receive and sense photons of a reflected photonic inspection pulses received through a receive (RX) path of said device;
a photonic steering assembly located along both the TX and the RX paths and comprising a Complex Reflector (CR) made of an array of steerabie reflectors, wherein a first set of steerabie reflectors are part of the TX path and a second set of steerabie reflectors are part of the RX path; and a host controller to receive said detected scene signal and control said host device at least partially based on said detected scene signal.
PCT/IB2017/055665 2016-09-20 2017-09-19 Methods circuits devices assemblies systems and functionally associated machine executable code for light detection and ranging based scanning WO2018055513A2 (en)

Applications Claiming Priority (20)

Application Number Priority Date Filing Date Title
US201662396858P 2016-09-20 2016-09-20
US201662396864P 2016-09-20 2016-09-20
US62/396,858 2016-09-20
US62/396,864 2016-09-20
US201662397379P 2016-09-21 2016-09-21
US62/397,379 2016-09-21
US201662405928P 2016-10-09 2016-10-09
US62/405,928 2016-10-09
US201662412294P 2016-10-25 2016-10-25
US62/412,294 2016-10-25
US201662414740P 2016-10-30 2016-10-30
US62/414,740 2016-10-30
US15/391,916 2016-12-28
US15/391,916 US20180100928A1 (en) 2016-10-09 2016-12-28 Methods circuits devices assemblies systems and functionally associated machine executable code for active scene scanning
US15/393,749 US20180113216A1 (en) 2016-10-25 2016-12-29 Methods Circuits Devices Assemblies Systems and Functionally Associated Machine Executable Code for Active Optical Scanning of a Scene
US15/393,749 2016-12-29
US15/393,593 US20180081038A1 (en) 2016-09-21 2016-12-29 Methods Circuits Devices Assemblies Systems and Functionally Associated Machine Executable Code for Light Detection and Ranging Based Scanning
US15/393,285 2016-12-29
US15/393,285 US20180081037A1 (en) 2016-09-20 2016-12-29 Methods Circuits Assemblies Devices Systems and Functionally Associated Machine Executable Code for Controllably Steering an Optical Beam
US15/393,593 2016-12-29

Publications (2)

Publication Number Publication Date
WO2018055513A2 true WO2018055513A2 (en) 2018-03-29
WO2018055513A3 WO2018055513A3 (en) 2018-05-11

Family

ID=61690362

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2017/055665 WO2018055513A2 (en) 2016-09-20 2017-09-19 Methods circuits devices assemblies systems and functionally associated machine executable code for light detection and ranging based scanning

Country Status (1)

Country Link
WO (1) WO2018055513A2 (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102018112809A1 (en) 2018-05-29 2019-12-05 Blickfeld GmbH Actuation of a scanning mirror with an elastic coupling
WO2020020876A1 (en) 2018-07-26 2020-01-30 Blickfeld GmbH Reduced nonlinearities for resonant deflection of a scanning mirror
CN110770600A (en) * 2018-11-29 2020-02-07 深圳市大疆创新科技有限公司 Distributed light detection and ranging (LIDAR) management system
CN110764070A (en) * 2019-10-29 2020-02-07 北科天绘(合肥)激光技术有限公司 Data real-time fusion processing method and device based on three-dimensional data and image data
EP3633403A1 (en) * 2018-10-04 2020-04-08 Aptiv Technologies Limited Object sensor including pitch compensation
DE102018128164A1 (en) * 2018-11-12 2020-05-14 Infineon Technologies Ag LIDAR SENSORS AND METHOD FOR LIDAR SENSORS
CN111175768A (en) * 2020-02-14 2020-05-19 深圳奥锐达科技有限公司 Off-axis scanning distance measuring system and method
CN111587381A (en) * 2018-12-17 2020-08-25 深圳市大疆创新科技有限公司 Method for adjusting motion speed of scanning element, distance measuring device and mobile platform
CN111670337A (en) * 2019-01-09 2020-09-15 深圳市大疆创新科技有限公司 Distance measuring device and mobile platform
CN111766589A (en) * 2019-03-12 2020-10-13 江苏美的清洁电器股份有限公司 Detection assembly, floor sweeping robot and method and system for detecting walking road conditions of floor sweeping robot
US20200327696A1 (en) * 2019-02-17 2020-10-15 Purdue Research Foundation Calibration of cameras and scanners on uav and mobile platforms
WO2021067165A1 (en) * 2019-10-02 2021-04-08 Cepton Technologies, Inc. Techniques for detecting cross-talk interferences in lidar imaging sensors
CN112859101A (en) * 2021-01-11 2021-05-28 武汉大学 Single photon ranging method based on polarization modulation
CN112969937A (en) * 2018-10-19 2021-06-15 创新科技有限公司 LIDAR system and method
CN113287031A (en) * 2018-11-02 2021-08-20 伟摩有限责任公司 Synchronization of multiple rotation sensors of a vehicle
CN113574411A (en) * 2019-03-14 2021-10-29 伟摩有限责任公司 Method and system for detecting an obstacle on a sensor housing
CN113865509A (en) * 2021-09-29 2021-12-31 苏州华兴源创科技股份有限公司 Automatic following detection device
US11561281B2 (en) 2020-06-29 2023-01-24 Waymo Llc Selective deactivation of light emitters for interference mitigation in light detection and ranging (lidar) devices
CN115694792A (en) * 2021-10-09 2023-02-03 科大国盾量子技术股份有限公司 Method and device capable of detecting blind attack caused by intense pulse light and receiving end
US11609477B2 (en) 2019-10-18 2023-03-21 Hyundai Motor Company Liquid crystal based optical deflector and optical scanner using the same
EP4175073A1 (en) * 2021-10-28 2023-05-03 Honeywell International Inc. Systems and methods for strobe-light-based navigation
US11740333B2 (en) 2019-12-04 2023-08-29 Waymo Llc Pulse energy plan for light detection and ranging (lidar) devices based on areas of interest and thermal budgets
US11960029B2 (en) 2022-12-20 2024-04-16 Waymo Llc Selective deactivation of light emitters for interference mitigation in light detection and ranging (lidar) devices

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2469294A1 (en) * 2010-12-23 2012-06-27 André Borowski 2D/3D real-time imager and corresponding imaging methods
US9625580B2 (en) * 2014-01-03 2017-04-18 Princeton Lightwave, Inc. LiDAR system comprising a single-photon detector

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102018112809A1 (en) 2018-05-29 2019-12-05 Blickfeld GmbH Actuation of a scanning mirror with an elastic coupling
WO2019229163A1 (en) 2018-05-29 2019-12-05 Blickfeld GmbH Actuation of a scanning mirror using an elastic coupling
WO2020020876A1 (en) 2018-07-26 2020-01-30 Blickfeld GmbH Reduced nonlinearities for resonant deflection of a scanning mirror
US11264706B2 (en) 2018-10-04 2022-03-01 Aptiv Technologies Limited Object sensor including pitch compensation
CN111007513B (en) * 2018-10-04 2024-03-15 安波福技术有限公司 Object sensor including pitch compensation
EP3633403A1 (en) * 2018-10-04 2020-04-08 Aptiv Technologies Limited Object sensor including pitch compensation
CN111007513A (en) * 2018-10-04 2020-04-14 安波福技术有限公司 Object sensor including pitch compensation
CN112969937A (en) * 2018-10-19 2021-06-15 创新科技有限公司 LIDAR system and method
CN113287031A (en) * 2018-11-02 2021-08-20 伟摩有限责任公司 Synchronization of multiple rotation sensors of a vehicle
US11879996B2 (en) 2018-11-12 2024-01-23 Infineon Technologies Ag LIDAR sensors and methods for LIDAR sensors
DE102018128164A1 (en) * 2018-11-12 2020-05-14 Infineon Technologies Ag LIDAR SENSORS AND METHOD FOR LIDAR SENSORS
CN110770600B (en) * 2018-11-29 2023-04-14 深圳市大疆创新科技有限公司 Distributed light detection and ranging (LIDAR) management system
CN110770600A (en) * 2018-11-29 2020-02-07 深圳市大疆创新科技有限公司 Distributed light detection and ranging (LIDAR) management system
CN111587381A (en) * 2018-12-17 2020-08-25 深圳市大疆创新科技有限公司 Method for adjusting motion speed of scanning element, distance measuring device and mobile platform
CN111670337A (en) * 2019-01-09 2020-09-15 深圳市大疆创新科技有限公司 Distance measuring device and mobile platform
US11610337B2 (en) * 2019-02-17 2023-03-21 Purdue Research Foundation Calibration of cameras and scanners on UAV and mobile platforms
US20200327696A1 (en) * 2019-02-17 2020-10-15 Purdue Research Foundation Calibration of cameras and scanners on uav and mobile platforms
CN111766589A (en) * 2019-03-12 2020-10-13 江苏美的清洁电器股份有限公司 Detection assembly, floor sweeping robot and method and system for detecting walking road conditions of floor sweeping robot
CN113574411A (en) * 2019-03-14 2021-10-29 伟摩有限责任公司 Method and system for detecting an obstacle on a sensor housing
US11150348B2 (en) 2019-10-02 2021-10-19 Cepton Technologies, Inc. Techniques for detecting cross-talk interferences in lidar imaging sensors
WO2021067165A1 (en) * 2019-10-02 2021-04-08 Cepton Technologies, Inc. Techniques for detecting cross-talk interferences in lidar imaging sensors
US11899110B2 (en) 2019-10-02 2024-02-13 Cepton Technologies, Inc. Techniques for detecting cross-talk interferences in LiDAR imaging sensors
US11609477B2 (en) 2019-10-18 2023-03-21 Hyundai Motor Company Liquid crystal based optical deflector and optical scanner using the same
CN110764070A (en) * 2019-10-29 2020-02-07 北科天绘(合肥)激光技术有限公司 Data real-time fusion processing method and device based on three-dimensional data and image data
US11740333B2 (en) 2019-12-04 2023-08-29 Waymo Llc Pulse energy plan for light detection and ranging (lidar) devices based on areas of interest and thermal budgets
EP4042190A4 (en) * 2019-12-04 2023-11-08 Waymo LLC Pulse energy plan for light detection and ranging (lidar) devices based on areas of interest and thermal budgets
CN111175768A (en) * 2020-02-14 2020-05-19 深圳奥锐达科技有限公司 Off-axis scanning distance measuring system and method
US11561281B2 (en) 2020-06-29 2023-01-24 Waymo Llc Selective deactivation of light emitters for interference mitigation in light detection and ranging (lidar) devices
CN112859101B (en) * 2021-01-11 2022-07-05 武汉大学 Single photon ranging method based on polarization modulation
CN112859101A (en) * 2021-01-11 2021-05-28 武汉大学 Single photon ranging method based on polarization modulation
CN113865509A (en) * 2021-09-29 2021-12-31 苏州华兴源创科技股份有限公司 Automatic following detection device
CN115694792A (en) * 2021-10-09 2023-02-03 科大国盾量子技术股份有限公司 Method and device capable of detecting blind attack caused by intense pulse light and receiving end
EP4175073A1 (en) * 2021-10-28 2023-05-03 Honeywell International Inc. Systems and methods for strobe-light-based navigation
US11960029B2 (en) 2022-12-20 2024-04-16 Waymo Llc Selective deactivation of light emitters for interference mitigation in light detection and ranging (lidar) devices

Also Published As

Publication number Publication date
WO2018055513A3 (en) 2018-05-11

Similar Documents

Publication Publication Date Title
WO2018055513A2 (en) Methods circuits devices assemblies systems and functionally associated machine executable code for light detection and ranging based scanning
JP7336014B2 (en) LIDAR system and method
JP7256920B2 (en) LIDAR system and method
US20180113216A1 (en) Methods Circuits Devices Assemblies Systems and Functionally Associated Machine Executable Code for Active Optical Scanning of a Scene
JP7046420B2 (en) LIDAR systems and methods with adjustable resolution and fail-safe operation
US20180100928A1 (en) Methods circuits devices assemblies systems and functionally associated machine executable code for active scene scanning
US20180081037A1 (en) Methods Circuits Assemblies Devices Systems and Functionally Associated Machine Executable Code for Controllably Steering an Optical Beam
US20180081038A1 (en) Methods Circuits Devices Assemblies Systems and Functionally Associated Machine Executable Code for Light Detection and Ranging Based Scanning
KR102656212B1 (en) Lidar systems and methods
CN117813530A (en) Controlling LiDAR resolution configuration based on sensors
KR20240051293A (en) Lidar systems and methods

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17791142

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 29.07.2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17791142

Country of ref document: EP

Kind code of ref document: A2