WO2022128194A1 - Light emission and detection in a lidar system - Google Patents

Light emission and detection in a lidar system Download PDF

Info

Publication number
WO2022128194A1
WO2022128194A1 PCT/EP2021/077347 EP2021077347W WO2022128194A1 WO 2022128194 A1 WO2022128194 A1 WO 2022128194A1 EP 2021077347 W EP2021077347 W EP 2021077347W WO 2022128194 A1 WO2022128194 A1 WO 2022128194A1
Authority
WO
WIPO (PCT)
Prior art keywords
segment
optical
light
optical component
component
Prior art date
Application number
PCT/EP2021/077347
Other languages
French (fr)
Inventor
Michael Boenigk
Original Assignee
Osram Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Osram Gmbh filed Critical Osram Gmbh
Publication of WO2022128194A1 publication Critical patent/WO2022128194A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning

Definitions

  • LIDAR Light Detection and Ranging
  • Light detection and ranging is a sensing technique that is used, for example , in the field of autonomous driving for providing detailed information about the surrounding of an automated or partially automated vehicle .
  • LIDAR light e . g . , laser light
  • the emission of light for scanning the scene may be controlled, for example , by means of microelectromechanical system (MEMS ) mirrors , which may provide deflection of the emitted light along one direction or two directions ( e . g . , along the hori zontal and/or vertical direction) .
  • MEMS microelectromechanical system
  • a LIDAR system may include a liquid crystal polari zation grating (LCPG) that may provide deflection of the emitted light ( and also of the light received from the field of view) into two directions .
  • LCPG-solution the imaged area may be reduced to increase the signal-to-noise ratio of the measurement .
  • Beam steering solutions based on MEMS components or liquid crystal components may present challenges in terms of cost , complexity, and maintenance .
  • Various aspects may be based on providing a simple and ef ficient beam steering solution for a LIDAR system without having to rely on complex and expensive components such as a MEMS mirror or a LCPG .
  • Various aspects may be related to an optical component configured to provide beam steering capabilities for one-dimensional and two-dimensional scanning of a field of view that does not require fine control over an oscillation angle (unlike MEMs mirrors ) or over the synchroni zed switching of liquid crystal cells (unlike LCPGs ) .
  • the optical component may be further configured to prevent light received at the LIDAR system from portions of the field of view other than a portion into which light was emitted from impinging onto a detector of the LIDAR system . This may provide an increased signal-to-noise ratio of the detection .
  • an optical component including a plurality of segments , each segment being configured to provide a respective deflection angle for light impinging on the segment , each deflection angle being associated with a respective portion of a field of view; the optical component being configured to allow for a control of a continuous movement of the optical component , such that the plurality of segments may be sequentially exposed to the incoming light to sequentially deflect the light at each of the plurality of deflection angles provided by the plurality of segments ; the optical component further including a plurality of receive optical elements , each associated with a respective segment of the plurality of segments , wherein a segment and the associated receive optical element are disposed relative to one another such that the receive optical element receives light coming from the portion of the field of view associated with the deflection angle provided by the segment .
  • Various aspects may be based on providing one-dimensional and/or two-dimensional scanning of a field of view ( e . g . , of a scene ) via the combination of the continuous movement of the optical component together with the configuration of the segments to provide the respective deflection angle .
  • a LIDAR system may include an optical system, the optical system including : a light source ; an optical component including a first segment configured to deflect the light emitted by the light source towards a field of view of the LIDAR system in a first emission direction, and a second segment configured to deflect the light emitted by the light source towards the field of view of the LIDAR system in a second emission direction, and a controller configured to control a continuous movement of the optical component , wherein the optical component is configured in such a way that the light emitted by the light source impinges onto the first segment during a first portion of the continuous movement of the optical component , and that the light emitted by the light source impinges onto the second segment during a second portion of the continuous movement of the optical component .
  • the first emission direction may be associated with a first portion of the field of view of the LIDAR system
  • the second emission direction may be associated with a second portion of the field of view of the LIDAR system
  • the optical component may further include a first receive optical element associated with the first segment and a second receive optical element associated with the second segment , wherein the first receive optical element and the first segment are disposed relative to one another such that the first receive optical element receives light coming from the first portion of the field of view, and wherein the second receive optical element and the second segment are disposed relative to one another such that the second receive optical element receives light coming from the second portion of the field of view .
  • an optical component may include : a first segment configured such that light impinging onto the first segment is deflected towards a first portion of a field of view of the optical component ; and a second segment configured such that light impinging onto the second segment is deflected towards a second portion of a field of view of the optical component ; and a first receive optical element associated with the first segment and a second receive optical element associated with the second segment , wherein the first receive optical element and the first segment are disposed relative to one another such that the first receive optical element receives light coming from the first portion of the field of view of the optical component , and wherein the second receive optical element and the second segment are disposed relative to one another such that the second receive optical element receives light coming from the second portion of the field of view of the optical component .
  • a receive optical element being associated with a segment may be understood as the receive optical element and the segment being in a predefined ( e . g . , fixed) angular relationship with one another .
  • the first receive optical element may be in a first predefined angular relationship with the first segment
  • the second receive optical element may be in a second predefined angular relationship with the second segment .
  • a LIDAR system may include an optical system, the optical system including : a light source ; a disk including a first side surface configured to deflect the light emitted by the light source towards a field of view of the LIDAR system in a first emission direction, and a second side surface configured to deflect the light emitted by the light source towards a field of view of the LIDAR system in a second emission direction; and a controller configured to control a continuous rotation of the disk, in such a way that the light emitted by the light source impinges onto the first side surface during a first portion of the continuous rotation of the disk, and that the light emitted by the light source impinges onto the second side surface during a second portion of the continuous rotation of the optical disk .
  • the first emission direction may be associated with a first portion of the field of view of the LIDAR system
  • the second emission direction may be associated with a second portion of the field of view of the LIDAR system
  • the disk may further include a first receive optical element disposed on a main surface of the disk and being associated with the first side surface , and a second receive optical element disposed on the main surface of the disk and being associated with the second side surface , wherein the first receive optical element and the first side surface are disposed relative to one another such that the first receive optical element receives light coming from the first portion of the field of view, and wherein the second receive optical element and the second side surface are disposed relative to one another such that the second receive optical element receives light coming from the second portion of the field of view .
  • the optical system described herein may provide a simple and cost-effective solution for providing beam steering in the LIDAR system.
  • the LIDAR system may be part, for example, of a vehicle or of a smart farming or of an indoor monitoring system.
  • a LIDAR system is an example of a possible application of the optical system and of the optical component described herein for providing control over the emission direction of light.
  • the optical system and the optical component described herein may also be for use in other types of application or systems in which a simple and cost-effective beam steering solution may be advantageous, for example in an optical transmission system (e.g., wireless or including optical fibers) , e.g. in a light-based communication system in which data and information may be transmitted by means of light.
  • an optical transmission system e.g., wireless or including optical fibers
  • segment may be used herein to describe a part (in other words, a portion) of an optical component.
  • a segment may be understood as a part of the optical component having a certain extension, and a certain surface area.
  • a segment may have a lateral extension in the horizontal direction (e.g., a width) in the range from 5 mm to 100 mm, for example in the range from 20 mm to 80 mm, for example in the range from 10 mm to 60 mm.
  • a segment may have a lateral extension in the vertical direction (e.g., a height) in the range from 1 mm to 15 mm, for example in the range from 2 mm to 10 mm, for example in the range from 1 mm to 6 mm, for example 3 mm.
  • a segment may have a surface area in the range from 5 mm 2 to 1000 mm 2 , for example from 20 mm 2 to 500 mm 2 , for example a surface area greater than 5 mm 2 or greater than 50 mm 2 .
  • a segment may be understood as a surface or part of a surface of the optical component, e.g. as a side surface or part of a side surface of the optical component.
  • a side surface may also be referred to herein as lateral surface or edge surface.
  • a segment may also be referred to herein as transmission element or emission element.
  • continuous movement may be used herein to describe a type of purposeful movement that does not present interruptions, i.e. a purposeful (and controlled) movement that continues until it is arbitrarily stopped.
  • a continuous movement may be opposite to a movement that occurs in a series of discrete steps.
  • processor as used herein may be understood as any kind of technological entity that allows handling of data. The data may be handled according to one or more specific functions executed by the processor. Further, a processor as used herein may be understood as any kind of circuit, e.g., any kind of analog or digital circuit. A processor may thus be or include an analog circuit, digital circuit, mixed-signal circuit, logic circuit, processor, microprocessor, Central Processing Unit (CPU) , Graphics Processing Unit (GPU) , Digital Signal Processor (DSP) , Field Programmable Gate Array (FPGA) , integrated circuit, Application Specific Integrated Circuit (ASIC) , etc., or any combination thereof.
  • CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • DSP Digital Signal Processor
  • FPGA Field Programmable Gate Array
  • ASIC Application Specific Integrated Circuit
  • any other kind of implementation of the respective functions may also be understood as a processor or logic circuit. It is understood that any two (or more) of the processors or logic circuits detailed herein may be realized as a single entity with equivalent functionality or the like, and conversely that any single processor or logic circuit detailed herein may be realized as two (or more) separate entities with equivalent functionality or the like.
  • FIG. 1A shows a LIDAR system in a schematic view according to various aspects
  • FIG. IB shows an optical system in a schematic top view according to various aspects
  • FIG. 1C and FIG. ID each shows schematically deflection of light associated with a segment of an optical component according to various aspects
  • FIG. IE and FIG. IF each shows schematically deflection of light associated with a segment of an optical component according to various aspects
  • FIG. 1G shows schematically a scanning of a field of view according to various aspects
  • FIG. 1H shows an optical system in a schematic top view according to various aspects
  • FIG. II shows schematically a scanning of a field of view according to various aspects
  • FIG. 2A and FIG. 2B each shows an optical component in a schematic view according to various aspects
  • FIG. 2C, FIG. 2D, FIG. 2E, FIG. 2F, FIG. 2G, and FIG. 2H each shows a graph associated with the deflection of light by an optical component according to various aspects
  • FIG. 21 and FIG. 2J each shows an optical component in a schematic view according to various aspects
  • FIG. 2K and FIG. 2L each shows an exemplary realization of an optical component in a schematic view according to various aspects ;
  • FIG. 3A, FIG. 3B, FIG. 3C, and FIG. 3D each shows an optical system in a schematic view according to various aspects
  • FIG. 3E illustrates a synchroni zation between the emission of light and the continuous movement of an optical component in a schematic view according to various aspects ;
  • FIG . 4A shows a receiver side of an optical system in a schematic view according to various aspects ;
  • FIG . 4B shows an imaged field of view in a schematic view according to various aspects ;
  • FIG . 4G shows a detector in a schematic view according to various aspects ;
  • FIG . 4D shows an optical component in a schematic view according to various aspects ;
  • FIG . 5A to FIG . 5L illustrate an imaging process including an optical component in a schematic view according to various aspects ;
  • FIG . 6 shows an optical component and a detector in a schematic view according to various aspects
  • FIG . 7 shows an optical component and a detector in a schematic view according to various aspects .
  • FIG . 8 shows a graph providing a comparison between beam steering according to the strategy described herein and beam steering not implementing the strategy described herein .
  • FIG. 1A shows a LIDAR system 100 in a schematic top view according to various aspects.
  • the representation of the LIDAR system 100 in FIG. 1A may be simplified for the purpose of explanation .
  • the LIDAR system 100 may include an optical system 101 (in some aspects, a plurality of optical systems 101) .
  • the optical system 101 may also be referred to as light emission and detection system 101.
  • the optical system 101 may be configured to control an emission of light into a field of view 108 of the LIDAR system 100. In some aspects, the optical system 101 may be further configured to detect light from the field of view 108 of the LIDAR system 100. In other aspects, the detection of light from the field of view 108 may be carried out by a separate detection system.
  • the field of view 108 of the LIDAR system 100 may be understood, in some aspects, as a field of view of the optical system 101.
  • the emission direction into the field of view 108 may be varied along one or more directions, e.g. at least along a first direction (e.g., the direction 152 in FIG. 1A) and a second direction (e.g., the direction 154 in FIG. 1A) .
  • the first direction 152 and the second direction 154 may be understood as the directions along which the field of view 108 extends.
  • the first direction 152 may be a first field of view direction, along which the field of view 108 has a first lateral extension (e.g., a height, or a first angular extension)
  • the second direction 154 may be a second field of view direction, along which the field of view 108 has a second lateral extension (e.g., a width, or a second angular extension)
  • the first direction 152 and the second direction 154 may be aligned at a defined angle (different from 0° or 180°) with one another, e.g. the first direction 152 and the second direction 154 may be perpendicular to one another.
  • first direction 152 may be a vertical direction
  • second direction 154 may be a hori zontal direction
  • first direction 152 and second direction 154 may be arbitrary
  • the first direction 152 and the second direction 154 may be perpendicular to a direction along which the optical axis 110 of the optical system 101 is aligned ( e . g . , the optical axis 110 may be aligned along a third direction 156 in FIG . 1A) .
  • the two directions along which the field of view 108 may extend are illustratively represented by the first field of view line 108- 1 and the second field of view line 108-2 in FIG . 1A.
  • the optical system 101 may be configured according to an adapted beam steering solution that does not require MEMS components or liquid crystal components .
  • the optical system may be configured according to an adapted beam steering solution that does not require MEMS components or liquid crystal components .
  • the optical system may be configured according to an adapted beam steering solution that does not require MEMS components or liquid crystal components .
  • the optical system 101 does not include any MEMS mirror and does not include any liquid crystal polari zation grating .
  • the optical system 101 may provide beam steering functionalities via an adapted optical component , as described in further detail below .
  • the LIDAR system 100 may include other systems and/or components in addition to the optical system 101 , e . g . one or more communication devices , a sensor fusion circuit , sensors for detection other than optical detection, etc .
  • FIG . IB shows a schematic top view of the optical system 101 according to various aspects .
  • the optical system 101 may include a light source 102 .
  • the light source 102 may be configured to emit light , e . g . light having a predefined wavelength, for example in the infra-red and/or near infra-red range , such as in the range from about 700 nm to about 5000 nm, for example in the range from about 860 nm to about 1600 nm, or for example at 905 nm or 1550 nm .
  • the light source 102 may be configured to emit light , e . g . light having a predefined wavelength, for example in the infra-red and/or near infra-red range , such as in the range from about 700 nm to about 5000 nm, for example in the range from about 860 nm to about 1600 nm, or for example at 905 nm or 1550 nm .
  • the light source 102 may be configured to emit light , e . g
  • the light source 102 may be configured to emit light in a continuous manner or in a pulsed manner, for example the light source 102 may be configured to emit one or more light pulses ( e . g . , a sequence of light pulses ) .
  • the light source 102 may include a laser source.
  • the light source may include one or more laser diodes, e.g. one or more edge-emitting laser diodes or one or more vertical cavity surface emitting laser diodes.
  • the light source 102 may be configured to emit laser light, e.g. one or more laser pulses, for example a sequence of laser pulses.
  • the light source 102 may include a plurality of laser sources (e.g., a plurality of laser diodes) disposed along one direction to form a one-dimensional array, or disposed along two directions to form a two-dimensional array.
  • the light source 102 may include a laser bar (e.g., including a plurality of laser diodes, such as 4 laser diodes or 8 laser diodes, as examples) .
  • the optical system 101 may include an optical component 104 configured to direct the light emitted by the light source 102 towards the field of view 108 of the LIDAR system 100.
  • the optical component 104 may include a plurality of segments (in other words, a plurality of adapted parts or adapted portions) , each configured to provide a respective emission direction towards the field of view 108 for the light impinging onto the segment, illustratively each segment may be configured to provide a respective redirection (a respective deflection angle) to the light impinging onto the segment.
  • the optical component 104 may include a first segment 106-1, a second segment 106-2, a third segment 106-3, a fourth segment 106-4, a fifth segment 106-5, a sixth segment 106-6, a seventh segment 106-7, and an eighth segment 106-8. It is however understood that the number of segments illustrated in FIG. IB is only an example, and an optical component 104 may include a desired number of segments (e.g., five, six, eight, ten, or more than ten) , adapted in accordance with a desired range to be provided for directing the light onto the field of view 108 and with a desired resolution, as described in further detail below. It is also understood that the representation of the optical component 104 shown in FIG. IB (e.g., the shape, the arrangement of the segments, etc.) is only for the purpose of illustrating the principles of its operation, and other configurations may be provided for the optical component 104 , as described in further detail below .
  • the representation of the optical component 104 shown in FIG. IB
  • the first segment 106- 1 may be configured to deflect the light emitted by the light source 102 towards the field of view 108 of the optical system 101 in a first emission direction 114
  • the second segment 106-2 may be configured to deflect the light emitted by the light source 102 towards the field of view 108 of the optical system 101 in a second emission direction
  • the third segment 106-3 may be configured to deflect the light emitted by the light source 102 towards the field of view 108 of the optical system 101 in a third emission direction, etc .
  • each segment may be configured to deflect the light emitted by the light source 102 at a respective emission direction towards the field of view 108 in case ( or when) the light emitted by the light source 102 impinges onto that segment .
  • the emission directions provided by the segments may di f fer from one another in at least one angular component , e . g . a di f ferent angular component with respect to an optical axis 110 of the optical system 101 , as described in further detail below .
  • Each emission direction may be associated with a respective portion of the field of view 108 .
  • Each segment may be configured to direct the light impinging onto the segment towards a respective portion of the field of view 108 , as described in further detail below .
  • the first emission direction may be associated with a first portion of the field of view 108
  • the second emission direction may be associated with a second portion of the field of view 108
  • the third portion may be associated with a third portion of the field of view 108 , etc .
  • a portion of the field of view 108 associated with a segment ( illustratively, a portion illuminated by the light deflected by that segment ) may di f fer from a portion of the field of view 108 associated with another segment in at least one coordinate along one field of view direction, as described in further detail below .
  • the optical system 101 may include a controller 112 configured to control a continuous movement of the optical component 104.
  • the controller 112 may be configured to control one or more properties associated with the continuous movement of the optical component 104, e.g. a starting time, an end time, a speed, an acceleration, etc.
  • the optical system 101 may include a motor (not shown) configured to drive the continuous movement of the optical component 104.
  • the motor may be part of the controller 112, or the controller 112 may be coupled with the motor, and the controller 112 may be configured to control an operation of the motor.
  • the motor may include an electrical motor, such as an AC motor or a DC motor.
  • the motor may include a spindle motor, or a servo motor.
  • the motor may be a motor used in hard disk drive technology (e.g., capable of achieving 250 rps) .
  • the controller may be configured to control the light source 102 to emit light in accordance (e.g., in synchronization) with the continuous movement of the optical component 104, as described in further detail below (see for example FIG. 3A to FIG. 3D) .
  • the continuous movement of the optical component 104 may provide that the segment of the optical component 104 that is illuminated by the light emitted by the light source 102 varies over time.
  • the continuous movement of the optical component 104 may vary a relative arrangement of the optical component 104 (e.g., of the segments) with respect to the light source 102, such that the light emitted by the light source 102 impinges onto the optical component 104 at different locations over time.
  • the type and the properties of the continuous movement may be selected in accordance with the configuration of the optical component 104, e.g. with its shape, with the disposition of the segments, etc.
  • the properties of the continuous movement may also be selected in accordance with desired properties of a scanning of the field of view 108, e.g. with a scanning speed, with an acquisition rate of detected light, etc.
  • the continuous movement of the optical component 104 may include a continuous circular movement, e.g. a continuous rotation around an axis of the optical component 104, for example an axis aligned along a first direction 152 (e.g., a vertical direction with respect to a main surface of the optical component 104) and passing through the center of the optical component 104 in the exemplary configuration in FIG.
  • the continuous circular movement may include a frequency of rotation in the range from 1 Hz to 400 Hz, for example in the range from 1 Hz to 300 Hz, for example in the range from 10 Hz to 250 Hz, for example a frequency of rotation equal to or greater than 75 Hz.
  • the continuous movement of the optical component 104 may include a continuous linear movement, e.g. a rectilinear movement (for example along a rail) .
  • the continuous linear movement may include a speed of movement in the range from 10 cm/ s to 100 cm/s, for example in the range from 20 cm/ s to 50 cm/s.
  • a speed or a frequency of rotation of a continuous movement of the optical component 104 may be adapted in accordance with a frame rate of the scanning of the field of view 108, as described in further detail below.
  • the optical component 104 may be configured such that the light emitted by the light source 102 impinges onto a different segment during different portions of the continuous movement (e.g., during different time intervals) .
  • the configuration of the optical component 104 may be understood as an arrangement of the segments relative to the light source 102 (and of the segments relative to one another) that provides that the continuous movement of the optical component 104 allows different segments to be illuminated by the light emitted by the light source 102 during the continuous movement.
  • the optical component 104 may be configured in such a way that the light emitted by the light source 102 impinges onto the first segment 106-1 during a first portion of the continuous movement of the optical component 104, that the light emitted by the light source 102 impinges onto the second segment 106-2 during a second portion of the continuous movement of the optical component 104 , that the light emitted by the light source 102 impinges onto the third segment 106-3 during a third portion of the continuous movement of the optical component 104 , etc .
  • the duration of a portion of the continuous movement during which light emitted by the light source 102 impinges onto a segment may be adapted by controlling the properties of the continuous movement and the properties of the segment , e . g . by adapting a frequency of rotation or a speed of the continuous movement , and/or by adapting a lateral extension ( e . g . , a width) of the segments .
  • a portion of the continuous movement during which light emitted by the light source 102 impinges onto a segment may also be referred to herein as a portion of the continuous movement associated with that segment ( e . g .
  • a portion of the continuous movement associated with a segment may have a duration in the range from 100 ps to 300 ms , for example in the range from 5 ms to 10 ms , for example in the range from 20 ps to 500 ps , for example about 2 . 2 ms .
  • the optical component 104 may be configured such that the portions of the continuous movement associated with di f ferent segments have a same duration .
  • control over the emission direction of the light into the field of view 108 may be provided also in a configuration in which the optical component 104 is stationary and the light source 102 is continuously moved to emit light onto di f ferent segments of the optical component 104 .
  • control over the emission direction of the light into the field of view 108 may also be provided in a configuration in which both the optical component 104 and the light source 102 are stationary, and an optical arrangement is used to continuously vary an impinging location of the light onto the optical component 104 .
  • the continuous movement of the optical component 104 may be understood, in some aspects , as a continuous variation of the relative arrangement between the optical component 104 and the light source 102, or as a continuous variation of the impinging location of the light emitted by the light source 102 onto the optical component 104.
  • An impinging location of the light may also be referred to herein as an impingement location of the light.
  • the configuration of the optical component 104 in combination with the continuous variation of the relative arrangement between the optical component 104 and the light source 102 provide a continuous control over the emission direction of the light into the field of view 108.
  • the light may be directed towards the first portion of the field of view 108
  • the light may be directed towards the second portion of the field of view 108
  • the light may be directed towards the third portion of the field of view 108, etc .
  • the emission direction into the field of view 108 may be varied along one or more directions, e.g. at least along a first direction (e.g., the direction 152 in FIG. 1A and FIG. IB) and a second direction (e.g., the direction 154 in FIG. 1A and FIG. IB) .
  • a first direction e.g., the direction 152 in FIG. 1A and FIG. IB
  • a second direction e.g., the direction 154 in FIG. 1A and FIG. IB
  • FIG. 1C to FIG. 1G The redirection of the light emitted by the light source 102 operated by the optical component 104 and by the continuous variation of the impinging location of the light onto the optical component 104 is further illustrated in FIG. 1C to FIG. 1G.
  • FIG. 1C and FIG. ID illustrate schematically the deflection of light operated by one of the segments of the optical component 104, e.g. by the first segment 106-1.
  • FIG. IE and FIG. IF illustrate schematically the deflection of light operated by another one of the segments of the optical component 104, e.g. by the second segment 106-2.
  • the emission direction provided by a segment of the optical component 104 may form a respective angle with the optical axis 110 of the optical arrangement 100 (for the sake of representation illustrated as passing through the central part of a segment in FIG. 1C to FIG. IF) .
  • the angle formed by the emission direction with the optical axis 110 may have one component in the first direction 152 and another component in the second direction 154.
  • an emission direction may be at an angle with the optical axis 110 both in the first direction 152 and in the second direction 154.
  • a segment of the optical component 104 may be configured to provide a respective deflection angle for the light.
  • each segment may be configured such that light impinging onto the segment is deflected along a direction forming a deflection angle with the optical axis 110 that is associated (only) with that segment.
  • the deflection angle may also be referred to herein as emission angle.
  • the first emission direction 114 (provided by the first segment 106-1) may be at a first emission angle with respect to the optical axis 110 of the optical system 101.
  • the first emission angle may include a first component 116-1 along the first field of view direction 152 (see FIG. 1C) and a second component 116-2 along the second field of view direction 154 (see FIG. ID) .
  • a second emission direction 118 (provided by the second segment 106-2) may be at a second emission angle with respect to the optical axis 110 of the optical system 101.
  • the second emission angle may include a respective first component 120-1 (also referred to herein as third component 120-1) along the first field of view direction 152 (see FIG. IE) and a respective second component 120-2 (also referred to herein as fourth component 120-2) along the second field of view direction 154 (see FIG. IF) . It is understood that the same may apply to the other segments not illustrated in FIG. 1C to FIG. IF, e.g.
  • a third emission direction (provided by the third segment 106-3) may be at a third emission angle with respect to the optical axis 110 of the optical system 101 , and the third emission angle may include a respective first component (also referred to herein as fi fth component ) along the first field of view direction 152 and a respective second component (also referred to herein as sixth component ) along the second field of view direction 154 , etc .
  • the segments of the optical component 104 may be configured such that at least one of the respective first component or second component formed by the respective emission direction with the optical axis 110 is associated only with that segment .
  • the segments of the optical component 104 may be configured to provide a respective deflection angle associated ( only) with that segment into one of the first direction 152 or the second direction 154 .
  • the deflection into the other one of the first direction 152 or the second direction 154 may be provided by the continuous variation of the relative arrangement of the optical component 104 with respect to the light source 102 .
  • each segment is configured to provide a respective deflection associated ( only) with that segment into the first direction 152 ( e . g . , in the vertical direction) , and that the deflection in the second direction 154 ( e . g .
  • each segment was configured to provide a respective deflection associated with that segment into the second direction 154 , and in case the deflection in the first direction 152 was provided by the continuous movement of the optical component 104 with respect to the light source 102 .
  • a portion of the field of view 108 illuminated by the light deflected by a segment may have a coordinate along the first direction 152 ( a vertical coordinate in the field of view 108 ) associated ( only) with that segment .
  • the first portion associated with the first segment 106- 1 may have a first vertical coordinate
  • the second portion associated with the second segment 106-2 may have a second vertical coordinate
  • the third portion associated with the third segment 106-3 may have a third vertical coordinate , etc .
  • a segment may be configured such that the component of the angle formed by the respective emission direction with the optical axis 110 that is associated with that segment ( e . g . , the respective first component along the first direction 152 , in the exemplary configuration described herein) may remain constant during the portion of the continuous movement of the optical component 104 during which the light impinges onto that segment .
  • light is deflected by a segment at a same angle along the first direction 152 during the period in which the light impinges onto that segment .
  • the first segment 106- 1 may be configured such that the first component 116- 1 of the first emission angle remains constant during the first portion of the continuous movement of the optical component 104 .
  • the second segment 106-2 may be configured such that the respective first component 120- 1 of the second emission angle remains constant during the second portion of the continuous movement of the optical component 104 .
  • the third segment 106-3 may be configured such that the respective first component of the third emission angle remains constant during the third portion of the continuous movement of the optical component 104 , etc .
  • the first component 116- 1 of the first emission angle may be di f ferent from the first component 120- 1 of the second emission angle ( and from the first component of the third emission angle , from the first component of the fourth emission angle , etc .
  • the first vertical coordinate of the first portion of the field of view 108 may be di f ferent from the second vertical coordinate of the second portion of the field of view 108 ( and from the third vertical coordinate of the third portion of the field of view 108 , etc . )
  • a di f ference between the first component of emission angles associated with adj acent segments of the optical component 104 may be selected in accordance with a desired resolution in the first direction 152 (a desired vertical resolution) , and in accordance with a total number of segments .
  • a desired resolution may be dependent on the properties of a detector used for detecting light from the field of view 108 (see for example FIG. 1H) , e.g. on a number of pixels of the detector.
  • the field of view 108 may be figuratively divided in a first number of pixels along the first direction 152 and in a second number of pixels along the second direction 154.
  • the total number of pixels may define the resolution with which the field of view 108 may be illuminated, and with which light may be detected from the field of view 108.
  • the difference between the first component of emission angles associated with adjacent segments may be selected based on how many pixels of the field of view 108 may be simultaneously illuminated along the first direction 152 by the light emitted by the light source 102.
  • the field of view 108 may include 504 pixels in the second direction 154 and 128 pixels in the first direction 152 (e.g., in case a detector including 128 photo diodes is used, as discussed in further detail below) .
  • the light source 102 may be configured to simultaneously illuminate 16 pixels in the first direction 152 (e.g., in case the light source 102 includes 4 laser diodes) .
  • the 128 pixels correspond to an angular range of 24°
  • the 504 pixels correspond to an angular range of 120°.
  • a difference of 3° may be provided for a resolution of 24 ° / 128 «0.19 ° .
  • a difference of 0.19° may be provided to have the same resolution.
  • a difference (e.g., an absolute value of a difference) between the first component of emission angles associated with adjacent segments (e.g., a difference between the first component 116-1 of the first emission angle and the first component 120-1 of the second emission angle) may be in the range from 0.1° to 10°, for example in the range from 0.25° to 5°, for example 1.5°, for example 3°.
  • a segment may be configured such that the respective second component of the emission angle in the second direction 154 varies in a predefined range during the portion of the continuous movement of the optical component 104 associated with that segment.
  • the variation of the relative arrangement between the optical component 104 and the light source 102 during the continuous movement provides that the light is deflected at a varying angle (e.g., at an angle having a varying second component) in the second direction 154.
  • the variation of the relative arrangement between the optical component 104 and the light source 102 during the continuous movement provides that the light impinges onto a segment at varying impinging locations, each associated with a respective second component of the emission angle (e.g., each impinging location providing a respective deflection into the field of view 108 along the second direction 154) .
  • the continuous movement of the optical component 104 may provide that a relative orientation of a segment with respect to the light source 102 varies during the portion of the continuous movement associated with that segment, in such a way that the light is deflected at an angle having a varying component in the second direction 154.
  • the first segment 106-1 may be configured such that the second component 116-2 of the first emission angle varies during the first portion of the continuous movement of the optical component 104.
  • the optical component 104 may be configured in such a way that the light emitted by the light source 102 impinges onto the first segment 106-1 at a plurality of first impinging locations during the first portion of the continuous movement of the optical component 104, each first impinging location being associated with a respective second component 116-2 of the first emission angle .
  • the optical component 104 may be configured in such a way that an orientation between the first segment 106- 1 and the light source 102 varies during the first portion of the continuous movement of the optical component 104 , such that the second component 116-2 of the first emission angle varies accordingly .
  • the second segment 106-2 may be configured such that the respective second component 120-2 of the second emission angle varies during the second portion of the continuous movement of the optical component 104 .
  • the optical component 104 may be configured in such a way that the light emitted by the light source 102 impinges onto the second segment 106-2 at a plurality of second impinging locations during the second portion of the continuous movement of the optical component 104 , each second impinging location being associated with a respective second component 120-2 of the second emission angle .
  • the optical component 104 may be configured in such a way that an orientation between the second segment 106-2 and the light source 102 varies during the second portion of the continuous movement of the optical component 104 , such that the second component 120-2 of the second emission angle varies accordingly .
  • the third segment 106-3 may be configured such that the respective second component of the third emission angle varies during the third portion of the continuous movement of the optical component 104 , etc .
  • the optical component 104 may be configured in such a way that the impinging location of the light emitted by the light source 102 onto a segment moves along that segment in a direction parallel to the second direction 154 during the portion of the continuous movement of the optical component 104 associated with that segment .
  • the optical component 104 may be configured in such a way that the continuous variation of the relative arrangement between the optical component 104 and the light source 102 provides that the impinging location of the light onto a segment travels along the extension ( e . g . , the width) of the segment in the second direction 154 during the associated portion of the continuous movement .
  • the translation of the impinging location in the second direction 154 provides the variation of the second component of the emission angle associated with that segment .
  • the optical component 104 may be configured in such a way that the impinging location of the light emitted by the light source 102 onto the first segment 106- 1 moves along the first segment 106- 1 in a direction parallel to the second direction 154 during the first portion of the continuous movement of the optical component 104 .
  • the optical component 104 may be configured in such a way that the impinging location of the light emitted by the light source 102 onto the second segment 106-2 moves along the second segment 106-2 in a direction parallel to the second direction 154 during the second portion of the continuous movement of the optical component 104 . It is understood that the same may apply for the other segments not illustrated in FIG .
  • the optical component 104 may be configured in such a way that the impinging location of the light emitted by the light source 102 onto the third segment 106-3 moves along the third segment 106-3 in a direction parallel to the second direction 154 during the third portion of the continuous movement of the optical component 104 , etc .
  • the second component of the emission angle associated with di f ferent segments may vary within a same angular range .
  • the movement of the impinging location of the light onto a segment may provide that the second component of the emission angle varies from an initial value ( as soon as the light starts impinging onto that segment ) to a final value (when the light reaches the end of the segment , before moving onto the next segment ) .
  • the angular range may be dependent on the lateral extension of the segment in the second direction 154, and on the relative orientation between the segment and the light source 102.
  • the angular range for the second component may be between -60° and +60° with respect to the optical axis 110 of the optical system 100 along the second direction 154, for example between -45° and +45°, for example between -30° and +30°.
  • a total angular range for the field of view 108 in the second direction 154 may be for example 120°, for example 90°, for example 60°.
  • the configuration of the optical system 101 may be adapted to provide a desired angular range, e.g. to provide a desired dimension of the field of view 108 to be scanned with the emitted light.
  • each segment may be configured such that the respective second component of the emission angle varies within a same angular range as the second component of the emission angle associated with the other segments.
  • the second component 116-2 of the first emission angle and the second component 120-2 of the second emission angle may vary within a same angular range during the respective (first and second) portion of the continuous movement of the optical component 104.
  • the second component of the third emission angle may also vary within the same angular range as the second component 116-2 of the first emission angle and the second component 120-2 of the second emission angle during the respective (third) portion of the continuous movement of the optical component 104, etc.
  • FIG. 1G An illustrative representation of the scanning of the field of view 108 provided by the optical system 101 is shown in FIG. 1G.
  • the light emitted into the field of view 108 is represented by the circle 122.
  • the light moves in the field of view 108 along the second direction 154 at a respective (vertical) coordinate in the first direction 152 (provided by the respective first component of the emission angle associated with the illuminated segment) .
  • FIG. 1G An illustrative representation of the scanning of the field of view 108 provided by the optical system 101 is shown in FIG. 1G.
  • the light emitted into the field of view 108 is represented by the circle 122.
  • the light moves in the field of view 108 along the second direction 154 at a respective (vertical) coordinate in the first direction 152 (provided by the respective first component of the emission angle associated with the illuminated segment) .
  • FIG. 1G An illustrative representation of the scanning of the field of view 108 provided by the optical system 101 is
  • the light 122 may move along a first trajectory 124-1 at a first vertical coordinate during a first portion of the continuous movement, then along a second trajectory 124-2 at a second vertical coordinate during a second portion of the continuous movement, then along a third trajectory 124-3 at a third vertical coordinate during a third portion of the continuous movement, and so on, up to an n-th trajectory 124-n at an n-th vertical coordinate during a n-th portion of the continuous movement.
  • the first trajectory 124-1 may be provided by light impinging onto the first segment 106-1
  • the second trajectory 124-2 may be provided by light impinging onto the second segment 106-2
  • the third trajectory 124-3 may be provided by light impinging onto the third segment 106-3, etc.
  • the emission of light in the field of view 108 may be understood as a line-by-line scan of the field of view 108 (or column-by-column in case a reversed configuration was implemented) , e.g. analogous to a cathode-ray tube for a television.
  • the emission of light may be understood as a line-by-line horizontal deflection of the light (e.g., of a laser beam) over the entire angular range (e.g., over an angular range of 120°) .
  • An extent of a trajectory in the second direction 154 may be defined by the angular range provided for the second component of the emission angle.
  • a distance between the respective coordinate in the first direction 152 between different trajectories may be defined by the difference between the respective first component of the emission angle associated with the respective segments.
  • a resolution in the first direction 152 may be associated with the properties of the light source 102.
  • each light source may be configured (e.g., arranged) to illuminate a respective location of a segment of the optical component 104 during the associated period of the continuous movement.
  • each light source may be configured to illuminate the segment at a respective coordinate along the lateral extension of the segment parallel to the first direction 152, e.g. at a respective height in the segment.
  • This configuration may provide that the light redirected by a segment into the field of view 108 includes a plurality of components (each associated with the respective light source) , each at a respective vertical coordinate (within the vertical coordinate of the trajectory associated with that segment) . This may provide a finer resolution for the illumination of the scene compared to using a single light source, e.g. with a broader laser spot.
  • the plurality of light sources may be configured such that a segment is fully illuminated along the lateral extension of the segment parallel to the first direction 152 by the light emitted by the plurality of light sources.
  • the plurality of light sources may be configured such that a segment is fully illuminated along its height by the light provided by the different light sources at different coordinates.
  • a resolution in the first direction 152 may increase for increasing number of light sources.
  • a number of illuminated pixels of the field view 108 in the vertical direction may increase for increasing number of light sources.
  • four laser diodes may illuminate 16 pixels in the vertical direction 152 of the field of view 108 (and 1 pixel in the horizontal direction 154 ) .
  • the optical component 104 may be configured to provide an improved detection of light from the field of view 108, e.g. a light detection with reduced noise, e.g. a light detection with increased signal-to-noise ratio.
  • FIG. 1H shows the optical system 101 in a schematic top view according to various aspects. In the representation in FIG. 1H, the optical system 101 is illustrated with components configured for detecting light from the field of view 108. The configuration in FIG. 1H may be optionally implemented in case detection of light should be performed via the optical system 101, otherwise the optical system 101 may be configured as described in relation to FIG. IB in case detection of light was assigned to a different detection system or was based on a different strategy.
  • the optical system 101 may include a detector 126 configured to detect light, e.g. the detector 126 may be configured to provide an analog signal (e.g., a current or a voltage) in accordance with the light received at the detector 126.
  • the detector 126 may include one or more photo diodes, each configured to provide an analog signal (e.g., a photo current) in response to light impinging onto the photo diode.
  • the one or more photo diodes may include at least one of a pin photo diode, an avalanche photo diode, a single photon avalanche photo diode, or a silicon photomultiplier, as examples.
  • the detector 126 may include a plurality of photo diodes (e.g., of the same type or of different types) , illustratively, the detector 126 may include a plurality of pixels each including or associated with a respective photo diode.
  • the plurality of photo diodes may form an array, e.g. a one-dimensional or two-dimensional array.
  • the photo diodes may disposed along one direction (e.g., a first direction, such as a vertical direction or a horizontal direction) , or may disposed along two directions, e.g. a first (e.g., horizontal) direction and a second (e.g., vertical) direction.
  • the photo diodes may form a column array or a line array.
  • the detector 126 may include 32 photo diodes, or 64 photo diodes, or 128 photo diodes. The number of photo diodes may be selected in accordance with a desired resolution, as described above.
  • the plurality of photo diodes may form an array along a direction parallel to the direction in which the segments of the optical component 104 provide a respective di f ferent deflection angle , e . g . the plurality of photo diodes may be aligned along a direction parallel to the first field of view direction 152 ( forming a column array) .
  • the detector 126 may have an optical aperture in the range from 50 mm 2 to 1000 mm 2 , for example in the range from 150 mm 2 to 600 mm 2 , for example 400 mm 2 .
  • the optical component 104 may include a plurality of receive optical elements (also referred to herein as tracking filters ) , each associated with a respective segment of the plurality of segments .
  • the optical component 104 may include a first receive optical element 128- 1 associated with the first segment 106- 1 , a second receive optical element 128-2 associated with the second segment 106-2 , a third receive optical element 128-3 associated with the third segment 106-3 , a fourth receive optical element 128-4 associated with the fourth segment 106-4 , a fi fth receive optical element 128-5 associated with the fi fth segment 106-5 , a sixth receive optical element 128- 6 associated with the sixth segment 106- 6 , a seventh receive optical element 128-7 associated with the seventh segment 106-7 , and an eighth receive optical element 128- 8 associated with the eighth segment 106- 8 .
  • the number of receive optical elements illustrated in FIG . 1H is only an example , and an
  • a receive optical element may be disposed relative to the associated segment in such a way that a direct reflection of the emitted light deflected by the segment impinges onto that receive optical element .
  • a receive optical element may be disposed in the optical component 104 such that , when the continuous movement of the optical component 104 brings the associated segment to be illuminated by the light emitted by the light source 102 , the receive optical element is in a position to receive light from the portion of the field of view 108 associated with that segment .
  • a rigid linear relationship ( a rigid linear link) may be provided between a receive optical element and the associated segment .
  • a predefined e . g .
  • a receive optical element may be oriented with respect to the associated segment ( and with respect to the field of view) in such a way that the light deflected by the segment into the field of view is received at the receive optical element .
  • the first segment 106- 1 and the first receive optical element 128- 1 are disposed relative to one another such that the first receive optical element 128- 1 receives light associated with a direct reflection 114r of the light deflected in the first emission direction 114 .
  • the second segment 106-2 and the second receive optical element 128-2 may be disposed relative to one another such that the second receive optical element 128-2 receives light associated with a direct reflection of the light deflected in the second emission direction .
  • the third segment 106-3 and the third receive optical element 128-3 may be disposed relative to one another such that the third receive optical element 128-3 receives light associated with a direct reflection of the light deflected in the third emission direction, etc .
  • a direct reflection is not the only mechanism that causes light to travel back towards the optical system 101 from the field of view 108 , e . g . the light may be scattered back as another example .
  • the first receive optical element 128- 1 and the first segment 106- 1 may be disposed relative to one another ( e . g .
  • the second receive optical element 128-2 and the second segment 106-2 may be disposed relative to one another such that the second receive optical element 128-2 receives light coming from the second portion of the field of view 108
  • the third receive optical element 128-3 and the third segment 106-3 may be disposed relative to one another such that the third receive optical element 128-3 receives light coming from the third portion of the field of view 108 , etc .
  • a receive optical element may be configured to allow the light coming from the ( illuminated) portion of the field of view 108 associated with the respective segment to reach the detector 126 .
  • a receive optical element may be configured such that only the light associated with the direct reflection of the light deflected by the associated segment is delivered to the detector 126 .
  • the plurality of receive optical elements may ensure that the direct reflection of the light directed towards the field of view 108 during a respective period of the continuous movement of the optical component is received at the detector 126 , and at the same time may ensure that other light (e . g . , noise light , such as sun light , or light from other sources in the field of view 108 ) may not arrive at the detector, as described in further detail below ( see also FIG . I I ) .
  • the plurality of receive optical elements may ensure that the light coming from the portion of the field of view 108 into which light was emitted ( or is being emitted) is received at the detector 126 , while light from other portions of the field of view is prevented from reaching the detector 126 .
  • a receive optical element may be figuratively configured such that only a subset of pixels of the field of view in the hori zontal direction are imaged onto the detector 126 ( the subset of pixels being illuminated) , for example 5 pixels , for example 10 pixels , for example 20 pixels .
  • a receive optical element may be configured to transmit or reflect the received light .
  • a receive optical element may be configured to be optically transparent (for the wavelength of the emitted light ) such that the light received at the receive optical element may travel further towards the detector (while the other light gets blocked by the optical component 104 ) .
  • a receive optical element may have a transmission rate greater than 70% , for example greater than 90% , for example substantially 100% .
  • a receive optical element may be configured to reflect the received light towards the detector 126 (while the other light may be blocked by or may pass through the optical component 104) .
  • a receive optical element may have a reflectivity (at the wavelength of the emitted light) greater than 70%, for example greater than 90%, for example substantially 100%.
  • the first receive optical element 128-1, the second optical element 128-2, the third optical element 128-3, etc. may be configured to transmit or reflect the received light (the light associated with a direct reflection of the light deflected by the respective segment, the light coming from the portion of the field of view 108 associated with the respective element) .
  • the optical component 104 may have light transmission or light absorbing properties in accordance with the configuration of the receive optical elements.
  • the optical component 104 may be configured to absorb light (e.g., in case the receive optical elements are transparent) , e.g. may be configured such that light arriving onto the optical component 104 (at a location other than a segment or a receive optical element) is absorbed.
  • the light arriving onto the optical component 104 e.g., from the field of view 108 may be blocked by the optical component 104, without traveling further to the detector 126.
  • the optical component 104 (at a location other than a segment or a receive optical element) may be configured to have a transmission rate less than 20%, for example less than 10%, for example less than 1%.
  • the optical component 104 may be configured to let light pass through the optical component 104 (e.g., in case the receive optical elements are reflective) .
  • the optical component 104 may be transparent, so that light that is not reflected by a receive optical element towards the detector 126 may travel away.
  • the properties of the receive optical elements may be adapted in accordance with the configuration of the optical component 104 and of the optical system 101, e.g. the size, the shape, the arrangement, etc., of the receive optical elements may be adapted based on the configuration of the segments, based on a disposition of the detector 126 in the optical system 101, etc. Possible configurations and arrangements of the receive optical elements will be described in further detail below.
  • a receive optical element may include a width in the range from 1 mm to 15 mm, for example in the range from 1.5 mm to 10 mm, for example in the range from 2 mm to 6 mm, and a height in the range from 1 mm to 15 mm, for example in the range from 1.5 mm to 10 mm, for example in the range from 2 mm to 6 mm.
  • a receive optical element may have an elongated shape, for example a rectangular shape.
  • FIG. II An illustrative representation of the detection of light from the field of view 108 is shown in FIG. II.
  • the light associated with the portion of the field of view 108 that is currently illuminated is represented by the circle 130.
  • the first receive optical element 128-1 allows following the illuminated portion 130 along the first trajectory 124-1, such that this light may be provided to the detector 126, while a light absorbing portion 132 of the optical component 104 absorbs or blocks other light that may otherwise arrive at the detector 126.
  • a frame rate for the detection of light from the field of view 108 may be associated with a frequency of rotation (or a speed) of the continuous movement of the optical component 104.
  • a frame may be understood as a complete scan of the field of view 108 (in both the horizontal direction and the vertical direction) .
  • a frame rate may be understood as a number of frames that are acquired in a defined period of time, e.g. in 1 s. Averaging over several scans of the field of view (over several frames) may be carried out, e.g. a frame may include a plurality of accumulated scans of the field of view.
  • the rotation frequency may be a multiple of the frame rate.
  • a rotation frequency may be 250 Hz.
  • the light detection described above may be illustratively understood, in some aspects, as a tracking of the scanned horizontal and vertical angle range through an optical tracking filter to increase the signal-to-noise ratio.
  • optical component 104 aspects associated with the optical component 104 will be described in further detail below.
  • exemplary realizations of the optical component 104 may be shown. It is understood that the properties described in relation to the illustrated exemplary realizations may apply also to other configurations of the optical component 104 that are not explicitly illustrated. As an example, the properties described in relation to an optical component having a disk shape (see for example FIG. 2A) may apply also to an optical component having a different shape, and the like.
  • FIG. 2A and FIG. 2B show an optical component 200 in a schematic view according to various aspects.
  • the optical component 200 may be an exemplary realization of the optical component 104 described in relation to FIG. 1A to FIG. II.
  • the optical component 200 may have a disk shape (it may be described as a segment disc) . It is understood that the disk shape is an example, and the optical component 200 may have other shapes, such as a polygonal shape having a number of sides associated with the number of segments (e.g., an hexagonal shape, an heptagonal shape, an octagonal shape, a decagonal shape, etc.) , or a band-like shape as described in further detail below.
  • An optical component (e.g., the optical component 200) may have a symmetric shape, e.g. a shape that provides a smooth transition from one deflection angle to the next (from one segment to the next) .
  • the optical component 200 may have a symmetrical mass distribution (concentricity) .
  • the optical component 200 may have a rigid structure.
  • the size of an optical component may be adapted according to the configuration of an optical system (e.g., of the optical system 101) .
  • the disk shaped optical component may have a radius in the range from 10 mm to 100 mm, for example in the range form 25 mm to 75 mm, for example in the range from 30 mm to 60 mm, for example in the range from 25 mm to 50 mm, for example a radius of 50 mm.
  • the optical component 200 may include a main surface 202s (e.g., including a main top surface and a main bottom surface) and a side surface 204s (e.g., including a plurality of side surfaces 204s-l, 204s-2, 204s-3, also referred to as plurality of partial side surfaces) .
  • a main surface 202s e.g., including a main top surface and a main bottom surface
  • a side surface 204s e.g., including a plurality of side surfaces 204s-l, 204s-2, 204s-3, also referred to as plurality of partial side surfaces
  • the plurality of segments may be disposed along the side surface 204s of the optical component 202s.
  • the side surface 204s of the optical component 200 may be adapted to provide the plurality of segments, e.g. each partial side surface may correspond to a respective segment.
  • a first segment may include the first side surface 204-1 (e.g., a first segment may correspond to the first side surface 204-1)
  • a second segment may include the second side surface 204-2 (e.g., a second segment may correspond to the second side surface 204-2)
  • a third segment may include the third side surface 204-3 (e.g., a third segment may correspond to the third side surface 204-3) , etc.
  • the segments may be disposed along the side surface of the disk shaped optical component 200.
  • the optical component 200 may include more than three adapted segments, e.g. as described above in relation to the optical component 104.
  • a segment may be understood, in some aspects, as an edge segment of the optical component 200.
  • the deflection angle associated with a segment may be provided by adapting the geometrical properties of the segment.
  • each segment (each partial side surface) may be tilted at a respective tilting angle with respect to the main surface 202s of the optical component 200.
  • the tilting angle associated with a segment may be understood as an angle formed between the surface of segment (e.g., the associated side surface) and the main surface 202s of the optical component.
  • the tilting angle associated with a segment may be also understood as an angle formed between the surface of the segment and the direction from which light impinges on that segment (see FIG. 2B) .
  • FIG. 2B In the exemplary configuration in FIG.
  • the first side surface 204s-l may be tilted at a first tilting angle with respect to the main surface 202s
  • the second side surface 204s-2 may be tilted at a second tilting angle with respect to the main surface 202s
  • the third side surface 204s-3 may be tilted at a third tilting angle with respect to the main surface 202s, etc.
  • a difference between the tilting angle of adjacent segments e.g., between the first tilting angle and the second tilting angle
  • a tilting angle of a segment may be described relative to an angle of incidence of the light onto the segment (e.g., light as emitted by a light source 206, for example configured as the light source 102) .
  • the deflection angle provided by a segment may be expressed in terms of a variation of such incidence angle, e.g. in terms of a difference between the first component of the incidence angle in the first direction 152 and the first component of the emission angle in the first direction 152.
  • the variation may be, for example, in the range from -10° to +10°, for example in the range from -5°to +5°, for example may be -1.5° or +1.5°.
  • a segment may include a concave surface with respect to the main surface 202s of the optical component 200.
  • a concave surface of a segment may be understood, in some aspects, as the surface of the segment having concave character in at least one direction, e.g. in at least the horizontal direction. In the other direction (e.g., in the vertical direction) , the surface of a segment may be substantially planar.
  • the first segment (the first side surface 204s-l) may have a first concave surface
  • the second segment (the second side surface 204s-2) may have a second concave surface
  • the third segment (the third side surface 204s-3) may have a third concave surface, etc.
  • a concave surface may have a radius of curvature in the range from 5 mm to 80 mm, for example in the range from 15 mm to 60 mm, for example in the range from 20°mm to 40°mm.
  • the concave shape of a side surface may be adapted to provide the desired deflection angle associated with the respective segment. In relation to the shape of a segment see also FIG. 2D and FIG. 2E.
  • the radius of curvature may not be uniform, e.g. may vary along the width of the surface, for providing the desired (linear) relationship between the light deflected by a segment and the light received at the associated receive optical element.
  • the continuous movement of an optical component may include a continuous circular movement around an axis of the optical component.
  • the axis around which the continuous rotation occurs may be an axis perpendicular to the main surface of the optical component.
  • the continuous rotation of the optical component 200 may be a rotation around the axis 208 perpendicular to the main surface 202s of the optical component 200.
  • an optical component may include a band-like structure, and the plurality of segments may be disposed as a plurality of stripes on the band-like structure.
  • the band-like structure may be mounted on a frame including one or more rollers that enable moving continuously the band-like structure (e.g., back and forth or revolving around the rollers) .
  • the optical component may be configured as a conveyor belt, continuously moving around the rollers, providing a linear translation of the segments.
  • FIG. 2C to FIG. 2H show a series of graphs 210-1, 210-2, 210-3, 210-4, 210-5, 210-6 illustrating a deflection of light provided by two different segments of the optical component 200.
  • the first graph 210-1, the second graph 210-2, and the third graph 210-3 illustrate the deflection of light provided by one segment (e.g., the first segment 204s-l) of the optical component 200.
  • a coordinate of the deflected light in a first (e.g., vertical) direction may remain constant during the period in which the light impinges onto the first segment (see the graphs 222-1, 222-2, 222-3)
  • a coordinate in a second (e.g., horizontal) direction may vary along the extension of the field of view following the movement of the impinging location of the light on the first segment (see the graphs 220-1, 220-2, 220-3, showing the light emitted towards a left side, towards the center, and towards a right side of the field of view, respectively) .
  • the fourth graph 210-4, the fifth graph 210-5, and the sixth graph 210-6 illustrate the deflection of light provided by another segment (e.g., the second segment 204s-2) of the optical component 200.
  • a coordinate of the deflected light in a first (e.g., vertical) direction may remain constant during the period in which the light impinges onto the second segment (see the graphs 222-4, 222-5, 222-6) , and may be less than the coordinate associated with the first segment 204s-l in this example, while a coordinate in a second (e.g., horizontal) direction may vary along the extension of the field of view following the movement of the impinging location of the light on the second segment (see the graphs 220-4, 220-5, 220-6, showing the light emitted towards a left side, towards the center, and towards a right side of the field of view, respectively) .
  • the first (e.g., vertical) direction may remain constant during the period in which the light impinges onto
  • FIG. 21 and FIG. 2J illustrate the optical component 200 in a schematic view in accordance with various aspects.
  • the representation in FIG. 21 and FIG. 2J may illustrate possible considerations for the dimensioning and the shaping of the segments of the optical component 200.
  • the representations in FIG. 21 and FIG. 2J illustrate the impinging of light onto the optical component 200 from different points of view (e.g., from the top, in FIG. 21, and from the side, in FIG. 2J) .
  • the cutout shape of the segments may be adapted to the linear relationship between the angular rotation of the segment disk and the light deflection angle.
  • the light source 206 includes a laser bar (e.g., with four laser diodes, e.g. a 4-channel laser, having a certain height H L and a certain width W L ) , providing an angle 214 at the output of the laser bar of 25° (e.g., an angle in the horizontal direction, 0 H ) •
  • the laser light may be collimated onto the optical component 200 by means of a fast axis collimator 216 (FAC) and a slow axis collimator 218 (SAC) , as commonly known in the art.
  • the laser light may be collimated to provide a resolution in the horizontal direction of 0.2° (H Res ) , only as a numerical example.
  • the angle in the vertical direction, 0 V may be left unaltered.
  • a segment of the optical component may be dimensioned such that the collimated light may fall within the extension of the segment in the first (e.g., vertical) direction, e.g. within the height of the segment.
  • the deflection provided by the segment may add or subtract the corresponding deflection angle from the angle at which the light impinges onto the segment.
  • the segment may provide a deflection angle of 0°, leaving the output angle at the 6° provided by the orientation of the light source 206 relative to the segment.
  • FIG. 2K and FIG. 2L each shows a practical realization 250a, 250b of an optical component in a schematic view according to various aspects.
  • the practical realizations 250a, 250b may be practical implementations of the optical component 200 described in FIG. 2A to FIG. 2J.
  • an optical component may be implemented, in some aspects, as a hard disk drive including a rotating (in other words, spinning) disk onto which light emitted by a light source (e.g., by the light source 206) may be directed.
  • a light source e.g., by the light source 206
  • an optical component may be implemented as an automotive and industrial 2.5 inch hard drive or 3.5 inch hard drive.
  • optical system e.g., with the optical system 101
  • optical system 101 Other aspects associated with an optical system (e.g., with the optical system 101) will be described in further detail below, in relation to FIG. 3A to FIG. 3D.
  • FIG. 3A, FIG. 3B, FIG. 3C, and FIG. 3D each shows a respective optical system 300a, 300b, 300c, 300d in a schematic view according to various aspects.
  • These optical systems 300a, 300b, 300c, 300d may be an exemplary implementation of the optical system 101 described in relation to FIG. 1A to FIG. II (e.g., the LIDAR system 100 may include an optical system 300a, 300b, 300c, 300d) .
  • the aspects described above in relation to the optical system 101 may apply to the optical systems 300a, 300b, 300c, 300d shown in FIG. 3A to FIG. 3D, and vice versa .
  • the optical system 300b in FIG . 3B may correspond to the optical system 300a in FIG . 3A, represented for illustrating in more detail possible aspects associated with an optical system 300a, 300b, 300c, 300d .
  • the optical systems 300a, 300b, 300c, 300d may include a light source 302a, 302b, 302c, 302d, e . g . a light source 302a, 302b, 302c, 302d configured as the light source 102 described in relation to FIG . IB .
  • the optical systems 300a, 300b, 300c, 300d may include an optical component 304a, 304b, 304c, 304d, e . g . an optical component 304a, 304b, 304c, 304d configured as the optical component 104 , 200 described in relation to FIG . IB to FIG . 2L .
  • the optical component 304a, 304b, 304c, 304d may include a plurality of segments 306a, 306b, 306c, 306d configured to deflect the light emitted by the light source 302a, 302b, 302c, 302d towards a field of view 308a, 308b, 308c, 308d of the optical systems 300a, 300b, 300c, 300d, as described above in relation to FIG . 1A to FIG . 2L ( e . g . , towards a field of view 308a, 308b, 308c, 308d of the LIDAR system) .
  • the plurality of segments 306a, 306b, 306c, 306d may be disposed along a side surface of the optical component 304a, 304b, 304c, 304d .
  • the continuous movement of the optical component 304a, 304b, 304c, 304d may be controlled by a controller (not shown) , e . g . configured as the controller 112 described in relation to FIG . IB, and may be driven by a motor (not shown) as described in relation to FIG . IB ( e . g . , a spindle motor, or a servo motor ) .
  • the continuous movement of the optical component 304a, 304b, 304c, 304d may be a continuous circular movement around an axis 310a, 310b, 310c, 310d of the optical component 304a, 304b, 304c, 304d .
  • the light source 302a, 302b, 302c, 302d may be understood as including transmission optics .
  • the optical systems 300a, 300b, 300c, 300d may include a transmission optics arrangement configured to direct (e.g., to steer) the light emitted by the light source 302a, 302b, 302c, 302d towards the optical component 304a, 304b, 304c, 304d (illustratively, towards the segments 306a, 306b, 306c, 306d) .
  • the transmission optics arrangement may be configured to collimate the light by the light source 302a, 302b, 302c, 302d onto the optical component 304a, 304b, 304c, 304d. In some aspects, the transmission optics arrangement may be configured to mix light coming from a plurality of light sources onto the optical component 304a, 304b, 304c, 304d. In some aspects, the transmission optics arrangement may be configured to operate a geometric shape transformation of the light.
  • the transmission optics arrangement may include one or more optical elements, e.g. one or more lenses, such as one or more cylinder lenses, one or more mirrors, and the like.
  • the transmission optics arrangement may include a fast-axis collimator lens and a slow-axis collimator lens (see also FIG. 21 and FIG. 2J) configured to collimate the light emitted by the laser bar onto the optical component 304a, 304b, 304c, 304d.
  • a fast-axis collimator lens and a slow-axis collimator lens see also FIG. 21 and FIG. 2J
  • the transmission optics arrangement may include a mirror 305a, 305b, 305c (a deflection mirror) for directing the light emitted by the light source 302a, 302b, 302c, 302d towards the optical component 304a, 304b, 304c.
  • a mirror 305a, 305b, 305c a deflection mirror
  • the optical system 300a, 300b, 300c, 300d may also be configured for detecting light from the field of view 308a, 308b, 308c, 308d, e.g. for detecting the direct reflection of emitted light originating from one or more objects 312a, 312b, 312c in the field of view 308a, 308b, 308c, 308d (represented as a person for illustrative purposes in FIG. 3A to FIG. 3C) , e.g. for detecting light from the portions of the field of view 308a, 308b, 308c, 308d into which light was (or is being) emitted.
  • the optical system 300a, 300b, 300c, 300d may include a detector 314a, 314b, 314c, 314d, e.g. configured as the detector 126 described in relation to FIG. 1H, for example a detector including an avalanche photo diode or a plurality of avalanche photo diodes.
  • the use of an optical component 304a, 304b, 304c, 304d as described herein may provide a rapid scanning of the field of view 308a, 308b, 308c, 308d.
  • a frame rate of 25 Hz illustratedly, during acquisition of a frame the field of view may be scanned three times by means of the light directed thereto by the optical component 304a, 304b, 304c, 304d) .
  • the optical component 304a, 304b, 304c, 304d may be configured to improve the light detection, as described above in relation to FIG. 1H and FIG. II.
  • the optical component 304a, 304b, 304c, 304d may include a plurality of receive optical elements 316a, 316b, 316c, 316d, each associated with a respective segment 306a, 306b, 306c, 306d and configured such that the direct reflection of the light deflected by the associated segment may be received at the detector 314a, 314b, 314c, 314d, e.g.
  • the receive optical elements 316a, 316b may be configured to reflect the light received from the field of view 308a, 308b towards the detector 314a, 314b.
  • the receive optical elements 316c, 316d may be configured to allow a transmission of the light received from the field of view 308a, 308b through the receive optical element 316c, 316d towards the detector 314c, 314d.
  • the configuration of the receive optical elements 316a, 316b, 316c, 316d may be in accordance with a configuration of the optical system 300a, 300b, 300c, 300d, e.g. with an arrangement of the detector 314a, 314b, 314c, 314d in the optical system 300a, 300b, 300c, 300d .
  • the receive optical elements 316a, 316b, 316c, 316d may be disposed on a main surface of the optical component 304a, 304b, 304c, 304d .
  • the arrangement of the receive optical elements 316a, 316b, 316c, 316d may be in accordance with the arrangement of the associated segment , and with the overall configuration of the optical system 300a, 300b, 300c, 300d .
  • a receive optical element 316a, 316b, 316c may be disposed at the opposite side of the optical component 304a, 304b, 304c with respect to the associated segment 306a, 306b, 306c .
  • a receive optical element 316d may be disposed at a same side of the optical component 304d as the associated segment 306d . Additional examples will be described in further detail below .
  • the optical system 300a, 300b, 300c, 300d may include a (first ) receive optics arrangement 318a, 318b, 318c, 318d configured to receive light from the field of view 308a, 308b, 308c, 308d of the optical system 300a, 300b, 300c, 300d and to direct the received light towards the optical component 304a, 304b, 304c, 304d .
  • the receive optics arrangement 318a, 318b, 318c, 318d may include one or more optical elements ( e . g . , one or more lenses , such as one or more cylinder lenses ) configured to collect light from the field of view 308a, 308b, 308c, 308d .
  • the receive optics arrangement 318a, 318b, 318c, 318d may include at least one (first ) lens having an optical aperture in the range from 100 mm 2 to 3000 mm 2 , for example in the range from 200 mm 2 to 400 mm 2 .
  • the first receive optics arrangement 318a, 318b, 318c, 318d may be configured to direct the received light towards the main surface of the optical component 304a, 304b, 304c, 304d, illustratively, towards the receive optical element 316a, 316b, 316c, 316d associated with the currently illuminated segment 306a, 306b, 306c, 306d deflecting the light towards the field of view 308a, 308b, 308c, 308d .
  • the first receive optics arrangement 318a, 318b, 318c, 318d may be configured to image the field of view 308a, 308b, 308c, 308d onto the optical component 304a, 304b, 304c, 304d (onto its main surface) .
  • the image may be provided with respect to a predefined focal point.
  • the first receive optics arrangement 318a, 318b, 318c, 318d may be configured to operate a geometrical transformation of the image the field of view 308a, 308b, 308c, 308d provided onto the optical component 304a, 304b, 304c, 304d, e.g. to provide a shape of the field of view adapted to a shape of the optical component (e.g., to provide trapezoidal representation of the field of view in case of a disc-shaped optical component, as an example) .
  • the configuration of the first receive optics arrangement 318a, 318b, 318c, 318d may be adapted depending on the arrangement of the components of the optical system 300a, 300b, 300c, 300d.
  • the first receive optics arrangement 318a, 318b, 318c may include a mirror 320a, 320b, 320c (a deflection mirror) configured to deflect the light received from the field of view 308a, 308b, 308c towards the optical component 304a, 304b, 304c (e.g., towards its main surface) .
  • the optical system 300a, 300b, 300c, 300d may include a (second) receive optics arrangement 322a, 322b, 322c, 322d configured to direct the light from the optical component 304a, 304b, 304c, 304d to the detector 314a, 314b, 314c, 314d.
  • the (second) receive optics arrangement 322a, 322b, 322c, 322d may be configured to image onto the detector 314a, 314b, 314c, 314d the light transmitted or reflected by the receive optical element associated with the currently illuminated segment.
  • the second receive optics arrangement 322a, 322b, 322c, 322d may include one or more optical elements (e.g., one or more lenses, such as one or more cylinder lenses, one or more mirrors, etc.) to direct (in some aspects, to focus) the light transmitted or reflected by the receive optical elements 316a, 316b, 316c, 316d onto the detector 314a, 314b, 314c, 314d.
  • the optical system 300a, 300b, 300c, 300d may include a position sensor 324a, 324b, 324c, 324d configured to provide position information ( e . g .
  • the position sensor 324a, 324b, 324c, 324d may be configured to determine a position of the optical component 304a, 304b, 304c, 304d during the continuous movement of the optical component 304a, 304b, 304c, 304d to identi fy the segment onto which the light emitted by the light source 302a, 302b, 302c, 302d is impinging .
  • the position of the optical component 304a, 304b, 304c, 304d during the continuous movement may be understood, in some aspects , as an angular position of the optical component 304a, 304b, 304c, 304d with respect to a reference point ( e . g . , as an angular displacement with respect to a reference point , e . g . with respect to a starting position) .
  • the position sensor 324a, 324b, 324c, 324d may be configured to determine an angular position of the optical component 304a, 304b, 304c, 304d during a continuous circular movement of the optical component 304a, 304b, 304c, 304d .
  • the position sensor 324a, 324b, 324c, 324d may be a passive device .
  • the position sensor 324a, 324b, 324c, 324d may include a light source (e . g . , a light emitting diode ) configured to illuminate the optical component 304a, 304b, 304c, 304d for determining the position ( e . g . , the angular position) of the optical component 304a, 304b, 304c, 304d .
  • a light source e . g . , a light emitting diode
  • the information provided by the position sensor 324a, 324b, 324c, 324d may be used to assign a location in the field of view 308a, 308b, 308c, 308d ( coordinates in the field of view) to the light received at the detector 314a, 314b, 314c, 314d .
  • the information provided by the position sensor 324a, 324b, 324c, 324d may be used to determine the position towards which the emitted light was directed ( to determine the illuminated portion of the field of view 308a, 308b, 308c, 308d) , to determine a position from which the direct reflection associated therewith was originated .
  • the optical system 300a, 300b, 300c, 300d may include one or more processors 326d ( shown in FIG . 3D, it is understood that also the optical system 300a, 300b, 300c may include respective one or more processors ) .
  • the one or more processors 326d may be configured to process position information provided by the position sensor 324a, 324b, 324c, 324d .
  • the one or more processors 326d may be configured to assign a location in the field of view 308a, 308b, 308c, 308d of the optical system 300a, 300b, 300c, 300d to light received at the optical system 300a, 300b, 300c, 300d in accordance with the position information provided by the position sensor 324a, 324b, 324c, 324d .
  • the position information provided by the position sensor 324a, 324b, 324c, 324d may be used to control an emission of light by the light source 302a, 302b, 302c, 302d .
  • the one or more processors 326d may be configured to control an emission of light from the light source 302a, 302b, 302c, 302d in accordance ( e . g . , in synchroni zation) with the position of the optical component as determined by the position information .
  • the one or more processors 326d may be configured to control the light source 302a, 302b, 302c, 302d to start emitting light at a defined angular position of the optical component 304a, 304b, 304c, 304d, in accordance with the position information .
  • the one or more processors 326d may include , for example , a central processing unit 328d ( CPU) and a system-on-chip 330d ( SOC ) .
  • the system-on-chip 330d may include a LIDAR engine ( e . g . , an application-speci fic integrated circuit , a microcontroller, and the like ) , for example a 2D/ 3D LIDAR data acquisition and processing system-on-chip, such as a LCA3 LeddarCore , for example in case the optical system 300a, 300b, 300c, 300d is part of a LIDAR system .
  • a LIDAR engine e . g . , an application-speci fic integrated circuit , a microcontroller, and the like
  • LCA3 LeddarCore for example in case the optical system 300a, 300b, 300c, 300d is part of a LIDAR system .
  • FIG . 3E illustrates an operation of a LIDAR engine 334 in relation to an oscillation of a MEMS mirror in a usual LIDAR system, as shown in the graph 336 .
  • Parameters associated with the use of a LIDAR engine 334 in a usual LIDAR system may illustrate the advantages in terms of speed of acquisition provided by the solution described herein.
  • the acquisition sequence of photo diodes is always automatic due to the synchronization required with the mirror's oscillation (see graph 336, with the oscillation of the MEMS mirror represented in relation to the oscillation angle 0 O s) •
  • the LIDAR engine 334 supports a MEMS mirror with an oscillation frequency of 1 kHz to 6 kHz.
  • a deflection of light by implementing the strategy described herein may be about 7 times more effective (7 times faster) than deflection with a 2.1 kHz MEMS mirror and LCPG.
  • the position information may also be used for increasing the tolerance of the optical system 300a, 300b, 300c, 300d to oscillations and deflections of the optical component 304a, 304b, 304c, 304d during operation, i.e. to uncontrolled variations in the position of the optical component 304a, 304b, 304c, 304d (e.g., an unwanted tilt) , for example due to vibrations, impacts, etc. Variations in the vertical direction may have no influence as long as the light emitted by the light source 302a, 302b, 302c, 302d hits the corresponding segment.
  • Variations in the plane of the main surface of the optical component 304a, 304b, 304c, 304d may change the deflection angle in the horizontal and vertical direction.
  • a correction of the point assignment in the point cloud may be performed by the one or more processors 326d (e.g., by the CPU 328d) .
  • the use of the hard disk drive concept described herein may not result in a strong deflection of the segment disk in the x, y, z directions.
  • the position information provided by a position sensor 324a, 324b, 324c, 324d may be used to adjust a tilt of the optical component 304a, 304b, 304c, 304d.
  • the optical component 304a, 304b, 304c, 304d may be mounted on a support configured to allow for an adjustment of the position and the rotation of the optical component 304a, 304b, 304c, 304d.
  • an automatic correction of uncontrolled variations in the position of the optical component 304b may be provided by using the imaging principle (e.g., considering the opposite movement of the segments 306b with respect to the associated receive optical elements 316b) .
  • the light detection at an optical system e.g., at the optical system 101, 300a, 300b, 300c, 300d
  • an optical component 104, 200, 304a, 304b, 304c, 304d
  • FIG. 4A shows a receiver side 400 of an optical system in a schematic view according to various aspects.
  • the receiver side 400 may be the receiver side of an optical system 101, 300a, 300b, 300c, 300d as described above.
  • the receiver side 400 may be configured to image the field of view 402 of the optical system.
  • the receiver side 400 of an optical system may include a first receive optics arrangement 404 (e.g., configured as the first receive optics arrangement 318a, 318b, 318c, 318d described in relation to FIG. 3A to FIG. 3D) , an optical component 406 (e.g., configured as the optical component 104, 200, 304a, 304b, 304c, 304d described in relation to FIG. IB to FIG. II, FIG. 2A to FIG. 2G, and FIG. 3A to FIG.
  • a first receive optics arrangement 404 e.g., configured as the first receive optics arrangement 318a, 318b, 318c, 318d described in relation to FIG. 3A to FIG. 3D
  • an optical component 406 e.g., configured as the optical component 104, 200, 304a, 304b, 304c, 304d described in relation to FIG. IB to FIG
  • a second receive optics arrangement 408 e.g., configured as the second receive optics arrangement 322a, 322b, 322c, 322d described in relation to FIG. 3A to FIG. 3D
  • a detector 410 e.g., configured as the detector 126, 314a, 314b, 314c, 314c described in relation to FIG. 1H, and FIG. 3A to FIG. 3D
  • the first receive optics arrangement 404 may be configured to image the field of view 402 onto the optical component 406 (e.g., on a disc-shaped component) .
  • FIG. 4B illustrates the image 412 of the field of view 402 provided on the optical component 406.
  • the first receive optics arrangement 404 may be configured to operate a geometric transformation to provide a trapezoidal image of the field of view 402 onto the optical component 406, illustratively a tapered image having a first width at a first side 414-1 (e.g., a width in the range from 20 mm to 30 mm, for example 26 mm) , and a second width at a second side 414-2 (e.g., a width in the range from 30 mm to 40 mm, for example 37 mm) .
  • the trapezoidal image 412 may have a height in the range from 10 mm to 20 mm, for example 13 mm.
  • the geometric transformation may ensure that the entire field of view 402 may be imaged onto the optical component 406.
  • FIG. 4D illustrates a top view of the optical component 406 on which the image 412 of the field of view 402 is formed .
  • the image 412 may have, illustratively, a number of pixels defined by the properties of the optical element 406 and of the light source of the optical system, as described in relation to FIG. 1H.
  • the number of pixels in the vertical direction may be defined by the angular range covered by the individual segments of the optical element 406 and by the number of light source.
  • the image 412 may include 128 pixels in the vertical direction.
  • the number of pixels in the horizontal direction may be defined by the angular range covered by the continuous movement of the optical element 406.
  • the image 412 may include 504 pixels in the horizontal direction.
  • the second receive optics arrangement 408 may be configured to image onto the detector 410 the light downstream of the optical component 406 to the detector 410 (e.g., to image onto the detector 410 the light not blocked by the optical component 406) .
  • the detector 410 is shown as visible from this top view, even though the detector 410 is located underneath the optical element 406.
  • the receive optical elements serve the purpose that only a portion of the detector 410 is illuminated by the image 412 (illustratively, the portion underneath the receive optical elements 420, e.g. the portion underneath the slits 420 in FIG. 4D) .
  • FIG. 4C illustrates an exemplary configuration for the detector 410, which may include a plurality of photo diodes 416 (e.g., 128 photo diodes, illustratively 128 pixels each associated with a respective photo diode) , e.g. 128 avalanche photo diodes.
  • the photo diodes may be arranged to form a column array, as an example. Pairs of individual photo diodes may be connected in parallel (e.g., 1 with 17, 2 with 18, etc.)
  • a total height of the detector 410 may be in the range from 5000 pm to 25000 pm, for example 20450 pm, and a width of the detector 410 may be in the range from 5000 pm to 50000 pm, for example 37000 pm.
  • the height of a photo diode 416 may be in the range from 100 pm to 200 pm, for example 120 pm.
  • a spacing between adjacent photo diodes 416 may be in the range from 10 pm to 60 pm, for example 40 pm.
  • the optical aperture of the first receive optics arrangement 404 may be adapted in accordance with an optical aperture of the detector 410.
  • the first receive optics arrangement 404 may have an optical aperture of about 294 mm 2 (smaller compared to the optical aperture of optics arrangement used in a usual LIDAR system)
  • the detector may have an optical aperture (width x height) of about 700 mm 2 .
  • the receive optical elements 420 of the optical component 406 may have a shape adapted to the image of the field of view 402 provided by the first receive optics arrangement 402, e.g. a tapered shape (for example, with a greater width towards the edge of a disc-shaped optical element, and a narrower width towards the center of the disc-shaped optical element) .
  • a tapered shape for example, with a greater width towards the edge of a disc-shaped optical element, and a narrower width towards the center of the disc-shaped optical element
  • the number of illuminated pixels may be selected in accordance with the processing capabilities of the optical system.
  • four laser diodes may illuminate 16 vertical pixels simultaneously, as described in relation to FIG. IB to FIG. II.
  • a system-on-chip of the optical system may only multiplex 4 x 16 photo diodes.
  • FIG. 5A to FIG. 5L illustrate an imaging process including an optical component 500, in which an image 502 of the field of view is formed on the optical component (including 504 x 16 pixels) .
  • a detector 510 is shown as visible from this top view, even though the detector 510 is located underneath the optical element 500.
  • the receive optical elements e.g., the slits in FIG. 5A to FIG. 5L
  • FIG. 5A to FIG. 5F illustrate that during a first portion of the continuous rotation of the optical component 500, light 504 (e.g., a laser beam) impinges onto a first segment 506-1, and the associated receive optical element 508-1 provides that only the relevant portion of the image 502 of the field of view is provided to the detector 510.
  • the FIG. 5A to FIG. 5F are associated to subsequent time points within the first portion of the continuous rotation of the optical component 500.
  • FIG. 5L illustrate that during a second portion of the continuous rotation of the optical component 500, the light 504 impinges onto a second segment 506-2, and the associated receive optical element 508-2 provides that only the relevant portion of the image 502 of the field of view is provided to the detector 510.
  • the FIG. 5G to FIG. 5L are associated to subsequent time points within the second portion of the continuous rotation of the optical component 500.
  • FIG. 6 shows a detector 600 and an optical component 602 in a schematic view according to various aspects.
  • the optical component 602 may be a further example of the optical component 104, 200, 304a, 304b, 304c, 304d described in relation to FIG. IB to FIG. II, FIG. 2A to FIG. 2G, and FIG. 3A to FIG. 3D.
  • a receive optical element may be configured to cover the unused photo diodes of a detector.
  • pairs of photo diodes 604 of a detector 600 may be connected in parallel with one another.
  • the receive optical elements of the optical component 602, shown in the inset 606, may each be assigned to a respective subset of photo diodes 602.
  • a first receive optical component 608-1 may be assigned to a first subset of photo diodes (e.g., 0 to 17)
  • a second receive optical component 608-2 may be assigned to a second subset of photo diodes (e.g., 16 to 33)
  • each receive optical element may transmit or reflect light towards the assigned photo diodes, while leaving the remaining photo diodes covered by the body of the optical component 602.
  • FIG. 7 shows an optical component 700 in a schematic view according to various aspects.
  • the optical component 700 may be a further example of the optical component 104, 200, 304a, 304b, 304c, 304d, 602 described in relation to FIG. IB to FIG. II, FIG. 2A to FIG. 2G, FIG. 3A to FIG. 3D, and FIG. 6.
  • the receive optical elements 702- 1 , 702-2 , 702-3 , 702-4 may be disposed tilted with respect to one another .
  • An image 704 of the field of view may be formed on the optical component 700 , for example compressed in the hori zontal direction and elongated in the vertical direction by the first receive optics arrangement .
  • the image 704 of the field of view may include 504 x 128 pixels .
  • the orientation of the receive optical elements 702- 1 , 702-2 , 702-3 , 702-4 relative to one another may be adapted in accordance with the formed image , and with the configuration of a detector .
  • FIG . 8 shows a graph 800 providing a comparison of the beam steering strategy described herein (Al ) with a beam steering strategy implemented in a usual long range LIDAR system .
  • a higher range may be provided, even though with the beam steering and tracking strategy described above .
  • a higher range may be provided by means of the optical component described herein compared to beam steering with a LCPG ( and a MEMS mirror ) as used in a usual LIDAR system .
  • the range of detection of a LIDAR system configured according to the strategy described herein may be greater than 100 m, for example greater than 130 m, for example may be about 180 m .
  • the range provided by the strategy described herein may be indicated by the crossing point of the line 802 ( denoted with Al ) in the graph 800 with the dotted line 804 , and the range provided by a usual long range LIDAR system may be indicated by the crossing point of the line 806 in the graph 800 with the dotted line 808 .
  • the graph may 800 be provided using formulas known in the art for calculating the power per pixel to obj ect distance , the noise shot of an avalanche photo diode , a total noise , noise accumulation, and minimum power on detector pixel for detection .
  • an optical system which may provide advantages with respect to beam-steering solutions implemented in a usual LIDAR system .
  • a single (hori zontal ) rotating disk may be used to scan both the horizontal and the vertical axis.
  • the horizontal and vertical deflection in the transmission path and also the limitation (tracking) of the illuminated angular range in the receiver path may be achieved by a single horizontally rotating disk providing an effective increase in the signal-to-noise ratio.
  • Complex, expensive and optically lossy MEMS mirrors and LCPG may be replaced by a single rotating disk using computer hard disk technology (with corresponding disk size and drive components) .
  • a laser beam guidance similar to the line method of a television picture tube may be realized, with simultaneous tracking of the irradiated angular section to increase the signal-to-noise ratio.
  • the components of the optical system may operate together with Lidar engines (like LCA3) and most of the system components of these technologies.
  • a smaller system geometry/volume may be provided, for example by eliminating the LCPG a smaller aperture of the receiver optics may be provided (approximately 7 times smaller aperture) .
  • the bill of material cost may be reduced.
  • An unchanged or even extended range compared to a usual LIDAR system may be provided with extended field of view (120° x 24°) and constant frame rate of 25 Hz. Scanning of the entire field of view (120° x 24°) at 25 Hz may be provided with the same (or even greater) range over the entire area.
  • An overall power consumption may be reduced.
  • (only) one single rotating disk may be used for horizontal and vertical laser beam steering in transmission path and also for limiting the imaged area around the laser angle in the receiver path. This may provide a reduction of back light by limiting the area around the irradiated angle point.
  • the disk may be divided in slices. Every slice area may include one transmission (TX) and one receive (RX) element. TX elements may be placed on the side (edge) and RX elements may be placed on a plane of the disc.
  • a laser beam steering equivalent to a TV picture tube may be realized by the edge elements of the rotating disc. Every edge element may realize a beam steering over the entire horizontal field of view vie the rotating motion.
  • the vertical beam steering may be affected by different tilt angles of the edge segments.
  • Every TX element may be assigned exactly to one RX element .
  • the RX element may perform a tracking of the irradiated area by the rotation of the disk ( tracking of laser beam) . For example , in the hori zontal direction the areas left and right from the irradiated area are hidden . So the back light of these areas is not imaged on a detector ( e . g . , on a photo element ) . Downstream of the RX tracking segment , the remaining light ( reflected light by an obj ect ) may be imaged to a vertical photo element row .
  • the number of photo elements may be determined by the vertical resolution .
  • the number of slices of the disk may be determined by the angle of the vertical field of view irradiated by one laser shot . For example 8 slices may be provided for an irradiated angle of 3 ° ( 1 shot ) at a whole hori zontal angle of 24 ° .
  • a cost reduction for a LIDAR system may be provided by replacing expensive components with inexpensive components that may provide a same performance .
  • a MEMS mirror may be replaced by a rotating disk with transmission elements .
  • a LCPG may be replaced by the rotating disk providing di f ferent vertical deflections , in combination with associated tracking filters that track the scanned angular range to increase signal-to-noise ratio .
  • Example 1 is a LIDAR system including an optical system, the optical system including : a light source ; an optical component including a first segment configured to deflect the light emitted by the light source towards a field of view of the LIDAR system in a first emission direction, and a second segment configured to deflect the light emitted by the light source towards the field of view of the LIDAR system in a second emission direction, and a controller configured to control a continuous movement of the optical component , wherein the optical component is configured in such a way that the light emitted by the light source impinges onto the first segment during a first portion of the continuous movement of the optical component , and that the light emitted by the light source impinges onto the second segment during a second portion of the continuous movement of the optical component .
  • the optical component may include six segments , or eight segments , or ten segments .
  • the subj ect-matter of example 1 may optionally further include that the first emission direction is at a first emission angle with respect to an optical axis of the optical system, the first emission angle including a first component along a first field of view direction and a second component along a second field of view direction, and that the first segment is configured such that the first component of the first emission angle remains constant during the first portion of the continuous movement of the optical component , and the first segment is configured such that the second component of the first emission angle varies during the first portion of the continuous movement of the optical component .
  • the sub ect-matter of example 2 may optionally further include that the optical component is configured in such a way that the light emitted by the light source impinges onto the first segment at a plurality of first impinging locations during the first portion of the continuous movement of the optical component , and that each first impinging location is associated with a respective second component of the first emission angle .
  • the subj ect-matter of any one of examples 1 to 3 may optionally further include that the second emission direction is at a second emission angle with respect to an optical axis of the optical system, the second emission angle including a third component along the first field of view direction and a fourth component along the second field of view direction, and that the second segment is configured such that the third component of the second emission angle remains constant during the second portion of the continuous movement of the optical component , and the second segment is configured such that the fourth component of the second emission angle varies during the second portion of the continuous movement of the optical component .
  • the subj ect-matter of example 4 may optionally further include that the optical component is configured in such a way that the light emitted by the light source impinges onto the second segment at a plurality of second impinging locations during the second portion of the continuous movement of the optical component , and that each second impinging location is associated with a respective fourth component of the second emission angle .
  • the sub ect-matter of any one of examples 1 to 5 may optionally further include that the optical component is configured in such a way that the impinging location of the light emitted by the light source onto the first segment moves along the first segment in a direction parallel to a second field of view direction during the first portion of the continuous movement of the optical component , and that the optical component is configured in such a way that the impinging location of the light emitted by the light source onto the second segment moves along the second segment in the direction parallel to the second field of view direction during the second portion of the continuous movement of the optical component .
  • the subj ect-matter of examples 2 and 4 may optionally further include that the first component of the first emission angle is di f ferent from the third component of the second emission angle , and that the second component of the first emission angle and the fourth component of the second emission angle vary in a same angular range during the respective portion of the continuous movement of the optical component .
  • the subj ect-matter of example 7 may optionally further include that the angular range is between - 60 ° and + 60 ° with respect to the optical axis of the optical system along the second field of view direction .
  • the subject-matter of example 7 or 8 may optionally further include that an absolute value of a difference between the first component of the first emission angle and the third component of the second emission angle is the range from 0.1° to 10°, for example in the range from 0.25° to 5°, for example 1.5°, for example 3°.
  • Example 10 the sub ect-matter of any one of examples 2 to 9 may optionally further include that the first field of view direction and the second field of view direction are aligned at a defined angle with one another (e.g., an angle different from 0° and different from 180°) .
  • Example 11 the subject-matter of example 10 may optionally further include that the first field of view direction and the second field of view direction are perpendicular to one another.
  • the first field of view direction is the vertical direction and the second field of view direction is the horizontal direction.
  • Example 12 the subject-matter of any one of examples 1 to 11 may optionally further include that the continuous movement of the optical component includes a continuous circular movement.
  • Example 13 the subject-matter of example 12 may optionally further include that the continuous circular movement includes a frequency of rotation in the range from 1 Hz to 400 Hz, for example in the range from 1 Hz to 300 Hz, for example in the range from 10 Hz to 250 Hz, for example a frequency of rotation equal to or greater than 75 Hz.
  • Example 14 the subject-matter of any one of examples 1 to 13 may optionally further include that the optical component further includes a third segment configured to deflect the light emitted by the light source towards the field of view of the LIDAR system in a third emission direction, and that the optical component is configured in such a way that the light emitted by the light source impinges onto the third segment during a third portion of the continuous movement of the optical component .
  • the subj ect-matter of example 14 may optionally further include that the third emission direction is at a third emission angle with respect to the optical axis of the optical system, the third emission angle including a fi fth component along the first field of view direction and a sixth component along the second field of view direction, and that the third segment is configured such that the fi fth component of the third emission angle remains constant during the third portion of the continuous movement of the optical component , and the third segment is configured such that the sixth component of the third emission angle varies during the third portion of the continuous movement of the optical component .
  • Example 16 the sub ect-matter of any one of examples 1 to 15 may optionally further include that the optical component includes a main surface and a plurality of side surfaces .
  • Example 17 the subj ect-matter of example 16 may optionally further include that the first segment includes a first side surface of the plurality of side surfaces and the second segment includes a second side surface of the plurality of side surfaces .
  • Example 18 the subj ect-matter of example 17 may optionally further include that the first side surface is tilted at a first tilting angle with respect to the main surface , and wherein the second side surface is tilted at a second tilting angle with respect to the main surface .
  • the subj ect-matter of any one of examples 16 to 18 may optionally further include that the continuous movement of the optical component includes a continuous circular movement around an axis perpendicular to the main surface of the optical component .
  • the subj ect-matter of any one of examples 1 to 19 may optionally further include that the optical component has a disk shape .
  • the disk shaped optical component has a radius in the range from 10 mm to 100 mm, for example in the range form 25 mm to 75 mm, for example in the range from 30 mm to 60 mm, for example in the range from 25 mm to 50 mm, for example a radius of 50 mm .
  • Example 21 the sub ect-matter of example 20 may optionally further include that the first segment and the second segment are disposed along a side surface of the disk shaped optical component .
  • the subj ect-matter of any one of examples 1 to 21 may optionally further include that the first segment has a first concave surface and the second segment has a second concave surface .
  • the first concave surface has a first radius of curvature in the range from 5 mm to 80 mm, for example in the range from 15 mm to 60 mm, for example in the range from 20 °mm to 40 °mm
  • the second concave surface has a second radius of curvature in the range from 5 mm to 80 mm, for example in the range from 15 mm to 60 mm, for example in the range from 20 °mm to 40 °mm .
  • Example 23 the subj ect-matter of any one of examples 1 to 22 may optionally further include that the first segment has a first extension along a first direction parallel to the first field of view direction in the range from 10 mm to 60 mm, and that the second segment has a second extension along a first direction parallel to the first field of view direction in the range from 10 mm to 60 mm .
  • the subj ect-matter of any one of examples 1 to 23 may optionally further include that the optical component includes a first receive optical element associated with the first segment , and a second receive optical element associated with the second segment .
  • the subj ect-matter of example 24 may optionally further include that the first segment and the first receive optical element are disposed relative to one another such that the first receive optical element receives light associated with a direct reflection of the light deflected in the first emission direction, and that the second segment and the second receive optical element are disposed relative to one another such that the second receive optical element receives light associated with a direct reflection of the light deflected in the second emission direction .
  • the first emission direction may be associated with a first portion of the field of view of the LIDAR system
  • the second emission direction may be associated with a second portion of the field of view of the LIDAR system
  • the first receive optical element and the first segment may be disposed relative to one another such that the first receive optical element receives light coming from the first portion of the field of view
  • the second receive optical element and the second segment may be disposed relative to one another such that the second receive optical element receives light coming from the second portion of the field of view .
  • Example 26 the sub ect-matter of example 24 or 25 may optionally further include that the first receive optical element and the second receive optical element are disposed on a main surface of the optical component .
  • the subj ect-matter of example 26 may optionally further include that the main surface of the optical component is a light absorbing surface , and that the first receive optical element and the second receive optical element are configured to transmit or reflect the received light .
  • the subj ect-matter of any one of examples 1 to 27 may optionally further include a first receive optics arrangement configured to collect light from the field of view of the LIDAR system and to image the field of view onto the optical component .
  • the subj ect-matter of example 28 may optionally further include that the first receive optics arrangement is configured to image the field of view onto a main surface of the optical component .
  • the first receive optics arrangement includes a first lens having an optical aperture in the range from 200 mm 2 to 400 mm 2 , for example in the range from 100 mm 2 to 3000 mm 2 .
  • the sub ect-matter of any one of examples 1 to 29 may optionally further include a detector configured to detect light .
  • the detector has an optical aperture in the range from 50 mm 2 to 1000 mm 2 , for example in the range from 150 mm 2 to 600 mm 2 , for example 400 mm 2 .
  • the subj ect-matter of example 30 may optionally further include that the detector includes one or more photo diodes .
  • the one or more photo diodes include at least one avalanche photo diodes .
  • the subj ect-matter of example 30 or 31 may optionally further include that the one or more photo diodes include a plurality of photodiodes disposed along a first direction to form a one dimensional array .
  • the plurality of photodiodes are further disposed along a second direction to form a two dimensional array .
  • Example 33 the subj ect-matter of any one of examples 30 to
  • the 32 may optionally further include that the first direction is parallel to the first field of view direction .
  • Example 34 the subj ect-matter of any one of examples 30 to
  • a second receive optics arrangement configured to image the light reflected or transmitted by a receive optical element onto the detector .
  • Example 35 the subj ect-matter of example 24 and any one of examples 30 to 33 may optionally further include that the first receive optical element is configured to transmit or reflect the light associated with a first portion of the field of view towards the detector, and that the second receive optical element is configured to transmit or reflect the light associated with a second portion of the field of view towards the detector .
  • Example 36 the subj ect-matter of any one of examples 1 to 35 may optionally further include a transmission optics arrangement configured to collimate the light emitted by the light source towards the optical component .
  • Example 37 the sub ect-matter of any one of examples 1 to 36 may optionally further include a position sensor configured to determine a position of the optical component during the continuous movement of the optical component to identi fy the segment onto which the light emitted by the light source is impinging .
  • the one or more processors may be configured to control an emission of light from the light source in accordance ( e . g . , in synchroni zation) with the position of the optical component as determined by the position information .
  • Example 38 the subj ect-matter of example 37 may optionally further include that the position sensor is configured to determine an angular position of the optical component during a continuous circular movement of the optical component .
  • the subj ect-matter of example 37 or 38 may optionally further include one or more processors configured to process position information provided by the position sensor, wherein the one or more processors are configured to assign a location in the field of view of the LIDAR system to light received at the optical system in accordance with the position information provided by the position sensor .
  • Example 40 the subj ect-matter of any one of examples 1 to 39 may optionally further include that the light source includes a plurality of light sources .
  • the subj ect-matter of example 40 may optionally further include that each light source of the plurality of light sources is configured to illuminate a respective location of a segment of the optical component .
  • Example 42 the sub ect-matter of example 41 may optionally further include that the plurality of light sources are configured such that a segment is fully illuminated along the lateral extension of the segment parallel to the first field of view direction by the light emitted by the plurality of light sources .
  • the subj ect-matter of any one of examples 1 to 42 may optionally further include that the light source includes a laser source .
  • the laser source includes one or more laser diodes .
  • the subj ect-matter of any one of examples 1 to 43 may optionally further include a motor configured to drive the continuous movement of the optical component .
  • the motor may be one or a servo motor or a spindle motor .
  • Example 45 is an optical component including : a first segment configured such that light impinging onto the first segment is deflected towards a first portion of a field of view of the optical component ; and a second segment configured such that light impinging onto the second segment is deflected towards a second portion of a field of view of the optical component ; and a first receive optical element associated with the first segment and a second receive optical element associated with the second segment , wherein the first receive optical element and the first segment are disposed relative to one another such that the first receive optical element receives light coming from the first portion of the field of view of the optical component , and wherein the second receive optical element and the second segment are disposed relative to one another such that the second receive optical element receives light coming from the second portion of the field of view of the optical component .
  • Example 46 is a LIDAR system including an optical system, the optical system including : a light source ; a disk including a first side surface configured to deflect the light emitted by the light source towards a field of view of the LIDAR system in a first emission direction, and a second side surface configured to deflect the light emitted by the light source towards a field of view of the LIDAR system in a second emission direction; and a controller configured to control a continuous rotation of the disc, in such a way that the light emitted by the light source impinges onto the first side surface during a first portion of the continuous rotation of the disc, and that the light emitted by the light source impinges onto the second side surface during a second portion of the continuous rotation of the optical disc .
  • the subj ect-matter of example 46 may optionally further include that the first emission direction is associated with a first portion of the field of view of the LIDAR system, and the second emission direction is associated with a second portion of the field of view of the LIDAR system; and the disk may further include a first receive optical element disposed on a main surface of the disk and being associated with the first side surface , and a second receive optical element disposed on the main surface of the disk and being associated with the second side surface , wherein the first receive optical element and the first side surface are disposed relative to one another such that the first receive optical element receives light coming from the first portion of the field of view, and wherein the second receive optical element and the second side surface are disposed relative to one another such that the second receive optical element receives light coming from the second portion of the field of view .
  • Example 48 the sub ect-matter of example 46 or 47 may optionally further include the one , or more than, or each of the features of any one of the examples 1 to 44 .

Abstract

A LIDAR system including an optical system (101) is provided, the optical system (101) including: a light source (102); an optical component (104) comprising a plurality of segments (106-1... 106-8) to deflect the light emitted by the light source (102) towards a field of view (108) of the LIDAR system in a plurality of respective emission directions (114); and a controller (112) to control a continuous movement of the optical component (104), in such a way that the light emitted by the light source (102) impinges onto a different segment during different portions of the continuous movement of the optical component (104). The optical system (101) may include a detector (126) to detect light. The optical component (104) may include a plurality of receive optical elements (128-1... 128-8), each associated with a respective segment. A receive optical element may be disposed relative to the associated segment in such a way that a direct reflection (114r) of the emitted light deflected by the segment impinges onto that receive optical element. The light detection may be understood as a tracking of a scanned horizontal and vertical angle range through an optical tracking filter to increase the signal- to-noise ratio. The optical component may be implemented as a hard disk drive including a rotating disk onto which light emitted by a light source may be directed. The LIDAR system may be part of a vehicle or of a smart farming or of an indoor monitoring system.

Description

LIGHT EMISSION AND DETECTION IN A LIDAR SYSTEM
Various aspects are related to a LIDAR ("Light Detection and Ranging" ) system including an optical system for light emission and detection .
Light detection and ranging is a sensing technique that is used, for example , in the field of autonomous driving for providing detailed information about the surrounding of an automated or partially automated vehicle . LIDAR light ( e . g . , laser light ) is used to scan a scene and determine the properties of the obj ects present therein, such as the location, the speed, the direction of motion, and the like . In a LIDAR system, the emission of light for scanning the scene may be controlled, for example , by means of microelectromechanical system (MEMS ) mirrors , which may provide deflection of the emitted light along one direction or two directions ( e . g . , along the hori zontal and/or vertical direction) . As another example , a LIDAR system may include a liquid crystal polari zation grating ( LCPG) that may provide deflection of the emitted light ( and also of the light received from the field of view) into two directions . In a LCPG-solution, the imaged area may be reduced to increase the signal-to-noise ratio of the measurement . Beam steering solutions based on MEMS components or liquid crystal components may present challenges in terms of cost , complexity, and maintenance .
Various aspects may be based on providing a simple and ef ficient beam steering solution for a LIDAR system without having to rely on complex and expensive components such as a MEMS mirror or a LCPG . Various aspects may be related to an optical component configured to provide beam steering capabilities for one-dimensional and two-dimensional scanning of a field of view that does not require fine control over an oscillation angle (unlike MEMs mirrors ) or over the synchroni zed switching of liquid crystal cells (unlike LCPGs ) . In some aspects , the optical component may be further configured to prevent light received at the LIDAR system from portions of the field of view other than a portion into which light was emitted from impinging onto a detector of the LIDAR system . This may provide an increased signal-to-noise ratio of the detection .
Various aspects may be related to an optical component including a plurality of segments , each segment being configured to provide a respective deflection angle for light impinging on the segment , each deflection angle being associated with a respective portion of a field of view; the optical component being configured to allow for a control of a continuous movement of the optical component , such that the plurality of segments may be sequentially exposed to the incoming light to sequentially deflect the light at each of the plurality of deflection angles provided by the plurality of segments ; the optical component further including a plurality of receive optical elements , each associated with a respective segment of the plurality of segments , wherein a segment and the associated receive optical element are disposed relative to one another such that the receive optical element receives light coming from the portion of the field of view associated with the deflection angle provided by the segment .
Various aspects may be based on providing one-dimensional and/or two-dimensional scanning of a field of view ( e . g . , of a scene ) via the combination of the continuous movement of the optical component together with the configuration of the segments to provide the respective deflection angle .
In various aspects , a LIDAR system may include an optical system, the optical system including : a light source ; an optical component including a first segment configured to deflect the light emitted by the light source towards a field of view of the LIDAR system in a first emission direction, and a second segment configured to deflect the light emitted by the light source towards the field of view of the LIDAR system in a second emission direction, and a controller configured to control a continuous movement of the optical component , wherein the optical component is configured in such a way that the light emitted by the light source impinges onto the first segment during a first portion of the continuous movement of the optical component , and that the light emitted by the light source impinges onto the second segment during a second portion of the continuous movement of the optical component .
In various aspects , the first emission direction may be associated with a first portion of the field of view of the LIDAR system, and the second emission direction may be associated with a second portion of the field of view of the LIDAR system; and the optical component may further include a first receive optical element associated with the first segment and a second receive optical element associated with the second segment , wherein the first receive optical element and the first segment are disposed relative to one another such that the first receive optical element receives light coming from the first portion of the field of view, and wherein the second receive optical element and the second segment are disposed relative to one another such that the second receive optical element receives light coming from the second portion of the field of view .
In various aspects , an optical component may include : a first segment configured such that light impinging onto the first segment is deflected towards a first portion of a field of view of the optical component ; and a second segment configured such that light impinging onto the second segment is deflected towards a second portion of a field of view of the optical component ; and a first receive optical element associated with the first segment and a second receive optical element associated with the second segment , wherein the first receive optical element and the first segment are disposed relative to one another such that the first receive optical element receives light coming from the first portion of the field of view of the optical component , and wherein the second receive optical element and the second segment are disposed relative to one another such that the second receive optical element receives light coming from the second portion of the field of view of the optical component . In some aspects , a receive optical element being associated with a segment may be understood as the receive optical element and the segment being in a predefined ( e . g . , fixed) angular relationship with one another . The first receive optical element may be in a first predefined angular relationship with the first segment , and the second receive optical element may be in a second predefined angular relationship with the second segment .
In various aspects , a LIDAR system may include an optical system, the optical system including : a light source ; a disk including a first side surface configured to deflect the light emitted by the light source towards a field of view of the LIDAR system in a first emission direction, and a second side surface configured to deflect the light emitted by the light source towards a field of view of the LIDAR system in a second emission direction; and a controller configured to control a continuous rotation of the disk, in such a way that the light emitted by the light source impinges onto the first side surface during a first portion of the continuous rotation of the disk, and that the light emitted by the light source impinges onto the second side surface during a second portion of the continuous rotation of the optical disk .
In some aspects , the first emission direction may be associated with a first portion of the field of view of the LIDAR system, and the second emission direction may be associated with a second portion of the field of view of the LIDAR system; and the disk may further include a first receive optical element disposed on a main surface of the disk and being associated with the first side surface , and a second receive optical element disposed on the main surface of the disk and being associated with the second side surface , wherein the first receive optical element and the first side surface are disposed relative to one another such that the first receive optical element receives light coming from the first portion of the field of view, and wherein the second receive optical element and the second side surface are disposed relative to one another such that the second receive optical element receives light coming from the second portion of the field of view . The optical system described herein may provide a simple and cost-effective solution for providing beam steering in the LIDAR system. The LIDAR system may be part, for example, of a vehicle or of a smart farming or of an indoor monitoring system.
In the context of the present disclosure, reference may be made to a LIDAR system. It is however understood that a LIDAR system is an example of a possible application of the optical system and of the optical component described herein for providing control over the emission direction of light. The optical system and the optical component described herein may also be for use in other types of application or systems in which a simple and cost-effective beam steering solution may be advantageous, for example in an optical transmission system (e.g., wireless or including optical fibers) , e.g. in a light-based communication system in which data and information may be transmitted by means of light.
The term "segment" may be used herein to describe a part (in other words, a portion) of an optical component. A segment may be understood as a part of the optical component having a certain extension, and a certain surface area. As a numerical example, a segment may have a lateral extension in the horizontal direction (e.g., a width) in the range from 5 mm to 100 mm, for example in the range from 20 mm to 80 mm, for example in the range from 10 mm to 60 mm. As another numerical example, a segment may have a lateral extension in the vertical direction (e.g., a height) in the range from 1 mm to 15 mm, for example in the range from 2 mm to 10 mm, for example in the range from 1 mm to 6 mm, for example 3 mm. As another numerical example, a segment may have a surface area in the range from 5 mm2 to 1000 mm2, for example from 20 mm2 to 500 mm2, for example a surface area greater than 5 mm2 or greater than 50 mm2. In some aspects, a segment may be understood as a surface or part of a surface of the optical component, e.g. as a side surface or part of a side surface of the optical component. A side surface may also be referred to herein as lateral surface or edge surface. A segment may also be referred to herein as transmission element or emission element. The term "continuous movement" may be used herein to describe a type of purposeful movement that does not present interruptions, i.e. a purposeful (and controlled) movement that continues until it is arbitrarily stopped. Illustratively, a continuous movement may be opposite to a movement that occurs in a series of discrete steps.
The term "processor" as used herein may be understood as any kind of technological entity that allows handling of data. The data may be handled according to one or more specific functions executed by the processor. Further, a processor as used herein may be understood as any kind of circuit, e.g., any kind of analog or digital circuit. A processor may thus be or include an analog circuit, digital circuit, mixed-signal circuit, logic circuit, processor, microprocessor, Central Processing Unit (CPU) , Graphics Processing Unit (GPU) , Digital Signal Processor (DSP) , Field Programmable Gate Array (FPGA) , integrated circuit, Application Specific Integrated Circuit (ASIC) , etc., or any combination thereof. Any other kind of implementation of the respective functions, which will be described below in further detail, may also be understood as a processor or logic circuit. It is understood that any two (or more) of the processors or logic circuits detailed herein may be realized as a single entity with equivalent functionality or the like, and conversely that any single processor or logic circuit detailed herein may be realized as two (or more) separate entities with equivalent functionality or the like.
In the drawings, like reference characters generally refer to the same parts throughout the different views. The drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating the principles disclosed herein. In the following description, various aspects disclosed herein are described with reference to the following drawings, in which:
FIG. 1A shows a LIDAR system in a schematic view according to various aspects; FIG. IB shows an optical system in a schematic top view according to various aspects;
FIG. 1C and FIG. ID each shows schematically deflection of light associated with a segment of an optical component according to various aspects;
FIG. IE and FIG. IF each shows schematically deflection of light associated with a segment of an optical component according to various aspects;
FIG. 1G and shows schematically a scanning of a field of view according to various aspects;
FIG. 1H shows an optical system in a schematic top view according to various aspects;
FIG. II shows schematically a scanning of a field of view according to various aspects;
FIG. 2A and FIG. 2B each shows an optical component in a schematic view according to various aspects;
FIG. 2C, FIG. 2D, FIG. 2E, FIG. 2F, FIG. 2G, and FIG. 2H each shows a graph associated with the deflection of light by an optical component according to various aspects;
FIG. 21 and FIG. 2J each shows an optical component in a schematic view according to various aspects;
FIG. 2K and FIG. 2L each shows an exemplary realization of an optical component in a schematic view according to various aspects ;
FIG. 3A, FIG. 3B, FIG. 3C, and FIG. 3D each shows an optical system in a schematic view according to various aspects; FIG . 3E illustrates a synchroni zation between the emission of light and the continuous movement of an optical component in a schematic view according to various aspects ;
FIG . 4A shows a receiver side of an optical system in a schematic view according to various aspects ;
FIG . 4B shows an imaged field of view in a schematic view according to various aspects ;
FIG . 4G shows a detector in a schematic view according to various aspects ;
FIG . 4D shows an optical component in a schematic view according to various aspects ;
FIG . 5A to FIG . 5L illustrate an imaging process including an optical component in a schematic view according to various aspects ;
FIG . 6 shows an optical component and a detector in a schematic view according to various aspects ;
FIG . 7 shows an optical component and a detector in a schematic view according to various aspects ; and
FIG . 8 shows a graph providing a comparison between beam steering according to the strategy described herein and beam steering not implementing the strategy described herein .
The following detailed description refers to the accompanying drawings that show, by way of illustration, speci fic details and aspects disclosed herein may be practiced . These aspects are described in suf ficient detail to enable those skilled in the art to practice the disclosed implementations . Other aspects may be utili zed and structural , logical , and electrical changes may be made without departing from the scope of the disclosed implementations . The various aspects are not necessarily mutually exclusive, as some aspects may be combined with one or more other aspects to form new aspects.
FIG. 1A shows a LIDAR system 100 in a schematic top view according to various aspects. The representation of the LIDAR system 100 in FIG. 1A may be simplified for the purpose of explanation .
The LIDAR system 100 may include an optical system 101 (in some aspects, a plurality of optical systems 101) . The optical system 101 may also be referred to as light emission and detection system 101.
In some aspects, the optical system 101 may be configured to control an emission of light into a field of view 108 of the LIDAR system 100. In some aspects, the optical system 101 may be further configured to detect light from the field of view 108 of the LIDAR system 100. In other aspects, the detection of light from the field of view 108 may be carried out by a separate detection system. The field of view 108 of the LIDAR system 100 may be understood, in some aspects, as a field of view of the optical system 101.
In some aspects, the emission direction into the field of view 108 may be varied along one or more directions, e.g. at least along a first direction (e.g., the direction 152 in FIG. 1A) and a second direction (e.g., the direction 154 in FIG. 1A) . The first direction 152 and the second direction 154 may be understood as the directions along which the field of view 108 extends. Illustratively, the first direction 152 may be a first field of view direction, along which the field of view 108 has a first lateral extension (e.g., a height, or a first angular extension) , and the second direction 154 may be a second field of view direction, along which the field of view 108 has a second lateral extension (e.g., a width, or a second angular extension) . The first direction 152 and the second direction 154 may be aligned at a defined angle (different from 0° or 180°) with one another, e.g. the first direction 152 and the second direction 154 may be perpendicular to one another. In some aspects , the first direction 152 may be a vertical direction, and the second direction 154 may be a hori zontal direction . It is however understood that the definition of first direction 152 and second direction 154 may be arbitrary . The first direction 152 and the second direction 154 may be perpendicular to a direction along which the optical axis 110 of the optical system 101 is aligned ( e . g . , the optical axis 110 may be aligned along a third direction 156 in FIG . 1A) . The two directions along which the field of view 108 may extend are illustratively represented by the first field of view line 108- 1 and the second field of view line 108-2 in FIG . 1A.
The optical system 101 may be configured according to an adapted beam steering solution that does not require MEMS components or liquid crystal components . In some aspects , the optical system
101 does not include any MEMS mirror and does not include any liquid crystal polari zation grating . The optical system 101 may provide beam steering functionalities via an adapted optical component , as described in further detail below .
It is understood that the LIDAR system 100 may include other systems and/or components in addition to the optical system 101 , e . g . one or more communication devices , a sensor fusion circuit , sensors for detection other than optical detection, etc .
FIG . IB shows a schematic top view of the optical system 101 according to various aspects .
The optical system 101 may include a light source 102 . The light source 102 may be configured to emit light , e . g . light having a predefined wavelength, for example in the infra-red and/or near infra-red range , such as in the range from about 700 nm to about 5000 nm, for example in the range from about 860 nm to about 1600 nm, or for example at 905 nm or 1550 nm . The light source
102 may be configured to emit light in a continuous manner or in a pulsed manner, for example the light source 102 may be configured to emit one or more light pulses ( e . g . , a sequence of light pulses ) . In some aspects, the light source 102 may include a laser source. By way of example the light source may include one or more laser diodes, e.g. one or more edge-emitting laser diodes or one or more vertical cavity surface emitting laser diodes. Illustratively, the light source 102 may be configured to emit laser light, e.g. one or more laser pulses, for example a sequence of laser pulses. In some aspects, the light source 102 may include a plurality of laser sources (e.g., a plurality of laser diodes) disposed along one direction to form a one-dimensional array, or disposed along two directions to form a two-dimensional array. As an example, the light source 102 may include a laser bar (e.g., including a plurality of laser diodes, such as 4 laser diodes or 8 laser diodes, as examples) .
The optical system 101 may include an optical component 104 configured to direct the light emitted by the light source 102 towards the field of view 108 of the LIDAR system 100. The optical component 104 may include a plurality of segments (in other words, a plurality of adapted parts or adapted portions) , each configured to provide a respective emission direction towards the field of view 108 for the light impinging onto the segment, illustratively each segment may be configured to provide a respective redirection (a respective deflection angle) to the light impinging onto the segment.
In the exemplary configuration shown in FIG. IB, the optical component 104 may include a first segment 106-1, a second segment 106-2, a third segment 106-3, a fourth segment 106-4, a fifth segment 106-5, a sixth segment 106-6, a seventh segment 106-7, and an eighth segment 106-8. It is however understood that the number of segments illustrated in FIG. IB is only an example, and an optical component 104 may include a desired number of segments (e.g., five, six, eight, ten, or more than ten) , adapted in accordance with a desired range to be provided for directing the light onto the field of view 108 and with a desired resolution, as described in further detail below. It is also understood that the representation of the optical component 104 shown in FIG. IB (e.g., the shape, the arrangement of the segments, etc.) is only for the purpose of illustrating the principles of its operation, and other configurations may be provided for the optical component 104 , as described in further detail below .
The first segment 106- 1 may be configured to deflect the light emitted by the light source 102 towards the field of view 108 of the optical system 101 in a first emission direction 114 , the second segment 106-2 may be configured to deflect the light emitted by the light source 102 towards the field of view 108 of the optical system 101 in a second emission direction, the third segment 106-3 may be configured to deflect the light emitted by the light source 102 towards the field of view 108 of the optical system 101 in a third emission direction, etc . I llustratively, each segment may be configured to deflect the light emitted by the light source 102 at a respective emission direction towards the field of view 108 in case ( or when) the light emitted by the light source 102 impinges onto that segment . The emission directions provided by the segments may di f fer from one another in at least one angular component , e . g . a di f ferent angular component with respect to an optical axis 110 of the optical system 101 , as described in further detail below .
Each emission direction may be associated with a respective portion of the field of view 108 . Each segment may be configured to direct the light impinging onto the segment towards a respective portion of the field of view 108 , as described in further detail below .
The first emission direction may be associated with a first portion of the field of view 108 , the second emission direction may be associated with a second portion of the field of view 108 , the third portion may be associated with a third portion of the field of view 108 , etc . A portion of the field of view 108 associated with a segment ( illustratively, a portion illuminated by the light deflected by that segment ) may di f fer from a portion of the field of view 108 associated with another segment in at least one coordinate along one field of view direction, as described in further detail below . In some aspects, the optical system 101 may include a controller 112 configured to control a continuous movement of the optical component 104. The controller 112 may be configured to control one or more properties associated with the continuous movement of the optical component 104, e.g. a starting time, an end time, a speed, an acceleration, etc. In various aspects, the optical system 101 may include a motor (not shown) configured to drive the continuous movement of the optical component 104. The motor may be part of the controller 112, or the controller 112 may be coupled with the motor, and the controller 112 may be configured to control an operation of the motor. The motor may include an electrical motor, such as an AC motor or a DC motor. As an example, the motor may include a spindle motor, or a servo motor. As an example, the motor may be a motor used in hard disk drive technology (e.g., capable of achieving 250 rps) . In some aspects, the controller may be configured to control the light source 102 to emit light in accordance (e.g., in synchronization) with the continuous movement of the optical component 104, as described in further detail below (see for example FIG. 3A to FIG. 3D) .
The continuous movement of the optical component 104 may provide that the segment of the optical component 104 that is illuminated by the light emitted by the light source 102 varies over time. Illustratively, the continuous movement of the optical component 104 may vary a relative arrangement of the optical component 104 (e.g., of the segments) with respect to the light source 102, such that the light emitted by the light source 102 impinges onto the optical component 104 at different locations over time.
The type and the properties of the continuous movement may be selected in accordance with the configuration of the optical component 104, e.g. with its shape, with the disposition of the segments, etc. The properties of the continuous movement may also be selected in accordance with desired properties of a scanning of the field of view 108, e.g. with a scanning speed, with an acquisition rate of detected light, etc. In some aspects, the continuous movement of the optical component 104 may include a continuous circular movement, e.g. a continuous rotation around an axis of the optical component 104, for example an axis aligned along a first direction 152 (e.g., a vertical direction with respect to a main surface of the optical component 104) and passing through the center of the optical component 104 in the exemplary configuration in FIG. 1A (see also FIG. 2A and FIG. 2B) . The continuous circular movement may include a frequency of rotation in the range from 1 Hz to 400 Hz, for example in the range from 1 Hz to 300 Hz, for example in the range from 10 Hz to 250 Hz, for example a frequency of rotation equal to or greater than 75 Hz. In other aspects, the continuous movement of the optical component 104 may include a continuous linear movement, e.g. a rectilinear movement (for example along a rail) . The continuous linear movement may include a speed of movement in the range from 10 cm/ s to 100 cm/s, for example in the range from 20 cm/ s to 50 cm/s. A speed or a frequency of rotation of a continuous movement of the optical component 104 may be adapted in accordance with a frame rate of the scanning of the field of view 108, as described in further detail below.
In some aspects, the optical component 104 may be configured such that the light emitted by the light source 102 impinges onto a different segment during different portions of the continuous movement (e.g., during different time intervals) . The configuration of the optical component 104 may be understood as an arrangement of the segments relative to the light source 102 (and of the segments relative to one another) that provides that the continuous movement of the optical component 104 allows different segments to be illuminated by the light emitted by the light source 102 during the continuous movement.
The optical component 104 may be configured in such a way that the light emitted by the light source 102 impinges onto the first segment 106-1 during a first portion of the continuous movement of the optical component 104, that the light emitted by the light source 102 impinges onto the second segment 106-2 during a second portion of the continuous movement of the optical component 104 , that the light emitted by the light source 102 impinges onto the third segment 106-3 during a third portion of the continuous movement of the optical component 104 , etc .
The duration of a portion of the continuous movement during which light emitted by the light source 102 impinges onto a segment may be adapted by controlling the properties of the continuous movement and the properties of the segment , e . g . by adapting a frequency of rotation or a speed of the continuous movement , and/or by adapting a lateral extension ( e . g . , a width) of the segments . A portion of the continuous movement during which light emitted by the light source 102 impinges onto a segment may also be referred to herein as a portion of the continuous movement associated with that segment ( e . g . , the first portion is associated with the first segment 106- 1 , the second portion is associated with the second segment 106-2 , etc . ) . As a numerical example , a portion of the continuous movement associated with a segment may have a duration in the range from 100 ps to 300 ms , for example in the range from 5 ms to 10 ms , for example in the range from 20 ps to 500 ps , for example about 2 . 2 ms . In some aspects , the optical component 104 may be configured such that the portions of the continuous movement associated with di f ferent segments have a same duration .
It is understood that the control over the emission direction of the light into the field of view 108 may be provided also in a configuration in which the optical component 104 is stationary and the light source 102 is continuously moved to emit light onto di f ferent segments of the optical component 104 . As a further example , the control over the emission direction of the light into the field of view 108 may also be provided in a configuration in which both the optical component 104 and the light source 102 are stationary, and an optical arrangement is used to continuously vary an impinging location of the light onto the optical component 104 . I llustratively, the continuous movement of the optical component 104 may be understood, in some aspects , as a continuous variation of the relative arrangement between the optical component 104 and the light source 102, or as a continuous variation of the impinging location of the light emitted by the light source 102 onto the optical component 104. An impinging location of the light may also be referred to herein as an impingement location of the light.
The configuration of the optical component 104 in combination with the continuous variation of the relative arrangement between the optical component 104 and the light source 102 provide a continuous control over the emission direction of the light into the field of view 108. During the first portion of the continuous movement the light may be directed towards the first portion of the field of view 108, during the second portion of the continuous movement the light may be directed towards the second portion of the field of view 108, during the third portion of the continuous movement the light may be directed towards the third portion of the field of view 108, etc .
As described in relation to FIG. 1A, the emission direction into the field of view 108 may be varied along one or more directions, e.g. at least along a first direction (e.g., the direction 152 in FIG. 1A and FIG. IB) and a second direction (e.g., the direction 154 in FIG. 1A and FIG. IB) .
The redirection of the light emitted by the light source 102 operated by the optical component 104 and by the continuous variation of the impinging location of the light onto the optical component 104 is further illustrated in FIG. 1C to FIG. 1G.
FIG. 1C and FIG. ID illustrate schematically the deflection of light operated by one of the segments of the optical component 104, e.g. by the first segment 106-1. FIG. IE and FIG. IF illustrate schematically the deflection of light operated by another one of the segments of the optical component 104, e.g. by the second segment 106-2. The emission direction provided by a segment of the optical component 104 may form a respective angle with the optical axis 110 of the optical arrangement 100 (for the sake of representation illustrated as passing through the central part of a segment in FIG. 1C to FIG. IF) . The angle formed by the emission direction with the optical axis 110 may have one component in the first direction 152 and another component in the second direction 154. Illustratively, an emission direction may be at an angle with the optical axis 110 both in the first direction 152 and in the second direction 154.
A segment of the optical component 104 may be configured to provide a respective deflection angle for the light. Illustratively, each segment may be configured such that light impinging onto the segment is deflected along a direction forming a deflection angle with the optical axis 110 that is associated (only) with that segment. The deflection angle may also be referred to herein as emission angle.
In the exemplary configuration shown in FIG. 1C and FIG. ID, the first emission direction 114 (provided by the first segment 106-1) may be at a first emission angle with respect to the optical axis 110 of the optical system 101. The first emission angle may include a first component 116-1 along the first field of view direction 152 (see FIG. 1C) and a second component 116-2 along the second field of view direction 154 (see FIG. ID) . In the exemplary configuration shown in FIG. IE and FIG. IF, a second emission direction 118 (provided by the second segment 106-2) may be at a second emission angle with respect to the optical axis 110 of the optical system 101. The second emission angle may include a respective first component 120-1 (also referred to herein as third component 120-1) along the first field of view direction 152 (see FIG. IE) and a respective second component 120-2 (also referred to herein as fourth component 120-2) along the second field of view direction 154 (see FIG. IF) . It is understood that the same may apply to the other segments not illustrated in FIG. 1C to FIG. IF, e.g. a third emission direction (provided by the third segment 106-3) may be at a third emission angle with respect to the optical axis 110 of the optical system 101 , and the third emission angle may include a respective first component ( also referred to herein as fi fth component ) along the first field of view direction 152 and a respective second component ( also referred to herein as sixth component ) along the second field of view direction 154 , etc .
The segments of the optical component 104 may be configured such that at least one of the respective first component or second component formed by the respective emission direction with the optical axis 110 is associated only with that segment .
I llustratively, the segments of the optical component 104 may be configured to provide a respective deflection angle associated ( only) with that segment into one of the first direction 152 or the second direction 154 . The deflection into the other one of the first direction 152 or the second direction 154 may be provided by the continuous variation of the relative arrangement of the optical component 104 with respect to the light source 102 . In the following description it may be assumed that each segment is configured to provide a respective deflection associated ( only) with that segment into the first direction 152 ( e . g . , in the vertical direction) , and that the deflection in the second direction 154 ( e . g . , in the hori zontal direction) may be provided by the continuous movement of the optical component 104 with respect to the light source 102 . It is however understood that the description may apply in a similar manner in case each segment was configured to provide a respective deflection associated with that segment into the second direction 154 , and in case the deflection in the first direction 152 was provided by the continuous movement of the optical component 104 with respect to the light source 102 .
A portion of the field of view 108 illuminated by the light deflected by a segment may have a coordinate along the first direction 152 ( a vertical coordinate in the field of view 108 ) associated ( only) with that segment . The first portion associated with the first segment 106- 1 may have a first vertical coordinate , the second portion associated with the second segment 106-2 may have a second vertical coordinate , the third portion associated with the third segment 106-3 may have a third vertical coordinate , etc .
A segment may be configured such that the component of the angle formed by the respective emission direction with the optical axis 110 that is associated with that segment ( e . g . , the respective first component along the first direction 152 , in the exemplary configuration described herein) may remain constant during the portion of the continuous movement of the optical component 104 during which the light impinges onto that segment . I llustratively, light is deflected by a segment at a same angle along the first direction 152 during the period in which the light impinges onto that segment .
In the exemplary configuration described herein, the first segment 106- 1 may be configured such that the first component 116- 1 of the first emission angle remains constant during the first portion of the continuous movement of the optical component 104 . The second segment 106-2 may be configured such that the respective first component 120- 1 of the second emission angle remains constant during the second portion of the continuous movement of the optical component 104 . The third segment 106-3 may be configured such that the respective first component of the third emission angle remains constant during the third portion of the continuous movement of the optical component 104 , etc . The first component 116- 1 of the first emission angle may be di f ferent from the first component 120- 1 of the second emission angle ( and from the first component of the third emission angle , from the first component of the fourth emission angle , etc . ) . The first vertical coordinate of the first portion of the field of view 108 may be di f ferent from the second vertical coordinate of the second portion of the field of view 108 ( and from the third vertical coordinate of the third portion of the field of view 108 , etc . )
In some aspects , a di f ference between the first component of emission angles associated with adj acent segments of the optical component 104 , illustratively associated with segments that are illuminated one after the other during the continuous movement of the optical component 104, may be selected in accordance with a desired resolution in the first direction 152 (a desired vertical resolution) , and in accordance with a total number of segments .
A desired resolution may be dependent on the properties of a detector used for detecting light from the field of view 108 (see for example FIG. 1H) , e.g. on a number of pixels of the detector. The field of view 108 may be figuratively divided in a first number of pixels along the first direction 152 and in a second number of pixels along the second direction 154. The total number of pixels may define the resolution with which the field of view 108 may be illuminated, and with which light may be detected from the field of view 108. The difference between the first component of emission angles associated with adjacent segments may be selected based on how many pixels of the field of view 108 may be simultaneously illuminated along the first direction 152 by the light emitted by the light source 102.
Only as an exemplary calculation for determining a desired difference between the first component of emission angles associated with adjacent segments, it may be assumed that the field of view 108 may include 504 pixels in the second direction 154 and 128 pixels in the first direction 152 (e.g., in case a detector including 128 photo diodes is used, as discussed in further detail below) . It may be further assumed that the light source 102 may be configured to simultaneously illuminate 16 pixels in the first direction 152 (e.g., in case the light source 102 includes 4 laser diodes) . It may be further assumed that the 128 pixels correspond to an angular range of 24°, and that the 504 pixels correspond to an angular range of 120°. In this configuration, in case the optical component 104 includes 8 segments, a difference of 3° may be provided for a resolution of 24 ° / 128«0.19 ° . In case only 1 pixel was illuminated by the light emitted by the light source 102, a difference of 0.19° may be provided to have the same resolution.
As a numerical example, a difference (e.g., an absolute value of a difference) between the first component of emission angles associated with adjacent segments (e.g., a difference between the first component 116-1 of the first emission angle and the first component 120-1 of the second emission angle) may be in the range from 0.1° to 10°, for example in the range from 0.25° to 5°, for example 1.5°, for example 3°.
In some aspects, a segment may be configured such that the respective second component of the emission angle in the second direction 154 varies in a predefined range during the portion of the continuous movement of the optical component 104 associated with that segment. Illustratively, the variation of the relative arrangement between the optical component 104 and the light source 102 during the continuous movement provides that the light is deflected at a varying angle (e.g., at an angle having a varying second component) in the second direction 154. Stated in a different fashion, the variation of the relative arrangement between the optical component 104 and the light source 102 during the continuous movement provides that the light impinges onto a segment at varying impinging locations, each associated with a respective second component of the emission angle (e.g., each impinging location providing a respective deflection into the field of view 108 along the second direction 154) . The continuous movement of the optical component 104 may provide that a relative orientation of a segment with respect to the light source 102 varies during the portion of the continuous movement associated with that segment, in such a way that the light is deflected at an angle having a varying component in the second direction 154.
In the exemplary configuration shown in FIG. ID, the first segment 106-1 may be configured such that the second component 116-2 of the first emission angle varies during the first portion of the continuous movement of the optical component 104. The optical component 104 may be configured in such a way that the light emitted by the light source 102 impinges onto the first segment 106-1 at a plurality of first impinging locations during the first portion of the continuous movement of the optical component 104, each first impinging location being associated with a respective second component 116-2 of the first emission angle . In some aspects , the optical component 104 may be configured in such a way that an orientation between the first segment 106- 1 and the light source 102 varies during the first portion of the continuous movement of the optical component 104 , such that the second component 116-2 of the first emission angle varies accordingly .
In the exemplary configuration shown in FIG . I F, the second segment 106-2 may be configured such that the respective second component 120-2 of the second emission angle varies during the second portion of the continuous movement of the optical component 104 . The optical component 104 may be configured in such a way that the light emitted by the light source 102 impinges onto the second segment 106-2 at a plurality of second impinging locations during the second portion of the continuous movement of the optical component 104 , each second impinging location being associated with a respective second component 120-2 of the second emission angle . In some aspects , the optical component 104 may be configured in such a way that an orientation between the second segment 106-2 and the light source 102 varies during the second portion of the continuous movement of the optical component 104 , such that the second component 120-2 of the second emission angle varies accordingly .
It is understood that the same may apply for the other segments not illustrated in FIG . 1C to FIG . I F, e . g . the third segment 106-3 may be configured such that the respective second component of the third emission angle varies during the third portion of the continuous movement of the optical component 104 , etc .
As a further description, as exemplarily indicated by the arrows parallel to the second direction 154 in FIG . ID and FIG . I F, the optical component 104 may be configured in such a way that the impinging location of the light emitted by the light source 102 onto a segment moves along that segment in a direction parallel to the second direction 154 during the portion of the continuous movement of the optical component 104 associated with that segment . I llustratively, the optical component 104 may be configured in such a way that the continuous variation of the relative arrangement between the optical component 104 and the light source 102 provides that the impinging location of the light onto a segment travels along the extension ( e . g . , the width) of the segment in the second direction 154 during the associated portion of the continuous movement . The translation of the impinging location in the second direction 154 provides the variation of the second component of the emission angle associated with that segment .
As shown in the exemplary configuration in FIG . ID, the optical component 104 may be configured in such a way that the impinging location of the light emitted by the light source 102 onto the first segment 106- 1 moves along the first segment 106- 1 in a direction parallel to the second direction 154 during the first portion of the continuous movement of the optical component 104 . As shown in the exemplary configuration in FIG . I F, the optical component 104 may be configured in such a way that the impinging location of the light emitted by the light source 102 onto the second segment 106-2 moves along the second segment 106-2 in a direction parallel to the second direction 154 during the second portion of the continuous movement of the optical component 104 . It is understood that the same may apply for the other segments not illustrated in FIG . 1C to FIG . I F, e . g . the optical component 104 may be configured in such a way that the impinging location of the light emitted by the light source 102 onto the third segment 106-3 moves along the third segment 106-3 in a direction parallel to the second direction 154 during the third portion of the continuous movement of the optical component 104 , etc .
In some aspects , the second component of the emission angle associated with di f ferent segments may vary within a same angular range . The movement of the impinging location of the light onto a segment may provide that the second component of the emission angle varies from an initial value ( as soon as the light starts impinging onto that segment ) to a final value (when the light reaches the end of the segment , before moving onto the next segment ) . The angular range may be dependent on the lateral extension of the segment in the second direction 154, and on the relative orientation between the segment and the light source 102. As a numerical example, the angular range for the second component may be between -60° and +60° with respect to the optical axis 110 of the optical system 100 along the second direction 154, for example between -45° and +45°, for example between -30° and +30°. Illustratively, a total angular range for the field of view 108 in the second direction 154 may be for example 120°, for example 90°, for example 60°. The configuration of the optical system 101 may be adapted to provide a desired angular range, e.g. to provide a desired dimension of the field of view 108 to be scanned with the emitted light.
In some aspects, each segment may be configured such that the respective second component of the emission angle varies within a same angular range as the second component of the emission angle associated with the other segments. In the exemplary configuration shown in FIG. ID and FIG. IF, the second component 116-2 of the first emission angle and the second component 120-2 of the second emission angle may vary within a same angular range during the respective (first and second) portion of the continuous movement of the optical component 104. The same may apply to the other segments not illustrated in FIG. 1C to FIG. IF, e.g. the second component of the third emission angle may also vary within the same angular range as the second component 116-2 of the first emission angle and the second component 120-2 of the second emission angle during the respective (third) portion of the continuous movement of the optical component 104, etc.
An illustrative representation of the scanning of the field of view 108 provided by the optical system 101 is shown in FIG. 1G. The light emitted into the field of view 108 is represented by the circle 122. During the various portions of the continuous movement of the optical component 104 associated with the different segments, the light moves in the field of view 108 along the second direction 154 at a respective (vertical) coordinate in the first direction 152 (provided by the respective first component of the emission angle associated with the illuminated segment) . In the exemplary representation in FIG. 1G, the light 122 may move along a first trajectory 124-1 at a first vertical coordinate during a first portion of the continuous movement, then along a second trajectory 124-2 at a second vertical coordinate during a second portion of the continuous movement, then along a third trajectory 124-3 at a third vertical coordinate during a third portion of the continuous movement, and so on, up to an n-th trajectory 124-n at an n-th vertical coordinate during a n-th portion of the continuous movement. For example, the first trajectory 124-1 may be provided by light impinging onto the first segment 106-1, the second trajectory 124-2 may be provided by light impinging onto the second segment 106-2, the third trajectory 124-3 may be provided by light impinging onto the third segment 106-3, etc. After the n-th trajectory 124-n, the light 122 may go back to the first trajectory 124-1 upon the continuous movement of the optical component 104 bringing again the light to impinge onto the first segment 106-1. By way of illustration, the emission of light in the field of view 108 may be understood as a line-by-line scan of the field of view 108 (or column-by-column in case a reversed configuration was implemented) , e.g. analogous to a cathode-ray tube for a television. In some aspects, the emission of light may be understood as a line-by-line horizontal deflection of the light (e.g., of a laser beam) over the entire angular range (e.g., over an angular range of 120°) .
An extent of a trajectory in the second direction 154 may be defined by the angular range provided for the second component of the emission angle. A distance between the respective coordinate in the first direction 152 between different trajectories may be defined by the difference between the respective first component of the emission angle associated with the respective segments.
In some aspects, a resolution in the first direction 152 may be associated with the properties of the light source 102. In case the light source 102 includes a plurality of light sources (e.g., in case the light source 102 is configured as a laser bar including a plurality of laser diodes) , each light source may be configured (e.g., arranged) to illuminate a respective location of a segment of the optical component 104 during the associated period of the continuous movement. Illustratively, each light source may be configured to illuminate the segment at a respective coordinate along the lateral extension of the segment parallel to the first direction 152, e.g. at a respective height in the segment. This configuration may provide that the light redirected by a segment into the field of view 108 includes a plurality of components (each associated with the respective light source) , each at a respective vertical coordinate (within the vertical coordinate of the trajectory associated with that segment) . This may provide a finer resolution for the illumination of the scene compared to using a single light source, e.g. with a broader laser spot.
The plurality of light sources may be configured such that a segment is fully illuminated along the lateral extension of the segment parallel to the first direction 152 by the light emitted by the plurality of light sources. The plurality of light sources may be configured such that a segment is fully illuminated along its height by the light provided by the different light sources at different coordinates. A resolution in the first direction 152 may increase for increasing number of light sources. Illustratively, a number of illuminated pixels of the field view 108 in the vertical direction may increase for increasing number of light sources. As a numerical example, four laser diodes may illuminate 16 pixels in the vertical direction 152 of the field of view 108 (and 1 pixel in the horizontal direction 154 ) .
In some aspects, as will be now described in relation to FIG. 1H and FIG. II, the optical component 104 may be configured to provide an improved detection of light from the field of view 108, e.g. a light detection with reduced noise, e.g. a light detection with increased signal-to-noise ratio. FIG. 1H shows the optical system 101 in a schematic top view according to various aspects. In the representation in FIG. 1H, the optical system 101 is illustrated with components configured for detecting light from the field of view 108. The configuration in FIG. 1H may be optionally implemented in case detection of light should be performed via the optical system 101, otherwise the optical system 101 may be configured as described in relation to FIG. IB in case detection of light was assigned to a different detection system or was based on a different strategy.
The optical system 101 may include a detector 126 configured to detect light, e.g. the detector 126 may be configured to provide an analog signal (e.g., a current or a voltage) in accordance with the light received at the detector 126. The detector 126 may include one or more photo diodes, each configured to provide an analog signal (e.g., a photo current) in response to light impinging onto the photo diode. The one or more photo diodes may include at least one of a pin photo diode, an avalanche photo diode, a single photon avalanche photo diode, or a silicon photomultiplier, as examples.
In some aspects, the detector 126 may include a plurality of photo diodes (e.g., of the same type or of different types) , illustratively, the detector 126 may include a plurality of pixels each including or associated with a respective photo diode. In this configuration, the plurality of photo diodes may form an array, e.g. a one-dimensional or two-dimensional array. Illustratively, the photo diodes may disposed along one direction (e.g., a first direction, such as a vertical direction or a horizontal direction) , or may disposed along two directions, e.g. a first (e.g., horizontal) direction and a second (e.g., vertical) direction. Further illustratively, the photo diodes may form a column array or a line array. As a numerical example, the detector 126 may include 32 photo diodes, or 64 photo diodes, or 128 photo diodes. The number of photo diodes may be selected in accordance with a desired resolution, as described above. In some aspects, the plurality of photo diodes may form an array along a direction parallel to the direction in which the segments of the optical component 104 provide a respective di f ferent deflection angle , e . g . the plurality of photo diodes may be aligned along a direction parallel to the first field of view direction 152 ( forming a column array) . In some aspects , the detector 126 may have an optical aperture in the range from 50 mm2 to 1000 mm2 , for example in the range from 150 mm2 to 600 mm2 , for example 400 mm2 .
In some aspects , the optical component 104 may include a plurality of receive optical elements ( also referred to herein as tracking filters ) , each associated with a respective segment of the plurality of segments . In the exemplary configuration shown in FIG . 1H, the optical component 104 may include a first receive optical element 128- 1 associated with the first segment 106- 1 , a second receive optical element 128-2 associated with the second segment 106-2 , a third receive optical element 128-3 associated with the third segment 106-3 , a fourth receive optical element 128-4 associated with the fourth segment 106-4 , a fi fth receive optical element 128-5 associated with the fi fth segment 106-5 , a sixth receive optical element 128- 6 associated with the sixth segment 106- 6 , a seventh receive optical element 128-7 associated with the seventh segment 106-7 , and an eighth receive optical element 128- 8 associated with the eighth segment 106- 8 . It is however understood that the number of receive optical elements illustrated in FIG . 1H is only an example , and an optical component 104 may include a desired number of receive optical elements in accordance with the number of segments .
A receive optical element may be disposed relative to the associated segment in such a way that a direct reflection of the emitted light deflected by the segment impinges onto that receive optical element . I llustratively, a receive optical element may be disposed in the optical component 104 such that , when the continuous movement of the optical component 104 brings the associated segment to be illuminated by the light emitted by the light source 102 , the receive optical element is in a position to receive light from the portion of the field of view 108 associated with that segment . In some aspects , a rigid linear relationship ( a rigid linear link) may be provided between a receive optical element and the associated segment . In some aspects , a predefined ( e . g . , fixed) angular relationship may be provided between a receive optical element and the associated segment . I llustratively, a receive optical element may be oriented with respect to the associated segment ( and with respect to the field of view) in such a way that the light deflected by the segment into the field of view is received at the receive optical element .
In the exemplary configuration shown in FIG . 1H, the first segment 106- 1 and the first receive optical element 128- 1 are disposed relative to one another such that the first receive optical element 128- 1 receives light associated with a direct reflection 114r of the light deflected in the first emission direction 114 . The second segment 106-2 and the second receive optical element 128-2 may be disposed relative to one another such that the second receive optical element 128-2 receives light associated with a direct reflection of the light deflected in the second emission direction . The third segment 106-3 and the third receive optical element 128-3 may be disposed relative to one another such that the third receive optical element 128-3 receives light associated with a direct reflection of the light deflected in the third emission direction, etc .
A direct reflection is not the only mechanism that causes light to travel back towards the optical system 101 from the field of view 108 , e . g . the light may be scattered back as another example . Described in a di f ferent fashion, the first receive optical element 128- 1 and the first segment 106- 1 may be disposed relative to one another ( e . g . , with the predefined angular relationship ) such that the first receive optical element 128- 1 receives light coming from the first portion of the field of view 108 , the second receive optical element 128-2 and the second segment 106-2 may be disposed relative to one another such that the second receive optical element 128-2 receives light coming from the second portion of the field of view 108 , the third receive optical element 128-3 and the third segment 106-3 may be disposed relative to one another such that the third receive optical element 128-3 receives light coming from the third portion of the field of view 108 , etc .
In some aspects , a receive optical element may be configured to allow the light coming from the ( illuminated) portion of the field of view 108 associated with the respective segment to reach the detector 126 . For example , a receive optical element may be configured such that only the light associated with the direct reflection of the light deflected by the associated segment is delivered to the detector 126 .
The plurality of receive optical elements may ensure that the direct reflection of the light directed towards the field of view 108 during a respective period of the continuous movement of the optical component is received at the detector 126 , and at the same time may ensure that other light ( e . g . , noise light , such as sun light , or light from other sources in the field of view 108 ) may not arrive at the detector, as described in further detail below ( see also FIG . I I ) . I llustratively, the plurality of receive optical elements may ensure that the light coming from the portion of the field of view 108 into which light was emitted ( or is being emitted) is received at the detector 126 , while light from other portions of the field of view is prevented from reaching the detector 126 . A receive optical element may be figuratively configured such that only a subset of pixels of the field of view in the hori zontal direction are imaged onto the detector 126 ( the subset of pixels being illuminated) , for example 5 pixels , for example 10 pixels , for example 20 pixels .
A receive optical element may be configured to transmit or reflect the received light . In some aspects , a receive optical element may be configured to be optically transparent ( for the wavelength of the emitted light ) such that the light received at the receive optical element may travel further towards the detector (while the other light gets blocked by the optical component 104 ) . As a numerical example , a receive optical element may have a transmission rate greater than 70% , for example greater than 90% , for example substantially 100% . In other aspects, a receive optical element may be configured to reflect the received light towards the detector 126 (while the other light may be blocked by or may pass through the optical component 104) . As numerical example, a receive optical element may have a reflectivity (at the wavelength of the emitted light) greater than 70%, for example greater than 90%, for example substantially 100%. In the exemplary configuration shown in FIG. 1H, the first receive optical element 128-1, the second optical element 128-2, the third optical element 128-3, etc. may be configured to transmit or reflect the received light (the light associated with a direct reflection of the light deflected by the respective segment, the light coming from the portion of the field of view 108 associated with the respective element) .
In some aspects, the optical component 104 may have light transmission or light absorbing properties in accordance with the configuration of the receive optical elements. As an example, the optical component 104 may be configured to absorb light (e.g., in case the receive optical elements are transparent) , e.g. may be configured such that light arriving onto the optical component 104 (at a location other than a segment or a receive optical element) is absorbed. Illustratively, the light arriving onto the optical component 104 (e.g., from the field of view 108) may be blocked by the optical component 104, without traveling further to the detector 126. As a numerical example, the optical component 104 (at a location other than a segment or a receive optical element) may be configured to have a transmission rate less than 20%, for example less than 10%, for example less than 1%. As another example, the optical component 104 may be configured to let light pass through the optical component 104 (e.g., in case the receive optical elements are reflective) . Illustratively, the optical component 104 may be transparent, so that light that is not reflected by a receive optical element towards the detector 126 may travel away.
The properties of the receive optical elements may be adapted in accordance with the configuration of the optical component 104 and of the optical system 101, e.g. the size, the shape, the arrangement, etc., of the receive optical elements may be adapted based on the configuration of the segments, based on a disposition of the detector 126 in the optical system 101, etc. Possible configurations and arrangements of the receive optical elements will be described in further detail below. Only as a numerical example, a receive optical element may include a width in the range from 1 mm to 15 mm, for example in the range from 1.5 mm to 10 mm, for example in the range from 2 mm to 6 mm, and a height in the range from 1 mm to 15 mm, for example in the range from 1.5 mm to 10 mm, for example in the range from 2 mm to 6 mm. As an example, a receive optical element may have an elongated shape, for example a rectangular shape.
An illustrative representation of the detection of light from the field of view 108 is shown in FIG. II. The light associated with the portion of the field of view 108 that is currently illuminated is represented by the circle 130. As exemplarily shown in FIG. II, the first receive optical element 128-1 allows following the illuminated portion 130 along the first trajectory 124-1, such that this light may be provided to the detector 126, while a light absorbing portion 132 of the optical component 104 absorbs or blocks other light that may otherwise arrive at the detector 126.
In some aspects, a frame rate for the detection of light from the field of view 108 may be associated with a frequency of rotation (or a speed) of the continuous movement of the optical component 104. A frame may be understood as a complete scan of the field of view 108 (in both the horizontal direction and the vertical direction) . A frame rate may be understood as a number of frames that are acquired in a defined period of time, e.g. in 1 s. Averaging over several scans of the field of view (over several frames) may be carried out, e.g. a frame may include a plurality of accumulated scans of the field of view. The rotation frequency may be a multiple of the frame rate. Assuming, as a numerical example, a frame rate of 25 Hz and a 10-fold averaging, a rotation frequency may be 250 Hz. The light detection described above may be illustratively understood, in some aspects, as a tracking of the scanned horizontal and vertical angle range through an optical tracking filter to increase the signal-to-noise ratio.
Aspects associated with the optical component 104 will be described in further detail below. In the figures, exemplary realizations of the optical component 104 may be shown. It is understood that the properties described in relation to the illustrated exemplary realizations may apply also to other configurations of the optical component 104 that are not explicitly illustrated. As an example, the properties described in relation to an optical component having a disk shape (see for example FIG. 2A) may apply also to an optical component having a different shape, and the like.
FIG. 2A and FIG. 2B show an optical component 200 in a schematic view according to various aspects. The optical component 200 may be an exemplary realization of the optical component 104 described in relation to FIG. 1A to FIG. II. In FIG. 2A and FIG. 2B the optical component 200 may have a disk shape (it may be described as a segment disc) . It is understood that the disk shape is an example, and the optical component 200 may have other shapes, such as a polygonal shape having a number of sides associated with the number of segments (e.g., an hexagonal shape, an heptagonal shape, an octagonal shape, a decagonal shape, etc.) , or a band-like shape as described in further detail below. An optical component (e.g., the optical component 200) may have a symmetric shape, e.g. a shape that provides a smooth transition from one deflection angle to the next (from one segment to the next) . The optical component 200 may have a symmetrical mass distribution (concentricity) . The optical component 200 may have a rigid structure.
The size of an optical component (e.g., of the optical component 200) may be adapted according to the configuration of an optical system (e.g., of the optical system 101) . As a numerical example, in case of a disk shaped optical component, the disk shaped optical component may have a radius in the range from 10 mm to 100 mm, for example in the range form 25 mm to 75 mm, for example in the range from 30 mm to 60 mm, for example in the range from 25 mm to 50 mm, for example a radius of 50 mm.
The optical component 200 may include a main surface 202s (e.g., including a main top surface and a main bottom surface) and a side surface 204s (e.g., including a plurality of side surfaces 204s-l, 204s-2, 204s-3, also referred to as plurality of partial side surfaces) .
In some aspects, the plurality of segments may be disposed along the side surface 204s of the optical component 202s. The side surface 204s of the optical component 200 may be adapted to provide the plurality of segments, e.g. each partial side surface may correspond to a respective segment. In the exemplary illustration in FIG. 2A, a first segment may include the first side surface 204-1 (e.g., a first segment may correspond to the first side surface 204-1) , a second segment may include the second side surface 204-2 (e.g., a second segment may correspond to the second side surface 204-2) , a third segment may include the third side surface 204-3 (e.g., a third segment may correspond to the third side surface 204-3) , etc. Illustratively, in the exemplary configuration in FIG. 2A and FIG. 2B, the segments (e.g., the first segment, the second segment, the third segment, etc.) may be disposed along the side surface of the disk shaped optical component 200. It is understood that the optical component 200 may include more than three adapted segments, e.g. as described above in relation to the optical component 104. A segment may be understood, in some aspects, as an edge segment of the optical component 200.
The deflection angle associated with a segment may be provided by adapting the geometrical properties of the segment. In some aspects, each segment (each partial side surface) may be tilted at a respective tilting angle with respect to the main surface 202s of the optical component 200. The tilting angle associated with a segment may be understood as an angle formed between the surface of segment (e.g., the associated side surface) and the main surface 202s of the optical component. The tilting angle associated with a segment may be also understood as an angle formed between the surface of the segment and the direction from which light impinges on that segment (see FIG. 2B) . In the exemplary configuration in FIG. 2A, the first side surface 204s-l may be tilted at a first tilting angle with respect to the main surface 202s, the second side surface 204s-2 may be tilted at a second tilting angle with respect to the main surface 202s, the third side surface 204s-3 may be tilted at a third tilting angle with respect to the main surface 202s, etc. A difference between the tilting angle of adjacent segments (e.g., between the first tilting angle and the second tilting angle) may be selected in accordance with a desired difference in the deflection angles, e.g. may be in the range from 0.1° to 10°, for example in the range from 0.25° to 5°, for example 1.5°, for example 3°.
In some aspects, a tilting angle of a segment (and the associated deflection angle) may be described relative to an angle of incidence of the light onto the segment (e.g., light as emitted by a light source 206, for example configured as the light source 102) . In case light impinges onto a segment at a certain incidence angle (e.g., 6°) , the deflection angle provided by a segment may be expressed in terms of a variation of such incidence angle, e.g. in terms of a difference between the first component of the incidence angle in the first direction 152 and the first component of the emission angle in the first direction 152. The variation may be, for example, in the range from -10° to +10°, for example in the range from -5°to +5°, for example may be -1.5° or +1.5°.
In some aspects, a segment may include a concave surface with respect to the main surface 202s of the optical component 200. A concave surface of a segment may be understood, in some aspects, as the surface of the segment having concave character in at least one direction, e.g. in at least the horizontal direction. In the other direction (e.g., in the vertical direction) , the surface of a segment may be substantially planar. The first segment (the first side surface 204s-l) may have a first concave surface, the second segment (the second side surface 204s-2) may have a second concave surface, the third segment (the third side surface 204s-3) may have a third concave surface, etc. A concave surface may have a radius of curvature in the range from 5 mm to 80 mm, for example in the range from 15 mm to 60 mm, for example in the range from 20°mm to 40°mm. The concave shape of a side surface may be adapted to provide the desired deflection angle associated with the respective segment. In relation to the shape of a segment see also FIG. 2D and FIG. 2E. In some aspects, the radius of curvature may not be uniform, e.g. may vary along the width of the surface, for providing the desired (linear) relationship between the light deflected by a segment and the light received at the associated receive optical element.
As described in relation to FIG. IB, the continuous movement of an optical component may include a continuous circular movement around an axis of the optical component. The axis around which the continuous rotation occurs may be an axis perpendicular to the main surface of the optical component. In the exemplary configuration shown in FIG. 2B, the continuous rotation of the optical component 200 may be a rotation around the axis 208 perpendicular to the main surface 202s of the optical component 200.
The configuration of an optical component, and the associated continuous movement, may also differ from the disc-shaped configuration and the continuous rotation described in FIG. 2A and FIG. 2B. As another example, an optical component may include a band-like structure, and the plurality of segments may be disposed as a plurality of stripes on the band-like structure. The band-like structure may be mounted on a frame including one or more rollers that enable moving continuously the band-like structure (e.g., back and forth or revolving around the rollers) . Illustratively, the optical component may be configured as a conveyor belt, continuously moving around the rollers, providing a linear translation of the segments. An operation of the optical component 200 is illustrated in FIG. 2C to FIG. 2H, which show a series of graphs 210-1, 210-2, 210-3, 210-4, 210-5, 210-6 illustrating a deflection of light provided by two different segments of the optical component 200.
The first graph 210-1, the second graph 210-2, and the third graph 210-3 illustrate the deflection of light provided by one segment (e.g., the first segment 204s-l) of the optical component 200. As shown in the associated graphs 220-1, 220-2, 220-3, 222-1, 222-2, 222-3, a coordinate of the deflected light in a first (e.g., vertical) direction may remain constant during the period in which the light impinges onto the first segment (see the graphs 222-1, 222-2, 222-3) , while a coordinate in a second (e.g., horizontal) direction may vary along the extension of the field of view following the movement of the impinging location of the light on the first segment (see the graphs 220-1, 220-2, 220-3, showing the light emitted towards a left side, towards the center, and towards a right side of the field of view, respectively) .
The fourth graph 210-4, the fifth graph 210-5, and the sixth graph 210-6 illustrate the deflection of light provided by another segment (e.g., the second segment 204s-2) of the optical component 200. As shown in the associated graphs 220-4, 220-5, 220-6, 222-4, 222-5, 222-6, a coordinate of the deflected light in a first (e.g., vertical) direction may remain constant during the period in which the light impinges onto the second segment (see the graphs 222-4, 222-5, 222-6) , and may be less than the coordinate associated with the first segment 204s-l in this example, while a coordinate in a second (e.g., horizontal) direction may vary along the extension of the field of view following the movement of the impinging location of the light on the second segment (see the graphs 220-4, 220-5, 220-6, showing the light emitted towards a left side, towards the center, and towards a right side of the field of view, respectively) . The coordinate in the first direction provided by the second segment differs from the coordinate in the first direction provided by the first segment, while the respective coordinate in the second direction varies in a same manner for the first segment and the second segment.
FIG. 21 and FIG. 2J illustrate the optical component 200 in a schematic view in accordance with various aspects. The representation in FIG. 21 and FIG. 2J may illustrate possible considerations for the dimensioning and the shaping of the segments of the optical component 200. The representations in FIG. 21 and FIG. 2J illustrate the impinging of light onto the optical component 200 from different points of view (e.g., from the top, in FIG. 21, and from the side, in FIG. 2J) .
In case of a disc-shaped optical component 200, the cutout shape of the segments may be adapted to the linear relationship between the angular rotation of the segment disk and the light deflection angle.
In the representation in FIG. 21 and FIG. 2J it is assumed that the light source 206 includes a laser bar (e.g., with four laser diodes, e.g. a 4-channel laser, having a certain height HL and a certain width WL) , providing an angle 214 at the output of the laser bar of 25° (e.g., an angle in the horizontal direction, 0H) • The laser light may be collimated onto the optical component 200 by means of a fast axis collimator 216 (FAC) and a slow axis collimator 218 (SAC) , as commonly known in the art. The laser light may be collimated to provide a resolution in the horizontal direction of 0.2° (HRes) , only as a numerical example. The angle in the vertical direction, 0V, may be left unaltered.
A segment of the optical component may be dimensioned such that the collimated light may fall within the extension of the segment in the first (e.g., vertical) direction, e.g. within the height of the segment. The deflection provided by the segment may add or subtract the corresponding deflection angle from the angle at which the light impinges onto the segment. In the configuration in FIG. 2D and FIG. 2E, as an example, the segment may provide a deflection angle of 0°, leaving the output angle at the 6° provided by the orientation of the light source 206 relative to the segment. FIG. 2K and FIG. 2L each shows a practical realization 250a, 250b of an optical component in a schematic view according to various aspects. The practical realizations 250a, 250b may be practical implementations of the optical component 200 described in FIG. 2A to FIG. 2J.
As shown by the practical realizations 250a, 250b in FIG. 2K and FIG. 2L an optical component may be implemented, in some aspects, as a hard disk drive including a rotating (in other words, spinning) disk onto which light emitted by a light source (e.g., by the light source 206) may be directed. As an example, an optical component may be implemented as an automotive and industrial 2.5 inch hard drive or 3.5 inch hard drive.
Several advantages may be associated with this practical implementation: it is a sophisticated technology with high precision in mass production; it is a technology available for automotive application; the same motors (used for driving the optical component) are already used in LIDAR system; a hard disk drive-like component may have vibration resistance up to approximately 5 G, and to shock 300 G; this technology may have a high mean time between failures (MTBF) , e.g. 750000 h or greater; and with this technology fast rotation may be provided, e.g. a maximum rotation around 15000 rpm.
Other aspects associated with an optical system (e.g., with the optical system 101) will be described in further detail below, in relation to FIG. 3A to FIG. 3D.
FIG. 3A, FIG. 3B, FIG. 3C, and FIG. 3D each shows a respective optical system 300a, 300b, 300c, 300d in a schematic view according to various aspects. These optical systems 300a, 300b, 300c, 300d may be an exemplary implementation of the optical system 101 described in relation to FIG. 1A to FIG. II (e.g., the LIDAR system 100 may include an optical system 300a, 300b, 300c, 300d) . It is understood that the aspects described above in relation to the optical system 101 may apply to the optical systems 300a, 300b, 300c, 300d shown in FIG. 3A to FIG. 3D, and vice versa . The optical system 300b in FIG . 3B may correspond to the optical system 300a in FIG . 3A, represented for illustrating in more detail possible aspects associated with an optical system 300a, 300b, 300c, 300d .
The optical systems 300a, 300b, 300c, 300d may include a light source 302a, 302b, 302c, 302d, e . g . a light source 302a, 302b, 302c, 302d configured as the light source 102 described in relation to FIG . IB . The optical systems 300a, 300b, 300c, 300d may include an optical component 304a, 304b, 304c, 304d, e . g . an optical component 304a, 304b, 304c, 304d configured as the optical component 104 , 200 described in relation to FIG . IB to FIG . 2L . The optical component 304a, 304b, 304c, 304d may include a plurality of segments 306a, 306b, 306c, 306d configured to deflect the light emitted by the light source 302a, 302b, 302c, 302d towards a field of view 308a, 308b, 308c, 308d of the optical systems 300a, 300b, 300c, 300d, as described above in relation to FIG . 1A to FIG . 2L ( e . g . , towards a field of view 308a, 308b, 308c, 308d of the LIDAR system) . In the exemplary configuration shown in FIG . 3A to FIG . 3D the plurality of segments 306a, 306b, 306c, 306d may be disposed along a side surface of the optical component 304a, 304b, 304c, 304d . The continuous movement of the optical component 304a, 304b, 304c, 304d may be controlled by a controller (not shown) , e . g . configured as the controller 112 described in relation to FIG . IB, and may be driven by a motor (not shown) as described in relation to FIG . IB ( e . g . , a spindle motor, or a servo motor ) .
In the exemplary configurations of the optical systems 300a, 300b, 300c, 300d shown in FIG . 3A to FIG . 3D, the continuous movement of the optical component 304a, 304b, 304c, 304d may be a continuous circular movement around an axis 310a, 310b, 310c, 310d of the optical component 304a, 304b, 304c, 304d .
In some aspects , the light source 302a, 302b, 302c, 302d may be understood as including transmission optics . I llustratively, in some aspects , the optical systems 300a, 300b, 300c, 300d may include a transmission optics arrangement configured to direct (e.g., to steer) the light emitted by the light source 302a, 302b, 302c, 302d towards the optical component 304a, 304b, 304c, 304d (illustratively, towards the segments 306a, 306b, 306c, 306d) . The transmission optics arrangement may be configured to collimate the light by the light source 302a, 302b, 302c, 302d onto the optical component 304a, 304b, 304c, 304d. In some aspects, the transmission optics arrangement may be configured to mix light coming from a plurality of light sources onto the optical component 304a, 304b, 304c, 304d. In some aspects, the transmission optics arrangement may be configured to operate a geometric shape transformation of the light. The transmission optics arrangement may include one or more optical elements, e.g. one or more lenses, such as one or more cylinder lenses, one or more mirrors, and the like. As an example, in case the light source 302a, 302b, 302c, 302d includes a laser bar, the transmission optics arrangement may include a fast-axis collimator lens and a slow-axis collimator lens (see also FIG. 21 and FIG. 2J) configured to collimate the light emitted by the laser bar onto the optical component 304a, 304b, 304c, 304d. As a further example, as shown in FIG. 3A to FIG. 3C, the transmission optics arrangement may include a mirror 305a, 305b, 305c (a deflection mirror) for directing the light emitted by the light source 302a, 302b, 302c, 302d towards the optical component 304a, 304b, 304c.
In some aspects, the optical system 300a, 300b, 300c, 300d may also be configured for detecting light from the field of view 308a, 308b, 308c, 308d, e.g. for detecting the direct reflection of emitted light originating from one or more objects 312a, 312b, 312c in the field of view 308a, 308b, 308c, 308d (represented as a person for illustrative purposes in FIG. 3A to FIG. 3C) , e.g. for detecting light from the portions of the field of view 308a, 308b, 308c, 308d into which light was (or is being) emitted. The optical system 300a, 300b, 300c, 300d may include a detector 314a, 314b, 314c, 314d, e.g. configured as the detector 126 described in relation to FIG. 1H, for example a detector including an avalanche photo diode or a plurality of avalanche photo diodes. The use of an optical component 304a, 304b, 304c, 304d as described herein may provide a rapid scanning of the field of view 308a, 308b, 308c, 308d. Considering, as a numerical example, a field of view 308a, 308b, 308c, 308d of 120° x 24° (e.g., 504 pixel x 128 pixel) , a rotation at 75 rps (4500 rpm) , and a repetition rate of the light source 302a, 302b, 302c, 302d (e.g., a laser repetition rate) of 302 kHz (at 10 ns pulse, fmax=310 kHz, duty cycle = 0.05%, four laser diodes) , it may be possible to obtain three accumulations over the entire field of view with a frame rate of 25 Hz (illustratively, during acquisition of a frame the field of view may be scanned three times by means of the light directed thereto by the optical component 304a, 304b, 304c, 304d) .
The optical component 304a, 304b, 304c, 304d may be configured to improve the light detection, as described above in relation to FIG. 1H and FIG. II. The optical component 304a, 304b, 304c, 304d may include a plurality of receive optical elements 316a, 316b, 316c, 316d, each associated with a respective segment 306a, 306b, 306c, 306d and configured such that the direct reflection of the light deflected by the associated segment may be received at the detector 314a, 314b, 314c, 314d, e.g. each configured such that the light coming from the portion of the field of view 308a, 308b, 308c, 308d associated with the respective segment may be received at the detector 314a, 314b, 314c, 314d (and light coming from other portions may not) .
In the configuration shown in FIG. 3A and FIG. 3B, the receive optical elements 316a, 316b may be configured to reflect the light received from the field of view 308a, 308b towards the detector 314a, 314b. In the configuration shown in FIG. 3C and FIG. 3D, the receive optical elements 316c, 316d may be configured to allow a transmission of the light received from the field of view 308a, 308b through the receive optical element 316c, 316d towards the detector 314c, 314d. Illustratively, the configuration of the receive optical elements 316a, 316b, 316c, 316d (e.g., the positional or angular relationship with the associated segment) may be in accordance with a configuration of the optical system 300a, 300b, 300c, 300d, e.g. with an arrangement of the detector 314a, 314b, 314c, 314d in the optical system 300a, 300b, 300c, 300d .
The receive optical elements 316a, 316b, 316c, 316d may be disposed on a main surface of the optical component 304a, 304b, 304c, 304d . As described in relation to FIG . 1H, the arrangement of the receive optical elements 316a, 316b, 316c, 316d may be in accordance with the arrangement of the associated segment , and with the overall configuration of the optical system 300a, 300b, 300c, 300d . As an example , as shown in FIG . 3A to FIG . 3C, a receive optical element 316a, 316b, 316c may be disposed at the opposite side of the optical component 304a, 304b, 304c with respect to the associated segment 306a, 306b, 306c . As another example , as shown in FIG . 3D, a receive optical element 316d may be disposed at a same side of the optical component 304d as the associated segment 306d . Additional examples will be described in further detail below .
In some aspects , the optical system 300a, 300b, 300c, 300d may include a ( first ) receive optics arrangement 318a, 318b, 318c, 318d configured to receive light from the field of view 308a, 308b, 308c, 308d of the optical system 300a, 300b, 300c, 300d and to direct the received light towards the optical component 304a, 304b, 304c, 304d . The receive optics arrangement 318a, 318b, 318c, 318d may include one or more optical elements ( e . g . , one or more lenses , such as one or more cylinder lenses ) configured to collect light from the field of view 308a, 308b, 308c, 308d . As an example , the receive optics arrangement 318a, 318b, 318c, 318d may include at least one ( first ) lens having an optical aperture in the range from 100 mm2 to 3000 mm2 , for example in the range from 200 mm2 to 400 mm2 .
In some aspects , the first receive optics arrangement 318a, 318b, 318c, 318d may be configured to direct the received light towards the main surface of the optical component 304a, 304b, 304c, 304d, illustratively, towards the receive optical element 316a, 316b, 316c, 316d associated with the currently illuminated segment 306a, 306b, 306c, 306d deflecting the light towards the field of view 308a, 308b, 308c, 308d . In some aspects, the first receive optics arrangement 318a, 318b, 318c, 318d may be configured to image the field of view 308a, 308b, 308c, 308d onto the optical component 304a, 304b, 304c, 304d (onto its main surface) . The image may be provided with respect to a predefined focal point. In some aspects, the first receive optics arrangement 318a, 318b, 318c, 318d may be configured to operate a geometrical transformation of the image the field of view 308a, 308b, 308c, 308d provided onto the optical component 304a, 304b, 304c, 304d, e.g. to provide a shape of the field of view adapted to a shape of the optical component (e.g., to provide trapezoidal representation of the field of view in case of a disc-shaped optical component, as an example) .
The configuration of the first receive optics arrangement 318a, 318b, 318c, 318d may be adapted depending on the arrangement of the components of the optical system 300a, 300b, 300c, 300d. As shown, for example, in FIG. 3A to FIG. 3C, in some aspects, the first receive optics arrangement 318a, 318b, 318c may include a mirror 320a, 320b, 320c (a deflection mirror) configured to deflect the light received from the field of view 308a, 308b, 308c towards the optical component 304a, 304b, 304c (e.g., towards its main surface) .
In some aspects, the optical system 300a, 300b, 300c, 300d may include a (second) receive optics arrangement 322a, 322b, 322c, 322d configured to direct the light from the optical component 304a, 304b, 304c, 304d to the detector 314a, 314b, 314c, 314d. The (second) receive optics arrangement 322a, 322b, 322c, 322d may be configured to image onto the detector 314a, 314b, 314c, 314d the light transmitted or reflected by the receive optical element associated with the currently illuminated segment. The second receive optics arrangement 322a, 322b, 322c, 322d may include one or more optical elements (e.g., one or more lenses, such as one or more cylinder lenses, one or more mirrors, etc.) to direct (in some aspects, to focus) the light transmitted or reflected by the receive optical elements 316a, 316b, 316c, 316d onto the detector 314a, 314b, 314c, 314d. In some aspects , the optical system 300a, 300b, 300c, 300d may include a position sensor 324a, 324b, 324c, 324d configured to provide position information ( e . g . , angle information, or angular information) associated with the optical component 304a, 304b, 304c, 304d . The position sensor 324a, 324b, 324c, 324d may be configured to determine a position of the optical component 304a, 304b, 304c, 304d during the continuous movement of the optical component 304a, 304b, 304c, 304d to identi fy the segment onto which the light emitted by the light source 302a, 302b, 302c, 302d is impinging . The position of the optical component 304a, 304b, 304c, 304d during the continuous movement may be understood, in some aspects , as an angular position of the optical component 304a, 304b, 304c, 304d with respect to a reference point ( e . g . , as an angular displacement with respect to a reference point , e . g . with respect to a starting position) . I llustratively, in some aspects , the position sensor 324a, 324b, 324c, 324d may be configured to determine an angular position of the optical component 304a, 304b, 304c, 304d during a continuous circular movement of the optical component 304a, 304b, 304c, 304d . In some aspects , the position sensor 324a, 324b, 324c, 324d may be a passive device .
In some aspects , the position sensor 324a, 324b, 324c, 324d may include a light source ( e . g . , a light emitting diode ) configured to illuminate the optical component 304a, 304b, 304c, 304d for determining the position ( e . g . , the angular position) of the optical component 304a, 304b, 304c, 304d .
The information provided by the position sensor 324a, 324b, 324c, 324d may be used to assign a location in the field of view 308a, 308b, 308c, 308d ( coordinates in the field of view) to the light received at the detector 314a, 314b, 314c, 314d .
I llustratively, the information provided by the position sensor 324a, 324b, 324c, 324d may be used to determine the position towards which the emitted light was directed ( to determine the illuminated portion of the field of view 308a, 308b, 308c, 308d) , to determine a position from which the direct reflection associated therewith was originated . In some aspects , the optical system 300a, 300b, 300c, 300d may include one or more processors 326d ( shown in FIG . 3D, it is understood that also the optical system 300a, 300b, 300c may include respective one or more processors ) . The one or more processors 326d may be configured to process position information provided by the position sensor 324a, 324b, 324c, 324d . The one or more processors 326d may be configured to assign a location in the field of view 308a, 308b, 308c, 308d of the optical system 300a, 300b, 300c, 300d to light received at the optical system 300a, 300b, 300c, 300d in accordance with the position information provided by the position sensor 324a, 324b, 324c, 324d .
In some aspects , the position information provided by the position sensor 324a, 324b, 324c, 324d may be used to control an emission of light by the light source 302a, 302b, 302c, 302d .
The one or more processors 326d may be configured to control an emission of light from the light source 302a, 302b, 302c, 302d in accordance ( e . g . , in synchroni zation) with the position of the optical component as determined by the position information . The one or more processors 326d may be configured to control the light source 302a, 302b, 302c, 302d to start emitting light at a defined angular position of the optical component 304a, 304b, 304c, 304d, in accordance with the position information .
The one or more processors 326d may include , for example , a central processing unit 328d ( CPU) and a system-on-chip 330d ( SOC ) . The system-on-chip 330d may include a LIDAR engine ( e . g . , an application-speci fic integrated circuit , a microcontroller, and the like ) , for example a 2D/ 3D LIDAR data acquisition and processing system-on-chip, such as a LCA3 LeddarCore , for example in case the optical system 300a, 300b, 300c, 300d is part of a LIDAR system .
FIG . 3E illustrates an operation of a LIDAR engine 334 in relation to an oscillation of a MEMS mirror in a usual LIDAR system, as shown in the graph 336 . Parameters associated with the use of a LIDAR engine 334 in a usual LIDAR system may illustrate the advantages in terms of speed of acquisition provided by the solution described herein. In case a MEMS mirror in combination with a LCPG is used in a LIDAR system for controlling the emission direction of the light, the acquisition sequence of photo diodes is always automatic due to the synchronization required with the mirror's oscillation (see graph 336, with the oscillation of the MEMS mirror represented in relation to the oscillation angle 0Os) • The LIDAR engine 334 supports a MEMS mirror with an oscillation frequency of 1 kHz to 6 kHz. Assuming that from 60° mirror tilt 40° are used (~66%) , which are transformed to a horizontal tile angle of 7.5°, to pick up one tile of the LCPG one mirror period (fMEMs=2.1 kHz) is needed (2 xTHaifscan=l/fMEMs=0.48 ms) . For a field of view of 120°x24° 56 tiles of a LCPG are required, leading to a time of ti2o°x24°~26.9 ms (even without taking into account the additional loss of time for switching the tiles) .
By using the strategy described herein, assuming for example 8 segments, and 250 rps for a continuous rotation of the optical component 304a, 304b, 304c, 304d, the entire field of view (assuming 120° x 24°) may be covered in one rotation, leading to a time of ti2o°x24°~4 ms. A rotation at 250 rps may correspond to a MEMS mirror with oscillation at 1 kHz. Thus, a deflection of light by implementing the strategy described herein may be about 7 times more effective (7 times faster) than deflection with a 2.1 kHz MEMS mirror and LCPG.
The position information may also be used for increasing the tolerance of the optical system 300a, 300b, 300c, 300d to oscillations and deflections of the optical component 304a, 304b, 304c, 304d during operation, i.e. to uncontrolled variations in the position of the optical component 304a, 304b, 304c, 304d (e.g., an unwanted tilt) , for example due to vibrations, impacts, etc. Variations in the vertical direction may have no influence as long as the light emitted by the light source 302a, 302b, 302c, 302d hits the corresponding segment. Variations in the plane of the main surface of the optical component 304a, 304b, 304c, 304d may change the deflection angle in the horizontal and vertical direction. By using the information provided by the position sensor 324a, 324b, 324c, 324d a correction of the point assignment in the point cloud may be performed by the one or more processors 326d (e.g., by the CPU 328d) . Even with strong vibrations or shocks, the use of the hard disk drive concept described herein may not result in a strong deflection of the segment disk in the x, y, z directions.
As shown in the inset 332b in FIG. 3B, the position information provided by a position sensor 324a, 324b, 324c, 324d may be used to adjust a tilt of the optical component 304a, 304b, 304c, 304d. Illustratively, the optical component 304a, 304b, 304c, 304d may be mounted on a support configured to allow for an adjustment of the position and the rotation of the optical component 304a, 304b, 304c, 304d.
As shown in FIG. 3B, an automatic correction of uncontrolled variations in the position of the optical component 304b may be provided by using the imaging principle (e.g., considering the opposite movement of the segments 306b with respect to the associated receive optical elements 316b) .
The light detection at an optical system (e.g., at the optical system 101, 300a, 300b, 300c, 300d) , and the influence of an optical component (104, 200, 304a, 304b, 304c, 304d) on the detection will be described in further detail below.
FIG. 4A shows a receiver side 400 of an optical system in a schematic view according to various aspects. The receiver side 400 may be the receiver side of an optical system 101, 300a, 300b, 300c, 300d as described above.
The receiver side 400 may be configured to image the field of view 402 of the optical system. As described in relation to FIG. 3A to FIG. 3D, the receiver side 400 of an optical system may include a first receive optics arrangement 404 (e.g., configured as the first receive optics arrangement 318a, 318b, 318c, 318d described in relation to FIG. 3A to FIG. 3D) , an optical component 406 (e.g., configured as the optical component 104, 200, 304a, 304b, 304c, 304d described in relation to FIG. IB to FIG. II, FIG. 2A to FIG. 2G, and FIG. 3A to FIG. 3D) , a second receive optics arrangement 408 (e.g., configured as the second receive optics arrangement 322a, 322b, 322c, 322d described in relation to FIG. 3A to FIG. 3D) , and a detector 410 (e.g., configured as the detector 126, 314a, 314b, 314c, 314c described in relation to FIG. 1H, and FIG. 3A to FIG. 3D) .
The first receive optics arrangement 404 may be configured to image the field of view 402 onto the optical component 406 (e.g., on a disc-shaped component) . FIG. 4B illustrates the image 412 of the field of view 402 provided on the optical component 406. In some aspects, the first receive optics arrangement 404 may be configured to operate a geometric transformation to provide a trapezoidal image of the field of view 402 onto the optical component 406, illustratively a tapered image having a first width at a first side 414-1 (e.g., a width in the range from 20 mm to 30 mm, for example 26 mm) , and a second width at a second side 414-2 (e.g., a width in the range from 30 mm to 40 mm, for example 37 mm) . The trapezoidal image 412 may have a height in the range from 10 mm to 20 mm, for example 13 mm. The geometric transformation may ensure that the entire field of view 402 may be imaged onto the optical component 406. It is understood that these values are only an example, and may be adapted depending on a desired configuration of the optical component 406 and of the receive optical elements. FIG. 4D illustrates a top view of the optical component 406 on which the image 412 of the field of view 402 is formed .
The image 412 may have, illustratively, a number of pixels defined by the properties of the optical element 406 and of the light source of the optical system, as described in relation to FIG. 1H. The number of pixels in the vertical direction may be defined by the angular range covered by the individual segments of the optical element 406 and by the number of light source. For example, the image 412 may include 128 pixels in the vertical direction. The number of pixels in the horizontal direction may be defined by the angular range covered by the continuous movement of the optical element 406. For example, the image 412 may include 504 pixels in the horizontal direction.
The second receive optics arrangement 408 may be configured to image onto the detector 410 the light downstream of the optical component 406 to the detector 410 (e.g., to image onto the detector 410 the light not blocked by the optical component 406) . For the purpose of illustration in FIG. 4D (and similarly in FIG. 5A to FIG. 5L described below) , the detector 410 is shown as visible from this top view, even though the detector 410 is located underneath the optical element 406. The receive optical elements serve the purpose that only a portion of the detector 410 is illuminated by the image 412 (illustratively, the portion underneath the receive optical elements 420, e.g. the portion underneath the slits 420 in FIG. 4D) .
FIG. 4C illustrates an exemplary configuration for the detector 410, which may include a plurality of photo diodes 416 (e.g., 128 photo diodes, illustratively 128 pixels each associated with a respective photo diode) , e.g. 128 avalanche photo diodes. The photo diodes may be arranged to form a column array, as an example. Pairs of individual photo diodes may be connected in parallel (e.g., 1 with 17, 2 with 18, etc.)
As a numerical example, a total height of the detector 410 may be in the range from 5000 pm to 25000 pm, for example 20450 pm, and a width of the detector 410 may be in the range from 5000 pm to 50000 pm, for example 37000 pm. As another numerical example, the height of a photo diode 416 may be in the range from 100 pm to 200 pm, for example 120 pm. A spacing between adjacent photo diodes 416 may be in the range from 10 pm to 60 pm, for example 40 pm.
The optical aperture of the first receive optics arrangement 404 may be adapted in accordance with an optical aperture of the detector 410. As an example, the first receive optics arrangement 404 may have an optical aperture of about 294 mm2 (smaller compared to the optical aperture of optics arrangement used in a usual LIDAR system) , and the detector may have an optical aperture (width x height) of about 700 mm2.
In some aspects, as shown in the inset 418 in FIG. 4A, the receive optical elements 420 of the optical component 406 may have a shape adapted to the image of the field of view 402 provided by the first receive optics arrangement 402, e.g. a tapered shape (for example, with a greater width towards the edge of a disc-shaped optical element, and a narrower width towards the center of the disc-shaped optical element) .
In some aspects, the number of illuminated pixels may be selected in accordance with the processing capabilities of the optical system. As an example, four laser diodes may illuminate 16 vertical pixels simultaneously, as described in relation to FIG. IB to FIG. II. In some aspects, a system-on-chip of the optical system may only multiplex 4 x 16 photo diodes.
FIG. 5A to FIG. 5L illustrate an imaging process including an optical component 500, in which an image 502 of the field of view is formed on the optical component (including 504 x 16 pixels) . For the purpose of illustration in FIG. 5A to FIG. 5L, a detector 510 is shown as visible from this top view, even though the detector 510 is located underneath the optical element 500. The receive optical elements (e.g., the slits in FIG. 5A to FIG. 5L) serve the purpose that only a portion of the detector 510 is illuminated by the image 502 (illustratively, the portion underneath the receive optical element onto which the field of view is imaged) .
FIG. 5A to FIG. 5F illustrate that during a first portion of the continuous rotation of the optical component 500, light 504 (e.g., a laser beam) impinges onto a first segment 506-1, and the associated receive optical element 508-1 provides that only the relevant portion of the image 502 of the field of view is provided to the detector 510. Illustratively, the FIG. 5A to FIG. 5F are associated to subsequent time points within the first portion of the continuous rotation of the optical component 500. FIG. 5G to FIG. 5L illustrate that during a second portion of the continuous rotation of the optical component 500, the light 504 impinges onto a second segment 506-2, and the associated receive optical element 508-2 provides that only the relevant portion of the image 502 of the field of view is provided to the detector 510. Illustratively, the FIG. 5G to FIG. 5L are associated to subsequent time points within the second portion of the continuous rotation of the optical component 500.
FIG. 6 shows a detector 600 and an optical component 602 in a schematic view according to various aspects. The optical component 602 may be a further example of the optical component 104, 200, 304a, 304b, 304c, 304d described in relation to FIG. IB to FIG. II, FIG. 2A to FIG. 2G, and FIG. 3A to FIG. 3D.
In some aspects, a receive optical element may be configured to cover the unused photo diodes of a detector. Illustratively, based on the processing capabilities of an optical system, pairs of photo diodes 604 of a detector 600 may be connected in parallel with one another.
The receive optical elements of the optical component 602, shown in the inset 606, may each be assigned to a respective subset of photo diodes 602. For example, a first receive optical component 608-1 may be assigned to a first subset of photo diodes (e.g., 0 to 17) , a second receive optical component 608-2 may be assigned to a second subset of photo diodes (e.g., 16 to 33) , etc. Illustratively, each receive optical element may transmit or reflect light towards the assigned photo diodes, while leaving the remaining photo diodes covered by the body of the optical component 602.
FIG. 7 shows an optical component 700 in a schematic view according to various aspects. The optical component 700 may be a further example of the optical component 104, 200, 304a, 304b, 304c, 304d, 602 described in relation to FIG. IB to FIG. II, FIG. 2A to FIG. 2G, FIG. 3A to FIG. 3D, and FIG. 6. In some aspects , the receive optical elements 702- 1 , 702-2 , 702-3 , 702-4 may be disposed tilted with respect to one another . An image 704 of the field of view may be formed on the optical component 700 , for example compressed in the hori zontal direction and elongated in the vertical direction by the first receive optics arrangement . As an example the image 704 of the field of view may include 504 x 128 pixels .
The orientation of the receive optical elements 702- 1 , 702-2 , 702-3 , 702-4 relative to one another may be adapted in accordance with the formed image , and with the configuration of a detector .
FIG . 8 shows a graph 800 providing a comparison of the beam steering strategy described herein (Al ) with a beam steering strategy implemented in a usual long range LIDAR system .
As shown in the graph 800 , higher range may be provided, even though with the beam steering and tracking strategy described above . I llustratively, as shown in the graph 800 , a higher range may be provided by means of the optical component described herein compared to beam steering with a LCPG ( and a MEMS mirror ) as used in a usual LIDAR system . As a numerical example , the range of detection of a LIDAR system configured according to the strategy described herein may be greater than 100 m, for example greater than 130 m, for example may be about 180 m . The range provided by the strategy described herein may be indicated by the crossing point of the line 802 ( denoted with Al ) in the graph 800 with the dotted line 804 , and the range provided by a usual long range LIDAR system may be indicated by the crossing point of the line 806 in the graph 800 with the dotted line 808 . The graph may 800 be provided using formulas known in the art for calculating the power per pixel to obj ect distance , the noise shot of an avalanche photo diode , a total noise , noise accumulation, and minimum power on detector pixel for detection .
In various aspects , an optical system is provided which may provide advantages with respect to beam-steering solutions implemented in a usual LIDAR system . A single (hori zontal ) rotating disk may be used to scan both the horizontal and the vertical axis. The horizontal and vertical deflection in the transmission path and also the limitation (tracking) of the illuminated angular range in the receiver path may be achieved by a single horizontally rotating disk providing an effective increase in the signal-to-noise ratio. Complex, expensive and optically lossy MEMS mirrors and LCPG may be replaced by a single rotating disk using computer hard disk technology (with corresponding disk size and drive components) . A laser beam guidance similar to the line method of a television picture tube may be realized, with simultaneous tracking of the irradiated angular section to increase the signal-to-noise ratio. The components of the optical system may operate together with Lidar engines (like LCA3) and most of the system components of these technologies. A smaller system geometry/volume may be provided, for example by eliminating the LCPG a smaller aperture of the receiver optics may be provided (approximately 7 times smaller aperture) . The bill of material cost may be reduced. An unchanged or even extended range compared to a usual LIDAR system may be provided with extended field of view (120° x 24°) and constant frame rate of 25 Hz. Scanning of the entire field of view (120° x 24°) at 25 Hz may be provided with the same (or even greater) range over the entire area. An overall power consumption may be reduced.
In various aspects, (only) one single rotating disk may be used for horizontal and vertical laser beam steering in transmission path and also for limiting the imaged area around the laser angle in the receiver path. This may provide a reduction of back light by limiting the area around the irradiated angle point. The disk may be divided in slices. Every slice area may include one transmission (TX) and one receive (RX) element. TX elements may be placed on the side (edge) and RX elements may be placed on a plane of the disc. A laser beam steering equivalent to a TV picture tube may be realized by the edge elements of the rotating disc. Every edge element may realize a beam steering over the entire horizontal field of view vie the rotating motion. The vertical beam steering may be affected by different tilt angles of the edge segments. This may provide that every horizontal angle (within the field of view) may be possible . Di f ferent vertical angles may be done in discrete steps . Every TX element may be assigned exactly to one RX element . The RX element may perform a tracking of the irradiated area by the rotation of the disk ( tracking of laser beam) . For example , in the hori zontal direction the areas left and right from the irradiated area are hidden . So the back light of these areas is not imaged on a detector ( e . g . , on a photo element ) . Downstream of the RX tracking segment , the remaining light ( reflected light by an obj ect ) may be imaged to a vertical photo element row . The number of photo elements may be determined by the vertical resolution . The number of slices of the disk may be determined by the angle of the vertical field of view irradiated by one laser shot . For example 8 slices may be provided for an irradiated angle of 3 ° ( 1 shot ) at a whole hori zontal angle of 24 ° .
In various aspects , a cost reduction for a LIDAR system may be provided by replacing expensive components with inexpensive components that may provide a same performance . A MEMS mirror may be replaced by a rotating disk with transmission elements . A LCPG may be replaced by the rotating disk providing di f ferent vertical deflections , in combination with associated tracking filters that track the scanned angular range to increase signal-to-noise ratio .
In the following, various aspects of this disclosure will be illustrated .
Example 1 is a LIDAR system including an optical system, the optical system including : a light source ; an optical component including a first segment configured to deflect the light emitted by the light source towards a field of view of the LIDAR system in a first emission direction, and a second segment configured to deflect the light emitted by the light source towards the field of view of the LIDAR system in a second emission direction, and a controller configured to control a continuous movement of the optical component , wherein the optical component is configured in such a way that the light emitted by the light source impinges onto the first segment during a first portion of the continuous movement of the optical component , and that the light emitted by the light source impinges onto the second segment during a second portion of the continuous movement of the optical component .
As a numerical example , the optical component may include six segments , or eight segments , or ten segments .
In Example 2 , the subj ect-matter of example 1 may optionally further include that the first emission direction is at a first emission angle with respect to an optical axis of the optical system, the first emission angle including a first component along a first field of view direction and a second component along a second field of view direction, and that the first segment is configured such that the first component of the first emission angle remains constant during the first portion of the continuous movement of the optical component , and the first segment is configured such that the second component of the first emission angle varies during the first portion of the continuous movement of the optical component .
In Example 3 , the sub ect-matter of example 2 may optionally further include that the optical component is configured in such a way that the light emitted by the light source impinges onto the first segment at a plurality of first impinging locations during the first portion of the continuous movement of the optical component , and that each first impinging location is associated with a respective second component of the first emission angle .
In Example 4 , the subj ect-matter of any one of examples 1 to 3 may optionally further include that the second emission direction is at a second emission angle with respect to an optical axis of the optical system, the second emission angle including a third component along the first field of view direction and a fourth component along the second field of view direction, and that the second segment is configured such that the third component of the second emission angle remains constant during the second portion of the continuous movement of the optical component , and the second segment is configured such that the fourth component of the second emission angle varies during the second portion of the continuous movement of the optical component .
In Example 5 , the subj ect-matter of example 4 may optionally further include that the optical component is configured in such a way that the light emitted by the light source impinges onto the second segment at a plurality of second impinging locations during the second portion of the continuous movement of the optical component , and that each second impinging location is associated with a respective fourth component of the second emission angle .
In Example 6 , the sub ect-matter of any one of examples 1 to 5 may optionally further include that the optical component is configured in such a way that the impinging location of the light emitted by the light source onto the first segment moves along the first segment in a direction parallel to a second field of view direction during the first portion of the continuous movement of the optical component , and that the optical component is configured in such a way that the impinging location of the light emitted by the light source onto the second segment moves along the second segment in the direction parallel to the second field of view direction during the second portion of the continuous movement of the optical component .
In Example 7 , the subj ect-matter of examples 2 and 4 may optionally further include that the first component of the first emission angle is di f ferent from the third component of the second emission angle , and that the second component of the first emission angle and the fourth component of the second emission angle vary in a same angular range during the respective portion of the continuous movement of the optical component .
In Example 8 , the subj ect-matter of example 7 may optionally further include that the angular range is between - 60 ° and + 60 ° with respect to the optical axis of the optical system along the second field of view direction . In Example 9, the subject-matter of example 7 or 8 may optionally further include that an absolute value of a difference between the first component of the first emission angle and the third component of the second emission angle is the range from 0.1° to 10°, for example in the range from 0.25° to 5°, for example 1.5°, for example 3°.
In Example 10, the sub ect-matter of any one of examples 2 to 9 may optionally further include that the first field of view direction and the second field of view direction are aligned at a defined angle with one another (e.g., an angle different from 0° and different from 180°) .
In Example 11, the subject-matter of example 10 may optionally further include that the first field of view direction and the second field of view direction are perpendicular to one another. In some aspects, the first field of view direction is the vertical direction and the second field of view direction is the horizontal direction.
In Example 12, the subject-matter of any one of examples 1 to 11 may optionally further include that the continuous movement of the optical component includes a continuous circular movement.
In Example 13, the subject-matter of example 12 may optionally further include that the continuous circular movement includes a frequency of rotation in the range from 1 Hz to 400 Hz, for example in the range from 1 Hz to 300 Hz, for example in the range from 10 Hz to 250 Hz, for example a frequency of rotation equal to or greater than 75 Hz.
In Example 14, the subject-matter of any one of examples 1 to 13 may optionally further include that the optical component further includes a third segment configured to deflect the light emitted by the light source towards the field of view of the LIDAR system in a third emission direction, and that the optical component is configured in such a way that the light emitted by the light source impinges onto the third segment during a third portion of the continuous movement of the optical component .
In Example 15 , the subj ect-matter of example 14 may optionally further include that the third emission direction is at a third emission angle with respect to the optical axis of the optical system, the third emission angle including a fi fth component along the first field of view direction and a sixth component along the second field of view direction, and that the third segment is configured such that the fi fth component of the third emission angle remains constant during the third portion of the continuous movement of the optical component , and the third segment is configured such that the sixth component of the third emission angle varies during the third portion of the continuous movement of the optical component .
In Example 16 , the sub ect-matter of any one of examples 1 to 15 may optionally further include that the optical component includes a main surface and a plurality of side surfaces .
In Example 17 , the subj ect-matter of example 16 may optionally further include that the first segment includes a first side surface of the plurality of side surfaces and the second segment includes a second side surface of the plurality of side surfaces .
In Example 18 , the subj ect-matter of example 17 may optionally further include that the first side surface is tilted at a first tilting angle with respect to the main surface , and wherein the second side surface is tilted at a second tilting angle with respect to the main surface .
In Example 19 , the subj ect-matter of any one of examples 16 to 18 may optionally further include that the continuous movement of the optical component includes a continuous circular movement around an axis perpendicular to the main surface of the optical component . In Example 20 , the subj ect-matter of any one of examples 1 to 19 may optionally further include that the optical component has a disk shape . In some aspects , the disk shaped optical component has a radius in the range from 10 mm to 100 mm, for example in the range form 25 mm to 75 mm, for example in the range from 30 mm to 60 mm, for example in the range from 25 mm to 50 mm, for example a radius of 50 mm .
In Example 21 , the sub ect-matter of example 20 may optionally further include that the first segment and the second segment are disposed along a side surface of the disk shaped optical component .
In Example 22 , the subj ect-matter of any one of examples 1 to 21 may optionally further include that the first segment has a first concave surface and the second segment has a second concave surface . In some aspects , the first concave surface has a first radius of curvature in the range from 5 mm to 80 mm, for example in the range from 15 mm to 60 mm, for example in the range from 20 °mm to 40 °mm, and the second concave surface has a second radius of curvature in the range from 5 mm to 80 mm, for example in the range from 15 mm to 60 mm, for example in the range from 20 °mm to 40 °mm .
In Example 23 , the subj ect-matter of any one of examples 1 to 22 may optionally further include that the first segment has a first extension along a first direction parallel to the first field of view direction in the range from 10 mm to 60 mm, and that the second segment has a second extension along a first direction parallel to the first field of view direction in the range from 10 mm to 60 mm .
In Example 24 , the subj ect-matter of any one of examples 1 to 23 may optionally further include that the optical component includes a first receive optical element associated with the first segment , and a second receive optical element associated with the second segment . In Example 25 , the subj ect-matter of example 24 may optionally further include that the first segment and the first receive optical element are disposed relative to one another such that the first receive optical element receives light associated with a direct reflection of the light deflected in the first emission direction, and that the second segment and the second receive optical element are disposed relative to one another such that the second receive optical element receives light associated with a direct reflection of the light deflected in the second emission direction .
In some aspects , the first emission direction may be associated with a first portion of the field of view of the LIDAR system, and the second emission direction may be associated with a second portion of the field of view of the LIDAR system . The first receive optical element and the first segment may be disposed relative to one another such that the first receive optical element receives light coming from the first portion of the field of view . The second receive optical element and the second segment may be disposed relative to one another such that the second receive optical element receives light coming from the second portion of the field of view .
In Example 26 , the sub ect-matter of example 24 or 25 may optionally further include that the first receive optical element and the second receive optical element are disposed on a main surface of the optical component .
In Example 27 , the subj ect-matter of example 26 may optionally further include that the main surface of the optical component is a light absorbing surface , and that the first receive optical element and the second receive optical element are configured to transmit or reflect the received light .
In Example 28 , the subj ect-matter of any one of examples 1 to 27 may optionally further include a first receive optics arrangement configured to collect light from the field of view of the LIDAR system and to image the field of view onto the optical component . In Example 29 , the subj ect-matter of example 28 may optionally further include that the first receive optics arrangement is configured to image the field of view onto a main surface of the optical component . In some aspects , the first receive optics arrangement includes a first lens having an optical aperture in the range from 200 mm2 to 400 mm2 , for example in the range from 100 mm2 to 3000 mm2 .
In Example 30 , the sub ect-matter of any one of examples 1 to 29 may optionally further include a detector configured to detect light . In some aspects , the detector has an optical aperture in the range from 50 mm2 to 1000 mm2 , for example in the range from 150 mm2 to 600 mm2 , for example 400 mm2 .
In Example 31 , the subj ect-matter of example 30 may optionally further include that the detector includes one or more photo diodes . In some aspects , the one or more photo diodes include at least one avalanche photo diodes .
In Example 32 , the subj ect-matter of example 30 or 31 may optionally further include that the one or more photo diodes include a plurality of photodiodes disposed along a first direction to form a one dimensional array . In some aspects , the plurality of photodiodes are further disposed along a second direction to form a two dimensional array .
In Example 33 , the subj ect-matter of any one of examples 30 to
32 may optionally further include that the first direction is parallel to the first field of view direction .
In Example 34 , the subj ect-matter of any one of examples 30 to
33 may optionally further include a second receive optics arrangement configured to image the light reflected or transmitted by a receive optical element onto the detector .
In Example 35 , the subj ect-matter of example 24 and any one of examples 30 to 33 may optionally further include that the first receive optical element is configured to transmit or reflect the light associated with a first portion of the field of view towards the detector, and that the second receive optical element is configured to transmit or reflect the light associated with a second portion of the field of view towards the detector .
In Example 36 , the subj ect-matter of any one of examples 1 to 35 may optionally further include a transmission optics arrangement configured to collimate the light emitted by the light source towards the optical component .
In Example 37 , the sub ect-matter of any one of examples 1 to 36 may optionally further include a position sensor configured to determine a position of the optical component during the continuous movement of the optical component to identi fy the segment onto which the light emitted by the light source is impinging .
In some aspects , the one or more processors may be configured to control an emission of light from the light source in accordance ( e . g . , in synchroni zation) with the position of the optical component as determined by the position information .
In Example 38 , the subj ect-matter of example 37 may optionally further include that the position sensor is configured to determine an angular position of the optical component during a continuous circular movement of the optical component .
In Example 39 , the subj ect-matter of example 37 or 38 may optionally further include one or more processors configured to process position information provided by the position sensor, wherein the one or more processors are configured to assign a location in the field of view of the LIDAR system to light received at the optical system in accordance with the position information provided by the position sensor .
In Example 40 , the subj ect-matter of any one of examples 1 to 39 may optionally further include that the light source includes a plurality of light sources . In Example 41 , the subj ect-matter of example 40 may optionally further include that each light source of the plurality of light sources is configured to illuminate a respective location of a segment of the optical component .
In Example 42 , the sub ect-matter of example 41 may optionally further include that the plurality of light sources are configured such that a segment is fully illuminated along the lateral extension of the segment parallel to the first field of view direction by the light emitted by the plurality of light sources .
In Example 43 , the subj ect-matter of any one of examples 1 to 42 may optionally further include that the light source includes a laser source . In some aspects , the laser source includes one or more laser diodes .
In Example 44 , the subj ect-matter of any one of examples 1 to 43 may optionally further include a motor configured to drive the continuous movement of the optical component . In some aspects , the motor may be one or a servo motor or a spindle motor .
Example 45 is an optical component including : a first segment configured such that light impinging onto the first segment is deflected towards a first portion of a field of view of the optical component ; and a second segment configured such that light impinging onto the second segment is deflected towards a second portion of a field of view of the optical component ; and a first receive optical element associated with the first segment and a second receive optical element associated with the second segment , wherein the first receive optical element and the first segment are disposed relative to one another such that the first receive optical element receives light coming from the first portion of the field of view of the optical component , and wherein the second receive optical element and the second segment are disposed relative to one another such that the second receive optical element receives light coming from the second portion of the field of view of the optical component . Example 46 is a LIDAR system including an optical system, the optical system including : a light source ; a disk including a first side surface configured to deflect the light emitted by the light source towards a field of view of the LIDAR system in a first emission direction, and a second side surface configured to deflect the light emitted by the light source towards a field of view of the LIDAR system in a second emission direction; and a controller configured to control a continuous rotation of the disc, in such a way that the light emitted by the light source impinges onto the first side surface during a first portion of the continuous rotation of the disc, and that the light emitted by the light source impinges onto the second side surface during a second portion of the continuous rotation of the optical disc .
In Example 47 , the subj ect-matter of example 46 may optionally further include that the first emission direction is associated with a first portion of the field of view of the LIDAR system, and the second emission direction is associated with a second portion of the field of view of the LIDAR system; and the disk may further include a first receive optical element disposed on a main surface of the disk and being associated with the first side surface , and a second receive optical element disposed on the main surface of the disk and being associated with the second side surface , wherein the first receive optical element and the first side surface are disposed relative to one another such that the first receive optical element receives light coming from the first portion of the field of view, and wherein the second receive optical element and the second side surface are disposed relative to one another such that the second receive optical element receives light coming from the second portion of the field of view .
In Example 48 , the sub ect-matter of example 46 or 47 may optionally further include the one , or more than, or each of the features of any one of the examples 1 to 44 .
While various implementations have been particularly shown and described with reference to speci fic aspects , it should be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope as defined by the appended claims . The scope is thus indicated by the appended claims and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced .
LIST OF REFERENCE SIGNS
100 LIDAR system
101 Optical system
102 Light source
104 Optical component
106- 1 First segment
106-2 Second segment
106-3 Third segment
106-4 Fourth segment
106-5 Fi fth segment
106- 6 Sixth segment
106-7 Seventh segment
106- 8 Eighth segment
108 Field of view
108- 1 First field of view line
108-2 Second field of view line
110 Optical axis
112 Controller
114 First emission direction
114r First direct reflection
116- 1 First component of the first deflection angle
116-2 Second component of the first deflection angle
118 Second emission direction
120- 1 First component of the second deflection angle
120-2 Second component of the second deflection angle
122 Emitted light
124- 1 First traj ectory
124-2 Second traj ectory
124-3 Third traj ectory
124-n N-th traj ectory
126 Detector
128- 1 First receive optical element
128-2 Second receive optical element
128-3 Third receive optical element
128-4 Fourth receive optical element
128-5 Fi fth receive optical element
128- 6 Sixth receive optical element
128-7 Seventh receive optical element - 8 Eighth receive optical element Direct reflection Light absorbing portion First direction Second direction Third direction Optical component s Main surface s Side surface s- l First side surface s-2 Second side surface s-3 Third side surface Light source Axis - 1 First graph -2 Second graph -3 Third graph -4 Fourth graph -5 Fi fth graph - 6 Sixth graph Angle Fast axis collimator Slow axis collimator - 1 Graph -2 Graph -3 Graph -4 Graph -5 Graph - 6 Graph - 1 Graph -2 Graph -3 Graph -4 Graph -5 Graph - 6 Graph a Optical component b Optical component a Optical system b Optical system c Optical system d Optical system a Light source b Light source c Light source d Light source a Optical component b Optical component c Optical component d Optical component a Mirror b Mirror c Mirror a Segments b Segments c Segments d Segments a Field of view b Field of view c Field of view d Field of view a Ax i s b Axis c Ax i s d Ax i s a Obj ect b Obj ect c Obj ect a Detector b Detector c Detector d Detector a Receive optical element b Receive optical element c Receive optical element d Receive optical element a First receive optics arrangementb First receive optics arrangementc First receive optics arrangement d First receive optics arrangementa Mirror b Mirror c Mirror a Second receive optics arrangementb Second receive optics arrangementc Second receive optics arrangementd Second receive optics arrangementa Position sensor b Position sensor c Position sensor d Position sensor d One or more processors d Central processing unit d System-on-chip b Inset LIDAR engine Graph Receiver side Field of view First receive optics arrangement Optical component Second receive optics arrangement Detector Image of field of view - 1 First side -2 Second side Photo diodes Inset Receive optical element Optical component Image of the field of view Light - 1 First segment -2 Second segment - 1 First receive optical element -2 Second receive optical element Detector Detector 602 Optical component
604 Photo diodes
606 Inset
608- 1 First receive optical element 608-2 Second receive optical element
700 Optical component
702- 1 First receive optical element
702-2 Second receive optical element
702-3 Third receive optical element 702-4 Fourth receive optical element
704 Image of the field of view
800 Graph
802 Line
804 Dotted line 806 Line
808 Dotted line

Claims

72 CLAIMS
1. A LIDAR system (100) comprising an optical system (101, 300a, 300b, 300c, 300d) , the optical system (101, 300a, 300b, 300c, 300d) comprising: a light source (102, 302a, 302b, 302c, 302d) ; an optical component (104, 200, 304a, 304b, 304c, 304d) comprising a first segment (106-1, 306a, 306b, 306c, 306d) configured to deflect the light emitted by the light source (102, 302a, 302b, 302c, 302d) towards a field of view (108, 308a, 308b, 308c, 308d) of the LIDAR system (100) in a first emission direction (114) , and a second segment (106-2, 306a, 306b, 306c, 306d) configured to deflect the light emitted by the light source (102, 302a, 302b, 302c, 302d) towards the field of view (108, 308a, 308b, 308c, 308d) of the LIDAR system (100) in a second emission direction, and a controller configured to control a continuous movement of the optical component (104, 200, 304a, 304b, 304c, 304d) , wherein the optical component (104, 200, 304a, 304b, 304c, 304d) is configured in such a way that the light emitted by the light source (102, 302a, 302b, 302c, 302d) impinges onto the first segment (106-1, 306a, 306b, 306c, 306d) during a first portion of the continuous movement of the optical component (104, 200, 304a, 304b, 304c, 304d) , and that the light emitted by the light source (102, 302a, 302b, 302c, 302d) impinges onto the second segment (106-2, 306a, 306b, 306c, 306d) during a second portion of the continuous movement of the optical component (104, 200, 304a, 304b,
304c, 304d) .
2. The LIDAR system (100) according to claim 1, wherein the first emission direction (114) is at a first emission angle with respect to an optical axis (110) of the optical system (101, 300a, 300b, 300c, 300d) , the first emission angle comprising a first component (116-1) along a first field of view direction (152) and a second component (116-2) along a second field of view direction (154) , wherein the first segment (106-1, 306a, 306b, 306c, 306d) is configured such that the first component (116-1) of the 73 first emission angle remains constant during the first portion of the continuous movement of the optical component (104, 200, 304a, 304b, 304c, 304d) , and wherein the first segment (106-1, 306a, 306b, 306c, 306d) is configured such that the second component (116-2) of the first emission angle varies during the first portion of the continuous movement of the optical component (104, 200, 304a, 304b, 304c, 304d) . The LIDAR system (100) according to claim 2, wherein the optical component (104, 200, 304a, 304b, 304c, 304d) is configured in such a way that the light emitted by the light source (102, 302a, 302b, 302c, 302d) impinges onto the first segment (106-1, 306a, 306b, 306c, 306d) at a plurality of first impinging locations during the first portion of the continuous movement of the optical component (104, 200, 304a, 304b, 304c, 304d) , and wherein each first impinging location is associated with a respective second component (116-2) of the first emission angle . The LIDAR system (100) according to any one of claims 1 to 3, wherein the second emission direction is at a second emission angle with respect to an optical axis (100) of the optical system (101, 300a, 300b, 300c, 300d) , the second emission angle comprising a third component (120-1) along the first field of view direction (152) and a fourth component (120-2) along the second field of view direction (154) , wherein the second segment (106-2, 306a, 306b, 306c, 306d) is configured such that the third component (120-1) of the second emission angle remains constant during the second portion of the continuous movement of the optical component (104, 200, 304a, 304b, 304c, 304d) , and wherein the second segment (106-2, 306a, 306b, 306c, 306d) is configured such that the fourth component (120-2) of the second emission angle varies during the second portion of the continuous movement of the optical component (104, 200, 304a, 304b, 304c, 304d) . The LIDAR system (100) according to claims 2 and 4, wherein the first component (116-1) of the first emission angle is different from the third component (120-1) of the second emission angle, and wherein the second component (116-2) of the first emission angle and the fourth component (120-2) of the second emission angle vary in a same angular range during the respective portion of the continuous movement of the optical component (104, 200, 304a, 304b, 304c, 304d) . The LIDAR system (100) according to claim 5, wherein the angular range is between -60° and +60° with respect to the optical axis (110) of the optical system (101, 300a, 300b, 300c, 300d) along the second field of view direction ( 154 ) . The LIDAR system (100) according to claims 2 and 4, wherein an absolute value of a difference between the first component (116-1) of the first emission angle and the third component (120-1) of the second emission angle is in the range from 0.1° to 10°. The LIDAR system (100) according to any one of claims 1 to 7, wherein the first field of view direction (152) is the vertical direction and the second field of view direction (154) is the horizontal direction. The LIDAR system (100) according to any one of claims 1 to 8, wherein the optical component (104, 200, 304a, 304b, 304c, 304d) comprises a main surface (202s) and a plurality of side surfaces, and wherein the first segment (106-1, 306a, 306b, 306c, 306d) comprises a first side surface (204s-l) of the plurality of side surfaces and the second segment (106-2, 306a, 306b, 306c, 306d) comprises a second side surface (204s-2) of the plurality of side surfaces. The LIDAR system (100) according to any one of claims 1 to 9, 75 wherein the optical component (104, 200, 304a, 304b, 304c, 304d) has a disk shape. The LIDAR system (100) according to any one of claims 1 to
10, wherein the optical component (104, 200, 304a, 304b, 304c, 304d) comprises a first receive optical element (128-1, 318a, 318b, 318c, 318d) associated with the first segment (106-1, 306a, 306b, 306c, 306d) , and a second receive optical element (128-2, 318a, 318b, 318c, 318d) associated with the second segment (106-2, 306a, 306b, 306c, 306d) , wherein the first segment (106-1, 306a, 306b, 306c, 306d) and the first receive optical element (128-1, 318a, 318b, 318c, 318d) are disposed relative to one another such that the first receive optical element (128-1, 318a, 318b, 318c, 318d) receives light associated with a direct reflection of the light deflected in the first emission direction, and wherein the second segment (106-2, 306a, 306b, 306c, 306d) and the second receive optical element (128-2, 318a, 318b, 318c, 318d) are disposed relative to one another such that the second receive optical element (128-2, 318a, 318b, 318c, 318d) receives light associated with a direct reflection of the light deflected in the second emission direction. The LIDAR system (100) according to claim 11, wherein the first receive optical element (128-1, 318a, 318b, 318c, 318d) and the second receive optical element (128-1, 318a, 318b, 318c, 318d) are configured to transmit or reflect the received light. The LIDAR system (100) according to any one of claims 1 to 12, further comprising a position sensor (324a, 324b, 324c, 324d) configured to determine a position of the optical component (104, 200, 304a, 304b, 304c, 304d) during the continuous movement of the optical component (104, 200, 304a, 304b, 304c, 304d) to identify the segment onto which the light emitted by the light source (102, 302a, 302b, 302c, 302d) is impinging. 76 A LIDAR system (100) comprising an optical system (101, 300a, 300b, 300c, 300d) , the optical system (101, 300a, 300b, 300c, 300d) comprising: a light source (102, 302a, 302b, 302c, 302d) ; a disk (104, 200, 304a, 304b, 304c, 304d) comprising a first side surface (204s-l, 306a, 306b, 306c, 306d) configured to deflect the light emitted by the light source (102, 302a, 302b, 302c, 302d) towards a field of view (108, 308a, 308b, 308c, 308d) of the LIDAR system (100) in a first emission direction, and a second side surface (204s-2, 306a, 306b, 306c, 306d) configured to deflect the light emitted by the light source (102, 302a, 302b, 302c, 302d) towards the field of view (108, 308a, 308b, 308c, 308d) of the LIDAR system (100) in a second emission direction, and a controller configured to control a continuous rotation of the disk (104, 200, 304a, 304b, 304c, 304d) , in such a way that the light emitted by the light source (102, 302a, 302b, 302c, 302d) impinges onto the first side surface (204s-l, 306a, 306b, 306c, 306d) during a first portion of the continuous rotation of the disk (104, 200, 304a, 304b, 304c, 304d) , and that the light emitted by the light source (102, 302a, 302b, 302c, 302d) impinges onto the second side surface (204s-2, 306a, 306b, 306c, 306d) during a second portion of the continuous rotation of the disk (104, 200, 304a, 304b, 304c, 304d) . A LIDAR system (100) comprising an optical system (101, 300a, 300b, 300c, 300d) , the optical system (101, 300a, 300b, 300c, 300d) comprising: a light source (102, 302a, 302b, 302c, 302d) ; an optical component (104, 200, 304a, 304b, 304c, 304d) comprising a first segment (106-1, 306a, 306b, 306c, 306d) configured to deflect the light emitted by the light source (102, 302a, 302b, 302c, 302d) towards a field of view (108, 308a, 308b, 308c, 308d) of the LIDAR system (100) in a first emission direction (114) , the first emission direction (114) being associated with a first portion of the field of view (108) , and a second segment (106-2, 306a, 306b, 306c, 306d) configured to deflect the light emitted by the light source 77
(102, 302a, 302b, 302c, 302d) towards the field of view (108, 308a, 308b, 308c, 308d) of the LIDAR system (100) in a second emission direction, the second emission direction (114) being associated with a second portion of the field of view (108) , and a controller configured to control a continuous movement of the optical component (104, 200, 304a, 304b, 304c, 304d) , wherein the optical component (104, 200, 304a, 304b, 304c, 304d) is configured in such a way that the light emitted by the light source (102, 302a, 302b, 302c, 302d) impinges onto the first segment (106-1, 306a, 306b, 306c, 306d) during a first portion of the continuous movement of the optical component (104, 200, 304a, 304b, 304c, 304d) , and that the light emitted by the light source (102, 302a, 302b, 302c, 302d) impinges onto the second segment (106-2, 306a, 306b, 306c, 306d) during a second portion of the continuous movement of the optical component (104, 200, 304a, 304b, 304c, 304d) , and wherein the optical component (104, 200, 304a, 304b, 304c, 304d) comprises a first receive optical element (128-1, 318a, 318b, 318c, 318d) in a first angular relationship with the first segment (106-1, 306a, 306b, 306c, 306d) , such that the first receive optical element (128-1, 318a, 318b, 318c, 318d) receives light from the first portion of the field of view (108) , and a second receive optical element (128-2, 318a, 318b, 318c, 318d) in a second angular relationship with the second segment (106-2, 306a, 306b, 306c, 306d) , such that the second receive optical element (128-2, 318a, 318b, 318c, 318d) receives light from the second portion of the field of view ( 108 ) .
PCT/EP2021/077347 2020-12-15 2021-10-05 Light emission and detection in a lidar system WO2022128194A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102020133539 2020-12-15
DE102020133539.3 2020-12-15

Publications (1)

Publication Number Publication Date
WO2022128194A1 true WO2022128194A1 (en) 2022-06-23

Family

ID=78085676

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2021/077347 WO2022128194A1 (en) 2020-12-15 2021-10-05 Light emission and detection in a lidar system

Country Status (1)

Country Link
WO (1) WO2022128194A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04115810A (en) * 1990-09-04 1992-04-16 Topcon Corp Cutting device for optical member and method of cutting thereof
US5864391A (en) * 1996-04-04 1999-01-26 Denso Corporation Radar apparatus and a vehicle safe distance control system using this radar apparatus
DE102019101967A1 (en) * 2019-01-28 2020-07-30 Valeo Schalter Und Sensoren Gmbh Receiving device for an optical measuring device for detecting objects, light signal deflection device, measuring device and method for operating a receiving device
EP3709052A1 (en) * 2019-03-15 2020-09-16 Ricoh Company, Ltd. Object detector

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04115810A (en) * 1990-09-04 1992-04-16 Topcon Corp Cutting device for optical member and method of cutting thereof
US5864391A (en) * 1996-04-04 1999-01-26 Denso Corporation Radar apparatus and a vehicle safe distance control system using this radar apparatus
DE102019101967A1 (en) * 2019-01-28 2020-07-30 Valeo Schalter Und Sensoren Gmbh Receiving device for an optical measuring device for detecting objects, light signal deflection device, measuring device and method for operating a receiving device
EP3709052A1 (en) * 2019-03-15 2020-09-16 Ricoh Company, Ltd. Object detector

Similar Documents

Publication Publication Date Title
US10261578B2 (en) Scanning depth engine
KR20200093603A (en) Optical designs and detector designs for improved resolution of lidar systems
WO2012013536A1 (en) Active illumination scanning imager
JP2019191149A (en) Photoelectric sensor and method for detecting object in monitoring area
JP7355171B2 (en) Optical device, distance measuring device using the same, and moving object
US10436935B2 (en) Optoelectronic sensor and method of detecting objects in a monitored zone
JP2020076718A (en) Distance measuring device and mobile body
WO2022128194A1 (en) Light emission and detection in a lidar system
JP4851737B2 (en) Distance measuring device
US20210199779A1 (en) Rotatable Mirror Device
CN111487603A (en) Laser emission unit and manufacturing method thereof
KR102122329B1 (en) Scanning headlight for a vehicle
US20210382177A1 (en) System for monitoring surroundings of vehicle
KR20230042439A (en) Lidar system with coarse angle control
EP4283330A1 (en) Lidar device with spatial light modulators
US11762066B2 (en) Multi-beam scanning system
CN110869801B (en) Laser scanner for laser radar system and method for operating laser scanner
US20230168349A1 (en) Laser scanner using macro scanning structure and a mems scanning mirror
WO2024017477A1 (en) A lidar system
AU2015203089B2 (en) Scanning depth engine
CN117630875A (en) Scanning type flash detection and ranging instrument and operation method thereof
JP2024051741A (en) OBJECT DETECTION SYSTEM, OBJECT DETECTION METHOD, AND PROGRAM
CN111487639A (en) Laser ranging device and method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21787378

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21787378

Country of ref document: EP

Kind code of ref document: A1