US20220276351A1 - Component and sensor for a lidar sensor system, lidar sensor system and method for a lidar sensor system - Google Patents

Component and sensor for a lidar sensor system, lidar sensor system and method for a lidar sensor system Download PDF

Info

Publication number
US20220276351A1
US20220276351A1 US17/742,426 US202217742426A US2022276351A1 US 20220276351 A1 US20220276351 A1 US 20220276351A1 US 202217742426 A US202217742426 A US 202217742426A US 2022276351 A1 US2022276351 A1 US 2022276351A1
Authority
US
United States
Prior art keywords
photo diode
sensor
lidar
lidar sensor
wavelength region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/742,426
Inventor
Ricardo Ferreira
Stefan Hadrath
Peter Hoehmann
Herbert Kaestle
Florian Kolb
Norbert Magg
Jiye Park
Tobias Schmidt
Martin Schnarrenberger
Norbert Haas
Helmut Horn
Bernhard Siessegger
Guido Angenendt
Charles Braquet
Gerhard Maierbacher
Oliver Neitzke
Sergey Khrushchev
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Osram GmbH
Original Assignee
Osram GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Osram GmbH filed Critical Osram GmbH
Priority to US17/742,426 priority Critical patent/US20220276351A1/en
Assigned to OSRAM GMBH reassignment OSRAM GMBH ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SCHNARRENBERGER, MARTIN, BRAQUET, Charles, PARK, Jiye, SCHMIDT, TOBIAS, HAAS, NORBERT, HADRATH, STEFAN, MAGG, NORBERT, NEITZKE, Oliver, SIESSEGGER, BERNHARD, FERREIRA, RICARDO, KHRUSHCHEV, SERGEY, ANGENENDT, GUIDO, HOEHMANN, PETER, KAESTLE, HERBERT, Kolb, Florian, MAIERBACHER, Gerhard, HORN, HELMUT
Publication of US20220276351A1 publication Critical patent/US20220276351A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/32Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • G01S17/36Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated with phase comparison between the received signal and the contemporaneously transmitted signal
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4811Constructional features, e.g. arrangements of optical elements common to transmitter and receiver
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • G01S7/4815Constructional features, e.g. arrangements of optical elements of transmitters alone using multiple transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/484Transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4861Circuits for detection, sampling, integration or read-out
    • G01S7/4863Detector arrays, e.g. charge-transfer gates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4865Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/4912Receivers
    • G01S7/4913Circuits for detection, sampling, integration or read-out
    • G01S7/4914Circuits for detection, sampling, integration or read-out of detector arrays, e.g. charge-transfer gates
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14643Photodiode arrays; MOS imagers
    • H01L27/14645Colour imagers
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14643Photodiode arrays; MOS imagers
    • H01L27/14649Infrared imagers
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14665Imagers using a photoconductor layer

Definitions

  • the technical field of the present disclosure relates generally to light detection and ranging (LIDAR) systems and methods that use light detection and ranging technology.
  • This disclosure is focusing on Components for LIDAR Sensor Systems, LIDAR Sensor Systems, LIDAR Sensor Devices and on Methods for LIDAR Sensor Systems or LIDAR Sensor Devices.
  • a human operator may actively switch for example between different SAE levels, depending on the vehicle's capabilities, or the vehicles operation system may request or initiate such a switch, typically with a timely information and acceptance period to possible human operators of the vehicles.
  • SAE levels may include internal factors such as individual preference, level of driving experience or the biological state of a human driver and external factors such as a change of environmental conditions like weather, traffic density or unexpected traffic complexities.
  • ADAS Advanced Driver Assistance Systems
  • Current ADAS systems may be configured for example to alert a human operator in dangerous situations (e.g. lane departure warning) but in specific driving situations, some ADAS systems are able to takeover control and perform vehicle steering operations without active selection or intervention by a human operator. Examples may include convenience-driven situations such as adaptive cruise control but also hazardous situations like in the case of lane keep assistants and emergency break assistants.
  • sensing systems Since modern traffic can be extremely complex due to a large number of heterogeneous traffic participants, changing environments or insufficiently mapped or even unmapped environments, and due to rapid, interrelated dynamics, such sensing systems will have to be able to cover a broad range of different tasks, which have to be performed with a high level of accuracy and reliability. It turns out that there is not a single “one fits all” sensing system that can meet all the required features relevant for semi-autonomous or fully autonomous vehicles. Instead, future mobility requires different sensing technologies and concepts with different advantages and disadvantages. Differences between sensing systems may be related to perception range, vertical and horizontal field of view (FOV), spatial and temporal resolution, speed of data acquisition, etc.
  • FOV vertical and horizontal field of view
  • sensor fusion and data interpretation possibly assisted by Deep Neuronal Learning (DNL) methods and other Neural Processor Units (NFU) methods for more complex tasks, like judgment of a traffic situation and generation of derived vehicle control functions, may be necessary to cope with such complexities.
  • DNS Deep Neuronal Learning
  • NFU Neural Processor Units
  • driving and steering of autonomous vehicles may require a set of ethical rules and commonly accepted traffic regulations.
  • LIDAR sensing systems are expected to play a vital role, as well as camera-based systems, possibly supported by radar and ultrasonic systems. With respect to a specific perception task, these systems may operate more or less independently of each other. However, in order to increase the level of perception (e.g. in terms of accuracy and range), signals and data acquired by different sensing systems may be brought together in so-called sensor fusion systems. Merging of sensor data is not only necessary to refine and consolidate the measured results but also to increase the confidence in sensor results by resolving possible inconsistencies and contradictories and by providing a certain level of redundancy. Unintended spurious signals and intentional adversarial attacks may play a role in this context as well.
  • vehicle-external sources may include sensing systems connected to other traffic participants, such as preceding and oncoming vehicles, pedestrians and cyclists, but also sensing systems mounted on road infrastructure elements like traffic lights, traffic signals, bridges, elements of road construction sites and central traffic surveillance structures.
  • data and information may come from far-away sources such as traffic teleoperators and satellites of global positioning systems (e.g. GPS).
  • Communication may be unilateral or bilateral and may include various wireless transmission technologies, such as WLAN, Bluetooth and communication based on radio frequencies and visual or non-visual light signals. It is to be noted that some sensing systems, for example LIDAR sensing systems, may be utilized for both sensing and communication tasks, which makes them particularly interesting for future mobility concepts. Data safety and security and unambiguous identification of communication partners are examples where light-based technologies have intrinsic advantages over other wireless communication technologies. Communication may need to be encrypted and tamper-proof.
  • NFU Neural Processor Units
  • future mobility will involve sensing systems, communication units, data storage devices, data computing and signal processing electronics as well as advanced algorithms and software solutions that may include and offer various ethical settings.
  • the combination of all these elements is constituting a cyber-physical world, usually denoted as the Internet of things (IoT).
  • future vehicles represent some kind of IoT device as well and may be called “Mobile IoT devices”.
  • Such “Mobile IoT devices” may be suited to transport people and cargo and to gain or provide information. It may be noted that future vehicles are sometimes also called “smartphones on wheels”, a term which surely reflects some of the capabilities of future vehicles. However, the term implies a certain focus towards consumer-related new features and gimmicks. Although these aspects may certainly play a role, it does not necessarily reflect the huge range of future business models, in particular data-driven business models, that can be envisioned only at the present moment of time but which are likely to center not only on personal, convenience-driven features but include also commercial, industrial or legal aspects.
  • New data-driven business models will focus on smart, location-based services, utilizing for example self-learning and prediction aspects, as well as gesture and language processing with Artificial Intelligence as one of the key drivers. All this is fueled by data, which will be generated in vast amounts in automotive industry by a large fleet of future vehicles acting as mobile digital platforms and by connectivity networks linking together mobile and stationary IoT devices.
  • Energy consumption may impose a limiting factor for autonomously driving electrical vehicles.
  • energy consuming devices like sensors, for example RADAR, LIDAR, camera, ultrasound, Global Navigation Satellite System (GNSS/GPS), sensor fusion equipment, processing power, mobile entertainment equipment, heater, fans, Heating, Ventilation and Air Conditioning (HVAC), Car-to-Car (C2C) and Car-to-Environment (C2X) communication, data encryption and decryption, and many more, all leading up to a high power consumption.
  • GNSS/GPS Global Navigation Satellite System
  • HVAC Heating, Ventilation and Air Conditioning
  • C2C Car-to-Car
  • C2X Car-to-Environment
  • safety in this context is focusing on passive adversaries for example due to malfunctioning systems or system components, while security is focusing on active adversaries for example due to intentional attacks by third parties.
  • Safety assessment to meet the targeted safety goals, methods of verification and validation have to be implemented and executed for all relevant systems and components.
  • Safety assessment may include safety by design principles, quality audits of the development and production processes, the use of redundant sensing and analysis components and many other concepts and methods.
  • Safe operation any sensor system or otherwise safety-related system might be prone to degradation, i.e. system performance may decrease over time or a system may even fail completely (e.g. being unavailable). To ensure safe operation, the system has to be able to compensate for such performance losses for example via redundant sensor systems. In any case, the system has to be configured to transfer the vehicle into a safe condition with acceptable risk.
  • One possibility may include a safe transition of the vehicle control to a human vehicle operator.
  • Operational design domain every safety-relevant system has an operational domain (e.g. with respect to environmental conditions such as temperature or weather conditions including rain, snow and fog) inside which a proper operation of the system has been specified and validated. As soon as the system gets outside of this domain, the system has to be able to compensate for such a situation or has to execute a safe transition of the vehicle control to a human vehicle operator.
  • an operational domain e.g. with respect to environmental conditions such as temperature or weather conditions including rain, snow and fog
  • Safe layer the automated driving system needs to recognize system limits in order to ensure that it operates only within these specified and verified limits. This includes also recognizing limitations with respect to a safe transition of control to the vehicle operator.
  • User responsibility it must be clear at all times which driving tasks remain under the user's responsibility.
  • the system has to be able to determine factors, which represent the biological state of the user (e.g. state of alertness) and keep the user informed about their responsibility with respect to the user's remaining driving tasks.
  • Human Operator-initiated handover there have to be clear rules and explicit instructions in case that a human operator requests an engaging or disengaging of the automated driving system.
  • Vehicle-initiated handover requests for such handover operations have to be clear and manageable by the human operator, including a sufficiently long time period for the operator to adapt to the current traffic situation.
  • the human operator is not available or not capable of a safe takeover, the automated driving system must be able to perform a minimal-risk maneuver.
  • Behavior in traffic automated driving systems have to act and react in an easy-to-understand way so that their behavior is predictable for other road users. This may include that automated driving systems have to observe and follow traffic rules and that automated driving systems inform other road users about their intended behavior, for example via dedicated indicator signals (optical, acoustic).
  • the automated driving system has to be protected against security threats (e.g. cyber-attacks), including for example unauthorized access to the system by third party attackers. Furthermore, the system has to be able to secure data integrity and to detect data corruption, as well as data forging. Identification of trustworthy data sources and communication partners is another important aspect. Therefore, security aspects are, in general, strongly linked to cryptographic concepts and methods.
  • Tagging may comprise, for example, to correlate data with location information, e.g. GPS-information.
  • the LIDAR Sensor System may be combined with a LIDAR Sensor Device for illumination of an environmental space connected to a light control unit.
  • the LIDAR Sensor System may comprise at least one light module. Said one light module has a light source and a driver connected to the light source.
  • the LIDAR Sensor System further has an interface unit, in particular a hardware interface, configured to receive, emit, and/or store data signals.
  • the interface unit may connect to the driver and/or to the light source for controlling the operation state of the driver and/or the operation of the light source.
  • the light source may be configured to emit radiation in the visible and/or the non-visible spectral range, as for example in the far-red range of the electromagnetic spectrum. It may be configured to emit monochromatic laser light.
  • the light source may be an integral part of the LIDAR Sensor System as well as a remote yet connected element. It may be placed in various geometrical patterns, distance pitches and may be configured for alternating of color or wavelength emission or intensity or beam angle.
  • the LIDAR Sensor System and/or light sources may be mounted such that they are moveable or can be inclined, rotated, tilted etc.
  • the LIDAR Sensor System and/or light source may be configured to be installed inside a LIDAR Sensor Device (e.g. vehicle) or exterior to a LIDAR Sensor Device (e.g. vehicle). In particular, it is possible that the LIDAR light source or selected LIDAR light sources are mounted such or adapted to being automatically controllable, in some implementations remotely, in their orientation, movement, light emission, light spectrum, sensor etc.
  • the light source may be selected from the following group or a combination thereof: light emitting diode (LED), super-luminescent laser diode (LD), VSECL laser diode array.
  • LED light emitting diode
  • LD super-luminescent laser diode
  • VSECL laser diode array VSECL laser diode array
  • the LIDAR Sensor System may comprise a sensor, such as a resistive, a capacitive, an inductive, a magnetic, an optical and/or a chemical sensor. It may comprise a voltage or current sensor. The sensor may connect to the interface unit and/or the driver of the LIDAR light source.
  • a sensor such as a resistive, a capacitive, an inductive, a magnetic, an optical and/or a chemical sensor. It may comprise a voltage or current sensor.
  • the sensor may connect to the interface unit and/or the driver of the LIDAR light source.
  • the LIDAR Sensor System and/or LIDAR Sensor Device comprise a brightness sensor, for example for sensing environmental light conditions in proximity of vehicle objects, such as houses, bridges, sign posts, and the like. It may be used for sensing daylight conditions and the sensed brightness signal may e.g. be used to improve surveillance efficiency and accuracy. That way, it may be enabled to provide the environment with a required amount of light of a predefined wavelength.
  • the LIDAR Sensor System and/or LIDAR Sensor Device comprises a sensor for vehicle movement, position and orientation. Such sensor data may allow a better prediction, as to whether the vehicle steering conditions and methods are sufficient.
  • the LIDAR Sensor System and/or LIDAR Sensor Device may also comprise a presence sensor. This may allow to adapt the emitted light to the presence of another traffic participant including pedestrians in order to provide sufficient illumination, prohibit or minimize eye damage or skin irritation or such due to illumination in harmful or invisible wavelength regions, such as UV or IR. It may also be enabled to provide light of a wavelength that may warn or frighten away unwanted presences, e.g. the presence of animals such as pets or insects.
  • the LIDAR Sensor System and/or LIDAR Sensor Device comprises a sensor or multi-sensor for predictive maintenance and/or operation of the LIDAR Sensor System and/or LIDAR Sensor Device failure.
  • the LIDAR Sensor System and/or LIDAR Sensor Device comprises an operating hour meter.
  • the operating hour meter may connect to the driver.
  • the LIDAR Sensor System may comprise one or more actuators for adjusting the environmental surveillance conditions for the LIDAR Sensor Device (e.g. vehicle). For instance, it may comprise actuators that allow adjusting for instance, laser pulse shape, temporal length, rise- and fall times, polarization, laser power, laser type (IR-diode, VCSEL), Field of View (FOV), laser wavelength, beam changing device (MEMS, DMD, DLP, LCD, Fiber), beam and/or sensor aperture, sensor type (PN-diode, APD, SPAD).
  • actuators that allow adjusting for instance, laser pulse shape, temporal length, rise- and fall times, polarization, laser power, laser type (IR-diode, VCSEL), Field of View (FOV), laser wavelength, beam changing device (MEMS, DMD, DLP, LCD, Fiber), beam and/or sensor aperture, sensor type (PN-diode, APD, SPAD).
  • any sensor or actuator may be an individual element or may form part of a different element of the LIDAR Sensor System.
  • an additional sensor or actuator being configured to perform or performing any of the described activities as individual element or as part of an additional element of the LIDAR Sensor System.
  • the LIDAR Sensor System and/or LIDAR Light Device further comprises a light control unit that connects to the interface unit.
  • the light control unit may be configured to control the at least one light module for operating in at least one of the following operation modes: dimming, pulsed, PWM, boost, irradiation patterns, including illuminating and non-illuminating periods, light communication (including C2C and C2X), synchronization with other elements of the LIDAR Sensor System, such as a second LIDAR Sensor Device.
  • the interface unit of the LIDAR Sensor System and/or LIDAR Sensor Device may comprise a gateway, such as a wireless gateway, that may connect to the light control unit. It may comprise a beacon, such as a BluetoothTM beacon.
  • the interface unit may be configured to connect to other elements of the LIDAR Sensor System, e.g. one or more other LIDAR Sensor Systems and/or LIDAR Sensor Devices and/or to one or more sensors and/or one or more actuators of the LIDAR Sensor System.
  • elements of the LIDAR Sensor System e.g. one or more other LIDAR Sensor Systems and/or LIDAR Sensor Devices and/or to one or more sensors and/or one or more actuators of the LIDAR Sensor System.
  • the interface unit may be configured to be connected by any wireless or wireline connectivity, including radio and/or optical connectivity.
  • the LIDAR Sensor System and/or LIDAR Sensor Device may be configured to enable customer-specific and/or vehicle-specific light spectra.
  • the LIDAR Sensor Device may be configured to change the form and/or position and/or orientation of the at least one LIDAR Sensor System.
  • the LIDAR Sensor System and/or LIDAR Sensor Device may be configured to change the light specifications of the light emitted by the light source, such as direction of emission, angle of emission, beam divergence, color, wavelength, and intensity as well as other characteristics like laser pulse shape, temporal length, rise- and fall times, polarization, pulse synchronization, pulse synchronization, laser power, laser type (IR-diode, VCSEL), Field of View (FOV), laser wavelength, beam changing device (MEMS, DMD, DLP, LCD, Fiber), beam and/or sensor aperture, sensor type (PN-diode, APD, SPAD).
  • laser type IR-diode, VCSEL
  • FOV Field of View
  • MEMS beam changing device
  • DMD DMD
  • DLP DLP
  • LCD LCD
  • Fiber beam and/or sensor aperture
  • sensor type PN-diode, APD, SPAD
  • the LIDAR Sensor System and/or LIDAR Sensor Device may comprise a data processing unit.
  • the data processing unit may connect to the LIDAR light driver and/or to the interface unit. It may be configured for data processing, for data and/or signal conversion and/or data storage.
  • the data processing unit may advantageously be provided for communication with local, network-based or web-based platforms, data sources or providers, in order to transmit, store or collect relevant information on the light module, the road to be travelled, or other aspects connected with the LIDAR Sensor System and/or LIDAR Sensor Device.
  • the LIDAR Sensor Device can encompass one or many LIDAR Sensor Systems that themselves can be comprised of infrared or visible light emitting modules, photoelectric sensors, optical components, interfaces for data communication, actuators, like MEMS mirror systems, computing and data storage devices, software and software databank, communication systems for communication with IoT, edge or cloud systems.
  • the LIDAR Sensor System and/or LIDAR Sensor Device can further include light emitting and light sensing elements that can be used for illumination purposes, like road lighting, or for data communication purposes, for example car-to-car, car-to-environment (for example drones, pedestrian, traffic signs, traffic posts etc.).
  • illumination purposes like road lighting
  • data communication purposes for example car-to-car, car-to-environment (for example drones, pedestrian, traffic signs, traffic posts etc.).
  • the LIDAR Sensor Device can further comprise one or more LIDAR Sensor Systems as well as other sensor systems, like optical camera sensor systems (CCD; CMOS), RADAR sensing system, and ultrasonic sensing systems.
  • LIDAR Sensor Systems like optical camera sensor systems (CCD; CMOS), RADAR sensing system, and ultrasonic sensing systems.
  • the LIDAR Sensor Device can be functionally designed as vehicle headlight, rear light, side light, daytime running light (DRL), corner light etc. and comprise LIDAR sensing functions as well as visible illuminating and signaling functions.
  • the LIDAR Sensor System may further comprise a control unit (Controlled LIDAR Sensor System).
  • the control unit may be configured for operating a management system. It is configured to connect to one or more LIDAR Sensor Systems and/or LIDAR Sensor Devices. It may connect to a data bus.
  • the data bus may be configured to connect to an interface unit of an LIDAR Sensor Device.
  • the control unit may be configured for controlling an operating state of the LIDAR Sensor System and/or LIDAR Sensor Device.
  • the LIDAR Sensor Management System may comprise a light control system which may comprise any of the following elements: monitoring and/or controlling the status of the at least one LIDAR Sensor System and/or LIDAR Sensor Device, monitoring and/or controlling the use of the at least one LIDAR Sensor System and/or LIDAR Sensor Device, scheduling the lighting of the at least one LIDAR Sensor System and/or LIDAR Sensor Device, adjusting the light spectrum of the at least one LIDAR Sensor System and/or LIDAR Sensor Device, defining the light spectrum of the at least one LIDAR Sensor System and/or LIDAR Sensor Device, monitoring and/or controlling the use of at least one sensor of the at least one LIDAR Sensor System and/or LIDAR Sensor Device.
  • a light control system which may comprise any of the following elements: monitoring and/or controlling the status of the at least one LIDAR Sensor System and/or LIDAR Sensor Device, monitoring and/or controlling the use of the at least one LIDAR Sensor System and/or LIDAR Sensor Device, scheduling the lighting of the at least one LID
  • the method for LIDAR Sensor System can be configured and designed to select, operate and control, based on internal or external data input, laser power, pulse shapes, pulse length, measurement time windows, wavelength, single wavelength or multiple wavelength approach, day and night settings, sensor type, sensor fusion, as well as laser safety functions according to relevant safety regulations.
  • the method for LIDAR Sensor Management System can be configured to initiate data encryption, data decryption and data communication protocols.
  • the computing device may be locally based, network based, and/or cloud-based. That means, the computing may be performed in the Controlled LIDAR Sensor System or on any directly or indirectly connected entities. In the latter case, the Controlled LIDAR Sensor System is provided with some connecting means, which allow establishment of at least a data connection with such connected entities.
  • the Controlled LIDAR Sensor System comprises a LIDAR Sensor Management System connected to the at least one hardware interface.
  • the LIDAR Sensor Management System may comprise one or more actuators for adjusting the surveillance conditions for the environment.
  • Surveillance conditions may, for instance, be vehicle speed, vehicle road density, vehicle distance to other objects, object type, object classification, emergency situations, weather conditions, day or night conditions, day or night time, vehicle and environmental temperatures, and driver biofeedback signals.
  • the present disclosure further comprises an LIDAR Sensor Management Software.
  • the present disclosure further comprises a data storage device with the LIDAR Sensor Management Software, wherein the data storage device is enabled to run the LIDAR Sensor Management Software.
  • the data storage device may either comprise be a hard disk, a RAM, or other common data storage utilities such as USB storage devices, CDs, DVDs and similar.
  • the LIDAR Sensor System in particular the LIDAR Sensor Management Software, may be configured to control the steering of Automatically Guided Vehicles (AGV).
  • AGV Automatically Guided Vehicles
  • the computing device is configured to perform the LIDAR Sensor Management Software.
  • the LIDAR Sensor Management Software may comprise any member selected from the following group or a combination thereof: software rules for adjusting light to outside conditions, adjusting the light intensity of the at least one LIDAR Sensor System and/or LIDAR Sensor Device to environmental conditions, adjusting the light spectrum of the at least one LIDAR Sensor System and/or LIDAR Sensor Device to environmental conditions, adjusting the light spectrum of the at least one LIDAR Sensor System and/or LIDAR Sensor Device to traffic density conditions, adjusting the light spectrum of the at least one LIDAR Sensor System and/or LIDAR Sensor Device according to customer specification or legal requirements.
  • the Controlled LIDAR Sensor System further comprises a feedback system connected to the at least one hardware interface.
  • the feedback system may comprise one or more sensors for monitoring the state of surveillance for which the Controlled LIDAR Sensor System is provided.
  • the state of surveillance may for example, be assessed by at least one of the following: road accidents, required driver interaction, Signal-to-Noise ratios, driver biofeedback signals, close encounters, fuel consumption, and battery status.
  • the Controlled LIDAR Sensor System may further comprise a feedback software.
  • the feedback software may in some embodiments comprise algorithms for vehicle (LIDAR Sensor Device) steering assessment on the basis of the data of the sensors.
  • LIDAR Sensor Device LiDAR Sensor Device
  • the feedback software of the Controlled LIDAR Sensor System may in some embodiments comprise algorithms for deriving surveillance strategies and/or lighting strategies on the basis of the data of the sensors.
  • the feedback software of the Controlled LIDAR Sensor System may in some embodiments of the present disclosure comprise LIDAR lighting schedules and characteristics depending on any member selected from the following group or a combination thereof: road accidents, required driver interaction, Signal-to-Noise ratios, driver biofeedback signals, close encounters, road warnings, fuel consumption, battery status, other autonomously driving vehicles.
  • the feedback software may be configured to provide instructions to the LIDAR Sensor Management Software for adapting the surveillance conditions of the environment autonomously.
  • the feedback software may comprise algorithms for interpreting sensor data and suggesting corrective actions to the LIDAR Sensor Management Software.
  • the instructions to the LIDAR Sensor Management Software are based on measured values and/or data of any member selected from the following group or a combination thereof: vehicle (LIDAR Sensor Device) speed, distance, density, vehicle specification and class.
  • the LIDAR Sensor System therefore may have a data interface to receive the measured values and/or data.
  • the data interface may be provided for wire-bound transmission or wireless transmission.
  • the measured values or the data are received from an intermediate storage, such as a cloud-based, web-based, network-based or local type storage unit.
  • sensors for sensing environmental conditions may be connected with or interconnected by means of cloud-based services, often also referred to as Internet of Things.
  • the Controlled LIDAR Sensor System comprises a software user interface (UI), particularly a graphical user interface (GUI).
  • UI software user interface
  • GUI graphical user interface
  • the software user interface may be provided for the light control software and/or the LIDAR Sensor Management Software and/or the feedback software.
  • the software user interface may further comprise a data communication and means for data communication for an output device, such as an augmented and/or virtual reality display.
  • the user interface may be implemented as an application for a mobile device, such as a smartphone, a tablet, a mobile computer or similar devices.
  • the Controlled LIDAR Sensor System may further comprise an application programming interface (API) for controlling the LIDAR Sensing System by third parties and/or for third party data integration, for example road or traffic conditions, street fares, energy prices, weather data, GPS.
  • API application programming interface
  • the Controlled LIDAR Sensor System comprises a software platform for providing at least one of surveillance data, vehicle (LIDAR Sensor Device) status, driving strategies, and emitted sensing light.
  • the LIDAR Sensor System and/or the Controlled LIDAR Sensor System can include infrared or visible light emitting modules, photoelectric sensors, optical components, interfaces for data communication, and actuators, like MEMS mirror systems, a computing and data storage device, a software and software databank, a communication system for communication with IoT, edge or cloud systems.
  • the LIDAR Sensor System and/or the Controlled LIDAR Sensor System can include light emitting and light sensing elements that can be used for illumination or signaling purposes, like road lighting, or for data communication purposes, for example car-to-car, car-to-environment.
  • the LIDAR Sensor System and/or the Controlled LIDAR Sensor System may be installed inside the driver cabin in order to perform driver monitoring functionalities, such as occupancy-detection, eye-tracking, face recognition, drowsiness detection, access authorization, gesture control, etc.) and/or to communicate with a Head-up-Display HUD).
  • driver monitoring functionalities such as occupancy-detection, eye-tracking, face recognition, drowsiness detection, access authorization, gesture control, etc.
  • the software platform may cumulate data from one's own or other vehicles (LIDAR Sensor Devices) to train machine learning algorithms for improving surveillance and car steering strategies.
  • LIDAR Sensor Devices LIDAR Sensor Devices
  • the Controlled LIDAR Sensor System may also comprise a plurality of LIDAR Sensor Systems arranged in adjustable groups.
  • the present disclosure further refers to a vehicle (LIDAR Sensor Device) with at least one LIDAR Sensor System.
  • vehicle may be planned and build particularly for integration of the LIDAR Sensor System.
  • the Controlled LIDAR Sensor System was integrated in a pre-existing vehicle. According to the present disclosure, both cases as well as a combination of these cases shall be referred to.
  • a method for a LIDAR Sensor System which comprises at least one LIDAR Sensor System.
  • the method may comprise the steps of controlling the light emitted by the at least one LIDAR Sensor System by providing light control data to the hardware interface of the Controlled LIDAR Sensor System and/or sensing the sensors and/or controlling the actuators of the Controlled LIDAR Sensor System via the LIDAR Sensor Management System.
  • the method for LIDAR Sensor System can be configured and designed to select, operate and control, based on internal or external data input, laser power, pulse shapes, pulse length, measurement time windows, wavelength, single wavelength or multiple wavelength approach, day and night settings, sensor type, sensor fusion, as well as laser safety functions according to relevant safety regulations.
  • the method according to the present disclosure may further comprise the step of generating light control data for adjusting the light of the at least one LIDAR Sensor System to environmental conditions.
  • the light control data is generated by using data provided by the daylight or night vision sensor.
  • the light control data is generated by using data provided by a weather or traffic control station.
  • the light control data may also be generated by using data provided by a utility company in some embodiments.
  • the data may be gained from one data source, whereas that one data source may be connected, e.g. by means of Internet of Things devices, to those devices. That way, data may be pre-analyzed before being released to the LIDAR Sensor System, missing data could be identified, and in further advantageous developments, specific pre-defined data could also be supported or replaced by “best-guess” values of a machine learning software.
  • the method further comprises the step of using the light of the at least one LIDAR Sensor Device for example during the time of day or night when traffic conditions are the best.
  • the light of the at least one LIDAR Sensor Device for example during the time of day or night when traffic conditions are the best.
  • traffic conditions are the best.
  • other conditions for the application of the light may also be considered.
  • the method may comprise a step of switching off the light of the at least one LIDAR Sensor System depending on a predetermined condition.
  • a predetermined condition may for instance occur, if the vehicle (LIDAR Sensor Device) speed or a distance to another traffic object is lower than a pre-defined or required safety distance or safety condition.
  • the method may also comprise the step of pushing notifications to the user interface in case of risks or fail functions and vehicle health status.
  • the method comprises analyzing sensor data for deducing traffic density and vehicle movement.
  • the LIDAR Sensor System features may be adjusted or triggered by way of a user interface or other user feedback data.
  • the adjustment may further be triggered by way of a machine learning process, as far as the characteristics, which are to be improved or optimized are accessible by sensors. It is also possible that individual users adjust the surveillance conditions and or further surveillance parameters to individual needs or desires.
  • the method may also comprise the step of uploading LIDAR sensing conditions to a software platform and/or downloading sensing conditions from a software platform.
  • the method comprises a step of logging performance data to an LIDAR sensing note book.
  • the data cumulated in the Controlled LIDAR Sensor System may, in a step of the method, be analyzed in order to directly or indirectly determine maintenance periods of the LIDAR Sensor System, expected failure of system components or such.
  • the present disclosure comprises a computer program product comprising a plurality of program instructions, which when executed by a computer system of a LIDAR Sensor System, cause the Controlled LIDAR Sensor System to execute the method according to the present disclosure.
  • the disclosure further comprises a data storage device.
  • Yet another aspect of the present disclosure refers to a data storage device with a computer program adapted to execute at least one of a method for a LIDAR Sensor System or a LIDAR Sensor Device.
  • FIG. 1 shows a portion of a sensor in accordance with various embodiments.
  • FIG. 2 shows a portion of a sensor in accordance with various embodiments in more detail.
  • FIG. 3 shows a portion of a sensor in accordance with various embodiments in more detail.
  • FIG. 4 shows a portion of a sensor in accordance with various embodiments in more detail.
  • FIG. 5 shows a recorded scene and the sensor pixels used to detect the scene in accordance with various embodiments in more detail.
  • FIG. 6 shows a recorded scene and the sensor pixels used to detect the scene in accordance with various embodiments in more detail.
  • FIG. 7 shows a flow diagram illustrating a method for a LIDAR Sensor System in accordance with various embodiments in more detail.
  • FIG. 8 shows a flow diagram illustrating another method for a LIDAR Sensor System in accordance with various embodiments in more detail.
  • FIG. 9 shows a cross sectional view of an optical component for a LIDAR Sensor System in accordance with various embodiments.
  • FIGS. 10A and 10B show a cross sectional view of an optical component for a LIDAR Sensor System ( FIG. 10A ) and a corresponding wavelength/transmission diagram ( FIG. 10B ) in accordance with various embodiments.
  • FIGS. 11A and 11B show a cross sectional view of an optical component for a LIDAR Sensor System ( FIG. 11A ) and a corresponding wavelength/transmission diagram ( FIG. 11B ) in accordance with various embodiments.
  • FIG. 12 shows a cross sectional view of a sensor for a LIDAR Sensor System in accordance with various embodiments.
  • FIG. 13 shows a top view of a sensor for a LIDAR Sensor System in accordance with various embodiments.
  • FIG. 14 shows a top view of a sensor for a LIDAR Sensor System in accordance with various embodiments.
  • FIG. 15 shows a top view of a sensor for a LIDAR Sensor System in accordance with various embodiments.
  • FIG. 16 shows a cross sectional view of an optical component for a LIDAR Sensor System in accordance with various embodiments.
  • FIG. 17A shows a side view of an optical package in a schematic representation in accordance with various embodiments.
  • FIG. 17B shows a circuit equivalent in a schematic representation in accordance with various embodiments.
  • FIG. 17C shows a circuit equivalent in a schematic representation in accordance with various embodiments.
  • FIG. 18 shows a top view of an optical package in a schematic representation in accordance with various embodiments.
  • FIG. 19A shows a side view of an optical package in a schematic representation in accordance with various embodiments.
  • FIG. 19B shows a top view of an optical package in a schematic representation in accordance with various embodiments.
  • FIG. 20 shows schematically an embodiment of the proposed LIDAR Sensor System, Controlled LIDAR Sensor System and LIDAR Sensor Device
  • LIDAR Light detection and ranging
  • LADAR Laser Detection and Ranging
  • TOF Time of Flight measurement device
  • Laser Scanners Laser Radar
  • the technology works by illuminating a target with an optical pulse and measuring the characteristics of the reflected return signal.
  • the width of the optical-pulse can range from a few nanoseconds to several microseconds.
  • LIDAR Sensor Systems For distance and speed measurement, a light-detection-and-ranging LIDAR Sensor Systems is known from the prior art. With LIDAR Sensor Systems, it is possible to quickly scan the environment and detect speed and direction of movement of individual objects (vehicles, pedestrians, static objects). LIDAR Sensor Systems are used, for example, in partially autonomous vehicles or fully autonomously driving prototypes, as well as in aircraft and drones. A high-resolution LIDAR Sensor System emits a (mostly infrared) laser beam, and further uses lenses, mirrors or micro-mirror systems, as well as suited sensor devices.
  • the disclosure relates to a LIDAR Sensor System for environment detection, wherein the LIDAR Sensor System is designed to carry out repeated measurements for detecting the environment, wherein the LIDAR Sensor System has an emitting unit (First LIDAR Sensing System) which is designed to perform a measurement with at least one laser pulse and wherein the LIDAR system has a detection unit (Second LIDAR Sensing Unit), which is designed to detect an object-reflected laser pulse during a measurement time window.
  • First LIDAR Sensing System which is designed to perform a measurement with at least one laser pulse
  • the LIDAR system has a detection unit (Second LIDAR Sensing Unit), which is designed to detect an object-reflected laser pulse during a measurement time window.
  • the LIDAR system has a control device (LIDAR Data Processing System/Control and Communication System/LIDAR Sensor Management System), which is designed, in the event that at least one reflected beam component is detected, to associate the detected beam component on the basis of a predetermined assignment with a solid angle range from which the beam component originates.
  • the disclosure also includes a method for operating a LIDAR Sensor System.
  • the distance measurement in question is based on a transit time measurement of emitted electromagnetic pulses.
  • electromagnetic pulses Since these are electromagnetic pulses, c is the value of the speed of light.
  • the word electromagnetic comprises the entire electromagnetic spectrum, thus including the ultraviolet, visible and infrared spectrum range.
  • each light pulse is typically associated with a measurement time window, which begins with the emission of the measurement light pulse. If objects that are very far away are to be detectable by a measurement, such as, for example, objects at a distance of 300 meters and farther, this measurement time window, within which it is checked whether at least one reflected beam component has been received, must last at least two microseconds.
  • such measuring time windows typically have a temporal distance from each other.
  • LIDAR sensors are now increasingly used in the automotive sector.
  • LIDAR sensors are increasingly installed in motor vehicles.
  • the disclosure also relates to a method for operating a LIDAR Sensor System arrangement comprising a First LIDAR Sensor System with a first LIDAR sensor and at least one Second LIDAR Sensor System with a second LIDAR sensor, wherein the first LIDAR sensor and the second LIDAR sensor repeatedly perform respective measurements, wherein the measurements of the first LIDAR Sensor are performed in respective first measurement time windows, at the beginning of which a first measurement beam is emitted by the first LIDAR sensor and it is checked whether at least one reflected beam component of the first measurement beam is detected within the respective first measurement time window.
  • the measurements of the at least one second LIDAR sensor are performed in the respective second measurement time windows, at the beginning of which a second measurement beam is emitted by the at least one second LIDAR sensor, and it is checked whether within the respective second measurement time window at least one reflected beam portion of the second measuring beam is detected.
  • the disclosure also includes a LIDAR Sensor System arrangement with a first LIDAR sensor and at least one second LIDAR sensor.
  • a LIDAR (light detection and ranging) Sensor System is to be understood in particular as meaning a system which, in addition to one or more emitters for emitting light beams, for example in pulsed form, and a detector for detecting any reflected beam components, may have further devices, for example optical elements such as lenses and/or a MEMS mirror.
  • the oscillating mirrors or micro-mirrors of the MEMS (Micro-Electro-Mechanical System) system in some embodiments in cooperation with a remotely located optical system, allow a field of view to be scanned in a horizontal angular range of e.g. 60° or 120° and in a vertical angular range of e.g. 30°.
  • the receiver unit or the sensor can measure the incident radiation without spatial resolution.
  • the receiver unit can also be spatial angle resolution measurement device.
  • the receiver unit or sensor may comprise a photodiode, e.g. an avalanche photo diode (APD) or a single photon avalanche diode (SPAD), a PIN diode or a photomultiplier.
  • APD avalanche photo diode
  • SPAD single photon avalanche diode
  • PIN diode a photomultiplier
  • Objects can be detected, for example, at a distance of up to 60 m, up to 300 m or up to 600 m using the LIDAR system.
  • a range of 300 m corresponds to a signal path of 600 m, from which, for example, a measuring time window or a measuring duration of 2 ⁇ s can result.
  • optical reflection elements in a LIDAR Sensor System may include micro-electrical mirror systems (MEMS) and/or digital mirrors (DMD) and/or digital light processing elements (DLP) and/or a galvo-scanner for control of the emitted laser beam pulses and/or reflection of an object-back-scattered laser pulses onto a sensor surface.
  • MEMS micro-electrical mirror systems
  • DMD digital mirrors
  • DLP digital light processing elements
  • a galvo-scanner for control of the emitted laser beam pulses and/or reflection of an object-back-scattered laser pulses onto a sensor surface.
  • a plurality of mirrors is provided. These may particularly be arranged in some implementations in the manner of a matrix. The mirrors may be individually and separately, independently of each other rotatable or movable.
  • the individual mirrors can each be part of a so-called micro mirror unit or “Digital Micro-Mirror Device” (DMD).
  • DMD can have a multiplicity of mirrors, in particular micro-mirrors, which can be rotated at high frequency between at least two positions.
  • Each mirror can be individually adjustable in its angle and can have at least two stable positions, or with other words, in particular stable, final states, between which it can alternate.
  • the number of mirrors can correspond to the resolution of a projected image, wherein a respective mirror can represent a light pixel on the area to be irradiated.
  • a “Digital Micro-Mirror Device” is a micro-electro-mechanical component for the dynamic modulation of light.
  • the DMD can for example provide suited illumination for a vehicle low and/or a high beam.
  • the DMD may also serve projection light for projecting images, logos, and information on a surface, such as a street or surrounding object.
  • the mirrors or the DMD can be designed as a micro-electromechanical system (MEMS). A movement of the respective mirror can be caused, for example, by energizing the MEMS.
  • MEMS micro-electromechanical system
  • Such micro-mirror arrays are available, for example, from Texas Instruments.
  • the micro-mirrors are in particular arranged like a matrix, e.g.
  • micro-mirrors for example, in an array of 854 ⁇ 480 micro-mirrors, as in the DLP3030-Q1 0.3-inch DMP mirror system optimized for automotive applications by Texas Instruments, or a 1920 ⁇ 1080 micro-mirror system designed for home projection applications 4096 ⁇ 2160 Micro-mirror system designed for 4K cinema projection applications, but also usable in a vehicle application.
  • the position of the micro-mirrors is, in particular, individually adjustable, for example with a clock rate of up to 32 kHz, so that predetermined light patterns can be coupled out of the headlamp by corresponding adjustment of the micro-mirrors.
  • the used MEMS arrangement may be provided as a 1D or 2D MEMS arrangement.
  • a 1D MEMS the movement of an individual mirror takes place in a translatory or rotational manner about an axis.
  • 2D MEMS the individual mirror is gimballed and oscillates about two axes, whereby the two axes can be individually employed so that the amplitude of each vibration can be adjusted and controlled independently of the other.
  • a beam radiation from the light source can be deflection through a structure with at least one liquid crystal element, wherein one molecular orientation of the at least one liquid crystal element is adjustable by means of an electric field.
  • the structure through which the radiation to be aligned is guided can comprise at least two sheet-like elements coated with electrically conductive and transparent coating material.
  • the plate elements are in some embodiments transparent and spaced apart from each other in parallel. The transparency of the plate elements and the electrically conductive coating material allows transmission of the radiation.
  • the electrically conductive and transparent coating material can at least partially or completely made of a material with a high electrical conductivity or a small electrical resistance such as indium tin oxide (ITO) and/or of a material with a low electrical conductivity or a large electrical resistance such as poly-3,4-ethylenedioxythiophene (PEDOT).
  • ITO indium tin oxide
  • PEDOT poly-3,4-ethylenedioxythiophene
  • the generated electric field can be adjustable in its strength.
  • the electric field can be adjustable in particular by applying an electrical voltage to the coating material or the coatings of the plate elements. Depending on the size or height of the applied electrical voltages on the coating materials or coatings of the plate elements formed as described above, differently sized potential differences and thus a different electrical field are formed between the coating materials or coatings.
  • the molecules of the liquid crystal elements may align with the field lines of the electric field.
  • the radiation passing through the structure moves at different speeds through the liquid crystal elements located between the plate elements.
  • the liquid crystal elements located between the plate elements have the function of a prism, which can deflect or direct incident radiation.
  • the radiation passing through the structure can be oriented or deflected, whereby the deflection angle can be controlled and varied by the level of the applied voltage.
  • a combination of white or colored light sources and infrared laser light sources is possible, in which the light source is followed by an adaptive mirror arrangement, via which radiation emitted by both light sources can be steered or modulated, a sensor system being used for the infrared light source intended for environmental detection.
  • the advantage of such an arrangement is that the two light systems and the sensor system use a common adaptive mirror arrangement. It is therefore not necessary to provide for the light system and the sensor system each have their own mirror arrangement. Due to the high degree of integration space, weight and in particular costs can be reduced.
  • LIDAR In LIDAR systems, differently designed transmitters and receiver concepts are also known in order to be able to record the distance information in different spatial directions. Based on this, a two-dimensional image of the environment is then generated, which contains the complete three-dimensional coordinates for each resolved spatial point.
  • the different LIDAR topologies can be abstractly distinguished based on how the image resolution is displayed. Namely, the resolution can be represented either exclusively by an angle-sensitive detector, an angle-sensitive emitter, or a combination of both.
  • a LIDAR system, which generates its resolution exclusively by means of the detector, is called a Flash LIDAR. It includes of an emitter, which illuminates as homogeneously as possible the entire field of vision.
  • the detector in this case includes of a plurality of individually readable and arranged in a matrix segments or pixels. Each of these pixels is correspondingly assigned a solid angle range. If light is received in a certain pixel, then the light is correspondingly derived from the solid angle region assigned to this pixel.
  • a raster or scanning LIDAR has an emitter which emits the measuring pulses selectively and in particular temporally sequentially in different spatial directions.
  • a single sensor segment is sufficient as a detector. If, in this case, light is received by the detector in a specific measuring time window, then this light comes from a solid angle range into which the light was emitted by the emitter in the same measuring time window.
  • a plurality of the above-described measurements or single-pulse measurements can be netted or combined with each other in a LIDAR Sensor System, for example to improve the signal-to-noise ratio by averaging the determined measured values.
  • the radiation emitted by the light source is in some embodiments infrared (IR) radiation emitted by a laser diode in a wavelength range of 600 nm to 850 nm.
  • IR infrared
  • the radiation of the laser diode can be emitted in a pulse-like manner with a frequency between 1 kHz and 1 MHz, in some implementations with a frequency between 10 kHz and 100 kHz.
  • the laser pulse duration may be between 0.1 ns and 100 ns, in some implementations between 1 ns and 2 ns.
  • a VCSEL Vertical Cavity Surface Emitting Laser
  • VECSEL Vertical External Cavity Surface Emitting Laser
  • Both the VCSEL and the VECSEL may be in the form of an array, e.g. 15 ⁇ 20 or 20 ⁇ 20 laser diodes may be arranged so that the summed radiation power can be several hundred watts. If the lasers pulse simultaneously in an array arrangement, the largest summed radiation powers can be achieved.
  • the emitter units may differ, for example, in their wavelengths of the respective emitted radiation. If the receiver unit is then also configured to be wavelength-sensitive, the pulses can also be differentiated according to their wavelength.
  • FIG. 20 shows schematically an embodiment of the proposed LIDAR Sensor System, Controlled LIDAR Sensor System and LIDAR Sensor Device.
  • the LIDAR Sensor System 10 comprises a First LIDAR Sensing System 40 that may comprise a Light Source 42 configured to emit electro-magnetic or other radiation 120 , in particular a continuous-wave or pulsed laser radiation in the blue and/or infrared wavelength range, a Light Source Controller 43 and related Software, Beam Steering and Modulation Devices 41 , in particular light steering and reflection devices, for example Micro-Mechanical Mirror Systems (MEMS), with a related control unit 150 , Optical components 80 , for example lenses and/or holographic elements, a LIDAR Sensor Management System 90 configured to manage input and output data that are required for the proper operation of the First LIDAR Sensing System 40 .
  • a Light Source 42 configured to emit electro-magnetic or other radiation 120 , in particular a continuous-wave or pulsed laser radiation in the blue and/or infrared wavelength range
  • a Light Source Controller 43 and related Software Beam Steering and Modulation Devices 41 , in particular light steering and reflection devices, for example Micro-
  • the First LIDAR Sensing System 40 may be connected to other LIDAR Sensor System devices, for example to a Control and Communication System 70 that is configured to manage input and output data that are required for the proper operation of the First LIDAR Sensor System 40 .
  • the LIDAR Sensor System 10 may include a Second LIDAR Sensing System 50 that is configured to receive and measure electromagnetic or other radiation, using a variety of Sensors 52 and Sensor Controller 53 .
  • the Second LIDAR Sensing System may comprise Detection Optics 82 , as well as Actuators for Beam Steering and Control 51 .
  • the LIDAR Sensor System 10 may further comprise a LIDAR Data Processing System 60 that performs Signal Processing 61 , Data Analysis and Computing 62 , Sensor Fusion and other sensing Functions 63 .
  • the LIDAR Sensor System 10 may further comprise a Control and Communication System 70 that receives and outputs a variety of signal and control data 160 and serves as a Gateway between various functions and devices of the LIDAR Sensor System 10 .
  • the LIDAR Sensor System 10 may further comprise one or many Camera Systems 81 , either stand-alone or combined with another Lidar Sensor System 10 component or embedded into another Lidar Sensor System 10 component, and data-connected to various other devices like to components of the Second LIDAR Sensing System 50 or to components of the LIDAR Data Processing System 60 or to the Control and Communication System 70 .
  • the LIDAR Sensor System 10 may be integrated or embedded into a LIDAR Sensor Device 30 , for example a housing, a vehicle, a vehicle headlight.
  • the Controlled LIDAR Sensor System 20 is configured to control the LIDAR Sensor System 10 and its various components and devices, and performs or at least assists in the navigation of the LIDAR Sensor Device 30 .
  • the Controlled LIDAR Sensor System 20 may be further configured to communicate for example with another vehicle or a communication networks and thus assists in navigating the LIDAR Sensor Device 30 .
  • the LIDAR Sensor System 10 is configured to emit electro-magnetic or other radiation in order to probe the environment 100 for other objects, like cars, pedestrians, road signs, and road obstacles.
  • the LIDAR Sensor System 10 is further configured to receive and measure electromagnetic or other types of object-reflected or object-emitted radiation 130 , but also other wanted or unwanted electromagnetic radiation 140 , in order to generate signals 110 that can be used for the environmental mapping process, usually generating a point cloud that is representative of the detected objects.
  • Controlled LIDAR Sensor System 20 uses Other Components or Software 150 to accomplish signal recognition and processing as well as signal analysis. This process may include the use of signal information that come from other sensor devices.
  • the LIDAR Sensor System may be combined with a LIDAR Sensor Device connected to a light control unit for illumination of an environmental space.
  • photo diodes may be used for the detection of light or light pulses in a respective sensor pixel, e.g. one or more of the following types of photo diodes:
  • photo diodes are understood to be of different photo diode types even though the structure of the photo diodes is the same (e.g. the photo diodes are all pin photo diodes), but the photo diodes are of different size or shape or orientation and/or may have different sensitivities (e.g. due to the application of different reverse-bias voltages to the photo diodes).
  • a photo diode type in the context of this disclosure is not only defined by the type of construction of the photo diode, but also by their sizes, shapes, orientation and/or ways of operation, and the like.
  • a two-dimensional array of sensor pixels may be provided for an imaging of two-dimensional images.
  • an optical signal converted into an electronic signal may be read-out individually per sensor pixel, comparable with a CCD or CMOS image sensor.
  • it may be provided to interconnect a plurality of sensor pixels in order to achieve a higher sensitivity by achieving a higher signal strength.
  • This principle may be applied, but is not limited, to the principle of the “silicon photomultiplier” (SiPM).
  • SiPM silicon photomultiplier
  • a plurality (in the order of 10 to 1000 or even more) of individual SPADs are connected in parallel. Although each single SPAD reacts to the first incoming photon (taking into consideration the detection probability), the sum of a lot of SPAD signals results in a quasi analog signal, which may be used to derive the incoming optical signal.
  • unwanted signals e.g. background light
  • unwanted signals e.g. coming from the non-illuminated and therefore not read out pixels
  • the de-focusing process may be adjusted adaptively, for example, depending on the illuminated scene and signal response of back-scattered light.
  • the most suitable size of the illumination spot on the surface of the sensor 52 does not necessarily need to coincide with the geometric layout of the pixels on the sensor array. By way of example, if the spot is positioned between two (or four) pixels, then two (or four) pixels will only be partially illuminated. This may also result in a bad signal-to-noise ratio due to the non-illuminated pixel regions.
  • control lines e.g. column select lines carrying the column select signals and row select lines carrying the row select signals
  • control lines may be provided to selectively interconnect a plurality of photo diodes to define a “virtual pixel”, which may be optimally adapted to the respective application scenario and the size of the laser spot on the sensor array.
  • This may be implemented by row selection lines and column selection lines, similar to the access and control of memory cells of a DRAM memory.
  • various types of photo diodes in other words, various photo diode types
  • the senor may include several pixels including different types of photo diodes.
  • various photo diode types may be monolithically integrated on the sensor 52 and may be accessed, controlled, or driven separately or the sensor pixel signals from pixels having the same or different photo diode types may be combined and analysed as one common signal.
  • photo diode types may be provided and individually controlled and read out, for example:
  • a photo diode of a pixel may be provided with an additional optical bandpass filter and/or polarization filter on pixel level connected upstream.
  • a plurality of pixels of the sensor 52 may be interconnected.
  • pixels having the same or different photo diode types may be interconnected, such as:
  • the interconnecting of pixels and thus the interconnecting of photo diodes may be provided based on the illumination conditions (in other words lighting conditions) of both, camera and/or LIDAR.
  • lighting conditions in other words lighting conditions
  • a smaller number of sensor pixels of the plurality of sensor pixels may be selected and combined.
  • fewer pixels may be interconnected. This results in a lower light sensitivity, but it may achieve a higher resolution.
  • bad lighting conditions e.g. when driving at night, more pixels may be interconnected. This results in a higher light sensitivity, but may suffer from a lower resolution.
  • the sensor controller may be configured to control the selection network (see below for further explanation) based on the level of illuminance of the LIDAR Sensor System such that the better the lighting conditions (visible and/or infrared spectral range) are, the fewer selected sensor pixels of the plurality of sensor pixels will be combined.
  • the interconnecting of the individual pixels and thus of the individual photo diodes to a “virtual sensor pixel” allows an accurate adaptation of the size of the sensor pixel to the demands of the entire system such as e.g. the entire LIDAR Sensing System. This may occur e.g. in a scenario in which it is to be expected that the non-illuminated regions of the photo diodes provide a significant noise contribution to the wanted signal.
  • a variable definition (selection) of the size of a “pixel” (“virtual pixel”) may be provided e.g. with avalanche photo diodes and/or silicon photomultipliers (SiPM), where the sensor 52 includes a large number of individual pixels including SPADs.
  • the laser beam has a beam profile of decreasing intensity with increasing distance from the center of the laser beam.
  • laser beam profiles can have different shapes, for example a Gaussian or a flat top shape. It is also to be noted that for a LIDAR measurement function, infrared as well as visible laser diodes and respectively suited sensor elements may be used.
  • the center may, as a result, be saturated.
  • the sensor pixels located in one or more rings further outside the sensor array may operate in the linear (non-saturated) mode due to the decreasing intensity and the signal intensity may be estimated.
  • the pixels of a ring may be interconnected to provide a plurality of pixel rings or pixel ring segments.
  • the pixel rings may further be interconnect in a timely successive manner, e.g. in case only one sum signal output is available for the interconnected sensor pixels).
  • a plurality of sum signal outputs may be provided or implemented in the sensor array which may be coupled to different groups of sensor pixels.
  • the pixels may be grouped in an arbitrary manner dependent on the respective requirements.
  • the combination of different types of sensor pixels within one sensor 52 e.g. allows combining the functionality of a LIDAR sensor with the functionality of a camera in one common optics arrangement without the risk that a deviation will occur with respect to adjustment and calibration between the LIDAR and camera. This may reduce costs for a combined LIDAR/camera sensor and may further improve the data fusion of LIDAR data and camera data.
  • camera sensors may be sensitive in the visible and/or infrared spectral range (thermographic camera).
  • the sensor controller 53 may control the sensor pixels taking into consideration the integration time (read out time) required by the respective photo diode of a pixel.
  • the integration time may be dependent on the size of the photo diode.
  • the clocking to control the read out process e.g. provided by the sensor controller 53 , may be different for the different types of pixels and may change depending on the configuration of the pixel selection network.
  • FIG. 1 shows a portion 3800 of the sensor 52 in accordance with various embodiments. It is to be noted that the sensor 52 does not need to be a SiPM detector array.
  • the sensor 52 includes a plurality of pixels 3802 . Each pixel 3802 includes a photo diode.
  • a light (laser) spot 3804 impinging on the surface of the portion 3800 of the sensor 52 is symbolized in FIG. 1 by a circle 3806 .
  • the light (laser) spot 3804 covers a plurality of sensor pixels 3802 .
  • a selection network may be provided which may be configured to selectively combine some pixels 3802 of the plurality of pixels 3802 to form an enlarged sensor pixel.
  • the electrical signals provided by the photo diodes of the combined sensor pixels are accumulated.
  • a read-out circuit may be provided which may be configured to read-out the accumulated electrical signals from the combined sensor pixels as one common signal.
  • the selection network may be configured to apply a plurality of row select signals 3808 , 3810 , 3812 (the number of row select signals may be equal to the number of rows of the sensor 52 ) to select the sensor pixels 3802 of the respectively selected row.
  • the selection network may include a row multiplexer (not shown in FIG. 1 ).
  • the selection network may be configured to apply a plurality of column select signals 3814 , 3816 , 3818 (the number of column select signals may be equal to the number of columns of the sensor 52 ) to select the pixels of the respectively selected column.
  • the selection network may include a column multiplexer (not shown in FIG. 1 ).
  • FIG. 1 illustrates nine selected sensor pixels 3802 selected by the plurality of row select signals 3808 , 3810 , 3812 and the plurality of column select signals 3814 , 3816 , 3818 .
  • the light (laser) spot 3804 fully covers the nine selected sensor pixels 3820 .
  • the sensor controller 53 may provide a supply voltage 3822 to the sensor 52 .
  • the sensor signals 3824 provided by the selected sensor pixels 3820 are read out from the sensor 52 and supplied to one or more amplifiers via the selection network. It is to be noted that a light (laser) spot 3804 do not need to fully cover a selected sensor pixel 3820 .
  • each sensor pixel 3802 of the sensor 52 in a manner comparable with a selection mechanism of memory cells in a Dynamic Random Access Memory (DRAM) allows a simple and thus cost efficient sensor circuit architecture to quickly and reliably select one or more sensor pixels 3802 to achieve an evaluation of a plurality of sensor pixels at the same time. This may improve the reliability of the sensor signal evaluation of the second LIDAR sensor system 50 .
  • DRAM Dynamic Random Access Memory
  • FIG. 2 shows a portion 3900 of the sensor 52 in accordance with various embodiments in more detail.
  • the sensor 52 may include a plurality of row selection lines 3902 , each row selection line 3902 being coupled to an input of the selection network, e.g. to an input of a row multiplexer of the selection network.
  • the sensor 52 may further include a plurality of column selection lines 3904 , each column selection line 3904 being coupled to another input of the selection network, e.g. to an input of a column multiplexer of the selection network.
  • a respective column switch 3906 is coupled respectively to one of the column selection lines 3904 and is connected to couple the electrical supply voltage 3908 present on a supply voltage line 3910 to the sensor pixels 3802 coupled to the respective column selection line 3904 or to decouple the electrical supply voltage 3908 therefrom.
  • Each sensor pixel 3802 may be coupled to a column read out line 3912 , which is in turn coupled to a collection read out line 3914 via a respective column read out switch 3916 .
  • the column read out switches 3916 may be part of the column multiplexer.
  • the sum of the current of the selected sensor pixels 3802 in other words the sensor signals 3824 , may be provided on the collection read out line 3914 .
  • Each sensor pixel 3802 may further be coupled downstream of an associated column selection line 3904 via a respective column pixel switch 3918 (in other words, a respective column pixel switch 3918 is connected between a respective associated column selection line 3904 and an associated sensor pixel 3802 ).
  • each sensor pixel 3802 may further be coupled upstream of an associated column read out line 3912 via a respective column pixel read out switch 3920 (in other words, a respective column pixel read out switch 3920 is connected between a respective associated column read out line 3912 and an associated sensor pixel 3802 ).
  • Each switch in the sensor 52 may be implemented by a transistor such as e.g. a field effect transistor (FET), e.g. a MOSFET.
  • FET field effect transistor
  • a control input (e.g. the gate terminal of a MOSFET) of each column pixel switch 3918 and of each column pixel read out switch 3920 may be electrically conductively coupled to an associated one of the plurality of row selection lines 3902 .
  • the row multiplexer may “activate” the column pixel switches 3918 and the pixel read out switches 3920 via an associated row selection line 3902 .
  • the associated column switch 3906 finally activates the respective sensor pixel 3802 by applying the supply voltage 3908 e.g. to the source of the MOSFET and (since e.g. the associated column pixel switch 3918 is closed) the supply voltage 3908 is also applied to the respective sensor pixel 3802 .
  • a sensor signal detected by the “activated” selected sensor pixel 3802 can be forwarded to the associated column read out line 3912 (since e.g. the associated column pixel read out switch 3920 is also closed), and, if also the associated column read out switch 3920 is closed, the respective sensor signal is transmitted to the collection read out line 3914 and finally to an associated amplifier (such as an associated TIA).
  • FIG. 4 shows a portion 4100 of the sensor 52 in accordance with various embodiments in more detail.
  • the column pixel read out switch 3920 may be dispensed with in a respective sensor pixel 3802 .
  • the embodiments shown in FIG. 41 may e.g. be applied to a SiPM as a sensor 52 .
  • the pixels 3802 may in this case be implemented as SPADs 3802 .
  • the sensor 52 further includes a first summation output 4102 for fast sensor signals.
  • the first summation output 4102 may be coupled to the anode of each SPAD via a respective coupling capacitor 4104 .
  • the sensor 52 in this example further includes a second summation output 4106 for slow sensor signals.
  • the second summation output 4106 may be coupled to the anode of each SPAD via a respective coupling resistor (which in the case of an SPAD as the photo diode of the pixel may also be referred to as quenching resistor) 4108 .
  • FIG. 5 shows a recorded scene 4200 and the sensor pixels used to detect the scene in accordance with various embodiments in more detail.
  • the sensor 52 may have sensor pixels 3802 with photo diodes having different sensitivities.
  • an edge region 4204 may at least partially surround a center region 4202 .
  • the center region 4202 may be provided for a larger operating range of the LIDAR Sensor System and the edge region 4204 may be provided for a shorter operating range.
  • the center region 4202 may represent the main moving (driving, flying or swimming) direction of a vehicle and thus usually needs a far view to recognize an object at a far distance.
  • the edge region 4204 may represent the edge region of the scene and usually, in a scenario where a vehicle (e.g.
  • sensor pixels 3802 with photo diodes having a higher sensitivity may be provided in the center region 4202 .
  • the shorter operating range means that the target object 100 return signal has a rather high (strong) signal intensity.
  • sensor pixels 3802 with photo diodes having a lower sensitivity may be provided in the edge region 4204 .
  • the patterning of the sensor pixels may be configured for specific driving scenarios and vehicle types (bus, car, truck, construction vehicles, drones, and the like).
  • the sensor pixels 3802 of the edge regions 4204 may have a high sensitivity. It should also be stated that, if a vehicle uses a variety of LIDAR/Camera sensor systems, these may be configured differently, even when illuminating and detecting the same Field-of-View.
  • FIG. 6 shows a recorded scene 4300 and the sensor pixels 3802 used to detect the scene 4300 in accordance with various embodiments in more detail.
  • a row-wise arrangement of the sensor pixels of the same photo diode type may be provided.
  • a first row 4302 may include pixels having APDs for a Flash LIDAR Sensor System and a second row 4304 may include pixels having pin photo diodes for a camera.
  • the two respectively adjacent pixel rows may be provided repeatedly so that the rows of different pixels are provided, for example, in an alternating manner.
  • the sequence and number of pixels rows of the same photo diode type could vary and likewise the grouping into specific selection networks.
  • a row of pixels or columns may employ different photo diode types.
  • a row or column must not be completely filled up with photo diodes. The own motion of a vehicle may compensate for the reduced resolution of the sensor array (“push-broom scanning” principle).
  • the different rows may include various photo diode types, such as for example:
  • the sensor controller 53 may be configured to select the respective pixels 3802 in accordance with the desired photo diode type in a current application.
  • FIG. 7 shows a flow diagram illustrating a method 4400 for a LIDAR Sensor System in accordance with various embodiments in more detail.
  • the LIDAR Sensor System may include a plurality of sensor pixels. Each sensor pixel includes at least one photo diode.
  • the LIDAR Sensor System may further include a selection network, and a read-out circuit.
  • the method 4400 may include, in 4402 , the selection network selectively combining some sensor pixels of the plurality of sensor pixels to form an enlarged sensor pixel.
  • the electrical signals provided by the photo diodes of the combined sensor pixels are accumulated.
  • the method 4400 may further include, in 4404 , the read-out circuit reading-out the accumulated electrical signals from the combined sensor pixels as one common signal.
  • FIG. 8 shows a flow diagram illustrating another method 4500 for a LIDAR Sensor System in accordance with various embodiments in more detail.
  • the LIDAR Sensor System may include a plurality of a plurality of pixels.
  • a first pixel of the plurality of pixels includes a photo diode of a first photo diode type
  • a second pixel of the plurality of pixels includes a photo diode of a second photo diode type.
  • the second photo diode type is different from the first photo diode type.
  • the LIDAR Sensor System may further include a pixel sensor selector and a sensor controller.
  • the method 4500 may include, in 4502 , the pixel sensor selector selecting at least one of the first pixel including a photo diode of the first photo diode type and/or at least one of the second pixel including a photo diode of the second photo diode type, and, in 4504 , the sensor controller controlling the pixel selector to select at least one first pixel and/or at least one second pixel.
  • the light (laser) emission (e.g. provided by a plurality of light (laser) sources, which may be operated in a group-wise manner) may be adapted in its light intensity pattern to the pixel distribution or arrangement of the sensor 52 , e.g. it may be adapted such that larger pixels may be charged with light having a higher intensity than smaller pixels. This may be provided in an analog manner with respect to photo diodes having a higher and lower sensitivity, respectively.
  • a first sensor pixel may include a photo diode of a first photo diode type and a second pixel of the plurality of pixels may include a photo diode of a second photo diode type.
  • the second photo diode type is different from the first photo diode type.
  • the both photo diodes may be stacked one above the other in a way as generally described in the embodiments as described with reference to FIG. 9 to FIG. 16 .
  • Example 1d is a LIDAR Sensor System.
  • the LIDAR Sensor System includes a plurality of sensor pixels, each sensor pixel including at least one photo diode.
  • the LIDAR Sensor System further includes a selection network configured to selectively combine some sensor pixels of the plurality of sensor pixels to form an enlarged sensor pixel.
  • the electrical signals provided by the photo diodes of the combined sensor pixels are accumulated.
  • the LIDAR Sensor System further includes a read-out circuit configured to read-out the accumulated electrical signals from the combined sensor pixels as one common signal.
  • Example 2d the subject matter of Example 1d can optionally include that the at least one photo diode includes at least one pin diode.
  • Example 3d the subject matter of Example 1d can optionally include that the at least one photo diode includes at least one avalanche photo diode.
  • Example 4d the subject matter of Example 3d can optionally include that the at least one avalanche photo diode includes at least one single-photon avalanche photo diode.
  • Example 5d the subject matter of any one of Examples 1d to 4d can optionally include that the plurality of sensor pixels are arranged in a sensor matrix in rows and columns.
  • Example 6d the subject matter of any one of Examples 1d to 5d can optionally include that the selection network includes a plurality of row selection lines, each row selection line being electrically conductively coupled to at least some sensor pixels of the same row, a plurality of column selection lines, each column selection line being electrically conductively coupled to at least some sensor pixels of the same column, and a plurality of read-out lines, each read-out line being electrically conductively coupled to at least some sensor pixels of the same column or the same row to accumulate the electrical signals provided by the combined sensor pixels.
  • the selection network includes a plurality of row selection lines, each row selection line being electrically conductively coupled to at least some sensor pixels of the same row, a plurality of column selection lines, each column selection line being electrically conductively coupled to at least some sensor pixels of the same column, and a plurality of read-out lines, each read-out line being electrically conductively coupled to at least some sensor pixels of the same column or the same row to accumulate the electrical signals provided by the combined sensor pixels.
  • Example 7d the subject matter of any one of Examples 1d to 6d can optionally include that each sensor pixel of at least some of the sensor pixels includes a first switch connected between the selection network and a first terminal of the sensor pixel, and/or a second switch connected between a second terminal of the sensor pixel and the selection network.
  • Example 8d the subject matter of Examples 6d and 7d can optionally include that the first switch is connected between a column selection line of the plurality of column selection lines and the first terminal of the sensor pixel, wherein a control terminal of the first switch is coupled to a row selection line of the plurality of row selection lines.
  • the second switch is connected between the second terminal of the sensor pixel and a read-out line of the plurality of read-out lines.
  • a control terminal of the second switch is coupled to a row selection line of the plurality of row selection lines.
  • Example 9d the subject matter of any one of Examples 7d or 8d can optionally include that at least one first switch and/or at least one second switch includes a field effect transistor.
  • Example 10d the subject matter of any one of Examples 1d to 9d can optionally include that the LIDAR Sensor System further includes a sensor controller configured to control the selection network to selectively combine some sensor pixels of the plurality of sensor pixels to form the enlarged sensor pixel.
  • the LIDAR Sensor System further includes a sensor controller configured to control the selection network to selectively combine some sensor pixels of the plurality of sensor pixels to form the enlarged sensor pixel.
  • Example 11d the subject matter of Example 10d can optionally include that the sensor controller is configured to control the selection network based on the level of illuminance of the LIDAR Sensor System such that with improving lighting conditions a smaller number of sensor pixels of the plurality of sensor pixels will be selected and combined.
  • Example 12d the subject matter of any one of Examples 1d to 11d can optionally include that the LIDAR Sensor System further includes a plurality of read-out amplifiers, each read-out amplifier coupled to an associated read-out line of the plurality of read-out lines.
  • Example 13d the subject matter of Example 12d can optionally include that the common signal is an electrical current.
  • the plurality of read-out amplifiers includes a plurality of transimpedance amplifiers, each transimpedance amplifier configured to convert the associated electrical current into an electrical voltage.
  • Example 14d is a LIDAR Sensor System.
  • the LIDAR Sensor System may include a plurality of pixels.
  • a first pixel of the plurality of pixels includes a photo diode of a first photo diode type
  • a second pixel of the plurality of pixels includes a photo diode of a second photo diode type.
  • the second photo diode type is different from the first photo diode type.
  • the LIDAR Sensor System may further include a pixel selector configured to select at least one of the first pixel including a photo diode of the first photo diode type and/or at least one of the second pixel including the photo diode of the second photo diode type, and a sensor controller configured to control the pixel selector to select at least one first pixel and/or at least one second pixel.
  • Example 15d the subject matter of Example 14d can optionally include that the sensor controller and the pixels are configured to individually read-out the photo diode of the first photo diode type and the photo diode of the second photo diode type.
  • Example 16d the subject matter of any one of Examples 14d or 15d can optionally include that the sensor controller and the pixels are configured to read-out the photo diode of the first photo diode type and the photo diode of the second photo diode type as one combined signal.
  • Example 17d the subject matter of any one of Examples 14d to 16d can optionally include that the photo diode of a first photo diode type and/or the photo diode of a second photo diode type are/is selected from a group consisting of: a pin photo diode; an avalanche photo diode; or a single-photon photo diode.
  • Example 18d the subject matter of any one of Examples 14d to 17d can optionally include that the LIDAR Sensor System further includes a selection network configured to selectively combine some pixels of the plurality of pixels to form an enlarged pixel, wherein the electrical signals provided by the photo diodes of the combined pixels are accumulated, and a read-out circuit configured to read-out the accumulated electrical signals from the combined pixels as one common signal.
  • the LIDAR Sensor System further includes a selection network configured to selectively combine some pixels of the plurality of pixels to form an enlarged pixel, wherein the electrical signals provided by the photo diodes of the combined pixels are accumulated, and a read-out circuit configured to read-out the accumulated electrical signals from the combined pixels as one common signal.
  • Example 19d the subject matter of any one of Examples 14d to 18d can optionally include that the plurality of pixels are arranged in a sensor matrix in rows and columns.
  • Example 20d the subject matter of any one of Examples 14d to 19d can optionally include that the selection network includes a plurality of row selection lines, each row selection line being electrically conductively coupled to at least some pixels of the same row, a plurality of column selection lines, each column selection line being electrically conductively coupled to at least some pixels of the same column, and a plurality of read-out lines, each read-out line being electrically conductively coupled to at least some pixels of the same column or the same row to accumulate the electrical signals provided by the combined pixels.
  • the selection network includes a plurality of row selection lines, each row selection line being electrically conductively coupled to at least some pixels of the same row, a plurality of column selection lines, each column selection line being electrically conductively coupled to at least some pixels of the same column, and a plurality of read-out lines, each read-out line being electrically conductively coupled to at least some pixels of the same column or the same row to accumulate the electrical signals provided by the combined pixels.
  • Example 21d the subject matter of any one of Examples 14d to 20d can optionally include that each pixel of at least some of the pixels includes a first switch connected between the selection network and a first terminal of the pixel, and/or a second switch connected between a second terminal of the pixel and the selection network.
  • Example 22d the subject matter of Examples 20d and 21d can optionally include that the first switch is connected between a column selection line of the plurality of column selection lines and the first terminal of the pixel. A control terminal of the first switch is coupled to a row selection line of the plurality of row selection lines. The second switch is connected between the second terminal of the pixel and a read-out line of the plurality of read-out lines. A control terminal of the second switch is coupled to a row selection line of the plurality of row selection lines.
  • Example 23d the subject matter of any one of Examples 21d or 22d can optionally include that at least one first switch and/or at least one second switch comprises a field effect transistor.
  • Example 24d the subject matter of any one of Examples 14d to 23d can optionally include that the sensor controller is further configured to control the selection network to selectively combine some pixels of the plurality of pixels to form the enlarged pixel.
  • Example 25d the subject matter of Example 22d can optionally include that the sensor controller is configured to control the selection network based on the level of illuminance of the LIDAR Sensor System such that with improving lighting conditions a smaller number of sensor pixels of the plurality of sensor pixels will be selected and combined.
  • Example 26d the subject matter of any one of Examples 14d to 25d can optionally include that the LIDAR Sensor System further includes a plurality of read-out amplifiers, each read-out amplifier coupled to an associated read-out line of the plurality of read-out lines.
  • Example 27d the subject matter of Example 26d can optionally include that the common signal is an electrical current.
  • the plurality of read-out amplifiers includes a plurality of transimpedance amplifiers, each transimpedance amplifier configured to convert the associated electrical current into an electrical voltage.
  • Example 28d is a method for a LIDAR Sensor System.
  • the LIDAR Sensor System may include a plurality of sensor pixels. Each sensor pixel includes at least one photo diode.
  • the LIDAR Sensor System may further include a selection network, and a read-out circuit.
  • the method may include the selection network selectively combining some sensor pixels of the plurality of sensor pixels to form an enlarged sensor pixel, wherein the electrical signals provided by the photo diodes of the combined sensor pixels are accumulated, and the read-out circuit reading-out the accumulated electrical signals from the combined sensor pixels as one common signal.
  • Example 29d the subject matter of Example 28d can optionally include that the at least one photo diode includes at least one pin diode.
  • Example 30d the subject matter of Example 28d can optionally include that the at least one photo diode includes at least one avalanche photo diode.
  • Example 31d the subject matter of Example 30d can optionally include that the at least one avalanche photo diode includes at least one single-photon avalanche photo diode.
  • Example 32d the subject matter of any one of Examples 28d to 31d can optionally include that the plurality of sensors are arranged in a sensor matrix in rows and columns.
  • Example 33d the subject matter of any one of Examples 28d to 32d can optionally include that the selection network includes a plurality of row selection lines, each row selection line being electrically conductively coupled to at least some sensor pixels of the same row, a plurality of column selection lines, each column selection line being electrically conductively coupled to at least some sensor pixels of the same column, and a plurality of read-out lines, each read-out line being electrically conductively coupled to at least some sensor pixels of the same column or the same row to accumulate the electrical signals provided by the combined sensor pixels.
  • the selection network includes a plurality of row selection lines, each row selection line being electrically conductively coupled to at least some sensor pixels of the same row, a plurality of column selection lines, each column selection line being electrically conductively coupled to at least some sensor pixels of the same column, and a plurality of read-out lines, each read-out line being electrically conductively coupled to at least some sensor pixels of the same column or the same row to accumulate the electrical signals provided by the combined sensor pixels.
  • Example 34d the subject matter of any one of Examples 28d to 33d can optionally include that each sensor pixel of at least some of the sensor pixels includes a first switch connected between the selection network and a first terminal of the sensor pixel, and/or a second switch connected between a second terminal of the sensor pixel and the selection network.
  • Example 35d the subject matter of Example 33d and Example 34d can optionally include that the first switch is connected between a column selection line of the plurality of column selection lines and the first terminal of the sensor pixel. A control terminal of the first switch is controlled via a row selection line of the plurality of row selection lines.
  • the second switch is connected between the second terminal of the sensor pixel and a read-out line of the plurality of read-out lines. A control terminal of the second switch is controlled via a row selection line of the plurality of row selection lines.
  • Example 36d the subject matter of any one of Examples 34d or 35d can optionally include that at least one first switch and/or at least one second switch comprises a field effect transistor.
  • Example 37d the subject matter of any one of Examples 28d to 36d can optionally include that the method further includes a sensor controller controlling the selection network to selectively combine some sensor pixels of the plurality of sensor pixels to form the enlarged sensor pixel.
  • Example 38d the subject matter of Example 37d can optionally include that the sensor controller controls the selection network based on the level of illuminance of the LIDAR Sensor System such that with improving lighting conditions a smaller number of sensor pixels of the plurality of sensor pixels will be selected and combined.
  • Example 39d the subject matter of any one of Examples 28d to 38d can optionally include that the LIDAR Sensor System further includes a plurality of read-out amplifiers, each read-out amplifier coupled to an associated read-out line of the plurality of read-out lines.
  • Example 40d the subject matter of Example 39d can optionally include that the common signal is an electrical current.
  • the plurality of read-out amplifiers includes a plurality of transimpedance amplifiers. Each transimpedance amplifier converts the associated electrical current into an electrical voltage.
  • Example 41d is a method for a LIDAR Sensor System.
  • the LIDAR Sensor System may include a plurality of a plurality of pixels. A first pixel of the plurality of pixels includes a photo diode of a first photo diode type, and a second pixel of the plurality of pixels includes a photo diode of a second photo diode type. The second photo diode type is different from the first photo diode type.
  • the LIDAR Sensor System may further include a pixel sensor selector and a sensor controller.
  • the method may include the pixel sensor selector selecting at least one of the first pixel including photo diode of the first photo diode type and/or at least one of the second pixel including the photo diode of the second photo diode type, and the sensor controller controlling the pixel selector to select at least one first pixel and/or at least one second pixel.
  • Example 42d the subject matter of Example 41d can optionally include that the photo diode of a first photo diode type and/or the photo diode of a second photo diode type are/is selected from a group consisting of: a pin photo diode, an avalanche photo diode, and/or a single-photon photo diode.
  • Example 43d the subject matter of any one of Examples 41d or 42d can optionally include that the method further includes a selection network selectively combining some sensors of the plurality of pixels to form an enlarged pixel, wherein the electrical signals provided by the photo diodes of the combined pixels are accumulated, and a read-out circuit reading out the accumulated electrical signals from the combined pixels as one common signal.
  • Example 44d the subject matter of any one of Examples 41d to 43d can optionally include that the plurality of pixels are arranged in a sensor matrix in rows and columns.
  • Example 45d the subject matter of any one of Examples 41d to 44d can optionally include that the selection network includes a plurality of row selection lines, each row selection line being electrically conductively coupled to at least some pixels of the same row, a plurality of column selection lines, each column selection line being electrically conductively coupled to at least some pixels of the same column, and a plurality of read-out lines, each read-out line being electrically conductively coupled to at least some pixels of the same column or the same row to accumulate the electrical signals provided by the combined pixels.
  • the selection network includes a plurality of row selection lines, each row selection line being electrically conductively coupled to at least some pixels of the same row, a plurality of column selection lines, each column selection line being electrically conductively coupled to at least some pixels of the same column, and a plurality of read-out lines, each read-out line being electrically conductively coupled to at least some pixels of the same column or the same row to accumulate the electrical signals provided by the combined pixels.
  • Example 46d the subject matter of any one of Examples 41d to 45d can optionally include that each pixel of at least some of the pixels includes a first switch connected between the selection network and a first terminal of the pixel, and/or a second switch connected between a second terminal of the pixel and the selection network.
  • Example 47d the subject matter of Example 45d and Example 46d can optionally include that the first switch is connected between a column selection line of the plurality of column selection lines and the first terminal of the pixel.
  • a control terminal of the first switch is controlled via a row selection line of the plurality of row selection lines
  • the second switch is connected between the second terminal of the pixel and a read-out line of the plurality of read-out lines.
  • a control terminal of the second switch is controlled via a row selection line of the plurality of row selection lines.
  • Example 48d the subject matter of any one of Examples 46d or 47d can optionally include that at least one first switch and/or at least one second switch includes a field effect transistor.
  • Example 49d the subject matter of any one of Examples 41d to 48d can optionally include that the sensor controller is controlling the selection network to selectively combine some pixels of the plurality of pixels to form the enlarged pixel.
  • Example 50d the subject matter of Example 49d can optionally include that the sensor controller controls the selection network based on the level of illuminance of the LIDAR Sensor System such that with improving lighting conditions a smaller number of sensor pixels of the plurality of sensor pixels will be selected and combined.
  • Example 51d the subject matter of any one of Examples 41d to 50d can optionally include that the LIDAR Sensor System further includes a plurality of read-out amplifiers, each read-out amplifier coupled to an associated read-out line of the plurality of read-out lines.
  • Example 52d the subject matter of Example 51d can optionally include that the common signal is an electrical current.
  • the plurality of read-out amplifiers includes a plurality of transimpedance amplifiers, each transimpedance amplifier converts the associated electrical current into an electrical voltage.
  • Example 53d is a computer program product.
  • the computer program product may include a plurality of program instructions that may be embodied in non-transitory computer readable medium, which when executed by a computer program device of a LIDAR Sensor System according to any one of examples 1d to 27d, cause the LIDAR Sensor System to execute the method according to any one of the examples 28d to 52d.
  • Example 54d is a data storage device with a computer program that may be embodied in non-transitory computer readable medium, adapted to execute at least one of a method for LIDAR Sensor System according to any one of the above method examples or a LIDAR Sensor System according to any one of the above LIDAR Sensor System examples.
  • the LIDAR Sensor System may be combined with a LIDAR Sensor Device for illumination of an environmental space connected to a light control unit.
  • a combination of a LIDAR sensor and a camera sensor may be desired e.g. in order to identify an object or characteristics of an object by means of data fusion. Furthermore, depending on the situation, either a three dimensional measurement by means of a LIDAR sensor or a two dimensional mapping by means of a camera sensor may be desired.
  • a LIDAR sensor alone usually cannot determine whether taillights of a vehicle are switched on or switched off.
  • deviations of the relative orientation of the optical axes of the sensors should also be taken into consideration, since they have an effect on the calibration state. This may also incorporate the fact that the fields of view of both sensors do not necessarily coincide with each other and that regions possibly exist in a region in close proximity to the sensors in which an object cannot be detected by all sensors of the one or more other sensors simultaneously.
  • Various aspects of this disclosure may provide a LIDAR functionality at two different wavelengths or the combination of a LIDAR function and a camera function in a visible wavelength region or the combination of a LIDAR function and a camera function in a wavelength region of the thermal infrared as will be described in more detail below.
  • a combination of a LIDAR function with a camera function is usually implemented by means of two separate sensor systems and the relative position of the sensor systems to each other is taken into consideration in the image processing.
  • a (movie or video) camera there is an approach to use three individual image sensors instead of a CCD/CMOS image sensor array with color filters (Bayer pattern).
  • the incoming light may be distributed over the three image sensors by means of an optics arrangement having full faced color filters (e.g. a trichroic beam splitter prism).
  • the physical principle of the wavelength dependent depth of penetration of light into a carrier such as a semiconductor (e.g. silicon) substrate is used in the field of the integration of a LIDAR sensor and a camera sensor in accordance with various embodiments.
  • two or more different types of photo diodes may be stacked above one another, i.e. one type of photo diode is placed over another type of photodiode.
  • This may be implemented e.g. by a monolithic integration of the different types of photo diodes in one common process of manufacturing (or other types of integration processes such as wafer bonding or other three-dimensional processes).
  • a pin photo diode for the detection of visible light e.g. red spectral region for the detection of car taillights
  • the carrier e.g. substrate
  • a deeper region of the carrier e.g.
  • an avalanche photo diode which may be configured to detect light emitted by a laser emitter and having a wavelength in the near infrared region (NIR).
  • the red light may in this case be detected near the surface of the pin photo diode due to its smaller depth of penetration.
  • Substantially fewer portions of the light of the visible spectrum (VIS) may penetrate into the deeper region (e.g. deeper layers) in this case, so that the avalanche photo diode which is implemented there is primarily sensitive to NIR light.
  • the stacking of the photo diodes one above the other may be useful in that:
  • FIG. 9 shows schematically in a cross sectional view an optical component 5100 for a LIDAR Sensor System in accordance with various embodiments.
  • the optical component 5100 may include a carrier, which may include a substrate, e.g. including a semiconductor material and/or a semiconductor compound material.
  • a carrier which may include a substrate, e.g. including a semiconductor material and/or a semiconductor compound material.
  • materials that may be used for the carrier and/or the semiconductor structure include one or more of the following materials: GaAs, AlGaInP, GaP, AlP, AlGaAs, GaAsP, GaInN, GaN, Si, SiGe, Ge, HgCdTe, InSb, InAs, GalnSb, GaSb, CdSe, HgSe, AlSb, CdS, ZnS, ZnSb, ZnTe.
  • the substrate may optionally include a device layer 5102 .
  • One or more electronic devices 5104 such as (field effect) transistors 5104 or other electronic devices (resistors, capacitors, inductors, and the like) 5104 may be completely or partially formed in the device layer 5102 .
  • the one or more electronic devices 5104 may be configured to process signals generated by the first photo diode 5110 and the second photo diode 5120 , which will be described in more detail below.
  • the substrate may optionally include a bottom interconnect layer 5106 .
  • the interconnect layer 5106 may be configured as a separate layer, e.g. as a separate layer arranged above the device layer 5102 (like shown in FIG. 9 ).
  • the carrier may have a thickness in the range from about 100 ⁇ m to about 3000 ⁇ m.
  • One or more electronic contacts 5108 configured to contact the electronic devices 5104 or an anode or a cathode of a first photo diode 5110 , in other words a first portion of the first photo diode 5110 (which will be described in more detail below), may be connected to an electronic contact 5108 of the bottom interconnect layer 5106 .
  • one or more contact vias 5112 may be formed in the bottom interconnect layer 5106 .
  • the one or more contact vias 5112 extend through the entire layer structure implementing the first photo diode 5110 into an intermediate interconnect/device layer 5114 .
  • the one or more electronic contacts 5108 as well as the one or more contact vias 5112 may be made of electrically conductive material such as a metal (e.g. Cu or Al) or any other suitable electrically conductive material.
  • the one or more electronic contacts 5108 and the one or more contact vias 5112 may form an electrically conductive connection network in the bottom interconnect layer 5106 .
  • the first photo diode 5110 may be an avalanche type photo diode such as an avalanche photo diode (APD) or a single-photon photo diode (SPAD).
  • the first photo diode 5110 may be operated in the linear mode/in the Geiger mode.
  • the first photo diode 5110 implements a LIDAR sensor pixel in a first semiconductor structure over the carrier.
  • the first photo diode 5110 is configured to absorb received light in a first wavelength region.
  • the first photo diode 5110 and thus the first semiconductor structure may have a layer thickness in the range from about 500 nm to about 50 ⁇ m.
  • One or more further electronic devices 5116 such as (field effect) transistors 5116 or other further electronic devices (resistors, capacitors, inductors, and the like) 5116 may be completely or partially formed in the intermediate interconnect/device layer 5114 .
  • One or more further electronic contacts 5118 configured to contact the further electronic devices 5116 or an anode or a cathode of the first photo diode 5110 , in other words a second portion of the first photo diode 5110 , may be connected to a further electronic contact 5118 of the intermediate interconnect/device layer 5114 .
  • the one or more further electronic contacts 5118 and the one or more contact vias 5112 may form an electrically conductive connection network (electrically conductive structure configured to electrically contact the first photo diode 5110 and the second photo diode 5120 ) in the intermediate interconnect/device layer 5114 .
  • the intermediate interconnect/device layer 5114 (which may also be referred to as interconnect layer 5114 ) is arranged between the first semiconductor structure and the second semiconductor structure.
  • One or more further electronic contacts 5118 and/or one or more contact vias 5112 may be configured to contact the further electronic devices 5116 or an anode or a cathode of a second photo diode 5120 , in other words a first portion of the second photo diode 5120 (which will be described in more detail below) may be connected to a further electronic contact 5118 of the intermediate interconnect/device layer 5114 .
  • the second photo diode 5120 may be arranged over (e.g. in direct physical contact with) the intermediate interconnect/device layer 5114 .
  • the second photo diode 5120 might be a pin photo diode (e.g. configured to receive light of the visible spectrum).
  • the second photo diode 5120 implements a camera sensor pixel in a second semiconductor structure over the intermediate interconnect/device layer 5114 and thus also over the first semiconductor structure.
  • the second photo diode 5120 is vertically stacked over the first photo diode.
  • the second photo diode 5120 is configured to absorb received light in a second wavelength region.
  • the received light of the second wavelength region has a shorter wavelength than the predominantly received light of the first wavelength region.
  • FIGS. 11A and 11B show schematically in a cross sectional view an optical component 5200 for a LIDAR Sensor System ( FIG. 10A ) and a corresponding wavelength/transmission diagram 5250 ( FIG. 10B ) in accordance with various embodiments.
  • the optical component 5200 of FIG. 10A is substantially similar to the optical component 5100 of FIG. 9 as described above. Therefore, only the main differences of the optical component 5200 of FIG. 10A with respect to the optical component 5100 of FIG. 9 will be described in more detail below.
  • the optical component 5200 of FIG. 10A may further optionally include one or more microlenses 5202 , which may be arranged over the second photo diode 5120 (e.g. directly above, in other words in physical contact with the second photo diode 5120 ).
  • the one or more microlenses 5202 may be embedded in or at least partially surrounded by a suitable filler material 5204 such as silicone.
  • the one or more microlenses 5202 together with the filler material 5204 may, for a layer structure, have a layer thickness in the range from about 1 ⁇ m to about 500 ⁇ m.
  • a filter layer 5206 which may be configured to implement a bandpass filter, may be arranged over the optional one or more microlenses 5202 or the second photo diode 5120 (e.g. directly above, in other words in physical contact with the optional filler material 5204 or with the second photo diode 5120 ).
  • the filter layer 5206 may have a layer thickness in the range from about 1 ⁇ m to about 500 ⁇ m.
  • the light may include various wavelengths, such as e.g. a first wavelength range ⁇ 1 (e.g. in the ultra-violet spectral region), a second wavelength range ⁇ 2 (e.g. in the visible spectral region), and a third wavelength range ⁇ 3 (e.g. in the near-infrared spectral region).
  • a first wavelength range ⁇ 1 e.g. in the ultra-violet spectral region
  • a second wavelength range ⁇ 2 e.g. in the visible spectral region
  • a third wavelength range ⁇ 3 e.g. in the near-infrared spectral region
  • the wavelength/transmission diagram 5250 as shown in FIG. 10B illustrates the wavelength-dependent transmission characteristic of the filter layer 5206 .
  • the filter layer 5206 has a bandpass filter characteristic.
  • the filter layer 5206 has a low, ideally negligible transmission for light having the first wavelength range ⁇ 1 .
  • the filter layer 5206 may completely block the light portions having the first wavelength range ⁇ 1 impinging on the upper (exposed) surface 5208 of the filter layer 5206 .
  • the transmission characteristic 5252 shows that the filter layer 5206 is substantially fully transparent (transmission factor close to “1”) for light having the second wavelength range ⁇ 2 and for light having the third wavelength range ⁇ 3 .
  • the second photo diode 5120 may include or be a pin photo diode (configured to detect light of the visible spectrum) and the first photo diode 5110 may include or be an avalanche photo diode (in the linear mode/in the Geiger mode) (configured to detect light of the near infrared (NIR) spectrum or in the infrared (IR) spectrum).
  • NIR near infrared
  • IR infrared
  • FIGS. 11A and 11B show schematically in a cross sectional view an optical component 5300 for a LIDAR Sensor System ( FIG. 11A ) and a corresponding wavelength/transmission diagram 5250 ( FIG. 11B ) in accordance with various embodiments.
  • the optical component 5300 of FIG. 11A is substantially similar to the optical component 5200 of FIG. 10A as described above. Therefore, only the main differences of the optical component 5300 of FIG. 11A from the optical component 5200 of FIG. 10A will be described in more detail below.
  • the optical component 5300 of FIG. 11A may further optionally include a mirror structure (e.g. a Bragg mirror structure).
  • the second photo diode 5120 may be arranged (in other words sandwiched) between the two mirrors (e.g. two Bragg mirrors) 5302 , 5304 of the mirror structure.
  • the optical component 5300 of FIG. 11A may further optionally include a bottom mirror (e.g. a bottom Bragg mirror) 5302 .
  • the bottom mirror (e.g. the bottom Bragg mirror) 5302 may be arranged over (e.g. in direct physical contact with) the intermediate interconnect/device layer 5114 .
  • the second photo diode 5120 may be arranged over (e.g.
  • a top mirror e.g. a top Bragg mirror
  • the optional one or more microlenses 5202 or the filter layer 5206 may be arranged over (e.g. in direct physical contact with) the top mirror 5304 .
  • the second photo diode 5120 may include or be a pin photo diode (configured to detect light of the visible spectrum) and the first photo diode 5110 may include or be an avalanche photo diode (in the linear mode/in the Geiger mode) (configured to detect light of the near infrared (NIR) spectrum or in the infrared (IR) spectrum).
  • NIR near infrared
  • IR infrared
  • FIG. 12 shows schematically a cross sectional view 5400 of a sensor 52 for a LIDAR Sensor System in accordance with various embodiments.
  • the sensor 52 may include a plurality of optical components (e.g. a plurality of optical components 5100 as shown in FIG. 9 ) in accordance with any one of the embodiments as described above or as will be described further below.
  • the optical components may be arranged in an array, e.g. in a matrix arrangement, e.g. in rows and columns. In various embodiments, more than 10, or more than 100, or more than 1000, or more than 10000, and even more optical components may be provided.
  • FIG. 13 shows a top view 5500 of the sensor 52 of FIG. 12 for a LIDAR Sensor System in accordance with various embodiments.
  • the top view 5500 illustrates a plurality of color filter portions (each color filter may be implemented as a filter layer 5206 ).
  • the different color filter portions may be configured to transmit (transfer) light of different wavelengths in the visible spectrum (to be detected by the second photo diode 5120 ) and light of one or more wavelengths to be absorbed or detected by the first photo diode 5110 for LIDAR detection.
  • a red pixel filter portion 5502 may be configured to transmit light having a wavelength to represent red color (to be detected by the second photo diode 5120 ) and light of one or more wavelengths to be absorbed or detected by the first photo diode 5110 for LIDAR detection and to block light outside these wavelength regions.
  • a green pixel filter portion 5504 may be configured to transmit light having a wavelength to represent green color (to be detected by the second photo diode 5120 ) and light of one or more wavelengths to be absorbed or detected by the first photo diode 5110 for LIDAR detection and to block light outside these wavelength regions.
  • a blue pixel filter portion 5506 may be configured to transmit light having a wavelength to represent blue color (to be detected by the second photo diode 5120 ) and light of one or more wavelengths to be absorbed or detected by the first photo diode 5110 for LIDAR detection and to block light outside these wavelength regions.
  • the color filter portions 5502 , 5504 , 5506 may each have the lateral size corresponding to a sensor pixel, in this case a size similar to the lateral sizes of the second photo diodes 5120 .
  • the second photo diodes 5110 may have the same lateral size as the second photo diodes 5120 .
  • the color filter portions 5502 , 5504 , 5506 may be arranged in accordance with a Bayer pattern.
  • FIG. 14 shows a top view 5600 of a sensor 52 for a LIDAR Sensor System in accordance with various embodiments.
  • the sensor of FIG. 14 is substantially similar to the sensor of FIG. 13 as described above. Therefore, only the main difference of the sensor of FIG. 14 from the sensor of FIG. 13 will be described in more detail below.
  • the color filter portions 5502 , 5504 , 5506 may each have a lateral size corresponding to a sensor pixel, in this case a size similar to the lateral size of the second photo diodes 5120 .
  • the first photo diodes 5110 may have a larger lateral size than the second photo diodes 5120 .
  • the surface area of the first photo diodes 5110 may be larger than the surface area of the second photo diodes 5120 .
  • the surface area of the first photo diodes 5110 may be larger than the surface area of the second photo diodes 5120 by a factor of two, or by a factor of four, or by a factor of eight, or by a factor of sixteen.
  • the larger size of the first photo diodes 5110 is symbolized by rectangles 5602 in FIG. 14 .
  • the color filter portions 5502 , 5504 , 5506 may also be arranged in accordance with a Bayer pattern. In these examples, the resolution of the first photo diodes 5110 may not be of high importance, but the sensitivity of the first photo diodes 5110 may be important.
  • FIG. 15 shows a top view 5700 of a sensor 52 for a LIDAR Sensor System in accordance with various embodiments.
  • the sensor of FIG. 15 is substantially similar to the sensor of FIG. 13 as described above. Therefore, only the main difference of the sensor of FIG. 15 from the sensor of FIG. 13 will be described in more detail below.
  • the top view 5700 illustrates a plurality of color filter portions (each color filter may be implemented as a filter layer 5206 ) different from the color filter portions of the sensor as shown in FIG. 13 or FIG. 14 .
  • a red pixel filter portion 5702 may be configured to transmit light having a wavelength to represent red color (to be detected by the second photo diode 5120 in order to detect a taillight of a vehicle) and light of one or more wavelengths to be absorbed or detected by the first photo diode 5110 for LIDAR detection and to block light outside these wavelength regions.
  • a yellow (or orange) pixel filter portion 5704 may be configured to transmit light having a wavelength to represent yellow (or orange) color (to be detected by the second photo diode 5120 in order to detect a warning light or a blinking light of a vehicle) and light of one or more wavelengths to be absorbed or detected by the first photo diode 5110 for LIDAR detection and to block light outside these wavelength regions.
  • the first photo diodes 5110 may have a larger lateral size than the second photo diodes 5120 .
  • the surface area of the first photo diodes 5110 may be larger than the surface area of the second photo diodes 5120 .
  • the surface area of the first photo diodes 5110 may be larger than the surface area of the second photo diodes 5120 by a factor of two, or by a factor of four, or by a factor of eight, or by a factor of sixteen.
  • the larger size of the first photo diodes 5110 is symbolized by rectangles 5602 in FIG. 15 .
  • the color filter portions 5702 and 5704 may be arranged in accordance with checkerboard pattern. In these examples, the resolution of the first photo diodes 5110 may not be of high importance, but the sensitivity of the first photo diodes 5110 may be important.
  • the structure and the transmission characteristics of the color filter portions may vary as a function of the desired color space.
  • an RGB color space was considered.
  • Other possible color spaces that may be provided are CYMG (cyan, yellow, magenta and green), RGBE (red, green, blue, and emerald), CMYW (cyan, magenta, yellow, and white), and the like.
  • the color filter portions would be adapted accordingly.
  • Optional further color filter types may mimic the scotopic sensitivity curve of the human eye.
  • FIG. 16 shows an optical component 5800 for a LIDAR Sensor System in accordance with various embodiments.
  • the optical component 5800 of FIG. 16 is substantially similar to the optical component 5200 of FIG. 10A as described above. Therefore, the main differences of the optical component 5800 of FIG. 16 from the optical component 5200 of FIG. 10A will be described in more detail below.
  • the optical component 5800 may have or may not have the optional one or more microlenses 5202 and the filler material 5204 .
  • a reflector layer 5802 may be arranged over (e.g. in direct physical contact with) the filter layer 5206 .
  • the reflector layer 5802 may be configured to reflect light in a wavelength region of a fourth wavelength ⁇ 4 .
  • the fourth wavelength range ⁇ 4 may have larger wavelengths than the first wavelength range ⁇ 1 , the second wavelength range ⁇ 2 , and the third wavelength range ⁇ 3 .
  • a light portion of the fourth wavelength ⁇ 4 is symbolized in FIG. 16 by a fourth arrow 5804 . This light impinges on the reflector layer 5802 and is reflected by the same.
  • the light portion that is reflected by the reflector layer 5802 is symbolized in FIG. 16 by a fifth arrow 5806 .
  • the reflector layer 5802 may be configured to reflect light in the wavelength region of thermal infrared light or infrared light.
  • the reflector layer 5802 may include a Bragg stack of layers configured to reflect light of a desired wavelength or wavelength region.
  • the optical component 5800 may further include a micromechanically defined IR absorber structure 5808 arranged over the reflector layer 5802 .
  • the IR absorber structure 5808 may be provided for a temperature-dependent resistivity measurement (based on the so called Microbolometer principle).
  • one or more conductor lines may be provided, e.g. in the intermediate interconnect/device layer 5114 .
  • the reflector layer 5802 may be configured to reflect thermal infrared radiation having a wavelength greater than approximately 2 ⁇ m.
  • Various embodiments such as e.g. the embodiments illustrated above may include a stack of different photo diodes, such as:
  • the above mentioned embodiments may be complemented by a filter, e.g. a bandpass filter, which is configured to transmit portions of the light which should be detected by the photo diode near to the surface of the carrier (e.g. of the visible spectrum) such as e.g. red light for vehicle taillights as well as portions of the light having the wavelength of the used LIDAR source (e.g. laser source).
  • a filter e.g. a bandpass filter, which is configured to transmit portions of the light which should be detected by the photo diode near to the surface of the carrier (e.g. of the visible spectrum) such as e.g. red light for vehicle taillights as well as portions of the light having the wavelength of the used LIDAR source (e.g. laser source).
  • a filter e.g. a bandpass filter
  • the above mentioned embodiments may further be complemented by a (one or more) microlens per pixel to increase the fill factor (a reduced fill factor may occur due to circuit regions of an image sensor pixel required by the manufacturing process).
  • the fill factor is to be understood as the area ratio between the optically active area and the total area of the pixel.
  • the optically active area may be reduced e.g. by electronic components.
  • a micro lens may extend over the entire area of the pixel and may guide the light to the optically active area. This would increase the fill factor.
  • a front-side illuminated image sensor or a back-side illuminated image sensor may be provided.
  • the device layer is positioned in a layer facing the light impinging the sensor 52 .
  • the device layer is positioned in a layer facing away from the light impinging the sensor 52 .
  • two APD photo diodes may be provided which are configured to detect light in different NIR wavelengths and which may be stacked over each other, e.g. to use the wavelength-dependent absorption characteristics of water (vapor) and to obtain information about the amount of water present in the atmosphere and/or an surfaces such as the roadway of a surface by the comparison of the intensities of the light detected at different wavelengths.
  • the detector may be implemented in a semiconductor material such as silicon or in semiconductor compound material such as silicon germanium, III-V semiconductor compound material, or II-VI semiconductor compound material, individually or in combination with each other.
  • Various embodiments may allow the manufacturing of a miniaturized and/or cost-efficient sensor system which may combine a camera sensor and a LIDAR sensor with each other in one common carrier (e.g. substrate).
  • a sensor system may be provided for pattern recognition, or object recognition, or face recognition.
  • the sensor system may be implemented in a mobile device such as a mobile phone or smartphone.
  • various embodiments may allow the manufacturing of a compact and/or cost-efficient sensor system for a vehicle.
  • a sensor system may be configured to detect active taillights of one or more other vehicles and at the same time to perform a three-dimensional measurement of objects by means of the LIDAR sensor portion of the sensor system.
  • various embodiments allow the combination of two LIDAR wavelengths in one common detector e.g. to obtain information about the surface characteristic of a reflecting target object by means of a comparison of the respectively reflected light.
  • Various embodiments may allow the combination of a LIDAR sensor, a camera sensor (configured to detect light of the visible spectrum (VIS)) and a camera sensor (configured to detect light of the thermal infrared spectrum), in one common sensor (e.g. monolithically integrated on one common carrier, e.g. one common substrate, e.g. one common wafer).
  • VIS visible spectrum
  • VIS visible spectrum
  • VIS thermal infrared spectrum
  • Various embodiments may reduce adjustment variations between different image sensors for camera and LIDAR.
  • even more than two photo diodes may be stacked one above the other.
  • the lateral size (and/or shape) of the one, two or even more photo diodes and the color filter portions of the filter layer may be the same.
  • the lateral size (and/or shape) of the one, two, or even more photo diodes may be the same, and the lateral size (and/or shape) of the color filter portions of the filter layer (e.g. filter layer 5206 ) may be different from each other and/or from the lateral size (and/or shape) of the one, two or even more photo diodes.
  • the lateral size (and/or shape) of the one, two, or even more photo diodes may be different from each other and/or from the lateral size (and/or shape) of the color filter portions, and the lateral size (and/or shape) of the color filter portions of the filter layer (e.g. filter layer 5206 ) may be the same.
  • the lateral size (and/or shape) of the one, two, or even more photo diodes may be different from each other and the lateral size (and/or shape) of the color filter portions of the filter layer (e.g. filter layer 5206 ) may be different from each other and/or from the lateral size (and/or shape) of the one, two or even more photo diodes.
  • color filter combinations like CYMG (cyan, yellow, green and magenta), RGBE (red, green, blue, and emerald), CMYW (cyan, magenta, yellow, and white) may be used as well.
  • the color filters may have a bandwidth (FWHM) in the range from about 50 nm to about 200 nm.
  • monochrome filters black/white may be provided.
  • the transmission curves of the used sensor pixel color filters should comply with the respective color-related traffic regulations.
  • Sensor elements having sensor pixels with color-filter need not only be arranged in a Bayer-Pattern, but other pattern configurations may be used as well, for example an X-trans-Matrix pixel-filter configuration.
  • a sensor as described with respect to FIGS. 10 to 16 may e.g. be implemented in a photon mixing device (e.g. for an indirect measurement or in a consumer electronic device in which a front camera of a smartphone may, e.g. at the same time, generate a three-dimensional image).
  • a photon mixing device e.g. for an indirect measurement or in a consumer electronic device in which a front camera of a smartphone may, e.g. at the same time, generate a three-dimensional image.
  • a sensor as described with respect to FIGS. 10 to 16 may e.g. also be implemented in a sensor to detect the characteristic of a surface, for example whether a street is dry or wet, since the surface usually has different light reflection characteristics depending on its state (e.g. dry state or wet state), and the like.
  • a stacked photo diode in accordance with various embodiments as described with reference to FIG. 9 to FIG. 16 may implement a first sensor pixel including a photo diode of a first photo diode type and a second pixel of the plurality of pixels including a photo diode of a second photo diode type.
  • such a stacked optical component including a plurality of photo diodes of different photo diode types (e.g. two, three, four or more photo diodes stacked above one another).
  • the stacked optical component may be substantially similar to the optical component 5100 of FIG. 9 as described above. Therefore, only the main differences of the stacked optical component with respect to the optical component 5100 of FIG. 9 will be described in more detail below.
  • the stacked optical component may optionally include one or more microlenses, which may be arranged over the second photo diode (e.g. directly above, in other words in physical contact with the second photo diode).
  • the one or more microlenses may be embedded in or at least partially surrounded by a suitable filler material such as silicone.
  • the one or more microlenses together with the filler material may, for a layer structure, have a layer thickness in the range from about 1 ⁇ m to about 500 ⁇ m.
  • a filter layer which may be configured to implement a bandpass filter, may be arranged over the optional one or more microlenses or the second photo diode (e.g. directly above, in other words in physical contact with the optional filler material or with the second photo diode).
  • the filter layer may have a layer thickness in the range from about 1 ⁇ m to about 500 ⁇ m.
  • the filter layer may have a filter characteristic in accordance with the respective application.
  • the second photo diode may include or be a pin photo diode (configured to detect light of the visible spectrum) and the first photo diode may include or be an avalanche photo diode (in the linear mode/in the Geiger mode) (configured to detect light of the near infrared (NIR) spectrum or in the infrared (IR) spectrum).
  • NIR near infrared
  • IR infrared
  • a multiplexer may be provided to individually select the sensor signals provided e.g. by the pin photo diode or by the avalanche photo diode.
  • the multiplexer may select e.g. either the pin photo diode (and thus provides only the sensor signals provided by the pin photo diode) or the avalanche photo diode (and thus provides only the sensor signals provided by the avalanche photo diode).
  • Example 1f is an optical component for a LIDAR Sensor System.
  • the optical component includes a first photo diode implementing a LIDAR sensor pixel in a first semiconductor structure and configured to absorb received light in a first wavelength region, a second photo diode implementing a camera sensor pixel in a second semiconductor structure over the first semiconductor structure and configured to absorb received light in a second wavelength region, and an interconnect layer (e.g. between the first semiconductor structure and the second semiconductor structure) including an electrically conductive structure configured to electrically contact the second photo diode.
  • the received light of the second wavelength region has a shorter wavelength than the received light of the first wavelength region.
  • Example 2f the subject matter of Example 1f can optionally include that the second photo diode is vertically stacked over the first photo diode.
  • Example 3f the subject matter of any one of Examples 1f or 2f can optionally include that the first photo diode is a first vertical photo diode, and/or that the second photo diode is a second vertical photo diode.
  • Example 4f the subject matter of any one of Examples 1f to 3f can optionally include that the optical component further includes a further interconnect layer (e.g. between the carrier and the first semiconductor structure) including an electrically conductive structure configured to electrically contact the second vertical photo diode and/or the first vertical photo diode.
  • a further interconnect layer e.g. between the carrier and the first semiconductor structure
  • Example 5f the subject matter of any one of Examples 1f to 4f can optionally include that the optical component further includes a microlens over the second semiconductor structure that laterally substantially covers the first vertical photo diode and/or the second vertical photo diode.
  • Example 6f the subject matter of any one of Examples 1f to 5f can optionally include that the optical component further includes a filter layer over the second semiconductor structure that laterally substantially covers the first vertical photo diode and/or the second vertical photo diode and is configured to transmit received light having a wavelength within the first wavelength region and within the second wavelength region, and block light that is outside of the first wavelength region and the second wavelength region.
  • the optical component further includes a filter layer over the second semiconductor structure that laterally substantially covers the first vertical photo diode and/or the second vertical photo diode and is configured to transmit received light having a wavelength within the first wavelength region and within the second wavelength region, and block light that is outside of the first wavelength region and the second wavelength region.
  • Example 7f the subject matter of any one of Examples 1f to 6f can optionally include that the received light of the first wavelength region has a wavelength in the range from about 800 nm to about 1800 nm, and/or that the received light of the second wavelength region has a wavelength in the range from about 380 nm to about 780 nm.
  • Example 8f the subject matter of any one of Examples 1f to 6f can optionally include that the received light of the first wavelength region has a wavelength in the range from about 800 nm to about 1800 nm, and/or that the received light of the second wavelength region has a wavelength in the range from about 800 nm to about 1750 nm.
  • Example 9f the subject matter of any one of Examples 1f to 8f can optionally include that the received light of the second wavelength region has a shorter wavelength than any received light of the first wavelength region by at least 50 nm, for example at least 100 nm.
  • Example 10f the subject matter of any one of Examples 1f to 7f or 9f can optionally include that the received light of the first wavelength region has a wavelength in an infrared spectrum wavelength region, and/or that the received light of the second wavelength region has a wavelength in the visible spectrum wavelength region.
  • Example 11f the subject matter of any one of Examples 1f to 10f can optionally include that the optical component further includes a mirror structure including a bottom mirror and a top mirror.
  • the second semiconductor structure is arranged between the bottom mirror and the top mirror.
  • the bottom mirror is arranged between the interconnect layer and the second semiconductor structure.
  • Example 12f the subject matter of Example 11f can optionally include that the mirror structure includes a Bragg mirror structure.
  • Example 13f the subject matter of any one of Examples 11f or 12f can optionally include that the mirror structure and the second vertical photo diode are configured so that the second vertical photo diode forms a resonant cavity photo diode.
  • Example 14f the subject matter of any one of Examples 1f to 13f can optionally include that the optical component further includes a reflector layer over the second semiconductor structure.
  • Example 15f the subject matter of Example 14f can optionally include that the reflector layer is configured as a thermal reflector layer configured to reflect radiation having a wavelength equal to or greater than approximately 2 ⁇ m, and/or that the reflector layer is configured as an infrared reflector layer.
  • Example 16f the subject matter of any one of Examples 1f to 15f can optionally include that the first photo diode is a pin photo diode, and that the second photo diode is a pin photo diode.
  • Example 17f the subject matter of any one of Examples 1f to 15f can optionally include that the first photo diode is an avalanche photo diode, and that the second photo diode is a pin photo diode.
  • Example 18f the subject matter of any one of Examples 1f to 15f can optionally include that the first photo diode is an avalanche photo diode, and that the second photo diode is a resonant cavity photo diode.
  • Example 19f the subject matter of any one of Examples 1f to 15f can optionally include that the first photo diode is a single-photon avalanche photo diode, and that the second photo diode is a resonant cavity photo diode.
  • Example 20f the subject matter of any one of Examples 1f to 15f can optionally include that the first photo diode is an avalanche photo diode, and that the second photo diode is an avalanche photo diode.
  • Example 21f the subject matter of any one of Examples 2f to 20f can optionally include that the optical component further includes an array of a plurality of photo diode stacks, each photo diode stack comprising a second photo diode vertically stacked over a first photo diode.
  • Example 22f the subject matter of any one of Examples 1f to 21f can optionally include that at least one photo diode stack of the plurality of photo diode stacks comprises at least one further second photo diode in the second semiconductor structure adjacent to the second photo diode, and that the first photo diode of the at least one photo diode stack of the plurality of photo diode stacks has a larger lateral extension than the second photo diode and the at least one further second photo diode of the at least one photo diode stack so that the second photo diode and the at least one further second photo diode are arranged laterally within the lateral extension of the first vertical photo diode.
  • Example 23f the subject matter of any one of Examples 1f to 22f can optionally include that the carrier is a semiconductor substrate.
  • Example 24f is a sensor for a LIDAR Sensor System.
  • the sensor may include a plurality of optical components according to any one of Examples 1f to 23f.
  • the plurality of optical components are monolithically integrated on the carrier as a common carrier.
  • Example 25f the subject matter of Example 24f can optionally include that the sensor is configured as a front-side illuminated sensor.
  • Example 26f the subject matter of Example 24f can optionally include that the sensor is configured as a back-side illuminated sensor.
  • Example 27f the subject matter of any one of Examples 24f to 26f can optionally include that the sensor further includes a color filter layer covering at least some optical components of the plurality of optical components.
  • Example 28f the subject matter of Example 27f can optionally include that the color filter layer includes a first color filter sublayer and a second color filter sublayer.
  • the first color filter sublayer is configured to transmit received light having a wavelength within the first wavelength region and within the second wavelength region, and to block light outside the first wavelength region and outside the second wavelength region.
  • the second color filter sublayer is configured to block received light having a wavelength outside the second wavelength region.
  • Example 29f the subject matter of Example 28f can optionally include that the first color filter sublayer and/or the second color filter sublayer includes a plurality of second sublayer pixels.
  • Example 30f the subject matter of Example 29f can optionally include that the first color filter sublayer and/or the second color filter sublayer includes a plurality of second sublayer pixels in accordance with a Bayer pattern.
  • Example 31f the subject matter of any one of Examples 27f to 30f can optionally include that the first color filter sublayer includes a plurality of first sublayer pixels having the same size as the second sublayer pixels. The first sublayer pixels and the second sublayer pixels coincide with each other.
  • Example 32f the subject matter of any one of Examples 27f to 30f can optionally include that the first color filter sublayer comprises a plurality of first sublayer pixels having a size larger than the size of the second sublayer pixels. One first sublayer pixels laterally substantially overlaps with a plurality of the second sublayer pixels.
  • Example 33f is a LIDAR Sensor System, including a sensor according to any one of Examples 24f to 32f, and a sensor controller configured to control the sensor.
  • Example 34f is a method for a LIDAR Sensor System according to example 33f, wherein the LIDAR Sensor System is integrated into a LIDAR Sensor Device, and communicates with a second Sensor System and uses the object classification and/or the Probability Factors and/or Traffic Relevance factors measured by the second Sensor System for evaluation of current and future measurements and derived LIDAR Sensor Device control parameters as a function of these factors.
  • Pulsed laser sources may have various applications.
  • An important field of application for pulsed laser sources may be time-of-flight LIDAR sensors or LIDAR systems.
  • a laser pulse may be emitted, the laser pulse may be reflected by a target object, and the reflected pulse may be received again by the LIDAR system.
  • a distance to the object may be calculated by measuring the time that has elapsed between sending out the laser pulse and receiving the reflected pulse.
  • Various types of lasers or laser sources may be used for a LIDAR application (e.g., in a LIDAR system).
  • a LIDAR system may include an edge-emitting diode laser, a vertical cavity surface-emitting laser (VCSEL), a fiber laser, or a solid state laser (e.g., a Nd:YAG diode pumped crystal laser, a disc laser, and the like).
  • VCSEL vertical cavity surface-emitting laser
  • An edge-emitting diode laser or a VCSEL may be provided, for example, for low-cost applications.
  • a special driver circuit may be provided for a laser diode to operate in pulsed mode.
  • a relatively high electrical current pulse may be sent through the laser diode within a short period of time (usually on the order of a few picoseconds up to a few microseconds) to achieve a short and intense optical laser pulse.
  • the driver circuit may include a storage capacitor for supplying the electrical charge for the current pulse.
  • the driver circuit may include a switching device (e.g., one or more transistors) for generating the current pulse.
  • a direct connection between the laser source and a current source may provide an excessive current (illustratively, a much too large current).
  • Silicon-based capacitors may be integrated into a hybrid or system-in-package for providing higher integration of laser drivers.
  • the switching device for activating the current pulse through the laser diode may be a separate element from the capacitor.
  • the storage capacitor and the switching may be located at a certain distance away from the laser diode. This may be related to the dimensions of the various electrical components included in the capacitor and in the switching device. Illustratively, with discrete components a minimum distance of the order of millimeters may be present. The soldering of the discrete components on a printed circuit board (PCB) and the circuit lanes connecting the components on the printed circuit board (PCB) may prevent said minimum distance to be reduced further. This may increase the parasitic capacitances and inductances in the system.
  • PCB printed circuit board
  • PCB printed circuit board
  • Various embodiments may be based on integrating in a common substrate one or more charge storage capacitors, one or more switching devices (also referred to as switches), and one or more laser light emitters (e.g., one or more laser diodes).
  • a system including a plurality of capacitors, a plurality of switching devices (e.g., a switching device for each capacitor), and one or more laser diodes integrated in or on a common substrate may be provided.
  • the arrangement of the capacitors and the switching devices in close proximity to the one or more laser diodes may provide reduced parasitic inductances and capacitances (e.g., of an electrical path for a drive current flow). This may provide improved pulse characteristics (e.g., a reduced minimum pulse width, an increased maximum current at a certain pulse width, a higher degree of influence on the actual pulse shape, or a more uniform shape of the pulse).
  • an optical package may be provided (also referred to as laser diode system).
  • the optical package may include a substrate (e.g., a semiconductor substrate, such as a compound semiconductor material substrate).
  • the substrate may include an array of a plurality of capacitors formed in the substrate.
  • the substrate may include a plurality of switches. Each switch may be connected between at least one capacitor and at least one laser diode.
  • the optical package may include the at least one laser diode mounted on the substrate.
  • the optical package may include a processor (e.g., a laser driver control circuit or part of a laser driver control circuit) configured to control the plurality of switches to control a first current flow to charge the plurality of capacitors.
  • the processor may be configured to control the plurality of switches to control a second current flow to drive the at least one laser diode with a current discharged from at least one capacitor (e.g., a current pulse through the laser diode).
  • the processor may be configured to control the plurality of switches to control the second current flow to discharge the plurality of capacitors.
  • the optical package may be provided, for example, for LIDAR applications.
  • the optical package may be based on an array-distributed approach for the capacitors and the switches.
  • the first current flow may be the same as the second current flow.
  • the current used for charging the capacitors may be the same as the current discharged from the capacitors.
  • the first current flow may be different from the second current flow (for example, in case part of the charge stored in the capacitors has dissipated, as described in further detail below).
  • the arrangement of the components of the optical package may be similar to the arrangement of the components of a dynamic random-access memory (DRAM).
  • each switch may be assigned to exactly one respective capacitor.
  • a switch-capacitor pair (e.g., in combination with the associated laser diode) may be similar to a memory cell of a DRAM array (e.g., a memory cell may include, for example a storage capacitor, a transistor, and electrical connections).
  • the plurality of capacitors and the plurality of switches may be understood as a driver circuit (illustratively, as part of a driver circuit, for example of a DRAM-like driver circuit) of the at least one laser diode.
  • the laser diode may partially cover the driver circuit (e.g., at least a portion of the array of capacitors).
  • the driver circuit may be arranged underneath the laser diode.
  • the driver circuit may be electrically connected with the laser diode (e.g., by means of a method of 3D-integration of integrated circuits, such as bump bonding).
  • the capacitors may have sufficient capacity to provide enough current to the laser diode for high-power laser emission, illustratively for emission in time-of-flight LIDAR applications.
  • about 500000 capacitors (for example, each having a capacitance of about 100 fF) may be assigned to the laser diode (e.g., to a VCSEL, for example having a diameter of about 100 ⁇ m).
  • the arrangement of the capacitors directly underneath the laser diode may provide small parasitic inductances and capacitances. This may simplify the generation of a short and powerful laser pulse (e.g., based on a current pulse of about 40 A in the exemplary arrangement).
  • a connection e.g., an electrical path
  • between a capacitor (and/or a switch) and the laser diode may have an inductivity lower than 100 pH.
  • the charge stored in the capacitors may dissipate in case the charge is not used, e.g. after a certain period of time.
  • a regular re-charging (illustratively, a refreshment) of the capacitors may be provided (e.g., at predefined time intervals).
  • the charge dissipation may reduce the risk of unintentional emission of a laser pulse.
  • the optical package may be provided or may operate without a high-resistivity resistor configured to discharge the storage capacitors over time periods larger than the laser pulse rate.
  • the driver circuit may be fabricated using DRAM manufacturing methods, e.g. CMOS technology methods.
  • the capacitors may be deep trench capacitors or stacked capacitors (illustratively, at least one capacitor may be a deep trench capacitor and/or at least one capacitor may be a stacked capacitor).
  • Each switch may include a transistor, e.g., a field effect transistor (e.g., a metal oxide semiconductor field effect transistor, such as a complementary metal oxide semiconductor field effect transistor).
  • the driver circuit may be provided (and fabricated) in a cost-efficient manner (e.g., without expensive, high-performance high-speed power transistors, such as without GaN FET).
  • the laser diode may include a III-V semiconductor material as active material (e.g. from the AlGaAs or GaN family of semiconductors).
  • the laser diode may include an edge-emitting laser diode.
  • the laser diode may include a vertical cavity surface-emitting laser diode (e.g., the optical package may be a VCSEL package).
  • the processor may be configured to individually control the plurality of switches to control the first current flow to charge the plurality of capacitors.
  • the processor may be configured to control the amount of charge to be delivered to the laser diode.
  • the processor may be configured to individually control the plurality of switches to control the second current flow to drive the at least one laser diode with a current discharged from at least one capacitor.
  • the processor may be configured to individually control the switches such that a variable number of capacitors associated with the laser diode may be discharged (illustratively, at a specific time) to drive the laser diode (e.g., only one capacitor, or some capacitors, or all capacitors). This may provide control over the total current for the current pulse and over the intensity of the outgoing laser pulse.
  • Variable laser output power may be provided, e.g. based on a precisely adjusted current waveform.
  • the optical package may include one or more access lines (e.g., similar to a DRAM circuit) for selectively charging and/or selectively discharging the capacitors (e.g., for charging and/or discharging a subset or a sub-array of capacitors).
  • access lines e.g., similar to a DRAM circuit
  • the capacitors e.g., for charging and/or discharging a subset or a sub-array of capacitors.
  • the optical package may include a plurality of laser diodes, for example arranged as a one-dimensional array (e.g., a line array) or as a two-dimensional array (e.g., a matrix array).
  • the optical package may include a VCSEL array.
  • Each laser diode may be associated with (e.g., driven by) a corresponding portion of the driver circuit (e.g., corresponding capacitors and switches, for example corresponding 500000 capacitors).
  • the optical package may include one or more heat dissipation components, such as one or more through vias, e.g. through-silicon vias (TSV), one or more metal layers, and/or one or more heat sink devices.
  • TSV through-silicon vias
  • the optical package may include one or more heat sink devices arranged underneath the substrate (for example, in direct physical contact with the substrate).
  • the optical package may include one or more through-silicon vias arranged outside and/or inside an area of the substrate including the switches and the capacitors. The one or more through-silicon vias may provide an improved (e.g., greater) heat conduction from the laser diode to a bottom surface of the substrate (illustratively, the mounting surface below the capacitor/switch array).
  • the optical package may include a metal layer arranged between the capacitors and the switches.
  • the metal layer may improve heat transfer towards the sides of the optical package.
  • the metal layer may have an additional electrical functionality, such as electrically contacting some of the capacitors with the sides of the optical package.
  • the heat dissipation components may be provided to dissipate the thermal load related to the high-density integration of the components of the optical package (e.g., laser diode and driver circuit).
  • FIG. 17A shows an optical package 15500 in a schematic side view in accordance with various embodiments.
  • the optical package 15500 may include a substrate 15502 .
  • the substrate 15502 may be a semiconductor substrate.
  • the substrate 15502 may include silicon or may essentially consist of silicon.
  • the substrate 15502 may include or essentially consist of a compound semiconductor material (e.g., GaAs, InP, GaN, or the like).
  • the substrate 15502 may include a plurality of capacitors 15504 .
  • the capacitors 15504 may be formed in the substrate 15502 , e.g. the capacitors 15504 may be monolithically integrated in the substrate 15502 .
  • a capacitor 15504 may be surrounded on three sides or more by the substrate 15502 (e.g., by the substrate material).
  • the capacitors 15504 may be fabricated, for example, by means of DRAM-manufacturing processes.
  • At least one capacitor 15504 may be a deep trench capacitor.
  • a trench (or a plurality of trenches) may be formed into the substrate 15502 (e.g., via etching).
  • a dielectric material may be deposited in the trench.
  • a plate may be formed surrounding a lower portion of the trench. The plate may be or may serve as first electrode for the deep trench capacitor.
  • the plate may be, for example, a doped region (e.g., an n-doped region) in the substrate 15502 .
  • a metal (e.g., a p-type metal) may be deposited on top of the dielectric layer. The metal may be or may serve as second electrode for the deep trench capacitor.
  • At least one capacitor 15504 may be a stacked capacitor.
  • an active area (or a plurality of separate active areas) may be formed in the substrate.
  • a gate dielectric layer may be deposited on top of the active area (e.g., on top of each active area).
  • a sequence of conductive layers and dielectric layers may be deposited on top of the gate dielectric layer. Electrical contacts may be formed, for example, via a masking and etching process followed by metal deposition.
  • the capacitors 15504 may be arranged in an ordered fashion in the substrate 15502 , e.g. the plurality of capacitors 15504 may form an array.
  • the capacitors 15504 may be arranged in one direction to form a one-dimensional capacitor array.
  • the capacitors 15504 may be arranged in two directions to form a two-dimensional capacitor array.
  • the capacitors 15504 of the array of capacitors 15504 may be arranged in rows and columns (e.g., a number N of rows and a number M of columns, wherein N may be equal to M or may be different from M).
  • the plurality of capacitors 15504 may include capacitors 15504 of the same type or of different types (e.g., one or more deep trench capacitors and one or more stacked capacitors), for example different types of capacitors 15504 in different portions of the array (e.g., in different sub-arrays).
  • the substrate 15502 may include a plurality of switches 15506 .
  • the switches 15506 may be formed in the substrate 15502 , e.g. the switches 15506 may be monolithically integrated in the substrate 15502 .
  • Each switch 15506 may be connected between at least one capacitor 15504 and at least one laser diode 15508 (e.g., each switch 15506 may be electrically coupled with at least one capacitor 15504 and at least one laser diode 15508 ).
  • a switch 15506 may be arranged along an electrical path connecting a capacitor 15504 with the laser diode 15508 .
  • a switch 15506 may be controlled (e.g., opened or closed) to control a current flow from the associated capacitor 15504 to the laser diode 15508 .
  • each switch 15506 may include a transistor.
  • At least one transistor (or more than one transistor, or all transistors) may be a field effect transistor, such as a metal oxide semiconductor field transistor (e.g., a complementary metal oxide semiconductor field transistor).
  • a metal oxide semiconductor field transistor e.g., a complementary metal oxide semiconductor field transistor. It is understood that the plurality of switches 15506 may include switches 15506 of the same type or of different types.
  • a switch 15506 may be assigned to more than one capacitor 15504 (e.g., a switch 15506 may be controlled to control a current flow between more than one capacitor 15504 and the laser diode 15508 ).
  • each switch 15506 may be assigned to exactly one respective capacitor 15504 .
  • the substrate 15502 may include a plurality of switch-capacitor pairs (e.g., similar to a plurality of DRAM cells). This may be illustrated by the circuit equivalents shown, for example, in FIG. 17B and FIG. 17C .
  • a switch 15506 s may be controlled (e.g., via a control terminal 15506 g , such as a gate terminal) to allow or prevent a current flow from the assigned capacitor 15504 c to the laser diode 15508 d (or to an associated laser diode, as shown in FIG. 17C ).
  • the switches 15506 may have a same or similar arrangement as the capacitors 15504 (e.g., the substrate 15502 may include an array of switches 15506 , such as a one-dimensional array or a two-dimensional array).
  • the optical package 15500 may include the at least one laser diode 15508 .
  • the laser diode 15508 may be mounted on the substrate 15502 (e.g., the laser diode 15508 may be arranged on a surface of the substrate 15502 , such as a top surface, for example on an insulating layer of the substrate 15502 ).
  • the laser diode 15508 may laterally cover at least a portion of the plurality of capacitors 15504 .
  • the laser diode 15508 may be mounted on the substrate 15502 in correspondence (e.g., directly above) of the plurality of capacitors 15504 or of at least a portion of the plurality of capacitors 15504 .
  • This may provide a low inductivity for an electrical path between a capacitor 15504 (or a switch 15506 ) and the laser diode 15508 .
  • the electrical path e.g., between a capacitor 15504 and the laser diode 15508 and/or between a switch 15502 and the laser diode 15508 ) may have an inductivity in a range between 70 pH and 200 pH, for example lower than 100 pH.
  • the laser diode 15508 may be a laser diode suitable for LIDAR applications (e.g., the optical package 15500 may be included in a LIDAR system, for example in the LIDAR Sensor System 10 ).
  • the laser diode 15508 may be or may include an edge-emitting laser diode.
  • the laser diode 15508 may be or may include a vertical cavity surface-emitting laser diode.
  • the laser diode 15508 may be configured to receive current discharged from the capacitors 15504 .
  • the substrate 15502 may include a plurality of electrical contacts (e.g., each electrical contact may be connected with a respective capacitor 15504 , for example via the respective switch 15506 ).
  • the laser diode 15508 may be mounted on the electrical contacts or may be electrically connected with the electrical contacts.
  • a first terminal of the laser diode 15508 may be electrically connected to the electrical contacts, for example via an electrically conductive common line 15510 , as described in further detail below (e.g., the first terminal of the laser diode 15508 may be electrically coupled to the common line 15510 ).
  • a second terminal of the laser diode 15508 may be electrically connected to a second potential, e.g., to ground.
  • the laser diode 15508 may be associated with a number of capacitors 15504 for providing a predefined laser output power.
  • the laser diode 15508 may be configured to receive current discharged from a number of capacitors 15504 such that a predefined laser output power may be provided, for example above a predefined threshold.
  • the laser diode 15508 may be configured to receive current discharged from a number of capacitors 15504 such that a predefined current may flow in or through the laser diode 15508 , for example a current above a current threshold.
  • the laser diode 15508 may be associated with a number of capacitors 15504 in the range from a few hundreds capacitors 15504 to a few millions capacitors 15504 , for example in the range from about 100000 capacitors 15504 to about 1000000 capacitors 15504 , for example in the range from about 400000 capacitors 15504 to about 600000 capacitors 15504 , for example about 500000 capacitors 15504 .
  • Each capacitor 15504 may have a capacitance in the femtofarad range, for example in the range from about 50 fF to about 200 fF, for example about 100 fF.
  • the capacitance of a capacitor 15504 may be selected or adjusted depending on the number of capacitors 15504 associated with the laser diode 15508 (illustratively, the capacitance may increase for decreasing number of associated capacitors 15504 and may decrease for increasing number of associated capacitors 15504 ).
  • the capacitance of a capacitor 15504 may be selected or adjusted depending on the current flow to drive the laser diode 15508 (e.g., in combination with the number of associated capacitors 15504 ).
  • At least one capacitor 15504 , or some capacitors 15504 , or all capacitors 15504 associated with the laser diode 15508 may be discharged (e.g., for each laser pulse emission). This may provide control over the emitted laser pulse, as described in further detail below.
  • the optical package 15500 may include more than one laser diode 15508 (e.g., a plurality of laser diodes), of the same type or of different types.
  • Each laser diode may be associated with a corresponding plurality of capacitors 15504 (e.g., with a corresponding number of capacitors, for example in the range from about 400000 to about 60000, for example about 500000).
  • the laser diode 15508 may be configured to emit light (e.g., a laser pulse) in case the current discharged from the associated capacitors 15504 flows in the laser diode 15508 .
  • the laser diode may be configured to emit light in a predefined wavelength range, e.g. in the near infra-red or in the infra-red wavelength range (e.g., in the range from about 800 nm to about 1600 nm, for example at about 905 nm or at about 1550 nm).
  • the duration of an emitted laser pulse may be dependent on a time constant of the capacitors 15504 .
  • an emitted laser pulse may have a pulse duration (in other words, a pulse width) in the range from below 1 ns to several nanoseconds, for example in the range from about 5 ns to about 20 ns, for example about 10 ns.
  • the optical package 15500 may include an electrically conductive common line 15510 (e.g., a metal line).
  • the common line 15510 may connect at least some capacitors 15504 of the plurality of capacitors 15504 .
  • the common line 15510 may connect (e.g., may be electrically connected with) the electrical contacts of at least some capacitors 15504 of the plurality of capacitors 15504 .
  • the common line 15510 may connect all capacitors 15504 of the plurality of capacitors 15504 .
  • the optical package 15500 may include a plurality of common lines 15510 , each connecting at least some capacitors 15504 of the plurality of capacitors 15504 .
  • the optical package 15500 may include a power source 15512 (e.g., a source configured to provide a current, for example a battery).
  • the power source 15512 may be electrically connected to the common line 15512 (or to each common line).
  • the power source 15512 may be configured to provide power to charge the plurality of capacitors 15504 (e.g., the capacitors 15504 connected to the common line 15510 ).
  • the optical package 15500 may include a processor 15514 .
  • the processor 15514 may be mounted on the substrate 15502 .
  • the processor 15514 may be monolithically integrated in the substrate 15502 .
  • the processor 15514 may be mounted on the printed circuit board 15602 (see FIG. 18 ).
  • the processor may be configured to control the plurality of switches 15506 (e.g., to open or close the plurality of switches).
  • the optical package 15500 (or the substrate 15502 ) may include a plurality of access lines electrically connected with control terminals of the switches 15506 (e.g., similar to word-lines in a DRAM).
  • the processor 15514 may be configured to control the switches 15506 by providing a control signal (e.g., a voltage, such as a control voltage, or an electric potential) to the plurality of access lines (or to some access lines, or to a single access line).
  • the processor 15514 may be configured to individually control the switches 15506 , e.g. by providing individual control signals to the access line or lines connected to the switch 15506 or the switches 15506 to be controlled.
  • the processor 15514 may include or may be configured to control a voltage supply circuit used for supplying control voltages to the access lines (not shown).
  • the processor 15514 may be configured to control (e.g., to individually control) the plurality of switches 15506 to control a first current flow to charge the plurality of capacitors 15504 .
  • the processor 15514 may be configured to open the plurality of switches 15506 such that current may flow from the common line 15510 (illustratively, from the power source 15512 ) into the capacitors 15504 .
  • the processor 15514 may be configured to control (e.g., to individually control) the plurality of switches 15506 to control a second current flow to discharge the plurality of capacitors 15504 .
  • the processor 15514 may be configured to open the plurality of switches 15506 such that the capacitors 15504 may be discharged (e.g., current may flow from the capacitors 15504 to the laser diode 15508 ).
  • the first current flow may be the same as the second current flow or different from the second current flow (e.g., the first current flow may be greater than the second current flow).
  • the processor 15514 may be configured to control (e.g., to individually control) the plurality of switches 15506 to control the second current flow to drive the laser diode 15508 with current discharged from at least one capacitor 15504 .
  • the processor 15514 may be configured to adjust a current flow through the laser diode 15508 (e.g., to adjust a laser output power) by controlling (e.g., opening) the switches 15506 (e.g., by discharging a certain number of the capacitors 15504 associated with the laser diode 15508 ).
  • the second current flow to drive the at least one laser diode 15508 may include a current proportional to the number of discharged capacitors 15504 (e.g., a current in the range from a few milliamperes up about 100 A, for example in the range from about 10 mA to about 100 A, for example from about 1 A to about 50 A, for example about 40 A).
  • a current proportional to the number of discharged capacitors 15504 e.g., a current in the range from a few milliamperes up about 100 A, for example in the range from about 10 mA to about 100 A, for example from about 1 A to about 50 A, for example about 40 A).
  • the processor 15514 may be configured to control an emitted light pulse.
  • the processor may be configured to control or select the properties of an emitted light pulse (e.g., a shape, a duration, and an amplitude of an emitted light pulse) by controlling the arrangement and/or the number of capacitors 15504 to be discharged (e.g., of discharged capacitors 15504 ).
  • a shape of the emitted light pulse may be controlled by discharging capacitors 15504 arranged in different locations within an array of capacitors 15504 .
  • an amplitude of the emitted light pulse may be increased (or decreased) by discharging a higher (or lower) number of capacitors 15504 .
  • the processor 15514 may be configured to control the plurality of switches 15506 to discharge at least some capacitors 15504 to drive the laser diode 15508 to emit a light pulse (e.g., a laser pulse) of a predefined pulse shape (in other words, a light pulse having a certain waveform).
  • a light pulse e.g., a laser pulse
  • the processor 15514 may be configured to encode data in the emitted light pulse (e.g., to select a shape associated with data to be transmitted).
  • the emitted light pulse may be modulated (e.g., electrically modulated) such that data may be encoded in the light pulse.
  • the processor 15514 may be configured to control the discharge of the capacitors 15504 to modulate an amplitude of the emitted light pulse, for example to include one or more hump-like structure elements in the emitted light pulse.
  • the processor 15514 may have access to a memory storing data (e.g., to be transmitted) associated with a corresponding pulse shape (e.g., storing a codebook mapping data with a corresponding pulse shape).
  • the processor 15514 may be configured to control the plurality of switches 15506 to discharge at least some capacitors 15504 to drive the laser diode 15508 to emit a light pulse dependent on a light emission scheme.
  • the processor 15514 may be configured to control the discharge of the capacitors 15504 to drive the laser diode 15508 to emit a sequence of light pulses, for example structured as a frame (illustratively, the temporal arrangement of the emitted light pulses may encode or describe data).
  • the optical package 15500 may include one or more further components, not illustrated in FIG. 17A .
  • the optical package 15500 e.g., the substrate 15502
  • a first additional switch (or a plurality of first additional switches) may be controlled (e.g., opened or closed) to selectively provide a path from the power source 15512 to the capacitors 15504 .
  • a second additional switch (or a plurality of second additional switches) may be controlled to selectively provide a path from the laser diode 15508 to an electrical contact (described in further detail below).
  • an exemplary operation of the optical package 15500 may be as follows.
  • a first additional switch SW B may be opened to disconnect the power source from the capacitors 11504 c (illustratively, the power source may be coupled with the node B, e.g. the terminal B, in FIG. 17B and FIG. 17C ).
  • the node A e.g. the terminal A, in FIG. 17B and FIG. 17C may indicate the substrate (e.g., may be coupled with the substrate).
  • a second additional switch SW C may be opened to disconnect the laser diode 15508 d from the associated electrical contact (illustratively, the electrical contact may be coupled with the node C, e.g. the terminal C, in FIG. 17B and FIG. 17C ).
  • the second additional switch SW C may be opened to disconnect each laser diode 15508 d from the associated electrical contact.
  • each laser diode 15508 d may have a respective additional switch and/or a respective electrical contact associated thereto.
  • the capacitors 15504 c to be charged may be selected by providing a corresponding control signal to the respective access line (e.g., applying a control voltage to the control terminal of the associated switch 15506 s ), illustratively coupled with the node D, e.g. the terminal D, in FIG. 17B and FIG. 17C .
  • the first additional switch SW B may be closed to charge the selected capacitors.
  • the access lines (e.g., control lines) may be deactivated after charging has been performed.
  • the first additional switch SW B may be opened.
  • the second additional switch SW C may be closed to provide an electrical path from the laser diode 15508 d (e.g., from each laser diode 15508 d ) to the associated electrical contact.
  • the capacitors 15504 c to be discharged may be selected by providing a corresponding control signal to the respective access line. The selected capacitors 15504 c may be discharged via the associated laser diode 15508 d.
  • FIG. 18 shows a top view of the optical package 15500 in a schematic representation in accordance with various embodiments.
  • the optical package 15500 may include a base support, e.g. a printed circuit board 15602 .
  • the substrate 15502 may be mounted on the printed circuit board 15602 (e.g., integrated in the printed circuit board 15602 ).
  • the processor 15514 may be mounted on the printed circuit board 15602 .
  • the printed circuit board 15602 may include a first electrical contact 15604 .
  • the first electrical contact 15604 may be connected (e.g., electrically coupled) to the common line 15510 of the substrate 15502 (in other words, to the common line 15510 of the optical package 15500 ), as shown, for example, in FIG. 17A .
  • the first electrical contact 15604 may be wire bonded to the common line 15510 .
  • Power to charge the capacitors 15504 may be provided via the first electrical contact 15604 of the printed circuit board 15602 .
  • a power source may be mounted on the printed circuit board 15602 and electrically coupled with the first electrical contact 15604 .
  • the printed circuit board 15602 may include a second electrical contact 15606 .
  • the second terminal 15608 of the laser diode 15508 may be electrically coupled to the second electrical contact 15606 of the printed circuit board 15602 .
  • the second electrical contact 15606 of the printed circuit board 15602 may be wire bonded to the second terminal 15608 of the laser diode 15508 .
  • the second electrical contact 15606 may provide a path for the current to flow through the laser diode 15508 .
  • the optical package 15500 may include a plurality of laser diodes 15508 , for example arranged in a one-dimensional array or in a two-dimensional array (e.g., in a matrix array) over the base support.
  • the optical package 15500 may include a plurality of first electrical contacts 15604 and/or a plurality of second electrical contacts 15606 .
  • the optical package 15500 may include a first electrical contact 15604 and a second electrical contact 15606 associated with each laser diode 15508 .
  • the optical package 15500 may include a first electrical contact 15604 for each line in an array of laser diodes 15508 .
  • FIG. 19A and FIG. 19B show a side view and a top view, respectively, of an optical package 15700 in a schematic representation in accordance with various embodiments.
  • components of the optical package 15700 that may be arranged at different levels are illustrated, e.g. at different vertical positions within the optical package 15700 or within the substrate, according to the representation in FIG. 19A .
  • the optical package 15700 may be configured as the optical package 15500 described, for example, in relation to FIG. 17A to FIG. 156 .
  • the optical package 15700 may be an exemplary realization of the optical package 15500 .
  • the optical package 15700 may include a substrate 15702 .
  • the optical package 15700 may include a plurality of storage capacitors 15704 formed (e.g., monolithically integrated) in the substrate 15702 (e.g., an array of storage capacitors 15704 , for example a two-dimensional array).
  • the optical package 15700 may include a plurality of switches 15706 formed (e.g., monolithically integrated) in the substrate, for example a plurality of transistors (e.g., field effect transistors). Each switch 15706 may be connected between at least one capacitor 15704 (e.g., exactly one capacitor 15704 ) and a laser diode 15708 .
  • the substrate 15702 may include a base 15702 s , e.g. including or essentially consisting of silicon.
  • the substrate 15702 may include an insulating layer 15702 i , for example including an oxide, such as silicon oxide.
  • the laser diode 15708 may be a vertical cavity surface-emitting laser diode (e.g., emitting light from a top surface of the laser diode 15708 ), for example having a pyramid shape.
  • the laser diode 15708 may be mounted on the substrate 15702 (e.g., on the insulating layer 15702 i ).
  • the laser diode 15708 may include an active layer 15708 a (illustratively, a layer of active material).
  • the laser diode 15708 may include one or more optical structures 15708 o , arranged above and/or underneath the active layer 15708 a .
  • the laser diode 15708 may include a first optical structure 15708 o arranged on top of the active layer 15708 a (e.g., in direct physical contact with the active layer 15708 a ).
  • the first optical structure 15708 o may be a top Bragg mirror (e.g., a sequence of alternating thin layers of dielectric materials having high and low refractive index).
  • the laser diode 15708 may include a second optical structure 15708 o arranged underneath the active layer 15708 a (e.g., in direct physical contact with the active layer 15708 a ).
  • the second optical structure 15708 o may be a bottom Bragg mirror.
  • the optical package 15700 may include a printed circuit board 15710 .
  • the substrate 15702 may be mounted on the printed circuit board 15710 .
  • the laser diode 15708 may be electrically connected to the printed circuit board 15710 (e.g., to an electrical contact of the printed circuit board 15710 ), for example via one or more bond wires 15712 .
  • the laser diode 15708 may include a (e.g., second) terminal 15714 arranged on top of the laser diode 15708 (e.g., a top contact).
  • the terminal 15714 may be a ring-like mesa structure (e.g., to allow emission of the laser light), as illustrated, for example, in FIG. 19B .
  • the one or more bond wires 15712 may be connected to the terminal 15714 .
  • the laser diode 15708 may include another (e.g., first) terminal 15716 arranged at a bottom surface of the laser diode 15708 (e.g., a bottom contact).
  • the terminal 15716 may be electrically coupled with a connector structure 15718 (e.g., a connector structure 15718 formed in the substrate 15702 ).
  • the connector structure 15718 may provide electrical coupling (e.g., an electrical path) with the switches 15706 and the capacitors 15704 (e.g., between the terminal 15716 and the switches 15706 and the capacitors 15704 ).
  • the connector structure 15718 may include a plurality of electrical contacts 15718 c , e.g.
  • Each electrical contact 15718 c may be connected with a respective capacitor 15704 , for example via the respective switch 15706 .
  • the connector structure 15718 may be selectively coupled to the plurality of storage capacitors 15706 (e.g., pin-like storage capacitors) by the plurality of switching devices 15706 .
  • the connector structure 15718 may be an example for the common line 15510 .
  • the connector structure 15718 may be used to charge the plurality of capacitors 15704 .
  • the connector structure 15718 may be electrically coupled with a power source.
  • the connector structure 15718 may be electrically coupled with the printed circuit board 15710 , for example via one or more bond wires 15720 .
  • the connector structure 15718 may be electrically coupled with an electrical terminal of the printed circuit board 15710 .
  • a power source may be electrically coupled with the electrical terminal of the printed circuit board 15710 .
  • the connector structure 15718 may have a comb-like arrangement including a plurality of connector lines (as shown in FIG. 19B ). Each connector line may optionally include or be associated with a respective switch (e.g., a field effect transistor) for providing additional control over the selection of the capacitors to be charged (e.g., in addition to the selection by means of the access lines 15722 ).
  • the substrate 15702 may include a plurality of access lines 15722 (illustratively, a plurality of word-lines). Each access line may be electrically coupled with one or more switches 15706 (e.g., with respective control terminals, e.g. gate terminals, of one or more switches 15706 ). The access lines 15722 may be used to control (e.g., open or close) the one or more switches 15706 coupled thereto.
  • the optical package 15700 may include a processor configured as the processor 15514 described above, for example in relation to FIG. 17A to FIG. 18 .
  • the processor may be configured to control the switches 15706 by supplying a control signal (e.g., a plurality of control signals) via the plurality of access lines 15522 .
  • the optical package 15700 may include one or more through-vias 15724 (e.g., through-silicon vias), as an example of heat dissipation component.
  • a through-via 15724 may extend through the substrate in the vertical direction (e.g., through the base 15702 s and through the insulating layer 15702 i ).
  • the through-via 15724 may be filled with a heat dissipation or heat conducting material, such as a metal (e.g., deposited or grown in the through-via 15724 ).
  • the through-via 15724 may be arranged outside the area in which the plurality of capacitors 15704 and/or the plurality of switches 15706 are formed in the substrate 15702 .
  • Example 1ad is an optical package.
  • the optical package may include a substrate.
  • the substrate may include an array of a plurality of capacitors formed in the substrate.
  • the substrate may include a plurality of switches formed in the substrate. Each switch may be connected between at least one laser diode and at least one capacitor of the plurality of capacitors.
  • the optical package may include the at least one laser diode mounted on the substrate.
  • the optical package may include a processor configured to control the plurality of switches to control a first current flow to charge the plurality of capacitors.
  • the processor may be configured to control the plurality of switches to control a second current flow to drive the at least one laser diode with current discharged from at least one capacitor of the plurality of capacitors.
  • the subject-matter of example 1ad can optionally include that the plurality of capacitors and the plurality of switches are monolithically integrated in the substrate.
  • the subject-matter of any one of examples 1ad or 2ad can optionally include that each switch of the plurality of switches is assigned to exactly one respective capacitor of the plurality of capacitors.
  • the subject-matter of example 3ad can optionally include that the processor is configured to individually control the plurality of switches to control the first current flow to charge the plurality of capacitors.
  • the processor may be configured to individually control the plurality of switches to control the second current flow to drive the at least one laser diode with current discharged from at least one capacitor of the plurality of capacitors.
  • each switch of the plurality of switches includes a transistor.
  • the subject-matter of example 5ad can optionally include that at least one transistor of the plurality of transistors is a field effect transistor.
  • the subject-matter of example 6ad can optionally include that at least one field effect transistor of the plurality of transistors is a metal oxide semiconductor field effect transistor.
  • the subject-matter of example 7ad can optionally include that at least one metal oxide semiconductor field effect transistor of the plurality of transistors is a complementary metal oxide semiconductor field effect transistor.
  • the subject-matter of any one of examples 1ad to Bad can optionally include that the array of capacitors includes a number of capacitors in the range from about 400000 capacitors to about 600000 capacitors associated with the at least one laser diode.
  • the subject-matter of any one of examples 1ad to 9ad can optionally include that at least one capacitor of the array of capacitors has a capacitance in the range from about 50 fF to about 200 fF.
  • the subject-matter of any one of examples 1ad to 10ad can optionally include that the current flow to drive the at least one laser diode includes a current in the range from about 10 mA to about 100 A.
  • the subject-matter of any one of examples 1ad to 11ad can optionally include that an electrical path between a capacitor and the at least one laser diode has an inductivity lower than 100 pH.
  • the subject-matter of any one of examples 1ad to 12ad can optionally include that at least one capacitor of the array of capacitors is a deep trench capacitor.
  • the subject-matter of any one of examples 1ad to 13ad can optionally include that at least one capacitor of the array of capacitors is a stacked capacitor.
  • the subject-matter of any one of examples 1ad to 14ad can optionally include that the capacitors of the array of capacitors are arranged in rows and columns.
  • the subject-matter of any one of examples 1ad to 15ad can optionally include an electrically conductive common line connecting at least some capacitors of the plurality of capacitors.
  • the subject-matter of example 16ad can optionally include a power source electrically connected to the common line and configured to provide the power to charge the plurality of capacitors.
  • the subject-matter of any one of examples 1ad to 17ad can optionally include a printed circuit board.
  • the substrate may be mounted on the printed circuit board.
  • the subject-matter of any one of examples 16ad or 17ad can optionally include a printed circuit board.
  • the substrate may be mounted on the printed circuit board.
  • the printed circuit board may include an electrical contact electrically coupled to the common line of the substrate.
  • the subject-matter of example 19ad can optionally include that the electrical contact of the printed circuit board is wire bonded to the common line of the substrate.
  • the subject-matter of any one of examples 16ad to 20ad can optionally include a printed circuit board.
  • the substrate may be mounted on the printed circuit board.
  • a first terminal of the at least one laser diode may be electrically coupled to the common line.
  • a second terminal of the at least one laser diode may be electrically coupled to an electrical contact of the printed circuit board.
  • the subject-matter of example 21ad can optionally include that the electrical contact of the printed circuit board is wire bonded to the second terminal of the at least one laser diode.
  • the subject-matter of any one of examples 1ad to 22ad can optionally include the substrate includes or essentially consists of silicon.
  • the subject-matter of any one of examples 1ad to 23ad can optionally include that the at least one laser diode laterally covers at least a portion of the plurality of capacitors.
  • the subject-matter of any one of examples 1ad to 24ad can optionally include that the at least one laser diode includes an edge emitting laser diode.
  • the subject-matter of any one of examples 1ad to 24ad can optionally include that the at least one laser diode includes a vertical cavity surface-emitting laser diode.
  • the subject-matter of any one of examples 1ad to 26ad can optionally include that the processor is monolithically integrated in the substrate.
  • the subject-matter of any one of examples 19ad to 26ad can optionally include that the processor is mounted on the printed circuit board.
  • the subject-matter of any one of examples 19ad to 28ad can optionally include that the processor is configured to control the plurality of switches to discharge at least some capacitors of the plurality of capacitors to drive the at least one laser diode to emit a laser pulse of a predefined pulse shape.
  • the subject-matter of example 29ad can optionally include that the laser pulse has a pulse duration of about 10 ns.
  • the subject-matter of any one of examples 29ad or 30ad can optionally include that the processor is configured to control the plurality of switches to discharge at least some capacitors of the plurality of capacitors to drive the at least one laser diode to emit a laser pulse dependent on a light emission scheme.
  • the subject-matter of any one of examples 29ad to 31ad can optionally include that the processor is configured to control the plurality of switches to discharge at least some capacitors of the plurality of capacitors to drive the at least one laser diode to emit a laser pulse of a predefined pulse shape.
  • Example 33ad is a LIDAR Sensor System including an optical package of any one of examples 1ad to 32ad.
  • the above-described embodiments can be implemented in any of numerous ways.
  • the embodiments may be combined in any order and any combination with other embodiments.
  • the embodiments may be implemented using hardware, software or a combination thereof.
  • the software code can be executed on any suitable processor or collection of processors, whether provided in a single computer or distributed among multiple computers.
  • a computer may be embodied in any of a number of forms, such as a rack-mounted computer, a desktop computer, a laptop computer, or a tablet computer. Additionally, a computer may be embedded in a device (e.g. LIDAR Sensor Device) not generally regarded as a computer but with suitable processing capabilities, including a Personal Digital Assistant (PDA), a smart phone or any other suitable portable or fixed electronic device.
  • a device e.g. LIDAR Sensor Device
  • PDA Personal Digital Assistant
  • smart phone any other suitable portable or fixed electronic device.
  • a computer may have one or more input and output devices. These devices can be used, among other things, to present a user interface. Examples of output devices that can be used to provide a user interface include printers or display screens for visual presentation of output and speakers or other sound generating devices for audible presentation of output. Examples of input devices that can be used for a user interface include keyboards, and pointing devices, such as mice, touch pads, and digitizing tablets. As another example, a computer may receive input information through speech recognition or in other audible format.
  • Such computers may be interconnected by one or more networks in any suitable form, including a local area network or a wide area network, such as an enterprise network, and intelligent network (IN) or the Internet.
  • networks may be based on any suitable technology and may operate according to any suitable protocol and may include wireless networks, wired networks or fiber optic networks.
  • the various methods or processes outlined herein may be coded as software that is executable on one or more processors that employ any one of a variety of operating systems or platforms. Additionally, such software may be written using any of a number of suitable programming languages and/or programming or scripting tools, and also may be compiled as executable machine language code or intermediate code that is executed on a framework or virtual machine.
  • various disclosed concepts may be embodied as a computer readable storage medium (or multiple computer readable storage media) (e.g., a computer memory, one or more floppy discs, compact discs, optical discs, magnetic tapes, flash memories, circuit configurations in Field Programmable Gate Arrays or other semiconductor devices, or other non-transitory medium or tangible computer storage medium) encoded with one or more programs that, when executed on one or more computers or other processors, perform methods that implement the various embodiments of the disclosure discussed above.
  • the computer readable medium or media can be transportable, such that the program or programs stored thereon can be loaded onto one or more different computers or other processors to implement various aspects of the present disclosure as discussed above.
  • program or “software” are used herein in a generic sense to refer to any type of computer code or set of computer-executable instructions that can be employed to program a computer or other processor to implement various aspects of embodiments as discussed above. Additionally, it should be appreciated that according to one aspect, one or more computer programs that when executed perform methods of the present disclosure need not reside on a single computer or processor, but may be distributed in a modular fashion amongst a number of different computers or processors to implement various aspects of the present disclosure.
  • Computer-executable instructions may be in many forms, such as program modules, executed by one or more computers or other devices.
  • program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
  • functionality of the program modules may be combined or distributed as desired in various embodiments.
  • data structures may be stored in computer-readable media in any suitable form.
  • data structures may be shown to have fields that are related through location in the data structure. Such relationships may likewise be achieved by assigning storage for the fields with locations in a computer-readable medium that convey relationship between the fields.
  • any suitable mechanism may be used to establish a relationship between information in fields of a data structure, including through the use of pointers, tags or other mechanisms that establish relationship between data elements.
  • a reference to “A and/or B”, when used in conjunction with open-ended language such as “comprising” can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.
  • the phrase “at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements.
  • This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase “at least one” refers, whether related or unrelated to those elements specifically identified.
  • “at least one of A and B” can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.
  • connection has been used to describe how various elements interface or “couple”. Such described interfacing or coupling of elements may be either direct or indirect.
  • connection and “coupled” are used to describe both a direct and an indirect connection and a direct or indirect coupling.
  • Actuators are components or devices which are able to convert energy (e.g. electric, magnetic, photoelectric, hydraulic, pneumatic) into a mechanical movement (e.g. translation, rotation, oscillation, vibration, shock, pull, push, etc.). Actuators may be used for example in order to move and/or change and/or modify components such as mechanical elements, optical elements, electronic elements, detector elements, etc. and/or materials or material components. Actuators can also be suited to emit, for example, ultrasound waves, and so on.
  • energy e.g. electric, magnetic, photoelectric, hydraulic, pneumatic
  • a mechanical movement e.g. translation, rotation, oscillation, vibration, shock, pull, push, etc.
  • Actuators may be used for example in order to move and/or change and/or modify components such as mechanical elements, optical elements, electronic elements, detector elements, etc. and/or materials or material components. Actuators can also be suited to emit, for example, ultrasound waves, and so on.
  • ASIC Application-Specific Integrated Circuit
  • SOC Systems-on-Chip
  • AGV Automated Guided Vehicle
  • An automated guided vehicle or automatic guided vehicle is a robot that follows markers or wires in the floor, or uses vision, magnets, or lasers for navigation.
  • An AGV can be equipped to operate autonomously.
  • SAE International Society of Automotive Engineers
  • Level 0 where automated systems issue warnings or may momentarily intervene
  • Level 5 where no human interaction is required at all
  • ADAS Advanced Driver-Assistance Systems
  • Such systems may comprise basic features such as anti-lock braking systems (ABS) and electronic stability controls (ESC), which are usually considered as Level 0 features, as well as more complex features, such as lane departure warning, lane keep assistant, lane change support, adaptive cruise control, collision avoidance, emergency break assistant and adaptive high-beam systems (ADB), etc., which may be considered as Level 1 features.
  • Levels 2, 3 and 4 features can be denoted as partial automation, conditional automation and high automation, respectively.
  • Level 5 finally, can be denoted as full automation.
  • Level 5 Alternative and widely used terms for Level 5 are driverless cars, self-driving cars or robot cars.
  • AGV Automated Guided Vehicle
  • UAV Unmanned Aerial Vehicles
  • AV Autonomous Vehicle
  • a Beacon is a device that emits signal data for communication purposes, for example based on Bluetooth or protocols based on DIIA, THREAD, ZIGBee or MDSIG technology.
  • a Beacon can establish a Wireless Local Area Network.
  • the light beam emitted by the light source may be transmitted into the Field of Illumination (FOI) either in a scanning or a non-scanning manner.
  • FOI Field of Illumination
  • the light of the light source is transmitted into the complete FOI in one single instance, i.e. the light beam is broadened (e.g. by a diffusing optical element) in such a way that the whole FOI is illuminated at once.
  • the light beam is directed over the FOI either in a 1-dimensional manner (e.g. by moving a vertical light stripe in a horizontal direction, or vice versa) or in a 2-dimensional manner (e.g. by moving a light spot along a zigzag pattern across the FOI).
  • a 1-dimensional manner e.g. by moving a vertical light stripe in a horizontal direction, or vice versa
  • 2-dimensional manner e.g. by moving a light spot along a zigzag pattern across the FOI.
  • Mechanical solutions may comprise rotating mirrors, oscillating mirrors, in particular oscillating micro-electromechanical mirrors (MEMS), Digital Mirror Devices (DMD), Galvo-Scanner, etc.
  • the moving mirrors may have plane surface areas (e.g. with circular, oval, rectangular or polygonal shape) and may be tilted or swiveled around one or more axes.
  • Non-mechanical solutions may comprise so called optical phased arrays (OPA) in which the phases of light waves are varied by dynamically controlling the optical properties of an adjustable optical element (e.g. phase modulators, phase shifters, Liquid Crystal Elements (LCD), etc.).
  • OPA optical phased arrays
  • Communication interface describes all sorts of interfaces or gateways between two devices, which can be used to exchange signals.
  • Signals in this context may comprise simple voltage or current levels, as well as complex information based on the above described coding or modulation techniques.
  • communication interfaces may be used to transfer information (signals, data, etc.) between different components of the LIDAR Sensor System. Furthermore, communication interfaces may be used to transfer information (signals, data, etc.) between the LIDAR Sensor System or its components or modules and other devices provided in the vehicle, in particular other sensor systems (LIDAR, RADAR, Ultrasonic, Cameras) in order to allow sensor fusion functionalities.
  • LIDAR LiDAR, RADAR, Ultrasonic, Cameras
  • a communication unit is an electronic device, which is configured to transmit and/or receive signals to or from other communication units.
  • Communication units may exchange information in a one-directional, bi-directional or multi-directional manner.
  • Communication signals may be exchanged via electromagnetic waves (including radio or microwave frequencies), light waves (including UV, VIS, IR), acoustic waves (including ultrasonic frequencies).
  • the information may be exchanged using all sorts of coding or modulation techniques e.g. pulse width modulation, pulse code modulation, amplitude modulation, frequency modulation, etc.
  • the information may be transmitted in an encrypted or non-encrypted manner and distributed in a trusted or distrusted network (for example a Blockchain ledger).
  • a trusted or distrusted network for example a Blockchain ledger.
  • vehicles and elements of road infrastructure may comprise CUs in order to exchange information with each other via so-called C2C (Car-to-Car) or C2X (Car-to-Infrastructure or Car-to-Environment).
  • C2C Car-to-Car
  • C2X Car-to-Infrastructure or Car-to-Environment
  • such communication units may be part of Internet-of-Things (IoT) Systems, i.e. a network of devices, sensors, vehicles, and other appliances, which connect, interact and exchange data with each other.
  • IoT Internet-of-Things
  • Component describes the elements, in particular the key elements, which make up the LIDAR System.
  • key elements may comprise a light source unit, a beam steering unit, a photodetector unit, ASIC units, processor units, timing clocks, generators of discrete random or stochastic values, and data storage units.
  • components may comprise optical elements related to the light source, optical elements related to the detector unit, electronic devices related to the light source, electronic devices related to the beam steering unit, electronic devices related to the detector unit and electronic devices related to ASIC, processor and data storage and data executing devices.
  • Components of a LIDAR Sensor System may further include a high-precision clock, a Global-Positioning-System (GPS) and an inertial navigation measurement system (IMU).
  • GPS Global-Positioning-System
  • IMU inertial navigation measurement system
  • a Computer program device is a device or product, which is able to execute instructions stored in a memory block of the device or which is able to execute instructions that have been transmitted to the device via an input interface.
  • Such computer program products or devices comprise any kind of computer-based system or software-based system, including processors, ASICs or any other electronic device which is capable to execute programmed instructions.
  • Computer program devices may be configured to perform methods, procedures, processes or control activities related to LIDAR Sensor Systems.
  • a Control and Communication System receives input from the LIDAR Data Processing System and communicates with the LIDAR Sensing System, LIDAR Sensor Device and vehicle control and sensing system as well as with other objects/vehicles.
  • Controlled LIDAR Sensor System comprises one or many controlled “First LIDAR Sensor Systems”, and/or one or many controlled “Second LIDAR Sensor Systems”, and/or one or many controlled LIDAR Data Processing Systems, and/or one or many controlled LIDAR Sensor Devices, and/or one or many controlled Control and Communication Systems.
  • Controlled means local or remote checking and fault detection and repair of either of the above-mentioned LIDAR Sensor System components. Controlled can also mean the control of a LIDAR Sensor Device, including a vehicle.
  • Controlled can also mean the inclusion of industry standards, bio feedbacks, safety regulations, autonomous driving levels (e.g. SAE Levels) and ethical and legal frameworks.
  • Controlled can also mean the control of more than one LIDAR Sensor System, or more than one LIDAR Sensor Device, or more than one vehicle and/or other objects.
  • a Controlled LIDAR Sensor System may include the use of artificial intelligent systems, data encryption and decryption, as well as Blockchain technologies using digital records that store a list of transactions (called “blocks”) backed by a cryptographic value. Each block contains a link to the previous block, a timestamp, and data about the transactions that it represents. Blocks are immutable, meaning that they can't easily be modified once they're created. And the data of a blockchain are stored non-locally, i.e. on different computers.
  • a Controlled LIDAR Sensor System may be configured to perform sensor fusion functions, such as collecting, evaluating and consolidating data from different sensor types (e.g. LIDAR, RADAR, Ultrasonic, Cameras).
  • the Controlled LIDAR Sensor System comprises feedback- and control-loops, i.e. the exchange of signals, data and information between different components, modules and systems which are all employed in order to derive a consistent understanding of the surroundings of a sensor system, e.g. of a sensor system onboard a vehicle.
  • Signals and data may be processed via Edge-Computing or Cloud-Computing systems, using corresponding Communication Units (CUs). Signals and data may be transmitted for that matter in an encrypted manner.
  • CUs Communication Units
  • Data security can also be enhanced by a combination of security controls, measures and strategies, singly and/or in combination, applied throughout a system's “layers”, including human, physical, endpoint, network, application and data environments.
  • Data analysis may benefit using data deconvolution methods or other suited methods that are known in imaging and signal processing methods, including neuronal and deep learning techniques.
  • LIDAR generated data sets can be used for the control and steering of vehicles (e.g. cars, ships, planes, drones), including remote control operations (e.g. parking operations or operations executed for example by an emergency officer in a control room).
  • the data sets can be encrypted and communicated (C2C, C2X), as well as presented to a user (for example by HUD or Virtual/Augmented Reality using wearable glasses or similar designs).
  • LIDAR Systems can also be used for data encryption purposes.
  • Data Usage may also comprise using methods of Artificial Intelligence (AI), i.e. computer-based systems or computer implemented methods which are configured to interpret transmitted data, to learn from such data based on these interpretations and derive conclusions which can be implemented into actions in order to achieve specific targets.
  • AI Artificial Intelligence
  • the data input for such AI-based methods may come from LIDAR Sensor Systems, as well as other physical or biofeedback sensors (e.g. Cameras which provide video streams from vehicle exterior or interior environments, evaluating e.g. the line of vision of a human driver).
  • AI-based methods may use algorithms for pattern recognition.
  • Data Usage in general, may employ mathematical or statistical methods in order to predict future events or scenarios based on available previous data sets (e.g. Bayesian method). Furthermore, Data Usage may include considerations regarding ethical questions (reflecting situations like for example the well-known “trolley dilemma”).
  • a Detector is a device which is able to provide an output signal (to an evaluation electronics unit) which is qualitatively or quantitatively correlated to the presence or the change of physical (or chemical) properties in its environment. Examples for such physical properties are temperature, pressure, acceleration, brightness of light (UV, VIS, IR), vibrations, electric fields, magnetic fields, electromagnetic fields, acoustic or ultrasound waves, etc.
  • Detector devices may comprise cameras (mono or stereo) using e.g. light-sensitive CCD or CMOS chips or stacked multilayer photodiodes, ultrasound or ultrasonic detectors, detectors for radio waves (RADAR systems), photodiodes, temperature sensors such as NTC-elements (i.e. a thermistor with negative temperature coefficient), acceleration sensors, etc.
  • a photodetector is a detection device, which is sensitive with respect to the exposure to electromagnetic radiation. Typically, light photons are converted into a current signal upon impingement onto the photosensitive element.
  • Photosensitive elements may comprise semiconductor elements with p-n junction areas, in which photons are absorbed and converted into electron-hole pairs.
  • detector types may be used for LIDAR applications, such as photo diodes, PN-diodes, PIN diodes (positive intrinsic negative diodes), APD (Avalanche Photo-Diodes), SPAD (Single Photon Avalanche Diodes), SiPM (Silicon Photomultipliers), CMOS sensors (Complementary metal-oxide-semiconductor, CCD (Charge-Coupled Device), stacked multilayer photodiodes, etc.
  • a photodetector In LIDAR systems, a photodetector is used to detect (qualitatively and/or quantitatively) echo signals from light which was emitted by the light source into the FOI and which was reflected or scattered thereafter from at least one object in the FOI.
  • the photodetector may comprise one or more photosensitive elements (of the same type or of different types) which may be arranged in linear stripes or in two-dimensional arrays.
  • the photosensitive area may have a rectangular, quadratic, polygonal, circular or oval shape.
  • a photodetector may be covered with Bayer-like visible or infrared filter segments.
  • a digital map is a collection of data that may be used to be formatted into a virtual image.
  • the primary function of a digital map is to provide accurate representations of measured data values.
  • Digital mapping also allows the calculation of geometrical distances from one object, as represented by its data set, to another object.
  • a digital map may also be called a virtual map.
  • Electronic devices denotes all kinds of electronics components or electronic modules, which may be used in a LIDAR Sensor System in order to facilitate its function or improve its function.
  • such electronic devices may comprise drivers and controllers for the light source, the beam steering unit or the detector unit.
  • Electronic devices may comprise all sorts of electronics components used in order to supply voltage, current or power.
  • Electronic devices may further comprise all sorts of electronics components used in order to manipulate electric or electronic signals, including receiving, sending, transmitting, amplifying, attenuating, filtering, comparing, storing or otherwise handling electric or electronic signals.
  • Electronic devices related to the light source there may be electronic devices related to the beam steering unit, electronic devices related to the detector unit and electronic devices related to ASIC and processor units.
  • Electronic devices may comprise also Timing Units, Positioning Units (e.g. actuators), position tracking units (e.g. GPS, Geolocation, Indoor-Positioning Units, Beacons, etc.), communication units (WLAN, radio communication, Bluetooth, BLE, etc.) or further measurement units (e.g. inertia, accelerations, vibrations, temperature, pressure, position, angle, rotation, etc.).
  • the term Field of Illumination relates to the solid angle sector into which light can be transmitted by the LIDAR light source (including all corresponding downstream optical elements).
  • the FOI is limited along a horizontal direction to an opening angle ⁇ H and along a vertical direction to an opening angle ⁇ V .
  • the light of the LIDAR light source may be transmitted into the complete FOI in one single instance (non-scanning LIDAR) or may be transmitted into the FOI in a successive, scanning manner (scanning LIDAR).
  • the term Field of View relates to the solid angle sector from which the LIDAR detector (including all corresponding upstream optical elements) can receive light signals.
  • the FOV is limited along a horizontal direction to an opening angle alpha H and along a vertical direction to an opening angle alpha Y .
  • a LIDAR Sensor System where the angular information (object recognition) about the environment is gained by using an angularly sensitive detector is usually called a Flash LIDAR Sensor System.
  • the term “frame” may be used to describe a logical structure of a signal (e.g., an electrical signal or a light signal or a LIDAR signal, such as a light signal).
  • the term “frame” may describe or define an arrangement (e.g., a structure) for the content of the frame (e.g., for the signal or the signal components).
  • the arrangement of content within the frame may be configured to provide data or information.
  • a frame may include a sequence of symbols or symbol representations.
  • a symbol or a symbol representation may have a different meaning (e.g., it may represent different type of data) depending on its position within the frame.
  • a frame may have a predefined time duration.
  • a frame may define a time window, within which a signal may have a predefined meaning.
  • a light signal configured to have a frame structure may include a sequence of light pulses representing (or carrying) data or information.
  • a frame may be defined by a code (e.g., a signal modulation code), which code may define the arrangement of the symbols within the frame.
  • Gateway means a networking hardware equipped for interfacing with another network.
  • a gateway may contain devices such as protocol translators, impedance matching devices, rate converters, fault isolators, or signal translators as necessary to provide system interoperability. It also requires the establishment of mutually acceptable administrative procedures between both networks.
  • a gateway is a node on a network that serves as a ‘gate’ or entrance and exit point to and from the network.
  • a node is an active redistribution and/or communication point with a unique network address that either creates, receives or transmits data, sometimes referred to as a ‘data node’.
  • HMI Human Machine Interaction
  • HMI Human-Machine Interactions
  • a biofeedback system may detect that the driver of a vehicle shows signs of increased fatigue, which are evaluated by a central control unit, finally leading to a switchover from a lower SAE level to a higher SAE level.
  • a LIDAR Data Processing System may comprise functions of signal processing, signal optimization (signal/noise), data analysis, object detection, object recognition, information exchange with edge and cloud computing, data banks, data libraries and other sensing devices (for example other LIDAR Devices, radar, camera, ultrasound, biometrical feedback data, driver control devices, car-to-car (C2C) communication, car-to-environment (C2X) communication, geolocation data (GPS).
  • signal optimization signal/noise
  • data analysis object detection
  • object recognition information exchange with edge and cloud computing
  • data banks data libraries and other sensing devices
  • data libraries and other sensing devices for example other LIDAR Devices, radar, camera, ultrasound, biometrical feedback data, driver control devices, car-to-car (C2C) communication, car-to-environment (C2X) communication, geolocation data (GPS).
  • a LIDAR Data Processing System may generate point clouds (3D/6D), object location, object movement, environment data, object/vehicle density.
  • a LIDAR Data Processing System may include feedback control to First LIDAR Sensing System and/or Second LIDAR Sensing System and/or Control and Communication System . . . .
  • a LIDAR Sensing System may comprise one or many LIDAR emission modules, here termed as “First LIDAR Sensing, and/or one or many LIDAR Sensor modules, here termed as “Second LIDAR Sensing.
  • sensor or sensor module describes—in the framework of this patent application—module, which is configured to function as a LIDAR Sensor System. As such it may comprise a minimum set of LIDAR key components necessary to perform basic LIDAR functions such as a distance measurement.
  • a LIDAR (light detection and ranging) Sensor is to be understood in particular as meaning a system which, in addition to one or more emitters for emitting light beams, for example in pulsed form, and a detector for detecting any reflected beam components, may have further devices, for example optical elements such as lenses and/or a MEMS mirror.
  • a LIDAR Sensor can therefore also be called a LIDAR System or a LIDAR Sensor System or LIDAR detection system.
  • a LIDAR Sensor Device is a LIDAR Sensor System either stand alone or integrated into a housing, light fixture, headlight or other vehicle components, furniture, ceiling, textile, etc. and/or combined with other objects (e.g. vehicles, pedestrians, traffic participation objects, . . . ).
  • LIDAR Sensor Management System receives input from the LIDAR Data Processing System and/or Control and Communication System and/or any other component of the LIDAR Sensor Device, and outputs control and signaling commands to the First LIDAR Sensing System and/or Second LIDAR Sensing System.
  • LIDAR Sensor Management Software (includes feedback software) for use in a LIDAR Sensor Management System.
  • a LIDAR Sensor Module comprises at least one LIDAR Light Source, at least one LIDAR Sensing Element, and at least one driver connected to the at least one LIDAR Light Source. It may further include Optical Components and a LIDAR Data Processing System supported by LIDAR signal processing hard- and software.
  • a LIDAR System is a system, that may be or may be configured as a LIDAR Sensor System.
  • a LIDAR Sensor System is a system, which uses light or electromagnetic radiation, respectively, to derive information about objects in the environment of the LIDAR system.
  • LIDAR stands for Light Detection and Ranging.
  • Alternative names may comprise LADAR (laser detection and ranging), LEDDAR (Light-Emitting Diode Detection and Ranging) or laser radar.
  • LIDAR systems typically comprise of a variety of components as will be described below.
  • LIDAR systems are arranged at a vehicle to derive information about objects on a roadway and in the vicinity of a roadway.
  • objects may comprise other road users (e.g. vehicles, pedestrians, cyclists, etc.), elements of road infrastructure (e.g. traffic signs, traffic lights, roadway markings, guardrails, traffic islands, sidewalks, bridge piers, etc.) and generally all kinds of objects which may be found on a roadway or in the vicinity of a roadway, either intentionally or unintentionally.
  • the information derived via such a LIDAR system may comprise the distance, the velocity, the acceleration, the direction of movement, the trajectory, the pose and/or other physical or chemical properties of these objects.
  • the LIDAR system may determine the Time-of-Flight (TOF) or variations of physical properties such as phase, amplitude, frequency, polarization, structured dot pattern, triangulation-based methods, etc. of the electromagnetic radiation emitted by a light source after the emitted radiation was reflected or scattered by at least one object in the Field of Illumination (FOI) and detected by a photodetector.
  • TOF Time-of-Flight
  • FOI Field of Illumination
  • LIDAR systems may be configured as Flash LIDAR or Solid-State LIDAR (no moving optics), Scanning LIDAR (1- or 2-MEMS mirror systems, Fiber-Oscillator), Hybrid versions as well as in other configurations.
  • the Light Control Unit may be configured to control the at least one First LIDAR Sensing System and/or at least one Second LIDAR Sensing System for operating in at least one operation mode.
  • the Light Control Unit may comprise a light control software. Possible operation modes are e.g.: dimming, pulsed, PWM, boost, irradiation patterns, including illuminating and non-illuminating periods, light communication (including C2C and C2X), synchronization with other elements of the LIDAR Sensor System, such as a second LIDAR Sensor Device.
  • a light source for LIDAR applications provides electromagnetic radiation or light, respectively, which is used to derive information about objects in the environment of the LIDAR system.
  • the light source emits radiation in a non-visible wavelength range, in particular infrared radiation (IR) in the wavelength range from 850 nm up to 8100 nm.
  • IR infrared radiation
  • the light source emits radiation in a narrow bandwidth range with a Full Width at Half Maximum (FWHM) between 1 ns to 100 ns.
  • a LIDAR light source may be configured to emit more than one wavelength, visible or invisible, either at the same time or in a time-sequential fashion.
  • the light source may emit pulsed radiation comprising individual pulses of the same pulse height or trains of multiple pulses with uniform pulse height or with varying pulse heights.
  • the pulses may have a symmetric pulse shape, e.g. a rectangular pulse shape.
  • the pulses may have asymmetric pulse shapes, with differences in their respective rising and falling edges.
  • Pulse length can be in the range of pico-seconds (ps) up to micro-seconds ( ⁇ s).
  • the plurality of pulses may also overlap with each other, at least partially.
  • the light source may be operated also in a continuous wave operation mode, at least temporarily.
  • the light source may be adapted to vary phase, amplitude, frequency, polarization, etc. of the emitted radiation.
  • the light source may comprise solid-state light sources (e.g. edge-emitting lasers, surface-emitting lasers, semiconductor lasers, VCSEL, VECSEL, LEDs, superluminescent LEDs, etc.).
  • the light source may comprise one or more light emitting elements (of the same type or of different types) which may be arranged in linear stripes or in two-dimensional arrays.
  • the light source may further comprise active or passive heat dissipation elements.
  • the light source may have several interfaces, which facilitate electrical connections to a variety of electronic devices such as power sources, drivers, controllers, processors, etc. Since a vehicle may employ more than one LIDAR system, each of them may have different laser characteristics, for example, regarding laser wavelength, pulse shape and FWHM.
  • the LIDAR light source may be combined with a regular vehicle lighting function, such as headlight, Daytime Running Light (DRL), Indicator Light, Brake Light, Fog Light etc. so that both light sources (LIDAR and another vehicle light source) are manufactured and/or placed on the same substrate, or integrated into the same housing and/or be combined as a non-separable unit.
  • a regular vehicle lighting function such as headlight, Daytime Running Light (DRL), Indicator Light, Brake Light, Fog Light etc.
  • a marker can be any electro-optical unit, for example an array of photodiodes, worn by external objects, in particular pedestrians and bicyclists, that can detect infrared radiation or acoustic waves (infrasound, audible, ultrasound), process the incoming radiation/waves and, as a response, reflect or emit infrared radiation or acoustic waves (infrasound, audible, ultrasound) with the same or different wavelength, and directly or indirectly communicate with other objects, including autonomously driving vehicles.
  • infrared radiation or acoustic waves infrasound, audible, ultrasound
  • Method may describe a procedure, a process, a technique or a series of steps, which are executed in order to accomplish a result or in order to perfom a function.
  • Method may for example refer to a series of steps during manufacturing or assembling a device.
  • Method may also refer to a way of using a product or device to achieve a certain result (e.g. measuring a value, storing data, processing a signal, etc.).
  • a light source module may describe a module, which comprises a light source, several beam forming optical elements and a light source driver as an electronic device, which is configured to supply power to the light source.
  • Objects may generally denote all sorts of physical, chemical or biological matter for which information can be derived via a sensor system.
  • objects may describe other road users (e.g. vehicles, pedestrians, cyclists, etc.), elements of road infrastructure (e.g. traffic signs, traffic lights, roadway markings, guardrails, traffic islands, sidewalks, bridge piers, etc.) and generally all kinds of objects which may be found on a roadway or in the vicinity of a roadway, either intentionally or unintentionally.
  • a processor is an electronic circuit, which performs multipurpose processes based on binary data inputs.
  • microprocessors are processing units based on a single integrated circuit (IC).
  • IC integrated circuit
  • a processor receives binary data, which may be processed according to instructions stored in a memory block of the processor, and provides binary results as outputs via its interfaces.
  • a LIDAR Sensor System where the angular information is gained by using a moveable mirror for scanning (i.e. angularly emitting) the laser beam across the Field of View (FOV), or any other technique to scan a laser beam across the FOV, is called a Scanning LIDAR Sensor System.
  • FOV Field of View
  • a sensor in the context of this disclosure includes one or more sensor pixels (which may also be referred to as pixel). Each sensor pixel includes exactly one photo diode.
  • the sensor pixels may all have the same shape or different shapes.
  • the sensor pixels may all have the same spacing to their respective neighbors or may have different spacings.
  • the sensor pixels may all have the same orientation in space or different orientation in space.
  • the sensor pixels may all be arranged within one plane or within different planes or other non-planar surfaces.
  • the sensor pixels may include the same material combination or different material combinations.
  • the sensor pixels may all have the same surface structure or may have different surface structures. Sensor pixels may be arranged and/or connected in groups.
  • each sensor pixel may have an arbitrary shape.
  • the sensor pixels may all have the same size or different sizes.
  • each sensor pixel may have an arbitrary size.
  • the sensor pixels may all include a photo diode of the same photo diode type or of different photo diode types.
  • a photo diode type may be characterized by one or more of the following features: size of the photo diode; sensitivity of the photo diode regarding conversion of electromagnetic radiation into electrical signals (the variation of the sensitivity may be caused the application of different reverse-bias voltages); sensitivity of the photo diode regarding light wavelengths; voltage class of the photo diode; structure of the photo diode (e.g. pin photo diode, avalanche photo diode, or single-photon avalanche photo diode); and material(s) of the photo diode.
  • the sensor pixels may be configured to be in functional relationship with color-filter elements and/or optical components.
  • Sensors are devices, modules or subsystems whose purpose it is to detect events or changes in its environment and send the information to other electronics, frequently a computer processor.
  • sensors available for all kinds of measurement purposes, for example the measurement of touch, temperature, humidity, air pressure and flow, electromagnetic radiation, toxic substances and the like.
  • a sensor can be an electronic component, module or subsystem that detects events or changes in energy forms in its physical environment (such as motion, light, temperature, sound, etc.) and sends the information to other electronics such as a computer for processing.
  • Sensors can be used to measure resistive, capacitive, inductive, magnetic, optical or chemical properties.
  • Sensors include camera sensors, for example CCD or CMOS chips, LIDAR sensors for measurements in the infrared wavelength range, Radar Sensors, and acoustic sensors for measurement in the infrasound, audible and ultrasound frequency range.
  • Ultrasound is radiation with a frequency above 20 kHz.
  • Sensors can be infrared sensitive and measure for example the presence and location of humans or animals.
  • Sensor can be grouped into a network of sensors.
  • a vehicle can employ a wide variety of sensors, including camera sensors, LIDAR sensing devices, RADAR, acoustical sensor systems, and the like. These sensors can be mounted inside or outside of a vehicle at various positions (roof, front, rear, side, corner, below, inside a headlight or any other lighting unit) and can furthermore establish a sensor network that may communicate via a hub or several sub-hubs and/or via the vehicle's electronic control unit (ECU).
  • ECU electronice control unit
  • Sensors can be connected directly or indirectly to data storage, data processing and data communication devices.
  • Sensors in cameras can be connected to a CCTV (Closed Circuit Television).
  • Light sensors can measure the amount and orientation of reflected light from other objects (reflectivity index).
  • sensing field describes that surroundings of a sensor system, wherein objects or any other contents can be detected, as well as their physical or chemical properties (or their changes).
  • a LIDAR Sensor System it describes a solid angle volume into which light is emitted by the LIDAR light source (FOI) and from which light that has been reflected or scattered by an object can be received by the LIDAR detector (FOV).
  • FOI LIDAR light source
  • FOV LIDAR detector
  • a LIDAR sensing field may comprise a roadway or the vicinity of a roadway close to a vehicle, but also the interior of a vehicle.
  • sensing field may describe the air around the sensor or some objects in direct contact to the sensor.
  • Sensor Optics denotes all kinds of optical elements, which may be used in a LIDAR Sensor System in order to facilitate its function or improve its function.
  • optical elements may comprise lenses or sets of lenses, filters, diffusors, mirrors, reflectors, light guides, Diffractive Optical Elements (DOE), Holographic Optical Elements and generally all kind of optical elements which may manipulate light (or electromagnetic radiation) via refraction, diffraction, reflection, transmission, absorption, scattering, etc.
  • Sensor Optics may refer to optical elements related to the light source, to the beam steering unit or the detector unit. Laser emitter and optical elements may be moved, tilted or otherwise shifted and/or modulated with respect to their distance and orientation.
  • Sensor system optimization may rely on a broad range of methods, functions or devices, including for example computing systems utilizing artificial intelligence, sensor fusion (utilizing data and signals from other LIDAR-sensors, RADAR sensors, Ultrasonic sensors, Cameras, Video-streams, etc.), as well as software upload and download functionalities (e.g. for update purposes).
  • Sensor system optimization may further utilize personal data of a vehicle user, for example regarding age, gender, level of fitness, available driving licenses (passenger car, truck) and driving experiences (cross vehicle weight, number of vehicle axes, trailer, horsepower, front-wheel drive/rear-wheel drive).
  • personal data may further include further details regarding driving experience (e.g.
  • Personal data may further include information about previous accidents, insurance policies, warning tickets, police reports, entries in central traffic registers (e.g. Flensburg in Germany), as well as data from biofeedback systems, other health-related systems (e.g. cardiac pacemakers) and other data (e.g. regarding driving and break times, level of alcohol intake, etc.).
  • biofeedback systems e.g. Flensburg in Germany
  • other health-related systems e.g. cardiac pacemakers
  • other data e.g. regarding driving and break times, level of alcohol intake, etc.
  • Personal data may be particularly relevant in care sharing scenarios and may include information about the intended ride (starting location, destination, weekday, number of passengers), the type of loading (passengers only, goods, animals, dangerous goods, heavy load, large load, etc.) and personal preferences (time-optimized driving, safety-optimized driving, etc.).
  • Personal data may be provided via smartphone connections (e.g. based on Bluetooth, WiFi, LiFi, etc.).
  • Smartphones or comparable mobile devices may further be utilized as measurement tools (e.g. ambient light, navigation data, traffic density, etc.) and/or as device which may be utilized as assistants, decision-making supports, or the like.
  • signal modulation may be used to describe a modulation of a signal for encoding data in such signal (e.g., a light signal or an electrical signal, for example a LIDAR signal).
  • a light signal e.g., a light pulse
  • an electrically modulated light signal may include a sequence of light pulses arranged (e.g., temporally spaced) such that data may be extracted or interpreted according to the arrangement of the light pulses.
  • signal demodulation also referred to as “electrical demodulation” may be used to describe a decoding of data from a signal (e.g., from a light signal, such as a sequence of light pulses).
  • a digital map is a collection of data that may be used to be formatted into a virtual image.
  • the primary function of a digital map is to provide accurate representations of measured data values.
  • Digital mapping also allows the calculation of geometrical distances from one object, as represented by its data set, to another object.
  • a digital map may also be called a virtual map.
  • a vehicle may be any object or device that either is equipped with a LIDAR Sensor System and/or communicates with a LIDAR Sensor System.
  • a vehicle can be: automotive vehicle, flying vehicle, all other moving vehicles, stationary objects, buildings, ceilings, textiles, traffic control equipment, . . . .

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The present disclosure relates to various embodiments of an optical component for a LIDAR Sensor System. The optical component includes a first photo diode implementing a LIDAR sensor pixel in a first semiconductor structure and configured to absorb received light in a first wavelength region, a second photo diode implementing a camera sensor pixel in a second semiconductor structure over the first semiconductor structure and configured to absorb received light in a second wavelength region, and an interconnect layer (e.g. arranged between the first semiconductor structure and the second semiconductor structure) including an electrically conductive structure configured to electrically contact the second photo diode. The received light of the second wavelength region has a shorter wavelength than the received light of the first wavelength region.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This is a continuation of U.S. application Ser. No. 16/809,587, filed on Mar. 5, 2020, which claims priority from German Application No.: 10 2019 205 514.1, filed on Apr. 16, 2019, German Application No.: 10 2019 214 455.1, filed on Sep. 23, 2019, German Application No.: 10 2019 216 362.9, filed on Oct. 24, 2019, German Application No.: 10 2020 201 577.5, filed on Feb. 10, 2020, German Application No.: 10 2019 217 097.8, filed on Nov. 6, 2019, German Application No.: 10 2020 202 374.3, filed on Feb. 25, 2020, German Application No.: 10 2020 201 900.2, filed on Feb. 17, 2020, German Application No.: 10 2019 203 175.7, filed on Mar. 8, 2019, German Application No.: 10 2019 218 025.6, filed on Nov. 22, 2019, German Application No.: 10 2019 219 775.2, filed on Dec. 17, 2019, German Application No.: 10 2020 200 833.7, filed on Jan. 24, 2020, German Application No.: 10 2019 208 489.3, filed on Jun. 12, 2019, German Application No.: 10 2019 210 528.9, filed on Jul. 17, 2019, German Application No.: 10 2019 206 939.8, filed on May 14, 2019, and German Application No.: 10 2019 213 210.3, filed on Sep. 2, 2019, the contents of each of the above-identified applications are incorporated herein by reference in their entirety.
  • TECHNICAL FIELD
  • The technical field of the present disclosure relates generally to light detection and ranging (LIDAR) systems and methods that use light detection and ranging technology. This disclosure is focusing on Components for LIDAR Sensor Systems, LIDAR Sensor Systems, LIDAR Sensor Devices and on Methods for LIDAR Sensor Systems or LIDAR Sensor Devices.
  • BACKGROUND INFORMATION
  • There are numerous studies and market forecasts, which predict that future mobility and transportation will shift from vehicles supervised by a human operator to vehicles with an increasing level of autonomy towards fully autonomous, self-driving vehicles. This shift, however, will not be an abrupt change but rather a gradual transition with different levels of autonomy, defined for example by SAE International (Society of Automotive Engineers) in SAE J3016 in-between. Furthermore, this transition will not take place in a simple linear manner, advancing from one level to the next level, while rendering all previous levels dispensable. Instead, it is expected that these levels of different extent of autonomy will co-exist over longer periods of time and that many vehicles and their respective sensor systems will be able to support more than one of these levels.
  • Depending on various factors, a human operator may actively switch for example between different SAE levels, depending on the vehicle's capabilities, or the vehicles operation system may request or initiate such a switch, typically with a timely information and acceptance period to possible human operators of the vehicles. These factors may include internal factors such as individual preference, level of driving experience or the biological state of a human driver and external factors such as a change of environmental conditions like weather, traffic density or unexpected traffic complexities.
  • It is important to note that the above-described scenario for a future is not a theoretical, far-away eventuality. In fact, already today, a large variety of so-called Advanced Driver Assistance Systems (ADAS) has been implemented in modern vehicles, which clearly exhibit characteristics of autonomous vehicle control. Current ADAS systems may be configured for example to alert a human operator in dangerous situations (e.g. lane departure warning) but in specific driving situations, some ADAS systems are able to takeover control and perform vehicle steering operations without active selection or intervention by a human operator. Examples may include convenience-driven situations such as adaptive cruise control but also hazardous situations like in the case of lane keep assistants and emergency break assistants.
  • The above-described scenarios all require vehicles and transportation systems with a tremendously increased capacity to perceive, interpret and react on their surroundings. Therefore, it is not surprising that remote environmental sensing systems will be at the heart of future mobility.
  • Since modern traffic can be extremely complex due to a large number of heterogeneous traffic participants, changing environments or insufficiently mapped or even unmapped environments, and due to rapid, interrelated dynamics, such sensing systems will have to be able to cover a broad range of different tasks, which have to be performed with a high level of accuracy and reliability. It turns out that there is not a single “one fits all” sensing system that can meet all the required features relevant for semi-autonomous or fully autonomous vehicles. Instead, future mobility requires different sensing technologies and concepts with different advantages and disadvantages. Differences between sensing systems may be related to perception range, vertical and horizontal field of view (FOV), spatial and temporal resolution, speed of data acquisition, etc. Therefore, sensor fusion and data interpretation, possibly assisted by Deep Neuronal Learning (DNL) methods and other Neural Processor Units (NFU) methods for more complex tasks, like judgment of a traffic situation and generation of derived vehicle control functions, may be necessary to cope with such complexities. Furthermore, driving and steering of autonomous vehicles may require a set of ethical rules and commonly accepted traffic regulations.
  • Among these sensing systems, LIDAR sensing systems are expected to play a vital role, as well as camera-based systems, possibly supported by radar and ultrasonic systems. With respect to a specific perception task, these systems may operate more or less independently of each other. However, in order to increase the level of perception (e.g. in terms of accuracy and range), signals and data acquired by different sensing systems may be brought together in so-called sensor fusion systems. Merging of sensor data is not only necessary to refine and consolidate the measured results but also to increase the confidence in sensor results by resolving possible inconsistencies and contradictories and by providing a certain level of redundancy. Unintended spurious signals and intentional adversarial attacks may play a role in this context as well.
  • For an accurate and reliable perception of a vehicle's surrounding, not only vehicle-internal sensing systems and measurement data may be considered but also data and information from vehicle-external sources. Such vehicle-external sources may include sensing systems connected to other traffic participants, such as preceding and oncoming vehicles, pedestrians and cyclists, but also sensing systems mounted on road infrastructure elements like traffic lights, traffic signals, bridges, elements of road construction sites and central traffic surveillance structures. Furthermore, data and information may come from far-away sources such as traffic teleoperators and satellites of global positioning systems (e.g. GPS).
  • Therefore, apart from sensing and perception capabilities, future mobility will also heavily rely on capabilities to communicate with a wide range of communication partners. Communication may be unilateral or bilateral and may include various wireless transmission technologies, such as WLAN, Bluetooth and communication based on radio frequencies and visual or non-visual light signals. It is to be noted that some sensing systems, for example LIDAR sensing systems, may be utilized for both sensing and communication tasks, which makes them particularly interesting for future mobility concepts. Data safety and security and unambiguous identification of communication partners are examples where light-based technologies have intrinsic advantages over other wireless communication technologies. Communication may need to be encrypted and tamper-proof.
  • From the above description, it becomes clear also that future mobility has to be able to handle vast amounts of data, as several tens of gigabytes may be generated per driving hour. This means that autonomous driving systems have to acquire, collect and store data at very high speed, usually complying with real-time conditions. Furthermore, future vehicles have to be able to interpret these data, i.e. to derive some kind of contextual meaning within a short period of time in order to plan and execute required driving maneuvers. This demands complex software solutions, making use of advanced algorithms. It is expected that autonomous driving systems will including more and more elements of artificial intelligence, machine and self-learning, as well as Deep Neural Networks (DNN) for certain tasks, e.g. visual image recognition, and other Neural Processor Units (NFU) methods for more complex tasks, like judgment of a traffic situation and generation of derived vehicle control functions, and the like. Data calculation, handling, storing and retrieving may require a large amount of processing power and hence electrical power.
  • In an attempt to summarize and conclude the above paragraphs, future mobility will involve sensing systems, communication units, data storage devices, data computing and signal processing electronics as well as advanced algorithms and software solutions that may include and offer various ethical settings. The combination of all these elements is constituting a cyber-physical world, usually denoted as the Internet of things (IoT). In that respect, future vehicles represent some kind of IoT device as well and may be called “Mobile IoT devices”.
  • Such “Mobile IoT devices” may be suited to transport people and cargo and to gain or provide information. It may be noted that future vehicles are sometimes also called “smartphones on wheels”, a term which surely reflects some of the capabilities of future vehicles. However, the term implies a certain focus towards consumer-related new features and gimmicks. Although these aspects may certainly play a role, it does not necessarily reflect the huge range of future business models, in particular data-driven business models, that can be envisioned only at the present moment of time but which are likely to center not only on personal, convenience-driven features but include also commercial, industrial or legal aspects.
  • New data-driven business models will focus on smart, location-based services, utilizing for example self-learning and prediction aspects, as well as gesture and language processing with Artificial Intelligence as one of the key drivers. All this is fueled by data, which will be generated in vast amounts in automotive industry by a large fleet of future vehicles acting as mobile digital platforms and by connectivity networks linking together mobile and stationary IoT devices.
  • New mobility services including station-based and free-floating car sharing, as well as ride-sharing propositions have already started to disrupt traditional business fields. This trend will continue, finally providing robo-taxi services and sophisticated Transportation-as-a-Service (TaaS) and Mobility-as-a-Service (MaaS) solutions.
  • Electrification, another game-changing trend with respect to future mobility, has to be considered as well. Hence, future sensing systems will have to pay close attention to system efficiency, weight and energy-consumption aspects. In addition to an overall minimization of energy consumption, also context-specific optimization strategies, depending for example on situation-specific or location-specific factors, may play an important role.
  • Energy consumption may impose a limiting factor for autonomously driving electrical vehicles. There are quite a number of energy consuming devices like sensors, for example RADAR, LIDAR, camera, ultrasound, Global Navigation Satellite System (GNSS/GPS), sensor fusion equipment, processing power, mobile entertainment equipment, heater, fans, Heating, Ventilation and Air Conditioning (HVAC), Car-to-Car (C2C) and Car-to-Environment (C2X) communication, data encryption and decryption, and many more, all leading up to a high power consumption. Especially data processing units are very power hungry. Therefore, it is necessary to optimize all equipment and use such devices in intelligent ways so that a higher battery mileage can be sustained.
  • Besides new services and data-driven business opportunities, future mobility is expected also to provide a significant reduction in traffic-related accidents. Based on data from the Federal Statistical Office of Germany (Destatis, 2018), over 98% of traffic accidents are caused, at least in part by humans. Statistics from other countries display similarly clear correlations.
  • Nevertheless, it has to be kept in mind that automated vehicles will also introduce new types of risks, which have not existed before. This applies to so far unseen traffic scenarios, involving only a single automated driving system as well as for complex scenarios resulting from dynamic interactions between a plurality of automated driving system. As a consequence, realistic scenarios aim at an overall positive risk balance for automated driving as compared to human driving performance with a reduced number of accidents, while tolerating to a certain extent some slightly negative impacts in cases of rare and unforeseeable driving situations. This may be regulated by ethical standards that are possibly implemented in soft- and hardware.
  • Any risk assessment for automated driving has to deal with both, safety and security related aspects: safety in this context is focusing on passive adversaries for example due to malfunctioning systems or system components, while security is focusing on active adversaries for example due to intentional attacks by third parties.
  • In the following a non-exhaustive enumeration is given for safety-related and security-related factors, with reference to “Safety first for Automated Driving”, a white paper published in 2019 by authors from various Automotive OEM, Tier-1 and Tier-2 suppliers.
  • Safety assessment: to meet the targeted safety goals, methods of verification and validation have to be implemented and executed for all relevant systems and components. Safety assessment may include safety by design principles, quality audits of the development and production processes, the use of redundant sensing and analysis components and many other concepts and methods.
  • Safe operation: any sensor system or otherwise safety-related system might be prone to degradation, i.e. system performance may decrease over time or a system may even fail completely (e.g. being unavailable). To ensure safe operation, the system has to be able to compensate for such performance losses for example via redundant sensor systems. In any case, the system has to be configured to transfer the vehicle into a safe condition with acceptable risk. One possibility may include a safe transition of the vehicle control to a human vehicle operator.
  • Operational design domain: every safety-relevant system has an operational domain (e.g. with respect to environmental conditions such as temperature or weather conditions including rain, snow and fog) inside which a proper operation of the system has been specified and validated. As soon as the system gets outside of this domain, the system has to be able to compensate for such a situation or has to execute a safe transition of the vehicle control to a human vehicle operator.
  • Safe layer: the automated driving system needs to recognize system limits in order to ensure that it operates only within these specified and verified limits. This includes also recognizing limitations with respect to a safe transition of control to the vehicle operator.
  • User responsibility: it must be clear at all times which driving tasks remain under the user's responsibility. In addition, the system has to be able to determine factors, which represent the biological state of the user (e.g. state of alertness) and keep the user informed about their responsibility with respect to the user's remaining driving tasks.
  • Human Operator-initiated handover: there have to be clear rules and explicit instructions in case that a human operator requests an engaging or disengaging of the automated driving system.
  • Vehicle-initiated handover: requests for such handover operations have to be clear and manageable by the human operator, including a sufficiently long time period for the operator to adapt to the current traffic situation. In case it turns out that the human operator is not available or not capable of a safe takeover, the automated driving system must be able to perform a minimal-risk maneuver.
  • Behavior in traffic: automated driving systems have to act and react in an easy-to-understand way so that their behavior is predictable for other road users. This may include that automated driving systems have to observe and follow traffic rules and that automated driving systems inform other road users about their intended behavior, for example via dedicated indicator signals (optical, acoustic).
  • Security: the automated driving system has to be protected against security threats (e.g. cyber-attacks), including for example unauthorized access to the system by third party attackers. Furthermore, the system has to be able to secure data integrity and to detect data corruption, as well as data forging. Identification of trustworthy data sources and communication partners is another important aspect. Therefore, security aspects are, in general, strongly linked to cryptographic concepts and methods.
  • Data recording: relevant data related to the status of the automated driving system have to be recorded, at least in well-defined cases. In addition, traceability of data has to be ensured, making strategies for data management a necessity, including concepts of bookkeeping and tagging. Tagging may comprise, for example, to correlate data with location information, e.g. GPS-information.
  • In the following disclosure, various aspects are disclosed which may be related to the technologies, concepts and scenarios presented in this chapter “BACKGROUND INFORMATION”. This disclosure is focusing on LIDAR Sensor Systems, Controlled LIDAR Sensor Systems and LIDAR Sensor Devices as well as Methods for LIDAR Sensor Management. As illustrated in the above remarks, automated driving systems are extremely complex systems including a huge variety of interrelated sensing systems, communication units, data storage devices, data computing and signal processing electronics as well as advanced algorithms and software solutions.
  • SUMMARY OF THE DISCLOSURE LIDAR Sensor System and LIDAR Sensor Device
  • The LIDAR Sensor System according to the present disclosure may be combined with a LIDAR Sensor Device for illumination of an environmental space connected to a light control unit.
  • The LIDAR Sensor System may comprise at least one light module. Said one light module has a light source and a driver connected to the light source. The LIDAR Sensor System further has an interface unit, in particular a hardware interface, configured to receive, emit, and/or store data signals. The interface unit may connect to the driver and/or to the light source for controlling the operation state of the driver and/or the operation of the light source.
  • The light source may be configured to emit radiation in the visible and/or the non-visible spectral range, as for example in the far-red range of the electromagnetic spectrum. It may be configured to emit monochromatic laser light. The light source may be an integral part of the LIDAR Sensor System as well as a remote yet connected element. It may be placed in various geometrical patterns, distance pitches and may be configured for alternating of color or wavelength emission or intensity or beam angle. The LIDAR Sensor System and/or light sources may be mounted such that they are moveable or can be inclined, rotated, tilted etc. The LIDAR Sensor System and/or light source may be configured to be installed inside a LIDAR Sensor Device (e.g. vehicle) or exterior to a LIDAR Sensor Device (e.g. vehicle). In particular, it is possible that the LIDAR light source or selected LIDAR light sources are mounted such or adapted to being automatically controllable, in some implementations remotely, in their orientation, movement, light emission, light spectrum, sensor etc.
  • The light source may be selected from the following group or a combination thereof: light emitting diode (LED), super-luminescent laser diode (LD), VSECL laser diode array.
  • In some embodiments, the LIDAR Sensor System may comprise a sensor, such as a resistive, a capacitive, an inductive, a magnetic, an optical and/or a chemical sensor. It may comprise a voltage or current sensor. The sensor may connect to the interface unit and/or the driver of the LIDAR light source.
  • In some embodiments, the LIDAR Sensor System and/or LIDAR Sensor Device comprise a brightness sensor, for example for sensing environmental light conditions in proximity of vehicle objects, such as houses, bridges, sign posts, and the like. It may be used for sensing daylight conditions and the sensed brightness signal may e.g. be used to improve surveillance efficiency and accuracy. That way, it may be enabled to provide the environment with a required amount of light of a predefined wavelength.
  • In some embodiments, the LIDAR Sensor System and/or LIDAR Sensor Device comprises a sensor for vehicle movement, position and orientation. Such sensor data may allow a better prediction, as to whether the vehicle steering conditions and methods are sufficient.
  • The LIDAR Sensor System and/or LIDAR Sensor Device may also comprise a presence sensor. This may allow to adapt the emitted light to the presence of another traffic participant including pedestrians in order to provide sufficient illumination, prohibit or minimize eye damage or skin irritation or such due to illumination in harmful or invisible wavelength regions, such as UV or IR. It may also be enabled to provide light of a wavelength that may warn or frighten away unwanted presences, e.g. the presence of animals such as pets or insects.
  • In some embodiments, the LIDAR Sensor System and/or LIDAR Sensor Device comprises a sensor or multi-sensor for predictive maintenance and/or operation of the LIDAR Sensor System and/or LIDAR Sensor Device failure.
  • In some embodiments, the LIDAR Sensor System and/or LIDAR Sensor Device comprises an operating hour meter. The operating hour meter may connect to the driver.
  • The LIDAR Sensor System may comprise one or more actuators for adjusting the environmental surveillance conditions for the LIDAR Sensor Device (e.g. vehicle). For instance, it may comprise actuators that allow adjusting for instance, laser pulse shape, temporal length, rise- and fall times, polarization, laser power, laser type (IR-diode, VCSEL), Field of View (FOV), laser wavelength, beam changing device (MEMS, DMD, DLP, LCD, Fiber), beam and/or sensor aperture, sensor type (PN-diode, APD, SPAD).
  • While the sensor or actuator has been described as part of the LIDAR Sensor System and/or LIDAR Sensor Device, it is understood, that any sensor or actuator may be an individual element or may form part of a different element of the LIDAR Sensor System. As well, it may be possible to provide an additional sensor or actuator, being configured to perform or performing any of the described activities as individual element or as part of an additional element of the LIDAR Sensor System.
  • In some embodiments, the LIDAR Sensor System and/or LIDAR Light Device further comprises a light control unit that connects to the interface unit.
  • The light control unit may be configured to control the at least one light module for operating in at least one of the following operation modes: dimming, pulsed, PWM, boost, irradiation patterns, including illuminating and non-illuminating periods, light communication (including C2C and C2X), synchronization with other elements of the LIDAR Sensor System, such as a second LIDAR Sensor Device.
  • The interface unit of the LIDAR Sensor System and/or LIDAR Sensor Device may comprise a gateway, such as a wireless gateway, that may connect to the light control unit. It may comprise a beacon, such as a Bluetooth™ beacon.
  • The interface unit may be configured to connect to other elements of the LIDAR Sensor System, e.g. one or more other LIDAR Sensor Systems and/or LIDAR Sensor Devices and/or to one or more sensors and/or one or more actuators of the LIDAR Sensor System.
  • The interface unit may be configured to be connected by any wireless or wireline connectivity, including radio and/or optical connectivity.
  • The LIDAR Sensor System and/or LIDAR Sensor Device may be configured to enable customer-specific and/or vehicle-specific light spectra. The LIDAR Sensor Device may be configured to change the form and/or position and/or orientation of the at least one LIDAR Sensor System. Further, the LIDAR Sensor System and/or LIDAR Sensor Device may be configured to change the light specifications of the light emitted by the light source, such as direction of emission, angle of emission, beam divergence, color, wavelength, and intensity as well as other characteristics like laser pulse shape, temporal length, rise- and fall times, polarization, pulse synchronization, pulse synchronization, laser power, laser type (IR-diode, VCSEL), Field of View (FOV), laser wavelength, beam changing device (MEMS, DMD, DLP, LCD, Fiber), beam and/or sensor aperture, sensor type (PN-diode, APD, SPAD).
  • In some embodiments, the LIDAR Sensor System and/or LIDAR Sensor Device may comprise a data processing unit. The data processing unit may connect to the LIDAR light driver and/or to the interface unit. It may be configured for data processing, for data and/or signal conversion and/or data storage. The data processing unit may advantageously be provided for communication with local, network-based or web-based platforms, data sources or providers, in order to transmit, store or collect relevant information on the light module, the road to be travelled, or other aspects connected with the LIDAR Sensor System and/or LIDAR Sensor Device.
  • In some embodiments, the LIDAR Sensor Device can encompass one or many LIDAR Sensor Systems that themselves can be comprised of infrared or visible light emitting modules, photoelectric sensors, optical components, interfaces for data communication, actuators, like MEMS mirror systems, computing and data storage devices, software and software databank, communication systems for communication with IoT, edge or cloud systems.
  • The LIDAR Sensor System and/or LIDAR Sensor Device can further include light emitting and light sensing elements that can be used for illumination purposes, like road lighting, or for data communication purposes, for example car-to-car, car-to-environment (for example drones, pedestrian, traffic signs, traffic posts etc.).
  • The LIDAR Sensor Device can further comprise one or more LIDAR Sensor Systems as well as other sensor systems, like optical camera sensor systems (CCD; CMOS), RADAR sensing system, and ultrasonic sensing systems.
  • The LIDAR Sensor Device can be functionally designed as vehicle headlight, rear light, side light, daytime running light (DRL), corner light etc. and comprise LIDAR sensing functions as well as visible illuminating and signaling functions.
  • The LIDAR Sensor System may further comprise a control unit (Controlled LIDAR Sensor System). The control unit may be configured for operating a management system. It is configured to connect to one or more LIDAR Sensor Systems and/or LIDAR Sensor Devices. It may connect to a data bus. The data bus may be configured to connect to an interface unit of an LIDAR Sensor Device. As part of the management system, the control unit may be configured for controlling an operating state of the LIDAR Sensor System and/or LIDAR Sensor Device.
  • The LIDAR Sensor Management System may comprise a light control system which may comprise any of the following elements: monitoring and/or controlling the status of the at least one LIDAR Sensor System and/or LIDAR Sensor Device, monitoring and/or controlling the use of the at least one LIDAR Sensor System and/or LIDAR Sensor Device, scheduling the lighting of the at least one LIDAR Sensor System and/or LIDAR Sensor Device, adjusting the light spectrum of the at least one LIDAR Sensor System and/or LIDAR Sensor Device, defining the light spectrum of the at least one LIDAR Sensor System and/or LIDAR Sensor Device, monitoring and/or controlling the use of at least one sensor of the at least one LIDAR Sensor System and/or LIDAR Sensor Device.
  • In some embodiments, the method for LIDAR Sensor System can be configured and designed to select, operate and control, based on internal or external data input, laser power, pulse shapes, pulse length, measurement time windows, wavelength, single wavelength or multiple wavelength approach, day and night settings, sensor type, sensor fusion, as well as laser safety functions according to relevant safety regulations.
  • The method for LIDAR Sensor Management System can be configured to initiate data encryption, data decryption and data communication protocols.
  • LIDAR Sensor System, Controlled LIDAR Sensor System, LIDAR Sensor Management System and Software
  • In a Controlled LIDAR Sensor System according to the present disclosure, the computing device may be locally based, network based, and/or cloud-based. That means, the computing may be performed in the Controlled LIDAR Sensor System or on any directly or indirectly connected entities. In the latter case, the Controlled LIDAR Sensor System is provided with some connecting means, which allow establishment of at least a data connection with such connected entities.
  • In some embodiments, the Controlled LIDAR Sensor System comprises a LIDAR Sensor Management System connected to the at least one hardware interface. The LIDAR Sensor Management System may comprise one or more actuators for adjusting the surveillance conditions for the environment. Surveillance conditions may, for instance, be vehicle speed, vehicle road density, vehicle distance to other objects, object type, object classification, emergency situations, weather conditions, day or night conditions, day or night time, vehicle and environmental temperatures, and driver biofeedback signals.
  • The present disclosure further comprises an LIDAR Sensor Management Software. The present disclosure further comprises a data storage device with the LIDAR Sensor Management Software, wherein the data storage device is enabled to run the LIDAR Sensor Management Software. The data storage device may either comprise be a hard disk, a RAM, or other common data storage utilities such as USB storage devices, CDs, DVDs and similar.
  • The LIDAR Sensor System, in particular the LIDAR Sensor Management Software, may be configured to control the steering of Automatically Guided Vehicles (AGV).
  • In some embodiments, the computing device is configured to perform the LIDAR Sensor Management Software.
  • The LIDAR Sensor Management Software may comprise any member selected from the following group or a combination thereof: software rules for adjusting light to outside conditions, adjusting the light intensity of the at least one LIDAR Sensor System and/or LIDAR Sensor Device to environmental conditions, adjusting the light spectrum of the at least one LIDAR Sensor System and/or LIDAR Sensor Device to environmental conditions, adjusting the light spectrum of the at least one LIDAR Sensor System and/or LIDAR Sensor Device to traffic density conditions, adjusting the light spectrum of the at least one LIDAR Sensor System and/or LIDAR Sensor Device according to customer specification or legal requirements.
  • According to some embodiments, the Controlled LIDAR Sensor System further comprises a feedback system connected to the at least one hardware interface. The feedback system may comprise one or more sensors for monitoring the state of surveillance for which the Controlled LIDAR Sensor System is provided. The state of surveillance may for example, be assessed by at least one of the following: road accidents, required driver interaction, Signal-to-Noise ratios, driver biofeedback signals, close encounters, fuel consumption, and battery status.
  • The Controlled LIDAR Sensor System may further comprise a feedback software.
  • The feedback software may in some embodiments comprise algorithms for vehicle (LIDAR Sensor Device) steering assessment on the basis of the data of the sensors.
  • The feedback software of the Controlled LIDAR Sensor System may in some embodiments comprise algorithms for deriving surveillance strategies and/or lighting strategies on the basis of the data of the sensors.
  • The feedback software of the Controlled LIDAR Sensor System may in some embodiments of the present disclosure comprise LIDAR lighting schedules and characteristics depending on any member selected from the following group or a combination thereof: road accidents, required driver interaction, Signal-to-Noise ratios, driver biofeedback signals, close encounters, road warnings, fuel consumption, battery status, other autonomously driving vehicles.
  • The feedback software may be configured to provide instructions to the LIDAR Sensor Management Software for adapting the surveillance conditions of the environment autonomously.
  • The feedback software may comprise algorithms for interpreting sensor data and suggesting corrective actions to the LIDAR Sensor Management Software.
  • In some embodiments of the LIDAR Sensor System, the instructions to the LIDAR Sensor Management Software are based on measured values and/or data of any member selected from the following group or a combination thereof: vehicle (LIDAR Sensor Device) speed, distance, density, vehicle specification and class.
  • The LIDAR Sensor System therefore may have a data interface to receive the measured values and/or data. The data interface may be provided for wire-bound transmission or wireless transmission. In particular, it is possible that the measured values or the data are received from an intermediate storage, such as a cloud-based, web-based, network-based or local type storage unit.
  • Further, the sensors for sensing environmental conditions may be connected with or interconnected by means of cloud-based services, often also referred to as Internet of Things.
  • In some embodiments, the Controlled LIDAR Sensor System comprises a software user interface (UI), particularly a graphical user interface (GUI). The software user interface may be provided for the light control software and/or the LIDAR Sensor Management Software and/or the feedback software.
  • The software user interface (UI) may further comprise a data communication and means for data communication for an output device, such as an augmented and/or virtual reality display.
  • The user interface may be implemented as an application for a mobile device, such as a smartphone, a tablet, a mobile computer or similar devices.
  • The Controlled LIDAR Sensor System may further comprise an application programming interface (API) for controlling the LIDAR Sensing System by third parties and/or for third party data integration, for example road or traffic conditions, street fares, energy prices, weather data, GPS.
  • In some embodiments, the Controlled LIDAR Sensor System comprises a software platform for providing at least one of surveillance data, vehicle (LIDAR Sensor Device) status, driving strategies, and emitted sensing light.
  • In some embodiments, the LIDAR Sensor System and/or the Controlled LIDAR Sensor System can include infrared or visible light emitting modules, photoelectric sensors, optical components, interfaces for data communication, and actuators, like MEMS mirror systems, a computing and data storage device, a software and software databank, a communication system for communication with IoT, edge or cloud systems.
  • The LIDAR Sensor System and/or the Controlled LIDAR Sensor System can include light emitting and light sensing elements that can be used for illumination or signaling purposes, like road lighting, or for data communication purposes, for example car-to-car, car-to-environment.
  • In some embodiments, the LIDAR Sensor System and/or the Controlled LIDAR Sensor System may be installed inside the driver cabin in order to perform driver monitoring functionalities, such as occupancy-detection, eye-tracking, face recognition, drowsiness detection, access authorization, gesture control, etc.) and/or to communicate with a Head-up-Display HUD).
  • The software platform may cumulate data from one's own or other vehicles (LIDAR Sensor Devices) to train machine learning algorithms for improving surveillance and car steering strategies.
  • The Controlled LIDAR Sensor System may also comprise a plurality of LIDAR Sensor Systems arranged in adjustable groups.
  • The present disclosure further refers to a vehicle (LIDAR Sensor Device) with at least one LIDAR Sensor System. The vehicle may be planned and build particularly for integration of the LIDAR Sensor System. However, it is also possible, that the Controlled LIDAR Sensor System was integrated in a pre-existing vehicle. According to the present disclosure, both cases as well as a combination of these cases shall be referred to.
  • Method for a LIDAR Sensor System
  • According to yet another aspect of the present disclosure, a method for a LIDAR Sensor System is provided, which comprises at least one LIDAR Sensor System. The method may comprise the steps of controlling the light emitted by the at least one LIDAR Sensor System by providing light control data to the hardware interface of the Controlled LIDAR Sensor System and/or sensing the sensors and/or controlling the actuators of the Controlled LIDAR Sensor System via the LIDAR Sensor Management System.
  • According to yet another aspect of the present disclosure, the method for LIDAR Sensor System can be configured and designed to select, operate and control, based on internal or external data input, laser power, pulse shapes, pulse length, measurement time windows, wavelength, single wavelength or multiple wavelength approach, day and night settings, sensor type, sensor fusion, as well as laser safety functions according to relevant safety regulations.
  • The method according to the present disclosure may further comprise the step of generating light control data for adjusting the light of the at least one LIDAR Sensor System to environmental conditions.
  • In some embodiments, the light control data is generated by using data provided by the daylight or night vision sensor.
  • According to some embodiments, the light control data is generated by using data provided by a weather or traffic control station.
  • The light control data may also be generated by using data provided by a utility company in some embodiments.
  • Advantageously, the data may be gained from one data source, whereas that one data source may be connected, e.g. by means of Internet of Things devices, to those devices. That way, data may be pre-analyzed before being released to the LIDAR Sensor System, missing data could be identified, and in further advantageous developments, specific pre-defined data could also be supported or replaced by “best-guess” values of a machine learning software.
  • In some embodiments, the method further comprises the step of using the light of the at least one LIDAR Sensor Device for example during the time of day or night when traffic conditions are the best. Of course, other conditions for the application of the light may also be considered.
  • In some embodiments, the method may comprise a step of switching off the light of the at least one LIDAR Sensor System depending on a predetermined condition. Such condition may for instance occur, if the vehicle (LIDAR Sensor Device) speed or a distance to another traffic object is lower than a pre-defined or required safety distance or safety condition.
  • The method may also comprise the step of pushing notifications to the user interface in case of risks or fail functions and vehicle health status.
  • In some embodiments, the method comprises analyzing sensor data for deducing traffic density and vehicle movement.
  • The LIDAR Sensor System features may be adjusted or triggered by way of a user interface or other user feedback data. The adjustment may further be triggered by way of a machine learning process, as far as the characteristics, which are to be improved or optimized are accessible by sensors. It is also possible that individual users adjust the surveillance conditions and or further surveillance parameters to individual needs or desires.
  • The method may also comprise the step of uploading LIDAR sensing conditions to a software platform and/or downloading sensing conditions from a software platform.
  • In at least one embodiment, the method comprises a step of logging performance data to an LIDAR sensing note book.
  • The data cumulated in the Controlled LIDAR Sensor System may, in a step of the method, be analyzed in order to directly or indirectly determine maintenance periods of the LIDAR Sensor System, expected failure of system components or such.
  • According to another aspect, the present disclosure comprises a computer program product comprising a plurality of program instructions, which when executed by a computer system of a LIDAR Sensor System, cause the Controlled LIDAR Sensor System to execute the method according to the present disclosure. The disclosure further comprises a data storage device.
  • Yet another aspect of the present disclosure refers to a data storage device with a computer program adapted to execute at least one of a method for a LIDAR Sensor System or a LIDAR Sensor Device.
  • Preferred embodiments can be found in the independent and dependent claims and in the entire disclosure, wherein in the description and representation of the features is not always differentiated in detail between the different claim categories; In any case implicitly, the disclosure is always directed both to the method and to appropriately equipped motor vehicles (LIDAR Sensor Devices) and/or a corresponding computer program product.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The detailed description is described with reference to the accompanying figures. The use of the same reference number in different instances in the description and the figure may indicate a similar or identical item. The drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating the principles of the present disclosure.
  • In the following description, various embodiments of the present disclosure are described with reference to the following drawings, in which:
  • FIG. 1 shows a portion of a sensor in accordance with various embodiments.
  • FIG. 2 shows a portion of a sensor in accordance with various embodiments in more detail.
  • FIG. 3 shows a portion of a sensor in accordance with various embodiments in more detail.
  • FIG. 4 shows a portion of a sensor in accordance with various embodiments in more detail.
  • FIG. 5 shows a recorded scene and the sensor pixels used to detect the scene in accordance with various embodiments in more detail.
  • FIG. 6 shows a recorded scene and the sensor pixels used to detect the scene in accordance with various embodiments in more detail.
  • FIG. 7 shows a flow diagram illustrating a method for a LIDAR Sensor System in accordance with various embodiments in more detail.
  • FIG. 8 shows a flow diagram illustrating another method for a LIDAR Sensor System in accordance with various embodiments in more detail.
  • FIG. 9 shows a cross sectional view of an optical component for a LIDAR Sensor System in accordance with various embodiments.
  • FIGS. 10A and 10B show a cross sectional view of an optical component for a LIDAR Sensor System (FIG. 10A) and a corresponding wavelength/transmission diagram (FIG. 10B) in accordance with various embodiments.
  • FIGS. 11A and 11B show a cross sectional view of an optical component for a LIDAR Sensor System (FIG. 11A) and a corresponding wavelength/transmission diagram (FIG. 11B) in accordance with various embodiments.
  • FIG. 12 shows a cross sectional view of a sensor for a LIDAR Sensor System in accordance with various embodiments.
  • FIG. 13 shows a top view of a sensor for a LIDAR Sensor System in accordance with various embodiments.
  • FIG. 14 shows a top view of a sensor for a LIDAR Sensor System in accordance with various embodiments.
  • FIG. 15 shows a top view of a sensor for a LIDAR Sensor System in accordance with various embodiments.
  • FIG. 16 shows a cross sectional view of an optical component for a LIDAR Sensor System in accordance with various embodiments.
  • FIG. 17A shows a side view of an optical package in a schematic representation in accordance with various embodiments.
  • FIG. 17B shows a circuit equivalent in a schematic representation in accordance with various embodiments.
  • FIG. 17C shows a circuit equivalent in a schematic representation in accordance with various embodiments.
  • FIG. 18 shows a top view of an optical package in a schematic representation in accordance with various embodiments.
  • FIG. 19A shows a side view of an optical package in a schematic representation in accordance with various embodiments.
  • FIG. 19B shows a top view of an optical package in a schematic representation in accordance with various embodiments.
  • FIG. 20 shows schematically an embodiment of the proposed LIDAR Sensor System, Controlled LIDAR Sensor System and LIDAR Sensor Device
  • DETAILED DESCRIPTION Introduction
  • Autonomously driving vehicles need sensing methods that detect objects and map their distances in a fast and reliable manner. Light detection and ranging (LIDAR), sometimes called Laser Detection and Ranging (LADAR), Time of Flight measurement device (TOF), Laser Scanners or Laser Radar—is a sensing method that detects objects and maps their distances. The technology works by illuminating a target with an optical pulse and measuring the characteristics of the reflected return signal. The width of the optical-pulse can range from a few nanoseconds to several microseconds.
  • In order to steer and guide autonomous cars in a complex driving environment, it is adamant to equip vehicles with fast and reliable sensing technologies that provide high-resolution, three-dimensional information (Data Cloud) about the surrounding environment thus enabling proper vehicle control by using on-board or cloud-based computer systems.
  • For distance and speed measurement, a light-detection-and-ranging LIDAR Sensor Systems is known from the prior art. With LIDAR Sensor Systems, it is possible to quickly scan the environment and detect speed and direction of movement of individual objects (vehicles, pedestrians, static objects). LIDAR Sensor Systems are used, for example, in partially autonomous vehicles or fully autonomously driving prototypes, as well as in aircraft and drones. A high-resolution LIDAR Sensor System emits a (mostly infrared) laser beam, and further uses lenses, mirrors or micro-mirror systems, as well as suited sensor devices.
  • The disclosure relates to a LIDAR Sensor System for environment detection, wherein the LIDAR Sensor System is designed to carry out repeated measurements for detecting the environment, wherein the LIDAR Sensor System has an emitting unit (First LIDAR Sensing System) which is designed to perform a measurement with at least one laser pulse and wherein the LIDAR system has a detection unit (Second LIDAR Sensing Unit), which is designed to detect an object-reflected laser pulse during a measurement time window. Furthermore, the LIDAR system has a control device (LIDAR Data Processing System/Control and Communication System/LIDAR Sensor Management System), which is designed, in the event that at least one reflected beam component is detected, to associate the detected beam component on the basis of a predetermined assignment with a solid angle range from which the beam component originates. The disclosure also includes a method for operating a LIDAR Sensor System.
  • The distance measurement in question is based on a transit time measurement of emitted electromagnetic pulses. The electromagnetic spectrum should range from the ultraviolet via the visible to the infrared, including violet and blue radiation in the range from 405 to 480 nm. If these hit an object, the pulse is proportionately reflected back to the distance-measuring unit and can be recorded as an echo pulse with a suitable sensor. If the emission of the pulse takes place at a time t0 and the echo pulse is detected at a later time t1, the distance d to the reflecting surface of the object over the transit time ΔtA=t1−t0 can be determined according Eq.1.

  • d=ΔtAc/2  Eq. 1
  • Since these are electromagnetic pulses, c is the value of the speed of light. In the context of this disclosure, the word electromagnetic comprises the entire electromagnetic spectrum, thus including the ultraviolet, visible and infrared spectrum range.
  • The LIDAR method is usefully working with light pulses which, for example, using semiconductor laser diodes having a wavelength between about 850 nm to about 1600 nm, which have a FWHM pulse width of 1 ns to 100 ns (FWHM=Full Width at Half Maximum). Also conceivable in general are wavelengths up to, in particular approximately, 8100 nm.
  • Furthermore, each light pulse is typically associated with a measurement time window, which begins with the emission of the measurement light pulse. If objects that are very far away are to be detectable by a measurement, such as, for example, objects at a distance of 300 meters and farther, this measurement time window, within which it is checked whether at least one reflected beam component has been received, must last at least two microseconds. In addition, such measuring time windows typically have a temporal distance from each other.
  • The use of LIDAR sensors is now increasingly used in the automotive sector. Correspondingly, LIDAR sensors are increasingly installed in motor vehicles.
  • The disclosure also relates to a method for operating a LIDAR Sensor System arrangement comprising a First LIDAR Sensor System with a first LIDAR sensor and at least one Second LIDAR Sensor System with a second LIDAR sensor, wherein the first LIDAR sensor and the second LIDAR sensor repeatedly perform respective measurements, wherein the measurements of the first LIDAR Sensor are performed in respective first measurement time windows, at the beginning of which a first measurement beam is emitted by the first LIDAR sensor and it is checked whether at least one reflected beam component of the first measurement beam is detected within the respective first measurement time window. Furthermore, the measurements of the at least one second LIDAR sensor are performed in the respective second measurement time windows, at the beginning of which a second measurement beam is emitted by the at least one second LIDAR sensor, and it is checked whether within the respective second measurement time window at least one reflected beam portion of the second measuring beam is detected. The disclosure also includes a LIDAR Sensor System arrangement with a first LIDAR sensor and at least one second LIDAR sensor.
  • A LIDAR (light detection and ranging) Sensor System is to be understood in particular as meaning a system which, in addition to one or more emitters for emitting light beams, for example in pulsed form, and a detector for detecting any reflected beam components, may have further devices, for example optical elements such as lenses and/or a MEMS mirror.
  • The oscillating mirrors or micro-mirrors of the MEMS (Micro-Electro-Mechanical System) system, in some embodiments in cooperation with a remotely located optical system, allow a field of view to be scanned in a horizontal angular range of e.g. 60° or 120° and in a vertical angular range of e.g. 30°. The receiver unit or the sensor can measure the incident radiation without spatial resolution. The receiver unit can also be spatial angle resolution measurement device. The receiver unit or sensor may comprise a photodiode, e.g. an avalanche photo diode (APD) or a single photon avalanche diode (SPAD), a PIN diode or a photomultiplier. Objects can be detected, for example, at a distance of up to 60 m, up to 300 m or up to 600 m using the LIDAR system. A range of 300 m corresponds to a signal path of 600 m, from which, for example, a measuring time window or a measuring duration of 2 μs can result.
  • As already described, optical reflection elements in a LIDAR Sensor System may include micro-electrical mirror systems (MEMS) and/or digital mirrors (DMD) and/or digital light processing elements (DLP) and/or a galvo-scanner for control of the emitted laser beam pulses and/or reflection of an object-back-scattered laser pulses onto a sensor surface. Advantageously, a plurality of mirrors is provided. These may particularly be arranged in some implementations in the manner of a matrix. The mirrors may be individually and separately, independently of each other rotatable or movable.
  • The individual mirrors can each be part of a so-called micro mirror unit or “Digital Micro-Mirror Device” (DMD). A DMD can have a multiplicity of mirrors, in particular micro-mirrors, which can be rotated at high frequency between at least two positions. Each mirror can be individually adjustable in its angle and can have at least two stable positions, or with other words, in particular stable, final states, between which it can alternate. The number of mirrors can correspond to the resolution of a projected image, wherein a respective mirror can represent a light pixel on the area to be irradiated. A “Digital Micro-Mirror Device” is a micro-electro-mechanical component for the dynamic modulation of light.
  • Thus, the DMD can for example provide suited illumination for a vehicle low and/or a high beam. Furthermore, the DMD may also serve projection light for projecting images, logos, and information on a surface, such as a street or surrounding object. The mirrors or the DMD can be designed as a micro-electromechanical system (MEMS). A movement of the respective mirror can be caused, for example, by energizing the MEMS. Such micro-mirror arrays are available, for example, from Texas Instruments. The micro-mirrors are in particular arranged like a matrix, e.g. for example, in an array of 854×480 micro-mirrors, as in the DLP3030-Q1 0.3-inch DMP mirror system optimized for automotive applications by Texas Instruments, or a 1920×1080 micro-mirror system designed for home projection applications 4096×2160 Micro-mirror system designed for 4K cinema projection applications, but also usable in a vehicle application. The position of the micro-mirrors is, in particular, individually adjustable, for example with a clock rate of up to 32 kHz, so that predetermined light patterns can be coupled out of the headlamp by corresponding adjustment of the micro-mirrors.
  • In some embodiments, the used MEMS arrangement may be provided as a 1D or 2D MEMS arrangement. In a 1D MEMS, the movement of an individual mirror takes place in a translatory or rotational manner about an axis. In 2D MEMS, the individual mirror is gimballed and oscillates about two axes, whereby the two axes can be individually employed so that the amplitude of each vibration can be adjusted and controlled independently of the other.
  • Furthermore, a beam radiation from the light source can be deflection through a structure with at least one liquid crystal element, wherein one molecular orientation of the at least one liquid crystal element is adjustable by means of an electric field. The structure through which the radiation to be aligned is guided can comprise at least two sheet-like elements coated with electrically conductive and transparent coating material. The plate elements are in some embodiments transparent and spaced apart from each other in parallel. The transparency of the plate elements and the electrically conductive coating material allows transmission of the radiation. The electrically conductive and transparent coating material can at least partially or completely made of a material with a high electrical conductivity or a small electrical resistance such as indium tin oxide (ITO) and/or of a material with a low electrical conductivity or a large electrical resistance such as poly-3,4-ethylenedioxythiophene (PEDOT).
  • The generated electric field can be adjustable in its strength. The electric field can be adjustable in particular by applying an electrical voltage to the coating material or the coatings of the plate elements. Depending on the size or height of the applied electrical voltages on the coating materials or coatings of the plate elements formed as described above, differently sized potential differences and thus a different electrical field are formed between the coating materials or coatings.
  • Depending on the strength of the electric field, that is, depending on the strength of the voltages applied to the coatings, the molecules of the liquid crystal elements may align with the field lines of the electric field.
  • Due to the differently oriented liquid crystal elements within the structure, different refractive indices can be achieved. As a result, the radiation passing through the structure, depending on the molecular orientation, moves at different speeds through the liquid crystal elements located between the plate elements. Overall, the liquid crystal elements located between the plate elements have the function of a prism, which can deflect or direct incident radiation. As a result, with a correspondingly applied voltage to the electrically conductive coatings of the plate elements, the radiation passing through the structure can be oriented or deflected, whereby the deflection angle can be controlled and varied by the level of the applied voltage.
  • Furthermore, a combination of white or colored light sources and infrared laser light sources is possible, in which the light source is followed by an adaptive mirror arrangement, via which radiation emitted by both light sources can be steered or modulated, a sensor system being used for the infrared light source intended for environmental detection. The advantage of such an arrangement is that the two light systems and the sensor system use a common adaptive mirror arrangement. It is therefore not necessary to provide for the light system and the sensor system each have their own mirror arrangement. Due to the high degree of integration space, weight and in particular costs can be reduced.
  • In LIDAR systems, differently designed transmitters and receiver concepts are also known in order to be able to record the distance information in different spatial directions. Based on this, a two-dimensional image of the environment is then generated, which contains the complete three-dimensional coordinates for each resolved spatial point. The different LIDAR topologies can be abstractly distinguished based on how the image resolution is displayed. Namely, the resolution can be represented either exclusively by an angle-sensitive detector, an angle-sensitive emitter, or a combination of both. A LIDAR system, which generates its resolution exclusively by means of the detector, is called a Flash LIDAR. It includes of an emitter, which illuminates as homogeneously as possible the entire field of vision. In contrast, the detector in this case includes of a plurality of individually readable and arranged in a matrix segments or pixels. Each of these pixels is correspondingly assigned a solid angle range. If light is received in a certain pixel, then the light is correspondingly derived from the solid angle region assigned to this pixel. In contrast to this, a raster or scanning LIDAR has an emitter which emits the measuring pulses selectively and in particular temporally sequentially in different spatial directions. Here a single sensor segment is sufficient as a detector. If, in this case, light is received by the detector in a specific measuring time window, then this light comes from a solid angle range into which the light was emitted by the emitter in the same measuring time window.
  • To improve Signal-to-Noise Ratio (SNR), a plurality of the above-described measurements or single-pulse measurements can be netted or combined with each other in a LIDAR Sensor System, for example to improve the signal-to-noise ratio by averaging the determined measured values.
  • The radiation emitted by the light source is in some embodiments infrared (IR) radiation emitted by a laser diode in a wavelength range of 600 nm to 850 nm. However, wavelengths up to 1064 nm, up to 1600 nm, up to 5600 nm or up to 8100 nm are also possible. The radiation of the laser diode can be emitted in a pulse-like manner with a frequency between 1 kHz and 1 MHz, in some implementations with a frequency between 10 kHz and 100 kHz. The laser pulse duration may be between 0.1 ns and 100 ns, in some implementations between 1 ns and 2 ns. As a type of the IR radiation emitting laser diode, a VCSEL (Vertical Cavity Surface Emitting Laser) can be used, which emits radiation with a radiation power in the “milliwatt” range. However, it is also possible to use a VECSEL (Vertical External Cavity Surface Emitting Laser), which can be operated with high pulse powers in the wattage range. Both the VCSEL and the VECSEL may be in the form of an array, e.g. 15×20 or 20×20 laser diodes may be arranged so that the summed radiation power can be several hundred watts. If the lasers pulse simultaneously in an array arrangement, the largest summed radiation powers can be achieved. The emitter units may differ, for example, in their wavelengths of the respective emitted radiation. If the receiver unit is then also configured to be wavelength-sensitive, the pulses can also be differentiated according to their wavelength.
  • Further embodiments relating to the functionality of various components of a LIDAR Sensor System, for example light sources, sensors, mirror systems, laser driver, control equipment, are described in Chapter “Components”.
  • The appendix “EXPLANATIONS AND GLOSSARY” describes further aspects of the referenced and used technical terms
  • It is an object of the disclosure to propose improved components for a LIDAR Sensor System and/or to propose improved solutions for a LIDAR Sensor System and/or for a LIDAR Sensor Device and/or to propose improved methods for a LIDAR Sensor System and/or for a LIDAR Sensor Device.
  • The object is achieved according to the features of the independent claims. Further aspects of the disclosure are given in the dependent claims and the following description.
  • FIG. 20 shows schematically an embodiment of the proposed LIDAR Sensor System, Controlled LIDAR Sensor System and LIDAR Sensor Device.
  • The LIDAR Sensor System 10 comprises a First LIDAR Sensing System 40 that may comprise a Light Source 42 configured to emit electro-magnetic or other radiation 120, in particular a continuous-wave or pulsed laser radiation in the blue and/or infrared wavelength range, a Light Source Controller 43 and related Software, Beam Steering and Modulation Devices 41, in particular light steering and reflection devices, for example Micro-Mechanical Mirror Systems (MEMS), with a related control unit 150, Optical components 80, for example lenses and/or holographic elements, a LIDAR Sensor Management System 90 configured to manage input and output data that are required for the proper operation of the First LIDAR Sensing System 40.
  • The First LIDAR Sensing System 40 may be connected to other LIDAR Sensor System devices, for example to a Control and Communication System 70 that is configured to manage input and output data that are required for the proper operation of the First LIDAR Sensor System 40.
  • The LIDAR Sensor System 10 may include a Second LIDAR Sensing System 50 that is configured to receive and measure electromagnetic or other radiation, using a variety of Sensors 52 and Sensor Controller 53.
  • The Second LIDAR Sensing System may comprise Detection Optics 82, as well as Actuators for Beam Steering and Control 51.
  • The LIDAR Sensor System 10 may further comprise a LIDAR Data Processing System 60 that performs Signal Processing 61, Data Analysis and Computing 62, Sensor Fusion and other sensing Functions 63.
  • The LIDAR Sensor System 10 may further comprise a Control and Communication System 70 that receives and outputs a variety of signal and control data 160 and serves as a Gateway between various functions and devices of the LIDAR Sensor System 10.
  • The LIDAR Sensor System 10 may further comprise one or many Camera Systems 81, either stand-alone or combined with another Lidar Sensor System 10 component or embedded into another Lidar Sensor System 10 component, and data-connected to various other devices like to components of the Second LIDAR Sensing System 50 or to components of the LIDAR Data Processing System 60 or to the Control and Communication System 70.
  • The LIDAR Sensor System 10 may be integrated or embedded into a LIDAR Sensor Device 30, for example a housing, a vehicle, a vehicle headlight.
  • The Controlled LIDAR Sensor System 20 is configured to control the LIDAR Sensor System 10 and its various components and devices, and performs or at least assists in the navigation of the LIDAR Sensor Device 30. The Controlled LIDAR Sensor System 20 may be further configured to communicate for example with another vehicle or a communication networks and thus assists in navigating the LIDAR Sensor Device 30.
  • As explained above, the LIDAR Sensor System 10 is configured to emit electro-magnetic or other radiation in order to probe the environment 100 for other objects, like cars, pedestrians, road signs, and road obstacles. The LIDAR Sensor System 10 is further configured to receive and measure electromagnetic or other types of object-reflected or object-emitted radiation 130, but also other wanted or unwanted electromagnetic radiation 140, in order to generate signals 110 that can be used for the environmental mapping process, usually generating a point cloud that is representative of the detected objects.
  • Various components of the Controlled LIDAR Sensor System 20 use Other Components or Software 150 to accomplish signal recognition and processing as well as signal analysis. This process may include the use of signal information that come from other sensor devices.
  • Chapter “Components”
  • The LIDAR Sensor System according to the present disclosure may be combined with a LIDAR Sensor Device connected to a light control unit for illumination of an environmental space.
  • As already described in this disclosure, various types of photo diodes may be used for the detection of light or light pulses in a respective sensor pixel, e.g. one or more of the following types of photo diodes:
      • pin photo diode;
      • passive and active pixel sensors (APS), like CCD or CMOS;
      • avalanche photo diode operated in a linear mode (APD);
      • avalanche photo diode operated in the Geiger mode to detect single photons (single-photon avalanche photo diode, SPAD).
  • It should be noted that in the context of this disclosure, photo diodes are understood to be of different photo diode types even though the structure of the photo diodes is the same (e.g. the photo diodes are all pin photo diodes), but the photo diodes are of different size or shape or orientation and/or may have different sensitivities (e.g. due to the application of different reverse-bias voltages to the photo diodes). Illustratively, a photo diode type in the context of this disclosure is not only defined by the type of construction of the photo diode, but also by their sizes, shapes, orientation and/or ways of operation, and the like.
  • A two-dimensional array of sensor pixels (and thus a two-dimensional array of photo diodes) may be provided for an imaging of two-dimensional images. In this case, an optical signal converted into an electronic signal may be read-out individually per sensor pixel, comparable with a CCD or CMOS image sensor. However, it may be provided to interconnect a plurality of sensor pixels in order to achieve a higher sensitivity by achieving a higher signal strength. This principle may be applied, but is not limited, to the principle of the “silicon photomultiplier” (SiPM). In this case, a plurality (in the order of 10 to 1000 or even more) of individual SPADs are connected in parallel. Although each single SPAD reacts to the first incoming photon (taking into consideration the detection probability), the sum of a lot of SPAD signals results in a quasi analog signal, which may be used to derive the incoming optical signal.
  • In contrast to the so-called Flash LIDAR Sensor System, in which the entire sensor array (which may also be referred to as detector array) is illuminated at once, there are several LIDAR concepts which use a combination of a one-dimensional beam deflection or a two-dimensional beam deflection with a two-dimensional detector array. In such a case, a circular or linear (straight or curved) laser beam may be transmitted and may be imaged via a separate, fixedly mounted receiver optics onto the sensor array (detector array). In this case, only predefined pixels of the sensor array are illuminated, dependent on the transmitter/receiver optics and the position of the beam deflection device. The illuminated pixels are read out and, the non-illuminated pixels are not read out. Thus, unwanted signals (e.g. background light) e.g. coming from the non-illuminated and therefore not read out pixels are suppressed. Depending on the dimensions of the transmitter/receiver optics it may be feasible to illuminate more pixels or less pixels, e.g. by de-focusing of the receiver optics. The de-focusing process may be adjusted adaptively, for example, depending on the illuminated scene and signal response of back-scattered light. The most suitable size of the illumination spot on the surface of the sensor 52 does not necessarily need to coincide with the geometric layout of the pixels on the sensor array. By way of example, if the spot is positioned between two (or four) pixels, then two (or four) pixels will only be partially illuminated. This may also result in a bad signal-to-noise ratio due to the non-illuminated pixel regions.
  • In various embodiments, control lines (e.g. column select lines carrying the column select signals and row select lines carrying the row select signals) may be provided to selectively interconnect a plurality of photo diodes to define a “virtual pixel”, which may be optimally adapted to the respective application scenario and the size of the laser spot on the sensor array. This may be implemented by row selection lines and column selection lines, similar to the access and control of memory cells of a DRAM memory. Furthermore, various types of photo diodes (in other words, various photo diode types) may be implemented (e.g. monolithically integrated) on one common sensor 52 and may be driven, accessed and read out separately, for example.
  • Moreover, in combination or independent from the interconnection of a plurality of pixels, the sensor may include several pixels including different types of photo diodes. In other words, various photo diode types may be monolithically integrated on the sensor 52 and may be accessed, controlled, or driven separately or the sensor pixel signals from pixels having the same or different photo diode types may be combined and analysed as one common signal.
  • By way of example, different photo diode types may be provided and individually controlled and read out, for example:
      • one or more pixels may have a single-photon avalanche photo diode for LIDAR applications;
      • one or more pixels may have a pin photo diode for camera applications (e.g. for the detection of the taillight or a headlight of a vehicle, or for thermal imaging using infrared sensitive sensors); and/or
      • one or more pixels may have an avalanche photo diode for LIDAR applications.
  • Depending on the respective application, a photo diode of a pixel may be provided with an additional optical bandpass filter and/or polarization filter on pixel level connected upstream.
  • In general, a plurality of pixels of the sensor 52 may be interconnected.
  • There are many options as to how the pixels having the same or different photo diode types may be interconnected, such as:
      • pixels may have different photo diode types, such as photo diode of the same physical structure, but have different sizes of their respective sensor surface regions;
      • pixels may have different photo diode types, such as photo diode of the same physical structure, but have different sensitivities (e.g. due to different operation modes such as the application of different reverse-bias voltages); or
      • pixels may have different photo diode types, such as photo diodes of different physical structures such as e.g. one or more pixels having a pin photo diode and/or one or more pixels having an avalanche photo diode and/or one or more pixels having a SPAD.
  • The interconnecting of pixels and thus the interconnecting of photo diodes (e.g. of pin photo diodes) may be provided based on the illumination conditions (in other words lighting conditions) of both, camera and/or LIDAR. With improving lighting conditions a smaller number of sensor pixels of the plurality of sensor pixels may be selected and combined. In other words, in case of good lighting conditions fewer pixels may be interconnected. This results in a lower light sensitivity, but it may achieve a higher resolution. In case of bad lighting conditions, e.g. when driving at night, more pixels may be interconnected. This results in a higher light sensitivity, but may suffer from a lower resolution.
  • In various embodiments, the sensor controller may be configured to control the selection network (see below for further explanation) based on the level of illuminance of the LIDAR Sensor System such that the better the lighting conditions (visible and/or infrared spectral range) are, the fewer selected sensor pixels of the plurality of sensor pixels will be combined.
  • The interconnecting of the individual pixels and thus of the individual photo diodes to a “virtual sensor pixel” allows an accurate adaptation of the size of the sensor pixel to the demands of the entire system such as e.g. the entire LIDAR Sensing System. This may occur e.g. in a scenario in which it is to be expected that the non-illuminated regions of the photo diodes provide a significant noise contribution to the wanted signal. By way of example, a variable definition (selection) of the size of a “pixel” (“virtual pixel”) may be provided e.g. with avalanche photo diodes and/or silicon photomultipliers (SiPM), where the sensor 52 includes a large number of individual pixels including SPADs. In order to increase the dynamic region of a sensor having a distinct saturation effect (e.g. SiPM), the following interconnection may be implemented: the laser beam has a beam profile of decreasing intensity with increasing distance from the center of the laser beam. In principle, laser beam profiles can have different shapes, for example a Gaussian or a flat top shape. It is also to be noted that for a LIDAR measurement function, infrared as well as visible laser diodes and respectively suited sensor elements may be used.
  • If pixels were interconnected in the sensor array in the form of rings, for example circular or elliptical rings, around the expected center of the impinging (e.g. laser) beam, the center may, as a result, be saturated. However, the sensor pixels located in one or more rings further outside the sensor array may operate in the linear (non-saturated) mode due to the decreasing intensity and the signal intensity may be estimated. In various embodiments, the pixels of a ring may be interconnected to provide a plurality of pixel rings or pixel ring segments. The pixel rings may further be interconnect in a timely successive manner, e.g. in case only one sum signal output is available for the interconnected sensor pixels). In alternative embodiments, a plurality of sum signal outputs may be provided or implemented in the sensor array which may be coupled to different groups of sensor pixels. In general, the pixels may be grouped in an arbitrary manner dependent on the respective requirements. The combination of different types of sensor pixels within one sensor 52 e.g. allows combining the functionality of a LIDAR sensor with the functionality of a camera in one common optics arrangement without the risk that a deviation will occur with respect to adjustment and calibration between the LIDAR and camera. This may reduce costs for a combined LIDAR/camera sensor and may further improve the data fusion of LIDAR data and camera data. As already mentioned above, camera sensors may be sensitive in the visible and/or infrared spectral range (thermographic camera).
  • Furthermore, the sensor controller 53 may control the sensor pixels taking into consideration the integration time (read out time) required by the respective photo diode of a pixel. The integration time may be dependent on the size of the photo diode. Thus, the clocking to control the read out process e.g. provided by the sensor controller 53, may be different for the different types of pixels and may change depending on the configuration of the pixel selection network.
  • FIG. 1 shows a portion 3800 of the sensor 52 in accordance with various embodiments. It is to be noted that the sensor 52 does not need to be a SiPM detector array. The sensor 52 includes a plurality of pixels 3802. Each pixel 3802 includes a photo diode. A light (laser) spot 3804 impinging on the surface of the portion 3800 of the sensor 52 is symbolized in FIG. 1 by a circle 3806. The light (laser) spot 3804 covers a plurality of sensor pixels 3802. A selection network may be provided which may be configured to selectively combine some pixels 3802 of the plurality of pixels 3802 to form an enlarged sensor pixel. The electrical signals provided by the photo diodes of the combined sensor pixels are accumulated. A read-out circuit may be provided which may be configured to read-out the accumulated electrical signals from the combined sensor pixels as one common signal.
  • The selection network may be configured to apply a plurality of row select signals 3808, 3810, 3812 (the number of row select signals may be equal to the number of rows of the sensor 52) to select the sensor pixels 3802 of the respectively selected row. To do this, the selection network may include a row multiplexer (not shown in FIG. 1). Furthermore, the selection network may be configured to apply a plurality of column select signals 3814, 3816, 3818 (the number of column select signals may be equal to the number of columns of the sensor 52) to select the pixels of the respectively selected column. To do this, the selection network may include a column multiplexer (not shown in FIG. 1).
  • FIG. 1 illustrates nine selected sensor pixels 3802 selected by the plurality of row select signals 3808, 3810, 3812 and the plurality of column select signals 3814, 3816, 3818. The light (laser) spot 3804 fully covers the nine selected sensor pixels 3820. Furthermore, the sensor controller 53 may provide a supply voltage 3822 to the sensor 52. The sensor signals 3824 provided by the selected sensor pixels 3820 are read out from the sensor 52 and supplied to one or more amplifiers via the selection network. It is to be noted that a light (laser) spot 3804 do not need to fully cover a selected sensor pixel 3820.
  • The individual selectability of each sensor pixel 3802 of the sensor 52 in a manner comparable with a selection mechanism of memory cells in a Dynamic Random Access Memory (DRAM) allows a simple and thus cost efficient sensor circuit architecture to quickly and reliably select one or more sensor pixels 3802 to achieve an evaluation of a plurality of sensor pixels at the same time. This may improve the reliability of the sensor signal evaluation of the second LIDAR sensor system 50.
  • FIG. 2 shows a portion 3900 of the sensor 52 in accordance with various embodiments in more detail.
  • The sensor 52 may include a plurality of row selection lines 3902, each row selection line 3902 being coupled to an input of the selection network, e.g. to an input of a row multiplexer of the selection network. The sensor 52 may further include a plurality of column selection lines 3904, each column selection line 3904 being coupled to another input of the selection network, e.g. to an input of a column multiplexer of the selection network. A respective column switch 3906 is coupled respectively to one of the column selection lines 3904 and is connected to couple the electrical supply voltage 3908 present on a supply voltage line 3910 to the sensor pixels 3802 coupled to the respective column selection line 3904 or to decouple the electrical supply voltage 3908 therefrom. Each sensor pixel 3802 may be coupled to a column read out line 3912, which is in turn coupled to a collection read out line 3914 via a respective column read out switch 3916. The column read out switches 3916 may be part of the column multiplexer. The sum of the current of the selected sensor pixels 3802, in other words the sensor signals 3824, may be provided on the collection read out line 3914. Each sensor pixel 3802 may further be coupled downstream of an associated column selection line 3904 via a respective column pixel switch 3918 (in other words, a respective column pixel switch 3918 is connected between a respective associated column selection line 3904 and an associated sensor pixel 3802). Moreover, each sensor pixel 3802 may further be coupled upstream of an associated column read out line 3912 via a respective column pixel read out switch 3920 (in other words, a respective column pixel read out switch 3920 is connected between a respective associated column read out line 3912 and an associated sensor pixel 3802). Each switch in the sensor 52 may be implemented by a transistor such as e.g. a field effect transistor (FET), e.g. a MOSFET. A control input (e.g. the gate terminal of a MOSFET) of each column pixel switch 3918 and of each column pixel read out switch 3920 may be electrically conductively coupled to an associated one of the plurality of row selection lines 3902. Thus, the row multiplexer may “activate” the column pixel switches 3918 and the pixel read out switches 3920 via an associated row selection line 3902. In case a respective column pixel switch 3918 and the associated pixel read out switch 3920 are activated, the associated column switch 3906 finally activates the respective sensor pixel 3802 by applying the supply voltage 3908 e.g. to the source of the MOSFET and (since e.g. the associated column pixel switch 3918 is closed) the supply voltage 3908 is also applied to the respective sensor pixel 3802. A sensor signal detected by the “activated” selected sensor pixel 3802 can be forwarded to the associated column read out line 3912 (since e.g. the associated column pixel read out switch 3920 is also closed), and, if also the associated column read out switch 3920 is closed, the respective sensor signal is transmitted to the collection read out line 3914 and finally to an associated amplifier (such as an associated TIA).
  • By way of example and as shown in FIG. 3,
      • the column switch 3906 may be implemented by a column switch MOSFET 4002;
      • the column read out switch 3916 may be implemented by a column read out switch MOSFET 4004
      • the column pixel switch 3918 may be implemented by a column pixel switch MOSFET 4006; and
      • the column pixel read out switch 3920 may be implemented by a column pixel read out switch MOSFET 4008.
  • FIG. 4 shows a portion 4100 of the sensor 52 in accordance with various embodiments in more detail.
  • In various embodiments, the column pixel read out switch 3920 may be dispensed with in a respective sensor pixel 3802. The embodiments shown in FIG. 41 may e.g. be applied to a SiPM as a sensor 52. Thus, the pixels 3802 may in this case be implemented as SPADs 3802. The sensor 52 further includes a first summation output 4102 for fast sensor signals. The first summation output 4102 may be coupled to the anode of each SPAD via a respective coupling capacitor 4104. The sensor 52 in this example further includes a second summation output 4106 for slow sensor signals. The second summation output 4106 may be coupled to the anode of each SPAD via a respective coupling resistor (which in the case of an SPAD as the photo diode of the pixel may also be referred to as quenching resistor) 4108.
  • FIG. 5 shows a recorded scene 4200 and the sensor pixels used to detect the scene in accordance with various embodiments in more detail.
  • As described above, the sensor 52 may have sensor pixels 3802 with photo diodes having different sensitivities. In various embodiments, an edge region 4204 may at least partially surround a center region 4202. In various embodiments, the center region 4202 may be provided for a larger operating range of the LIDAR Sensor System and the edge region 4204 may be provided for a shorter operating range. The center region 4202 may represent the main moving (driving, flying or swimming) direction of a vehicle and thus usually needs a far view to recognize an object at a far distance. The edge region 4204 may represent the edge region of the scene and usually, in a scenario where a vehicle (e.g. a car) is moving, objects 100, which may be detected, are usually nearer than in the main moving direction in which the vehicle is moving. The larger operating range means that the target object 100 return signal has a rather low signal intensity. Thus, sensor pixels 3802 with photo diodes having a higher sensitivity may be provided in the center region 4202. The shorter operating range means that the target object 100 return signal has a rather high (strong) signal intensity. Thus, sensor pixels 3802 with photo diodes having a lower sensitivity may be provided in the edge region 4204. In principle, however, the patterning of the sensor pixels (type, size, and sensitivity) may be configured for specific driving scenarios and vehicle types (bus, car, truck, construction vehicles, drones, and the like). This means that, for example, the sensor pixels 3802 of the edge regions 4204 may have a high sensitivity. It should also be stated that, if a vehicle uses a variety of LIDAR/Camera sensor systems, these may be configured differently, even when illuminating and detecting the same Field-of-View.
  • FIG. 6 shows a recorded scene 4300 and the sensor pixels 3802 used to detect the scene 4300 in accordance with various embodiments in more detail.
  • In various embodiments, a row-wise arrangement of the sensor pixels of the same photo diode type may be provided. By way of example, a first row 4302 may include pixels having APDs for a Flash LIDAR Sensor System and a second row 4304 may include pixels having pin photo diodes for a camera. The two respectively adjacent pixel rows may be provided repeatedly so that the rows of different pixels are provided, for example, in an alternating manner. However, the sequence and number of pixels rows of the same photo diode type could vary and likewise the grouping into specific selection networks. It is also to be noted that a row of pixels or columns may employ different photo diode types. Also, a row or column must not be completely filled up with photo diodes. The own motion of a vehicle may compensate for the reduced resolution of the sensor array (“push-broom scanning” principle).
  • The different rows may include various photo diode types, such as for example:
      • first row: pixels having APDs (LIDAR) second row: pixels having pin photo diodes (camera).
      • first row: pixels having first polarization plane second row: pixels having different second polarization plane.
      • This may allow the differentiation between directly incoming light beams and reflected light beams (e.g. vehicle, different surfaces of an object).
      • first row: pixels having first pin photo diodes (configured to detect light having wavelengths in the visible spectrum)
      • second row: pixels having second pin photo diodes (configured to detect light having wavelengths in the near infrared (NIR) spectrum).
      • This may allow the detection of taillights as well as an infrared (IR) illumination.
  • The sensor controller 53 may be configured to select the respective pixels 3802 in accordance with the desired photo diode type in a current application.
  • FIG. 7 shows a flow diagram illustrating a method 4400 for a LIDAR Sensor System in accordance with various embodiments in more detail.
  • The LIDAR Sensor System may include a plurality of sensor pixels. Each sensor pixel includes at least one photo diode. The LIDAR Sensor System may further include a selection network, and a read-out circuit. The method 4400 may include, in 4402, the selection network selectively combining some sensor pixels of the plurality of sensor pixels to form an enlarged sensor pixel. The electrical signals provided by the photo diodes of the combined sensor pixels are accumulated. The method 4400 may further include, in 4404, the read-out circuit reading-out the accumulated electrical signals from the combined sensor pixels as one common signal.
  • FIG. 8 shows a flow diagram illustrating another method 4500 for a LIDAR Sensor System in accordance with various embodiments in more detail.
  • The LIDAR Sensor System may include a plurality of a plurality of pixels. A first pixel of the plurality of pixels includes a photo diode of a first photo diode type, and a second pixel of the plurality of pixels includes a photo diode of a second photo diode type. The second photo diode type is different from the first photo diode type. The LIDAR Sensor System may further include a pixel sensor selector and a sensor controller. The method 4500 may include, in 4502, the pixel sensor selector selecting at least one of the first pixel including a photo diode of the first photo diode type and/or at least one of the second pixel including a photo diode of the second photo diode type, and, in 4504, the sensor controller controlling the pixel selector to select at least one first pixel and/or at least one second pixel.
  • Moreover, it is to be noted that the light (laser) emission (e.g. provided by a plurality of light (laser) sources, which may be operated in a group-wise manner) may be adapted in its light intensity pattern to the pixel distribution or arrangement of the sensor 52, e.g. it may be adapted such that larger pixels may be charged with light having a higher intensity than smaller pixels. This may be provided in an analog manner with respect to photo diodes having a higher and lower sensitivity, respectively.
  • In various embodiments, in the LIDAR sensor system as described with reference to FIG. 1 to FIG. 8, a first sensor pixel may include a photo diode of a first photo diode type and a second pixel of the plurality of pixels may include a photo diode of a second photo diode type. The second photo diode type is different from the first photo diode type. In various embodiments, the both photo diodes may be stacked one above the other in a way as generally described in the embodiments as described with reference to FIG. 9 to FIG. 16.
  • In the following, various aspects of this disclosure will be illustrated:
  • Example 1d is a LIDAR Sensor System. The LIDAR Sensor System includes a plurality of sensor pixels, each sensor pixel including at least one photo diode. The LIDAR Sensor System further includes a selection network configured to selectively combine some sensor pixels of the plurality of sensor pixels to form an enlarged sensor pixel. The electrical signals provided by the photo diodes of the combined sensor pixels are accumulated. The LIDAR Sensor System further includes a read-out circuit configured to read-out the accumulated electrical signals from the combined sensor pixels as one common signal.
  • In Example 2d, the subject matter of Example 1d can optionally include that the at least one photo diode includes at least one pin diode.
  • In Example 3d, the subject matter of Example 1d can optionally include that the at least one photo diode includes at least one avalanche photo diode.
  • In Example 4d, the subject matter of Example 3d can optionally include that the at least one avalanche photo diode includes at least one single-photon avalanche photo diode.
  • In Example 5d, the subject matter of any one of Examples 1d to 4d can optionally include that the plurality of sensor pixels are arranged in a sensor matrix in rows and columns.
  • In Example 6d, the subject matter of any one of Examples 1d to 5d can optionally include that the selection network includes a plurality of row selection lines, each row selection line being electrically conductively coupled to at least some sensor pixels of the same row, a plurality of column selection lines, each column selection line being electrically conductively coupled to at least some sensor pixels of the same column, and a plurality of read-out lines, each read-out line being electrically conductively coupled to at least some sensor pixels of the same column or the same row to accumulate the electrical signals provided by the combined sensor pixels.
  • In Example 7d, the subject matter of any one of Examples 1d to 6d can optionally include that each sensor pixel of at least some of the sensor pixels includes a first switch connected between the selection network and a first terminal of the sensor pixel, and/or a second switch connected between a second terminal of the sensor pixel and the selection network.
  • In Example 8d, the subject matter of Examples 6d and 7d can optionally include that the first switch is connected between a column selection line of the plurality of column selection lines and the first terminal of the sensor pixel, wherein a control terminal of the first switch is coupled to a row selection line of the plurality of row selection lines. The second switch is connected between the second terminal of the sensor pixel and a read-out line of the plurality of read-out lines. A control terminal of the second switch is coupled to a row selection line of the plurality of row selection lines.
  • In Example 9d, the subject matter of any one of Examples 7d or 8d can optionally include that at least one first switch and/or at least one second switch includes a field effect transistor.
  • In Example 10d, the subject matter of any one of Examples 1d to 9d can optionally include that the LIDAR Sensor System further includes a sensor controller configured to control the selection network to selectively combine some sensor pixels of the plurality of sensor pixels to form the enlarged sensor pixel.
  • In Example 11d, the subject matter of Example 10d can optionally include that the sensor controller is configured to control the selection network based on the level of illuminance of the LIDAR Sensor System such that with improving lighting conditions a smaller number of sensor pixels of the plurality of sensor pixels will be selected and combined.
  • In Example 12d, the subject matter of any one of Examples 1d to 11d can optionally include that the LIDAR Sensor System further includes a plurality of read-out amplifiers, each read-out amplifier coupled to an associated read-out line of the plurality of read-out lines.
  • In Example 13d, the subject matter of Example 12d can optionally include that the common signal is an electrical current. The plurality of read-out amplifiers includes a plurality of transimpedance amplifiers, each transimpedance amplifier configured to convert the associated electrical current into an electrical voltage.
  • Example 14d is a LIDAR Sensor System. The LIDAR Sensor System may include a plurality of pixels. A first pixel of the plurality of pixels includes a photo diode of a first photo diode type, and a second pixel of the plurality of pixels includes a photo diode of a second photo diode type. The second photo diode type is different from the first photo diode type. The LIDAR Sensor System may further include a pixel selector configured to select at least one of the first pixel including a photo diode of the first photo diode type and/or at least one of the second pixel including the photo diode of the second photo diode type, and a sensor controller configured to control the pixel selector to select at least one first pixel and/or at least one second pixel.
  • In Example 15d, the subject matter of Example 14d can optionally include that the sensor controller and the pixels are configured to individually read-out the photo diode of the first photo diode type and the photo diode of the second photo diode type.
  • In Example 16d, the subject matter of any one of Examples 14d or 15d can optionally include that the sensor controller and the pixels are configured to read-out the photo diode of the first photo diode type and the photo diode of the second photo diode type as one combined signal.
  • In Example 17d, the subject matter of any one of Examples 14d to 16d can optionally include that the photo diode of a first photo diode type and/or the photo diode of a second photo diode type are/is selected from a group consisting of: a pin photo diode; an avalanche photo diode; or a single-photon photo diode.
  • In Example 18d, the subject matter of any one of Examples 14d to 17d can optionally include that the LIDAR Sensor System further includes a selection network configured to selectively combine some pixels of the plurality of pixels to form an enlarged pixel, wherein the electrical signals provided by the photo diodes of the combined pixels are accumulated, and a read-out circuit configured to read-out the accumulated electrical signals from the combined pixels as one common signal.
  • In Example 19d, the subject matter of any one of Examples 14d to 18d can optionally include that the plurality of pixels are arranged in a sensor matrix in rows and columns.
  • In Example 20d, the subject matter of any one of Examples 14d to 19d can optionally include that the selection network includes a plurality of row selection lines, each row selection line being electrically conductively coupled to at least some pixels of the same row, a plurality of column selection lines, each column selection line being electrically conductively coupled to at least some pixels of the same column, and a plurality of read-out lines, each read-out line being electrically conductively coupled to at least some pixels of the same column or the same row to accumulate the electrical signals provided by the combined pixels.
  • In Example 21d, the subject matter of any one of Examples 14d to 20d can optionally include that each pixel of at least some of the pixels includes a first switch connected between the selection network and a first terminal of the pixel, and/or a second switch connected between a second terminal of the pixel and the selection network.
  • In Example 22d, the subject matter of Examples 20d and 21d can optionally include that the first switch is connected between a column selection line of the plurality of column selection lines and the first terminal of the pixel. A control terminal of the first switch is coupled to a row selection line of the plurality of row selection lines. The second switch is connected between the second terminal of the pixel and a read-out line of the plurality of read-out lines. A control terminal of the second switch is coupled to a row selection line of the plurality of row selection lines.
  • In Example 23d, the subject matter of any one of Examples 21d or 22d can optionally include that at least one first switch and/or at least one second switch comprises a field effect transistor.
  • In Example 24d, the subject matter of any one of Examples 14d to 23d can optionally include that the sensor controller is further configured to control the selection network to selectively combine some pixels of the plurality of pixels to form the enlarged pixel.
  • In Example 25d, the subject matter of Example 22d can optionally include that the sensor controller is configured to control the selection network based on the level of illuminance of the LIDAR Sensor System such that with improving lighting conditions a smaller number of sensor pixels of the plurality of sensor pixels will be selected and combined.
  • In Example 26d, the subject matter of any one of Examples 14d to 25d can optionally include that the LIDAR Sensor System further includes a plurality of read-out amplifiers, each read-out amplifier coupled to an associated read-out line of the plurality of read-out lines.
  • In Example 27d, the subject matter of Example 26d can optionally include that the common signal is an electrical current. The plurality of read-out amplifiers includes a plurality of transimpedance amplifiers, each transimpedance amplifier configured to convert the associated electrical current into an electrical voltage.
  • Example 28d is a method for a LIDAR Sensor System. The LIDAR Sensor System may include a plurality of sensor pixels. Each sensor pixel includes at least one photo diode. The LIDAR Sensor System may further include a selection network, and a read-out circuit. The method may include the selection network selectively combining some sensor pixels of the plurality of sensor pixels to form an enlarged sensor pixel, wherein the electrical signals provided by the photo diodes of the combined sensor pixels are accumulated, and the read-out circuit reading-out the accumulated electrical signals from the combined sensor pixels as one common signal.
  • In Example 29d, the subject matter of Example 28d can optionally include that the at least one photo diode includes at least one pin diode.
  • In Example 30d, the subject matter of Example 28d can optionally include that the at least one photo diode includes at least one avalanche photo diode.
  • In Example 31d, the subject matter of Example 30d can optionally include that the at least one avalanche photo diode includes at least one single-photon avalanche photo diode.
  • In Example 32d, the subject matter of any one of Examples 28d to 31d can optionally include that the plurality of sensors are arranged in a sensor matrix in rows and columns.
  • In Example 33d, the subject matter of any one of Examples 28d to 32d can optionally include that the selection network includes a plurality of row selection lines, each row selection line being electrically conductively coupled to at least some sensor pixels of the same row, a plurality of column selection lines, each column selection line being electrically conductively coupled to at least some sensor pixels of the same column, and a plurality of read-out lines, each read-out line being electrically conductively coupled to at least some sensor pixels of the same column or the same row to accumulate the electrical signals provided by the combined sensor pixels.
  • In Example 34d, the subject matter of any one of Examples 28d to 33d can optionally include that each sensor pixel of at least some of the sensor pixels includes a first switch connected between the selection network and a first terminal of the sensor pixel, and/or a second switch connected between a second terminal of the sensor pixel and the selection network.
  • In Example 35d, the subject matter of Example 33d and Example 34d can optionally include that the first switch is connected between a column selection line of the plurality of column selection lines and the first terminal of the sensor pixel. A control terminal of the first switch is controlled via a row selection line of the plurality of row selection lines. The second switch is connected between the second terminal of the sensor pixel and a read-out line of the plurality of read-out lines. A control terminal of the second switch is controlled via a row selection line of the plurality of row selection lines.
  • In Example 36d, the subject matter of any one of Examples 34d or 35d can optionally include that at least one first switch and/or at least one second switch comprises a field effect transistor.
  • In Example 37d, the subject matter of any one of Examples 28d to 36d can optionally include that the method further includes a sensor controller controlling the selection network to selectively combine some sensor pixels of the plurality of sensor pixels to form the enlarged sensor pixel.
  • In Example 38d, the subject matter of Example 37d can optionally include that the sensor controller controls the selection network based on the level of illuminance of the LIDAR Sensor System such that with improving lighting conditions a smaller number of sensor pixels of the plurality of sensor pixels will be selected and combined.
  • In Example 39d, the subject matter of any one of Examples 28d to 38d can optionally include that the LIDAR Sensor System further includes a plurality of read-out amplifiers, each read-out amplifier coupled to an associated read-out line of the plurality of read-out lines.
  • In Example 40d, the subject matter of Example 39d can optionally include that the common signal is an electrical current. The plurality of read-out amplifiers includes a plurality of transimpedance amplifiers. Each transimpedance amplifier converts the associated electrical current into an electrical voltage.
  • Example 41d is a method for a LIDAR Sensor System. The LIDAR Sensor System may include a plurality of a plurality of pixels. A first pixel of the plurality of pixels includes a photo diode of a first photo diode type, and a second pixel of the plurality of pixels includes a photo diode of a second photo diode type. The second photo diode type is different from the first photo diode type. The LIDAR Sensor System may further include a pixel sensor selector and a sensor controller. The method may include the pixel sensor selector selecting at least one of the first pixel including photo diode of the first photo diode type and/or at least one of the second pixel including the photo diode of the second photo diode type, and the sensor controller controlling the pixel selector to select at least one first pixel and/or at least one second pixel.
  • In Example 42d, the subject matter of Example 41d can optionally include that the photo diode of a first photo diode type and/or the photo diode of a second photo diode type are/is selected from a group consisting of: a pin photo diode, an avalanche photo diode, and/or a single-photon photo diode.
  • In Example 43d, the subject matter of any one of Examples 41d or 42d can optionally include that the method further includes a selection network selectively combining some sensors of the plurality of pixels to form an enlarged pixel, wherein the electrical signals provided by the photo diodes of the combined pixels are accumulated, and a read-out circuit reading out the accumulated electrical signals from the combined pixels as one common signal.
  • In Example 44d, the subject matter of any one of Examples 41d to 43d can optionally include that the plurality of pixels are arranged in a sensor matrix in rows and columns.
  • In Example 45d, the subject matter of any one of Examples 41d to 44d can optionally include that the selection network includes a plurality of row selection lines, each row selection line being electrically conductively coupled to at least some pixels of the same row, a plurality of column selection lines, each column selection line being electrically conductively coupled to at least some pixels of the same column, and a plurality of read-out lines, each read-out line being electrically conductively coupled to at least some pixels of the same column or the same row to accumulate the electrical signals provided by the combined pixels.
  • In Example 46d, the subject matter of any one of Examples 41d to 45d can optionally include that each pixel of at least some of the pixels includes a first switch connected between the selection network and a first terminal of the pixel, and/or a second switch connected between a second terminal of the pixel and the selection network.
  • In Example 47d, the subject matter of Example 45d and Example 46d can optionally include that the first switch is connected between a column selection line of the plurality of column selection lines and the first terminal of the pixel. A control terminal of the first switch is controlled via a row selection line of the plurality of row selection lines, and the second switch is connected between the second terminal of the pixel and a read-out line of the plurality of read-out lines. A control terminal of the second switch is controlled via a row selection line of the plurality of row selection lines.
  • In Example 48d, the subject matter of any one of Examples 46d or 47d can optionally include that at least one first switch and/or at least one second switch includes a field effect transistor.
  • In Example 49d, the subject matter of any one of Examples 41d to 48d can optionally include that the sensor controller is controlling the selection network to selectively combine some pixels of the plurality of pixels to form the enlarged pixel.
  • In Example 50d, the subject matter of Example 49d can optionally include that the sensor controller controls the selection network based on the level of illuminance of the LIDAR Sensor System such that with improving lighting conditions a smaller number of sensor pixels of the plurality of sensor pixels will be selected and combined.
  • In Example 51d, the subject matter of any one of Examples 41d to 50d can optionally include that the LIDAR Sensor System further includes a plurality of read-out amplifiers, each read-out amplifier coupled to an associated read-out line of the plurality of read-out lines.
  • In Example 52d, the subject matter of Example 51d can optionally include that the common signal is an electrical current. The plurality of read-out amplifiers includes a plurality of transimpedance amplifiers, each transimpedance amplifier converts the associated electrical current into an electrical voltage.
  • Example 53d is a computer program product. The computer program product may include a plurality of program instructions that may be embodied in non-transitory computer readable medium, which when executed by a computer program device of a LIDAR Sensor System according to any one of examples 1d to 27d, cause the LIDAR Sensor System to execute the method according to any one of the examples 28d to 52d.
  • Example 54d is a data storage device with a computer program that may be embodied in non-transitory computer readable medium, adapted to execute at least one of a method for LIDAR Sensor System according to any one of the above method examples or a LIDAR Sensor System according to any one of the above LIDAR Sensor System examples.
  • The LIDAR Sensor System according to the present disclosure may be combined with a LIDAR Sensor Device for illumination of an environmental space connected to a light control unit.
  • In the LIDAR Sensor System, a combination of a LIDAR sensor and a camera sensor may be desired e.g. in order to identify an object or characteristics of an object by means of data fusion. Furthermore, depending on the situation, either a three dimensional measurement by means of a LIDAR sensor or a two dimensional mapping by means of a camera sensor may be desired. By way of example, a LIDAR sensor alone usually cannot determine whether taillights of a vehicle are switched on or switched off.
  • In a conventional combination of a LIDAR sensor and a camera sensor, two separate image sensors are provided and these are combined by means of a suitable optics arrangement (e.g. semitransparent mirrors, prisms, and the like). As a consequence, a rather large LIDAR sensor space is required and both partial optics arrangements of the optics arrangement and both sensors (LIDAR sensor and camera sensor) have to be aligned to each other with high accuracy. As an alternative, in the case of two separate mapping systems and thus two sensors, the relative positions of the optical axes of the two sensors to each other have to be determined with high accuracy to be able to take into consideration effects resulting from the geometric distance of the sensors from each other in a subsequent image processing to accurately match the images provided by the sensors. Furthermore, deviations of the relative orientation of the optical axes of the sensors should also be taken into consideration, since they have an effect on the calibration state. This may also incorporate the fact that the fields of view of both sensors do not necessarily coincide with each other and that regions possibly exist in a region in close proximity to the sensors in which an object cannot be detected by all sensors of the one or more other sensors simultaneously.
  • Various aspects of this disclosure may provide a LIDAR functionality at two different wavelengths or the combination of a LIDAR function and a camera function in a visible wavelength region or the combination of a LIDAR function and a camera function in a wavelength region of the thermal infrared as will be described in more detail below.
  • In a conventional LIDAR Sensor System, a combination of a LIDAR function with a camera function is usually implemented by means of two separate sensor systems and the relative position of the sensor systems to each other is taken into consideration in the image processing. In the context of a (movie or video) camera, there is an approach to use three individual image sensors instead of a CCD/CMOS image sensor array with color filters (Bayer pattern). The incoming light may be distributed over the three image sensors by means of an optics arrangement having full faced color filters (e.g. a trichroic beam splitter prism). In the context of a conventional photo camera efforts have been made to avoid the disadvantageous effects of the Bayer-Pattern-Color filter by providing a CMOS image sensor which uses the wavelength-dependent absorption of silicon in order to register different spectral colors in different depths of penetration.
  • Illustratively, the physical principle of the wavelength dependent depth of penetration of light into a carrier such as a semiconductor (e.g. silicon) substrate, which has (up to now) only been used in photo applications, is used in the field of the integration of a LIDAR sensor and a camera sensor in accordance with various embodiments.
  • To achieve this, two or more different types of photo diodes may be stacked above one another, i.e. one type of photo diode is placed over another type of photodiode. This may be implemented e.g. by a monolithic integration of the different types of photo diodes in one common process of manufacturing (or other types of integration processes such as wafer bonding or other three-dimensional processes). In various embodiments, a pin photo diode for the detection of visible light (e.g. red spectral region for the detection of car taillights) may be provided near to the surface of the carrier (e.g. substrate). In a deeper region of the carrier (e.g. in a deeper region of the substrate), there may be provided an avalanche photo diode (APD), which may be configured to detect light emitted by a laser emitter and having a wavelength in the near infrared region (NIR). The red light may in this case be detected near the surface of the pin photo diode due to its smaller depth of penetration. Substantially fewer portions of the light of the visible spectrum (VIS) may penetrate into the deeper region (e.g. deeper layers) in this case, so that the avalanche photo diode which is implemented there is primarily sensitive to NIR light.
  • The stacking of the photo diodes one above the other may be useful in that:
      • the sensor functions of pin photo diodes (camera) and APD (LIDAR) are always accurately aligned with respect to each other and only one receiving optical arrangement is required—in various embodiments, CCD or CMOS sensors may be provided—moreover, the camera may be configured as an infrared (IR) camera, as a camera for visible light or as a thermal camera or a combination thereof;
      • the incoming light is efficiently used.
  • FIG. 9 shows schematically in a cross sectional view an optical component 5100 for a LIDAR Sensor System in accordance with various embodiments.
  • The optical component 5100 may include a carrier, which may include a substrate, e.g. including a semiconductor material and/or a semiconductor compound material. Examples of materials that may be used for the carrier and/or the semiconductor structure include one or more of the following materials: GaAs, AlGaInP, GaP, AlP, AlGaAs, GaAsP, GaInN, GaN, Si, SiGe, Ge, HgCdTe, InSb, InAs, GalnSb, GaSb, CdSe, HgSe, AlSb, CdS, ZnS, ZnSb, ZnTe. The substrate may optionally include a device layer 5102. One or more electronic devices 5104 such as (field effect) transistors 5104 or other electronic devices (resistors, capacitors, inductors, and the like) 5104 may be completely or partially formed in the device layer 5102. The one or more electronic devices 5104 may be configured to process signals generated by the first photo diode 5110 and the second photo diode 5120, which will be described in more detail below. The substrate may optionally include a bottom interconnect layer 5106. Alternatively, the interconnect layer 5106 may be configured as a separate layer, e.g. as a separate layer arranged above the device layer 5102 (like shown in FIG. 9). The carrier may have a thickness in the range from about 100 μm to about 3000 μm.
  • One or more electronic contacts 5108 configured to contact the electronic devices 5104 or an anode or a cathode of a first photo diode 5110, in other words a first portion of the first photo diode 5110 (which will be described in more detail below), may be connected to an electronic contact 5108 of the bottom interconnect layer 5106. Furthermore, one or more contact vias 5112 may be formed in the bottom interconnect layer 5106. The one or more contact vias 5112 extend through the entire layer structure implementing the first photo diode 5110 into an intermediate interconnect/device layer 5114. The one or more electronic contacts 5108 as well as the one or more contact vias 5112 may be made of electrically conductive material such as a metal (e.g. Cu or Al) or any other suitable electrically conductive material. The one or more electronic contacts 5108 and the one or more contact vias 5112 may form an electrically conductive connection network in the bottom interconnect layer 5106.
  • The first photo diode 5110 may be an avalanche type photo diode such as an avalanche photo diode (APD) or a single-photon photo diode (SPAD). The first photo diode 5110 may be operated in the linear mode/in the Geiger mode. Illustratively, the first photo diode 5110 implements a LIDAR sensor pixel in a first semiconductor structure over the carrier. The first photo diode 5110 is configured to absorb received light in a first wavelength region. The first photo diode 5110 and thus the first semiconductor structure may have a layer thickness in the range from about 500 nm to about 50 μm.
  • One or more further electronic devices 5116 such as (field effect) transistors 5116 or other further electronic devices (resistors, capacitors, inductors, and the like) 5116 may be completely or partially formed in the intermediate interconnect/device layer 5114. One or more further electronic contacts 5118 configured to contact the further electronic devices 5116 or an anode or a cathode of the first photo diode 5110, in other words a second portion of the first photo diode 5110, may be connected to a further electronic contact 5118 of the intermediate interconnect/device layer 5114. The one or more further electronic contacts 5118 and the one or more contact vias 5112 may form an electrically conductive connection network (electrically conductive structure configured to electrically contact the first photo diode 5110 and the second photo diode 5120) in the intermediate interconnect/device layer 5114. Illustratively, the intermediate interconnect/device layer 5114 (which may also be referred to as interconnect layer 5114) is arranged between the first semiconductor structure and the second semiconductor structure.
  • One or more further electronic contacts 5118 and/or one or more contact vias 5112 may be configured to contact the further electronic devices 5116 or an anode or a cathode of a second photo diode 5120, in other words a first portion of the second photo diode 5120 (which will be described in more detail below) may be connected to a further electronic contact 5118 of the intermediate interconnect/device layer 5114.
  • The second photo diode 5120 may be arranged over (e.g. in direct physical contact with) the intermediate interconnect/device layer 5114. The second photo diode 5120 might be a pin photo diode (e.g. configured to receive light of the visible spectrum). Illustratively, the second photo diode 5120 implements a camera sensor pixel in a second semiconductor structure over the intermediate interconnect/device layer 5114 and thus also over the first semiconductor structure. In other words, the second photo diode 5120 is vertically stacked over the first photo diode. The second photo diode 5120 is configured to absorb received light in a second wavelength region. The received light of the second wavelength region has a shorter wavelength than the predominantly received light of the first wavelength region.
  • FIGS. 11A and 11B show schematically in a cross sectional view an optical component 5200 for a LIDAR Sensor System (FIG. 10A) and a corresponding wavelength/transmission diagram 5250 (FIG. 10B) in accordance with various embodiments.
  • The optical component 5200 of FIG. 10A is substantially similar to the optical component 5100 of FIG. 9 as described above. Therefore, only the main differences of the optical component 5200 of FIG. 10A with respect to the optical component 5100 of FIG. 9 will be described in more detail below.
  • The optical component 5200 of FIG. 10A may further optionally include one or more microlenses 5202, which may be arranged over the second photo diode 5120 (e.g. directly above, in other words in physical contact with the second photo diode 5120). The one or more microlenses 5202 may be embedded in or at least partially surrounded by a suitable filler material 5204 such as silicone. The one or more microlenses 5202 together with the filler material 5204 may, for a layer structure, have a layer thickness in the range from about 1 μm to about 500 μm.
  • Furthermore, a filter layer 5206, which may be configured to implement a bandpass filter, may be arranged over the optional one or more microlenses 5202 or the second photo diode 5120 (e.g. directly above, in other words in physical contact with the optional filler material 5204 or with the second photo diode 5120). The filter layer 5206 may have a layer thickness in the range from about 1 μm to about 500 μm.
  • As shown in FIG. 10A, light impinges on the upper (exposed) surface 5208 of the filter layer 5206. The light may include various wavelengths, such as e.g. a first wavelength range λ1 (e.g. in the ultra-violet spectral region), a second wavelength range λ2 (e.g. in the visible spectral region), and a third wavelength range λ3 (e.g. in the near-infrared spectral region). Light having the first wavelength λ1 is symbolized in FIG. 10A by a first arrow 5210. Light having the second wavelength λ2 is symbolized in FIG. 10A by a second arrow 5212. Light having the third wavelength λ3 is symbolized in FIG. 10A by a third arrow 5214.
  • The wavelength/transmission diagram 5250 as shown in FIG. 10B illustrates the wavelength-dependent transmission characteristic of the filter layer 5206. As illustrated, the filter layer 5206 has a bandpass filter characteristic. In more detail, the filter layer 5206 has a low, ideally negligible transmission for light having the first wavelength range λ1. In other words, the filter layer 5206 may completely block the light portions having the first wavelength range λ1 impinging on the upper (exposed) surface 5208 of the filter layer 5206. Furthermore, the transmission characteristic 5252 shows that the filter layer 5206 is substantially fully transparent (transmission factor close to “1”) for light having the second wavelength range λ2 and for light having the third wavelength range λ3.
  • In various embodiments, the second photo diode 5120 may include or be a pin photo diode (configured to detect light of the visible spectrum) and the first photo diode 5110 may include or be an avalanche photo diode (in the linear mode/in the Geiger mode) (configured to detect light of the near infrared (NIR) spectrum or in the infrared (IR) spectrum).
  • FIGS. 11A and 11B show schematically in a cross sectional view an optical component 5300 for a LIDAR Sensor System (FIG. 11A) and a corresponding wavelength/transmission diagram 5250 (FIG. 11B) in accordance with various embodiments.
  • The optical component 5300 of FIG. 11A is substantially similar to the optical component 5200 of FIG. 10A as described above. Therefore, only the main differences of the optical component 5300 of FIG. 11A from the optical component 5200 of FIG. 10A will be described in more detail below.
  • The optical component 5300 of FIG. 11A may further optionally include a mirror structure (e.g. a Bragg mirror structure). The second photo diode 5120 may be arranged (in other words sandwiched) between the two mirrors (e.g. two Bragg mirrors) 5302, 5304 of the mirror structure. In other words, the optical component 5300 of FIG. 11A may further optionally include a bottom mirror (e.g. a bottom Bragg mirror) 5302. The bottom mirror (e.g. the bottom Bragg mirror) 5302 may be arranged over (e.g. in direct physical contact with) the intermediate interconnect/device layer 5114. In this case, the second photo diode 5120 may be arranged over (e.g. in direct physical contact with) the bottom mirror 5302. Furthermore, a top mirror (e.g. a top Bragg mirror) 5304 may be arranged over (e.g. in direct physical contact with) the second photo diode 5120. In this case, the optional one or more microlenses 5202 or the filter layer 5206 may be arranged over (e.g. in direct physical contact with) the top mirror 5304.
  • In various embodiments, the second photo diode 5120 may include or be a pin photo diode (configured to detect light of the visible spectrum) and the first photo diode 5110 may include or be an avalanche photo diode (in the linear mode/in the Geiger mode) (configured to detect light of the near infrared (NIR) spectrum or in the infrared (IR) spectrum).
  • FIG. 12 shows schematically a cross sectional view 5400 of a sensor 52 for a LIDAR Sensor System in accordance with various embodiments. As shown in FIG. 12, the sensor 52 may include a plurality of optical components (e.g. a plurality of optical components 5100 as shown in FIG. 9) in accordance with any one of the embodiments as described above or as will be described further below. The optical components may be arranged in an array, e.g. in a matrix arrangement, e.g. in rows and columns. In various embodiments, more than 10, or more than 100, or more than 1000, or more than 10000, and even more optical components may be provided.
  • FIG. 13 shows a top view 5500 of the sensor 52 of FIG. 12 for a LIDAR Sensor System in accordance with various embodiments. The top view 5500 illustrates a plurality of color filter portions (each color filter may be implemented as a filter layer 5206). The different color filter portions may be configured to transmit (transfer) light of different wavelengths in the visible spectrum (to be detected by the second photo diode 5120) and light of one or more wavelengths to be absorbed or detected by the first photo diode 5110 for LIDAR detection. By way of example, a red pixel filter portion 5502 may be configured to transmit light having a wavelength to represent red color (to be detected by the second photo diode 5120) and light of one or more wavelengths to be absorbed or detected by the first photo diode 5110 for LIDAR detection and to block light outside these wavelength regions. Furthermore, a green pixel filter portion 5504 may be configured to transmit light having a wavelength to represent green color (to be detected by the second photo diode 5120) and light of one or more wavelengths to be absorbed or detected by the first photo diode 5110 for LIDAR detection and to block light outside these wavelength regions. Moreover, a blue pixel filter portion 5506 may be configured to transmit light having a wavelength to represent blue color (to be detected by the second photo diode 5120) and light of one or more wavelengths to be absorbed or detected by the first photo diode 5110 for LIDAR detection and to block light outside these wavelength regions. The color filter portions 5502, 5504, 5506 may each have the lateral size corresponding to a sensor pixel, in this case a size similar to the lateral sizes of the second photo diodes 5120. In these embodiments, the second photo diodes 5110 may have the same lateral size as the second photo diodes 5120. The color filter portions 5502, 5504, 5506 may be arranged in accordance with a Bayer pattern.
  • FIG. 14 shows a top view 5600 of a sensor 52 for a LIDAR Sensor System in accordance with various embodiments.
  • The sensor of FIG. 14 is substantially similar to the sensor of FIG. 13 as described above. Therefore, only the main difference of the sensor of FIG. 14 from the sensor of FIG. 13 will be described in more detail below.
  • In various embodiments, the color filter portions 5502, 5504, 5506 may each have a lateral size corresponding to a sensor pixel, in this case a size similar to the lateral size of the second photo diodes 5120. In these embodiments, the first photo diodes 5110 may have a larger lateral size than the second photo diodes 5120. By way of example, the surface area of the first photo diodes 5110 may be larger than the surface area of the second photo diodes 5120. In one implementation, the surface area of the first photo diodes 5110 may be larger than the surface area of the second photo diodes 5120 by a factor of two, or by a factor of four, or by a factor of eight, or by a factor of sixteen. The larger size of the first photo diodes 5110 is symbolized by rectangles 5602 in FIG. 14. The color filter portions 5502, 5504, 5506 may also be arranged in accordance with a Bayer pattern. In these examples, the resolution of the first photo diodes 5110 may not be of high importance, but the sensitivity of the first photo diodes 5110 may be important.
  • FIG. 15 shows a top view 5700 of a sensor 52 for a LIDAR Sensor System in accordance with various embodiments.
  • The sensor of FIG. 15 is substantially similar to the sensor of FIG. 13 as described above. Therefore, only the main difference of the sensor of FIG. 15 from the sensor of FIG. 13 will be described in more detail below.
  • The top view 5700 illustrates a plurality of color filter portions (each color filter may be implemented as a filter layer 5206) different from the color filter portions of the sensor as shown in FIG. 13 or FIG. 14. In these examples, a red pixel filter portion 5702 may be configured to transmit light having a wavelength to represent red color (to be detected by the second photo diode 5120 in order to detect a taillight of a vehicle) and light of one or more wavelengths to be absorbed or detected by the first photo diode 5110 for LIDAR detection and to block light outside these wavelength regions. Furthermore, a yellow (or orange) pixel filter portion 5704 may be configured to transmit light having a wavelength to represent yellow (or orange) color (to be detected by the second photo diode 5120 in order to detect a warning light or a blinking light of a vehicle) and light of one or more wavelengths to be absorbed or detected by the first photo diode 5110 for LIDAR detection and to block light outside these wavelength regions. In these embodiments, the first photo diodes 5110 may have a larger lateral size than the second photo diodes 5120. By way of example, the surface area of the first photo diodes 5110 may be larger than the surface area of the second photo diodes 5120. In one implementation, the surface area of the first photo diodes 5110 may be larger than the surface area of the second photo diodes 5120 by a factor of two, or by a factor of four, or by a factor of eight, or by a factor of sixteen. The larger size of the first photo diodes 5110 is symbolized by rectangles 5602 in FIG. 15. The color filter portions 5702 and 5704 may be arranged in accordance with checkerboard pattern. In these examples, the resolution of the first photo diodes 5110 may not be of high importance, but the sensitivity of the first photo diodes 5110 may be important.
  • It is to be noted that the structure and the transmission characteristics of the color filter portions may vary as a function of the desired color space. In the above described embodiments, an RGB color space was considered. Other possible color spaces that may be provided are CYMG (cyan, yellow, magenta and green), RGBE (red, green, blue, and emerald), CMYW (cyan, magenta, yellow, and white), and the like. The color filter portions would be adapted accordingly. Optional further color filter types may mimic the scotopic sensitivity curve of the human eye.
  • FIG. 16 shows an optical component 5800 for a LIDAR Sensor System in accordance with various embodiments.
  • The optical component 5800 of FIG. 16 is substantially similar to the optical component 5200 of FIG. 10A as described above. Therefore, the main differences of the optical component 5800 of FIG. 16 from the optical component 5200 of FIG. 10A will be described in more detail below.
  • To begin with, the optical component 5800 may have or may not have the optional one or more microlenses 5202 and the filler material 5204. Furthermore, a reflector layer 5802 may be arranged over (e.g. in direct physical contact with) the filter layer 5206. The reflector layer 5802 may be configured to reflect light in a wavelength region of a fourth wavelength λ4. The fourth wavelength range λ4 may have larger wavelengths than the first wavelength range λ1, the second wavelength range λ2, and the third wavelength range λ3. A light portion of the fourth wavelength λ4 is symbolized in FIG. 16 by a fourth arrow 5804. This light impinges on the reflector layer 5802 and is reflected by the same. The light portion that is reflected by the reflector layer 5802 is symbolized in FIG. 16 by a fifth arrow 5806. The reflector layer 5802 may be configured to reflect light in the wavelength region of thermal infrared light or infrared light. The reflector layer 5802 may include a Bragg stack of layers configured to reflect light of a desired wavelength or wavelength region. The optical component 5800 may further include a micromechanically defined IR absorber structure 5808 arranged over the reflector layer 5802. The IR absorber structure 5808 may be provided for a temperature-dependent resistivity measurement (based on the so called Microbolometer principle). To electrically contact the IR absorber structure 5808 for the resistivity measurement, one or more conductor lines may be provided, e.g. in the intermediate interconnect/device layer 5114. The reflector layer 5802 may be configured to reflect thermal infrared radiation having a wavelength greater than approximately 2 μm.
  • Various embodiments such as e.g. the embodiments illustrated above may include a stack of different photo diodes, such as:
      • a stack of a pin photo diode (configured to detect light of the visible spectrum) over a pin photo diode (configured to detect light of the near infrared (NIR) spectrum);
      • a stack of a pin photo diode (configured to detect light of the visible spectrum) over an avalanche photo diode (in the linear mode/in the Geiger mode) (configured to detect light of the near infrared (NIR) spectrum);
      • a stack of a resonant cavity photo diode (configured to detect light of the visible spectrum) over an avalanche photo diode (in the linear mode/in the Geiger mode) (configured to detect light of the near infrared (NIR) spectrum);
      • a stack of a pin photo diode (configured to detect light of the visible spectrum) over a further photo diode configured to provide indirect ToF measurements by means of phase differences (e.g. PMD approach);
      • a stack of a resonant cavity photo diode (configured to detect light of the visible spectrum) over a further photo diode configured to provide indirect ToF measurements by means of phase differences (e.g. PMD approach);
  • As described above, the above mentioned embodiments may be complemented by a filter, e.g. a bandpass filter, which is configured to transmit portions of the light which should be detected by the photo diode near to the surface of the carrier (e.g. of the visible spectrum) such as e.g. red light for vehicle taillights as well as portions of the light having the wavelength of the used LIDAR source (e.g. laser source).
  • The above mentioned embodiments may further be complemented by a (one or more) microlens per pixel to increase the fill factor (a reduced fill factor may occur due to circuit regions of an image sensor pixel required by the manufacturing process). The fill factor is to be understood as the area ratio between the optically active area and the total area of the pixel. The optically active area may be reduced e.g. by electronic components. A micro lens may extend over the entire area of the pixel and may guide the light to the optically active area. This would increase the fill factor.
  • In various embodiments, a front-side illuminated image sensor or a back-side illuminated image sensor may be provided. In a front-side illuminated image sensor, the device layer is positioned in a layer facing the light impinging the sensor 52. In a back-side illuminated image sensor, the device layer is positioned in a layer facing away from the light impinging the sensor 52.
  • In various embodiments, two APD photo diodes may be provided which are configured to detect light in different NIR wavelengths and which may be stacked over each other, e.g. to use the wavelength-dependent absorption characteristics of water (vapor) and to obtain information about the amount of water present in the atmosphere and/or an surfaces such as the roadway of a surface by the comparison of the intensities of the light detected at different wavelengths.
  • Depending on the desired wavelengths, the detector may be implemented in a semiconductor material such as silicon or in semiconductor compound material such as silicon germanium, III-V semiconductor compound material, or II-VI semiconductor compound material, individually or in combination with each other.
  • Various embodiments may allow the manufacturing of a miniaturized and/or cost-efficient sensor system which may combine a camera sensor and a LIDAR sensor with each other in one common carrier (e.g. substrate). Such a sensor system may be provided for pattern recognition, or object recognition, or face recognition. The sensor system may be implemented in a mobile device such as a mobile phone or smartphone.
  • Furthermore, various embodiments may allow the manufacturing of a compact and/or cost-efficient sensor system for a vehicle. Such a sensor system may be configured to detect active taillights of one or more other vehicles and at the same time to perform a three-dimensional measurement of objects by means of the LIDAR sensor portion of the sensor system.
  • Moreover, various embodiments allow the combination of two LIDAR wavelengths in one common detector e.g. to obtain information about the surface characteristic of a reflecting target object by means of a comparison of the respectively reflected light.
  • Various embodiments, may allow the combination of a LIDAR sensor, a camera sensor (configured to detect light of the visible spectrum (VIS)) and a camera sensor (configured to detect light of the thermal infrared spectrum), in one common sensor (e.g. monolithically integrated on one common carrier, e.g. one common substrate, e.g. one common wafer).
  • Various embodiments may reduce adjustment variations between different image sensors for camera and LIDAR.
  • In various embodiments, even more than two photo diodes may be stacked one above the other.
  • It is to be noted that in various embodiments, the lateral size (and/or shape) of the one, two or even more photo diodes and the color filter portions of the filter layer (e.g. filter layer 5206) may be the same.
  • Furthermore, in various embodiments, the lateral size (and/or shape) of the one, two, or even more photo diodes may be the same, and the lateral size (and/or shape) of the color filter portions of the filter layer (e.g. filter layer 5206) may be different from each other and/or from the lateral size (and/or shape) of the one, two or even more photo diodes.
  • Moreover, in various embodiments, the lateral size (and/or shape) of the one, two, or even more photo diodes may be different from each other and/or from the lateral size (and/or shape) of the color filter portions, and the lateral size (and/or shape) of the color filter portions of the filter layer (e.g. filter layer 5206) may be the same.
  • Moreover, in various embodiments, the lateral size (and/or shape) of the one, two, or even more photo diodes may be different from each other and the lateral size (and/or shape) of the color filter portions of the filter layer (e.g. filter layer 5206) may be different from each other and/or from the lateral size (and/or shape) of the one, two or even more photo diodes.
  • In addition, as already described above, other types of color filter combinations, like CYMG (cyan, yellow, green and magenta), RGBE (red, green, blue, and emerald), CMYW (cyan, magenta, yellow, and white) may be used as well. The color filters may have a bandwidth (FWHM) in the range from about 50 nm to about 200 nm. However, also monochrome filters (black/white) may be provided.
  • It is to be noted that standard color value components and luminance factors for retroreflective traffic signs are specified in accordance with DIN EN 12899-1 and DIN 6171-1. The color coordinates of vehicle headlamps (dipped and high beam, daytime running lights) are defined by the ECE white field (CIE-Diagram) of the automotive industry. The same applies to signal colors, whose color coordinates are defined, for example, by ECE color boundaries. See also CIE No. 2.2 (TC-1.6) 1975, or also BGB1. II—Issued on 12 Aug. 2005—No. 248). Other national or regional specification standards may apply as well. All these components may be implemented in various embodiments.
  • Accordingly, the transmission curves of the used sensor pixel color filters should comply with the respective color-related traffic regulations. Sensor elements having sensor pixels with color-filter need not only be arranged in a Bayer-Pattern, but other pattern configurations may be used as well, for example an X-trans-Matrix pixel-filter configuration.
  • A sensor as described with respect to FIGS. 10 to 16 may e.g. be implemented in a photon mixing device (e.g. for an indirect measurement or in a consumer electronic device in which a front camera of a smartphone may, e.g. at the same time, generate a three-dimensional image).
  • A sensor as described with respect to FIGS. 10 to 16 may e.g. also be implemented in a sensor to detect the characteristic of a surface, for example whether a street is dry or wet, since the surface usually has different light reflection characteristics depending on its state (e.g. dry state or wet state), and the like.
  • As previously described with reference to FIG. 1 to FIG. 8, a stacked photo diode in accordance with various embodiments as described with reference to FIG. 9 to FIG. 16 may implement a first sensor pixel including a photo diode of a first photo diode type and a second pixel of the plurality of pixels including a photo diode of a second photo diode type.
  • By way of example, such a stacked optical component including a plurality of photo diodes of different photo diode types (e.g. two, three, four or more photo diodes stacked above one another). The stacked optical component may be substantially similar to the optical component 5100 of FIG. 9 as described above. Therefore, only the main differences of the stacked optical component with respect to the optical component 5100 of FIG. 9 will be described in more detail below.
  • The stacked optical component may optionally include one or more microlenses, which may be arranged over the second photo diode (e.g. directly above, in other words in physical contact with the second photo diode). The one or more microlenses may be embedded in or at least partially surrounded by a suitable filler material such as silicone. The one or more microlenses together with the filler material may, for a layer structure, have a layer thickness in the range from about 1 μm to about 500 μm.
  • Furthermore, a filter layer, which may be configured to implement a bandpass filter, may be arranged over the optional one or more microlenses or the second photo diode (e.g. directly above, in other words in physical contact with the optional filler material or with the second photo diode). The filter layer may have a layer thickness in the range from about 1 μm to about 500 μm. The filter layer may have a filter characteristic in accordance with the respective application.
  • In various embodiments, the second photo diode may include or be a pin photo diode (configured to detect light of the visible spectrum) and the first photo diode may include or be an avalanche photo diode (in the linear mode/in the Geiger mode) (configured to detect light of the near infrared (NIR) spectrum or in the infrared (IR) spectrum).
  • In various embodiments, a multiplexer may be provided to individually select the sensor signals provided e.g. by the pin photo diode or by the avalanche photo diode. Thus, the multiplexer may select e.g. either the pin photo diode (and thus provides only the sensor signals provided by the pin photo diode) or the avalanche photo diode (and thus provides only the sensor signals provided by the avalanche photo diode).
  • In the following, various aspects of this disclosure will be illustrated:
  • Example 1f is an optical component for a LIDAR Sensor System. The optical component includes a first photo diode implementing a LIDAR sensor pixel in a first semiconductor structure and configured to absorb received light in a first wavelength region, a second photo diode implementing a camera sensor pixel in a second semiconductor structure over the first semiconductor structure and configured to absorb received light in a second wavelength region, and an interconnect layer (e.g. between the first semiconductor structure and the second semiconductor structure) including an electrically conductive structure configured to electrically contact the second photo diode. The received light of the second wavelength region has a shorter wavelength than the received light of the first wavelength region.
  • In Example 2f, the subject matter of Example 1f can optionally include that the second photo diode is vertically stacked over the first photo diode.
  • In Example 3f, the subject matter of any one of Examples 1f or 2f can optionally include that the first photo diode is a first vertical photo diode, and/or that the second photo diode is a second vertical photo diode.
  • In Example 4f, the subject matter of any one of Examples 1f to 3f can optionally include that the optical component further includes a further interconnect layer (e.g. between the carrier and the first semiconductor structure) including an electrically conductive structure configured to electrically contact the second vertical photo diode and/or the first vertical photo diode.
  • In Example 5f, the subject matter of any one of Examples 1f to 4f can optionally include that the optical component further includes a microlens over the second semiconductor structure that laterally substantially covers the first vertical photo diode and/or the second vertical photo diode.
  • In Example 6f, the subject matter of any one of Examples 1f to 5f can optionally include that the optical component further includes a filter layer over the second semiconductor structure that laterally substantially covers the first vertical photo diode and/or the second vertical photo diode and is configured to transmit received light having a wavelength within the first wavelength region and within the second wavelength region, and block light that is outside of the first wavelength region and the second wavelength region.
  • In Example 7f, the subject matter of any one of Examples 1f to 6f can optionally include that the received light of the first wavelength region has a wavelength in the range from about 800 nm to about 1800 nm, and/or that the received light of the second wavelength region has a wavelength in the range from about 380 nm to about 780 nm.
  • In Example 8f, the subject matter of any one of Examples 1f to 6f can optionally include that the received light of the first wavelength region has a wavelength in the range from about 800 nm to about 1800 nm, and/or that the received light of the second wavelength region has a wavelength in the range from about 800 nm to about 1750 nm.
  • In Example 9f, the subject matter of any one of Examples 1f to 8f can optionally include that the received light of the second wavelength region has a shorter wavelength than any received light of the first wavelength region by at least 50 nm, for example at least 100 nm.
  • In Example 10f, the subject matter of any one of Examples 1f to 7f or 9f can optionally include that the received light of the first wavelength region has a wavelength in an infrared spectrum wavelength region, and/or that the received light of the second wavelength region has a wavelength in the visible spectrum wavelength region.
  • In Example 11f, the subject matter of any one of Examples 1f to 10f can optionally include that the optical component further includes a mirror structure including a bottom mirror and a top mirror. The second semiconductor structure is arranged between the bottom mirror and the top mirror. The bottom mirror is arranged between the interconnect layer and the second semiconductor structure.
  • In Example 12f, the subject matter of Example 11f can optionally include that the mirror structure includes a Bragg mirror structure.
  • In Example 13f, the subject matter of any one of Examples 11f or 12f can optionally include that the mirror structure and the second vertical photo diode are configured so that the second vertical photo diode forms a resonant cavity photo diode.
  • In Example 14f, the subject matter of any one of Examples 1f to 13f can optionally include that the optical component further includes a reflector layer over the second semiconductor structure.
  • In Example 15f, the subject matter of Example 14f can optionally include that the reflector layer is configured as a thermal reflector layer configured to reflect radiation having a wavelength equal to or greater than approximately 2 μm, and/or that the reflector layer is configured as an infrared reflector layer.
  • In Example 16f, the subject matter of any one of Examples 1f to 15f can optionally include that the first photo diode is a pin photo diode, and that the second photo diode is a pin photo diode.
  • In Example 17f, the subject matter of any one of Examples 1f to 15f can optionally include that the first photo diode is an avalanche photo diode, and that the second photo diode is a pin photo diode.
  • In Example 18f, the subject matter of any one of Examples 1f to 15f can optionally include that the first photo diode is an avalanche photo diode, and that the second photo diode is a resonant cavity photo diode.
  • In Example 19f, the subject matter of any one of Examples 1f to 15f can optionally include that the first photo diode is a single-photon avalanche photo diode, and that the second photo diode is a resonant cavity photo diode.
  • In Example 20f, the subject matter of any one of Examples 1f to 15f can optionally include that the first photo diode is an avalanche photo diode, and that the second photo diode is an avalanche photo diode.
  • In Example 21f, the subject matter of any one of Examples 2f to 20f can optionally include that the optical component further includes an array of a plurality of photo diode stacks, each photo diode stack comprising a second photo diode vertically stacked over a first photo diode.
  • In Example 22f, the subject matter of any one of Examples 1f to 21f can optionally include that at least one photo diode stack of the plurality of photo diode stacks comprises at least one further second photo diode in the second semiconductor structure adjacent to the second photo diode, and that the first photo diode of the at least one photo diode stack of the plurality of photo diode stacks has a larger lateral extension than the second photo diode and the at least one further second photo diode of the at least one photo diode stack so that the second photo diode and the at least one further second photo diode are arranged laterally within the lateral extension of the first vertical photo diode.
  • In Example 23f, the subject matter of any one of Examples 1f to 22f can optionally include that the carrier is a semiconductor substrate.
  • Example 24f is a sensor for a LIDAR Sensor System. The sensor may include a plurality of optical components according to any one of Examples 1f to 23f. The plurality of optical components are monolithically integrated on the carrier as a common carrier.
  • In Example 25f, the subject matter of Example 24f can optionally include that the sensor is configured as a front-side illuminated sensor.
  • In Example 26f, the subject matter of Example 24f can optionally include that the sensor is configured as a back-side illuminated sensor.
  • In Example 27f, the subject matter of any one of Examples 24f to 26f can optionally include that the sensor further includes a color filter layer covering at least some optical components of the plurality of optical components.
  • In Example 28f, the subject matter of Example 27f can optionally include that the color filter layer includes a first color filter sublayer and a second color filter sublayer. The first color filter sublayer is configured to transmit received light having a wavelength within the first wavelength region and within the second wavelength region, and to block light outside the first wavelength region and outside the second wavelength region. The second color filter sublayer is configured to block received light having a wavelength outside the second wavelength region.
  • In Example 29f, the subject matter of Example 28f can optionally include that the first color filter sublayer and/or the second color filter sublayer includes a plurality of second sublayer pixels.
  • In Example 30f, the subject matter of Example 29f can optionally include that the first color filter sublayer and/or the second color filter sublayer includes a plurality of second sublayer pixels in accordance with a Bayer pattern.
  • In Example 31f, the subject matter of any one of Examples 27f to 30f can optionally include that the first color filter sublayer includes a plurality of first sublayer pixels having the same size as the second sublayer pixels. The first sublayer pixels and the second sublayer pixels coincide with each other.
  • In Example 32f, the subject matter of any one of Examples 27f to 30f can optionally include that the first color filter sublayer comprises a plurality of first sublayer pixels having a size larger than the size of the second sublayer pixels. One first sublayer pixels laterally substantially overlaps with a plurality of the second sublayer pixels.
  • Example 33f is a LIDAR Sensor System, including a sensor according to any one of Examples 24f to 32f, and a sensor controller configured to control the sensor.
  • Example 34f is a method for a LIDAR Sensor System according to example 33f, wherein the LIDAR Sensor System is integrated into a LIDAR Sensor Device, and communicates with a second Sensor System and uses the object classification and/or the Probability Factors and/or Traffic Relevance factors measured by the second Sensor System for evaluation of current and future measurements and derived LIDAR Sensor Device control parameters as a function of these factors.
  • Pulsed laser sources may have various applications. An important field of application for pulsed laser sources may be time-of-flight LIDAR sensors or LIDAR systems. In a time-of-flight LIDAR system, a laser pulse may be emitted, the laser pulse may be reflected by a target object, and the reflected pulse may be received again by the LIDAR system. A distance to the object may be calculated by measuring the time that has elapsed between sending out the laser pulse and receiving the reflected pulse. Various types of lasers or laser sources may be used for a LIDAR application (e.g., in a LIDAR system). By way of example, a LIDAR system may include an edge-emitting diode laser, a vertical cavity surface-emitting laser (VCSEL), a fiber laser, or a solid state laser (e.g., a Nd:YAG diode pumped crystal laser, a disc laser, and the like). An edge-emitting diode laser or a VCSEL may be provided, for example, for low-cost applications.
  • A special driver circuit may be provided for a laser diode to operate in pulsed mode. A relatively high electrical current pulse may be sent through the laser diode within a short period of time (usually on the order of a few picoseconds up to a few microseconds) to achieve a short and intense optical laser pulse. The driver circuit may include a storage capacitor for supplying the electrical charge for the current pulse. The driver circuit may include a switching device (e.g., one or more transistors) for generating the current pulse. A direct connection between the laser source and a current source may provide an excessive current (illustratively, a much too large current). Silicon-based capacitors (e.g., trench capacitors or stacked capacitors) may be integrated into a hybrid or system-in-package for providing higher integration of laser drivers. The switching device for activating the current pulse through the laser diode may be a separate element from the capacitor.
  • The storage capacitor and the switching may be located at a certain distance away from the laser diode. This may be related to the dimensions of the various electrical components included in the capacitor and in the switching device. Illustratively, with discrete components a minimum distance of the order of millimeters may be present. The soldering of the discrete components on a printed circuit board (PCB) and the circuit lanes connecting the components on the printed circuit board (PCB) may prevent said minimum distance to be reduced further. This may increase the parasitic capacitances and inductances in the system.
  • Various embodiments may be based on integrating in a common substrate one or more charge storage capacitors, one or more switching devices (also referred to as switches), and one or more laser light emitters (e.g., one or more laser diodes). Illustratively, a system including a plurality of capacitors, a plurality of switching devices (e.g., a switching device for each capacitor), and one or more laser diodes integrated in or on a common substrate may be provided. The arrangement of the capacitors and the switching devices in close proximity to the one or more laser diodes (e.g., in the same substrate) may provide reduced parasitic inductances and capacitances (e.g., of an electrical path for a drive current flow). This may provide improved pulse characteristics (e.g., a reduced minimum pulse width, an increased maximum current at a certain pulse width, a higher degree of influence on the actual pulse shape, or a more uniform shape of the pulse).
  • In various embodiments, an optical package may be provided (also referred to as laser diode system). The optical package may include a substrate (e.g., a semiconductor substrate, such as a compound semiconductor material substrate). The substrate may include an array of a plurality of capacitors formed in the substrate. The substrate may include a plurality of switches. Each switch may be connected between at least one capacitor and at least one laser diode. The optical package may include the at least one laser diode mounted on the substrate. The optical package may include a processor (e.g., a laser driver control circuit or part of a laser driver control circuit) configured to control the plurality of switches to control a first current flow to charge the plurality of capacitors. The processor may be configured to control the plurality of switches to control a second current flow to drive the at least one laser diode with a current discharged from at least one capacitor (e.g., a current pulse through the laser diode). Illustratively, the processor may be configured to control the plurality of switches to control the second current flow to discharge the plurality of capacitors. The optical package may be provided, for example, for LIDAR applications. Illustratively, the optical package may be based on an array-distributed approach for the capacitors and the switches.
  • The first current flow may be the same as the second current flow. Illustratively, the current used for charging the capacitors may be the same as the current discharged from the capacitors. Alternatively, the first current flow may be different from the second current flow (for example, in case part of the charge stored in the capacitors has dissipated, as described in further detail below).
  • The arrangement of the components of the optical package (e.g., the capacitors, the switches, and the at least one laser diode) may be similar to the arrangement of the components of a dynamic random-access memory (DRAM). By way of example, each switch may be assigned to exactly one respective capacitor. A switch-capacitor pair (e.g., in combination with the associated laser diode) may be similar to a memory cell of a DRAM array (e.g., a memory cell may include, for example a storage capacitor, a transistor, and electrical connections).
  • The plurality of capacitors and the plurality of switches may be understood as a driver circuit (illustratively, as part of a driver circuit, for example of a DRAM-like driver circuit) of the at least one laser diode. The laser diode may partially cover the driver circuit (e.g., at least a portion of the array of capacitors). Illustratively, the driver circuit may be arranged underneath the laser diode. The driver circuit may be electrically connected with the laser diode (e.g., by means of a method of 3D-integration of integrated circuits, such as bump bonding). The capacitors (e.g., DRAM-like capacitors) may have sufficient capacity to provide enough current to the laser diode for high-power laser emission, illustratively for emission in time-of-flight LIDAR applications. In an exemplary arrangement, about 500000 capacitors (for example, each having a capacitance of about 100 fF) may be assigned to the laser diode (e.g., to a VCSEL, for example having a diameter of about 100 μm). The arrangement of the capacitors directly underneath the laser diode may provide small parasitic inductances and capacitances. This may simplify the generation of a short and powerful laser pulse (e.g., based on a current pulse of about 40 A in the exemplary arrangement). By way of example, a connection (e.g., an electrical path) between a capacitor (and/or a switch) and the laser diode may have an inductivity lower than 100 pH.
  • The charge stored in the capacitors may dissipate in case the charge is not used, e.g. after a certain period of time. A regular re-charging (illustratively, a refreshment) of the capacitors may be provided (e.g., at predefined time intervals). The charge dissipation may reduce the risk of unintentional emission of a laser pulse. The optical package may be provided or may operate without a high-resistivity resistor configured to discharge the storage capacitors over time periods larger than the laser pulse rate.
  • The driver circuit may be fabricated using DRAM manufacturing methods, e.g. CMOS technology methods. The capacitors may be deep trench capacitors or stacked capacitors (illustratively, at least one capacitor may be a deep trench capacitor and/or at least one capacitor may be a stacked capacitor). Each switch may include a transistor, e.g., a field effect transistor (e.g., a metal oxide semiconductor field effect transistor, such as a complementary metal oxide semiconductor field effect transistor). The driver circuit may be provided (and fabricated) in a cost-efficient manner (e.g., without expensive, high-performance high-speed power transistors, such as without GaN FET).
  • The laser diode may include a III-V semiconductor material as active material (e.g. from the AlGaAs or GaN family of semiconductors). By way of example, the laser diode may include an edge-emitting laser diode. As another example, the laser diode may include a vertical cavity surface-emitting laser diode (e.g., the optical package may be a VCSEL package).
  • In various embodiments, the processor may be configured to individually control the plurality of switches to control the first current flow to charge the plurality of capacitors.
  • In various embodiments, the processor may be configured to control the amount of charge to be delivered to the laser diode. The processor may be configured to individually control the plurality of switches to control the second current flow to drive the at least one laser diode with a current discharged from at least one capacitor. Illustratively, the processor may be configured to individually control the switches such that a variable number of capacitors associated with the laser diode may be discharged (illustratively, at a specific time) to drive the laser diode (e.g., only one capacitor, or some capacitors, or all capacitors). This may provide control over the total current for the current pulse and over the intensity of the outgoing laser pulse. Variable laser output power may be provided, e.g. based on a precisely adjusted current waveform.
  • By way of example, the optical package may include one or more access lines (e.g., similar to a DRAM circuit) for selectively charging and/or selectively discharging the capacitors (e.g., for charging and/or discharging a subset or a sub-array of capacitors).
  • In various embodiments, the optical package may include a plurality of laser diodes, for example arranged as a one-dimensional array (e.g., a line array) or as a two-dimensional array (e.g., a matrix array). By way of example, the optical package may include a VCSEL array. Each laser diode may be associated with (e.g., driven by) a corresponding portion of the driver circuit (e.g., corresponding capacitors and switches, for example corresponding 500000 capacitors).
  • In various embodiments, the optical package may include one or more heat dissipation components, such as one or more through vias, e.g. through-silicon vias (TSV), one or more metal layers, and/or one or more heat sink devices. By way of example, the optical package may include one or more heat sink devices arranged underneath the substrate (for example, in direct physical contact with the substrate). As another example, the optical package may include one or more through-silicon vias arranged outside and/or inside an area of the substrate including the switches and the capacitors. The one or more through-silicon vias may provide an improved (e.g., greater) heat conduction from the laser diode to a bottom surface of the substrate (illustratively, the mounting surface below the capacitor/switch array). As a further example, the optical package may include a metal layer arranged between the capacitors and the switches. The metal layer may improve heat transfer towards the sides of the optical package. The metal layer may have an additional electrical functionality, such as electrically contacting some of the capacitors with the sides of the optical package. The heat dissipation components may be provided to dissipate the thermal load related to the high-density integration of the components of the optical package (e.g., laser diode and driver circuit).
  • FIG. 17A shows an optical package 15500 in a schematic side view in accordance with various embodiments.
  • The optical package 15500 may include a substrate 15502. The substrate 15502 may be a semiconductor substrate. By way of example, the substrate 15502 may include silicon or may essentially consist of silicon. As another example, the substrate 15502 may include or essentially consist of a compound semiconductor material (e.g., GaAs, InP, GaN, or the like).
  • The substrate 15502 may include a plurality of capacitors 15504. The capacitors 15504 may be formed in the substrate 15502, e.g. the capacitors 15504 may be monolithically integrated in the substrate 15502. Illustratively, a capacitor 15504 may be surrounded on three sides or more by the substrate 15502 (e.g., by the substrate material). The capacitors 15504 may be fabricated, for example, by means of DRAM-manufacturing processes.
  • By way of example, at least one capacitor 15504 (or more than one capacitor 15504, or all capacitors 15504) may be a deep trench capacitor. Illustratively, a trench (or a plurality of trenches) may be formed into the substrate 15502 (e.g., via etching). A dielectric material may be deposited in the trench. A plate may be formed surrounding a lower portion of the trench. The plate may be or may serve as first electrode for the deep trench capacitor. The plate may be, for example, a doped region (e.g., an n-doped region) in the substrate 15502. A metal (e.g., a p-type metal) may be deposited on top of the dielectric layer. The metal may be or may serve as second electrode for the deep trench capacitor.
  • As another example, at least one capacitor 15504 (or more than one capacitor 15504, or all capacitors 15504) may be a stacked capacitor. Illustratively, an active area (or a plurality of separate active areas) may be formed in the substrate. A gate dielectric layer may be deposited on top of the active area (e.g., on top of each active area). A sequence of conductive layers and dielectric layers may be deposited on top of the gate dielectric layer. Electrical contacts may be formed, for example, via a masking and etching process followed by metal deposition.
  • The capacitors 15504 may be arranged in an ordered fashion in the substrate 15502, e.g. the plurality of capacitors 15504 may form an array. By way of example, the capacitors 15504 may be arranged in one direction to form a one-dimensional capacitor array. As another example, the capacitors 15504 may be arranged in two directions to form a two-dimensional capacitor array. Illustratively, the capacitors 15504 of the array of capacitors 15504 may be arranged in rows and columns (e.g., a number N of rows and a number M of columns, wherein N may be equal to M or may be different from M). It is understood that the plurality of capacitors 15504 may include capacitors 15504 of the same type or of different types (e.g., one or more deep trench capacitors and one or more stacked capacitors), for example different types of capacitors 15504 in different portions of the array (e.g., in different sub-arrays).
  • The substrate 15502 may include a plurality of switches 15506. The switches 15506 may be formed in the substrate 15502, e.g. the switches 15506 may be monolithically integrated in the substrate 15502. Each switch 15506 may be connected between at least one capacitor 15504 and at least one laser diode 15508 (e.g., each switch 15506 may be electrically coupled with at least one capacitor 15504 and at least one laser diode 15508). Illustratively, a switch 15506 may be arranged along an electrical path connecting a capacitor 15504 with the laser diode 15508.
  • A switch 15506 may be controlled (e.g., opened or closed) to control a current flow from the associated capacitor 15504 to the laser diode 15508. By way of example, each switch 15506 may include a transistor. At least one transistor (or more than one transistor, or all transistors) may be a field effect transistor, such as a metal oxide semiconductor field transistor (e.g., a complementary metal oxide semiconductor field transistor). It is understood that the plurality of switches 15506 may include switches 15506 of the same type or of different types.
  • A switch 15506 may be assigned to more than one capacitor 15504 (e.g., a switch 15506 may be controlled to control a current flow between more than one capacitor 15504 and the laser diode 15508). Alternatively, each switch 15506 may be assigned to exactly one respective capacitor 15504. Illustratively, the substrate 15502 may include a plurality of switch-capacitor pairs (e.g., similar to a plurality of DRAM cells). This may be illustrated by the circuit equivalents shown, for example, in FIG. 17B and FIG. 17C. A switch 15506 s may be controlled (e.g., via a control terminal 15506 g, such as a gate terminal) to allow or prevent a current flow from the assigned capacitor 15504 c to the laser diode 15508 d (or to an associated laser diode, as shown in FIG. 17C).
  • The switches 15506 may have a same or similar arrangement as the capacitors 15504 (e.g., the substrate 15502 may include an array of switches 15506, such as a one-dimensional array or a two-dimensional array).
  • The optical package 15500 may include the at least one laser diode 15508. The laser diode 15508 may be mounted on the substrate 15502 (e.g., the laser diode 15508 may be arranged on a surface of the substrate 15502, such as a top surface, for example on an insulating layer of the substrate 15502). The laser diode 15508 may laterally cover at least a portion of the plurality of capacitors 15504. Illustratively, the laser diode 15508 may be mounted on the substrate 15502 in correspondence (e.g., directly above) of the plurality of capacitors 15504 or of at least a portion of the plurality of capacitors 15504. This may provide a low inductivity for an electrical path between a capacitor 15504 (or a switch 15506) and the laser diode 15508. The electrical path (e.g., between a capacitor 15504 and the laser diode 15508 and/or between a switch 15502 and the laser diode 15508) may have an inductivity in a range between 70 pH and 200 pH, for example lower than 100 pH.
  • The laser diode 15508 may be a laser diode suitable for LIDAR applications (e.g., the optical package 15500 may be included in a LIDAR system, for example in the LIDAR Sensor System 10). By way of example, the laser diode 15508 may be or may include an edge-emitting laser diode. As another example, the laser diode 15508 may be or may include a vertical cavity surface-emitting laser diode.
  • The laser diode 15508 may be configured to receive current discharged from the capacitors 15504. By way of example, the substrate 15502 may include a plurality of electrical contacts (e.g., each electrical contact may be connected with a respective capacitor 15504, for example via the respective switch 15506). The laser diode 15508 may be mounted on the electrical contacts or may be electrically connected with the electrical contacts. By way of example, a first terminal of the laser diode 15508 may be electrically connected to the electrical contacts, for example via an electrically conductive common line 15510, as described in further detail below (e.g., the first terminal of the laser diode 15508 may be electrically coupled to the common line 15510). A second terminal of the laser diode 15508 may be electrically connected to a second potential, e.g., to ground.
  • The laser diode 15508 may be associated with a number of capacitors 15504 for providing a predefined laser output power. By way of example, the laser diode 15508 may be configured to receive current discharged from a number of capacitors 15504 such that a predefined laser output power may be provided, for example above a predefined threshold. Stated in another fashion, the laser diode 15508 may be configured to receive current discharged from a number of capacitors 15504 such that a predefined current may flow in or through the laser diode 15508, for example a current above a current threshold. By way of example, the laser diode 15508 may be associated with a number of capacitors 15504 in the range from a few hundreds capacitors 15504 to a few millions capacitors 15504, for example in the range from about 100000 capacitors 15504 to about 1000000 capacitors 15504, for example in the range from about 400000 capacitors 15504 to about 600000 capacitors 15504, for example about 500000 capacitors 15504. Each capacitor 15504 may have a capacitance in the femtofarad range, for example in the range from about 50 fF to about 200 fF, for example about 100 fF. The capacitance of a capacitor 15504 may be selected or adjusted depending on the number of capacitors 15504 associated with the laser diode 15508 (illustratively, the capacitance may increase for decreasing number of associated capacitors 15504 and may decrease for increasing number of associated capacitors 15504). The capacitance of a capacitor 15504 may be selected or adjusted depending on the current flow to drive the laser diode 15508 (e.g., in combination with the number of associated capacitors 15504). At least one capacitor 15504, or some capacitors 15504, or all capacitors 15504 associated with the laser diode 15508 may be discharged (e.g., for each laser pulse emission). This may provide control over the emitted laser pulse, as described in further detail below.
  • The optical package 15500 may include more than one laser diode 15508 (e.g., a plurality of laser diodes), of the same type or of different types. Each laser diode may be associated with a corresponding plurality of capacitors 15504 (e.g., with a corresponding number of capacitors, for example in the range from about 400000 to about 60000, for example about 500000).
  • The laser diode 15508 may be configured to emit light (e.g., a laser pulse) in case the current discharged from the associated capacitors 15504 flows in the laser diode 15508. The laser diode may be configured to emit light in a predefined wavelength range, e.g. in the near infra-red or in the infra-red wavelength range (e.g., in the range from about 800 nm to about 1600 nm, for example at about 905 nm or at about 1550 nm). The duration of an emitted laser pulse may be dependent on a time constant of the capacitors 15504. By way of example, an emitted laser pulse may have a pulse duration (in other words, a pulse width) in the range from below 1 ns to several nanoseconds, for example in the range from about 5 ns to about 20 ns, for example about 10 ns.
  • The optical package 15500 may include an electrically conductive common line 15510 (e.g., a metal line). The common line 15510 may connect at least some capacitors 15504 of the plurality of capacitors 15504. Illustratively, the common line 15510 may connect (e.g., may be electrically connected with) the electrical contacts of at least some capacitors 15504 of the plurality of capacitors 15504. By way of example, the common line 15510 may connect all capacitors 15504 of the plurality of capacitors 15504. As another example, the optical package 15500 may include a plurality of common lines 15510, each connecting at least some capacitors 15504 of the plurality of capacitors 15504.
  • The optical package 15500 may include a power source 15512 (e.g., a source configured to provide a current, for example a battery). The power source 15512 may be electrically connected to the common line 15512 (or to each common line). The power source 15512 may be configured to provide power to charge the plurality of capacitors 15504 (e.g., the capacitors 15504 connected to the common line 15510).
  • The optical package 15500 may include a processor 15514. By way of example, the processor 15514 may be mounted on the substrate 15502. As another example, the processor 15514 may be monolithically integrated in the substrate 15502. Alternatively, the processor 15514 may be mounted on the printed circuit board 15602 (see FIG. 18). The processor may be configured to control the plurality of switches 15506 (e.g., to open or close the plurality of switches). As an example, the optical package 15500 (or the substrate 15502) may include a plurality of access lines electrically connected with control terminals of the switches 15506 (e.g., similar to word-lines in a DRAM). The processor 15514 may be configured to control the switches 15506 by providing a control signal (e.g., a voltage, such as a control voltage, or an electric potential) to the plurality of access lines (or to some access lines, or to a single access line). The processor 15514 may be configured to individually control the switches 15506, e.g. by providing individual control signals to the access line or lines connected to the switch 15506 or the switches 15506 to be controlled. By way of example, the processor 15514 may include or may be configured to control a voltage supply circuit used for supplying control voltages to the access lines (not shown).
  • The processor 15514 may be configured to control (e.g., to individually control) the plurality of switches 15506 to control a first current flow to charge the plurality of capacitors 15504. Illustratively, the processor 15514 may be configured to open the plurality of switches 15506 such that current may flow from the common line 15510 (illustratively, from the power source 15512) into the capacitors 15504.
  • The processor 15514 may be configured to control (e.g., to individually control) the plurality of switches 15506 to control a second current flow to discharge the plurality of capacitors 15504. Illustratively, the processor 15514 may be configured to open the plurality of switches 15506 such that the capacitors 15504 may be discharged (e.g., current may flow from the capacitors 15504 to the laser diode 15508). The first current flow may be the same as the second current flow or different from the second current flow (e.g., the first current flow may be greater than the second current flow).
  • The processor 15514 may be configured to control (e.g., to individually control) the plurality of switches 15506 to control the second current flow to drive the laser diode 15508 with current discharged from at least one capacitor 15504. The processor 15514 may be configured to adjust a current flow through the laser diode 15508 (e.g., to adjust a laser output power) by controlling (e.g., opening) the switches 15506 (e.g., by discharging a certain number of the capacitors 15504 associated with the laser diode 15508). Illustratively, the second current flow to drive the at least one laser diode 15508 may include a current proportional to the number of discharged capacitors 15504 (e.g., a current in the range from a few milliamperes up about 100 A, for example in the range from about 10 mA to about 100 A, for example from about 1 A to about 50 A, for example about 40 A).
  • The processor 15514 may be configured to control an emitted light pulse. The processor may be configured to control or select the properties of an emitted light pulse (e.g., a shape, a duration, and an amplitude of an emitted light pulse) by controlling the arrangement and/or the number of capacitors 15504 to be discharged (e.g., of discharged capacitors 15504). By way of example, a shape of the emitted light pulse may be controlled by discharging capacitors 15504 arranged in different locations within an array of capacitors 15504. As another example, an amplitude of the emitted light pulse may be increased (or decreased) by discharging a higher (or lower) number of capacitors 15504.
  • The processor 15514 may be configured to control the plurality of switches 15506 to discharge at least some capacitors 15504 to drive the laser diode 15508 to emit a light pulse (e.g., a laser pulse) of a predefined pulse shape (in other words, a light pulse having a certain waveform). By way of example, the processor 15514 may be configured to encode data in the emitted light pulse (e.g., to select a shape associated with data to be transmitted). Illustratively, the emitted light pulse may be modulated (e.g., electrically modulated) such that data may be encoded in the light pulse. The processor 15514 may be configured to control the discharge of the capacitors 15504 to modulate an amplitude of the emitted light pulse, for example to include one or more hump-like structure elements in the emitted light pulse. The processor 15514 may have access to a memory storing data (e.g., to be transmitted) associated with a corresponding pulse shape (e.g., storing a codebook mapping data with a corresponding pulse shape).
  • The processor 15514 may be configured to control the plurality of switches 15506 to discharge at least some capacitors 15504 to drive the laser diode 15508 to emit a light pulse dependent on a light emission scheme. By way of example, the processor 15514 may be configured to control the discharge of the capacitors 15504 to drive the laser diode 15508 to emit a sequence of light pulses, for example structured as a frame (illustratively, the temporal arrangement of the emitted light pulses may encode or describe data).
  • The optical package 15500 may include one or more further components, not illustrated in FIG. 17A. By way of example, the optical package 15500 (e.g., the substrate 15502) may include one or more additional switches (e.g., as illustrated for example in the circuit equivalent in FIG. 17C). A first additional switch (or a plurality of first additional switches) may be controlled (e.g., opened or closed) to selectively provide a path from the power source 15512 to the capacitors 15504. A second additional switch (or a plurality of second additional switches) may be controlled to selectively provide a path from the laser diode 15508 to an electrical contact (described in further detail below).
  • As illustrated in FIG. 17C, an exemplary operation of the optical package 15500 may be as follows. A first additional switch SWB may be opened to disconnect the power source from the capacitors 11504 c (illustratively, the power source may be coupled with the node B, e.g. the terminal B, in FIG. 17B and FIG. 17C). The node A, e.g. the terminal A, in FIG. 17B and FIG. 17C may indicate the substrate (e.g., may be coupled with the substrate). A second additional switch SWC may be opened to disconnect the laser diode 15508 d from the associated electrical contact (illustratively, the electrical contact may be coupled with the node C, e.g. the terminal C, in FIG. 17B and FIG. 17C). As an example, the second additional switch SWC may be opened to disconnect each laser diode 15508 d from the associated electrical contact. As another example, each laser diode 15508 d may have a respective additional switch and/or a respective electrical contact associated thereto. The capacitors 15504 c to be charged may be selected by providing a corresponding control signal to the respective access line (e.g., applying a control voltage to the control terminal of the associated switch 15506 s), illustratively coupled with the node D, e.g. the terminal D, in FIG. 17B and FIG. 17C. The first additional switch SWB may be closed to charge the selected capacitors. The access lines (e.g., control lines) may be deactivated after charging has been performed. The first additional switch SWB may be opened. The second additional switch SWC may be closed to provide an electrical path from the laser diode 15508 d (e.g., from each laser diode 15508 d) to the associated electrical contact. The capacitors 15504 c to be discharged may be selected by providing a corresponding control signal to the respective access line. The selected capacitors 15504 c may be discharged via the associated laser diode 15508 d.
  • FIG. 18 shows a top view of the optical package 15500 in a schematic representation in accordance with various embodiments.
  • The optical package 15500 may include a base support, e.g. a printed circuit board 15602. The substrate 15502 may be mounted on the printed circuit board 15602 (e.g., integrated in the printed circuit board 15602). The processor 15514 may be mounted on the printed circuit board 15602.
  • The printed circuit board 15602 may include a first electrical contact 15604. The first electrical contact 15604 may be connected (e.g., electrically coupled) to the common line 15510 of the substrate 15502 (in other words, to the common line 15510 of the optical package 15500), as shown, for example, in FIG. 17A. By way of example, the first electrical contact 15604 may be wire bonded to the common line 15510. Power to charge the capacitors 15504 may be provided via the first electrical contact 15604 of the printed circuit board 15602. By way of example, a power source may be mounted on the printed circuit board 15602 and electrically coupled with the first electrical contact 15604.
  • The printed circuit board 15602 may include a second electrical contact 15606. The second terminal 15608 of the laser diode 15508 may be electrically coupled to the second electrical contact 15606 of the printed circuit board 15602. By way of example, the second electrical contact 15606 of the printed circuit board 15602 may be wire bonded to the second terminal 15608 of the laser diode 15508. The second electrical contact 15606 may provide a path for the current to flow through the laser diode 15508.
  • It is understood that the arrangement shown in FIG. 18 is illustrated as an example, and other configurations of the optical package 15500 may be provided. By way of example, the optical package 15500 may include a plurality of laser diodes 15508, for example arranged in a one-dimensional array or in a two-dimensional array (e.g., in a matrix array) over the base support. The optical package 15500 may include a plurality of first electrical contacts 15604 and/or a plurality of second electrical contacts 15606. As an example, the optical package 15500 may include a first electrical contact 15604 and a second electrical contact 15606 associated with each laser diode 15508. As another example, the optical package 15500 may include a first electrical contact 15604 for each line in an array of laser diodes 15508.
  • FIG. 19A and FIG. 19B show a side view and a top view, respectively, of an optical package 15700 in a schematic representation in accordance with various embodiments. In FIG. 19B, components of the optical package 15700 that may be arranged at different levels are illustrated, e.g. at different vertical positions within the optical package 15700 or within the substrate, according to the representation in FIG. 19A.
  • The optical package 15700 may be configured as the optical package 15500 described, for example, in relation to FIG. 17A to FIG. 156. Illustratively, the optical package 15700 may be an exemplary realization of the optical package 15500.
  • The optical package 15700 may include a substrate 15702. The optical package 15700 may include a plurality of storage capacitors 15704 formed (e.g., monolithically integrated) in the substrate 15702 (e.g., an array of storage capacitors 15704, for example a two-dimensional array). The optical package 15700 may include a plurality of switches 15706 formed (e.g., monolithically integrated) in the substrate, for example a plurality of transistors (e.g., field effect transistors). Each switch 15706 may be connected between at least one capacitor 15704 (e.g., exactly one capacitor 15704) and a laser diode 15708. The substrate 15702 may include a base 15702 s, e.g. including or essentially consisting of silicon. The substrate 15702 may include an insulating layer 15702 i, for example including an oxide, such as silicon oxide.
  • The laser diode 15708 may be a vertical cavity surface-emitting laser diode (e.g., emitting light from a top surface of the laser diode 15708), for example having a pyramid shape. The laser diode 15708 may be mounted on the substrate 15702 (e.g., on the insulating layer 15702 i). The laser diode 15708 may include an active layer 15708 a (illustratively, a layer of active material).
  • The laser diode 15708 may include one or more optical structures 15708 o, arranged above and/or underneath the active layer 15708 a. By way of example, the laser diode 15708 may include a first optical structure 15708 o arranged on top of the active layer 15708 a (e.g., in direct physical contact with the active layer 15708 a). The first optical structure 15708 o may be a top Bragg mirror (e.g., a sequence of alternating thin layers of dielectric materials having high and low refractive index). The laser diode 15708 may include a second optical structure 15708 o arranged underneath the active layer 15708 a (e.g., in direct physical contact with the active layer 15708 a). The second optical structure 15708 o may be a bottom Bragg mirror.
  • The optical package 15700 may include a printed circuit board 15710. The substrate 15702 may be mounted on the printed circuit board 15710. The laser diode 15708 may be electrically connected to the printed circuit board 15710 (e.g., to an electrical contact of the printed circuit board 15710), for example via one or more bond wires 15712. By way of example, the laser diode 15708 may include a (e.g., second) terminal 15714 arranged on top of the laser diode 15708 (e.g., a top contact). The terminal 15714 may be a ring-like mesa structure (e.g., to allow emission of the laser light), as illustrated, for example, in FIG. 19B. The one or more bond wires 15712 may be connected to the terminal 15714.
  • The laser diode 15708 may include another (e.g., first) terminal 15716 arranged at a bottom surface of the laser diode 15708 (e.g., a bottom contact). The terminal 15716 may be electrically coupled with a connector structure 15718 (e.g., a connector structure 15718 formed in the substrate 15702). The connector structure 15718 may provide electrical coupling (e.g., an electrical path) with the switches 15706 and the capacitors 15704 (e.g., between the terminal 15716 and the switches 15706 and the capacitors 15704). By way of example, the connector structure 15718 may include a plurality of electrical contacts 15718 c, e.g. a grid-structure with individual pin-like elements. Each electrical contact 15718 c may be connected with a respective capacitor 15704, for example via the respective switch 15706. Illustratively, the connector structure 15718 may be selectively coupled to the plurality of storage capacitors 15706 (e.g., pin-like storage capacitors) by the plurality of switching devices 15706. The connector structure 15718 may be an example for the common line 15510.
  • The connector structure 15718 may be used to charge the plurality of capacitors 15704. By way of example the connector structure 15718 may be electrically coupled with a power source. As another example, the connector structure 15718 may be electrically coupled with the printed circuit board 15710, for example via one or more bond wires 15720. The connector structure 15718 may be electrically coupled with an electrical terminal of the printed circuit board 15710. A power source may be electrically coupled with the electrical terminal of the printed circuit board 15710. Illustratively, the connector structure 15718 may have a comb-like arrangement including a plurality of connector lines (as shown in FIG. 19B). Each connector line may optionally include or be associated with a respective switch (e.g., a field effect transistor) for providing additional control over the selection of the capacitors to be charged (e.g., in addition to the selection by means of the access lines 15722).
  • The substrate 15702 may include a plurality of access lines 15722 (illustratively, a plurality of word-lines). Each access line may be electrically coupled with one or more switches 15706 (e.g., with respective control terminals, e.g. gate terminals, of one or more switches 15706). The access lines 15722 may be used to control (e.g., open or close) the one or more switches 15706 coupled thereto.
  • The optical package 15700 may include a processor configured as the processor 15514 described above, for example in relation to FIG. 17A to FIG. 18. The processor may be configured to control the switches 15706 by supplying a control signal (e.g., a plurality of control signals) via the plurality of access lines 15522.
  • The optical package 15700 (e.g., the substrate 15702) may include one or more through-vias 15724 (e.g., through-silicon vias), as an example of heat dissipation component. By way of example, a through-via 15724 may extend through the substrate in the vertical direction (e.g., through the base 15702 s and through the insulating layer 15702 i). The through-via 15724 may be filled with a heat dissipation or heat conducting material, such as a metal (e.g., deposited or grown in the through-via 15724). The through-via 15724 may be arranged outside the area in which the plurality of capacitors 15704 and/or the plurality of switches 15706 are formed in the substrate 15702.
  • In the following, various aspects of this disclosure will be illustrated:
  • Example 1ad is an optical package. The optical package may include a substrate. The substrate may include an array of a plurality of capacitors formed in the substrate. The substrate may include a plurality of switches formed in the substrate. Each switch may be connected between at least one laser diode and at least one capacitor of the plurality of capacitors. The optical package may include the at least one laser diode mounted on the substrate. The optical package may include a processor configured to control the plurality of switches to control a first current flow to charge the plurality of capacitors. The processor may be configured to control the plurality of switches to control a second current flow to drive the at least one laser diode with current discharged from at least one capacitor of the plurality of capacitors.
  • In example 2ad, the subject-matter of example 1ad can optionally include that the plurality of capacitors and the plurality of switches are monolithically integrated in the substrate.
  • In example 3ad, the subject-matter of any one of examples 1ad or 2ad can optionally include that each switch of the plurality of switches is assigned to exactly one respective capacitor of the plurality of capacitors.
  • In example 4ad, the subject-matter of example 3ad can optionally include that the processor is configured to individually control the plurality of switches to control the first current flow to charge the plurality of capacitors. The processor may be configured to individually control the plurality of switches to control the second current flow to drive the at least one laser diode with current discharged from at least one capacitor of the plurality of capacitors.
  • In example 5ad, the subject-matter of any one of examples 1ad to 4ad can optionally include that each switch of the plurality of switches includes a transistor.
  • In example 6ad, the subject-matter of example 5ad can optionally include that at least one transistor of the plurality of transistors is a field effect transistor.
  • In example 7ad, the subject-matter of example 6ad can optionally include that at least one field effect transistor of the plurality of transistors is a metal oxide semiconductor field effect transistor.
  • In example 8ad, the subject-matter of example 7ad can optionally include that at least one metal oxide semiconductor field effect transistor of the plurality of transistors is a complementary metal oxide semiconductor field effect transistor.
  • In example 9ad, the subject-matter of any one of examples 1ad to Bad can optionally include that the array of capacitors includes a number of capacitors in the range from about 400000 capacitors to about 600000 capacitors associated with the at least one laser diode.
  • In example 10ad, the subject-matter of any one of examples 1ad to 9ad can optionally include that at least one capacitor of the array of capacitors has a capacitance in the range from about 50 fF to about 200 fF.
  • In example 11ad, the subject-matter of any one of examples 1ad to 10ad can optionally include that the current flow to drive the at least one laser diode includes a current in the range from about 10 mA to about 100 A.
  • In example 12ad, the subject-matter of any one of examples 1ad to 11ad can optionally include that an electrical path between a capacitor and the at least one laser diode has an inductivity lower than 100 pH.
  • In example 13ad, the subject-matter of any one of examples 1ad to 12ad can optionally include that at least one capacitor of the array of capacitors is a deep trench capacitor.
  • In example 14ad, the subject-matter of any one of examples 1ad to 13ad can optionally include that at least one capacitor of the array of capacitors is a stacked capacitor.
  • In example 15ad, the subject-matter of any one of examples 1ad to 14ad can optionally include that the capacitors of the array of capacitors are arranged in rows and columns.
  • In example 16ad, the subject-matter of any one of examples 1ad to 15ad can optionally include an electrically conductive common line connecting at least some capacitors of the plurality of capacitors.
  • In example 17ad, the subject-matter of example 16ad can optionally include a power source electrically connected to the common line and configured to provide the power to charge the plurality of capacitors.
  • In example 18ad, the subject-matter of any one of examples 1ad to 17ad can optionally include a printed circuit board. The substrate may be mounted on the printed circuit board.
  • In example 19ad, the subject-matter of any one of examples 16ad or 17ad can optionally include a printed circuit board. The substrate may be mounted on the printed circuit board. The printed circuit board may include an electrical contact electrically coupled to the common line of the substrate.
  • In example 20ad, the subject-matter of example 19ad can optionally include that the electrical contact of the printed circuit board is wire bonded to the common line of the substrate.
  • In example 21ad, the subject-matter of any one of examples 16ad to 20ad can optionally include a printed circuit board. The substrate may be mounted on the printed circuit board. A first terminal of the at least one laser diode may be electrically coupled to the common line. A second terminal of the at least one laser diode may be electrically coupled to an electrical contact of the printed circuit board.
  • In example 22ad, the subject-matter of example 21ad can optionally include that the electrical contact of the printed circuit board is wire bonded to the second terminal of the at least one laser diode.
  • In example 23ad, the subject-matter of any one of examples 1ad to 22ad can optionally include the substrate includes or essentially consists of silicon.
  • In example 24ad, the subject-matter of any one of examples 1ad to 23ad can optionally include that the at least one laser diode laterally covers at least a portion of the plurality of capacitors.
  • In example 25ad, the subject-matter of any one of examples 1ad to 24ad can optionally include that the at least one laser diode includes an edge emitting laser diode.
  • In example 26ad, the subject-matter of any one of examples 1ad to 24ad can optionally include that the at least one laser diode includes a vertical cavity surface-emitting laser diode.
  • In example 27ad, the subject-matter of any one of examples 1ad to 26ad can optionally include that the processor is monolithically integrated in the substrate.
  • In example 28ad, the subject-matter of any one of examples 19ad to 26ad can optionally include that the processor is mounted on the printed circuit board.
  • In example 29ad, the subject-matter of any one of examples 19ad to 28ad can optionally include that the processor is configured to control the plurality of switches to discharge at least some capacitors of the plurality of capacitors to drive the at least one laser diode to emit a laser pulse of a predefined pulse shape.
  • In example 30ad, the subject-matter of example 29ad can optionally include that the laser pulse has a pulse duration of about 10 ns.
  • In example 31ad, the subject-matter of any one of examples 29ad or 30ad can optionally include that the processor is configured to control the plurality of switches to discharge at least some capacitors of the plurality of capacitors to drive the at least one laser diode to emit a laser pulse dependent on a light emission scheme.
  • In example 32ad, the subject-matter of any one of examples 29ad to 31ad can optionally include that the processor is configured to control the plurality of switches to discharge at least some capacitors of the plurality of capacitors to drive the at least one laser diode to emit a laser pulse of a predefined pulse shape.
  • Example 33ad is a LIDAR Sensor System including an optical package of any one of examples 1ad to 32ad.
  • CONCLUSION
  • While various embodiments have been described and illustrated herein, those of ordinary skill in the art will readily envision a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein, and each of such variations and/or modifications is deemed to be within the scope of the embodiments described herein. More generally, those skilled in the art will readily appreciate that all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the teachings is/are used. Those skilled in the art will recognize, or be able to ascertain using no more than routine experimentation, many equivalents to the specific advantageous embodiments described herein. It is, therefore, to be understood that the foregoing embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, embodiments may be practiced otherwise than as specifically described and claimed. Embodiments of the present disclosure are directed to each individual feature, system, article, material, kit, and/or method described herein. In addition, any combination of two or more such features, systems, articles, materials, kits, and/or methods, if such features, systems, articles, materials, kits, and/or methods are not mutually inconsistent, is included within the scope of the present disclosure.
  • The above-described embodiments can be implemented in any of numerous ways. The embodiments may be combined in any order and any combination with other embodiments. For example, the embodiments may be implemented using hardware, software or a combination thereof. When implemented in software, the software code can be executed on any suitable processor or collection of processors, whether provided in a single computer or distributed among multiple computers.
  • Further, it should be appreciated that a computer may be embodied in any of a number of forms, such as a rack-mounted computer, a desktop computer, a laptop computer, or a tablet computer. Additionally, a computer may be embedded in a device (e.g. LIDAR Sensor Device) not generally regarded as a computer but with suitable processing capabilities, including a Personal Digital Assistant (PDA), a smart phone or any other suitable portable or fixed electronic device.
  • Also, a computer may have one or more input and output devices. These devices can be used, among other things, to present a user interface. Examples of output devices that can be used to provide a user interface include printers or display screens for visual presentation of output and speakers or other sound generating devices for audible presentation of output. Examples of input devices that can be used for a user interface include keyboards, and pointing devices, such as mice, touch pads, and digitizing tablets. As another example, a computer may receive input information through speech recognition or in other audible format.
  • Such computers may be interconnected by one or more networks in any suitable form, including a local area network or a wide area network, such as an enterprise network, and intelligent network (IN) or the Internet. Such networks may be based on any suitable technology and may operate according to any suitable protocol and may include wireless networks, wired networks or fiber optic networks.
  • The various methods or processes outlined herein may be coded as software that is executable on one or more processors that employ any one of a variety of operating systems or platforms. Additionally, such software may be written using any of a number of suitable programming languages and/or programming or scripting tools, and also may be compiled as executable machine language code or intermediate code that is executed on a framework or virtual machine.
  • In this respect, various disclosed concepts may be embodied as a computer readable storage medium (or multiple computer readable storage media) (e.g., a computer memory, one or more floppy discs, compact discs, optical discs, magnetic tapes, flash memories, circuit configurations in Field Programmable Gate Arrays or other semiconductor devices, or other non-transitory medium or tangible computer storage medium) encoded with one or more programs that, when executed on one or more computers or other processors, perform methods that implement the various embodiments of the disclosure discussed above. The computer readable medium or media can be transportable, such that the program or programs stored thereon can be loaded onto one or more different computers or other processors to implement various aspects of the present disclosure as discussed above.
  • The terms “program” or “software” are used herein in a generic sense to refer to any type of computer code or set of computer-executable instructions that can be employed to program a computer or other processor to implement various aspects of embodiments as discussed above. Additionally, it should be appreciated that according to one aspect, one or more computer programs that when executed perform methods of the present disclosure need not reside on a single computer or processor, but may be distributed in a modular fashion amongst a number of different computers or processors to implement various aspects of the present disclosure.
  • Computer-executable instructions may be in many forms, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Typically the functionality of the program modules may be combined or distributed as desired in various embodiments.
  • Also, data structures may be stored in computer-readable media in any suitable form. For simplicity of illustration, data structures may be shown to have fields that are related through location in the data structure. Such relationships may likewise be achieved by assigning storage for the fields with locations in a computer-readable medium that convey relationship between the fields. However, any suitable mechanism may be used to establish a relationship between information in fields of a data structure, including through the use of pointers, tags or other mechanisms that establish relationship between data elements.
  • Also, various advantageous concepts may be embodied as one or more methods, of which an example has been provided. The acts performed as part of the method may be ordered in any suitable way. Accordingly, embodiments may be constructed in which acts are performed in an order different than illustrated, which may include performing some acts simultaneously, even though shown as sequential acts in illustrative embodiments.
  • All definitions, as defined and used herein, should be understood to control over dictionary definitions, definitions in documents incorporated by reference, and/or ordinary meanings of the defined terms.
  • The indefinite articles “a” and “an,” as used herein in the specification and in the claims, unless clearly indicated to the contrary, should be understood to mean “at least one.”
  • The phrase “and/or,” as used herein in the specification and in the claims, should be understood to mean “either or both” of the elements so conjoined, i.e., elements that are conjunctively present in some cases and disjunctively present in other cases. Multiple elements listed with “and/or” should be construed in the same fashion, i.e., “one or more” of the elements so conjoined. Other elements may optionally be present other than the elements specifically identified by the “and/or” clause, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, a reference to “A and/or B”, when used in conjunction with open-ended language such as “comprising” can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.
  • As used herein in the specification and in the claims, “or” should be understood to have the same meaning as “and/or” as defined above. For example, when separating items in a list, “or” or “and/or” shall be interpreted as being inclusive, i.e., the inclusion of at least one, but also including more than one, of a number or list of elements, and, optionally, additional unlisted items. Only terms clearly indicated to the contrary, such as “only one of” or “exactly one of,” or, when used in the claims, “consisting of,” will refer to the inclusion of exactly one element of a number or list of elements. In general, the term “or” as used herein shall only be interpreted as indicating exclusive alternatives (i.e. “one or the other but not both”) when preceded by terms of exclusivity, such as “either,” “one of,” “only one of,” or “exactly one of.” “Consisting essentially of,” when used in the claims, shall have its ordinary meaning as used in the field of patent law.
  • As used herein in the specification and in the claims, the phrase “at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements. This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase “at least one” refers, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, “at least one of A and B” (or, equivalently, “at least one of A or B,” or, equivalently “at least one of A and/or B”) can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.
  • In the claims, as well as in the disclosure above, all transitional phrases such as “comprising,” “including,” “carrying,” “having,” “containing,” “involving,” “holding,” “composed of,” and the like are to be understood to be open-ended, i.e., to mean including but not limited to. Only the transitional phrases “consisting of” and “consisting essentially of” shall be closed or semi-closed transitional phrases, respectively, as set forth in the eighth edition as revised in July 2010 of the United States Patent Office Manual of Patent Examining Procedures, Section 2111.03
  • For the purpose of this disclosure and the claims that follow, the term “connect” has been used to describe how various elements interface or “couple”. Such described interfacing or coupling of elements may be either direct or indirect. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as preferred forms of implementing the claims.
  • In the context of this description, the terms “connected” and “coupled” are used to describe both a direct and an indirect connection and a direct or indirect coupling.
  • Appendix: Explanations and Glossary
  • This section provides some explanations and descriptions of certain aspects and meanings of the referenced technical terms, but are not limiting in their understanding.
  • Actuators
  • Actuators are components or devices which are able to convert energy (e.g. electric, magnetic, photoelectric, hydraulic, pneumatic) into a mechanical movement (e.g. translation, rotation, oscillation, vibration, shock, pull, push, etc.). Actuators may be used for example in order to move and/or change and/or modify components such as mechanical elements, optical elements, electronic elements, detector elements, etc. and/or materials or material components. Actuators can also be suited to emit, for example, ultrasound waves, and so on.
  • ASIC
  • An Application-Specific Integrated Circuit (ASIC) is an integrated circuit device, which was designed to perform particular, customized functions. As building blocks, ASICs may comprise a large number of logic gates. In addition, ASICs may comprise further building blocks such as microprocessors and memory blocks, forming so-called Systems-on-Chip (SOC).
  • Automated Guided Vehicle (AGV)
  • An automated guided vehicle or automatic guided vehicle (AGV) is a robot that follows markers or wires in the floor, or uses vision, magnets, or lasers for navigation. An AGV can be equipped to operate autonomously.
  • Autonomous Vehicle (AV)
  • There are numerous terms, which are currently in use to describe vehicles with a certain extent of automatic driving capability. Such vehicles are capable to perform—without direct human interaction—at least some of the activities, which previously could be performed only by a human driver. According to SAE International (Society of Automotive Engineers), six levels of automation can be defined (SAE J3016), starting with Level 0 (where automated systems issue warnings or may momentarily intervene) up to Level 5 (where no human interaction is required at all).
  • An increasing number of modern vehicles is already equipped with so-called Advanced Driver-Assistance Systems (ADAS), which are configured to help the driver during the driving process or to intervene in specific driving situations. Such systems may comprise basic features such as anti-lock braking systems (ABS) and electronic stability controls (ESC), which are usually considered as Level 0 features, as well as more complex features, such as lane departure warning, lane keep assistant, lane change support, adaptive cruise control, collision avoidance, emergency break assistant and adaptive high-beam systems (ADB), etc., which may be considered as Level 1 features. Levels 2, 3 and 4 features can be denoted as partial automation, conditional automation and high automation, respectively. Level 5 finally, can be denoted as full automation.
  • Alternative and widely used terms for Level 5 are driverless cars, self-driving cars or robot cars. In case of industrial applications, the term Automated Guided Vehicle (AGV) is widely used to denote vehicles with partial or full automation for specific tasks like for example material transportation in a manufacturing facility or a warehouse. Furthermore, also Unmanned Aerial Vehicles (UAV) or drones may exhibit different levels of automation. Unless otherwise stated, the term “Autonomous Vehicle (AV)” is considered to comprise, in the context of the present patent application, all the above mentioned embodiments of vehicles with partial, conditional, high or full automation.
  • Beacon
  • A Beacon is a device that emits signal data for communication purposes, for example based on Bluetooth or protocols based on DIIA, THREAD, ZIGBee or MDSIG technology. A Beacon can establish a Wireless Local Area Network.
  • Beam Steering
  • Generally speaking, the light beam emitted by the light source may be transmitted into the Field of Illumination (FOI) either in a scanning or a non-scanning manner. In case of a non-scanning LIDAR (e.g. Flash LIDAR), the light of the light source is transmitted into the complete FOI in one single instance, i.e. the light beam is broadened (e.g. by a diffusing optical element) in such a way that the whole FOI is illuminated at once.
  • Alternatively, in case of a scanning illumination, the light beam is directed over the FOI either in a 1-dimensional manner (e.g. by moving a vertical light stripe in a horizontal direction, or vice versa) or in a 2-dimensional manner (e.g. by moving a light spot along a zigzag pattern across the FOI). To perform such beam steering operations both mechanical and non-mechanical solutions are applicable.
  • Mechanical solutions may comprise rotating mirrors, oscillating mirrors, in particular oscillating micro-electromechanical mirrors (MEMS), Digital Mirror Devices (DMD), Galvo-Scanner, etc. The moving mirrors may have plane surface areas (e.g. with circular, oval, rectangular or polygonal shape) and may be tilted or swiveled around one or more axes. Non-mechanical solutions may comprise so called optical phased arrays (OPA) in which the phases of light waves are varied by dynamically controlling the optical properties of an adjustable optical element (e.g. phase modulators, phase shifters, Liquid Crystal Elements (LCD), etc.).
  • Communication Interface
  • Communication interface describes all sorts of interfaces or gateways between two devices, which can be used to exchange signals. Signals in this context may comprise simple voltage or current levels, as well as complex information based on the above described coding or modulation techniques.
  • In case of a LIDAR Sensor System, communication interfaces may be used to transfer information (signals, data, etc.) between different components of the LIDAR Sensor System. Furthermore, communication interfaces may be used to transfer information (signals, data, etc.) between the LIDAR Sensor System or its components or modules and other devices provided in the vehicle, in particular other sensor systems (LIDAR, RADAR, Ultrasonic, Cameras) in order to allow sensor fusion functionalities.
  • Communication Unit (CU)
  • A communication unit is an electronic device, which is configured to transmit and/or receive signals to or from other communication units. Communication units may exchange information in a one-directional, bi-directional or multi-directional manner. Communication signals may be exchanged via electromagnetic waves (including radio or microwave frequencies), light waves (including UV, VIS, IR), acoustic waves (including ultrasonic frequencies). The information may be exchanged using all sorts of coding or modulation techniques e.g. pulse width modulation, pulse code modulation, amplitude modulation, frequency modulation, etc.
  • The information may be transmitted in an encrypted or non-encrypted manner and distributed in a trusted or distrusted network (for example a Blockchain ledger). As an example, vehicles and elements of road infrastructure may comprise CUs in order to exchange information with each other via so-called C2C (Car-to-Car) or C2X (Car-to-Infrastructure or Car-to-Environment). Furthermore, such communication units may be part of Internet-of-Things (IoT) Systems, i.e. a network of devices, sensors, vehicles, and other appliances, which connect, interact and exchange data with each other.
  • Component
  • Component describes the elements, in particular the key elements, which make up the LIDAR System. Such key elements may comprise a light source unit, a beam steering unit, a photodetector unit, ASIC units, processor units, timing clocks, generators of discrete random or stochastic values, and data storage units. Further, components may comprise optical elements related to the light source, optical elements related to the detector unit, electronic devices related to the light source, electronic devices related to the beam steering unit, electronic devices related to the detector unit and electronic devices related to ASIC, processor and data storage and data executing devices. Components of a LIDAR Sensor System may further include a high-precision clock, a Global-Positioning-System (GPS) and an inertial navigation measurement system (IMU).
  • Computer Program Device
  • A Computer program device is a device or product, which is able to execute instructions stored in a memory block of the device or which is able to execute instructions that have been transmitted to the device via an input interface. Such computer program products or devices comprise any kind of computer-based system or software-based system, including processors, ASICs or any other electronic device which is capable to execute programmed instructions. Computer program devices may be configured to perform methods, procedures, processes or control activities related to LIDAR Sensor Systems.
  • Control and Communication System
  • A Control and Communication System receives input from the LIDAR Data Processing System and communicates with the LIDAR Sensing System, LIDAR Sensor Device and vehicle control and sensing system as well as with other objects/vehicles.
  • Controlled LIDAR Sensor System
  • Controlled LIDAR Sensor System comprises one or many controlled “First LIDAR Sensor Systems”, and/or one or many controlled “Second LIDAR Sensor Systems”, and/or one or many controlled LIDAR Data Processing Systems, and/or one or many controlled LIDAR Sensor Devices, and/or one or many controlled Control and Communication Systems.
  • Controlled means local or remote checking and fault detection and repair of either of the above-mentioned LIDAR Sensor System components. Controlled can also mean the control of a LIDAR Sensor Device, including a vehicle.
  • Controlled can also mean the inclusion of industry standards, bio feedbacks, safety regulations, autonomous driving levels (e.g. SAE Levels) and ethical and legal frameworks.
  • Controlled can also mean the control of more than one LIDAR Sensor System, or more than one LIDAR Sensor Device, or more than one vehicle and/or other objects.
  • A Controlled LIDAR Sensor System may include the use of artificial intelligent systems, data encryption and decryption, as well as Blockchain technologies using digital records that store a list of transactions (called “blocks”) backed by a cryptographic value. Each block contains a link to the previous block, a timestamp, and data about the transactions that it represents. Blocks are immutable, meaning that they can't easily be modified once they're created. And the data of a blockchain are stored non-locally, i.e. on different computers.
  • A Controlled LIDAR Sensor System may be configured to perform sensor fusion functions, such as collecting, evaluating and consolidating data from different sensor types (e.g. LIDAR, RADAR, Ultrasonic, Cameras). Thus, the Controlled LIDAR Sensor System comprises feedback- and control-loops, i.e. the exchange of signals, data and information between different components, modules and systems which are all employed in order to derive a consistent understanding of the surroundings of a sensor system, e.g. of a sensor system onboard a vehicle.
  • Data Analysis
  • Various components (e.g. detectors, ASIC) and processes (like Signal/Noise measurement and optimization, fusion of various other sensor signals like from other LIDAR Sensors Systems, Radar, Camera or ultrasound measurements)) are provided as being necessary to reliably measure backscattered LIDAR signals and derive information regarding the recognition of point clouds and subsequent object recognition and classification. Signals and data may be processed via Edge-Computing or Cloud-Computing systems, using corresponding Communication Units (CUs). Signals and data may be transmitted for that matter in an encrypted manner.
  • For increased data security and data permanence, further provisions may be taken such as implementation of methods based on Blockchain or smart contracts. Data security can also be enhanced by a combination of security controls, measures and strategies, singly and/or in combination, applied throughout a system's “layers”, including human, physical, endpoint, network, application and data environments.
  • Data analysis may benefit using data deconvolution methods or other suited methods that are known in imaging and signal processing methods, including neuronal and deep learning techniques.
  • Data Usage
  • LIDAR generated data sets can be used for the control and steering of vehicles (e.g. cars, ships, planes, drones), including remote control operations (e.g. parking operations or operations executed for example by an emergency officer in a control room). The data sets can be encrypted and communicated (C2C, C2X), as well as presented to a user (for example by HUD or Virtual/Augmented Reality using wearable glasses or similar designs). LIDAR Systems can also be used for data encryption purposes.
  • Data Usage may also comprise using methods of Artificial Intelligence (AI), i.e. computer-based systems or computer implemented methods which are configured to interpret transmitted data, to learn from such data based on these interpretations and derive conclusions which can be implemented into actions in order to achieve specific targets. The data input for such AI-based methods may come from LIDAR Sensor Systems, as well as other physical or biofeedback sensors (e.g. Cameras which provide video streams from vehicle exterior or interior environments, evaluating e.g. the line of vision of a human driver). AI-based methods may use algorithms for pattern recognition. Data Usage, in general, may employ mathematical or statistical methods in order to predict future events or scenarios based on available previous data sets (e.g. Bayesian method). Furthermore, Data Usage may include considerations regarding ethical questions (reflecting situations like for example the well-known “trolley dilemma”).
  • Detector
  • A Detector is a device which is able to provide an output signal (to an evaluation electronics unit) which is qualitatively or quantitatively correlated to the presence or the change of physical (or chemical) properties in its environment. Examples for such physical properties are temperature, pressure, acceleration, brightness of light (UV, VIS, IR), vibrations, electric fields, magnetic fields, electromagnetic fields, acoustic or ultrasound waves, etc. Detector devices may comprise cameras (mono or stereo) using e.g. light-sensitive CCD or CMOS chips or stacked multilayer photodiodes, ultrasound or ultrasonic detectors, detectors for radio waves (RADAR systems), photodiodes, temperature sensors such as NTC-elements (i.e. a thermistor with negative temperature coefficient), acceleration sensors, etc.
  • A photodetector is a detection device, which is sensitive with respect to the exposure to electromagnetic radiation. Typically, light photons are converted into a current signal upon impingement onto the photosensitive element. Photosensitive elements may comprise semiconductor elements with p-n junction areas, in which photons are absorbed and converted into electron-hole pairs. Many different detector types may be used for LIDAR applications, such as photo diodes, PN-diodes, PIN diodes (positive intrinsic negative diodes), APD (Avalanche Photo-Diodes), SPAD (Single Photon Avalanche Diodes), SiPM (Silicon Photomultipliers), CMOS sensors (Complementary metal-oxide-semiconductor, CCD (Charge-Coupled Device), stacked multilayer photodiodes, etc.
  • In LIDAR systems, a photodetector is used to detect (qualitatively and/or quantitatively) echo signals from light which was emitted by the light source into the FOI and which was reflected or scattered thereafter from at least one object in the FOI. The photodetector may comprise one or more photosensitive elements (of the same type or of different types) which may be arranged in linear stripes or in two-dimensional arrays. The photosensitive area may have a rectangular, quadratic, polygonal, circular or oval shape. A photodetector may be covered with Bayer-like visible or infrared filter segments.
  • Digital Map
  • A digital map is a collection of data that may be used to be formatted into a virtual image. The primary function of a digital map is to provide accurate representations of measured data values. Digital mapping also allows the calculation of geometrical distances from one object, as represented by its data set, to another object. A digital map may also be called a virtual map.
  • Electronic Devices
  • Electronic devices denotes all kinds of electronics components or electronic modules, which may be used in a LIDAR Sensor System in order to facilitate its function or improve its function. As example, such electronic devices may comprise drivers and controllers for the light source, the beam steering unit or the detector unit. Electronic devices may comprise all sorts of electronics components used in order to supply voltage, current or power. Electronic devices may further comprise all sorts of electronics components used in order to manipulate electric or electronic signals, including receiving, sending, transmitting, amplifying, attenuating, filtering, comparing, storing or otherwise handling electric or electronic signals.
  • In a LIDAR system, there may be electronic devices related to the light source, electronic devices related to the beam steering unit, electronic devices related to the detector unit and electronic devices related to ASIC and processor units. Electronic devices may comprise also Timing Units, Positioning Units (e.g. actuators), position tracking units (e.g. GPS, Geolocation, Indoor-Positioning Units, Beacons, etc.), communication units (WLAN, radio communication, Bluetooth, BLE, etc.) or further measurement units (e.g. inertia, accelerations, vibrations, temperature, pressure, position, angle, rotation, etc.).
  • Field of Illumination
  • The term Field of Illumination (FOI) relates to the solid angle sector into which light can be transmitted by the LIDAR light source (including all corresponding downstream optical elements). The FOI is limited along a horizontal direction to an opening angle □□H and along a vertical direction to an opening angle □□V. The light of the LIDAR light source may be transmitted into the complete FOI in one single instance (non-scanning LIDAR) or may be transmitted into the FOI in a successive, scanning manner (scanning LIDAR).
  • Field of View
  • The term Field of View (FOV) relates to the solid angle sector from which the LIDAR detector (including all corresponding upstream optical elements) can receive light signals. The FOV is limited along a horizontal direction to an opening angle alphaH and along a vertical direction to an opening angle alphaY.
  • Flash LIDAR Sensor System
  • A LIDAR Sensor System where the angular information (object recognition) about the environment is gained by using an angularly sensitive detector is usually called a Flash LIDAR Sensor System.
  • Frame (Physical Layer)
  • In the context of the present application the term “frame” may be used to describe a logical structure of a signal (e.g., an electrical signal or a light signal or a LIDAR signal, such as a light signal). Illustratively, the term “frame” may describe or define an arrangement (e.g., a structure) for the content of the frame (e.g., for the signal or the signal components). The arrangement of content within the frame may be configured to provide data or information. A frame may include a sequence of symbols or symbol representations. A symbol or a symbol representation may have a different meaning (e.g., it may represent different type of data) depending on its position within the frame. A frame may have a predefined time duration. Illustratively, a frame may define a time window, within which a signal may have a predefined meaning. By way of example, a light signal configured to have a frame structure may include a sequence of light pulses representing (or carrying) data or information. A frame may be defined by a code (e.g., a signal modulation code), which code may define the arrangement of the symbols within the frame.
  • Gateway
  • Gateway means a networking hardware equipped for interfacing with another network. A gateway may contain devices such as protocol translators, impedance matching devices, rate converters, fault isolators, or signal translators as necessary to provide system interoperability. It also requires the establishment of mutually acceptable administrative procedures between both networks. In other words, a gateway is a node on a network that serves as a ‘gate’ or entrance and exit point to and from the network. In other words, a node is an active redistribution and/or communication point with a unique network address that either creates, receives or transmits data, sometimes referred to as a ‘data node’.
  • Human Machine Interaction (HMI)
  • For Human-Machine Interactions (HMI), for example the interaction between a vehicle and a driver, it might be necessary to process data and information such that they can be provided as graphical representations or in other forms of visualizations, e.g. HUD or methods of Augmented Reality (AR) or Virtual Reality (VR). Biofeedback systems, which may evaluate biological parameters such as fatigue, dizziness, increased heartbeat, nervousness, etc., may be included into such Human-Machine Interaction systems. As an example, a biofeedback system may detect that the driver of a vehicle shows signs of increased fatigue, which are evaluated by a central control unit, finally leading to a switchover from a lower SAE level to a higher SAE level.
  • LIDAR DATA Processing System
  • A LIDAR Data Processing System may comprise functions of signal processing, signal optimization (signal/noise), data analysis, object detection, object recognition, information exchange with edge and cloud computing, data banks, data libraries and other sensing devices (for example other LIDAR Devices, radar, camera, ultrasound, biometrical feedback data, driver control devices, car-to-car (C2C) communication, car-to-environment (C2X) communication, geolocation data (GPS).
  • A LIDAR Data Processing System may generate point clouds (3D/6D), object location, object movement, environment data, object/vehicle density.
  • A LIDAR Data Processing System may include feedback control to First LIDAR Sensing System and/or Second LIDAR Sensing System and/or Control and Communication System . . . .
  • LIDAR Sensing System
  • A LIDAR Sensing System may comprise one or many LIDAR emission modules, here termed as “First LIDAR Sensing, and/or one or many LIDAR Sensor modules, here termed as “Second LIDAR Sensing.
  • LIDAR Sensor
  • Unless otherwise stated, the term sensor or sensor module describes—in the framework of this patent application—module, which is configured to function as a LIDAR Sensor System. As such it may comprise a minimum set of LIDAR key components necessary to perform basic LIDAR functions such as a distance measurement.
  • A LIDAR (light detection and ranging) Sensor is to be understood in particular as meaning a system which, in addition to one or more emitters for emitting light beams, for example in pulsed form, and a detector for detecting any reflected beam components, may have further devices, for example optical elements such as lenses and/or a MEMS mirror. A LIDAR Sensor can therefore also be called a LIDAR System or a LIDAR Sensor System or LIDAR detection system.
  • LIDAR Sensor Device
  • A LIDAR Sensor Device is a LIDAR Sensor System either stand alone or integrated into a housing, light fixture, headlight or other vehicle components, furniture, ceiling, textile, etc. and/or combined with other objects (e.g. vehicles, pedestrians, traffic participation objects, . . . ).
  • LIDAR Sensor Management System
  • LIDAR Sensor Management System receives input from the LIDAR Data Processing System and/or Control and Communication System and/or any other component of the LIDAR Sensor Device, and outputs control and signaling commands to the First LIDAR Sensing System and/or Second LIDAR Sensing System.
  • LIDAR Sensor Management Software
  • LIDAR Sensor Management Software (includes feedback software) for use in a LIDAR Sensor Management System.
  • LIDAR Sensor Module
  • A LIDAR Sensor Module comprises at least one LIDAR Light Source, at least one LIDAR Sensing Element, and at least one driver connected to the at least one LIDAR Light Source. It may further include Optical Components and a LIDAR Data Processing System supported by LIDAR signal processing hard- and software.
  • LIDAR System
  • A LIDAR System is a system, that may be or may be configured as a LIDAR Sensor System.
  • LIDAR Sensor System
  • A LIDAR Sensor System is a system, which uses light or electromagnetic radiation, respectively, to derive information about objects in the environment of the LIDAR system. The acronym LIDAR stands for Light Detection and Ranging. Alternative names may comprise LADAR (laser detection and ranging), LEDDAR (Light-Emitting Diode Detection and Ranging) or laser radar.
  • LIDAR systems typically comprise of a variety of components as will be described below. In an exemplary application, such LIDAR systems are arranged at a vehicle to derive information about objects on a roadway and in the vicinity of a roadway. Such objects may comprise other road users (e.g. vehicles, pedestrians, cyclists, etc.), elements of road infrastructure (e.g. traffic signs, traffic lights, roadway markings, guardrails, traffic islands, sidewalks, bridge piers, etc.) and generally all kinds of objects which may be found on a roadway or in the vicinity of a roadway, either intentionally or unintentionally.
  • The information derived via such a LIDAR system may comprise the distance, the velocity, the acceleration, the direction of movement, the trajectory, the pose and/or other physical or chemical properties of these objects. To derive this information, the LIDAR system may determine the Time-of-Flight (TOF) or variations of physical properties such as phase, amplitude, frequency, polarization, structured dot pattern, triangulation-based methods, etc. of the electromagnetic radiation emitted by a light source after the emitted radiation was reflected or scattered by at least one object in the Field of Illumination (FOI) and detected by a photodetector.
  • LIDAR systems may be configured as Flash LIDAR or Solid-State LIDAR (no moving optics), Scanning LIDAR (1- or 2-MEMS mirror systems, Fiber-Oscillator), Hybrid versions as well as in other configurations.
  • Light Control Unit
  • The Light Control Unit may be configured to control the at least one First LIDAR Sensing System and/or at least one Second LIDAR Sensing System for operating in at least one operation mode. The Light Control Unit may comprise a light control software. Possible operation modes are e.g.: dimming, pulsed, PWM, boost, irradiation patterns, including illuminating and non-illuminating periods, light communication (including C2C and C2X), synchronization with other elements of the LIDAR Sensor System, such as a second LIDAR Sensor Device.
  • Light Source
  • A light source for LIDAR applications provides electromagnetic radiation or light, respectively, which is used to derive information about objects in the environment of the LIDAR system. In some implementations, the light source emits radiation in a non-visible wavelength range, in particular infrared radiation (IR) in the wavelength range from 850 nm up to 8100 nm. In some implementations, the light source emits radiation in a narrow bandwidth range with a Full Width at Half Maximum (FWHM) between 1 ns to 100 ns.
  • A LIDAR light source may be configured to emit more than one wavelength, visible or invisible, either at the same time or in a time-sequential fashion.
  • The light source may emit pulsed radiation comprising individual pulses of the same pulse height or trains of multiple pulses with uniform pulse height or with varying pulse heights. The pulses may have a symmetric pulse shape, e.g. a rectangular pulse shape. Alternatively, the pulses may have asymmetric pulse shapes, with differences in their respective rising and falling edges. Pulse length can be in the range of pico-seconds (ps) up to micro-seconds (μs).
  • The plurality of pulses may also overlap with each other, at least partially. Apart from such a pulsed operation, the light source may be operated also in a continuous wave operation mode, at least temporarily. In continuous wave operation mode, the light source may be adapted to vary phase, amplitude, frequency, polarization, etc. of the emitted radiation. The light source may comprise solid-state light sources (e.g. edge-emitting lasers, surface-emitting lasers, semiconductor lasers, VCSEL, VECSEL, LEDs, superluminescent LEDs, etc.).
  • The light source may comprise one or more light emitting elements (of the same type or of different types) which may be arranged in linear stripes or in two-dimensional arrays. The light source may further comprise active or passive heat dissipation elements.
  • The light source may have several interfaces, which facilitate electrical connections to a variety of electronic devices such as power sources, drivers, controllers, processors, etc. Since a vehicle may employ more than one LIDAR system, each of them may have different laser characteristics, for example, regarding laser wavelength, pulse shape and FWHM.
  • The LIDAR light source may be combined with a regular vehicle lighting function, such as headlight, Daytime Running Light (DRL), Indicator Light, Brake Light, Fog Light etc. so that both light sources (LIDAR and another vehicle light source) are manufactured and/or placed on the same substrate, or integrated into the same housing and/or be combined as a non-separable unit.
  • Marker
  • A marker can be any electro-optical unit, for example an array of photodiodes, worn by external objects, in particular pedestrians and bicyclists, that can detect infrared radiation or acoustic waves (infrasound, audible, ultrasound), process the incoming radiation/waves and, as a response, reflect or emit infrared radiation or acoustic waves (infrasound, audible, ultrasound) with the same or different wavelength, and directly or indirectly communicate with other objects, including autonomously driving vehicles.
  • Method
  • The term method may describe a procedure, a process, a technique or a series of steps, which are executed in order to accomplish a result or in order to perfom a function. Method may for example refer to a series of steps during manufacturing or assembling a device. Method may also refer to a way of using a product or device to achieve a certain result (e.g. measuring a value, storing data, processing a signal, etc.).
  • Module
  • Module describes any aggregation of components, which may set up a LIDAR system. As example, a light source module may describe a module, which comprises a light source, several beam forming optical elements and a light source driver as an electronic device, which is configured to supply power to the light source.
  • Objects
  • Objects may generally denote all sorts of physical, chemical or biological matter for which information can be derived via a sensor system. With respect to a LIDAR Sensor System, objects may describe other road users (e.g. vehicles, pedestrians, cyclists, etc.), elements of road infrastructure (e.g. traffic signs, traffic lights, roadway markings, guardrails, traffic islands, sidewalks, bridge piers, etc.) and generally all kinds of objects which may be found on a roadway or in the vicinity of a roadway, either intentionally or unintentionally.
  • Processor
  • A processor is an electronic circuit, which performs multipurpose processes based on binary data inputs. Specifically, microprocessors are processing units based on a single integrated circuit (IC). Generally speaking, a processor receives binary data, which may be processed according to instructions stored in a memory block of the processor, and provides binary results as outputs via its interfaces.
  • Scanning LIDAR Sensor System
  • A LIDAR Sensor System where the angular information is gained by using a moveable mirror for scanning (i.e. angularly emitting) the laser beam across the Field of View (FOV), or any other technique to scan a laser beam across the FOV, is called a Scanning LIDAR Sensor System.
  • Sensor/Sensor Pixel
  • A sensor in the context of this disclosure includes one or more sensor pixels (which may also be referred to as pixel). Each sensor pixel includes exactly one photo diode. The sensor pixels may all have the same shape or different shapes. The sensor pixels may all have the same spacing to their respective neighbors or may have different spacings. The sensor pixels may all have the same orientation in space or different orientation in space. The sensor pixels may all be arranged within one plane or within different planes or other non-planar surfaces. The sensor pixels may include the same material combination or different material combinations. The sensor pixels may all have the same surface structure or may have different surface structures. Sensor pixels may be arranged and/or connected in groups.
  • In general, each sensor pixel may have an arbitrary shape. The sensor pixels may all have the same size or different sizes. In general, each sensor pixel may have an arbitrary size. Furthermore, the sensor pixels may all include a photo diode of the same photo diode type or of different photo diode types.
  • A photo diode type may be characterized by one or more of the following features: size of the photo diode; sensitivity of the photo diode regarding conversion of electromagnetic radiation into electrical signals (the variation of the sensitivity may be caused the application of different reverse-bias voltages); sensitivity of the photo diode regarding light wavelengths; voltage class of the photo diode; structure of the photo diode (e.g. pin photo diode, avalanche photo diode, or single-photon avalanche photo diode); and material(s) of the photo diode.
  • The sensor pixels may be configured to be in functional relationship with color-filter elements and/or optical components.
  • Sensors
  • Sensors are devices, modules or subsystems whose purpose it is to detect events or changes in its environment and send the information to other electronics, frequently a computer processor. Nowadays, there is a broad range of sensors available for all kinds of measurement purposes, for example the measurement of touch, temperature, humidity, air pressure and flow, electromagnetic radiation, toxic substances and the like. In other words, a sensor can be an electronic component, module or subsystem that detects events or changes in energy forms in its physical environment (such as motion, light, temperature, sound, etc.) and sends the information to other electronics such as a computer for processing.
  • Sensors can be used to measure resistive, capacitive, inductive, magnetic, optical or chemical properties.
  • Sensors include camera sensors, for example CCD or CMOS chips, LIDAR sensors for measurements in the infrared wavelength range, Radar Sensors, and acoustic sensors for measurement in the infrasound, audible and ultrasound frequency range. Ultrasound is radiation with a frequency above 20 kHz.
  • Sensors can be infrared sensitive and measure for example the presence and location of humans or animals.
  • Sensor can be grouped into a network of sensors. A vehicle can employ a wide variety of sensors, including camera sensors, LIDAR sensing devices, RADAR, acoustical sensor systems, and the like. These sensors can be mounted inside or outside of a vehicle at various positions (roof, front, rear, side, corner, below, inside a headlight or any other lighting unit) and can furthermore establish a sensor network that may communicate via a hub or several sub-hubs and/or via the vehicle's electronic control unit (ECU).
  • Sensors can be connected directly or indirectly to data storage, data processing and data communication devices.
  • Sensors in cameras can be connected to a CCTV (Closed Circuit Television). Light sensors can measure the amount and orientation of reflected light from other objects (reflectivity index).
  • Sensing Field
  • The term sensing field describes that surroundings of a sensor system, wherein objects or any other contents can be detected, as well as their physical or chemical properties (or their changes). In case of a LIDAR Sensor System, it describes a solid angle volume into which light is emitted by the LIDAR light source (FOI) and from which light that has been reflected or scattered by an object can be received by the LIDAR detector (FOV). As an example, a LIDAR sensing field may comprise a roadway or the vicinity of a roadway close to a vehicle, but also the interior of a vehicle. For other types of sensors, sensing field may describe the air around the sensor or some objects in direct contact to the sensor.
  • Sensor Optics
  • Sensor Optics denotes all kinds of optical elements, which may be used in a LIDAR Sensor System in order to facilitate its function or improve its function. As example, such optical elements may comprise lenses or sets of lenses, filters, diffusors, mirrors, reflectors, light guides, Diffractive Optical Elements (DOE), Holographic Optical Elements and generally all kind of optical elements which may manipulate light (or electromagnetic radiation) via refraction, diffraction, reflection, transmission, absorption, scattering, etc. Sensor Optics may refer to optical elements related to the light source, to the beam steering unit or the detector unit. Laser emitter and optical elements may be moved, tilted or otherwise shifted and/or modulated with respect to their distance and orientation.
  • Sensor System Optimization
  • Some LIDAR-related business models may deal with methods of sensor system optimization. Sensor system optimization may rely on a broad range of methods, functions or devices, including for example computing systems utilizing artificial intelligence, sensor fusion (utilizing data and signals from other LIDAR-sensors, RADAR sensors, Ultrasonic sensors, Cameras, Video-streams, etc.), as well as software upload and download functionalities (e.g. for update purposes). Sensor system optimization may further utilize personal data of a vehicle user, for example regarding age, gender, level of fitness, available driving licenses (passenger car, truck) and driving experiences (cross vehicle weight, number of vehicle axes, trailer, horsepower, front-wheel drive/rear-wheel drive). Personal data may further include further details regarding driving experience (e.g. beginners level, experienced level, professional motorist level) and/or driving experiences based on data such as average mileage per year, experience for certain road classes, road environments or driving conditions (e.g. motorway, mountain roads, offroad, high altitude, bridges, tunnels, reversing, parking, etc.), as well as experiences with certain weather conditions or other relevant conditions (snow, ice, fog, day/night, snow tires, snow chains, etc.).
  • Personal data may further include information about previous accidents, insurance policies, warning tickets, police reports, entries in central traffic registers (e.g. Flensburg in Germany), as well as data from biofeedback systems, other health-related systems (e.g. cardiac pacemakers) and other data (e.g. regarding driving and break times, level of alcohol intake, etc.).
  • Personal data may be particularly relevant in care sharing scenarios and may include information about the intended ride (starting location, destination, weekday, number of passengers), the type of loading (passengers only, goods, animals, dangerous goods, heavy load, large load, etc.) and personal preferences (time-optimized driving, safety-optimized driving, etc.). Personal data may be provided via smartphone connections (e.g. based on Bluetooth, WiFi, LiFi, etc.). Smartphones or comparable mobile devices may further be utilized as measurement tools (e.g. ambient light, navigation data, traffic density, etc.) and/or as device which may be utilized as assistants, decision-making supports, or the like.
  • Signal Modulation
  • In the context of the present application the term “signal modulation” (also referred to as “electrical modulation”) may be used to describe a modulation of a signal for encoding data in such signal (e.g., a light signal or an electrical signal, for example a LIDAR signal). By way of example, a light signal (e.g., a light pulse) may be electrically modulated such that the light signal carries or transmits data or information. Illustratively, an electrically modulated light signal may include a sequence of light pulses arranged (e.g., temporally spaced) such that data may be extracted or interpreted according to the arrangement of the light pulses. Analogously, the term “signal demodulation” (also referred to as “electrical demodulation”) may be used to describe a decoding of data from a signal (e.g., from a light signal, such as a sequence of light pulses).
  • Virtual Map
  • A digital map is a collection of data that may be used to be formatted into a virtual image. The primary function of a digital map is to provide accurate representations of measured data values. Digital mapping also allows the calculation of geometrical distances from one object, as represented by its data set, to another object. A digital map may also be called a virtual map.
  • Vehicle
  • A vehicle may be any object or device that either is equipped with a LIDAR Sensor System and/or communicates with a LIDAR Sensor System. In particular a vehicle can be: automotive vehicle, flying vehicle, all other moving vehicles, stationary objects, buildings, ceilings, textiles, traffic control equipment, . . . .
  • List of Abbreviations
    • ACK=Acknowledgment
    • ADAS=Advanced Driver-Assistance Systems
    • ADB=Adaptive high-beam systems
    • AGV=Automatically Guided Vehicles
    • AI=Artificial Intelligence
    • APD=Avalanche Photo-Diodes
    • API=Application Programming Interface
    • APP=Application software, especially as downloaded by a user to a mobile device
    • AR=Augmented Reality
    • ASCII=American Standard Code for Information Interchange
    • ASIC=Application-Specific Integrated Circuit
    • ASSP=Application Specific Standard Product
    • AV=Autonomous Vehicle
    • BCU=Board Control System
    • C2C=Car-to-Car
    • C2X=Car-to-Infrastructure or Car-to-Environment
    • CCD=Charge-Coupled Device
    • CCTV=Closed Circuit Television
    • CD=Compact Disc
    • CD=Collision Detection
    • CDe=Computing Device
    • CdTe=Cadmium telluride
    • CIS=CMOS Image Sensor
    • CMOS=Complementary Metal-Oxide-Semiconductor
    • CMYW=cyan, magenta, yellow, and white
    • CU=Communication Unit/Data/Device
    • CYMG=cyan, yellow, green and magenta
    • DLP=Digital Light Processing
    • DMD=Digital Mirror Devices
    • DNL=Deep Neuronal Learning
    • DNN=Deep Neural Networks
    • DOE=Diffractive Optical Elements
    • DRAM=Dynamic Random Access Memory
    • DRL=Daytime Running Light
    • ECU=Electronic Control Unit/Vehicle Control Unit
    • FET=Field Effect Transistor
    • FOI=Field of Illumination
    • FOV=Field of View
    • FWHM=Full Width at Half Maximum
    • GNSS=Global Navigation Satellite System
    • GPS=Global-Positioning-System
    • GUI=Graphical User Interface
    • HMI=Human Machine Interaction
    • HUD=Head-up-Display
    • HVAC=Heating, Ventilation and Air Conditioning
    • IC=Integrated Circuit
    • ID=Identification
    • IMU=Inertial Measurement Unit (system)
    • IoT=Internet of Things
    • IR=Infrared Radiation
    • ITO=Indium Tin Oxide
    • iTOF=Indirect TOF
    • LADAR=Laser Detection and Ranging
    • LAS=Laser File Format
    • LCD=Liquid Crystal Display
    • LED=Light-Emitting Diodes
    • LEDDAR=Light-Emitting Diode Detection and Ranging
    • LIDAR=Light detection and ranging
    • LPaaS=LiDAR Platform as a Service
    • MaaS=Mobility-as-a-Service
    • MEMS=Micro-Electro-Mechanical System
    • ML=Machine Learning
    • MO SFET=Metal-Oxide-Semiconductor Field-Effect Transistor
    • NFU=Neural Processor Units
    • NIR=Near Infrared
    • OPA=Optical Phased Arrays
    • PaaS=Platform as a Service
    • PCB=Printed Circuit Board
    • PD=Photo-Diode
    • PEDOT=Poly-3,4-ethylendioxythiophen
    • PIN=Positive Intrinsic Negative diode
    • PWM=Pulse-width Modulation
    • QR-Code=Quick Response Code
    • RADAR=Radio Detection And Ranging
    • RAM=Random Access Memory
    • RGB=Red Green Blue
    • RGBE=red, green, blue, and emerald
    • SAE=Society of Automotive Engineers
    • SiPM=Silicon Photomultipliers
    • SNR=Signal-to-Noise Ratio
    • SOC=Systems-on-Chip
    • SPAD=Single Photon Avalanche Diodes
    • TaaS=Transportation-as-a-Service
    • TIA=Transimpedance Amplifier
    • TOF=Time of Flight
    • TSV=Through-Silicon-Via
    • UAV=Unmanned Aerial Vehicles
    • UI=User Interface
    • USB=Universal Serial Bus
    • UV=Ultra-Violet radiation
    • VCSEL=Vertical Cavity Surface Emitting Laser
    • VECSEL=Vertical-External-Cavity-Surface-Emitting-Laser
    • VIS=Visible Spectrum
    • VR=Virtual Reality
    • WiFi=Wireless Fidelity
    • WLAN=Wireless Local Area Network
    • ZnS=Zinc Sulphide
    • fC=femto Coulomb
    • pC=pico Coulomb
    • fps=frames per second
    • ms=milli-seconds
    • ns=nano-seconds
    • ps=pico-seconds
    • μs=micro-seconds
    • i.e.=that is/in other words
    • e.g.=for example
    LIST OF REFERENCE SIGNS
    • 10 LIDAR Sensor System
    • 20 Controlled LIDAR Sensor System
    • 30 LIDAR Sensor Device
    • 40 First LIDAR Sensing System
    • 41 Light scanner/Actuator for Beam Steering and Control
    • 42 Light Source
    • 43 Light Source Controller/Software
    • 50 Second LIDAR Sensing System
    • 51 Detection Optic/Actuator for Beam Steering and Control
    • 52 Sensor or Sensor element
    • 53 Sensor Controller
    • 60 LIDAR Data Processing System
    • 61 Advanced Signal Processing
    • 62 Data Analysis and Computing
    • 63 Sensor Fusion and other Sensing Functions
    • 70 Control and Communication System
    • 80 Optics
    • 81 Camera System and Camera sensors
    • 82 Camera Data and Signal exchange
    • 90 LIDAR Sensor Management System
    • 92 Basic Signal Processing
    • VSPAD SPAD potential
    • 3800 Sensor portion
    • 3802 Sensor pixel
    • 3804 Light spot
    • 3806 Circle
    • 3808 Row select signal
    • 3810 Row select signal
    • 3812 Row select signal
    • 3814 Column select signal
    • 3816 Column select signal
    • 3818 Column select signal
    • 3820 Selected sensor pixel
    • 3822 Supply voltage
    • 3900 Sensor portion
    • 3902 Row selection line
    • 3904 Column selection line
    • 3906 Column switch
    • 3908 Supply voltage
    • 3910 Supply voltage line
    • 3912 Column read out line
    • 3914 Collection read out line
    • 3916 Column read out switch
    • 3918 Column pixel switch
    • 3920 Column pixel read out switch
    • 4002 Column switch MOSFET
    • 4004 Column read out switch MOSFET
    • 4006 Column pixel switch MOSFET
    • 4008 Column pixel read out switch MOSFET
    • 4100 Sensor portion
    • 4102 First summation output
    • 4104 Coupling capacitor
    • 4106 Second summation output
    • 4108 Coupling resistor
    • 4200 Recorded scene
    • 4202 Center region
    • 4204 Edge region
    • 4300 Recorded scene
    • 4302 First row
    • 4304 Second row
    • 4400 Method
    • 4402 Partial process
    • 4404 Partial process
    • 4500 Method
    • 4502 Partial process
    • 4504 Partial process
    • 5100 Optical component for a LIDAR Sensor System
    • 5102 Device layer
    • 5104 One or more electronic devices
    • 5106 Bottom interconnect layer
    • 5108 One or more electronic contacts
    • 5110 First photo diode
    • 5112 One or more contact vias
    • 5114 Intermediate interconnect/device layer
    • 5116 One or more further electronic devices
    • 5118 One or more further electronic contacts
    • 5120 Second photo diode
    • 5200 Optical component for a LIDAR Sensor System
    • 5202 One or more microlenses
    • 5204 Filler material
    • 5206 Filter layer
    • 5208 Upper (exposed) surface of filter layer
    • 5210 First arrow
    • 5212 Second arrow
    • 5214 Third arrow
    • 5250 Wavelength/transmission diagram
    • 5252 Transmission characteristic
    • 5300 Optical component for a LIDAR Sensor System
    • 5302 Bottom mirror
    • 5304 Top mirror
    • 5400 Cross sectional view of a sensor for a LIDAR Sensor System
    • 5500 Top view of a sensor for a LIDAR Sensor System
    • 5502 Red pixel filter portion
    • 5504 Green pixel filter portion
    • 5506 Blue pixel filter portion
    • 5600 Top view of a sensor for a LIDAR Sensor System
    • 5602 Rectangle
    • 5700 Top view of a sensor for a LIDAR Sensor System
    • 5702 Red pixel filter portion
    • 5704 Yellow or orange pixel filter portion
    • 5800 Optical component for a LIDAR Sensor System
    • 5802 Reflector layer
    • 5804 Fourth arrow
    • 5806 Fifth arrow
    • 5808 Micromechanically defined IR absorber structure
    • 15500 Optical package
    • 15502 Substrate
    • 15504 Capacitor
    • 15504 c Capacitor
    • 15506 Switch
    • 15506 g Control terminal
    • 15506 s Switch
    • 15508 Laser diode
    • 15508 d Laser diode
    • 15510 Common line
    • 15512 Power source
    • 15514 Processor
    • 15602 Printed circuit board
    • 15604 First electrical contact
    • 15606 Second electrical contact
    • 15608 Terminal
    • 15700 Optical package
    • 15702 Substrate
    • 15702 i Insulating layer
    • 15702 s Base
    • 15704 Capacitor
    • 15706 Switch
    • 15708 Laser diode
    • 15708 a Active layer
    • 15708 o Optical structure
    • 15710 Printed circuit board
    • 15712 Bond wire
    • 15714 Terminal
    • 15716 Terminal
    • 15718 Connector structure
    • 15718 c Electrical contact
    • 15720 Bond wire
    • 15722 Access line
    • 15724 Through via

Claims (34)

What is claimed is:
1. An optical component for a LIDAR Sensor System, the optical component comprising:
a first photo diode implementing a LIDAR sensor pixel in a first semiconductor structure and configured to absorb received light in a first wavelength region;
a second photo diode implementing a camera sensor pixel in a second semiconductor structure over the first semiconductor structure and configured to absorb received light in a second wavelength region;
an interconnect layer comprising an electrically conductive structure configured to electrically contact the second photo diode;
wherein the received light of the second wavelength region has a shorter wavelength than the received light of the first wavelength region.
2. The optical component according to claim 1,
wherein the second photo diode is vertically stacked over the first photo diode.
3. The optical component according to claim 1,
wherein the first photo diode is a first vertical photo diode; and/or
wherein the second photo diode is a second vertical photo diode.
4. The optical component according to claim 1, further comprising:
a further interconnect layer comprising an electrically conductive structure configured to electrically contact the second vertical photo diode and/or the first vertical photo diode.
5. The optical component according to claim 1, further comprising:
a microlens over the second semiconductor structure that laterally substantially covers the first vertical photo diode and/or the second vertical photo diode.
6. The optical component according to claim 1, further comprising:
a filter layer over the second semiconductor structure that laterally substantially covers the first vertical photo diode and/or the second vertical photo diode and is configured to transmit received light having a wavelength within the first wavelength region and within the second wavelength region, and block light outside the first wavelength region and outside the second wavelength region.
7. The optical component according to claim 1,
wherein the received light of the first wavelength region has a wavelength in the range from about 800 nm to about 1800 nm; and/or
wherein the received light of the second wavelength region has a wavelength in the range from about 380 nm to about 780 nm.
8. The optical component according to claim 1,
wherein the received light of the first wavelength region has a wavelength in the range from about 800 nm to about 1800 nm; and/or
wherein the received light of the second wavelength region has a wavelength in the range from about 800 nm to about 1750 nm.
9. The optical component according to claim 1,
wherein the received light of the second wavelength region has a shorter wavelength than received light of the first wavelength region by at least 50 nm, preferably by at least 100 nm.
10. The optical component according to claim 1,
wherein the received light of the first wavelength region has a wavelength in an infrared spectrum wavelength region; and/or
wherein the received light of the second wavelength region has a wavelength in the visible spectrum wavelength region.
11. The optical component according to claim 1, further comprising:
a mirror structure comprising a bottom mirror and a top mirror;
wherein the second semiconductor structure is arranged between the bottom mirror and the top mirror;
wherein the bottom mirror is arranged between the interconnect layer and the second semiconductor structure.
12. The optical component according to claim 11,
wherein the mirror structure comprises a Bragg mirror structure.
13. The optical component according to claim 11,
wherein the mirror structure and the second vertical photo diode are configured so that the second vertical photo diode forms a resonant cavity photo diode.
14. The optical component according to claim 1, further comprising:
a reflector layer over the second semiconductor structure.
15. The optical component according to claim 14,
wherein the reflector layer is configured to reflect radiation having a wavelength equal to or greater than approximately 2 μm; and/or
wherein the reflector layer is configured as an infrared reflector layer.
16. The optical component according to claim 1,
wherein the first photo diode is a pin photo diode; and
wherein the second photo diode is a pin photo diode.
17. The optical component according to claim 1,
wherein the first photo diode is an avalanche photo diode; and
wherein the second photo diode is a pin photo diode.
18. The optical component according to claim 1,
wherein the first photo diode is an avalanche photo diode; and
wherein the second photo diode is a resonant cavity photo diode.
19. The optical component according to claim 1,
wherein the first photo diode is a single-photon avalanche photo diode; and
wherein the second photo diode is a resonant cavity photo diode.
20. The optical component according to claim 1,
wherein the first photo diode is an avalanche photo diode; and
wherein the second photo diode is an avalanche photo diode.
21. The optical component according to claim 2, further comprising:
an array of a plurality of photo diode stacks, each photo diode stack comprising a second photo diode vertically stacked over a first photo diode.
22. The optical component according to claim 1,
wherein at least one photo diode stack of the plurality of photo diode stacks comprises at least one further second photo diode in the second semiconductor structure adjacent to the second photo diode;
wherein the first photo diode of the at least one photo diode stack of the plurality of photo diode stacks has a larger lateral extension than the second photo diode and the at least one further second photo diode of the at least one photo diode stack so that the second photo diode and the at least one further second photo diode are arranged laterally within the lateral extension of the first vertical photo diode.
23. The optical component according to claim 1,
wherein the carrier is a semiconductor substrate.
24. A sensor for a LIDAR Sensor System, the sensor comprising:
a plurality of optical components, each optical component of the plurality of optical components comprising:
a first photo diode implementing a LIDAR sensor pixel in a first semiconductor structure and configured to absorb received light in a first wavelength region;
a second photo diode implementing a camera sensor pixel in a second semiconductor structure over the first semiconductor structure and configured to absorb received light in a second wavelength region;
an interconnect layer comprising an electrically conductive structure configured to electrically contact the second photo diode;
wherein the received light of the second wavelength region has a shorter wavelength than the received light of the first wavelength region; and
wherein the plurality of optical components are monolithically integrated on the carrier as a common carrier.
25. The sensor according to claim 24,
wherein the sensor is configured as a front-side illuminated sensor.
26. The sensor according to claim 24,
wherein the sensor is configured as a back-side illuminated sensor.
27. The sensor according to claim 24, further comprising:
a color filter layer covering at least some optical components of the plurality of optical components.
28. The sensor according to claim 27,
wherein the color filter layer comprises a first color filter sublayer and a second color filter sublayer;
wherein the first color filter sublayer is configured to transmit received light having a wavelength within the first wavelength region and within the second wavelength region, and block light outside the first wavelength region and outside the second wavelength region; and
wherein the second color filter sublayer is configured to block received light having a wavelength outside the second wavelength region.
29. The sensor according to claim 28,
wherein the first color filter sublayer and/or the second color filter sublayer comprises a plurality of second sublayer pixels.
30. The sensor according to claim 29,
wherein the first color filter sublayer and/or the second color filter sublayer comprises a plurality of second sublayer pixels in accordance with a Bayer pattern.
31. The sensor according to claim 27,
wherein the first color filter sublayer comprises a plurality of first sublayer pixels having the same size as the second sublayer pixels;
wherein the first sublayer pixels and the second sublayer pixels coincide with each other.
32. The sensor according to claim 27,
wherein the first color filter sublayer comprises a plurality of first sublayer pixels having a size larger than the size of the second sublayer pixels;
wherein one first sublayer pixels laterally substantially overlaps with a plurality of the second sublayer pixels.
33. A LIDAR Sensor System, comprising:
a sensor for a LIDAR Sensor System, the sensor comprising:
a plurality of optical components, each optical component of the plurality of optical components comprising:
a first photo diode implementing a LIDAR sensor pixel in a first semiconductor structure and configured to absorb received light in a first wavelength region;
a second photo diode implementing a camera sensor pixel in a second semiconductor structure over the first semiconductor structure and configured to absorb received light in a second wavelength region;
an interconnect layer comprising an electrically conductive structure configured to electrically contact the second photo diode;
wherein the received light of the second wavelength region has a shorter wavelength than the received light of the first wavelength region; and
wherein the plurality of optical components are monolithically integrated on the carrier as a common carrier; and
a sensor controller configured to control the sensor.
34. A method for a LIDAR Sensor System, wherein the LIDAR Sensor System comprises:
a sensor for a LIDAR Sensor System, the sensor comprising:
a plurality of optical components, each optical component of the plurality of optical components comprising:
a first photo diode implementing a LIDAR sensor pixel in a first semiconductor structure and configured to absorb received light in a first wavelength region;
a second photo diode implementing a camera sensor pixel in a second semiconductor structure over the first semiconductor structure and configured to absorb received light in a second wavelength region;
an interconnect layer comprising an electrically conductive structure configured to electrically contact the second photo diode;
wherein the received light of the second wavelength region has a shorter wavelength than the received light of the first wavelength region; and
wherein the plurality of optical components are monolithically integrated on the carrier as a common carrier; and
a sensor controller configured to control the sensor;
wherein the method comprises:
integrating the LIDAR Sensor System into a LIDAR Sensor Device; and
communicating with a second Sensor System and using a object classification and/or Probability Factors and/or Traffic Relevance factors measured by the second Sensor System for evaluation of current and future measurements and derived LIDAR Sensor Device control parameters as a function of these factors.
US17/742,426 2019-03-08 2022-05-12 Component and sensor for a lidar sensor system, lidar sensor system and method for a lidar sensor system Pending US20220276351A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/742,426 US20220276351A1 (en) 2019-03-08 2022-05-12 Component and sensor for a lidar sensor system, lidar sensor system and method for a lidar sensor system

Applications Claiming Priority (32)

Application Number Priority Date Filing Date Title
DE102019203175.7 2019-03-08
DE102019203175 2019-03-08
DE102019205514 2019-04-16
DE102019205514.1 2019-04-16
DE102019206939.8 2019-05-14
DE102019206939 2019-05-14
DE102019208489.3 2019-06-12
DE102019208489 2019-06-12
DE102019210528 2019-07-17
DE102019210528.9 2019-07-17
DE102019213210.3 2019-09-02
DE102019213210 2019-09-02
DE102019214455 2019-09-23
DE102019214455.1 2019-09-23
DE102019216362.9 2019-10-24
DE102019216362 2019-10-24
DE102019217097 2019-11-06
DE102019217097.8 2019-11-06
DE102019218025 2019-11-22
DE102019218025.6 2019-11-22
DE102019219775 2019-12-17
DE102019219775.2 2019-12-17
DE102020200833 2020-01-24
DE102020200833.7 2020-01-24
DE102020201577.5 2020-02-10
DE102020201577 2020-02-10
DE102020201900 2020-02-17
DE102020201900.2 2020-02-17
DE102020202374 2020-02-25
DE102020202374.3 2020-02-25
US16/809,587 US11726184B2 (en) 2019-03-08 2020-03-05 Component for a LIDAR sensor system, LIDAR sensor system, LIDAR sensor device, method for a LIDAR sensor system and method for a LIDAR sensor device
US17/742,426 US20220276351A1 (en) 2019-03-08 2022-05-12 Component and sensor for a lidar sensor system, lidar sensor system and method for a lidar sensor system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US16/809,587 Continuation US11726184B2 (en) 2019-03-08 2020-03-05 Component for a LIDAR sensor system, LIDAR sensor system, LIDAR sensor device, method for a LIDAR sensor system and method for a LIDAR sensor device

Publications (1)

Publication Number Publication Date
US20220276351A1 true US20220276351A1 (en) 2022-09-01

Family

ID=69770900

Family Applications (4)

Application Number Title Priority Date Filing Date
US16/809,587 Active 2041-09-15 US11726184B2 (en) 2019-03-08 2020-03-05 Component for a LIDAR sensor system, LIDAR sensor system, LIDAR sensor device, method for a LIDAR sensor system and method for a LIDAR sensor device
US17/742,426 Pending US20220276351A1 (en) 2019-03-08 2022-05-12 Component and sensor for a lidar sensor system, lidar sensor system and method for a lidar sensor system
US17/742,448 Pending US20220276352A1 (en) 2019-03-08 2022-05-12 Optical package for a lidar sensor system and lidar sensor system technical field
US18/514,827 Pending US20240094353A1 (en) 2019-03-08 2023-11-20 Lidar system, apparatus communicating with the lidar system, and apparatus located in a field of view (fov) of the lidar system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US16/809,587 Active 2041-09-15 US11726184B2 (en) 2019-03-08 2020-03-05 Component for a LIDAR sensor system, LIDAR sensor system, LIDAR sensor device, method for a LIDAR sensor system and method for a LIDAR sensor device

Family Applications After (2)

Application Number Title Priority Date Filing Date
US17/742,448 Pending US20220276352A1 (en) 2019-03-08 2022-05-12 Optical package for a lidar sensor system and lidar sensor system technical field
US18/514,827 Pending US20240094353A1 (en) 2019-03-08 2023-11-20 Lidar system, apparatus communicating with the lidar system, and apparatus located in a field of view (fov) of the lidar system

Country Status (6)

Country Link
US (4) US11726184B2 (en)
EP (1) EP3963355A1 (en)
CN (3) CN113795773A (en)
CA (1) CA3173966A1 (en)
DE (1) DE112020001131T5 (en)
WO (1) WO2020182591A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220291388A1 (en) * 2021-03-15 2022-09-15 Argo AI, LLC Compressive sensing for photodiode data
DE102022124675A1 (en) 2022-09-26 2024-03-28 Ifm Electronic Gmbh PMD sensor with multiple semiconductor levels

Families Citing this family (363)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110116520A1 (en) * 2008-07-07 2011-05-19 Koninklijke Philips Electronics N.V. Eye-safe laser-based lighting
US11885887B1 (en) * 2012-04-16 2024-01-30 Mazed Mohammad A Imaging subsystem
US10074417B2 (en) * 2014-11-20 2018-09-11 Rambus Inc. Memory systems and methods for improved power management
WO2017170005A1 (en) * 2016-03-30 2017-10-05 日本電気株式会社 Recording medium on which indoor/outdoor determination program is recorded, indoor/outdoor determination system, indoor/outdoor determination method, mobile terminal, and means for classifying and determining indoor/outdoor environment
US10142196B1 (en) 2016-04-15 2018-11-27 Senseware, Inc. System, method, and apparatus for bridge interface communication
JP6712936B2 (en) * 2016-09-23 2020-06-24 株式会社小松製作所 Work vehicle management system and work vehicle management method
KR102569539B1 (en) * 2016-11-07 2023-08-24 주식회사 에이치엘클레무브 Object sensing system for vehicle and object sensing method for vehicle
TW201826993A (en) * 2016-12-09 2018-08-01 美商泰華施股份有限公司 Robotic cleaning device with operating speed variation based on environment
DE102017106959A1 (en) * 2017-03-31 2018-10-04 Osram Opto Semiconductors Gmbh Lighting device and lighting system
GB201707973D0 (en) * 2017-05-18 2017-07-05 Jaguar Land Rover Ltd A vehicle control system, method and computer program for a vehicle control multilayer architecture
US11226403B2 (en) * 2017-07-12 2022-01-18 GM Global Technology Operations LLC Chip-scale coherent lidar with integrated high power laser diode
EP3438699A1 (en) * 2017-07-31 2019-02-06 Hexagon Technology Center GmbH Range finder using spad arrangement for acquiring multiple targets
WO2019028205A1 (en) 2017-08-03 2019-02-07 The Research Foundation For The State University Of New York Dual-screen digital radiography with asymmetric reflective screens
JP2019032206A (en) * 2017-08-07 2019-02-28 ソニーセミコンダクタソリューションズ株式会社 Distance sensor, distance measuring apparatus, and image sensor
DE102017214346A1 (en) * 2017-08-17 2019-02-21 Volkswagen Aktiengesellschaft Headlight for a vehicle
EP3534113B1 (en) * 2017-09-13 2022-11-02 ClearMotion, Inc. Road surface-based vehicle control
DE102017122711A1 (en) * 2017-09-29 2019-04-04 Claas E-Systems Kgaa Mbh & Co. Kg Method for operating a self-propelled agricultural machine
JP6773002B2 (en) * 2017-10-30 2020-10-21 株式会社デンソー Vehicle equipment and computer programs
US10914110B2 (en) * 2017-11-02 2021-02-09 Magna Closures Inc. Multifunction radar based detection system for a vehicle liftgate
US11175388B1 (en) * 2017-11-22 2021-11-16 Insight Lidar, Inc. Digital coherent LiDAR with arbitrary waveforms
US11232350B2 (en) * 2017-11-29 2022-01-25 Honda Motor Co., Ltd. System and method for providing road user classification training using a vehicle communications network
EP3503457B1 (en) * 2017-12-22 2020-08-12 ID Quantique S.A. Method and device for recognizing blinding attacks in a quantum encrypted channel
US11262756B2 (en) * 2018-01-15 2022-03-01 Uatc, Llc Discrete decision architecture for motion planning system of an autonomous vehicle
JP7246863B2 (en) * 2018-04-20 2023-03-28 ソニーセミコンダクタソリューションズ株式会社 Photodetector, vehicle control system and rangefinder
US11787346B2 (en) * 2018-04-20 2023-10-17 Axon Enterprise, Inc. Systems and methods for a housing equipment for a security vehicle
KR102025012B1 (en) * 2018-05-08 2019-09-24 재단법인 다차원 스마트 아이티 융합시스템 연구단 Multi pixel micro lens pixel array and camera system for solving color mix and operating method thereof
DE102018209192B3 (en) * 2018-05-17 2019-05-09 Continental Automotive Gmbh Method and device for operating a camera monitor system for a motor vehicle
US11303632B1 (en) * 2018-06-08 2022-04-12 Wells Fargo Bank, N.A. Two-way authentication system and method
DE102018214354A1 (en) * 2018-08-24 2020-02-27 Robert Bosch Gmbh First vehicle-side terminal, method for operating the first terminal, second vehicle-side terminal and method for operating the second vehicle-side terminal
US10928498B1 (en) * 2018-09-18 2021-02-23 Apple Inc. Electronic device with circular radar-antenna array
US10955857B2 (en) * 2018-10-02 2021-03-23 Ford Global Technologies, Llc Stationary camera localization
US11195418B1 (en) * 2018-10-04 2021-12-07 Zoox, Inc. Trajectory prediction on top-down scenes and associated model
WO2020086972A1 (en) * 2018-10-26 2020-04-30 Current Lighting Solutions, Llc Identification of lighting fixtures for indoor positioning using color band code
US11388797B2 (en) * 2018-10-29 2022-07-12 Signify Holding B.V. Lighting system with connected light sources
US10908409B2 (en) * 2018-12-07 2021-02-02 Beijing Voyager Technology Co., Ltd. Coupled and synchronous mirror elements in a LiDAR-based micro-mirror array
US11082535B2 (en) * 2018-12-20 2021-08-03 Here Global B.V. Location enabled augmented reality (AR) system and method for interoperability of AR applications
JP2020118567A (en) * 2019-01-24 2020-08-06 ソニーセミコンダクタソリューションズ株式会社 Distance measurement device, on-vehicle system, and distance measurement method
CN113474618B (en) * 2019-02-01 2024-03-15 砺铸智能设备(天津)有限公司 System and method for object inspection using multispectral 3D laser scanning
AU2020218753A1 (en) 2019-02-04 2021-09-30 Anduril Industries, Inc. Advanced computational pixel imagers with multiple in-pixel counters
EP3693243A1 (en) * 2019-02-06 2020-08-12 Zenuity AB Method and system for controlling an automated driving system of a vehicle
WO2020164021A1 (en) * 2019-02-13 2020-08-20 北京百度网讯科技有限公司 Driving control method and apparatus, device, medium, and system
US11003195B2 (en) * 2019-02-28 2021-05-11 GM Global Technology Operations LLC Method to prioritize the process of receiving for cooperative sensor sharing objects
US11644549B2 (en) * 2019-03-06 2023-05-09 The University Court Of The University Of Edinburgh Extended dynamic range and reduced power imaging for LIDAR detector arrays
JP6970703B2 (en) * 2019-03-18 2021-11-24 株式会社東芝 Electronics and methods
JP2020153798A (en) * 2019-03-19 2020-09-24 株式会社リコー Optical device, distance measuring optical unit, distance measuring device, and distance measuring system
JP7147651B2 (en) * 2019-03-22 2022-10-05 トヨタ自動車株式会社 Object recognition device and vehicle control system
US11521309B2 (en) * 2019-05-30 2022-12-06 Bruker Nano, Inc. Method and apparatus for rapid inspection of subcomponents of manufactured component
DE102019208269A1 (en) * 2019-06-06 2020-12-10 Robert Bosch Gmbh Lidar device
CN112114322A (en) * 2019-06-21 2020-12-22 广州印芯半导体技术有限公司 Time-of-flight distance measuring device and time-of-flight distance measuring method
US11267590B2 (en) * 2019-06-27 2022-03-08 Nxgen Partners Ip, Llc Radar system and method for detecting and identifying targets using orbital angular momentum correlation matrix
US11153010B2 (en) * 2019-07-02 2021-10-19 Waymo Llc Lidar based communication
US20210005085A1 (en) * 2019-07-03 2021-01-07 Cavh Llc Localized artificial intelligence for intelligent road infrastructure
TWI786311B (en) * 2019-07-04 2022-12-11 先進光電科技股份有限公司 Mobile vehicle assist system and parking control method thereof
US11269076B2 (en) * 2019-07-11 2022-03-08 Mtd Products Inc Solid state LIDAR machine vision for power equipment device
US11663378B2 (en) * 2019-07-16 2023-05-30 Here Global B.V. Method, apparatus, and system for providing traffic simulations in a smart-city infrastructure
EP3770881B1 (en) * 2019-07-26 2023-11-15 Volkswagen AG Methods, computer programs, apparatuses, a vehicle, and a traffic entity for updating an environmental model of a vehicle
US11592537B2 (en) * 2019-07-29 2023-02-28 Infineon Technologies Ag Optical crosstalk mitigation in LIDAR using digital signal processing
US11403077B2 (en) * 2019-08-01 2022-08-02 Dspace Gmbh Method and system for preparing block diagrams for code generation
US10802120B1 (en) 2019-08-20 2020-10-13 Luminar Technologies, Inc. Coherent pulsed lidar system
US11237612B2 (en) * 2019-08-22 2022-02-01 Micron Technology, Inc. Charge-sharing capacitive monitoring circuit in a multi-chip package to control power
US11164784B2 (en) 2019-08-22 2021-11-02 Micron Technology, Inc. Open-drain transistor monitoring circuit in a multi-chip package to control power
CN112532342B (en) * 2019-09-17 2023-05-16 华为技术有限公司 Data transmission method and device in back reflection communication
US11455515B2 (en) * 2019-09-24 2022-09-27 Robert Bosch Gmbh Efficient black box adversarial attacks exploiting input data structure
US11144769B2 (en) * 2019-09-30 2021-10-12 Pony Ai Inc. Variable resolution sensors
US11619945B2 (en) * 2019-09-30 2023-04-04 GM Cruise Holdings LLC. Map prior layer
JP7238722B2 (en) * 2019-10-11 2023-03-14 トヨタ自動車株式会社 vehicle parking assist device
KR20210044433A (en) * 2019-10-15 2021-04-23 에스케이하이닉스 주식회사 Image sensor and image processing device including the same
FR3102253B1 (en) * 2019-10-16 2022-01-14 Commissariat Energie Atomique Obstacle detection method, detection device, detection system and associated vehicle
US11573302B2 (en) * 2019-10-17 2023-02-07 Argo AI, LLC LiDAR system comprising a Geiger-mode avalanche photodiode-based receiver having pixels with multiple-return capability
US11838400B2 (en) * 2019-11-19 2023-12-05 International Business Machines Corporation Image encoding for blockchain
CN112909034A (en) 2019-12-04 2021-06-04 半导体元件工业有限责任公司 Semiconductor device with a plurality of transistors
US11652176B2 (en) * 2019-12-04 2023-05-16 Semiconductor Components Industries, Llc Semiconductor devices with single-photon avalanche diodes and light scattering structures with different densities
US11420645B2 (en) * 2019-12-11 2022-08-23 At&T Intellectual Property I, L.P. Method and apparatus for personalizing autonomous transportation
US11592575B2 (en) * 2019-12-20 2023-02-28 Waymo Llc Sensor steering for multi-directional long-range perception
US20210209377A1 (en) * 2020-01-03 2021-07-08 Cawamo Ltd System and method for identifying events of interest in images from one or more imagers in a computing network
US10907960B1 (en) * 2020-01-06 2021-02-02 Outsight SA Calibration system for combined depth and texture sensor
US20210215798A1 (en) * 2020-01-10 2021-07-15 Continental Automotive Systems, Inc. Lidar system
US11619722B2 (en) * 2020-02-12 2023-04-04 Ford Global Technologies, Llc Vehicle lidar polarization
US11506786B2 (en) 2020-02-14 2022-11-22 Arete Associates Laser detection and ranging
US11643105B2 (en) 2020-02-21 2023-05-09 Argo AI, LLC Systems and methods for generating simulation scenario definitions for an autonomous vehicle system
US11429107B2 (en) 2020-02-21 2022-08-30 Argo AI, LLC Play-forward planning and control system for an autonomous vehicle
US11567173B2 (en) * 2020-03-04 2023-01-31 Caterpillar Paving Products Inc. Systems and methods for increasing lidar sensor coverage
US11450116B2 (en) * 2020-03-09 2022-09-20 Ford Global Technologies, Llc Systems and methods for sharing camera setting control among multiple image processing components in a vehicle
US20210288726A1 (en) * 2020-03-12 2021-09-16 Uber Technologies Inc. Location accuracy using local transmitters
US11521504B2 (en) * 2020-03-18 2022-12-06 Rosemount Aerospace Inc. Method and system for aircraft taxi strike alerting
JP7383542B2 (en) * 2020-03-24 2023-11-20 株式会社東芝 Photodetector and distance measuring device
US11644551B2 (en) * 2020-03-30 2023-05-09 Semiconductor Components Industries, Llc Lidar systems with improved time-to-digital conversion circuitry
US11381399B2 (en) * 2020-04-01 2022-07-05 Ford Global Technologies, Llc Enhanced vehicle operation
CN115427753A (en) * 2020-04-15 2022-12-02 三菱电机株式会社 Parameter adjustment device, learning device, measurement system, parameter adjustment method, and program
US11443447B2 (en) * 2020-04-17 2022-09-13 Samsung Electronics Co., Ltd. Three-dimensional camera system
KR20210138201A (en) * 2020-05-11 2021-11-19 현대자동차주식회사 Method and apparatus for controlling autonomous driving
US11032530B1 (en) * 2020-05-15 2021-06-08 Microsoft Technology Licensing, Llc Gradual fallback from full parallax correction to planar reprojection
EP3916424A1 (en) * 2020-05-25 2021-12-01 Scantinel Photonics GmbH Device and method for scanning measurement of the distance to an object
US11595619B1 (en) * 2020-06-02 2023-02-28 Aurora Operations, Inc. Autonomous vehicle teleoperations system
US11481884B2 (en) * 2020-06-04 2022-10-25 Nuro, Inc. Image quality enhancement for autonomous vehicle remote operations
US11428785B2 (en) * 2020-06-12 2022-08-30 Ours Technology, Llc Lidar pixel with active polarization control
JP2023530118A (en) * 2020-06-16 2023-07-13 テレフオンアクチーボラゲット エルエム エリクソン(パブル) Techniques for reporting network traffic activity
CN112639514B (en) * 2020-07-07 2024-02-23 深圳市速腾聚创科技有限公司 Laser receiving device, laser radar and intelligent induction equipment
FR3112653A1 (en) * 2020-07-15 2022-01-21 STMicroelectronics (Alps) SAS INTEGRATED CIRCUIT AND METHOD FOR DIAGNOSING SUCH AN INTEGRATED CIRCUIT
WO2022016276A1 (en) 2020-07-21 2022-01-27 Leddartech Inc. Beam-steering device particularly for lidar systems
CA3125716C (en) 2020-07-21 2024-04-09 Leddartech Inc. Systems and methods for wide-angle lidar using non-uniform magnification optics
CA3125718C (en) * 2020-07-21 2023-10-03 Leddartech Inc. Beam-steering devices and methods for lidar applications
US11119219B1 (en) * 2020-08-10 2021-09-14 Luminar, Llc Lidar system with input optical element
US20220050201A1 (en) * 2020-08-17 2022-02-17 Litexel Inc. Fmcw imaging lidar based on coherent pixel array
US10884130B1 (en) * 2020-08-18 2021-01-05 Aeva, Inc. LIDAR system noise calibration and target detection
US11553618B2 (en) * 2020-08-26 2023-01-10 PassiveLogic, Inc. Methods and systems of building automation state load and user preference via network systems activity
US11722590B1 (en) 2020-08-28 2023-08-08 Apple Inc. Electronic devices with conductive tape
US11882752B1 (en) 2020-08-28 2024-01-23 Apple Inc. Electronic devices with through-display sensors
CA3130336A1 (en) 2020-09-10 2022-03-10 Saco Technologies Inc. Light shaping assembly having a two-dimensional array of light sources and a fresnel lens
US11238729B1 (en) * 2020-09-11 2022-02-01 Toyota Motor Engineering & Manufacturing North America, Inc. Systems and methods for traffic flow prediction
US11212195B1 (en) * 2020-09-11 2021-12-28 Microsoft Technology Licensing, Llc IT monitoring recommendation service
CN112162259B (en) * 2020-09-15 2024-02-13 中国电子科技集团公司第四十四研究所 Pulse laser time-voltage conversion circuit and control method thereof
US20220091266A1 (en) * 2020-09-18 2022-03-24 Denso International America, Inc. Systems and methods for enhancing outputs of a lidar
WO2022061656A1 (en) * 2020-09-24 2022-03-31 深圳市大疆创新科技有限公司 Laser ranging device
US11561289B2 (en) * 2020-09-25 2023-01-24 Beijing Voyager Technology Co., Ltd. Scanning LiDAR system with a wedge prism
US11966811B2 (en) * 2020-09-28 2024-04-23 Cognex Corporation Machine vision system and method with on-axis aimer and distance measurement assembly
US11057115B1 (en) * 2020-09-30 2021-07-06 Visera Technologies Company Limited Optical communication device
US11943294B1 (en) * 2020-09-30 2024-03-26 Amazon Technologies, Inc. Storage medium and compression for object stores
CN112372631B (en) * 2020-10-05 2022-03-15 华中科技大学 Rapid collision detection method and device for robot machining of large complex component
US20220113390A1 (en) * 2020-10-09 2022-04-14 Silc Technologies, Inc. Increasing signal-to-noise ratios in lidar systems
CN112202800B (en) * 2020-10-10 2021-10-01 中国科学技术大学 VR video edge prefetching method and system based on reinforcement learning in C-RAN architecture
US11726383B2 (en) * 2020-10-14 2023-08-15 California Institute Of Technology Modular hybrid optical phased arrays
US11327158B1 (en) * 2020-10-19 2022-05-10 Aeva, Inc. Techniques to compensate for mirror Doppler spreading in coherent LiDAR systems using matched filtering
US11385351B2 (en) * 2020-10-19 2022-07-12 Aeva, Inc. Techniques for automatically adjusting detection threshold of FMCW LIDAR
CN112270059A (en) * 2020-10-19 2021-01-26 泰州市气象局 Weather radar networking strategy evaluation method and system
CN112305961B (en) * 2020-10-19 2022-04-12 武汉大学 Novel signal detection and acquisition equipment
KR20220052176A (en) * 2020-10-20 2022-04-27 현대자동차주식회사 Text message analysis and notification device
US11648959B2 (en) 2020-10-20 2023-05-16 Argo AI, LLC In-vehicle operation of simulation scenarios during autonomous vehicle runs
CN112285746B (en) * 2020-10-21 2024-02-13 厦门大学 Spoofing detection method and device based on multipath signals
CN112821930A (en) * 2020-10-25 2021-05-18 泰州物族信息科技有限公司 Adaptive antenna state management platform
CN112558053B (en) * 2020-10-28 2022-05-31 电子科技大学 Optical beam forming network device and method based on microwave photon true time delay
CN112379393B (en) * 2020-10-29 2023-04-25 中车株洲电力机车研究所有限公司 Train collision early warning method and device
CN112348344B (en) * 2020-10-30 2022-09-06 天津市赛英工程建设咨询管理有限公司 Public transport reachable index calculation method
US11105904B1 (en) 2020-10-30 2021-08-31 Aeva, Inc. Techniques for mitigating lag-angle effects for LIDARs scans
US11965989B2 (en) * 2020-11-04 2024-04-23 Beijing Voyager Technology Co., Ltd. Copackaging photodetector and readout circuit for improved LiDAR detection
US10976415B1 (en) 2020-11-09 2021-04-13 Aeva, Inc. Techniques for image conjugate pitch reduction
US10948598B1 (en) * 2020-11-25 2021-03-16 Aeva, Inc. Coherent LiDAR system utilizing polarization-diverse architecture
CN116583959A (en) * 2020-11-27 2023-08-11 趣眼有限公司 Method and system for infrared sensing
CN112417798B (en) * 2020-11-27 2023-05-23 成都海光微电子技术有限公司 Time sequence testing method and device, electronic equipment and storage medium
JP2023552735A (en) * 2020-11-27 2023-12-19 トライアイ リミテッド Method and system for infrared sensing
US11985433B2 (en) * 2020-11-30 2024-05-14 Microsoft Technology Licensing, Llc SPAD array for intensity image sensing on head-mounted displays
CN112347993B (en) * 2020-11-30 2023-03-17 吉林大学 Expressway vehicle behavior and track prediction method based on vehicle-unmanned aerial vehicle cooperation
WO2022120290A1 (en) * 2020-12-05 2022-06-09 Alpha Fiber, Inc. System and method for detection and monitoring of impact
CN112529799A (en) * 2020-12-07 2021-03-19 中国工程物理研究院流体物理研究所 Optical aberration distortion correction system based on FPGA convolutional neural network structure
CN112540363B (en) * 2020-12-07 2023-08-08 西安电子科技大学芜湖研究院 Silicon photomultiplier readout circuit for laser radar
CN112464870B (en) * 2020-12-08 2024-04-16 未来汽车科技(深圳)有限公司 Target object live-action fusion method, system, equipment and storage medium for AR-HUD
CN114615397B (en) * 2020-12-09 2023-06-30 华为技术有限公司 TOF device and electronic equipment
KR20220083059A (en) * 2020-12-11 2022-06-20 삼성전자주식회사 Time of flight camera device and driving method thereof
CN112698356B (en) * 2020-12-14 2023-08-01 哈尔滨工业大学(深圳) Non-blind area pulse coherent wind-measuring laser radar system based on multi-aperture transceiving
KR102512347B1 (en) * 2020-12-14 2023-03-22 현대모비스 주식회사 Apparatus for Time-to-digital converter and method for aligning signal using the same
CN114640394B (en) * 2020-12-15 2023-05-26 中国联合网络通信集团有限公司 Communication method and communication device
CN112686105B (en) * 2020-12-18 2021-11-02 云南省交通规划设计研究院有限公司 Fog concentration grade identification method based on video image multi-feature fusion
CN112686842B (en) * 2020-12-21 2021-08-24 苏州炫感信息科技有限公司 Light spot detection method and device, electronic equipment and readable storage medium
US20210109881A1 (en) * 2020-12-21 2021-04-15 Intel Corporation Device for a vehicle
US20220200626A1 (en) * 2020-12-21 2022-06-23 Intel Corporation Flexible compression header and code generation
KR20230162131A (en) * 2020-12-26 2023-11-28 트라이아이 엘티디. Systems, methods and computer program products for generating depth images based on short-wave infrared detection information
CN112731357A (en) * 2020-12-30 2021-04-30 清华大学 Real-time correction method and system for positioning error of laser point cloud odometer
US20230107571A1 (en) * 2021-10-01 2023-04-06 Liberty Reach Inc. Machine Vision-Based Method and System for Locating Objects within a Scene Containing the Objects
CN112883997B (en) * 2021-01-11 2023-05-12 武汉坤能轨道系统技术有限公司 Rail transit fastener detection system and detection method
WO2022153126A1 (en) * 2021-01-14 2022-07-21 Innoviz Technologies Ltd. Synchronization of multiple lidar systems
CN112765809B (en) * 2021-01-14 2022-11-11 成都理工大学 Railway line comparing and selecting method and device based on typical disasters of high-ground stress tunnel
CN112651382B (en) * 2021-01-15 2024-04-02 北京中科虹霸科技有限公司 Focusing data calibration system and iris image acquisition system
US11309854B1 (en) * 2021-01-26 2022-04-19 Saudi Arabian Oil Company Digitally controlled grounded capacitance multiplier
US20220236412A1 (en) * 2021-01-26 2022-07-28 Omron Corporation Laser scanner apparatus and method of operation
TWI766560B (en) * 2021-01-27 2022-06-01 國立臺灣大學 Object recognition and ranging system using image semantic segmentation and lidar point cloud
CN112954586B (en) * 2021-01-29 2022-09-09 北京邮电大学 Deception jamming source positioning method, electronic equipment and storage medium
CN112953634A (en) * 2021-01-29 2021-06-11 Oppo广东移动通信有限公司 Optimization method of visible light communication transmission, electronic device and storage medium
CN112924960B (en) * 2021-01-29 2023-07-18 重庆长安汽车股份有限公司 Target size real-time detection method, system, vehicle and storage medium
US11763425B2 (en) * 2021-02-03 2023-09-19 Qualcomm Incorporated High resolution time-of-flight depth imaging
CN112965054B (en) * 2021-02-03 2023-09-19 南京众控电子科技有限公司 Cabin door opening and closing recognition method based on radar technology
US11770701B2 (en) 2021-02-05 2023-09-26 Argo AI, LLC Secure communications with autonomous vehicles
EP4291918A1 (en) * 2021-02-11 2023-12-20 Sony Semiconductor Solutions Corporation Configuration control circuitry and configuration control method
US20220260340A1 (en) * 2021-02-17 2022-08-18 Bae Systems Information And Electronic Systems Integration Inc. Push broom clutter rejection using a multimodal filter
CN112918214B (en) * 2021-02-22 2022-05-31 一汽奔腾轿车有限公司 Method and control device for realizing remote air conditioner control
CN112883872B (en) * 2021-02-22 2023-11-07 业泓科技(成都)有限公司 Identification sensing structure, fingerprint identification module and terminal
TWI770834B (en) * 2021-02-23 2022-07-11 國立陽明交通大學 Angle estimation method and detection device using the same
US11802945B2 (en) 2021-03-10 2023-10-31 Allegro Microsystems, Llc Photonic ROIC having safety features
US11581697B2 (en) 2021-03-10 2023-02-14 Allegro Microsystems, Llc Detector system comparing pixel response with photonic energy decay
US11757893B2 (en) 2021-03-11 2023-09-12 Bank Of America Corporation System and method for authorizing entity users based on augmented reality and LiDAR technology
CN115079131A (en) * 2021-03-11 2022-09-20 中强光电股份有限公司 Light-emitting device
CN113093222B (en) * 2021-03-11 2023-08-01 武汉大学 Single-spectrum temperature measurement laser radar system based on volume Bragg grating
CN113063327B (en) * 2021-03-22 2023-04-25 贵州航天电子科技有限公司 Full-wave sampling laser fuze signal processing circuit and signal processing method
US20220311938A1 (en) * 2021-03-24 2022-09-29 Qualcomm Incorporated Image capture with expanded field of view
US11604264B2 (en) 2021-03-26 2023-03-14 Aeye, Inc. Switchable multi-lens Lidar receiver
US11474213B1 (en) 2021-03-26 2022-10-18 Aeye, Inc. Hyper temporal lidar with dynamic laser control using marker shots
US20220308184A1 (en) 2021-03-26 2022-09-29 Aeye, Inc. Hyper Temporal Lidar with Controllable Detection Intervals
US11635495B1 (en) 2021-03-26 2023-04-25 Aeye, Inc. Hyper temporal lidar with controllable tilt amplitude for a variable amplitude scan mirror
US11630188B1 (en) * 2021-03-26 2023-04-18 Aeye, Inc. Hyper temporal lidar with dynamic laser control using safety models
US11811456B2 (en) 2021-03-30 2023-11-07 Honeywell International Inc. Multi-pixel waveguide optical receiver
US20220315220A1 (en) * 2021-03-31 2022-10-06 Skydio, Inc. Autonomous Aerial Navigation In Low-Light And No-Light Conditions
US11861896B1 (en) 2021-03-31 2024-01-02 Skydio, Inc. Autonomous aerial navigation in low-light and no-light conditions
US11867650B2 (en) 2021-04-06 2024-01-09 Apple Inc. Enclosure detection for reliable optical failsafe
CN114184091A (en) * 2021-04-08 2022-03-15 西安龙飞电气技术有限公司 Infrared radar dual-mode digital processing method for air-to-air missile seeker
WO2022221153A1 (en) * 2021-04-11 2022-10-20 Allen Samuels Remote vehicle operation with high latency communications
CN113345185B (en) * 2021-04-12 2022-08-30 中国地质大学(武汉) Passive door and window alarm device based on LoRa scattering communication method
US11770632B2 (en) 2021-04-14 2023-09-26 Allegro Microsystems, Llc Determining a temperature of a pixel array by measuring voltage of a pixel
US11409000B1 (en) * 2021-04-15 2022-08-09 Aeva, Inc. Techniques for simultaneous determination of range and velocity with passive modulation
US11646787B2 (en) * 2021-04-16 2023-05-09 Qualcomm Incorporated Utilization of sensor information by repeaters
DE102021203829A1 (en) * 2021-04-19 2022-10-20 Robert Bosch Gesellschaft mit beschränkter Haftung Range-optimized LiDAR system and LiDAR device (110) and control device for such a LiDAR system
EP4327119A1 (en) 2021-04-23 2024-02-28 OSRAM GmbH Amplitude shift keying lidar
CN113094460B (en) * 2021-04-25 2023-07-28 南京大学 Three-dimensional building progressive coding and transmission method and system of structure level
CN113219493B (en) * 2021-04-26 2023-08-25 中山大学 End-to-end cloud data compression method based on three-dimensional laser radar sensor
US11816187B2 (en) * 2021-04-30 2023-11-14 Intuit Inc. Anomaly detection in event-based systems using image processing
US11592674B2 (en) * 2021-05-03 2023-02-28 Microsoft Technology Licensing, Llc External illumination with reduced detectability
US11754683B2 (en) 2021-05-10 2023-09-12 nEYE Systems, Inc. Pseudo monostatic LiDAR with two-dimensional silicon photonic mems switch array
CN113219980B (en) * 2021-05-14 2024-04-12 深圳中智永浩机器人有限公司 Robot global self-positioning method, device, computer equipment and storage medium
US20220371533A1 (en) * 2021-05-18 2022-11-24 Motional Ad Llc Distributed vehicle body sensors for event detection
EP4323809A1 (en) * 2021-05-19 2024-02-21 Neye Systems, Inc. Lidar with microlens array and integrated photonic switch array
CN113298964A (en) * 2021-05-21 2021-08-24 深圳市大道至简信息技术有限公司 Roadside parking linkage management method and system based on high-level video
US20220374428A1 (en) * 2021-05-24 2022-11-24 Nvidia Corporation Simulation query engine in autonomous machine applications
CN113328802B (en) * 2021-05-27 2022-04-22 北方工业大学 OCC-VLC heterogeneous networking system
CN113488489B (en) * 2021-06-02 2024-02-23 汇顶科技私人有限公司 Pixel unit, light sensor and ranging system based on flight time
CN113219990B (en) * 2021-06-02 2022-04-26 西安电子科技大学 Robot path planning method based on adaptive neighborhood and steering cost
CN113407465B (en) * 2021-06-16 2024-02-09 深圳市同泰怡信息技术有限公司 Switch configuration method and device of baseboard management controller and computer equipment
KR20220170151A (en) * 2021-06-22 2022-12-29 현대자동차주식회사 Method and Apparatus for Intrusion Response to In-Vehicle Network
CN113345106A (en) * 2021-06-24 2021-09-03 西南大学 Three-dimensional point cloud analysis method and system based on multi-scale multi-level converter
CN113469907B (en) * 2021-06-28 2023-04-07 西安交通大学 Data simplification method and system based on blade profile characteristics
CN113466924B (en) * 2021-07-01 2023-05-05 成都理工大学 Symmetrical warhead pulse forming device and method
CN113253219B (en) * 2021-07-05 2021-09-17 天津所托瑞安汽车科技有限公司 No-reference object self-calibration method, device, equipment and medium of millimeter wave radar
CN113341427B (en) * 2021-07-09 2024-05-17 中国科学技术大学 Distance measurement method, distance measurement device, electronic equipment and storage medium
CN113238237B (en) * 2021-07-12 2021-10-01 天津天瞳威势电子科技有限公司 Library position detection method and device
US20230015697A1 (en) * 2021-07-13 2023-01-19 Citrix Systems, Inc. Application programming interface (api) authorization
CN113724146A (en) * 2021-07-14 2021-11-30 北京理工大学 Single-pixel imaging method based on plug-and-play prior
CN113341402A (en) * 2021-07-15 2021-09-03 哈尔滨工程大学 Sonar device for sonar monitoring robot
US11804951B2 (en) * 2021-07-19 2023-10-31 Infineon Technologies Ag Advanced sensor security protocol
US20230023043A1 (en) * 2021-07-21 2023-01-26 Waymo Llc Optimized multichannel optical system for lidar sensors
US11875548B2 (en) * 2021-07-22 2024-01-16 GM Global Technology Operations LLC System and method for region of interest window generation for attention based perception
DE102021119423A1 (en) * 2021-07-27 2023-02-02 Sick Ag Photoelectric sensor and method for detecting an object using the triangulation principle
CN113485997B (en) * 2021-07-27 2023-10-31 中南大学 Trajectory data deviation rectifying method based on probability distribution deviation estimation
TWI788939B (en) * 2021-08-03 2023-01-01 崑山科技大學 Method and system for auxiliary detection of Parkinson's disease
US20230044279A1 (en) * 2021-08-04 2023-02-09 Atieva, Inc. Sensor-based control of lidar resolution configuration
WO2023023106A2 (en) * 2021-08-18 2023-02-23 Lyte Technologies Inc. Optical transceiver arrays
US11929325B2 (en) * 2021-08-18 2024-03-12 Qualcomm Incorporated Mixed pitch track pattern
CN117859083A (en) * 2021-08-18 2024-04-09 莱特人工智能公司 Integrated array for coherent optical detection
CN215990972U (en) * 2021-08-20 2022-03-08 深圳市首力智能科技有限公司 Rotatable monitoring device
US20230060383A1 (en) * 2021-08-25 2023-03-02 Cyngn, Inc. System and method of off-board-centric autonomous driving computation
CN113758480B (en) * 2021-08-26 2022-07-26 南京英尼格玛工业自动化技术有限公司 Surface type laser positioning system, AGV positioning calibration system and AGV positioning method
CN113676484B (en) * 2021-08-27 2023-04-18 绿盟科技集团股份有限公司 Attack tracing method and device and electronic equipment
CN113722796B (en) * 2021-08-29 2023-07-18 中国长江电力股份有限公司 Vision-laser radar coupling-based lean texture tunnel modeling method
CN113743769B (en) * 2021-08-30 2023-07-11 广东电网有限责任公司 Data security detection method and device, electronic equipment and storage medium
CN113687429B (en) * 2021-08-30 2023-07-04 四川启睿克科技有限公司 Device and method for determining boundary of millimeter wave radar monitoring area
DE102021122418A1 (en) * 2021-08-31 2023-03-02 Sick Ag Photoelectric sensor and method for detecting objects in a surveillance area
FR3126506A1 (en) * 2021-08-31 2023-03-03 Valeo Vision Automotive lighting device and object detection method
US20230061830A1 (en) * 2021-09-02 2023-03-02 Canoo Technologies Inc. Metamorphic labeling using aligned sensor data
CN113781339B (en) * 2021-09-02 2023-06-23 中科联芯(广州)科技有限公司 Silicon-based multispectral signal processing method and device and mobile terminal
TWI787988B (en) * 2021-09-03 2022-12-21 啟碁科技股份有限公司 Detection system and detection method
US20230071312A1 (en) * 2021-09-08 2023-03-09 PassiveLogic, Inc. External Activation of Quiescent Device
US11830383B2 (en) * 2021-09-08 2023-11-28 PassiveLogic, Inc. External activating of quiescent device
CN117941410A (en) 2021-09-09 2024-04-26 大众汽车股份公司 Apparatus, method and computer program for vehicle
CN113766218B (en) * 2021-09-14 2024-05-14 北京集创北方科技股份有限公司 Position detection method of optical lens, electronic device and storage medium
CN113884034B (en) * 2021-09-16 2023-08-15 北方工业大学 Lei Dawei vibration target deformation inversion method and device
TWI780916B (en) * 2021-09-16 2022-10-11 英業達股份有限公司 Quantum chip cooling management device and method
CN113820695A (en) * 2021-09-17 2021-12-21 深圳市睿联技术股份有限公司 Ranging method and apparatus, terminal system, and computer-readable storage medium
US20230089124A1 (en) * 2021-09-20 2023-03-23 DC-001, Inc. dba Spartan Radar Systems and methods for determining the local position of a vehicle using radar
EP4156107A1 (en) * 2021-09-24 2023-03-29 Beijing Xiaomi Mobile Software Co., Ltd. Method and apparatus of encoding/decoding point cloud geometry data sensed by at least one sensor
CN113741541A (en) * 2021-09-28 2021-12-03 广州极飞科技股份有限公司 Unmanned aerial vehicle flight control method, device, system, equipment and storage medium
DE102021210798A1 (en) 2021-09-28 2023-03-30 Volkswagen Aktiengesellschaft Beam deflection device for a laser device of a motor vehicle, and laser device
US20230099674A1 (en) * 2021-09-29 2023-03-30 Subaru Corporation Vehicle backup warning systems
US11378664B1 (en) * 2021-10-05 2022-07-05 Aeva, Inc. Techniques for compact optical sensing module with hybrid multi-chip integration
US11706853B2 (en) * 2021-10-06 2023-07-18 Microsoft Technology Licensing, Llc Monitoring an emission state of light sources
WO2023058671A1 (en) * 2021-10-08 2023-04-13 ソニーセミコンダクタソリューションズ株式会社 Image sensor, data processing device, and image sensor system
WO2023058670A1 (en) * 2021-10-08 2023-04-13 ソニーセミコンダクタソリューションズ株式会社 Image sensor, data processing device, and image sensor system
WO2023058669A1 (en) * 2021-10-08 2023-04-13 ソニーセミコンダクタソリューションズ株式会社 Image sensor, data processing device, and image sensor system
CN113949629A (en) * 2021-10-15 2022-01-18 深圳忆联信息系统有限公司 Server substrate management controller initialization method and device and computer equipment
US11847756B2 (en) * 2021-10-20 2023-12-19 Snap Inc. Generating ground truths for machine learning
CN113879435B (en) * 2021-11-01 2022-11-01 深圳市乐骑智能科技有限公司 Internet of things-based electric scooter steering lamp automatic control method and electric scooter
CN114244907B (en) * 2021-11-23 2024-01-16 华为技术有限公司 Radar data compression method and device
DE102021130609A1 (en) 2021-11-23 2023-05-25 Scantinel Photonics GmbH Device and method for scanning the distance to an object
CN114475650B (en) * 2021-12-01 2022-11-01 中铁十九局集团矿业投资有限公司 Vehicle driving behavior determination method, device, equipment and medium
CN114268368B (en) * 2021-12-01 2023-09-19 重庆邮电大学 Design method of unmanned aerial vehicle high-capacity chaotic space laser safety emergency communication system
WO2023108177A2 (en) * 2021-12-06 2023-06-15 Quantum Biotek Inc. Pulsewave velocity detection device and the hemodynamic determination of hba1c, arterial age and calcium score
CN114157358B (en) * 2021-12-10 2022-12-27 中国科学院西安光学精密机械研究所 Ground simulation device for sunset in laser communication
JP7097647B1 (en) * 2021-12-16 2022-07-08 Dolphin株式会社 Adjustment method and program of optical scanning device, object detection device, optical scanning device
CN116265983A (en) * 2021-12-17 2023-06-20 上海禾赛科技有限公司 Laser radar control method and multichannel laser radar
US20230204725A1 (en) * 2021-12-23 2023-06-29 Suteng Innovation Technology Co., Ltd. Lidar anti-interference method and apparatus, storage medium, and lidar
CN114056352A (en) * 2021-12-24 2022-02-18 上海海积信息科技股份有限公司 Automatic driving control device and vehicle
CN116338634A (en) * 2021-12-24 2023-06-27 深圳市速腾聚创科技有限公司 Waveguide assembly, integrated chip and laser radar
US20230204781A1 (en) * 2021-12-28 2023-06-29 Nio Technology (Anhui) Co., Ltd. Time of flight cameras using passive image sensors and existing light sources
CN114370828B (en) * 2021-12-28 2023-06-20 中国铁路设计集团有限公司 Shield tunnel diameter convergence and radial dislocation detection method based on laser scanning
CN113985420B (en) * 2021-12-28 2022-05-03 四川吉埃智能科技有限公司 Method for compensating scanning light path error of laser radar inclined by 45 degrees
CN114413961B (en) * 2021-12-30 2024-04-26 军事科学院系统工程研究院军事新能源技术研究所 Test evaluation device for dynamic laser wireless energy transmission system
CN114325741B (en) * 2021-12-31 2023-04-07 探维科技(北京)有限公司 Detection module and laser ranging system
US11927814B2 (en) * 2022-01-05 2024-03-12 Scidatek Inc. Semiconductor photodetector array sensor integrated with optical-waveguide-based devices
CN114527483B (en) * 2022-01-06 2022-09-13 北京福通互联科技集团有限公司 Active detection photoelectric image acquisition system
US20230246601A1 (en) * 2022-01-31 2023-08-03 Qorvo Us, Inc. Protection circuit for acoustic filter and power amplifier stage
CN114705081B (en) * 2022-02-11 2023-09-08 广东空天科技研究院 Deformable and recyclable backpack type arrow machine combination air launching system
CN114444615B (en) * 2022-02-14 2023-04-07 烟台大学 Bayesian classification recognition system based on industrial PaaS platform and recognition method thereof
DE102023103823A1 (en) 2022-02-16 2023-08-17 Elmos Semiconductor Se Module for emitting electromagnetic radiation, in particular a laser light module
DE102023100352B3 (en) 2022-02-16 2023-04-27 Elmos Semiconductor Se LIDAR VCSEL laser module with low parasitic inductances
CN114567632B (en) * 2022-02-23 2023-09-19 中煤能源研究院有限责任公司 Progressive coding edge intelligent image transmission method, system, equipment and medium
CN114545405B (en) * 2022-02-24 2023-05-02 电子科技大学 Real-beam scanning radar angle super-resolution method based on neural network
CN114723672A (en) * 2022-03-09 2022-07-08 杭州易现先进科技有限公司 Method, system, device and medium for three-dimensional reconstruction data acquisition and verification
CN114624818B (en) * 2022-03-18 2024-03-29 苏州山河光电科技有限公司 Fiber bragg grating device and sensing equipment
CN114719830B (en) * 2022-03-23 2023-06-23 深圳市维力谷无线技术股份有限公司 Backpack type mobile mapping system and mapping instrument with same
CN114419260B (en) * 2022-03-30 2022-06-17 山西建筑工程集团有限公司 Method for three-dimensional topographic surveying and mapping earthwork engineering quantity by using composite point cloud network
CN114724368B (en) * 2022-03-31 2023-04-25 海南龙超信息科技集团有限公司 Smart city traffic management system
CN114814880A (en) * 2022-04-01 2022-07-29 深圳市灵明光子科技有限公司 Laser radar detection parameter adjustment control method and device
DE102022107842A1 (en) 2022-04-01 2023-10-05 Valeo Schalter Und Sensoren Gmbh Detection device and method for detecting a person in a vehicle interior of a vehicle
CN114861587B (en) * 2022-04-07 2023-03-10 珠海妙存科技有限公司 Chip carrier plate pin arrangement design method, system, device and storage medium
US20230333255A1 (en) * 2022-04-14 2023-10-19 Aurora Operations, Inc. Lidar system
US11886095B2 (en) 2022-04-15 2024-01-30 Raytheon Company Scalable unit cell device for large two-dimensional arrays with integrated phase control
CN114743269B (en) * 2022-04-19 2022-12-02 国网湖北省电力有限公司黄石供电公司 Method and system for identifying nonstandard operation of transformer substation worker
CN114519403B (en) * 2022-04-19 2022-09-02 清华大学 Optical diagram neural classification network and method based on-chip diffraction neural network
DE102022203850A1 (en) * 2022-04-20 2023-10-26 Robert Bosch Gesellschaft mit beschränkter Haftung Device and method for determining a pupil position
EP4270050A1 (en) * 2022-04-25 2023-11-01 Leica Geosystems AG Method for coordinative measuring by terrestrial scanning with image-based interference detection of moving objects
CN114935751B (en) * 2022-05-13 2024-04-12 中国科学院西安光学精密机械研究所 High-digital dynamic target simulator and simulation method
CN114999581B (en) * 2022-06-13 2023-11-10 华东交通大学 Time lag identification method and system for rare earth extraction and separation process
CN114758311B (en) * 2022-06-14 2022-09-02 北京航空航天大学 Traffic flow prediction method and system based on heterogeneous feature fusion
CN114764911B (en) * 2022-06-15 2022-09-23 小米汽车科技有限公司 Obstacle information detection method, obstacle information detection device, electronic device, and storage medium
WO2023245145A2 (en) * 2022-06-16 2023-12-21 Nanopath Inc. Multiplexed pathogen detection using nanoplasmonic sensor for human papillomavirus
EP4300133A1 (en) * 2022-06-27 2024-01-03 VoxelSensors SRL Optical sensing system
CN115168345B (en) * 2022-06-27 2023-04-18 天翼爱音乐文化科技有限公司 Database classification method, system, device and storage medium
DE102022116331A1 (en) * 2022-06-30 2024-01-04 Connaught Electronics Ltd. Camera for a vehicle, method for operating a camera and system with a camera, e.g. for use as a camera monitor system or all-round view system
CN115313128B (en) * 2022-07-07 2024-04-26 北京工业大学 Interference system based on multispectral mid-wave infrared picosecond all-fiber laser
CN115222791B (en) * 2022-07-15 2023-08-15 小米汽车科技有限公司 Target association method, device, readable storage medium and chip
CN114935739B (en) * 2022-07-20 2022-11-01 南京恩瑞特实业有限公司 Compensation system of test source in phased array weather radar machine
CN115100503B (en) * 2022-07-29 2024-05-07 电子科技大学 Method, system, storage medium and terminal for generating countermeasure point cloud based on curvature distance and hard concrete distribution
CN115253141B (en) * 2022-08-03 2023-06-02 杭州智缤科技有限公司 Low-power-consumption intelligent fire hydrant system, control method and control system
KR20240040057A (en) * 2022-08-05 2024-03-27 코어포토닉스 리미티드 System and method for zoom digital camera with automatically adjustable zoom field of view
CN115341165B (en) * 2022-08-22 2023-10-10 中国科学院长春应用化学研究所 Powder coating thermal spraying equipment system that shoots
CN115144842B (en) * 2022-09-02 2023-03-14 深圳阜时科技有限公司 Transmitting module, photoelectric detection device, electronic equipment and three-dimensional information detection method
EP4336211A1 (en) * 2022-09-07 2024-03-13 Nokia Technologies Oy Controlling devices using lidar signals
CN115279038B (en) * 2022-09-26 2022-12-27 深圳国人无线通信有限公司 Wiring method suitable for high-speed signal transmission and PCB
CN115356748B (en) * 2022-09-29 2023-01-17 江西财经大学 Method and system for extracting atmospheric pollution information based on laser radar observation result
WO2024081258A1 (en) * 2022-10-14 2024-04-18 Motional Ad Llc Plenoptic sensor devices, systems, and methods
DE102022127124A1 (en) 2022-10-17 2024-04-18 Bayerische Motoren Werke Aktiengesellschaft Method for generating a test data set for assessing a blockage of a LIDAR sensor for a driver assistance system of a motor vehicle
DE102022127121A1 (en) 2022-10-17 2024-04-18 Bayerische Motoren Werke Aktiengesellschaft LIDAR system for a driver assistance system of a motor vehicle
DE102022127122A1 (en) 2022-10-17 2024-04-18 Bayerische Motoren Werke Aktiengesellschaft LIDAR system for a driver assistance system
WO2024092177A1 (en) * 2022-10-27 2024-05-02 Analog Photonics LLC Doppler processing in coherent lidar
CN115390164B (en) * 2022-10-27 2023-01-31 南京信息工程大学 Radar echo extrapolation forecasting method and system
CN116050243B (en) * 2022-11-16 2023-09-05 南京玻璃纤维研究设计院有限公司 Glass resistivity prediction method and system based on functional statistical model
CN115603849B (en) * 2022-11-24 2023-04-07 北京理工大学深圳汽车研究院(电动车辆国家工程实验室深圳研究院) Multi-sensor trigger control method, device, equipment and storage medium
CN115599799B (en) * 2022-11-30 2023-03-10 中南大学 Block chain and federal learning fusion method for medical big data
CN115599025B (en) * 2022-12-12 2023-03-03 南京芯驰半导体科技有限公司 Resource grouping control system, method and storage medium of chip array
CN115664426B (en) * 2022-12-27 2023-03-21 深圳安德空间技术有限公司 Real-time lossless compression method and system for ground penetrating radar data
CN115825952A (en) * 2023-01-19 2023-03-21 中国科学院空天信息创新研究院 Satellite-borne SAR imaging method for simultaneous double-side-view imaging
US11906623B1 (en) * 2023-01-25 2024-02-20 Plusai, Inc. Velocity estimation using light detection and ranging (LIDAR) system
CN115827938B (en) * 2023-02-20 2023-04-21 四川省煤田测绘工程院 Homeland space planning data acquisition method, electronic equipment and computer readable medium
CN115981375B (en) * 2023-03-17 2023-07-28 南京信息工程大学 Design method of multi-unmanned aerial vehicle time-varying formation controller based on event triggering mechanism
CN116030212B (en) * 2023-03-28 2023-06-02 北京集度科技有限公司 Picture construction method, equipment, vehicle and storage medium
CN116051429B (en) * 2023-03-31 2023-07-18 深圳时识科技有限公司 Data enhancement method, impulse neural network training method, storage medium and chip
CN116184368B (en) * 2023-04-25 2023-07-11 山东科技大学 Gaussian-Markov-based airborne radar placement error interpolation correction method
CN116232123B (en) * 2023-05-06 2023-08-08 太原理工大学 Energy self-adaptive conversion device and method based on mining air duct vibration spectrum
CN116242414B (en) * 2023-05-12 2023-08-11 深圳深浦电气有限公司 Response time detection system and detection device
US11927673B1 (en) * 2023-05-16 2024-03-12 Wireless Photonics, Llc Method and system for vehicular lidar and communication utilizing a vehicle head light and/or taillight
CN116671900B (en) * 2023-05-17 2024-03-19 安徽理工大学 Blink recognition and control method based on brain wave instrument
CN116466328A (en) * 2023-06-19 2023-07-21 深圳市矽赫科技有限公司 Flash intelligent optical radar device and system
CN117075130A (en) * 2023-07-07 2023-11-17 中国电子科技集团公司第三十八研究所 Low-speed small target laser tracking device and working method thereof
CN116609766B (en) * 2023-07-21 2023-11-07 深圳市速腾聚创科技有限公司 Laser radar and mobile device
CN116629183B (en) * 2023-07-24 2023-10-13 湖南大学 Silicon carbide MOSFET interference source modeling method, equipment and storage medium
CN116660866B (en) * 2023-07-31 2023-12-05 今创集团股份有限公司 Laser radar visual detection box and manufacturing method and application thereof
CN116721301B (en) * 2023-08-10 2023-10-24 中国地质大学(武汉) Training method, classifying method, device and storage medium for target scene classifying model
CN116886637B (en) * 2023-09-05 2023-12-19 北京邮电大学 Single-feature encryption stream detection method and system based on graph integration
CN117036647B (en) * 2023-10-10 2023-12-12 中国电建集团昆明勘测设计研究院有限公司 Ground surface extraction method based on inclined live-action three-dimensional model
CN117098255B (en) * 2023-10-19 2023-12-15 南京波达电子科技有限公司 Edge calculation-based decentralization radar ad hoc network method
CN117170093B (en) * 2023-11-03 2024-01-09 山东创瑞激光科技有限公司 Optical path system of face type scanning
CN117278328B (en) * 2023-11-21 2024-02-06 广东车卫士信息科技有限公司 Data processing method and system based on Internet of vehicles
CN117478278B (en) * 2023-12-26 2024-03-15 南京信息工程大学 Method, device, terminal and storage medium for realizing zero-error communication
CN117470719B (en) * 2023-12-27 2024-03-12 山西省生态环境监测和应急保障中心(山西省生态环境科学研究院) Multifunctional environment monitoring robot
CN117556221B (en) * 2024-01-09 2024-03-26 四川大学 Data analysis method and system based on intelligent electrical control interaction session
CN117590353B (en) * 2024-01-19 2024-03-29 山东省科学院海洋仪器仪表研究所 Method for rapidly extracting and imaging weak echo signals of photon counting laser radar
CN117890898B (en) * 2024-03-01 2024-05-14 清华大学 Bistatic radar encryption target detection method based on phase center agile array
CN117974369A (en) * 2024-03-29 2024-05-03 陕西交控通宇交通研究有限公司 Intelligent bridge construction monitoring method and device

Family Cites Families (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7436038B2 (en) * 2002-02-05 2008-10-14 E-Phocus, Inc Visible/near infrared image sensor array
US8569700B2 (en) * 2012-03-06 2013-10-29 Omnivision Technologies, Inc. Image sensor for two-dimensional and three-dimensional image capture
CN103346476B (en) * 2013-06-24 2015-10-28 中国科学院长春光学精密机械与物理研究所 Photonic crystal nano cavity Quantum Rings single photon emission device and preparation method thereof
US20150362587A1 (en) * 2014-06-17 2015-12-17 Microsoft Corporation Lidar sensor calibration using surface pattern detection
IL233356A (en) * 2014-06-24 2015-10-29 Brightway Vision Ltd Gated sensor based imaging system with minimized delay time between sensor exposures
EP3161520B1 (en) * 2014-06-27 2021-10-13 HRL Laboratories, LLC Compressive scanning lidar
IL235359A0 (en) * 2014-10-27 2015-11-30 Ofer David High dynamic range imaging of environment with a high-intensity reflecting/transmitting source
CN104682194A (en) * 2014-11-02 2015-06-03 北京工业大学 Double-resonance vertical-cavity surface-emitting laser structure for generating terahertz wave and microwave
TWI744196B (en) 2015-08-04 2021-10-21 光程研創股份有限公司 Method for fabricating image sensor array
WO2017112416A1 (en) * 2015-12-20 2017-06-29 Apple Inc. Light detection and ranging sensor
US10761195B2 (en) * 2016-04-22 2020-09-01 OPSYS Tech Ltd. Multi-wavelength LIDAR system
US10451740B2 (en) * 2016-04-26 2019-10-22 Cepton Technologies, Inc. Scanning lidar systems for three-dimensional sensing
US20170372602A1 (en) * 2016-06-24 2017-12-28 Continental Advanced Lidar Solutions Us, Llc Ladar enabled traffic control
US10120214B2 (en) * 2016-06-24 2018-11-06 Qualcomm Incorporated Systems and methods for light beam position detection
DE102017208052A1 (en) 2017-05-12 2018-11-15 Robert Bosch Gmbh Transmitter optics for a LiDAR system, optical arrangement for a LiDAR system, LiDAR system and working device
DE202018006696U1 (en) * 2017-05-15 2022-04-01 Ouster, Inc. Optical image transmitter with brightness improvement
FR3066621A1 (en) * 2017-05-17 2018-11-23 Valeo Systemes D'essuyage OPTICAL SENSOR PROTECTION DEVICE, DRIVING ASSISTANCE SYSTEM AND CORRESPONDING ASSEMBLY METHOD
DE102017213298A1 (en) 2017-08-01 2019-02-07 Osram Gmbh Data transmission to a motor vehicle
DE102017213465A1 (en) 2017-08-03 2019-02-07 Robert Bosch Gmbh Fiber optic based LiDAR system
DE102017216198A1 (en) 2017-09-13 2019-03-14 Osram Gmbh DATA TRANSMISSION BY A MOTOR VEHICLE
DE102017127963A1 (en) 2017-11-27 2019-05-29 Valeo Schalter Und Sensoren Gmbh Circuit arrangement for detecting light
CN208111471U (en) * 2018-04-25 2018-11-16 孙刘杰 A kind of upside-down mounting RCLED based on MJT technology
CN109541569A (en) 2018-09-30 2019-03-29 北醒(北京)光子科技有限公司 A kind of laser radar APD temperature compensation system and measurement method
DE102019001005A1 (en) 2019-02-11 2019-08-01 Daimler Ag Device and method for the compression of sensor data
CN110620169B (en) * 2019-09-10 2020-08-28 北京工业大学 Transverse current limiting high-efficiency light-emitting diode based on resonant cavity
CN115986033A (en) * 2022-12-28 2023-04-18 厦门大学 Synchrotron radiation orthogonal linearly polarized light resonant cavity light-emitting diode

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220291388A1 (en) * 2021-03-15 2022-09-15 Argo AI, LLC Compressive sensing for photodiode data
US11567212B2 (en) * 2021-03-15 2023-01-31 Argo AI, LLC Compressive sensing for photodiode data
DE102022124675A1 (en) 2022-09-26 2024-03-28 Ifm Electronic Gmbh PMD sensor with multiple semiconductor levels

Also Published As

Publication number Publication date
EP3963355A1 (en) 2022-03-09
CN113795773A (en) 2021-12-14
WO2020182591A1 (en) 2020-09-17
US20220276352A1 (en) 2022-09-01
US20200284883A1 (en) 2020-09-10
CN114942454A (en) 2022-08-26
DE112020001131T5 (en) 2022-01-27
US20240094353A1 (en) 2024-03-21
US11726184B2 (en) 2023-08-15
CN114942453A (en) 2022-08-26
CA3173966A1 (en) 2020-09-17

Similar Documents

Publication Publication Date Title
US20220276351A1 (en) Component and sensor for a lidar sensor system, lidar sensor system and method for a lidar sensor system
US20230205222A1 (en) Planar-Beam, Light Detection and Ranging System
WO2020170841A1 (en) Avalanche-photodiode sensor and distance measurement device
US20220082660A1 (en) Light Detection and Ranging (Lidar) Device having a Light-Guide Manifold
US20230358870A1 (en) Systems and methods for tuning filters for use in lidar systems
US20230194717A1 (en) Utilizing light detection and ranging sensors for vehicle-to-everything communications
CN117561458A (en) LIDAR system and method for vehicle corner mounting
Chai et al. Technologies for autonomous driving
US20230305160A1 (en) Multimodal detection with integrated sensors
US20230213623A1 (en) Systems and methods for scanning a region of interest using a light detection and ranging scanner
US11947048B2 (en) Crosstalk reduction for light detection and ranging (lidar) devices using wavelength locking
US11871130B2 (en) Compact perception device
US20240103138A1 (en) Stray light filter structures for lidar detector array
US20230305124A1 (en) Methods and systems of window blockage detection for lidar
US20230138819A1 (en) Compact lidar systems for detecting objects in blind-spot areas
US20240159518A1 (en) Unevenly distributed illumination for depth sensor
WO2024049500A2 (en) Multimodal detection with integrated sensors
WO2023130125A1 (en) Systems and methods for scanning a region of interest using a light detection and ranging scanner
Padullaparthi Automotive LiDARs
WO2024072664A1 (en) Improvement on wavelength stability of multijunction diode laser in lidar
WO2023107099A1 (en) Light detection and ranging (lidar) device having a light-guide manifold
WO2023076635A1 (en) Compact lidar systems for detecting objects in blind-spot areas
WO2023183425A1 (en) Methods and systems of window blockage detection for lidar

Legal Events

Date Code Title Description
AS Assignment

Owner name: OSRAM GMBH, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FERREIRA, RICARDO;HADRATH, STEFAN;HOEHMANN, PETER;AND OTHERS;SIGNING DATES FROM 20211105 TO 20211125;REEL/FRAME:059896/0979

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION