USRE48042E1 - Devices and methods for a LIDAR platform with a shared transmit/receive path - Google Patents

Devices and methods for a LIDAR platform with a shared transmit/receive path Download PDF

Info

Publication number
USRE48042E1
USRE48042E1 US15/919,479 US201815919479A USRE48042E US RE48042 E1 USRE48042 E1 US RE48042E1 US 201815919479 A US201815919479 A US 201815919479A US RE48042 E USRE48042 E US RE48042E
Authority
US
United States
Prior art keywords
light
lens
lidar device
receive
block
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US15/919,479
Inventor
Gaetan Pennecot
Pierre-Yves Droz
Drew Eugene Ulrich
Daniel Gruver
Zachary Morriss
Anthony Levandowski
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Waymo LLC
Waymo Holding Inc
Original Assignee
Waymo LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=51493404&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=USRE48042(E1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Waymo LLC filed Critical Waymo LLC
Priority to US15/919,479 priority Critical patent/USRE48042E1/en
Priority to US16/890,789 priority patent/USRE48874E1/en
Application granted granted Critical
Publication of USRE48042E1 publication Critical patent/USRE48042E1/en
Assigned to GOOGLE INC. reassignment GOOGLE INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DROZ, PIERRE-YVES, GRUVER, DANIEL, LEVANDOWSKI, ANTHONY, MORRISS, ZACHARY, PENNECOT, GAETAN, ULRICH, DREW EUGENE
Assigned to WAYMO HOLDING INC. reassignment WAYMO HOLDING INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GOOGLE INC.
Assigned to WAYMO LLC reassignment WAYMO LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WAYMO HOLDING INC.
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4811Constructional features, e.g. arrangements of optical elements common to transmitter and receiver
    • G01S7/4813Housing arrangements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4811Constructional features, e.g. arrangements of optical elements common to transmitter and receiver
    • G01S7/4812Constructional features, e.g. arrangements of optical elements common to transmitter and receiver transmitted and received beams following a coaxial path
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • G01S7/4815Constructional features, e.g. arrangements of optical elements of transmitters alone using multiple transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning

Definitions

  • Vehicles can be configured to operate in an autonomous mode in which the vehicle navigates through an environment with little or no input from a driver.
  • Such autonomous vehicles can include one or more sensors that are configured to detect information about the environment in which the vehicle operates.
  • a LIDAR can estimates distance to environmental features while scanning through a scene to assemble a “point cloud” indicative of reflective surfaces in the environment. Individual points in the point cloud can be determined by transmitting a laser pulse and detecting a returning pulse, if any, reflected from an object in the environment, and determining the distance to the object according to the time delay between the transmitted pulse and the reception of the reflected pulse.
  • a laser, or set of lasers can be rapidly and repeatedly scanned across a scene to provide continuous real-time information on distances to reflective objects in the scene. Combining the measured distances and the orientation of the laser(s) while measuring each distance allows for associating a three-dimensional position with each returning pulse. In this way, a three-dimensional map of points indicative of locations of reflective features in the environment can be generated for the entire scanning zone.
  • a light detection and ranging (LIDAR) device in one example, includes a housing configured to rotate about an axis.
  • the housing has an interior space that includes a transmit block, a receive block, and a shared space.
  • the transmit block has an exit aperture and the receive block has an entrance aperture.
  • the LIDAR device also includes a plurality of light sources in the transmit block.
  • the plurality of light sources is configured to emit a plurality of light beams that enter the shared space through the exit aperture and traverse the shared space via a transmit path.
  • the light beams include light having wavelengths in a wavelength range.
  • the LIDAR device also includes a plurality of detectors in the receive block. The plurality of detectors is configured to detect light having wavelengths in the wavelength range.
  • the LIDAR device also includes a lens mounted to the housing.
  • the lens is configured to (i) receive the light beams via the transmit path, (ii) collimate the light beams for transmission into an environment of the LIDAR device, (iii) collect light that includes light from one or more of the collimated light beams reflected by one or more objects in the environment of the LIDAR device, and (iv) focus the collected light onto the detectors via a receive path that extends through the shared space and the entrance aperture of the receive block.
  • a method in another example, involves rotating a housing of a light detection and ranging (LIDAR) device about an axis.
  • the housing has an interior space that includes a transmit block, a receive block, and a shared space.
  • the transmit block has an exit aperture and the receive block has an entrance aperture.
  • the method further involves emitting a plurality of light beams by a plurality of light sources in the transmit block.
  • the plurality of light beams enter the shared space via a transmit path.
  • the light beams include light having wavelengths in a wavelength range.
  • the method further involves receiving the light beams at a lens mounted to the housing along the transmit path.
  • the method further involves collimating, by the lens, the light beams for transmission into an environment of the LIDAR device.
  • the method further involves collecting, by the lens, light from one or more of the collimated light beams reflected by one or more objects in the environment of the LIDAR device.
  • the method further involves focusing, by the lens, the collected light onto a plurality of detectors in the receive block via a receive path that extends through the shared space and the entrance aperture of the receive block.
  • the method further involves detecting, by the plurality of detectors in the receive block, light from the focused light having wavelengths in the wavelength range.
  • FIG. 1 is a block diagram of an example LIDAR device.
  • FIG. 2 is a cross-section view of an example LIDAR device.
  • FIG. 3A is a perspective view of an example LIDAR device fitted with various components, in accordance with at least some embodiments described herein
  • FIG. 3B is a perspective view of the example LIDAR device shown in FIG. 3A with the various components removed to illustrate interior space of the housing.
  • FIG. 4 illustrates an example transmit block, in accordance with at least some embodiments described herein.
  • FIG. 5A is a view of an example light source, in accordance with an example embodiment.
  • FIG. 5B is a view of the light source of FIG. 5A in combination with a cylindrical lens, in accordance with an example embodiment.
  • FIG. 5C is another view of the light source and cylindrical lens combination of FIG. 5B , in accordance with an example embodiment.
  • FIG. 6A illustrates an example receive block, in accordance with at least some embodiments described herein.
  • FIG. 6B illustrates a side view of three detectors included in the receive block of FIG. 6A .
  • FIG. 7A illustrates an example lens with an aspheric surface and a toroidal surface, in accordance with at least some embodiments described herein.
  • FIG. 7B illustrates a cross-section view of the example lens 750 shown in FIG. 7A .
  • FIG. 8A illustrates an example LIDAR device mounted on a vehicle, in accordance with at least some embodiments described herein.
  • FIG. 8B illustrates a scenario where the LIDAR device shown in FIG. 8A is scanning an environment that includes one or more objects, in accordance with at least some embodiments described herein.
  • FIG. 9 is a flowchart of a method, in accordance with at least some embodiments described herein.
  • a light detection and ranging (LIDAR) device may transmit light pulses originating from a plurality of light sources and may receive reflected light pulses that are then detected by a plurality of detectors.
  • a LIDAR device is provided that includes a transmit/receive lens that both collimates the light from the plurality of light sources and focuses the reflected light onto the plurality of detectors.
  • the LIDAR device comprises a housing that is configured to rotate about an axis.
  • the axis is substantially vertical.
  • the housing may have an interior space that includes various components such as a transmit block that includes the plurality of light sources, a receive block that includes the plurality of detectors, a shared space where emitted light traverses from the transmit block to the transmit/receive lens and reflected light traverses from the transmit/receive lens to the receive block, and the transmit/receive lens that collimates the emitted light and focuses the reflected light.
  • the housing may include radio frequency (RF) and optical shielding between the transmit block and the receive block.
  • RF radio frequency
  • the housing can be formed from and/or coated by a metal, metallic ink, or metallic foam to provide the RF shielding.
  • Metals used for shielding can include, for example, copper or nickel.
  • the plurality of light sources included in the transmit block can include, for example, laser diodes.
  • the light sources emit light with wavelengths of approximately 905 nm.
  • a transmit path through which the transmit/receive lens receives the light emitted by the light sources may include a reflective element, such as a mirror or prism. By including the reflective element, the transmit path can be folded to provide a smaller size of the transmit block and, hence, a smaller housing of the LIDAR device.
  • the transmit path includes an exit aperture of the transmit block through which the emitted light enters the shared space and traverses to the transmit/receive lens.
  • each light source of the plurality of light sources includes a respective lens, such as a cylindrical or acylindrical lens.
  • the light source may emit an uncollimated light beam that diverges more in a first direction than in a second direction.
  • the light source's respective lens may pre-collimate the uncollimated light beam in the first direction to provide a partially collimated light beam, thereby reducing the divergence in the first direction.
  • the partially collimated light beam diverges less in the first direction than in the second direction.
  • the transmit/receive lens receives the partially collimated light beams from the one or more light sources via an exit aperture of the transmit block and the transmit/receive lens collimates the partially collimated light beams to provide collimated light beams that are transmitted into the environment of the LIDAR device.
  • the light emitted by the light sources may have a greater divergence in the second direction than in the first direction, and the exit aperture can accommodate vertical and horizontal extents of the beams of light from the light sources.
  • the housing mounts the transmit/receive lens through which light from the plurality of light sources can exit the housing, and reflected light can enter the housing to reach the receive block.
  • the transmit/receive lens can have an optical power that is sufficient to collimate the light emitted by the plurality of light sources and to focus the reflected light onto the plurality of detectors in the receive block.
  • the transmit/receive lens has a surface with an aspheric shape that is at the outside of the housing, a surface with a toroidal shape that is inside the housing, and a focal length of approximately 120 mm.
  • the plurality of detectors included in the receive block can include, for example, avalanche photodiodes in a sealed environment that is filled with an inert gas, such as nitrogen.
  • the receive block can include an entrance aperture through which focused light from the transmit/receive lens traverses towards the detectors.
  • the entrance aperture can include a filtering window that passes light having wavelengths within the wavelength range emitted by the plurality of light sources and attenuates light having other wavelengths.
  • the collimated light transmitted from the LIDAR device into the environment may reflect from one or more objects in the environment to provide object-reflected light.
  • the transmit/receive lens may collect the object-reflected light and focus the object-reflected light through a focusing path (“receive path”) onto the plurality of detectors.
  • the receive path may include a reflective surface that directs the focused light to the plurality of detectors. Additionally or alternatively, the reflective surface can fold the focused light towards the receive block and thus provide space savings for the shared space and the housing of the LIDAR device.
  • the reflective surface may define a wall that includes the exit aperture between the transmit block and the shared space.
  • the exit aperture of the transmit block corresponds to a transparent and/or non-reflective portion of the reflective surface.
  • the transparent portion can be a hole or cut-away portion of the reflective surface.
  • the reflective surface can be formed by forming a layer of reflective material on a transparent substrate (e.g., glass) and the transparent portion can be a portion of the substrate that is not coated with the reflective material.
  • the shared space can be used for both the transmit path and the receive path.
  • the transmit path at least partially overlaps the receive path in the shared space.
  • the vertical and horizontal extents of the exit aperture are sufficient to accommodate the beam widths of the emitted light beams from the light sources.
  • the non-reflective nature of the exit aperture prevents a portion of the collected and focused light in the receive path from reflecting, at the reflective surface, towards the detectors in the receive block.
  • reducing the beam widths of the emitted light beams from the transmit blocks is desirable to minimize the size of the exit aperture and reduce the lost portion of the collected light.
  • the reduction of the beam widths traversing through the exit aperture can be achieved by partially collimating the emitted light beams by including a respective lens, such as a cylindrical or acylindrical lens, adjacent to each light source.
  • the transmit/receive lens can be configured to define a focal surface that has a substantial curvature in a vertical plane and/or a horizontal plane.
  • the transmit/receive lens can be configured to have the aspheric surface and the toroidal surface described above that provides the curved focal surface along the vertical plane and/or the horizontal plane.
  • the light sources in the transmit block can be arranged along the transmit/receive lens' curved focal surface in the transmit block, and the detectors in the receive block can be arranged on the transmit/receive lens' curved focal surface in the receive block.
  • the emitted light beams from the light sources arranged along the curved focal surface can converge into the exit aperture having a smaller size than an aperture for light beams that are substantially parallel and/or diverging.
  • the light sources can be mounted on a curved edge of one or more vertically-oriented printed circuit boards (PCBs), such that the curved edge of the PCB substantially matches the curvature of the focal surface in the vertical plane of the PCB.
  • the one or more PCBs can be mounted in the transmit block along a horizontal curvature that substantially matches the curvature of the focal surface in the horizontal plane of the one or more PCBs.
  • the transmit block can include four PCBs, with each PCB mounting sixteen light sources, so as to provide 64 light sources along the curved focal plane of the transmit/receive lens in the transmit block.
  • the 64 light sources are arranged in a pattern substantially corresponding to the curved focal surface defined by the transmit/receive lens such that the emitted light beams converge towards the exit aperture of the transmit block.
  • the plurality of detectors can be disposed on a flexible PCB that is mounted to the receive block to conform with the shape of the transmit/receive lens' focal surface.
  • the flexible PCB may be held between two clamping pieces that have surfaces corresponding to the shape of the focal surface.
  • each of the plurality of detectors can be arranged on the flexible PCB so as to receive focused light from the transmit/receive lens that corresponds to a respective light source of the plurality of light sources.
  • the detectors can be arranged in a pattern substantially corresponding to the curved focal surface of the transmit/receive lens in the receive block.
  • the transmit/receive lens can be configured to focus onto each detector of the plurality of detectors a respective portion of the collected light that comprises light from the detector's corresponding light source.
  • LIDAR device that uses a shared transmit/receive lens.
  • LIDAR device can include the shared lens configured to provide a curved focal plane for transmitting light sources and receiving detectors such that light from the light sources passes through a small exit aperture included in a reflective surface that reflects collected light towards the detectors.
  • FIG. 1 is a block diagram of an example LIDAR device 100 .
  • the LIDAR device 100 comprises a housing 110 that houses an arrangement of various components included in the LIDAR device 100 such as a transmit block 120 , a receive block 130 , a shared space 140 , and a lens 150 .
  • the LIDAR device 100 includes the arrangement of the various components that provide emitted light beams 102 from the transmit block 120 that are collimated by the lens 150 and transmitted to an environment of the LIDAR device 100 as collimated light beams 104 , and collect reflected light 106 from one or more objects in the environment of the LIDAR device 100 by the lens 150 for focusing towards the receive block 130 as focused light 108 .
  • the reflected light 106 comprises light from the collimated light beams 104 that was reflected by the one or more objects in the environment of the LIDAR device 100 .
  • the emitted light beams 102 and the focused light 108 traverse in the shared space 140 also included in the housing 110 .
  • the emitted light beams 102 are propagating in a transmit path through the shared space 140 and the focused light 108 are propagating in a receive path through the shared space 140 .
  • the transmit path at least partially overlaps the receive path in the shared space 140 .
  • the LIDAR device 100 can determine an aspect of the one or more objects (e.g., location, shape, etc.) in the environment of the LIDAR device 100 by processing the focused light 108 received by the receive block 130 . For example, the LIDAR device 100 can compare a time when pulses included in the emitted light beams 102 were emitted by the transmit block 120 with a time when corresponding pulses included in the focused light 108 were received by the receive block 130 and determine the distance between the one or more objects and the LIDAR device 100 based on the comparison.
  • the LIDAR device 100 can compare a time when pulses included in the emitted light beams 102 were emitted by the transmit block 120 with a time when corresponding pulses included in the focused light 108 were received by the receive block 130 and determine the distance between the one or more objects and the LIDAR device 100 based on the comparison.
  • the housing 110 included in the LIDAR device 100 can provide a platform for mounting the various components included in the LIDAR device 100 .
  • the housing 110 can be formed from any material capable of supporting the various components of the LIDAR device 100 included in an interior space of the housing 110 .
  • the housing 110 may be formed from a structural material such as plastic or metal.
  • the housing 110 can be configured for optical shielding to reduce ambient light and/or unintentional transmission of the emitted light beams 102 from the transmit block 120 to the receive block 130 .
  • Optical shielding from ambient light of the environment of the LIDAR device 100 can be achieved by forming and/or coating the outer surface of the housing 110 with a material that blocks the ambient light from the environment.
  • inner surfaces of the housing 110 can include and/or be coated with the material described above to optically isolate the transmit block 120 from the receive block 130 to prevent the receive block 130 from receiving the emitted light beams 102 before the emitted light beams 102 reach the lens 150 .
  • the housing 110 can be configured for electromagnetic shielding to reduce electromagnetic noise (e.g., Radio Frequency (RF) Noise, etc.) from ambient environment of the LIDAR device 110 and/or electromagnetic noise between the transmit block 120 and the receive block 130 .
  • Electromagnetic shielding can improve quality of the emitted light beams 102 emitted by the transmit block 120 and reduce noise in signals received and/or provided by the receive block 130 .
  • Electromagnetic shielding can be achieved by forming and/or coating the housing 110 with a material that absorbs electromagnetic radiation such as a metal, metallic ink, metallic foam, carbon foam, or any other material configured to absorb electromagnetic radiation.
  • Metals that can be used for the electromagnetic shielding can include for example, copper or nickel.
  • the housing 110 can be configured to have a substantially cylindrical shape and to rotate about an axis of the LIDAR device 100 .
  • the housing 110 can have the substantially cylindrical shape with a diameter of approximately 10 centimeters.
  • the axis is substantially vertical.
  • a three-dimensional map of a 360 degree view of the environment of the LIDAR device 100 can be determined without frequent recalibration of the arrangement of the various components of the LIDAR device 100 .
  • the LIDAR device 100 can be configured to tilt the axis of rotation of the housing 110 to control the field of view of the LIDAR device 100 .
  • the LIDAR device 100 can optionally include a mounting structure for the housing 110 .
  • the mounting structure can include a motor or other means for rotating the housing 110 about the axis of the LIDAR device 100 .
  • the mounting structure can be included in a device and/or system other than the LIDAR device 100 .
  • the various components of the LIDAR device 100 such as the transmit block 120 , receive block 130 , and the lens 150 can be removably mounted to the housing 110 in predetermined positions to reduce burden of calibrating the arrangement of each component and/or subcomponents included in each component.
  • the housing 110 provides the platform for the various components of the LIDAR device 100 for ease of assembly, maintenance, calibration, and manufacture of the LIDAR device 100 .
  • the transmit block 120 includes a plurality of light sources 122 that can be configured to emit the plurality of emitted light beams 102 via an exit aperture 124 .
  • each of the plurality of emitted light beams 102 corresponds to one of the plurality of light sources 122 .
  • the transmit block 120 can optionally include a mirror 126 along the transmit path of the emitted light beams 102 between the light sources 122 and the exit aperture 124 .
  • the light sources 122 can include laser diodes, light emitting diodes (LED), vertical cavity surface emitting lasers (VCSEL), organic light emitting diodes (OLED), polymer light emitting diodes (PLED), light emitting polymers (LEP), liquid crystal displays (LCD), microelectromechanical systems (MEMS), or any other device configured to selectively transmit, reflect, and/or emit light to provide the plurality of emitted light beams 102 .
  • the light sources 122 can be configured to emit the emitted light beams 102 in a wavelength range that can be detected by detectors 132 included in the receive block 130 .
  • the wavelength range could, for example, be in the ultraviolet, visible, and/or infrared portions of the electromagnetic spectrum.
  • the wavelength range can be a narrow wavelength range, such as provided by lasers. In one example, the wavelength range includes wavelengths that are approximately 905 nm.
  • the light sources 122 can be configured to emit the emitted light beams 102 in the form of pulses. In some examples, the plurality of light sources 122 can be disposed on one or more substrates (e.g., printed circuit boards (PCB), flexible PCBs, etc.) and arranged to emit the plurality of light beams 102 towards the exit aperture 124 .
  • PCB printed circuit boards
  • the plurality of light sources 122 can be configured to emit uncollimated light beams included in the emitted light beams 102 .
  • the emitted light beams 102 can diverge in one or more directions along the transmit path due to the uncollimated light beams emitted by the plurality of light sources 122 .
  • vertical and horizontal extents of the emitted light beams 102 at any position along the transmit path can be based on an extent of the divergence of the uncollimated light beams emitted by the plurality of light sources 122 .
  • the exit aperture 124 arranged along the transmit path of the emitted light beams 102 can be configured to accommodate the vertical and horizontal extents of the plurality of light beams 102 emitted by the plurality of light sources 122 at the exit aperture 124 .
  • the block diagram shown in FIG. 1 is described in connection with functional modules for convenience in description. However, the functional modules in the block diagram of FIG. 1 can be physically implemented in other locations.
  • the exit aperture 124 is included in the transmit block 120
  • the exit aperture 124 can be physically included in both the transmit block 120 and the shared space 140 .
  • the transmit block 120 and the shared space 140 can be separated by a wall that includes the exit aperture 124 .
  • the exit aperture 124 can correspond to a transparent portion of the wall.
  • the transparent portion can be a hole or cut-away portion of the wall.
  • the wall can be formed from a transparent substrate (e.g., glass) coated with a non-transparent material, and the exit aperture 124 can be a portion of the substrate that is not coated with the non-transparent material.
  • the LIDAR device 100 it may be desirable to minimize size of the exit aperture 124 while accommodating the vertical and horizontal extents of the plurality of light beams 102 .
  • minimizing the size of the exit aperture 124 can improve the optical shielding of the light sources 122 described above in the functions of the housing 110 .
  • the wall separating the transmit block 120 and the shared space 140 can be arranged along the receive path of the focused light 108 , and thus, the exit aperture 124 can be minimized to allow a larger portion of the focused light 108 to reach the wall.
  • the wall can be coated with a reflective material (e.g., reflective surface 142 in shared space 140 ) and the receive path can include reflecting the focused light 108 by the reflective material towards the receive block 130 .
  • a reflective material e.g., reflective surface 142 in shared space 140
  • the receive path can include reflecting the focused light 108 by the reflective material towards the receive block 130 .
  • minimizing the size of the exit aperture 124 can allow a larger portion of the focused light 108 to reflect off the reflective material that the wall is coated with.
  • each light source of the plurality of light sources 122 can include a cylindrical lens arranged adjacent to the light source.
  • the light source may emit a corresponding uncollimated light beam that diverges more in a first direction than in a second direction.
  • the cylindrical lens may pre-collimate the uncollimated light beam in the first direction to provide a partially collimated light beam, thereby reducing the divergence in the first direction.
  • the partially collimated light beam diverges less in the first direction than in the second direction.
  • uncollimated light beams from other light sources of the plurality of light sources 122 can have a reduced beam width in the first direction and thus the emitted light beams 102 can have a smaller divergence due to the partially collimated light beams.
  • at least one of the vertical and horizontal extents of the exit aperture 124 can be reduced due to partially collimating the light beams 102 .
  • the light sources 122 can be arranged along a substantially curved surface defined by the transmit block 120 .
  • the curved surface can be configured such that the emitted light beams 102 converge towards the exit aperture 124 , and thus the vertical and horizontal extents of the emitted light beams 102 at the exit aperture 124 can be reduced due to the arrangement of the light sources 122 along the curved surface of the transmit block 120 .
  • the curved surface of the transmit block 120 can include a curvature along the first direction of divergence of the emitted light beams 102 and a curvature along the second direction of divergence of the emitted light beams 102 , such that the plurality of light beams 102 converge towards a central area in front of the plurality of light sources 122 along the transmit path.
  • the light sources 122 can be disposed on a flexible substrate (e.g., flexible PCB) having a curvature along one or more directions.
  • the curved flexible substrate can be curved along the first direction of divergence of the emitted light beams 102 and the second direction of divergence of the emitted light beams 102 .
  • the light sources 122 can be disposed on a curved edge of one or more vertically-oriented printed circuit boards (PCBs), such that the curved edge of the PCB substantially matches the curvature of the first direction (e.g., the vertical plane of the PCB).
  • PCBs vertically-oriented printed circuit boards
  • the one or more PCBs can be mounted in the transmit block 120 along a horizontal curvature that substantially matches the curvature of the second direction (e.g., the horizontal plane of the one or more PCBs).
  • the transmit block 120 can include four PCBs, with each PCB mounting sixteen light sources, so as to provide 64 light sources along the curved surface of the transmit block 120 .
  • the 64 light sources are arranged in a pattern such that the emitted light beams 102 converge towards the exit aperture 124 of the transmit block 120 .
  • the transmit block 120 can optionally include the mirror 126 along the transmit path of the emitted light beams 102 between the light sources 122 and the exit aperture 124 .
  • the transmit path of the emitted light beams 102 can be folded to provide a smaller size of the transmit block 120 and the housing 110 of the LIDAR device 100 than a size of another transmit block where the transmit path that is not folded.
  • the receive block 130 includes a plurality of detectors 132 that can be configured to receive the focused light 108 via an entrance aperture 134 .
  • each of the plurality of detectors 132 is configured and arranged to receive a portion of the focused light 108 corresponding to a light beam emitted by a corresponding light source of the plurality of light sources 122 and reflected of the one or more objects in the environment of the LIDAR device 100 .
  • the receive block 130 can optionally include the detectors 132 in a sealed environment having an inert gas 136 .
  • the detectors 132 may comprise photodiodes, avalanche photodiodes, phototransistors, cameras, active pixel sensors (APS), charge coupled devices (CCD), cryogenic detectors, or any other sensor of light configured to receive focused light 108 having wavelengths in the wavelength range of the emitted light beams 102 .
  • the detectors 132 can be disposed on one or more substrates and arranged accordingly.
  • the light sources 122 can be arranged along a curved surface of the transmit block 120
  • the detectors 132 can also be arranged along a curved surface of the receive block 130 .
  • the curved surface of the receive block 130 can similarly be curved along one or more axes of the curved surface of the receive block 130 .
  • each of the detectors 132 are configured to receive light that was originally emitted by a corresponding light source of the plurality of light sources 122 .
  • the detectors 132 can be disposed on the one or more substrates similarly to the light sources 122 disposed in the transmit block 120 .
  • the detectors 132 can be disposed on a flexible substrate (e.g., flexible PCB) and arranged along the curved surface of the flexible substrate to each receive focused light originating from a corresponding light source of the light sources 122 .
  • the flexible substrate may be held between two clamping pieces that have surfaces corresponding to the shape of the curved surface of the receive block 130 .
  • assembly of the receive block 130 can be simplified by sliding the flexible substrate onto the receive block 130 and using the two clamping pieces to hold it at the correct curvature.
  • the focused light 108 traversing along the receive path can be received by the detectors 132 via the entrance aperture 134 .
  • the entrance aperture 134 can include a filtering window that passes light having wavelengths within the wavelength range emitted by the plurality of light sources 122 and attenuates light having other wavelengths.
  • the detectors 132 receive the focused light 108 substantially comprising light having the wavelengths within the wavelength range.
  • the plurality of detectors 132 included in the receive block 130 can include, for example, avalanche photodiodes in a sealed environment that is filled with the inert gas 136 .
  • the inert gas 136 may comprise, for example, nitrogen.
  • the shared space 140 includes the transmit path for the emitted light beams 102 from the transmit block 120 to the lens 150 , and includes the receive path for the focused light 108 from the lens 150 to the receive block 130 .
  • the transmit path at least partially overlaps with the receive path in the shared space 140 .
  • the shared space 140 can include a reflective surface 142 .
  • the reflective surface 142 can be arranged along the receive path and configured to reflect the focused light 108 towards the entrance aperture 134 and onto the detectors 132 .
  • the reflective surface 142 may comprise a prism, mirror or any other optical element configured to reflect the focused light 108 towards the entrance aperture 134 in the receive block 130 .
  • a wall separates the shared space 140 from the transmit block 120 .
  • the wall may comprise a transparent substrate (e.g., glass) and the reflective surface 142 may comprise a reflective coating on the wall with an uncoated portion for the exit aperture 124 .
  • the reflective surface 142 can reduce size of the shared space 140 by folding the receive path similarly to the mirror 126 in the transmit block 120 . Additionally or alternatively, in some examples, the reflective surface 142 can direct the focused light 103 to the receive block 130 further providing flexibility to the placement of the receive block 130 in the housing 110 . For example, varying the tilt of the reflective surface 142 can cause the focused light 108 to be reflected to various portions of the interior space of the housing 110 , and thus the receive block 130 can be placed in a corresponding position in the housing 110 . Additionally or alternatively, in this example, the LIDAR device 100 can be calibrated by varying the tilt of the reflective surface 142 .
  • the lens 150 mounted to the housing 110 can have an optical power to both collimate the emitted light beams 102 from the light sources 122 in the transmit block 120 , and focus the reflected light 106 from the one or more objects in the environment of the LIDAR device 100 onto the detectors 132 in the receive block 130 .
  • the lens 150 has a focal length of approximately 120 mm.
  • collimating the emitted light beams 102 to provide the collimated light beams 104 allows determining the distance travelled by the collimated light beams 104 to the one or more objects in the environment of the LIDAR device 100 .
  • the emitted light beams 102 from the light sources 122 traversing along the transmit path can be collimated by the lens 150 to provide the collimated light beams 104 to the environment of the LIDAR device 100 .
  • the collimated light beams 104 may then reflect off the one or more objects in the environment of the LIDAR device 100 and return to the lens 150 as the reflected light 106 .
  • the lens 150 may then collect and focus the reflected light 106 as the focused light 108 onto the detectors 132 included in the receive block 130 .
  • aspects of the one or more objects in the environment of the LIDAR device 100 can be determined by comparing the emitted light beams 102 with the focused light beams 108 .
  • the aspects can include, for example, distance, shape, color, and/or material of the one or more objects.
  • rotating the housing 110 a three dimensional map of the surroundings of the LIDAR device 100 can be determined.
  • the lens 150 can be configured to have a focal surface corresponding to the curved surface of the transmit block 120 .
  • the lens 150 can include an aspheric surface outside the housing 110 and a toroidal surface inside the housing 110 facing the shared space 140 .
  • the shape of the lens 150 allows the lens 150 to both collimate the emitted light beams 102 and focus the reflected light 106 .
  • the shape of the lens 150 allows the lens 150 to have the focal surface corresponding to the curved surface of the transmit block 120 .
  • the focal surface provided by the lens 150 substantially matches the curved shape of the transmit block 120 .
  • the detectors 132 can be arranged similarly in the curved shape of the receive block 130 to receive the focused light 108 along the curved focal surface provided by the lens 150 .
  • the curved surface of the receive block 130 may also substantially match the curved focal surface provided by the lens 150 .
  • FIG. 2 is a cross-section view of an example LIDAR device 200 .
  • the LIDAR device 200 includes a housing 210 that houses a transmit block 220 , a receive block 230 , a shared space 240 , and a lens 250 .
  • FIG. 2 shows an x-y-z axis, in which the z-axis is in a substantially vertical direction and the x-axis and y-axis define a substantially horizontal plane.
  • the structure, function, and operation of various components included in the LIDAR device 200 are similar to corresponding components included in the LIDAR device 100 described in FIG. 1 .
  • the housing 210 , the transmit block 220 , the receive block 230 , the shared space 240 , and the lens 250 are similar, respectively, to the housing 110 , the transmit block 120 , the receive block 130 , and the shared space 140 described in FIG. 1 .
  • the transmit block 220 includes a plurality of light sources 222 a-c arranged along a curved focal surface 228 defined by the lens 250 .
  • the plurality of light sources 222 a-c can be configured to emit, respectively, the plurality of light beams 202 a-c having wavelengths within a wavelength range.
  • the plurality of light sources 222 a-c may comprise laser diodes that emit the plurality of light beams 202 a-c having the wavelengths within the wavelength range.
  • the plurality of light beams 202 a-c are reflected by mirror 224 through an exit aperture 226 into the shared space 240 and towards the lens 250 .
  • the structure, function, and operation of the plurality of light sources 222 a-c, the mirror 224 , and the exit aperture 226 can be similar, respectively, to the plurality of light sources 122 , the mirror 124 , and the exit aperture 226 discussed in the description of the LIDAR device 100 of FIG. 1 .
  • FIG. 2 shows that the curved focal surface 228 is curved in the x-y plane (horizontal plane), additionally or alternatively, the plurality of light sources 222 a-c may be arranged along a focal surface that is curved in a vertical plane.
  • the curved focal surface 228 can have a curvature in a vertical plane, and the plurality of light sources 222 a-c can include additional light sources arranged vertically along the curved focal surface 228 and configured to emit light beams directed at the mirror 224 and reflected through the exit aperture 226 .
  • the plurality of light beams 202 a-c may converge towards the exit aperture 226 .
  • the exit aperture 226 may be minimally sized while being capable of accommodating vertical and horizontal extents of the plurality of light beams 202 a-c.
  • the curved focal surface 228 can be defined by the lens 250 .
  • the curved focal surface 228 may correspond to a focal surface of the lens 250 due to shape and composition of the lens 250 .
  • the plurality of light sources 222 a-c can be arranged along the focal surface defined by the lens 250 at the transmit block.
  • the plurality of light beams 202 a-c propagate in a transmit path that extends through the transmit block 220 , the exit aperture 226 , and the shared space 240 towards the lens 250 .
  • the lens 250 collimates the plurality of light beams 202 a-c to provide collimated light beams 204 a-c into an environment of the LIDAR device 200 .
  • the collimated light beams 204 a-c correspond, respectively, to the plurality of light beams 202 a-c.
  • the collimated light beams 204 a-c reflect off one or more objects in the environment of the LIDAR device 200 as reflected light 206 .
  • the reflected light 206 may be focused by the lens 250 into the shared space 240 as focused light 208 traveling along a receive path that extends through the shared space 240 onto the receive block 230 .
  • the focused light 208 may be reflected by the reflective surface 242 as focused light 208 a-c propagating towards the receive block 230 .
  • the lens 250 may be capable of both collimating the plurality of light beams 202 a-c and focusing the reflected light 206 along the receive path 208 towards the receive block 230 due to shape and composition of the lens 250 .
  • the lens 250 can have an aspheric surface 252 facing outside of the housing 210 and a toroidal surface 254 facing the shared space 240 .
  • the exit aperture 226 is included in a wall 244 that separates the transmit block 220 from the shared space 240 .
  • the wall 244 can be formed from a transparent material (e.g., glass) that is coated with a reflective material 242 .
  • the exit aperture 226 may correspond to the portion of the wall 244 that is not coated by the reflective material 242 . Additionally or alternatively, the exit aperture 226 may comprise a hole or cut-away in the wall 244 .
  • the focused light 208 is reflected by the reflective surface 242 and directed towards an entrance aperture 234 of the receive block 230 .
  • the entrance aperture 234 may comprise a filtering window configured to allow wavelengths in the wavelength range of the plurality of light beams 202 a-c emitted by the plurality of light sources 222 a-c and attenuate other wavelengths.
  • the focused light 208 a-c reflected by the reflective surface 242 from the focused light 208 propagates, respectively, onto a plurality of detectors 232 a-c.
  • the structure, function, and operation of the entrance aperture 234 and the plurality of detectors 232 a-c is similar, respectively, to the entrance aperture 134 and the plurality of detectors 132 included in the LIDAR device 100 described in FIG. 1 .
  • the plurality of detectors 232 a-c can be arranged along a curved focal surface 238 of the receive block 230 .
  • FIG. 2 shows that the curved focal surface 238 is curved along the x-y plane (horizontal plane), additionally or alternatively, the curved focal surface 238 can be curved in a vertical plane.
  • the curvature of the focal surface 238 is also defined by the lens 250 .
  • the curved focal surface 238 may correspond to a focal surface of the light projected by the lens 250 along the receive path at the receive block 230 .
  • Each of the focused light 208 a-c corresponds, respectively, to the emitted light beams 202 a-c and is directed onto, respectively, the plurality of detectors 232 a-c.
  • the detector 232 a is configured and arranged to received focused light 208 a that corresponds to collimated light beam 204 a reflected of the one or more objects in the environment of the LIDAR device 200 .
  • the collimated light beam 204 a corresponds to the light beam 202 a emitted by the light source 222 a.
  • the detector 232 a receives light that was emitted by the light source 222 a
  • the detector 232 b receives light that was emitted by the light source 222 b
  • the detector 232 c receives light that was emitted by the light source 222 c.
  • At least one aspect of the one or more object in the environment of the LIDAR device 200 may be determined. For example, by comparing a time when the plurality of light beams 202 a-c were emitted by the plurality of light sources 222 a-c and a time when the plurality of detectors 232 a-c received the focused light 208 a-c, a distance between the LIDAR device 200 and the one or more object in the environment of the LIDAR device 200 may be determined. In some examples, other aspects such as shape, color, material, etc. may also be determined.
  • the LIDAR device 200 may be rotated about an axis to determine a three-dimensional map of the surroundings of the LIDAR device 200 .
  • the LIDAR device 200 may be rotated about a substantially vertical axis as illustrated by arrow 290 .
  • the LIDAR device 200 may be rotated counter clock-wise about the axis as illustrated by the arrow 290 , additionally or alternatively, the LIDAR device 200 may be rotated in the clockwise direction.
  • the LIDAR device 200 may be rotated 360 degrees about the axis.
  • the LIDAR device 200 may be rotated back and forth along a portion of the 360 degree view of the LIDAR device 200 .
  • the LIDAR device 200 may be mounted on a platform that wobbles back and forth about the axis without making a complete rotation.
  • FIG. 3A is a perspective view of an example LIDAR device 300 fitted with various components, in accordance with at least some embodiments described herein.
  • FIG. 3B is a perspective view of the example LIDAR device 300 shown in FIG. 3A with the various components removed to illustrate interior space of the housing 310 .
  • the structure, function, and operation of the LIDAR device 300 is similar to the LIDAR devices 100 and 200 described, respectively, in FIGS. 1 and 2 .
  • the LIDAR device 300 includes a housing 310 that houses a transmit block 320 , a receive block 330 , and a lens 350 that are similar, respectively, to the housing 110 , the transmit block 120 , the receive block 130 , and the lens 150 described in FIG. 1 .
  • collimated light beams 304 propagate from the lens 350 toward an environment of the LIDAR device 300 and reflect of one or more objects in the environment as reflected light 306 , similarly to the collimated light beams 104 and reflected light 106 described in FIG. 1 .
  • the LIDAR device 300 can be mounted on a mounting structure 360 and rotated about an axis to provide a 360 degree view of the environment surrounding the LIDAR device 300 .
  • the mounting structure 360 may comprise a movable platform that may tilt in one or more directions to change the axis of rotation of the LIDAR device 300 .
  • the various components of the LIDAR device 300 can be removably mounted to the housing 310 .
  • the transmit block 320 may comprise one or more printed circuit boards (PCBs) that are fitted in the portion of the housing 310 where the transmit block 320 can be mounted.
  • the receive block 330 may comprise a plurality of detectors 332 mounted to a flexible substrate and can be removably mounted to the housing 310 as a block that includes the plurality of detectors.
  • the lens 350 can be mounted to another side of the housing 310 .
  • a plurality of light beams 302 can be transmitted by the transmit block 320 into the shared space 340 and towards the lens 350 to be collimated into the collimated light beams 304 .
  • the received light 306 can be focused by the lens 350 and directed through the shared space 340 onto the receive block 330 .
  • FIG. 4 illustrates an example transmit block 420 , in accordance with at least some embodiments described herein.
  • Transmit block 420 can correspond to the transmit blocks 120 , 220 , and 320 described in FIGS. 1-3 .
  • the transmit block 420 includes a plurality of light sources 422 a-c similar to the plurality of light sources 222 a-c included in the transmit block 220 of FIG. 2 .
  • the light sources 422 a-c are arranged along a focal surface 428 , which is curved in a vertical plane.
  • the light sources 422 a-c are configured to emit a plurality of light beams 402 a-c that converge and propagate through an exit aperture 426 in a wall 444 .
  • the plurality of light sources 422 a-c can be arranged along a focal surface 428 that is curved in a vertical plane, additionally or alternatively, the plurality of light sources 422 a-c can be arranged along a focal surface that is curved in a horizontal plane or a focal surface that is curved both vertically and horizontally.
  • the plurality of light sources 422 a-c can be arranged in a curved three dimensional grid pattern.
  • the transmit block 420 may comprise a plurality of printed circuit board (PCB) vertically mounted such that a column of light sources such as the plurality of light sources 422 a-c are along the vertical axis of each PCB and each of the plurality of PCBs can be arranged adjacent to other vertically mounted PCBs along a horizontally curved plane to provide the three dimensional grid pattern.
  • PCB printed circuit board
  • the light beams 402 a-c converge towards the exit aperture 426 which allows the size of the exit aperture 426 to be minimized while accommodating vertical and horizontal extents of the light beams 402 a-c similarly to the exit aperture 226 described in FIG. 2 .
  • FIGS. 5A, 5B, and 5C illustrate an example of how such partial collimation could be achieved.
  • a light source 500 is made up of a laser diode 502 and a cylindrical lens 504 .
  • laser diode 502 has an aperture 506 with a shorter dimension corresponding to a fast axis 508 and a longer dimension corresponding to a slow axis 510 .
  • FIGS. 5B and 5C show an uncollimated laser beam 512 being emitted from laser diode 502 .
  • Laser beam 512 diverges in two directions, one direction defined by fast axis 508 and another, generally orthogonal direction defined by slow axis 510 .
  • FIG. 5B shows the divergence of laser beam 512 along fast axis 508
  • FIG. 5C shows the divergence of laser beam 512 along slow axis 510 .
  • Laser beam 512 diverges more quickly along fast axis 508 than along slow axis 510 .
  • laser diode 502 is an Osram SPL DL90_3 nanostack pulsed laser diode that emits pulses of light with a range of wavelengths from about 896 nm to about 910 nm (a nominal wavelength of 905 nm).
  • the aperture has a shorter dimension of about 10 microns, corresponding to its fast axis, and a longer dimension of about 200 microns, corresponding to its slow axis.
  • the divergence of the laser beam in this specific example is about 25 degrees along the fast axis and about 11 degrees along the slow axis. It is to be understood that this specific example is illustrative only.
  • Laser diode 502 could have a different configuration, different aperture sizes, different beam divergences, and/or emit different wavelengths.
  • cylindrical lens 504 may be positioned in front of aperture 506 with its cylinder axis 514 generally parallel to slow axis 510 and perpendicular to fast axis 508 .
  • cylindrical lens 504 can pre-collimate laser beam 512 along fast axis 508 , resulting in partially collimated laser beam 516 .
  • this pre-collimation may reduce the divergence along fast axis 508 to about one degree or less. Nonetheless, laser beam 516 is only partially collimated because the divergence along slow axis 510 may be largely unchanged by cylindrical lens 504 .
  • partially collimated laser beam 516 provided by cylindrical lens 504 may have a higher divergence along slow axis 510 than along fast axis 508 .
  • the divergences along slow axis 510 in uncollimated laser beam 512 and in partially collimated laser beam 516 may be substantially equal.
  • cylindrical lens 504 is a microrod lens with a diameter of about 600 microns that is placed about 250 microns in front of aperture 506 .
  • the material of the microrod lens could be, for example, fused silica or a borosilicate crown glass, such as Schott BK7.
  • the microrod lens could be a molded plastic cylinder or acylinder.
  • Cylindrical lens 504 could also be used to provide magnification along fast axis 508 . For example, if the dimensions of aperture 506 are 10 microns by 200 microns, as previously described, and cylindrical lens 504 is a microrod lens as described above, then cylindrical lens 504 may magnify the shorter dimension (corresponding to fast axis 508 ) by about 20 times.
  • This magnification effectively stretches out the shorter dimension of aperture 506 to about the same as the longer dimension.
  • the focused spot could have a substantially square shape instead of the rectangular slit shape of aperture 506 .
  • FIG. 6A illustrates an example receive block 630 , in accordance with at least some embodiments described herein.
  • FIG. 6B illustrates a side view of three detectors 632 a-c included in the receive block 630 of FIG. 6A .
  • Receive block 630 can correspond to the receive blocks 130 , 230 , and 330 described in FIGS. 1-3 .
  • the receive block 630 includes a plurality of detectors 632 a-c arranged along a curved surface 638 defined by a lens 650 similarly to the receive block 230 , the detectors 232 and the curved plane 238 described in FIG. 2 .
  • Focused light 608 a-c from lens 650 propagates along a receive path that includes a reflective surface 642 onto the detectors 632 a-c similar, respectively, to the focused light 208 a-c, the lens 250 , the reflective surface 242 , and the detectors 232 a-c described in FIG. 2 .
  • the receive block 630 comprises a flexible substrate 680 on which the plurality of detectors 632 a-c are arranged along the curved surface 638 .
  • the flexible substrate 680 conforms to the curved surface 638 by being mounted to a receive block housing 690 having the curved surface 638 .
  • the curved surface 638 includes the arrangement of the detectors 632 a-c curved along a vertical and horizontal axis of the receive block 630 .
  • FIGS. 7A and 7B illustrate an example lens 750 with an aspheric surface 752 and a toroidal surface 754 , in accordance with at least some embodiments described herein.
  • FIG. 7B illustrates a cross-section view of the example lens 750 shown in FIG. 7A .
  • the lens 750 can correspond to lens 150 , 250 , and 350 included in FIGS. 1-3 .
  • the lens 750 can be configured to both collimate light incident on the toroidal surface 754 from a light source into collimated light propagating out of the aspheric surface 752 , and focus reflected light entering from the aspheric surface 752 onto a detector.
  • the structure of the lens 750 including the aspheric surface 752 and the toroidal surface 754 allows the lens 750 to perform both functions of collimating and focusing described in the example above.
  • the lens 750 defines a focal surface of the light propagating through the lens 750 due to the aspheric surface 752 and the toroidal surface 754 .
  • the light sources providing the light entering the toroidal surface 754 can be arranged along the defined focal surface, and the detectors receiving the light focused from the light entering the aspheric surface 752 can also be arranged along the defined focal surface.
  • the lens 750 that performs both of these functions (collimating transmitted light and focusing received light), instead of a transmit lens for collimating and a receive lens for focusing, advantages with respect to size, cost, and/or complexity can be provided.
  • FIG. 8A illustrates an example LIDAR device 810 mounted on a vehicle 800 , in accordance with at least some embodiments described herein.
  • FIG. 8A shows a Right Side View, Front View, Back View, and Top View of the vehicle 800 .
  • vehicle 800 is illustrated in FIG. 8 as a car, other examples are possible.
  • the vehicle 800 could represent a truck, a van, a semi-trailer truck, a motorcycle, a golf cart, an off-road vehicle, or a farm vehicle, among other examples.
  • the structure, function, and operation of the LIDAR device 810 shown in FIG. 8A is similar to the example LIDAR devices 100 , 200 , and 300 shown in FIGS. 1-3 .
  • the LIDAR device 810 can be configured to rotate about an axis and determine a three-dimensional map of a surrounding environment of the LIDAR device 810 .
  • the LIDAR device 810 can be mounted on a platform 802 .
  • the platform 802 may comprise a movable mount that allows the vehicle 800 to control the axis of rotation of the LIDAR device 810 .
  • the LIDAR device 810 is shown to be mounted in a particular location on the vehicle 800 , in some examples, the LIDAR device 810 may be mounted elsewhere on the vehicle 800 .
  • the LIDAR device 810 may be mounted anywhere on top of the vehicle 800 , on a side of the vehicle 800 , under the vehicle 800 , on a hood of the vehicle 800 , and/or on a trunk of the vehicle 800 .
  • the LIDAR device 810 includes a lens 812 through which collimated light is transmitted from the LIDAR device 810 to the surrounding environment of the LIDAR device 810 , similarly to the lens 150 , 250 , and 350 described in FIGS. 1-3 .
  • the lens 812 can also be configured to receive reflected light from the surrounding environment of the LIDAR device 810 that were reflected off one or more objects in the surrounding environment.
  • FIG. 8B illustrates a scenario where the LIDAR device 810 shown in FIG. 8A and scanning an environment 830 that includes one or more objects, in accordance with at least some embodiments described herein.
  • vehicle 800 can be traveling on a road 822 in the environment 830 .
  • the LIDAR device 810 may be able to determine aspects of objects in the surrounding environment 830 , such as lane lines 824 a-b, other vehicles 826 a-c, and/or street sign 828 .
  • the LIDAR device 810 can provide the vehicle 800 with information about the objects in the surrounding environment 830 , including distance, shape, color, and/or material type of the objects.
  • FIG. 9 is a flowchart of a method 900 of operating a LIDAR device, in accordance with at least some embodiments described herein.
  • Method 900 shown in FIG. 9 presents an embodiment of a method that could be used with the LIDAR devices 100 , 200 , and 300 , for example.
  • Method 900 may include one or more operations, functions, or actions as illustrated by one or more of blocks 902 - 912 . Although the blocks are illustrated in a sequential order, these blocks may in some instances be performed in parallel, and/or in a different order than those described herein. Also, the various blocks may be combined into fewer blocks, divided into additional blocks, and/or removed based upon the desired implementation.
  • each block may represent a module, a segment, or a portion of a manufacturing or operation process.
  • the method 900 includes rotating a housing of a light detection and ranging (LIDAR) device about an axis, wherein the housing has an interior space that includes a transmit block, a receive block, and a shared space, wherein the transmit block has an exit aperture, and wherein the receive block has an entrance aperture.
  • LIDAR light detection and ranging
  • the method 900 includes emitting, by a plurality of light sources in the transmit block, a plurality of light beams that enter the shared space via a transmit path, the light beams comprising light having wavelengths in a wavelength range.
  • the method 900 includes receiving the light beams at a lens mounted to the housing along the transmit path.
  • the method 900 includes collimating, by the lens, the light beams for transmission into an environment of the LIDAR device.
  • the method 900 includes focusing, by the lens, the collected light onto a plurality of detectors in the receive block via a receive path that extends through the shared space and the entrance aperture of the receive block.
  • the method 900 includes detecting, by the plurality of detectors in the receive block, light from the focused light having wavelengths in the wavelength range.
  • a LIDAR device such as the LIDAR device 200 can be rotated about an axis (block 902 ).
  • a transmit block such as the transmit block 220 , can include a plurality of light sources that emit light beams having wavelengths in a wavelength range, through an exit aperture and a shared space to a lens (block 904 ).
  • the light beams can be received by the lens (block 906 ) and collimated for transmission to an environment of the LIDAR device (block 908 ).
  • the collimated light may then reflect off one or more objects in the environment of the LIDAR device and return as reflected light collected by the lens.
  • the lens may then focus the collected light onto a plurality of detectors in the receive block via a receive path that extends through the shared space and an entrance aperture of the receive block (block 910 ).
  • the plurality of detectors in the receive block may then detect light from the focused light having wavelengths in the wavelength range of the emitted light beams from the light sources (block 912 ).
  • devices and operation methods described include a LIDAR device rotated about an axis and configured to transmit collimated light and focus reflected light.
  • the collimation and focusing can be performed by a shared lens.
  • a shared lens that performs both of these functions, instead of a transmit lens for collimating and a receive lens for focusing, advantages with respect to size, cost, and/or complexity can be provided.
  • the shared lens can define a curved focal surface.
  • the light sources emitting light through the shared lens and the detectors receiving light focused by the shared lens can be arranged along the curved focal surface defined by the shared lens.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Mechanical Optical Scanning Systems (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

A LIDAR device may transmit light pulses originating from one or more light sources and may receive reflected light pulses that are then detected by one or more detectors. The LIDAR device may include a lens that both (i) collimates the light from the one or more light sources to provide collimated light for transmission into an environment of the LIDAR device and (ii) focuses the reflected light onto the one or more detectors. The lens may define a curved focal surface in a transmit path of the light from the one or more light sources and a curved focal surface in a receive path of the one or more detectors. The one or more light sources may be arranged along the curved focal surface in the transmit path. The one or more detectors may be arranged along the curved focal surface in the receive path.

Description

CROSS REFERENCE TO RELATED APPLICATION
The present application is a continuation of U.S. patent application Ser. No. 13/971,606, filed Aug. 20, 2013, which application is incorporated herein by reference
BACKGROUND
Unless otherwise indicated herein, the materials described in this section are not prior art to the claims in this application and are not admitted to be prior art by inclusion in this section.
Vehicles can be configured to operate in an autonomous mode in which the vehicle navigates through an environment with little or no input from a driver. Such autonomous vehicles can include one or more sensors that are configured to detect information about the environment in which the vehicle operates.
One such sensor is a light detection and ranging (LIDAR) device. A LIDAR can estimates distance to environmental features while scanning through a scene to assemble a “point cloud” indicative of reflective surfaces in the environment. Individual points in the point cloud can be determined by transmitting a laser pulse and detecting a returning pulse, if any, reflected from an object in the environment, and determining the distance to the object according to the time delay between the transmitted pulse and the reception of the reflected pulse. A laser, or set of lasers, can be rapidly and repeatedly scanned across a scene to provide continuous real-time information on distances to reflective objects in the scene. Combining the measured distances and the orientation of the laser(s) while measuring each distance allows for associating a three-dimensional position with each returning pulse. In this way, a three-dimensional map of points indicative of locations of reflective features in the environment can be generated for the entire scanning zone.
SUMMARY
In one example, a light detection and ranging (LIDAR) device is provided that includes a housing configured to rotate about an axis. The housing has an interior space that includes a transmit block, a receive block, and a shared space. The transmit block has an exit aperture and the receive block has an entrance aperture. The LIDAR device also includes a plurality of light sources in the transmit block. The plurality of light sources is configured to emit a plurality of light beams that enter the shared space through the exit aperture and traverse the shared space via a transmit path. The light beams include light having wavelengths in a wavelength range. The LIDAR device also includes a plurality of detectors in the receive block. The plurality of detectors is configured to detect light having wavelengths in the wavelength range. The LIDAR device also includes a lens mounted to the housing. The lens is configured to (i) receive the light beams via the transmit path, (ii) collimate the light beams for transmission into an environment of the LIDAR device, (iii) collect light that includes light from one or more of the collimated light beams reflected by one or more objects in the environment of the LIDAR device, and (iv) focus the collected light onto the detectors via a receive path that extends through the shared space and the entrance aperture of the receive block.
In another example, a method is provided that involves rotating a housing of a light detection and ranging (LIDAR) device about an axis. The housing has an interior space that includes a transmit block, a receive block, and a shared space. The transmit block has an exit aperture and the receive block has an entrance aperture. The method further involves emitting a plurality of light beams by a plurality of light sources in the transmit block. The plurality of light beams enter the shared space via a transmit path. The light beams include light having wavelengths in a wavelength range. The method further involves receiving the light beams at a lens mounted to the housing along the transmit path. The method further involves collimating, by the lens, the light beams for transmission into an environment of the LIDAR device. The method further involves collecting, by the lens, light from one or more of the collimated light beams reflected by one or more objects in the environment of the LIDAR device. The method further involves focusing, by the lens, the collected light onto a plurality of detectors in the receive block via a receive path that extends through the shared space and the entrance aperture of the receive block. The method further involves detecting, by the plurality of detectors in the receive block, light from the focused light having wavelengths in the wavelength range.
These as well as other aspects, advantages, and alternatives, will become apparent to those of ordinary skill in the art by reading the following detailed description, with reference where appropriate to the accompanying figures.
BRIEF DESCRIPTION OF THE FIGURES
FIG. 1 is a block diagram of an example LIDAR device.
FIG. 2 is a cross-section view of an example LIDAR device.
FIG. 3A is a perspective view of an example LIDAR device fitted with various components, in accordance with at least some embodiments described herein
FIG. 3B is a perspective view of the example LIDAR device shown in FIG. 3A with the various components removed to illustrate interior space of the housing.
FIG. 4 illustrates an example transmit block, in accordance with at least some embodiments described herein.
FIG. 5A is a view of an example light source, in accordance with an example embodiment.
FIG. 5B is a view of the light source of FIG. 5A in combination with a cylindrical lens, in accordance with an example embodiment.
FIG. 5C is another view of the light source and cylindrical lens combination of FIG. 5B, in accordance with an example embodiment.
FIG. 6A illustrates an example receive block, in accordance with at least some embodiments described herein.
FIG. 6B illustrates a side view of three detectors included in the receive block of FIG. 6A.
FIG. 7A illustrates an example lens with an aspheric surface and a toroidal surface, in accordance with at least some embodiments described herein.
FIG. 7B illustrates a cross-section view of the example lens 750 shown in FIG. 7A.
FIG. 8A illustrates an example LIDAR device mounted on a vehicle, in accordance with at least some embodiments described herein.
FIG. 8B illustrates a scenario where the LIDAR device shown in FIG. 8A is scanning an environment that includes one or more objects, in accordance with at least some embodiments described herein.
FIG. 9 is a flowchart of a method, in accordance with at least some embodiments described herein.
DETAILED DESCRIPTION
The following detailed description describes various features and functions of the disclosed systems, devices and methods with reference to the accompanying figures. In the figures, similar symbols identify similar components, unless context dictates otherwise. The illustrative system, device and method embodiments described herein are not meant to be limiting. It may be readily understood by those skilled in the art that certain aspects of the disclosed systems, devices and methods can be arranged and combined in a wide variety of different configurations, all of which are contemplated herein.
A light detection and ranging (LIDAR) device may transmit light pulses originating from a plurality of light sources and may receive reflected light pulses that are then detected by a plurality of detectors. Within examples described herein, a LIDAR device is provided that includes a transmit/receive lens that both collimates the light from the plurality of light sources and focuses the reflected light onto the plurality of detectors. By using a transmit/receive lens that performs both of these functions, instead of a transmit lens for collimating and a receive lens for focusing, advantages with respect to size, cost, and/or complexity can be provided.
The LIDAR device comprises a housing that is configured to rotate about an axis. In some examples, the axis is substantially vertical. The housing may have an interior space that includes various components such as a transmit block that includes the plurality of light sources, a receive block that includes the plurality of detectors, a shared space where emitted light traverses from the transmit block to the transmit/receive lens and reflected light traverses from the transmit/receive lens to the receive block, and the transmit/receive lens that collimates the emitted light and focuses the reflected light. By rotating the housing that includes the various components, in some examples, a three-dimensional map of a 360-degree field of view of an environment of the LIDAR device can be determined without frequent recalibration of the arrangement of the various components.
In some examples, the housing may include radio frequency (RF) and optical shielding between the transmit block and the receive block. For example, the housing can be formed from and/or coated by a metal, metallic ink, or metallic foam to provide the RF shielding. Metals used for shielding can include, for example, copper or nickel.
The plurality of light sources included in the transmit block can include, for example, laser diodes. In one example, the light sources emit light with wavelengths of approximately 905 nm. In some examples, a transmit path through which the transmit/receive lens receives the light emitted by the light sources may include a reflective element, such as a mirror or prism. By including the reflective element, the transmit path can be folded to provide a smaller size of the transmit block and, hence, a smaller housing of the LIDAR device. Additionally, the transmit path includes an exit aperture of the transmit block through which the emitted light enters the shared space and traverses to the transmit/receive lens.
In some examples, each light source of the plurality of light sources includes a respective lens, such as a cylindrical or acylindrical lens. The light source may emit an uncollimated light beam that diverges more in a first direction than in a second direction. In these examples, the light source's respective lens may pre-collimate the uncollimated light beam in the first direction to provide a partially collimated light beam, thereby reducing the divergence in the first direction. In some examples, the partially collimated light beam diverges less in the first direction than in the second direction. The transmit/receive lens receives the partially collimated light beams from the one or more light sources via an exit aperture of the transmit block and the transmit/receive lens collimates the partially collimated light beams to provide collimated light beams that are transmitted into the environment of the LIDAR device. In this example, the light emitted by the light sources may have a greater divergence in the second direction than in the first direction, and the exit aperture can accommodate vertical and horizontal extents of the beams of light from the light sources.
The housing mounts the transmit/receive lens through which light from the plurality of light sources can exit the housing, and reflected light can enter the housing to reach the receive block. The transmit/receive lens can have an optical power that is sufficient to collimate the light emitted by the plurality of light sources and to focus the reflected light onto the plurality of detectors in the receive block. In one example, the transmit/receive lens has a surface with an aspheric shape that is at the outside of the housing, a surface with a toroidal shape that is inside the housing, and a focal length of approximately 120 mm.
The plurality of detectors included in the receive block can include, for example, avalanche photodiodes in a sealed environment that is filled with an inert gas, such as nitrogen. The receive block can include an entrance aperture through which focused light from the transmit/receive lens traverses towards the detectors. In some examples, the entrance aperture can include a filtering window that passes light having wavelengths within the wavelength range emitted by the plurality of light sources and attenuates light having other wavelengths.
The collimated light transmitted from the LIDAR device into the environment may reflect from one or more objects in the environment to provide object-reflected light. The transmit/receive lens may collect the object-reflected light and focus the object-reflected light through a focusing path (“receive path”) onto the plurality of detectors. In some examples, the receive path may include a reflective surface that directs the focused light to the plurality of detectors. Additionally or alternatively, the reflective surface can fold the focused light towards the receive block and thus provide space savings for the shared space and the housing of the LIDAR device.
In some examples, the reflective surface may define a wall that includes the exit aperture between the transmit block and the shared space. In this case, the exit aperture of the transmit block corresponds to a transparent and/or non-reflective portion of the reflective surface. The transparent portion can be a hole or cut-away portion of the reflective surface. Alternatively, the reflective surface can be formed by forming a layer of reflective material on a transparent substrate (e.g., glass) and the transparent portion can be a portion of the substrate that is not coated with the reflective material. Thus, the shared space can be used for both the transmit path and the receive path. In some examples, the transmit path at least partially overlaps the receive path in the shared space.
The vertical and horizontal extents of the exit aperture are sufficient to accommodate the beam widths of the emitted light beams from the light sources. However, the non-reflective nature of the exit aperture prevents a portion of the collected and focused light in the receive path from reflecting, at the reflective surface, towards the detectors in the receive block. Thus, reducing the beam widths of the emitted light beams from the transmit blocks is desirable to minimize the size of the exit aperture and reduce the lost portion of the collected light. In some examples noted above, the reduction of the beam widths traversing through the exit aperture can be achieved by partially collimating the emitted light beams by including a respective lens, such as a cylindrical or acylindrical lens, adjacent to each light source.
Additionally or alternatively, to reduce the beam widths of the emitted light beams, in some examples, the transmit/receive lens can be configured to define a focal surface that has a substantial curvature in a vertical plane and/or a horizontal plane. For example, the transmit/receive lens can be configured to have the aspheric surface and the toroidal surface described above that provides the curved focal surface along the vertical plane and/or the horizontal plane. In this configuration, the light sources in the transmit block can be arranged along the transmit/receive lens' curved focal surface in the transmit block, and the detectors in the receive block can be arranged on the transmit/receive lens' curved focal surface in the receive block. Thus, the emitted light beams from the light sources arranged along the curved focal surface can converge into the exit aperture having a smaller size than an aperture for light beams that are substantially parallel and/or diverging.
To facilitate such curved arrangement of the light sources, in some examples, the light sources can be mounted on a curved edge of one or more vertically-oriented printed circuit boards (PCBs), such that the curved edge of the PCB substantially matches the curvature of the focal surface in the vertical plane of the PCB. In this example, the one or more PCBs can be mounted in the transmit block along a horizontal curvature that substantially matches the curvature of the focal surface in the horizontal plane of the one or more PCBs. For example, the transmit block can include four PCBs, with each PCB mounting sixteen light sources, so as to provide 64 light sources along the curved focal plane of the transmit/receive lens in the transmit block. In this example, the 64 light sources are arranged in a pattern substantially corresponding to the curved focal surface defined by the transmit/receive lens such that the emitted light beams converge towards the exit aperture of the transmit block.
For the receive block, in some examples, the plurality of detectors can be disposed on a flexible PCB that is mounted to the receive block to conform with the shape of the transmit/receive lens' focal surface. For example, the flexible PCB may be held between two clamping pieces that have surfaces corresponding to the shape of the focal surface. Additionally, in this example, each of the plurality of detectors can be arranged on the flexible PCB so as to receive focused light from the transmit/receive lens that corresponds to a respective light source of the plurality of light sources. In this example, the detectors can be arranged in a pattern substantially corresponding to the curved focal surface of the transmit/receive lens in the receive block. Thus, in this example, the transmit/receive lens can be configured to focus onto each detector of the plurality of detectors a respective portion of the collected light that comprises light from the detector's corresponding light source.
Some embodiments of the present disclosure therefore provide systems and methods for a LIDAR device that uses a shared transmit/receive lens. In some examples, such LIDAR device can include the shared lens configured to provide a curved focal plane for transmitting light sources and receiving detectors such that light from the light sources passes through a small exit aperture included in a reflective surface that reflects collected light towards the detectors.
FIG. 1 is a block diagram of an example LIDAR device 100. The LIDAR device 100 comprises a housing 110 that houses an arrangement of various components included in the LIDAR device 100 such as a transmit block 120, a receive block 130, a shared space 140, and a lens 150. The LIDAR device 100 includes the arrangement of the various components that provide emitted light beams 102 from the transmit block 120 that are collimated by the lens 150 and transmitted to an environment of the LIDAR device 100 as collimated light beams 104, and collect reflected light 106 from one or more objects in the environment of the LIDAR device 100 by the lens 150 for focusing towards the receive block 130 as focused light 108. The reflected light 106 comprises light from the collimated light beams 104 that was reflected by the one or more objects in the environment of the LIDAR device 100. The emitted light beams 102 and the focused light 108 traverse in the shared space 140 also included in the housing 110. In some examples, the emitted light beams 102 are propagating in a transmit path through the shared space 140 and the focused light 108 are propagating in a receive path through the shared space 140. In some examples, the transmit path at least partially overlaps the receive path in the shared space 140. The LIDAR device 100 can determine an aspect of the one or more objects (e.g., location, shape, etc.) in the environment of the LIDAR device 100 by processing the focused light 108 received by the receive block 130. For example, the LIDAR device 100 can compare a time when pulses included in the emitted light beams 102 were emitted by the transmit block 120 with a time when corresponding pulses included in the focused light 108 were received by the receive block 130 and determine the distance between the one or more objects and the LIDAR device 100 based on the comparison.
The housing 110 included in the LIDAR device 100 can provide a platform for mounting the various components included in the LIDAR device 100. The housing 110 can be formed from any material capable of supporting the various components of the LIDAR device 100 included in an interior space of the housing 110. For example, the housing 110 may be formed from a structural material such as plastic or metal.
In some examples, the housing 110 can be configured for optical shielding to reduce ambient light and/or unintentional transmission of the emitted light beams 102 from the transmit block 120 to the receive block 130. Optical shielding from ambient light of the environment of the LIDAR device 100 can be achieved by forming and/or coating the outer surface of the housing 110 with a material that blocks the ambient light from the environment. Additionally, inner surfaces of the housing 110 can include and/or be coated with the material described above to optically isolate the transmit block 120 from the receive block 130 to prevent the receive block 130 from receiving the emitted light beams 102 before the emitted light beams 102 reach the lens 150.
In some examples, the housing 110 can be configured for electromagnetic shielding to reduce electromagnetic noise (e.g., Radio Frequency (RF) Noise, etc.) from ambient environment of the LIDAR device 110 and/or electromagnetic noise between the transmit block 120 and the receive block 130. Electromagnetic shielding can improve quality of the emitted light beams 102 emitted by the transmit block 120 and reduce noise in signals received and/or provided by the receive block 130. Electromagnetic shielding can be achieved by forming and/or coating the housing 110 with a material that absorbs electromagnetic radiation such as a metal, metallic ink, metallic foam, carbon foam, or any other material configured to absorb electromagnetic radiation. Metals that can be used for the electromagnetic shielding can include for example, copper or nickel.
In some examples, the housing 110 can be configured to have a substantially cylindrical shape and to rotate about an axis of the LIDAR device 100. For example, the housing 110 can have the substantially cylindrical shape with a diameter of approximately 10 centimeters. In some examples, the axis is substantially vertical. By rotating the housing 110 that includes the various components, in some examples, a three-dimensional map of a 360 degree view of the environment of the LIDAR device 100 can be determined without frequent recalibration of the arrangement of the various components of the LIDAR device 100. Additionally or alternatively, the LIDAR device 100 can be configured to tilt the axis of rotation of the housing 110 to control the field of view of the LIDAR device 100.
Although not illustrated in FIG. 1, the LIDAR device 100 can optionally include a mounting structure for the housing 110. The mounting structure can include a motor or other means for rotating the housing 110 about the axis of the LIDAR device 100. Alternatively, the mounting structure can be included in a device and/or system other than the LIDAR device 100.
In some examples, the various components of the LIDAR device 100 such as the transmit block 120, receive block 130, and the lens 150 can be removably mounted to the housing 110 in predetermined positions to reduce burden of calibrating the arrangement of each component and/or subcomponents included in each component. Thus, the housing 110 provides the platform for the various components of the LIDAR device 100 for ease of assembly, maintenance, calibration, and manufacture of the LIDAR device 100.
The transmit block 120 includes a plurality of light sources 122 that can be configured to emit the plurality of emitted light beams 102 via an exit aperture 124. In some examples, each of the plurality of emitted light beams 102 corresponds to one of the plurality of light sources 122. The transmit block 120 can optionally include a mirror 126 along the transmit path of the emitted light beams 102 between the light sources 122 and the exit aperture 124.
The light sources 122 can include laser diodes, light emitting diodes (LED), vertical cavity surface emitting lasers (VCSEL), organic light emitting diodes (OLED), polymer light emitting diodes (PLED), light emitting polymers (LEP), liquid crystal displays (LCD), microelectromechanical systems (MEMS), or any other device configured to selectively transmit, reflect, and/or emit light to provide the plurality of emitted light beams 102. In some examples, the light sources 122 can be configured to emit the emitted light beams 102 in a wavelength range that can be detected by detectors 132 included in the receive block 130. The wavelength range could, for example, be in the ultraviolet, visible, and/or infrared portions of the electromagnetic spectrum. In some examples, the wavelength range can be a narrow wavelength range, such as provided by lasers. In one example, the wavelength range includes wavelengths that are approximately 905 nm. Additionally, the light sources 122 can be configured to emit the emitted light beams 102 in the form of pulses. In some examples, the plurality of light sources 122 can be disposed on one or more substrates (e.g., printed circuit boards (PCB), flexible PCBs, etc.) and arranged to emit the plurality of light beams 102 towards the exit aperture 124.
In some examples, the plurality of light sources 122 can be configured to emit uncollimated light beams included in the emitted light beams 102. For example, the emitted light beams 102 can diverge in one or more directions along the transmit path due to the uncollimated light beams emitted by the plurality of light sources 122. In some examples, vertical and horizontal extents of the emitted light beams 102 at any position along the transmit path can be based on an extent of the divergence of the uncollimated light beams emitted by the plurality of light sources 122.
The exit aperture 124 arranged along the transmit path of the emitted light beams 102 can be configured to accommodate the vertical and horizontal extents of the plurality of light beams 102 emitted by the plurality of light sources 122 at the exit aperture 124. It is noted that the block diagram shown in FIG. 1 is described in connection with functional modules for convenience in description. However, the functional modules in the block diagram of FIG. 1 can be physically implemented in other locations. For example, although illustrated that the exit aperture 124 is included in the transmit block 120, the exit aperture 124 can be physically included in both the transmit block 120 and the shared space 140. For example, the transmit block 120 and the shared space 140 can be separated by a wall that includes the exit aperture 124. In this case, the exit aperture 124 can correspond to a transparent portion of the wall. In one example, the transparent portion can be a hole or cut-away portion of the wall. In another example, the wall can be formed from a transparent substrate (e.g., glass) coated with a non-transparent material, and the exit aperture 124 can be a portion of the substrate that is not coated with the non-transparent material.
In some examples of the LIDAR device 100, it may be desirable to minimize size of the exit aperture 124 while accommodating the vertical and horizontal extents of the plurality of light beams 102. For example, minimizing the size of the exit aperture 124 can improve the optical shielding of the light sources 122 described above in the functions of the housing 110. Additionally or alternatively, the wall separating the transmit block 120 and the shared space 140 can be arranged along the receive path of the focused light 108, and thus, the exit aperture 124 can be minimized to allow a larger portion of the focused light 108 to reach the wall. For example, the wall can be coated with a reflective material (e.g., reflective surface 142 in shared space 140) and the receive path can include reflecting the focused light 108 by the reflective material towards the receive block 130. In this case, minimizing the size of the exit aperture 124 can allow a larger portion of the focused light 108 to reflect off the reflective material that the wall is coated with.
To minimize the size of the exit aperture 124, in some examples, the divergence of the emitted light beams 102 can be reduced by partially collimating the uncollimated light beams emitted by the light sources 122 to minimize the vertical and horizontal extents of the emitted light beams 102 and thus minimize the size of the exit aperture 124. For example, each light source of the plurality of light sources 122 can include a cylindrical lens arranged adjacent to the light source. The light source may emit a corresponding uncollimated light beam that diverges more in a first direction than in a second direction. The cylindrical lens may pre-collimate the uncollimated light beam in the first direction to provide a partially collimated light beam, thereby reducing the divergence in the first direction. In some examples, the partially collimated light beam diverges less in the first direction than in the second direction. Similarly, uncollimated light beams from other light sources of the plurality of light sources 122 can have a reduced beam width in the first direction and thus the emitted light beams 102 can have a smaller divergence due to the partially collimated light beams. In this example, at least one of the vertical and horizontal extents of the exit aperture 124 can be reduced due to partially collimating the light beams 102.
Additionally or alternatively, to minimize the size of the exit aperture 124, in some examples, the light sources 122 can be arranged along a substantially curved surface defined by the transmit block 120. The curved surface can be configured such that the emitted light beams 102 converge towards the exit aperture 124, and thus the vertical and horizontal extents of the emitted light beams 102 at the exit aperture 124 can be reduced due to the arrangement of the light sources 122 along the curved surface of the transmit block 120. In some examples, the curved surface of the transmit block 120 can include a curvature along the first direction of divergence of the emitted light beams 102 and a curvature along the second direction of divergence of the emitted light beams 102, such that the plurality of light beams 102 converge towards a central area in front of the plurality of light sources 122 along the transmit path.
To facilitate such curved arrangement of the light sources 122, in some examples, the light sources 122 can be disposed on a flexible substrate (e.g., flexible PCB) having a curvature along one or more directions. For example, the curved flexible substrate can be curved along the first direction of divergence of the emitted light beams 102 and the second direction of divergence of the emitted light beams 102. Additionally or alternatively, to facilitate such curved arrangement of the light sources 122, in some examples, the light sources 122 can be disposed on a curved edge of one or more vertically-oriented printed circuit boards (PCBs), such that the curved edge of the PCB substantially matches the curvature of the first direction (e.g., the vertical plane of the PCB). In this example, the one or more PCBs can be mounted in the transmit block 120 along a horizontal curvature that substantially matches the curvature of the second direction (e.g., the horizontal plane of the one or more PCBs). For example, the transmit block 120 can include four PCBs, with each PCB mounting sixteen light sources, so as to provide 64 light sources along the curved surface of the transmit block 120. In this example, the 64 light sources are arranged in a pattern such that the emitted light beams 102 converge towards the exit aperture 124 of the transmit block 120.
The transmit block 120 can optionally include the mirror 126 along the transmit path of the emitted light beams 102 between the light sources 122 and the exit aperture 124. By including the mirror 126 in the transmit block 120, the transmit path of the emitted light beams 102 can be folded to provide a smaller size of the transmit block 120 and the housing 110 of the LIDAR device 100 than a size of another transmit block where the transmit path that is not folded.
The receive block 130 includes a plurality of detectors 132 that can be configured to receive the focused light 108 via an entrance aperture 134. In some examples, each of the plurality of detectors 132 is configured and arranged to receive a portion of the focused light 108 corresponding to a light beam emitted by a corresponding light source of the plurality of light sources 122 and reflected of the one or more objects in the environment of the LIDAR device 100. The receive block 130 can optionally include the detectors 132 in a sealed environment having an inert gas 136.
The detectors 132 may comprise photodiodes, avalanche photodiodes, phototransistors, cameras, active pixel sensors (APS), charge coupled devices (CCD), cryogenic detectors, or any other sensor of light configured to receive focused light 108 having wavelengths in the wavelength range of the emitted light beams 102.
To facilitate receiving, by each of the detectors 132, the portion of the focused light 108 from the corresponding light source of the plurality of light sources 122, the detectors 132 can be disposed on one or more substrates and arranged accordingly. For example, the light sources 122 can be arranged along a curved surface of the transmit block 120, and the detectors 132 can also be arranged along a curved surface of the receive block 130. The curved surface of the receive block 130 can similarly be curved along one or more axes of the curved surface of the receive block 130. Thus, each of the detectors 132 are configured to receive light that was originally emitted by a corresponding light source of the plurality of light sources 122.
To provide the curved surface of the receive block 130, the detectors 132 can be disposed on the one or more substrates similarly to the light sources 122 disposed in the transmit block 120. For example, the detectors 132 can be disposed on a flexible substrate (e.g., flexible PCB) and arranged along the curved surface of the flexible substrate to each receive focused light originating from a corresponding light source of the light sources 122. In this example, the flexible substrate may be held between two clamping pieces that have surfaces corresponding to the shape of the curved surface of the receive block 130. Thus, in this example, assembly of the receive block 130 can be simplified by sliding the flexible substrate onto the receive block 130 and using the two clamping pieces to hold it at the correct curvature.
The focused light 108 traversing along the receive path can be received by the detectors 132 via the entrance aperture 134. In some examples, the entrance aperture 134 can include a filtering window that passes light having wavelengths within the wavelength range emitted by the plurality of light sources 122 and attenuates light having other wavelengths. In this example, the detectors 132 receive the focused light 108 substantially comprising light having the wavelengths within the wavelength range.
In some examples, the plurality of detectors 132 included in the receive block 130 can include, for example, avalanche photodiodes in a sealed environment that is filled with the inert gas 136. The inert gas 136 may comprise, for example, nitrogen.
The shared space 140 includes the transmit path for the emitted light beams 102 from the transmit block 120 to the lens 150, and includes the receive path for the focused light 108 from the lens 150 to the receive block 130. In some examples, the transmit path at least partially overlaps with the receive path in the shared space 140. By including the transmit path and the receive path in the shared space 140, advantages with respect to size, cost, and/or complexity of assembly, manufacture, and/or maintenance of the LIDAR device 100 can be provided.
In some examples, the shared space 140 can include a reflective surface 142. The reflective surface 142 can be arranged along the receive path and configured to reflect the focused light 108 towards the entrance aperture 134 and onto the detectors 132. The reflective surface 142 may comprise a prism, mirror or any other optical element configured to reflect the focused light 108 towards the entrance aperture 134 in the receive block 130. In some examples where a wall separates the shared space 140 from the transmit block 120. In these examples, the wall may comprise a transparent substrate (e.g., glass) and the reflective surface 142 may comprise a reflective coating on the wall with an uncoated portion for the exit aperture 124.
In embodiments including the reflective surface 142, the reflective surface 142 can reduce size of the shared space 140 by folding the receive path similarly to the mirror 126 in the transmit block 120. Additionally or alternatively, in some examples, the reflective surface 142 can direct the focused light 103 to the receive block 130 further providing flexibility to the placement of the receive block 130 in the housing 110. For example, varying the tilt of the reflective surface 142 can cause the focused light 108 to be reflected to various portions of the interior space of the housing 110, and thus the receive block 130 can be placed in a corresponding position in the housing 110. Additionally or alternatively, in this example, the LIDAR device 100 can be calibrated by varying the tilt of the reflective surface 142.
The lens 150 mounted to the housing 110 can have an optical power to both collimate the emitted light beams 102 from the light sources 122 in the transmit block 120, and focus the reflected light 106 from the one or more objects in the environment of the LIDAR device 100 onto the detectors 132 in the receive block 130. In one example, the lens 150 has a focal length of approximately 120 mm. By using the same lens 150 to perform both of these functions, instead of a transmit lens for collimating and a receive lens for focusing, advantages with respect to size, cost, and/or complexity can be provided. In some examples, collimating the emitted light beams 102 to provide the collimated light beams 104 allows determining the distance travelled by the collimated light beams 104 to the one or more objects in the environment of the LIDAR device 100.
In an example scenario, the emitted light beams 102 from the light sources 122 traversing along the transmit path can be collimated by the lens 150 to provide the collimated light beams 104 to the environment of the LIDAR device 100. The collimated light beams 104 may then reflect off the one or more objects in the environment of the LIDAR device 100 and return to the lens 150 as the reflected light 106. The lens 150 may then collect and focus the reflected light 106 as the focused light 108 onto the detectors 132 included in the receive block 130. In some examples, aspects of the one or more objects in the environment of the LIDAR device 100 can be determined by comparing the emitted light beams 102 with the focused light beams 108. The aspects can include, for example, distance, shape, color, and/or material of the one or more objects. Additionally, in some examples, rotating the housing 110, a three dimensional map of the surroundings of the LIDAR device 100 can be determined.
In some examples where the plurality of light sources 122 are arranged along the curved surface of the transmit block 120, the lens 150 can be configured to have a focal surface corresponding to the curved surface of the transmit block 120. For example, the lens 150 can include an aspheric surface outside the housing 110 and a toroidal surface inside the housing 110 facing the shared space 140. In this example, the shape of the lens 150 allows the lens 150 to both collimate the emitted light beams 102 and focus the reflected light 106. Additionally, in this example, the shape of the lens 150 allows the lens 150 to have the focal surface corresponding to the curved surface of the transmit block 120. In some examples, the focal surface provided by the lens 150 substantially matches the curved shape of the transmit block 120. Additionally, in some examples, the detectors 132 can be arranged similarly in the curved shape of the receive block 130 to receive the focused light 108 along the curved focal surface provided by the lens 150. Thus, in some examples, the curved surface of the receive block 130 may also substantially match the curved focal surface provided by the lens 150.
FIG. 2 is a cross-section view of an example LIDAR device 200. In this example, the LIDAR device 200 includes a housing 210 that houses a transmit block 220, a receive block 230, a shared space 240, and a lens 250. For purposes of illustration, FIG. 2 shows an x-y-z axis, in which the z-axis is in a substantially vertical direction and the x-axis and y-axis define a substantially horizontal plane.
The structure, function, and operation of various components included in the LIDAR device 200 are similar to corresponding components included in the LIDAR device 100 described in FIG. 1. For example, the housing 210, the transmit block 220, the receive block 230, the shared space 240, and the lens 250 are similar, respectively, to the housing 110, the transmit block 120, the receive block 130, and the shared space 140 described in FIG. 1.
The transmit block 220 includes a plurality of light sources 222a-c arranged along a curved focal surface 228 defined by the lens 250. The plurality of light sources 222a-c can be configured to emit, respectively, the plurality of light beams 202a-c having wavelengths within a wavelength range. For example, the plurality of light sources 222a-c may comprise laser diodes that emit the plurality of light beams 202a-c having the wavelengths within the wavelength range. The plurality of light beams 202a-c are reflected by mirror 224 through an exit aperture 226 into the shared space 240 and towards the lens 250. The structure, function, and operation of the plurality of light sources 222a-c, the mirror 224, and the exit aperture 226 can be similar, respectively, to the plurality of light sources 122, the mirror 124, and the exit aperture 226 discussed in the description of the LIDAR device 100 of FIG. 1.
Although FIG. 2 shows that the curved focal surface 228 is curved in the x-y plane (horizontal plane), additionally or alternatively, the plurality of light sources 222a-c may be arranged along a focal surface that is curved in a vertical plane. For example, the curved focal surface 228 can have a curvature in a vertical plane, and the plurality of light sources 222a-c can include additional light sources arranged vertically along the curved focal surface 228 and configured to emit light beams directed at the mirror 224 and reflected through the exit aperture 226.
Due to the arrangement of the plurality of light sources 222a-c along the curved focal surface 228, the plurality of light beams 202a-c, in some examples, may converge towards the exit aperture 226. Thus, in these examples, the exit aperture 226 may be minimally sized while being capable of accommodating vertical and horizontal extents of the plurality of light beams 202a-c. Additionally, in some examples, the curved focal surface 228 can be defined by the lens 250. For example, the curved focal surface 228 may correspond to a focal surface of the lens 250 due to shape and composition of the lens 250. In this example, the plurality of light sources 222a-c can be arranged along the focal surface defined by the lens 250 at the transmit block.
The plurality of light beams 202a-c propagate in a transmit path that extends through the transmit block 220, the exit aperture 226, and the shared space 240 towards the lens 250. The lens 250 collimates the plurality of light beams 202a-c to provide collimated light beams 204a-c into an environment of the LIDAR device 200. The collimated light beams 204a-c correspond, respectively, to the plurality of light beams 202a-c. In some examples, the collimated light beams 204a-c reflect off one or more objects in the environment of the LIDAR device 200 as reflected light 206. The reflected light 206 may be focused by the lens 250 into the shared space 240 as focused light 208 traveling along a receive path that extends through the shared space 240 onto the receive block 230. For example, the focused light 208 may be reflected by the reflective surface 242 as focused light 208a-c propagating towards the receive block 230.
The lens 250 may be capable of both collimating the plurality of light beams 202a-c and focusing the reflected light 206 along the receive path 208 towards the receive block 230 due to shape and composition of the lens 250. For example, the lens 250 can have an aspheric surface 252 facing outside of the housing 210 and a toroidal surface 254 facing the shared space 240. By using the same lens 250 to perform both of these functions, instead of a transmit lens for collimating and a receive lens for focusing, advantages with respect to size, cost, and/or complexity can be provided.
The exit aperture 226 is included in a wall 244 that separates the transmit block 220 from the shared space 240. In some examples, the wall 244 can be formed from a transparent material (e.g., glass) that is coated with a reflective material 242. In this example, the exit aperture 226 may correspond to the portion of the wall 244 that is not coated by the reflective material 242. Additionally or alternatively, the exit aperture 226 may comprise a hole or cut-away in the wall 244.
The focused light 208 is reflected by the reflective surface 242 and directed towards an entrance aperture 234 of the receive block 230. In some examples, the entrance aperture 234 may comprise a filtering window configured to allow wavelengths in the wavelength range of the plurality of light beams 202a-c emitted by the plurality of light sources 222a-c and attenuate other wavelengths. The focused light 208a-c reflected by the reflective surface 242 from the focused light 208 propagates, respectively, onto a plurality of detectors 232a-c. The structure, function, and operation of the entrance aperture 234 and the plurality of detectors 232a-c is similar, respectively, to the entrance aperture 134 and the plurality of detectors 132 included in the LIDAR device 100 described in FIG. 1.
The plurality of detectors 232a-c can be arranged along a curved focal surface 238 of the receive block 230. Although FIG. 2 shows that the curved focal surface 238 is curved along the x-y plane (horizontal plane), additionally or alternatively, the curved focal surface 238 can be curved in a vertical plane. The curvature of the focal surface 238 is also defined by the lens 250. For example, the curved focal surface 238 may correspond to a focal surface of the light projected by the lens 250 along the receive path at the receive block 230.
Each of the focused light 208a-c corresponds, respectively, to the emitted light beams 202a-c and is directed onto, respectively, the plurality of detectors 232a-c. For example, the detector 232a is configured and arranged to received focused light 208a that corresponds to collimated light beam 204a reflected of the one or more objects in the environment of the LIDAR device 200. In this example, the collimated light beam 204a corresponds to the light beam 202a emitted by the light source 222a. Thus, the detector 232a receives light that was emitted by the light source 222a, the detector 232b receives light that was emitted by the light source 222b, and the detector 232c receives light that was emitted by the light source 222c.
By comparing the received light 208a-c with the emitted light beams 202a-c, at least one aspect of the one or more object in the environment of the LIDAR device 200 may be determined. For example, by comparing a time when the plurality of light beams 202a-c were emitted by the plurality of light sources 222a-c and a time when the plurality of detectors 232a-c received the focused light 208a-c, a distance between the LIDAR device 200 and the one or more object in the environment of the LIDAR device 200 may be determined. In some examples, other aspects such as shape, color, material, etc. may also be determined.
In some examples, the LIDAR device 200 may be rotated about an axis to determine a three-dimensional map of the surroundings of the LIDAR device 200. For example, the LIDAR device 200 may be rotated about a substantially vertical axis as illustrated by arrow 290. Although illustrated that the LIDAR device 200 is rotated counter clock-wise about the axis as illustrated by the arrow 290, additionally or alternatively, the LIDAR device 200 may be rotated in the clockwise direction. In some examples, the LIDAR device 200 may be rotated 360 degrees about the axis. In other examples, the LIDAR device 200 may be rotated back and forth along a portion of the 360 degree view of the LIDAR device 200. For example, the LIDAR device 200 may be mounted on a platform that wobbles back and forth about the axis without making a complete rotation.
FIG. 3A is a perspective view of an example LIDAR device 300 fitted with various components, in accordance with at least some embodiments described herein. FIG. 3B is a perspective view of the example LIDAR device 300 shown in FIG. 3A with the various components removed to illustrate interior space of the housing 310. The structure, function, and operation of the LIDAR device 300 is similar to the LIDAR devices 100 and 200 described, respectively, in FIGS. 1 and 2. For example, the LIDAR device 300 includes a housing 310 that houses a transmit block 320, a receive block 330, and a lens 350 that are similar, respectively, to the housing 110, the transmit block 120, the receive block 130, and the lens 150 described in FIG. 1. Additionally, collimated light beams 304 propagate from the lens 350 toward an environment of the LIDAR device 300 and reflect of one or more objects in the environment as reflected light 306, similarly to the collimated light beams 104 and reflected light 106 described in FIG. 1.
The LIDAR device 300 can be mounted on a mounting structure 360 and rotated about an axis to provide a 360 degree view of the environment surrounding the LIDAR device 300. In some examples, the mounting structure 360 may comprise a movable platform that may tilt in one or more directions to change the axis of rotation of the LIDAR device 300.
As illustrated in FIG. 3B, the various components of the LIDAR device 300 can be removably mounted to the housing 310. For example, the transmit block 320 may comprise one or more printed circuit boards (PCBs) that are fitted in the portion of the housing 310 where the transmit block 320 can be mounted. Additionally, the receive block 330 may comprise a plurality of detectors 332 mounted to a flexible substrate and can be removably mounted to the housing 310 as a block that includes the plurality of detectors. Similarly, the lens 350 can be mounted to another side of the housing 310.
A plurality of light beams 302 can be transmitted by the transmit block 320 into the shared space 340 and towards the lens 350 to be collimated into the collimated light beams 304. Similarly, the received light 306 can be focused by the lens 350 and directed through the shared space 340 onto the receive block 330.
FIG. 4 illustrates an example transmit block 420, in accordance with at least some embodiments described herein. Transmit block 420 can correspond to the transmit blocks 120, 220, and 320 described in FIGS. 1-3. For example, the transmit block 420 includes a plurality of light sources 422a-c similar to the plurality of light sources 222a-c included in the transmit block 220 of FIG. 2. Additionally, the light sources 422a-c are arranged along a focal surface 428, which is curved in a vertical plane. The light sources 422a-c are configured to emit a plurality of light beams 402a-c that converge and propagate through an exit aperture 426 in a wall 444.
Although the plurality of light sources 422a-c can be arranged along a focal surface 428 that is curved in a vertical plane, additionally or alternatively, the plurality of light sources 422a-c can be arranged along a focal surface that is curved in a horizontal plane or a focal surface that is curved both vertically and horizontally. For example, the plurality of light sources 422a-c can be arranged in a curved three dimensional grid pattern. For example, the transmit block 420 may comprise a plurality of printed circuit board (PCB) vertically mounted such that a column of light sources such as the plurality of light sources 422a-c are along the vertical axis of each PCB and each of the plurality of PCBs can be arranged adjacent to other vertically mounted PCBs along a horizontally curved plane to provide the three dimensional grid pattern.
As shown in FIG. 4, the light beams 402a-c converge towards the exit aperture 426 which allows the size of the exit aperture 426 to be minimized while accommodating vertical and horizontal extents of the light beams 402a-c similarly to the exit aperture 226 described in FIG. 2.
As noted above in the description of FIG. 1, the light from light sources 122 could be partially collimated to fit through the exit aperture 124. FIGS. 5A, 5B, and 5C illustrate an example of how such partial collimation could be achieved. In this example, a light source 500 is made up of a laser diode 502 and a cylindrical lens 504. As shown in FIG. 5A, laser diode 502 has an aperture 506 with a shorter dimension corresponding to a fast axis 508 and a longer dimension corresponding to a slow axis 510. FIGS. 5B and 5C show an uncollimated laser beam 512 being emitted from laser diode 502. Laser beam 512 diverges in two directions, one direction defined by fast axis 508 and another, generally orthogonal direction defined by slow axis 510. FIG. 5B shows the divergence of laser beam 512 along fast axis 508, whereas FIG. 5C shows the divergence of laser beam 512 along slow axis 510. Laser beam 512 diverges more quickly along fast axis 508 than along slow axis 510.
In one specific example, laser diode 502 is an Osram SPL DL90_3 nanostack pulsed laser diode that emits pulses of light with a range of wavelengths from about 896 nm to about 910 nm (a nominal wavelength of 905 nm). In this specific example, the aperture has a shorter dimension of about 10 microns, corresponding to its fast axis, and a longer dimension of about 200 microns, corresponding to its slow axis. The divergence of the laser beam in this specific example is about 25 degrees along the fast axis and about 11 degrees along the slow axis. It is to be understood that this specific example is illustrative only. Laser diode 502 could have a different configuration, different aperture sizes, different beam divergences, and/or emit different wavelengths.
As shown in FIGS. 5B and 5C, cylindrical lens 504 may be positioned in front of aperture 506 with its cylinder axis 514 generally parallel to slow axis 510 and perpendicular to fast axis 508. In this arrangement, cylindrical lens 504 can pre-collimate laser beam 512 along fast axis 508, resulting in partially collimated laser beam 516. In some examples, this pre-collimation may reduce the divergence along fast axis 508 to about one degree or less. Nonetheless, laser beam 516 is only partially collimated because the divergence along slow axis 510 may be largely unchanged by cylindrical lens 504. Thus, whereas uncollimated laser beam 512 emitted by laser diode has a higher divergence along fast axis 508 than along slow axis 510, partially collimated laser beam 516 provided by cylindrical lens 504 may have a higher divergence along slow axis 510 than along fast axis 508. Further, the divergences along slow axis 510 in uncollimated laser beam 512 and in partially collimated laser beam 516 may be substantially equal.
In one example, cylindrical lens 504 is a microrod lens with a diameter of about 600 microns that is placed about 250 microns in front of aperture 506. The material of the microrod lens could be, for example, fused silica or a borosilicate crown glass, such as Schott BK7. Alternatively, the microrod lens could be a molded plastic cylinder or acylinder. Cylindrical lens 504 could also be used to provide magnification along fast axis 508. For example, if the dimensions of aperture 506 are 10 microns by 200 microns, as previously described, and cylindrical lens 504 is a microrod lens as described above, then cylindrical lens 504 may magnify the shorter dimension (corresponding to fast axis 508) by about 20 times. This magnification effectively stretches out the shorter dimension of aperture 506 to about the same as the longer dimension. As a result, when light from laser beam 516 is focused, for example, focused onto a detector, the focused spot could have a substantially square shape instead of the rectangular slit shape of aperture 506.
FIG. 6A illustrates an example receive block 630, in accordance with at least some embodiments described herein. FIG. 6B illustrates a side view of three detectors 632a-c included in the receive block 630 of FIG. 6A. Receive block 630 can correspond to the receive blocks 130, 230, and 330 described in FIGS. 1-3. For example, the receive block 630 includes a plurality of detectors 632a-c arranged along a curved surface 638 defined by a lens 650 similarly to the receive block 230, the detectors 232 and the curved plane 238 described in FIG. 2. Focused light 608a-c from lens 650 propagates along a receive path that includes a reflective surface 642 onto the detectors 632a-c similar, respectively, to the focused light 208a-c, the lens 250, the reflective surface 242, and the detectors 232a-c described in FIG. 2.
The receive block 630 comprises a flexible substrate 680 on which the plurality of detectors 632a-c are arranged along the curved surface 638. The flexible substrate 680 conforms to the curved surface 638 by being mounted to a receive block housing 690 having the curved surface 638. As illustrated in FIG. 6, the curved surface 638 includes the arrangement of the detectors 632a-c curved along a vertical and horizontal axis of the receive block 630.
FIGS. 7A and 7B illustrate an example lens 750 with an aspheric surface 752 and a toroidal surface 754, in accordance with at least some embodiments described herein. FIG. 7B illustrates a cross-section view of the example lens 750 shown in FIG. 7A. The lens 750 can correspond to lens 150, 250, and 350 included in FIGS. 1-3. For example, the lens 750 can be configured to both collimate light incident on the toroidal surface 754 from a light source into collimated light propagating out of the aspheric surface 752, and focus reflected light entering from the aspheric surface 752 onto a detector. The structure of the lens 750 including the aspheric surface 752 and the toroidal surface 754 allows the lens 750 to perform both functions of collimating and focusing described in the example above.
In some examples, the lens 750 defines a focal surface of the light propagating through the lens 750 due to the aspheric surface 752 and the toroidal surface 754. In these examples, the light sources providing the light entering the toroidal surface 754 can be arranged along the defined focal surface, and the detectors receiving the light focused from the light entering the aspheric surface 752 can also be arranged along the defined focal surface.
By using the lens 750 that performs both of these functions (collimating transmitted light and focusing received light), instead of a transmit lens for collimating and a receive lens for focusing, advantages with respect to size, cost, and/or complexity can be provided.
FIG. 8A illustrates an example LIDAR device 810 mounted on a vehicle 800, in accordance with at least some embodiments described herein. FIG. 8A shows a Right Side View, Front View, Back View, and Top View of the vehicle 800. Although vehicle 800 is illustrated in FIG. 8 as a car, other examples are possible. For instance, the vehicle 800 could represent a truck, a van, a semi-trailer truck, a motorcycle, a golf cart, an off-road vehicle, or a farm vehicle, among other examples.
The structure, function, and operation of the LIDAR device 810 shown in FIG. 8A is similar to the example LIDAR devices 100, 200, and 300 shown in FIGS. 1-3. For example, the LIDAR device 810 can be configured to rotate about an axis and determine a three-dimensional map of a surrounding environment of the LIDAR device 810. To facilitate the rotation, the LIDAR device 810 can be mounted on a platform 802. In some examples, the platform 802 may comprise a movable mount that allows the vehicle 800 to control the axis of rotation of the LIDAR device 810.
While the LIDAR device 810 is shown to be mounted in a particular location on the vehicle 800, in some examples, the LIDAR device 810 may be mounted elsewhere on the vehicle 800. For example, the LIDAR device 810 may be mounted anywhere on top of the vehicle 800, on a side of the vehicle 800, under the vehicle 800, on a hood of the vehicle 800, and/or on a trunk of the vehicle 800.
The LIDAR device 810 includes a lens 812 through which collimated light is transmitted from the LIDAR device 810 to the surrounding environment of the LIDAR device 810, similarly to the lens 150, 250, and 350 described in FIGS. 1-3. Similarly, the lens 812 can also be configured to receive reflected light from the surrounding environment of the LIDAR device 810 that were reflected off one or more objects in the surrounding environment.
FIG. 8B illustrates a scenario where the LIDAR device 810 shown in FIG. 8A and scanning an environment 830 that includes one or more objects, in accordance with at least some embodiments described herein. In this example scenario, vehicle 800 can be traveling on a road 822 in the environment 830. By rotating the LIDAR device 810 about the axis defined by the platform 802, the LIDAR device 810 may be able to determine aspects of objects in the surrounding environment 830, such as lane lines 824a-b, other vehicles 826a-c, and/or street sign 828. Thus, the LIDAR device 810 can provide the vehicle 800 with information about the objects in the surrounding environment 830, including distance, shape, color, and/or material type of the objects.
FIG. 9 is a flowchart of a method 900 of operating a LIDAR device, in accordance with at least some embodiments described herein. Method 900 shown in FIG. 9 presents an embodiment of a method that could be used with the LIDAR devices 100, 200, and 300, for example. Method 900 may include one or more operations, functions, or actions as illustrated by one or more of blocks 902-912. Although the blocks are illustrated in a sequential order, these blocks may in some instances be performed in parallel, and/or in a different order than those described herein. Also, the various blocks may be combined into fewer blocks, divided into additional blocks, and/or removed based upon the desired implementation.
In addition, for the method 900 and other processes and methods disclosed herein, the flowchart shows functionality and operation of one possible implementation of present embodiments. In this regard, each block may represent a module, a segment, or a portion of a manufacturing or operation process.
At block 902, the method 900 includes rotating a housing of a light detection and ranging (LIDAR) device about an axis, wherein the housing has an interior space that includes a transmit block, a receive block, and a shared space, wherein the transmit block has an exit aperture, and wherein the receive block has an entrance aperture.
At block 904, the method 900 includes emitting, by a plurality of light sources in the transmit block, a plurality of light beams that enter the shared space via a transmit path, the light beams comprising light having wavelengths in a wavelength range.
At block 906, the method 900 includes receiving the light beams at a lens mounted to the housing along the transmit path.
At block 908, the method 900 includes collimating, by the lens, the light beams for transmission into an environment of the LIDAR device.
At block 910, the method 900 includes focusing, by the lens, the collected light onto a plurality of detectors in the receive block via a receive path that extends through the shared space and the entrance aperture of the receive block.
At block 912, the method 900 includes detecting, by the plurality of detectors in the receive block, light from the focused light having wavelengths in the wavelength range.
For example, a LIDAR device such as the LIDAR device 200 can be rotated about an axis (block 902). A transmit block, such as the transmit block 220, can include a plurality of light sources that emit light beams having wavelengths in a wavelength range, through an exit aperture and a shared space to a lens (block 904). The light beams can be received by the lens (block 906) and collimated for transmission to an environment of the LIDAR device (block 908). The collimated light may then reflect off one or more objects in the environment of the LIDAR device and return as reflected light collected by the lens. The lens may then focus the collected light onto a plurality of detectors in the receive block via a receive path that extends through the shared space and an entrance aperture of the receive block (block 910). The plurality of detectors in the receive block may then detect light from the focused light having wavelengths in the wavelength range of the emitted light beams from the light sources (block 912).
Within examples, devices and operation methods described include a LIDAR device rotated about an axis and configured to transmit collimated light and focus reflected light. The collimation and focusing can be performed by a shared lens. By using a shared lens that performs both of these functions, instead of a transmit lens for collimating and a receive lens for focusing, advantages with respect to size, cost, and/or complexity can be provided. Additionally, in some examples, the shared lens can define a curved focal surface. In these examples, the light sources emitting light through the shared lens and the detectors receiving light focused by the shared lens can be arranged along the curved focal surface defined by the shared lens.
It should be understood that arrangements described herein are for purposes of example only. As such, those skilled in the art will appreciate that other arrangements and other elements (e.g. machines, interfaces, functions, orders, and groupings of functions, etc.) can be used instead, and some elements may be omitted altogether according to the desired results. Further, many of the elements that are described are functional entities that may be implemented as discrete or distributed components or in conjunction with other components, in any suitable combination and location, or other structural elements described as independent structures may be combined.
While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope being indicated by the following claims, along with the full scope of equivalents to which such claims are entitled. It is also to be understood that the terminology used herein is for the purpose of describing particular embodiments only, and is not intended to be limiting.

Claims (20)

What is claimed is:
1. A light detection and ranging (LIDAR) device, comprising:
a lens mounted to a housing, wherein the housing is configured to rotate about an axis and has an interior space that includes a transmit block, a receive block, a transmit path, and a receive path, wherein the transmit block has an exit aperture, wherein the receive block has an entrance aperture, wherein the transmit path extends from the exit aperture to the lens, wherein the receive path extends from the lens to the entrance aperture, and wherein the transmit path at least partially overlaps the receive path in the interior space between the transmit block and the receive block;
a plurality of light sources in the transmit block, wherein the plurality of light sources are configured to emit a plurality of light beams through the exit aperture in a plurality of different directions, the light beams comprising light having wavelengths in a wavelength range;
a plurality of detectors in the receive block, wherein the plurality of detectors are configured to detect light having wavelengths in the wavelength range; and
wherein the lens is configured to receive the light beams via the transmit path, collimate the light beams for transmission into an environment of the LIDAR device, collect light comprising light from one or more of the collimated light beams reflected by one or more objects in the environment of the LIDAR device, and focus the collected light onto the detectors via the receive path.
2. The LIDAR device of claim 1, wherein each detector in the plurality of detectors is associated with a corresponding light source in the plurality of light sources, and wherein the lens is configured to focus onto each detector a respective portion of the collected light that comprises light from the detector's corresponding light source.
3. The LIDAR device of claim 1, wherein the exit aperture is in a wall that comprises a reflective surface.
4. The LIDAR device of claim 3, wherein the receive path extends from the lens to the entrance aperture via the reflective surface.
5. The LIDAR device of claim 3, wherein the wall comprises a transparent material, the reflective surface covers a portion of the transparent material, and the exit aperture corresponds to a portion of the transparent material that is not covered by the reflective surface.
6. The LIDAR device of claim 1, wherein the lens defines a curved focal surface in the transmit block and a curved focal surface in the receive block.
7. The LIDAR device of claim 6, wherein the light sources in the plurality of light sources are arranged in a pattern substantially corresponding to the curved focal surface in the transmit block, and wherein the detectors in the plurality of detectors are arranged in a pattern substantially corresponding to the curved focal surface in the receive block.
8. The LIDAR device of claim 1, wherein the lens has an aspheric surface and a toroidal surface.
9. The LIDAR device of claim 8, wherein the toroidal surface is in the interior space within the housing and the aspheric surface is outside of the housing.
10. The LIDAR device of claim 1, wherein the axis is substantially vertical housing is configured to rotate about an axis.
11. The LIDAR device of claim 1, further comprising a mirror in the transmit block, wherein the mirror is configured to reflect the light beams toward the exit aperture.
12. The LIDAR device of claim 1, wherein the receive block comprises a sealed environment containing an inert gas.
13. The LIDAR device of claim 1, wherein the entrance aperture comprises a material that passes light having wavelengths in the wavelength range and attenuates light having other wavelengths filtering window.
14. The LIDAR device of claim 1, wherein each light source in the plurality of light sources comprises a respective laser diode.
15. The LIDAR device of claim 1, wherein each detector in the plurality of detectors comprises a respective avalanche photodiode.
16. A method comprising:
rotating a housing ofoperating a light detection and ranging (LIDAR) device about an axiscomprising a housing, wherein the housing mounts a lens and has an interior space that includes a transmit block, a receive block, a transmit path, and a receive path, wherein the transmit block has an exit aperture, wherein the receive block has an entrance aperture, wherein the transmit path extends from the exit aperture to the lens, wherein the receive path extends from the lens to the entrance aperture, and wherein the transmit path at least partially overlaps the receive path in the interior space between the transmit block and the receive block;, wherein operating the LIDAR device comprises:
emitting, by a plurality of light sources in the transmit block, a plurality of light beams through the exit aperture in a plurality of different directions, the light beams comprising light having wavelengths in a wavelength range;
receiving, by the lens, the light beams via the transmit path;
collimating, by the lens, the light beams for transmission into an environment of the LIDAR device;
collecting, by the lens, light from one or more of the collimated light beams reflected by one or more objects in the environment of the LIDAR device;
focusing, by the lens, the collected light onto a plurality of detectors in the receive block via the receive path; and
detecting, by the plurality of detectors in the receive block, light from the focused light having wavelengths in the wavelength range.
17. The method of claim 16, wherein each detector in the plurality of detectors is associated with a corresponding light source in the plurality of light sources, the method further comprising wherein operating the LIDAR device further comprises:
focusing onto each detector, by the lens, a respective portion of the collected light that comprises light from the detector's corresponding light source.
18. The method of claim 16, wherein the exit aperture is in a wall that comprises a reflective surface, and wherein the receive path extends from the lens to the entrance aperture via the reflective surface, further comprising wherein operating the LIDAR device further comprises:
reflecting, by the reflective surface, the collected light that is focused by the lens onto the plurality of detectors in the receive block via the receive path.
19. The method of claim 16, further comprising wherein operating the LIDAR device further comrpises:
reflecting, by a mirror in the transmit block, the emitted light beams toward the exit aperture.
20. The LIDAR device of claim 10, wherein the axis is substantially vertical.
US15/919,479 2013-08-20 2018-03-13 Devices and methods for a LIDAR platform with a shared transmit/receive path Active USRE48042E1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US15/919,479 USRE48042E1 (en) 2013-08-20 2018-03-13 Devices and methods for a LIDAR platform with a shared transmit/receive path
US16/890,789 USRE48874E1 (en) 2013-08-20 2020-06-02 Devices and methods for a rotating LIDAR platform with a shared transmit/receive path

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US13/971,606 US8836922B1 (en) 2013-08-20 2013-08-20 Devices and methods for a rotating LIDAR platform with a shared transmit/receive path
US14/462,075 US9285464B2 (en) 2013-08-20 2014-08-18 Devices and methods for a rotating LIDAR platform with a shared transmit/receive path
US15/919,479 USRE48042E1 (en) 2013-08-20 2018-03-13 Devices and methods for a LIDAR platform with a shared transmit/receive path

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/462,075 Reissue US9285464B2 (en) 2013-08-20 2014-08-18 Devices and methods for a rotating LIDAR platform with a shared transmit/receive path

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/462,075 Continuation US9285464B2 (en) 2013-08-20 2014-08-18 Devices and methods for a rotating LIDAR platform with a shared transmit/receive path

Publications (1)

Publication Number Publication Date
USRE48042E1 true USRE48042E1 (en) 2020-06-09

Family

ID=51493404

Family Applications (4)

Application Number Title Priority Date Filing Date
US13/971,606 Active US8836922B1 (en) 2013-08-20 2013-08-20 Devices and methods for a rotating LIDAR platform with a shared transmit/receive path
US14/462,075 Ceased US9285464B2 (en) 2013-08-20 2014-08-18 Devices and methods for a rotating LIDAR platform with a shared transmit/receive path
US15/919,479 Active USRE48042E1 (en) 2013-08-20 2018-03-13 Devices and methods for a LIDAR platform with a shared transmit/receive path
US16/890,789 Active USRE48874E1 (en) 2013-08-20 2020-06-02 Devices and methods for a rotating LIDAR platform with a shared transmit/receive path

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US13/971,606 Active US8836922B1 (en) 2013-08-20 2013-08-20 Devices and methods for a rotating LIDAR platform with a shared transmit/receive path
US14/462,075 Ceased US9285464B2 (en) 2013-08-20 2014-08-18 Devices and methods for a rotating LIDAR platform with a shared transmit/receive path

Family Applications After (1)

Application Number Title Priority Date Filing Date
US16/890,789 Active USRE48874E1 (en) 2013-08-20 2020-06-02 Devices and methods for a rotating LIDAR platform with a shared transmit/receive path

Country Status (6)

Country Link
US (4) US8836922B1 (en)
EP (2) EP3036562B1 (en)
JP (2) JP6249577B2 (en)
KR (3) KR102095895B1 (en)
CN (2) CN111487600A (en)
WO (1) WO2015026471A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11933967B2 (en) 2019-08-22 2024-03-19 Red Creamery, LLC Distally actuated scanning mirror
US12123950B2 (en) 2016-02-15 2024-10-22 Red Creamery, LLC Hybrid LADAR with co-planar scanning and imaging field-of-view

Families Citing this family (269)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USRE46672E1 (en) 2006-07-13 2018-01-16 Velodyne Lidar, Inc. High definition LiDAR system
US9455783B2 (en) 2013-05-06 2016-09-27 Federal Law Enforcement Development Services, Inc. Network security and variable pulse wave form with continuous communication
US20090129782A1 (en) 2007-05-24 2009-05-21 Federal Law Enforcement Development Service, Inc. Building illumination apparatus with integrated communications, security and energy management
US11265082B2 (en) 2007-05-24 2022-03-01 Federal Law Enforcement Development Services, Inc. LED light control assembly and system
US9100124B2 (en) 2007-05-24 2015-08-04 Federal Law Enforcement Development Services, Inc. LED Light Fixture
US9414458B2 (en) 2007-05-24 2016-08-09 Federal Law Enforcement Development Services, Inc. LED light control assembly and system
US8890773B1 (en) 2009-04-01 2014-11-18 Federal Law Enforcement Development Services, Inc. Visible light transceiver glasses
DE102011119707A1 (en) * 2011-11-29 2013-05-29 Valeo Schalter Und Sensoren Gmbh Optical measuring device
US9063549B1 (en) * 2013-03-06 2015-06-23 Google Inc. Light detection and ranging device with oscillating mirror driven by magnetically interactive coil
US10132928B2 (en) 2013-05-09 2018-11-20 Quanergy Systems, Inc. Solid state optical phased array lidar and method of using same
US10126412B2 (en) * 2013-08-19 2018-11-13 Quanergy Systems, Inc. Optical phased array lidar system and method of using same
US8836922B1 (en) 2013-08-20 2014-09-16 Google Inc. Devices and methods for a rotating LIDAR platform with a shared transmit/receive path
US9368936B1 (en) 2013-09-30 2016-06-14 Google Inc. Laser diode firing system
US10203399B2 (en) 2013-11-12 2019-02-12 Big Sky Financial Corporation Methods and apparatus for array based LiDAR systems with reduced interference
US20150198941A1 (en) 2014-01-15 2015-07-16 John C. Pederson Cyber Life Electronic Networking and Commerce Operating Exchange
US9360554B2 (en) 2014-04-11 2016-06-07 Facet Technology Corp. Methods and apparatus for object detection and identification in a multiple detector lidar array
US9753351B2 (en) 2014-06-30 2017-09-05 Quanergy Systems, Inc. Planar beam forming and steering optical phased array chip and method of using same
US9869753B2 (en) 2014-08-15 2018-01-16 Quanergy Systems, Inc. Three-dimensional-mapping two-dimensional-scanning lidar based on one-dimensional-steering optical phased arrays and method of using same
US10036803B2 (en) * 2014-10-20 2018-07-31 Quanergy Systems, Inc. Three-dimensional lidar sensor based on two-dimensional scanning of one-dimensional optical emitter and method of using same
US10036801B2 (en) 2015-03-05 2018-07-31 Big Sky Financial Corporation Methods and apparatus for increased precision and improved range in a multiple detector LiDAR array
US10088557B2 (en) 2015-03-20 2018-10-02 MSOTEK Co., Ltd LIDAR apparatus
US9625582B2 (en) 2015-03-25 2017-04-18 Google Inc. Vehicle with multiple light detection and ranging devices (LIDARs)
US9880263B2 (en) * 2015-04-06 2018-01-30 Waymo Llc Long range steerable LIDAR system
CN106291580B (en) 2015-06-12 2019-08-23 上海珏芯光电科技有限公司 Laser infrared radar imaging system
DE102015110767A1 (en) * 2015-07-03 2017-01-05 Valeo Schalter Und Sensoren Gmbh Detector unit for an optical sensor device
KR102422783B1 (en) 2015-08-03 2022-07-19 엘지이노텍 주식회사 Apparatus for light detection and ranging
US20170048953A1 (en) 2015-08-11 2017-02-16 Federal Law Enforcement Development Services, Inc. Programmable switch and system
US9992477B2 (en) 2015-09-24 2018-06-05 Ouster, Inc. Optical system for collecting distance information within a field
US10063849B2 (en) 2015-09-24 2018-08-28 Ouster, Inc. Optical system for collecting distance information within a field
US10557939B2 (en) 2015-10-19 2020-02-11 Luminar Technologies, Inc. Lidar system with improved signal-to-noise ratio in the presence of solar background noise
US9720415B2 (en) 2015-11-04 2017-08-01 Zoox, Inc. Sensor-based object-detection optimization for autonomous vehicles
EP3371625A4 (en) 2015-11-05 2019-05-15 Luminar Technologies, Inc. Lidar system with improved scanning speed for high-resolution depth mapping
US10557940B2 (en) 2015-11-30 2020-02-11 Luminar Technologies, Inc. Lidar system
KR101909327B1 (en) * 2015-12-11 2018-10-17 전자부품연구원 Scanning lidar having optical structures that share a transmission receiving lens
DE102015121839A1 (en) * 2015-12-15 2017-06-22 Sick Ag Optoelectronic sensor and method for detecting an object
KR20170071181A (en) * 2015-12-15 2017-06-23 한화테크윈 주식회사 Cooling apparatus for rotating device
DE102015121840A1 (en) 2015-12-15 2017-06-22 Sick Ag Optoelectronic sensor and method for detecting an object
WO2017110573A1 (en) * 2015-12-24 2017-06-29 コニカミノルタ株式会社 Light projection/reception unit, and radar
EP3396403A4 (en) * 2015-12-24 2018-12-26 Konica Minolta, Inc. Light projection/reception unit, and radar
KR102204980B1 (en) * 2015-12-29 2021-01-19 한국전자기술연구원 Scanning lidar with variable vertical scanning range
JP6763971B2 (en) * 2016-01-29 2020-09-30 アウスター インコーポレイテッド Systems and methods for calibrating optical distance sensors
US10627490B2 (en) 2016-01-31 2020-04-21 Velodyne Lidar, Inc. Multiple pulse, LIDAR based 3-D imaging
EP3408684A4 (en) * 2016-01-31 2019-10-02 Velodyne LiDAR, Inc. Lidar based 3-d imaging with far-field illumination overlap
CA3017735C (en) 2016-03-19 2023-03-14 Velodyne Lidar, Inc. Integrated illumination and detection for lidar based 3-d imaging
EP3226031A1 (en) * 2016-03-29 2017-10-04 Leica Geosystems AG Laser scanner
US10761195B2 (en) 2016-04-22 2020-09-01 OPSYS Tech Ltd. Multi-wavelength LIDAR system
US10838062B2 (en) 2016-05-24 2020-11-17 Veoneer Us, Inc. Direct detection LiDAR system and method with pulse amplitude modulation (AM) transmitter and quadrature receiver
US10416292B2 (en) 2016-05-24 2019-09-17 Veoneer Us, Inc. Direct detection LiDAR system and method with frequency modulation (FM) transmitter and quadrature receiver
KR102235710B1 (en) * 2016-05-31 2021-04-02 한국전자기술연구원 Scanning lidar having optical structures with transmission receiving single lens
WO2017210418A1 (en) 2016-06-01 2017-12-07 Velodyne Lidar, Inc. Multiple pixel scanning lidar
US10212785B2 (en) 2016-06-13 2019-02-19 Google Llc Staggered array of individually addressable light-emitting elements for sweeping out an angular range
US9909862B2 (en) * 2016-06-13 2018-03-06 Google Llc Curved array of light-emitting elements for sweeping out an angular range
WO2018021800A1 (en) * 2016-07-25 2018-02-01 엘지이노텍 주식회사 Light-receiving device and lidar
KR102209500B1 (en) 2016-08-02 2021-02-01 연용현 Lidar apparatus
US10948572B2 (en) 2016-08-24 2021-03-16 Ouster, Inc. Optical system for collecting distance information within a field
US10066986B2 (en) * 2016-08-31 2018-09-04 GM Global Technology Operations LLC Light emitting sensor having a plurality of secondary lenses of a moveable control structure for controlling the passage of light between a plurality of light emitters and a primary lens
CN109997057B (en) * 2016-09-20 2020-07-14 创新科技有限公司 Laser radar system and method
KR102210101B1 (en) * 2016-09-22 2021-02-02 한국전자기술연구원 Optical structure and scanning LiDAR having the same
RU2718483C2 (en) * 2016-09-23 2020-04-08 Общество с ограниченной ответственностью "Гардиан Стекло Сервиз" System and/or method of identifying coating for glass
FR3056524B1 (en) * 2016-09-28 2018-10-12 Valeo Systemes D'essuyage DETECTION SYSTEM FOR MOTOR VEHICLE
KR102455389B1 (en) * 2016-09-30 2022-10-14 매직 립, 인코포레이티드 Projector with spatial light modulation
US10502830B2 (en) * 2016-10-13 2019-12-10 Waymo Llc Limitation of noise on light detectors using an aperture
US10379540B2 (en) 2016-10-17 2019-08-13 Waymo Llc Light detection and ranging (LIDAR) device having multiple receivers
DE102016220468A1 (en) 2016-10-19 2018-04-19 Robert Bosch Gmbh Lidar sensor for detecting an object
US10845470B2 (en) 2016-11-16 2020-11-24 Waymo Llc Methods and systems for protecting a light detection and ranging (LIDAR) device
USD871412S1 (en) * 2016-11-21 2019-12-31 Datalogic Ip Tech S.R.L. Optical scanner
US10605984B2 (en) 2016-12-01 2020-03-31 Waymo Llc Array of waveguide diffusers for light detection using an aperture
US10502618B2 (en) 2016-12-03 2019-12-10 Waymo Llc Waveguide diffuser for light detection using an aperture
DE102016224509A1 (en) 2016-12-08 2018-06-14 Zf Friedrichshafen Ag Receiver arrangement and method for receiving at least one light pulse and for outputting a received signal
US10001551B1 (en) * 2016-12-19 2018-06-19 Waymo Llc Mirror assembly
US10690754B2 (en) * 2016-12-23 2020-06-23 Cepton Technologies, Inc. Scanning apparatuses and methods for a lidar system
TW201823673A (en) * 2016-12-28 2018-07-01 鴻海精密工業股份有限公司 Laser distance measuring device
US10295660B1 (en) 2016-12-30 2019-05-21 Panosense Inc. Aligning optical components in LIDAR systems
US10109183B1 (en) 2016-12-30 2018-10-23 Panosense Inc. Interface for transferring data between a non-rotating body and a rotating body
US10122416B2 (en) 2016-12-30 2018-11-06 Panosense Inc. Interface for transferring power and data between a non-rotating body and a rotating body
US10591740B2 (en) 2016-12-30 2020-03-17 Panosense Inc. Lens assembly for a LIDAR system
US10742088B2 (en) 2016-12-30 2020-08-11 Panosense Inc. Support assembly for rotating body
US10359507B2 (en) 2016-12-30 2019-07-23 Panosense Inc. Lidar sensor assembly calibration based on reference surface
US10830878B2 (en) * 2016-12-30 2020-11-10 Panosense Inc. LIDAR system
US10048358B2 (en) 2016-12-30 2018-08-14 Panosense Inc. Laser power calibration and correction
US10942257B2 (en) 2016-12-31 2021-03-09 Innovusion Ireland Limited 2D scanning high precision LiDAR using combination of rotating concave mirror and beam steering devices
US10520592B2 (en) * 2016-12-31 2019-12-31 Waymo Llc Light detection and ranging (LIDAR) device with an off-axis receiver
DE102017101945A1 (en) * 2017-02-01 2018-08-02 Osram Opto Semiconductors Gmbh Measuring arrangement with an optical transmitter and an optical receiver
DE102017202957B4 (en) 2017-02-23 2024-06-27 Zf Friedrichshafen Ag Receiver arrangement, semiconductor device and method for receiving light pulses and for outputting a received signal
US11105925B2 (en) 2017-03-01 2021-08-31 Ouster, Inc. Accurate photo detector measurements for LIDAR
WO2018160886A1 (en) * 2017-03-01 2018-09-07 Ouster, Inc. Accurate photo detector measurements for lidar
US10338594B2 (en) * 2017-03-13 2019-07-02 Nio Usa, Inc. Navigation of autonomous vehicles to enhance safety under one or more fault conditions
JP7037830B2 (en) 2017-03-13 2022-03-17 オプシス テック リミテッド Eye safety scanning lidar system
US9810786B1 (en) 2017-03-16 2017-11-07 Luminar Technologies, Inc. Optical parametric oscillator for lidar system
US9810775B1 (en) 2017-03-16 2017-11-07 Luminar Technologies, Inc. Q-switched laser for LIDAR system
US9905992B1 (en) 2017-03-16 2018-02-27 Luminar Technologies, Inc. Self-Raman laser for lidar system
US10365351B2 (en) * 2017-03-17 2019-07-30 Waymo Llc Variable beam spacing, timing, and power for vehicle sensors
CN110691983A (en) * 2017-03-20 2020-01-14 威力登激光雷达有限公司 LIDAR-based 3-D imaging with structured light and integrated illumination and detection
US9869754B1 (en) 2017-03-22 2018-01-16 Luminar Technologies, Inc. Scan patterns for lidar systems
US10254388B2 (en) 2017-03-28 2019-04-09 Luminar Technologies, Inc. Dynamically varying laser output in a vehicle in view of weather conditions
US10209359B2 (en) 2017-03-28 2019-02-19 Luminar Technologies, Inc. Adaptive pulse rate in a lidar system
US10121813B2 (en) 2017-03-28 2018-11-06 Luminar Technologies, Inc. Optical detector having a bandpass filter in a lidar system
US10139478B2 (en) 2017-03-28 2018-11-27 Luminar Technologies, Inc. Time varying gain in an optical detector operating in a lidar system
US10732281B2 (en) 2017-03-28 2020-08-04 Luminar Technologies, Inc. Lidar detector system having range walk compensation
US10267899B2 (en) 2017-03-28 2019-04-23 Luminar Technologies, Inc. Pulse timing based on angle of view
US10061019B1 (en) 2017-03-28 2018-08-28 Luminar Technologies, Inc. Diffractive optical element in a lidar system to correct for backscan
US10007001B1 (en) 2017-03-28 2018-06-26 Luminar Technologies, Inc. Active short-wave infrared four-dimensional camera
US10114111B2 (en) 2017-03-28 2018-10-30 Luminar Technologies, Inc. Method for dynamically controlling laser power
US10545240B2 (en) 2017-03-28 2020-01-28 Luminar Technologies, Inc. LIDAR transmitter and detector system using pulse encoding to reduce range ambiguity
US11119198B2 (en) 2017-03-28 2021-09-14 Luminar, Llc Increasing operational safety of a lidar system
US11181622B2 (en) 2017-03-29 2021-11-23 Luminar, Llc Method for controlling peak and average power through laser receiver
US10663595B2 (en) 2017-03-29 2020-05-26 Luminar Technologies, Inc. Synchronized multiple sensor head system for a vehicle
US10088559B1 (en) 2017-03-29 2018-10-02 Luminar Technologies, Inc. Controlling pulse timing to compensate for motor dynamics
US10983213B2 (en) 2017-03-29 2021-04-20 Luminar Holdco, Llc Non-uniform separation of detector array elements in a lidar system
US10976417B2 (en) 2017-03-29 2021-04-13 Luminar Holdco, Llc Using detectors with different gains in a lidar system
US11002853B2 (en) 2017-03-29 2021-05-11 Luminar, Llc Ultrasonic vibrations on a window in a lidar system
US10191155B2 (en) 2017-03-29 2019-01-29 Luminar Technologies, Inc. Optical resolution in front of a vehicle
US10254762B2 (en) * 2017-03-29 2019-04-09 Luminar Technologies, Inc. Compensating for the vibration of the vehicle
US10969488B2 (en) 2017-03-29 2021-04-06 Luminar Holdco, Llc Dynamically scanning a field of regard using a limited number of output beams
US10641874B2 (en) 2017-03-29 2020-05-05 Luminar Technologies, Inc. Sizing the field of view of a detector to improve operation of a lidar system
US10684360B2 (en) 2017-03-30 2020-06-16 Luminar Technologies, Inc. Protecting detector in a lidar system using off-axis illumination
US9989629B1 (en) 2017-03-30 2018-06-05 Luminar Technologies, Inc. Cross-talk mitigation using wavelength switching
US10241198B2 (en) 2017-03-30 2019-03-26 Luminar Technologies, Inc. Lidar receiver calibration
US10295668B2 (en) 2017-03-30 2019-05-21 Luminar Technologies, Inc. Reducing the number of false detections in a lidar system
US10401481B2 (en) 2017-03-30 2019-09-03 Luminar Technologies, Inc. Non-uniform beam power distribution for a laser operating in a vehicle
US11022688B2 (en) 2017-03-31 2021-06-01 Luminar, Llc Multi-eye lidar system
DE102017205504A1 (en) 2017-03-31 2018-10-04 Robert Bosch Gmbh Optical scanning system
US20180284246A1 (en) 2017-03-31 2018-10-04 Luminar Technologies, Inc. Using Acoustic Signals to Modify Operation of a Lidar System
CA3057988A1 (en) 2017-03-31 2018-10-04 Velodyne Lidar, Inc. Integrated lidar illumination power control
US10641876B2 (en) 2017-04-06 2020-05-05 Quanergy Systems, Inc. Apparatus and method for mitigating LiDAR interference through pulse coding and frequency shifting
US10908282B2 (en) 2017-04-07 2021-02-02 General Electric Company LiDAR system and method
US9904375B1 (en) 2017-04-10 2018-02-27 Uber Technologies, Inc. LIDAR display systems and methods
US10556585B1 (en) 2017-04-13 2020-02-11 Panosense Inc. Surface normal determination for LIDAR range samples by detecting probe pulse stretching
US10677897B2 (en) 2017-04-14 2020-06-09 Luminar Technologies, Inc. Combining lidar and camera data
US10423162B2 (en) 2017-05-08 2019-09-24 Nio Usa, Inc. Autonomous vehicle logic to identify permissioned parking relative to multiple classes of restricted parking
JP2020519881A (en) * 2017-05-08 2020-07-02 ベロダイン ライダー, インク. LIDAR data collection and control
US10649072B2 (en) 2017-05-10 2020-05-12 Massachusetts Institute Of Technology LiDAR device based on scanning mirrors array and multi-frequency laser modulation
US10564261B2 (en) 2017-05-11 2020-02-18 Ford Global Technologies, Llc Autonomous vehicle LIDAR mirror
DE102017208047A1 (en) * 2017-05-12 2018-11-15 Robert Bosch Gmbh LIDAR device and method with simplified detection
US11150347B2 (en) * 2017-05-15 2021-10-19 Ouster, Inc. Micro-optics for optical imager with non-uniform filter
DE102017208900A1 (en) 2017-05-26 2018-11-29 Robert Bosch Gmbh Method and device for scanning a solid angle
US10094916B1 (en) 2017-06-09 2018-10-09 Waymo Llc LIDAR optics alignment systems and methods
US11294035B2 (en) 2017-07-11 2022-04-05 Nuro, Inc. LiDAR system with cylindrical lenses
US11061116B2 (en) 2017-07-13 2021-07-13 Nuro, Inc. Lidar system with image size compensation mechanism
US10369974B2 (en) 2017-07-14 2019-08-06 Nio Usa, Inc. Control and coordination of driverless fuel replenishment for autonomous vehicles
US10710633B2 (en) 2017-07-14 2020-07-14 Nio Usa, Inc. Control of complex parking maneuvers and autonomous fuel replenishment of driverless vehicles
JP2020527805A (en) 2017-07-20 2020-09-10 ニューロ・インコーポレーテッドNuro Incorporated Relocation of autonomous vehicles
US11009868B2 (en) 2017-07-20 2021-05-18 Nuro, Inc. Fleet of autonomous vehicles with lane positioning and platooning behaviors
US20190033460A1 (en) * 2017-07-27 2019-01-31 GM Global Technology Operations LLC Apparatus for increase field of view for lidar detector and illuminator
KR102350621B1 (en) 2017-07-27 2022-01-12 주식회사 엠쏘텍 Lidar apparatus
KR102350613B1 (en) 2017-07-27 2022-01-12 주식회사 엠쏘텍 Irrotational omnidirectional lidar apparatus
JP7319958B2 (en) 2017-07-28 2023-08-02 ニューロ・インコーポレーテッド Designing Adaptive Compartments for Autonomous and Semi-Autonomous Vehicles
KR20220119769A (en) 2017-07-28 2022-08-30 옵시스 테크 엘티디 Vcsel array lidar transmitter with small angular divergence
US10698088B2 (en) 2017-08-01 2020-06-30 Waymo Llc LIDAR receiver using a waveguide and an aperture
DE102017213726A1 (en) 2017-08-08 2019-02-14 Robert Bosch Gmbh Sensor device for detecting an object
DE202017105001U1 (en) * 2017-08-21 2017-09-14 Jenoptik Advanced Systems Gmbh LIDAR scanner with MEMS mirror and at least two scan angle ranges
DE102017214705A1 (en) * 2017-08-23 2019-02-28 Robert Bosch Gmbh Coaxial LIDAR system with elongated mirror opening
US10890650B2 (en) 2017-09-05 2021-01-12 Waymo Llc LIDAR with co-aligned transmit and receive paths
JP6935007B2 (en) * 2017-09-05 2021-09-15 ウェイモ エルエルシー Shared waveguides for lidar transmitters and receivers
CN107576954A (en) * 2017-09-15 2018-01-12 中科和光(天津)应用激光技术研究所有限公司 A kind of transmitting collimater based on laser radar
US11460550B2 (en) 2017-09-19 2022-10-04 Veoneer Us, Llc Direct detection LiDAR system and method with synthetic doppler processing
US10613200B2 (en) 2017-09-19 2020-04-07 Veoneer, Inc. Scanning lidar system and method
US10838043B2 (en) * 2017-11-15 2020-11-17 Veoneer Us, Inc. Scanning LiDAR system and method with spatial filtering for reduction of ambient light
EP3460519A1 (en) * 2017-09-25 2019-03-27 Hexagon Technology Center GmbH Laser scanner
EP3460520B1 (en) * 2017-09-25 2023-07-19 Hexagon Technology Center GmbH Multi-beam laser scanner
US10523880B2 (en) * 2017-09-28 2019-12-31 Waymo Llc Synchronized spinning LIDAR and rolling shutter camera system
JP6967929B2 (en) * 2017-09-28 2021-11-17 シャープ株式会社 Optical sensors and electronic devices
US10684370B2 (en) * 2017-09-29 2020-06-16 Veoneer Us, Inc. Multifunction vehicle detection system
US11194022B2 (en) 2017-09-29 2021-12-07 Veoneer Us, Inc. Detection system with reflection member and offset detection array
EP3470872B1 (en) * 2017-10-11 2021-09-08 Melexis Technologies NV Sensor device
US10003168B1 (en) 2017-10-18 2018-06-19 Luminar Technologies, Inc. Fiber laser with free-space components
KR102093637B1 (en) * 2017-10-20 2020-03-27 전자부품연구원 Lidar device and system comprising the same
US10775485B2 (en) 2017-10-20 2020-09-15 Korea Electronics Technology Institute LIDAR device and system comprising the same
US10824862B2 (en) 2017-11-14 2020-11-03 Nuro, Inc. Three-dimensional object detection for autonomous robotic systems using image proposals
EP3710855A4 (en) 2017-11-15 2021-08-04 Opsys Tech Ltd. Noise adaptive solid-state lidar system
US11585901B2 (en) 2017-11-15 2023-02-21 Veoneer Us, Llc Scanning lidar system and method with spatial filtering for reduction of ambient light
US10451716B2 (en) 2017-11-22 2019-10-22 Luminar Technologies, Inc. Monitoring rotation of a mirror in a lidar system
US10324185B2 (en) 2017-11-22 2019-06-18 Luminar Technologies, Inc. Reducing audio noise in a lidar scanner with a polygon mirror
US11592530B2 (en) 2017-11-30 2023-02-28 Cepton Technologies, Inc. Detector designs for improved resolution in lidar systems
US11353556B2 (en) 2017-12-07 2022-06-07 Ouster, Inc. Light ranging device with a multi-element bulk lens system
US11294041B2 (en) * 2017-12-08 2022-04-05 Velodyne Lidar Usa, Inc. Systems and methods for improving detection of a return signal in a light ranging and detection system
US10942244B2 (en) * 2017-12-12 2021-03-09 Waymo Llc Systems and methods for LIDARs with adjustable resolution and failsafe operation
US11340339B2 (en) * 2017-12-22 2022-05-24 Waymo Llc Systems and methods for adaptive range coverage using LIDAR
US11493601B2 (en) 2017-12-22 2022-11-08 Innovusion, Inc. High density LIDAR scanning
CN108061904B (en) 2017-12-29 2020-12-22 华为技术有限公司 Multi-line laser radar
US11022971B2 (en) 2018-01-16 2021-06-01 Nio Usa, Inc. Event data recordation to identify and resolve anomalies associated with control of driverless vehicles
US11195353B2 (en) 2018-01-17 2021-12-07 Uatc, Llc Methods, devices, and systems for communicating autonomous-vehicle status
US11592527B2 (en) 2018-02-16 2023-02-28 Cepton Technologies, Inc. Systems for incorporating LiDAR sensors in a headlamp module of a vehicle
WO2019165294A1 (en) 2018-02-23 2019-08-29 Innovusion Ireland Limited 2-dimensional steering system for lidar systems
WO2020013890A2 (en) 2018-02-23 2020-01-16 Innovusion Ireland Limited Multi-wavelength pulse steering in lidar systems
JP7324518B2 (en) 2018-04-01 2023-08-10 オプシス テック リミテッド Noise adaptive solid-state lidar system
US10324170B1 (en) 2018-04-05 2019-06-18 Luminar Technologies, Inc. Multi-beam lidar system with polygon mirror
US11029406B2 (en) 2018-04-06 2021-06-08 Luminar, Llc Lidar system with AlInAsSb avalanche photodiode
JP2019191018A (en) 2018-04-26 2019-10-31 ソニー株式会社 Ranging device and ranging module
JP6806835B2 (en) * 2018-04-28 2021-01-06 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Semiconductor device
WO2019205163A1 (en) * 2018-04-28 2019-10-31 SZ DJI Technology Co., Ltd. Light detection and ranging sensors with multiple emitters and multiple receivers, and associated systems and methods
US10348051B1 (en) 2018-05-18 2019-07-09 Luminar Technologies, Inc. Fiber-optic amplifier
US10884105B2 (en) 2018-05-31 2021-01-05 Eagle Technology, Llc Optical system including an optical body with waveguides aligned along an imaginary curved surface for enhanced beam steering and related methods
WO2019237581A1 (en) * 2018-06-13 2019-12-19 Hesai Photonics Technology Co., Ltd. Lidar systems and methods
KR102637175B1 (en) * 2018-07-02 2024-02-14 현대모비스 주식회사 Lidar sensing device
US10591601B2 (en) 2018-07-10 2020-03-17 Luminar Technologies, Inc. Camera-gated lidar system
US10627516B2 (en) 2018-07-19 2020-04-21 Luminar Technologies, Inc. Adjustable pulse characteristics for ground detection in lidar systems
US10760957B2 (en) 2018-08-09 2020-09-01 Ouster, Inc. Bulk optics for a scanning array
US10739189B2 (en) 2018-08-09 2020-08-11 Ouster, Inc. Multispectral ranging/imaging sensor arrays and systems
US10551501B1 (en) 2018-08-09 2020-02-04 Luminar Technologies, Inc. Dual-mode lidar system
US10340651B1 (en) 2018-08-21 2019-07-02 Luminar Technologies, Inc. Lidar system with optical trigger
US11971507B2 (en) 2018-08-24 2024-04-30 Velodyne Lidar Usa, Inc. Systems and methods for mitigating optical crosstalk in a light ranging and detection system
WO2020049055A1 (en) * 2018-09-06 2020-03-12 Blickfeld GmbH Coaxial setup for light detection and ranging, lidar, measurements
US11408999B2 (en) 2018-09-14 2022-08-09 Goodrich Corporation LIDAR camera systems
US10712434B2 (en) 2018-09-18 2020-07-14 Velodyne Lidar, Inc. Multi-channel LIDAR illumination driver
US11536845B2 (en) * 2018-10-31 2022-12-27 Waymo Llc LIDAR systems with multi-faceted mirrors
US11662465B2 (en) * 2018-11-01 2023-05-30 Waymo Llc LIDAR with tilted and offset optical cavity
US11474211B2 (en) 2018-11-01 2022-10-18 Waymo Llc Optimized high speed lidar mirror design
US11082010B2 (en) 2018-11-06 2021-08-03 Velodyne Lidar Usa, Inc. Systems and methods for TIA base current detection and compensation
CA3114399A1 (en) 2018-11-13 2020-05-22 Nuro, Inc. Lidar for vehicle blind spot detection
DE102018129972A1 (en) 2018-11-27 2020-05-28 Sick Ag Optoelectronic sensor and method for detecting objects
US11391822B2 (en) * 2018-11-30 2022-07-19 Seagate Technology Llc Rotating pyramidal mirror
EP3671261A1 (en) * 2018-12-21 2020-06-24 Leica Geosystems AG 3d surveillance system comprising lidar and multispectral imaging for object classification
CN109581328B (en) * 2018-12-21 2023-06-02 宁波傲视智绘光电科技有限公司 Laser radar
CN111381218B (en) * 2018-12-27 2022-06-24 余姚舜宇智能光学技术有限公司 Hybrid solid-state laser radar and manufacturing method and detection method thereof
US11460578B2 (en) * 2018-12-28 2022-10-04 Robert Bosch Gmbh 3D lidar sensing unit with oscillating axis of rotation for autonomous driving
FR3091525B1 (en) * 2019-01-04 2021-01-29 Balyo Self-guided handling equipment incorporating detection means
KR102674814B1 (en) * 2019-01-04 2024-06-12 엘지전자 주식회사 Automotive LiDAR device
US11885958B2 (en) 2019-01-07 2024-01-30 Velodyne Lidar Usa, Inc. Systems and methods for a dual axis resonant scanning mirror
US12061263B2 (en) 2019-01-07 2024-08-13 Velodyne Lidar Usa, Inc. Systems and methods for a configurable sensor system
US10935637B2 (en) * 2019-01-29 2021-03-02 Cepton Technologies, Inc. Lidar system including a transceiver array
US11774561B2 (en) 2019-02-08 2023-10-03 Luminar Technologies, Inc. Amplifier input protection circuits
US11681030B2 (en) 2019-03-05 2023-06-20 Waymo Llc Range calibration of light detectors
EP3918368A4 (en) * 2019-03-05 2022-10-12 Waymo Llc Lidar transmitter and receiver optics
WO2020186236A1 (en) * 2019-03-14 2020-09-17 Waymo Llc Methods and systems for detecting obstructions on a sensor housing
WO2020198235A1 (en) 2019-03-25 2020-10-01 Cepton Technologies, Inc. Mounting configurations for optoelectronic components in lidar systems
JP7535313B2 (en) 2019-04-09 2024-08-16 オプシス テック リミテッド Solid-state LIDAR transmitter with laser control
KR102663206B1 (en) * 2019-04-23 2024-05-03 현대자동차주식회사 Lidar ntegrated lamp device for vehicle
CN110161511B (en) * 2019-04-30 2021-11-19 探维科技(北京)有限公司 Laser radar system
CN113906316A (en) 2019-05-30 2022-01-07 欧普赛斯技术有限公司 Eye-safe long-range LIDAR system using actuators
JP7438564B2 (en) 2019-06-10 2024-02-27 オプシス テック リミテッド Eye-safe long-range solid-state LIDAR system
JP2022539706A (en) 2019-06-25 2022-09-13 オプシス テック リミテッド Adaptive multi-pulse LIDAR system
US11525892B2 (en) 2019-06-28 2022-12-13 Waymo Llc Beam homogenization for occlusion resistance
US10613203B1 (en) 2019-07-01 2020-04-07 Velodyne Lidar, Inc. Interference mitigation for light detection and ranging
US11579257B2 (en) 2019-07-15 2023-02-14 Veoneer Us, Llc Scanning LiDAR system and method with unitary optical element
US11474218B2 (en) 2019-07-15 2022-10-18 Veoneer Us, Llc Scanning LiDAR system and method with unitary optical element
CN114341673A (en) * 2019-08-22 2022-04-12 深圳市大疆创新科技有限公司 Compact bearing for multi-element optical scanning devices, and associated systems and methods
CN110412544A (en) * 2019-08-23 2019-11-05 上海禾赛光电科技有限公司 Laser transmitting system and laser radar including the laser transmitting system
EP4025930B1 (en) 2019-09-03 2024-02-07 Xenomatix NV Spot pattern projector for solid-state lidar system
WO2021060919A1 (en) * 2019-09-25 2021-04-01 문명일 Lidar optical device and scanning method therefor
WO2021051727A1 (en) * 2020-01-06 2021-03-25 深圳市速腾聚创科技有限公司 Lidar and device having lidar
US11964627B2 (en) 2019-09-30 2024-04-23 Nuro, Inc. Methods and apparatus for supporting compartment inserts in autonomous delivery vehicles
JP7375185B2 (en) * 2019-10-18 2023-11-07 エイエムエス センサーズ エイジア プライヴェット リミテッド LIDAR transmitters and LIDAR systems with curved laser devices and methods of manufacturing LIDAR transmitters and LIDAR systems
KR20210046466A (en) 2019-10-18 2021-04-28 현대자동차주식회사 Liquid crystal based optical deflector and optical scanner using the same
US11313969B2 (en) 2019-10-28 2022-04-26 Veoneer Us, Inc. LiDAR homodyne transceiver using pulse-position modulation
US11747453B1 (en) 2019-11-04 2023-09-05 Waymo Llc Calibration system for light detection and ranging (lidar) devices
CN113030911A (en) * 2019-12-09 2021-06-25 觉芯电子(无锡)有限公司 Laser radar system
CN110988893B (en) * 2019-12-11 2022-04-12 武汉万集信息技术有限公司 Laser radar device
US11592574B2 (en) * 2019-12-12 2023-02-28 GM Global Technology Operations LLC LiDAR vision systems
US11988775B1 (en) * 2019-12-18 2024-05-21 Zoox, Inc. Dynamic sensor illumination
JP2023508459A (en) * 2019-12-27 2023-03-02 華為技術有限公司 Ranging system and mobile platform
US11867837B2 (en) 2020-01-06 2024-01-09 Suteng Innovation Technology Co., Ltd. LiDAR and device having LiDAR
CN113075644B (en) * 2020-01-06 2023-08-04 深圳市速腾聚创科技有限公司 Laser radar and device with same
GB2592584A (en) * 2020-02-28 2021-09-08 Rushmere Tech Limited Optical system and LIDAR system
RU198797U1 (en) * 2020-03-11 2020-07-28 Федеральное государственное бюджетное учреждение науки Институт оптики атмосферы им. В.Е. Зуева Сибирского отделения Российской академии наук (ИОА СО РАН) Lidar photodetector module
EP4113168A4 (en) * 2020-03-20 2023-04-26 Huawei Technologies Co., Ltd. Ranging system and vehicle
US11907887B2 (en) 2020-03-23 2024-02-20 Nuro, Inc. Methods and apparatus for unattended deliveries
WO2021223183A1 (en) * 2020-05-07 2021-11-11 深圳市速腾聚创科技有限公司 Laser transceiving assembly, laser radar, and automatic driving device
US20220043124A1 (en) * 2020-08-07 2022-02-10 Uatc, Llc Light Detection and Ranging (LIDAR) System Having Transmit Optics for Pre-Collimation Steering
US11320124B1 (en) 2020-10-29 2022-05-03 Waymo Llc Infrared light module uniformity rotational test module
WO2022147652A1 (en) * 2021-01-05 2022-07-14 深圳市速腾聚创科技有限公司 Laser radar and device having laser radar
US12044800B2 (en) 2021-01-14 2024-07-23 Magna Electronics, Llc Scanning LiDAR system and method with compensation for transmit laser pulse effects
DE102021100788A1 (en) 2021-01-15 2022-07-21 Deutsches Zentrum für Luft- und Raumfahrt e.V. OPTICAL DEVICE FOR CLOSE AND DISTANCE IMAGING AND SYSTEMS INCLUDING OPTICAL DEVICE
US11326758B1 (en) 2021-03-12 2022-05-10 Veoneer Us, Inc. Spotlight illumination system using optical element
US20220308175A1 (en) * 2021-03-24 2022-09-29 Waymo Llc Optical Sensor for Mirror Zero Angle in a Scanning Lidar
CN115236639A (en) * 2021-04-06 2022-10-25 上海禾赛科技有限公司 Optical component detection system for laser radar and laser radar
US11732858B2 (en) 2021-06-18 2023-08-22 Veoneer Us, Llc Headlight illumination system using optical element
US12092278B2 (en) 2022-10-07 2024-09-17 Magna Electronics, Llc Generating a spotlight
DE102023108881A1 (en) * 2023-04-06 2024-10-10 Valeo Detection Systems GmbH OPTICAL RECEIVING UNIT FOR A LIDAR SYSTEM, LIDAR SYSTEM FOR A VEHICLE AND METHOD FOR OPERATING A LIDAR SYSTEM

Citations (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3790277A (en) 1971-09-20 1974-02-05 Blount & George Inc Optical instrument tracking system
US4516158A (en) 1981-07-31 1985-05-07 British Aerospace Public Limited Company Ground reconnaissance
US4700301A (en) 1983-11-02 1987-10-13 Dyke Howard L Method of automatically steering agricultural type vehicles
US4709195A (en) 1986-09-12 1987-11-24 Spectra-Physics, Inc. Bar code scanner with DC brushless motor
JPH01240884A (en) 1988-03-23 1989-09-26 Kougakushiya Eng Kk Optical type non-contact position measuring apparatus
US5202742A (en) 1990-10-03 1993-04-13 Aisin Seiki Kabushiki Kaisha Laser radar for a vehicle lateral guidance system
US5231401A (en) 1990-08-10 1993-07-27 Kaman Aerospace Corporation Imaging lidar system
US5241481A (en) 1987-06-22 1993-08-31 Arnex Handelsbolag Method and a device for laser optical navigation
EP0296405B1 (en) 1987-06-22 1994-03-30 Arnex Handelsbolag A method and a device for laser-optical navigation
JPH06214027A (en) 1992-12-08 1994-08-05 Erwin Sick Gmbh Opt Elektron Detector of range of laser
JPH08327738A (en) 1995-06-05 1996-12-13 Koito Mfg Co Ltd Distance measuring instrument
US5703351A (en) 1996-11-18 1997-12-30 Eastman Kodak Company Autofocus module having a diffractively achromatized toroidal lens
US6046800A (en) * 1997-01-31 2000-04-04 Kabushiki Kaisha Topcon Position detection surveying device
US6115128A (en) 1997-09-17 2000-09-05 The Regents Of The Univerity Of California Multi-dimensional position sensor using range detectors
US20020140924A1 (en) * 1999-01-08 2002-10-03 Richard J. Wangler Vehicle classification and axle counting sensor system and method
US6778732B1 (en) * 2002-06-07 2004-08-17 Boston Laser, Inc. Generation of high-power, high brightness optical beams by optical cutting and beam-shaping of diode lasers
US7089114B1 (en) 2003-07-03 2006-08-08 Baojia Huang Vehicle collision avoidance system and method
US7248342B1 (en) 2003-02-14 2007-07-24 United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Three-dimension imaging lidar
US7255275B2 (en) 2005-09-12 2007-08-14 Symbol Technologies, Inc. Laser power control arrangements in electro-optical readers
US7259838B2 (en) 2005-06-17 2007-08-21 Specialty Minerals (Michigan) Inc. Optical beam separation element, measuring apparatus and method of measuring
US7311000B2 (en) * 2003-07-11 2007-12-25 Qinetiq Limited Wind speed measurement apparatus and method
US7361948B2 (en) * 2003-03-25 2008-04-22 Uv Craftory Co., Ltd. Filter function-equipped optical sensor and flame sensor
CN101241182A (en) 2007-02-06 2008-08-13 电装波动株式会社 Laser radar apparatus for measuring direction and distance of an object
US7417716B2 (en) 2004-02-25 2008-08-26 Sharp Kabushiki Kaisha Multiple ranging apparatus
US7428041B2 (en) 2002-02-28 2008-09-23 Viasala Oyj Lidar
US7544945B2 (en) 2006-02-06 2009-06-09 Avago Technologies General Ip (Singapore) Pte. Ltd. Vertical cavity surface emitting laser (VCSEL) array laser scanner
JP2009128238A (en) 2007-11-26 2009-06-11 Toyota Central R&D Labs Inc Laser radar equipment
US7616293B2 (en) 2004-04-29 2009-11-10 Sigma Space Corporation System and method for traffic monitoring, speed determination, and traffic light violation detection and recording
US20100020306A1 (en) 2006-07-13 2010-01-28 Velodyne Acoustics, Inc. High definition lidar system
US20100220141A1 (en) * 2007-05-14 2010-09-02 Mastermind Co., Ltd. Ink-jet printer
US20100302528A1 (en) 2009-06-02 2010-12-02 Velodyne Acoustics, Inc. Color lidar scanner
US20110216304A1 (en) 2006-07-13 2011-09-08 Velodyne Acoustics, Inc. High definition lidar system
US20110255070A1 (en) 2010-04-14 2011-10-20 Digital Ally, Inc. Traffic scanning lidar
US20120013917A1 (en) 2010-07-16 2012-01-19 Kabushiki Kaisha Topcon Measuring Device
EP2410358A1 (en) 2010-07-19 2012-01-25 European Space Agency Imaging optics and optical device for mapping a curved image field
US20120133917A1 (en) 2010-11-30 2012-05-31 Hilti Aktiengesellschaft Distance measuring device and surveying system
JP2012181144A (en) 2011-03-02 2012-09-20 Jvc Kenwood Corp Distance measuring instrument and manufacturing method thereof
US20120300190A1 (en) 2011-05-26 2012-11-29 Esw Gmbh Measuring device for distance measurement
US20130278939A1 (en) 2010-11-30 2013-10-24 Technische Universität Dresden Apparatus for non-incremental position and form measurement of moving sold bodies
US20140168631A1 (en) 2012-12-18 2014-06-19 Pouch Holdings LLC Multi-clad Fiber Based Optical Apparatus and Methods for Light Detection and Ranging Sensors
US8836922B1 (en) 2013-08-20 2014-09-16 Google Inc. Devices and methods for a rotating LIDAR platform with a shared transmit/receive path
US8946637B2 (en) 2010-11-23 2015-02-03 The United States Of America As Represented By The Secretary Of The Army Compact fiber-based scanning laser detection and ranging system

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6288906A (en) * 1985-10-15 1987-04-23 Fujitsu Ltd Measuring method for solid shape
JPH07244167A (en) * 1994-01-17 1995-09-19 Omron Corp Optical detection device and vehicle applying said device
JPH09269375A (en) * 1996-03-29 1997-10-14 Nabco Ltd Optical apparatus for sensor for door
EP1159636B1 (en) * 1999-03-18 2003-05-28 Siemens Aktiengesellschaft Resoluting range finding device
DE10114362C2 (en) * 2001-03-22 2003-12-24 Martin Spies Laser scanning system for distance measurement
CN1967285A (en) * 2006-09-14 2007-05-23 中国科学院安徽光学精密机械研究所 Laser radar transmission type confocal distance light receiving and emitting optical system
JP5305749B2 (en) * 2008-06-18 2013-10-02 オリンパスイメージング株式会社 Optical scanning device
US8508721B2 (en) * 2009-08-18 2013-08-13 The Boeing Company Multifunction aircraft LIDAR
LU91714B1 (en) * 2010-07-29 2012-01-30 Iee Sarl Active illumination scanning imager
DE102011076493A1 (en) * 2011-05-26 2012-11-29 Hilti Aktiengesellschaft Measuring device for distance measurement
CN102540195B (en) * 2011-12-29 2014-06-25 东风汽车公司 Five-path laser radar for vehicle and control method thereof

Patent Citations (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3790277A (en) 1971-09-20 1974-02-05 Blount & George Inc Optical instrument tracking system
US4516158A (en) 1981-07-31 1985-05-07 British Aerospace Public Limited Company Ground reconnaissance
US4700301A (en) 1983-11-02 1987-10-13 Dyke Howard L Method of automatically steering agricultural type vehicles
US4709195A (en) 1986-09-12 1987-11-24 Spectra-Physics, Inc. Bar code scanner with DC brushless motor
US5241481A (en) 1987-06-22 1993-08-31 Arnex Handelsbolag Method and a device for laser optical navigation
EP0296405B1 (en) 1987-06-22 1994-03-30 Arnex Handelsbolag A method and a device for laser-optical navigation
JPH01240884A (en) 1988-03-23 1989-09-26 Kougakushiya Eng Kk Optical type non-contact position measuring apparatus
US5231401A (en) 1990-08-10 1993-07-27 Kaman Aerospace Corporation Imaging lidar system
US5202742A (en) 1990-10-03 1993-04-13 Aisin Seiki Kabushiki Kaisha Laser radar for a vehicle lateral guidance system
US5455669A (en) 1992-12-08 1995-10-03 Erwin Sick Gmbh Optik-Elektronik Laser range finding apparatus
JPH06214027A (en) 1992-12-08 1994-08-05 Erwin Sick Gmbh Opt Elektron Detector of range of laser
JPH08327738A (en) 1995-06-05 1996-12-13 Koito Mfg Co Ltd Distance measuring instrument
US5703351A (en) 1996-11-18 1997-12-30 Eastman Kodak Company Autofocus module having a diffractively achromatized toroidal lens
US6046800A (en) * 1997-01-31 2000-04-04 Kabushiki Kaisha Topcon Position detection surveying device
US6115128A (en) 1997-09-17 2000-09-05 The Regents Of The Univerity Of California Multi-dimensional position sensor using range detectors
US20020140924A1 (en) * 1999-01-08 2002-10-03 Richard J. Wangler Vehicle classification and axle counting sensor system and method
US7428041B2 (en) 2002-02-28 2008-09-23 Viasala Oyj Lidar
US6778732B1 (en) * 2002-06-07 2004-08-17 Boston Laser, Inc. Generation of high-power, high brightness optical beams by optical cutting and beam-shaping of diode lasers
US7248342B1 (en) 2003-02-14 2007-07-24 United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Three-dimension imaging lidar
US7361948B2 (en) * 2003-03-25 2008-04-22 Uv Craftory Co., Ltd. Filter function-equipped optical sensor and flame sensor
US7089114B1 (en) 2003-07-03 2006-08-08 Baojia Huang Vehicle collision avoidance system and method
US7311000B2 (en) * 2003-07-11 2007-12-25 Qinetiq Limited Wind speed measurement apparatus and method
US7417716B2 (en) 2004-02-25 2008-08-26 Sharp Kabushiki Kaisha Multiple ranging apparatus
US7616293B2 (en) 2004-04-29 2009-11-10 Sigma Space Corporation System and method for traffic monitoring, speed determination, and traffic light violation detection and recording
US7259838B2 (en) 2005-06-17 2007-08-21 Specialty Minerals (Michigan) Inc. Optical beam separation element, measuring apparatus and method of measuring
US7255275B2 (en) 2005-09-12 2007-08-14 Symbol Technologies, Inc. Laser power control arrangements in electro-optical readers
US7544945B2 (en) 2006-02-06 2009-06-09 Avago Technologies General Ip (Singapore) Pte. Ltd. Vertical cavity surface emitting laser (VCSEL) array laser scanner
US20110216304A1 (en) 2006-07-13 2011-09-08 Velodyne Acoustics, Inc. High definition lidar system
US20100020306A1 (en) 2006-07-13 2010-01-28 Velodyne Acoustics, Inc. High definition lidar system
US8767190B2 (en) 2006-07-13 2014-07-01 Velodyne Acoustics, Inc. High definition LiDAR system
US7969558B2 (en) * 2006-07-13 2011-06-28 Velodyne Acoustics Inc. High definition lidar system
JP2008216238A (en) 2007-02-06 2008-09-18 Denso Wave Inc Laser radar apparatus
US20080316463A1 (en) * 2007-02-06 2008-12-25 Denso Wave Incorporated Laser radar apparatus that measures direction and distance of an object
CN101241182A (en) 2007-02-06 2008-08-13 电装波动株式会社 Laser radar apparatus for measuring direction and distance of an object
US20100220141A1 (en) * 2007-05-14 2010-09-02 Mastermind Co., Ltd. Ink-jet printer
JP2009128238A (en) 2007-11-26 2009-06-11 Toyota Central R&D Labs Inc Laser radar equipment
US20100302528A1 (en) 2009-06-02 2010-12-02 Velodyne Acoustics, Inc. Color lidar scanner
US20110255070A1 (en) 2010-04-14 2011-10-20 Digital Ally, Inc. Traffic scanning lidar
US20120013917A1 (en) 2010-07-16 2012-01-19 Kabushiki Kaisha Topcon Measuring Device
JP2012021949A (en) 2010-07-16 2012-02-02 Topcon Corp Measuring apparatus
EP2410358A1 (en) 2010-07-19 2012-01-25 European Space Agency Imaging optics and optical device for mapping a curved image field
US8946637B2 (en) 2010-11-23 2015-02-03 The United States Of America As Represented By The Secretary Of The Army Compact fiber-based scanning laser detection and ranging system
US20120133917A1 (en) 2010-11-30 2012-05-31 Hilti Aktiengesellschaft Distance measuring device and surveying system
JP2012118076A (en) 2010-11-30 2012-06-21 Hilti Ag Range finder
US20130278939A1 (en) 2010-11-30 2013-10-24 Technische Universität Dresden Apparatus for non-incremental position and form measurement of moving sold bodies
JP2012181144A (en) 2011-03-02 2012-09-20 Jvc Kenwood Corp Distance measuring instrument and manufacturing method thereof
US20120300190A1 (en) 2011-05-26 2012-11-29 Esw Gmbh Measuring device for distance measurement
US20140168631A1 (en) 2012-12-18 2014-06-19 Pouch Holdings LLC Multi-clad Fiber Based Optical Apparatus and Methods for Light Detection and Ranging Sensors
US8836922B1 (en) 2013-08-20 2014-09-16 Google Inc. Devices and methods for a rotating LIDAR platform with a shared transmit/receive path

Non-Patent Citations (36)

* Cited by examiner, † Cited by third party
Title
"Complaint", Waymo LLC v. Uber Technologies, Inc.; Ottomottto LLC; Otto Trucking LLC, Case 3:17-cv-00939, Document 1, Filed Feb. 23, 2017, 28 pages.
"First Amended Complaint", Waymo LLC v. Uber Technologies, Inc.; Ottomottto LLC; Otto Trucking LLC, Case 3:17-cv-00939-WHA, Document 23, Filed Mar. 10, 2017, 31 pages.
"HDL-64E Resource Manual", Velodyne, http://velodynelidar.com/lidar/products/manual/HDLResource%20Manual_lowres.pdf, Nov. 9, 2007, 71 pages.
"Invalidity Claim Charts", Exhibits 1-10, Waymo LLC v. Uber Technologies, Inc.; Ottomottto LLC; Otto Trucking LLC, Case 3:17-cv-00939-WHA, Jun. 2, 2017, 339 pages.
"Laser Radar: Progress and Opportunities in Active Electro-Optical Sensing", National Academies Press, http://www.nap.edu/catalog.php?record_id=18733, 2014, e-book, 311 pages.
"LiDAR: Driving the Future of Autonomous Navigation, Frost & Sullivan Exclusive Whitepaper for Analysis of LiDAR Technology for Advanced Safety", 2016, 30 pages.
"Redacted Declaration of Gregory Kintz", Waymo LLC v. Uber Technologies, Inc.; Ottomottto LLC; Otto Trucking LLC, Case 3:17-cv-00939-WHA, Document 24-26, Filed Mar. 10, 2017, 43 pages.
"Redacted Declaration of Paul McManamon in Support of Defendant's Opposition to Plaintiff Waymo LLC's Motion for Preliminary Injunction", Waymo LLC v. Uber Technologies, Inc.; Ottomottto LLC; Otto Trucking LLC, Case 3:17-cv-00939-WHA, Document 174-7, Filed Apr. 7, 2017, 34 pages.
"Redacted Declaration of Pierre Yves Droz", Waymo LLC v. Uber Technologies, Inc.; Ottomottto LLC; Otto Trucking LLC, Case 3:17-cv-00939-WHA, Document 24-3, filed Mar. 10, 2017, 14 pages.
"Redacted Defendants' Uber Technologies, Inc., Ottomotto LLC, and Otto Trucking LLC's Opposition to Plaintiff Waymo LLC's Motion for Preliminary Injunction", Waymo LLC v. Uber Technologies, Inc.; Ottomottto LLC; Otto Trucking LLC, Case 3:17-cv-00939-WHA, Document 173-3, Filed Apr. 7, 2017, 32 pages.
"Redacted Plaintiff Waymo LLC's Notice of Motion and Motion for a Preliminary Injunction", Waymo LLC v. Uber Technologies, Inc.; Ottomottto LLC; Otto Trucking LLC, Case 3:17-cv-00939-WHA, Document 25-4, Filed Mar. 10, 2017, 30 pages.
"Transcript of Proceedings Appearances", Waymo LLC v. Uber Technologies, Inc.; Ottomottto LLC; Otto Trucking LLC, No. C 17-00939 WHA, Apr. 12, 2017, 119 pages.
"Uber Technologies, Inc. and Ottomotto LLC's Invalidity Contentions Pursuant to Patent L.R. 3-3 and 3-4", Waymo LLC v. Uber Technologies, Inc.; Ottomottto LLC; Otto Trucking LLC, Case 3:17-cv-00939-WHA, Jun. 2, 2017, 19 pages.
"Uber Technology Tutorial", Waymo LLC v. Uber Technologies, Inc. et al., Apr. 12, 2017, 44 pages.
"Waymo's Technology Tutorial", Waymo LLC v. Uber Technologies, Inc. et al., Civil Action No. 3:17-cv-00939, Apr. 12, 2017, 64 pages.
Carl D. Crane III et al., "Development of an Integrated Sensor System for Obstacle Detection and Terrain Evaluation for Application to Unmanned Ground Vehicles," Proc. SPIE 5804, Unmanned Ground Vehicle Technology VII, (May 27, 2005).
Ceilometer CT25K, User's Guide, M210345-en-A, Published by Vaisala Oyj, Dec. 2002, 161 pages.
Clifford, S.F. et al., "Monostatic Diffraction-Limited Lidars: The Impact of Optical Refractive Turbulence", Applied Optics, vol. 22, No. 11, Jun. 1, 1983, pp. 1696-1701.
European Search Report, European Patent Application No. 14838560.2 dated Mar. 10, 2017, 9 pages.
Hall, David et al., "Team DAD Technical Paper", http://archive.darpa.mil/grandchallenge05/TechPapers/TeamDAD.pdf, Aug. 26, 2005, 12 pages.
IBEO LUX data sheet, "Reliable in all Weathers-The ibeo LUX Laserscanner for the Automotive Section", http://www.abott-mf.com/pdf/ibeo%20LUX%20data%20sheet001.pdf, 2009, 2 pages.
IBEO LUX data sheet, "Reliable in all Weathers—The ibeo LUX Laserscanner for the Automotive Section", http://www.abott-mf.com/pdf/ibeo%20LUX%20data%20sheet001.pdf, 2009, 2 pages.
International Search Report dated Nov. 19, 2014 of PCT/US2014/047864 filed Jul. 23, 2014.
James Paul Odenthal, "A Linear Photodiode Array Employed in a Short Range Laser Triangulation Obstacle Avoidance Sensor," RPI Technical Report MP-74, School of Engineering, Rensselaer Polytechnic Institute, Troy, New York, Dec. 1980, 101 pages.
James Paul Odenthal, "A Linear Photodiode Array Employed in a Short Range Laser Triangulation Obstacle Avoidance Sensor," RPI Technical Report MP-74, School of Engineering, Rensselaer Polytechnic Institute, Troy, New York, Dec. 1980, Appendix B and Appendix C, 28 pages.
McManamon, Dr. Paul F. et al., "A History of Laser Radar in the United States", Laser Radar Technology and Applications XV, Proceedings of SPIE, vol. 7684, 2010, pp. 1-11.
McManamon, Paul, "Field Guide to Lidar", SPIE Field Guides, vol. FG36, John E. Grievenkamp, Series Editor, SPIE Press, 2015, 29 pages.
Molebny, Vasyl et al., "Laser Radar: Historical Prospective-From the East to the West", Optical Engineering, vol. 56, No. 3, Mar. 2017, 25 pages.
Molebny, Vasyl et al., "Laser Radar: Historical Prospective—From the East to the West", Optical Engineering, vol. 56, No. 3, Mar. 2017, 25 pages.
Mundhenk, T. Nathan et al., "PanDAR: A Wide-Area, Frame-Rate, and Full Color LIDAR With Foveated Region Using Backfilling Interpolation Upsampling", Intelligent Robots and Computer Vision XXXII, Algorithms and Techniques, Proc. of SPIE-IS&T Electronic Imaging , vol. 9406, 2015, pp. 94060K-1-94060K-13.
Munkel, Christoph et al., "Retrieval of Mixing Height and Dust Concentration With Lidar Ceilometer", Boundary-Layer Meteorol, vol. 124, 2007, pp. 117-128.
Oliver Wulf and Bernardo Wagner, "Fast 3D Scanning Methods for Laser Measurement Systems," 14th International Conference on Control Systems and Computer Science (CSCS14), Jul. 2-5, 2003, Bucharest, Romania.
Richmond, Richard D. et al., "Direct-Detection LADAR Systems" SPIE Press / Tutorial Text in Optical Engineering, vol. TT85, 2010, 40 pages.
Stone, William C., et al., "Performance Analysis of Next-Generation LADAR for Manufacturing, Construction, and Mobility", United States Department of Commerce, Technology Administration, National Institute of Standards and Technology, NISTIR 7117, May 2004, 198 pages.
Touretzky, David S. et al., "What's Hidden in the Hidden Layers", BYTE, Aug. 1989, pp. 227-233.
Urmson, Chris et al., "Autonomous Driving in Urban Environments: Boss and the Urban Challenge", Journal of Field Robotics, vol. 25, No. 8, 2008, pp. 425-466.

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12123950B2 (en) 2016-02-15 2024-10-22 Red Creamery, LLC Hybrid LADAR with co-planar scanning and imaging field-of-view
US11933967B2 (en) 2019-08-22 2024-03-19 Red Creamery, LLC Distally actuated scanning mirror

Also Published As

Publication number Publication date
KR102095895B1 (en) 2020-04-01
EP3036562B1 (en) 2020-11-25
US9285464B2 (en) 2016-03-15
JP2018028555A (en) 2018-02-22
CN111487600A (en) 2020-08-04
US8836922B1 (en) 2014-09-16
US20150055117A1 (en) 2015-02-26
KR20160043109A (en) 2016-04-20
JP2016534346A (en) 2016-11-04
KR20190026956A (en) 2019-03-13
KR101872799B1 (en) 2018-06-29
EP3036562A1 (en) 2016-06-29
KR101956045B1 (en) 2019-03-11
EP3798672A1 (en) 2021-03-31
JP6249577B2 (en) 2017-12-20
WO2015026471A1 (en) 2015-02-26
EP3798672B1 (en) 2023-10-25
CN105659108A (en) 2016-06-08
KR20180077293A (en) 2018-07-06
EP3036562A4 (en) 2017-04-12
USRE48874E1 (en) 2022-01-04

Similar Documents

Publication Publication Date Title
USRE48874E1 (en) Devices and methods for a rotating LIDAR platform with a shared transmit/receive path
JP2018028555A5 (en)
US11822022B2 (en) Methods and systems for LIDAR optics alignment
KR102319494B1 (en) Variable beam spacing, timing, and power for vehicle sensors
JP2022552293A (en) Lidar sensor calibration
CN112703418A (en) Photodetector array with corresponding array of optical elements
US20220179051A1 (en) Lidar assembly with modularized components

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

AS Assignment

Owner name: GOOGLE INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PENNECOT, GAETAN;DROZ, PIERRE-YVES;ULRICH, DREW EUGENE;AND OTHERS;REEL/FRAME:053475/0728

Effective date: 20130731

Owner name: WAYMO LLC, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WAYMO HOLDING INC.;REEL/FRAME:053475/0928

Effective date: 20170315

Owner name: WAYMO HOLDING INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GOOGLE INC.;REEL/FRAME:053475/0765

Effective date: 20170315

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8