CN113016153B - Verifying timing of sensors used in an autonomous vehicle - Google Patents

Verifying timing of sensors used in an autonomous vehicle Download PDF

Info

Publication number
CN113016153B
CN113016153B CN201980004347.4A CN201980004347A CN113016153B CN 113016153 B CN113016153 B CN 113016153B CN 201980004347 A CN201980004347 A CN 201980004347A CN 113016153 B CN113016153 B CN 113016153B
Authority
CN
China
Prior art keywords
sensor
time
data
module
delay
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201980004347.4A
Other languages
Chinese (zh)
Other versions
CN113016153A (en
Inventor
王帅
张满江
申耀明
周翔飞
李领昌
李贤飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Baidu com Times Technology Beijing Co Ltd
Baidu USA LLC
Original Assignee
Baidu com Times Technology Beijing Co Ltd
Baidu USA LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Baidu com Times Technology Beijing Co Ltd, Baidu USA LLC filed Critical Baidu com Times Technology Beijing Co Ltd
Publication of CN113016153A publication Critical patent/CN113016153A/en
Application granted granted Critical
Publication of CN113016153B publication Critical patent/CN113016153B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04JMULTIPLEX COMMUNICATION
    • H04J3/00Time-division multiplex systems
    • H04J3/02Details
    • H04J3/06Synchronising arrangements
    • H04J3/0602Systems characterised by the synchronising information used
    • H04J3/0605Special codes used as synchronising signal
    • H04J3/0608Detectors therefor, e.g. correlators, state machines
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/04Monitoring the functioning of the control system
    • B60W50/045Monitoring control system parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/862Combination of radar systems with sonar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/87Combinations of radar systems, e.g. primary radar and secondary radar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/86Combinations of sonar systems with lidar systems; Combinations of sonar systems with systems not using wave reflection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/87Combinations of sonar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52004Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24133Distances to prototypes
    • G06F18/24137Distances to cluster centroïds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04JMULTIPLEX COMMUNICATION
    • H04J3/00Time-division multiplex systems
    • H04J3/02Details
    • H04J3/06Synchronising arrangements
    • H04J3/0635Clock or time synchronisation in a network
    • H04J3/0638Clock or time synchronisation among nodes; Internode synchronisation
    • H04J3/0647Synchronisation among TDM nodes
    • H04J3/065Synchronisation among TDM nodes using timestamps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2400/00Indexing codes relating to detected, measured or calculated conditions or factors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo or light sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • B60W2420/408
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4041Position
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/25Data precision
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9316Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles combined with communication equipment with other vehicles or with base stations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9323Alternative operation using light waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9324Alternative operation using ultrasonic waves
    • GPHYSICS
    • G04HOROLOGY
    • G04GELECTRONIC TIME-PIECES
    • G04G3/00Producing timing pulses

Abstract

A method of verifying operation of a sensor comprising: causing a sensor to acquire sensor data at a first time (1215), wherein the sensor acquires the sensor data by transmitting a wave to a detector; determining that the detector detected a wave at a second time (1220); receiving sensor data from the sensor at a third time (1225); and verifying operation of the sensor based on at least one of the first time, the second time, or the third time (1230).

Description

Verifying timing of sensors used in an autonomous vehicle
Technical Field
Embodiments of the present disclosure relate generally to operating an autonomous vehicle. More particularly, embodiments of the present disclosure relate to verifying operation of a sensor used in an autonomous vehicle.
Background
A vehicle operating in an autonomous mode (e.g., unmanned) may free up occupants, particularly the driver, from some driving-related responsibilities. When operating in the autonomous mode, the vehicle may navigate to various locations using onboard sensors, allowing the vehicle to travel with minimal human interaction or without some of the passengers.
Action planning and control are key operations in autopilot. The path describes the geometry of the motion for the autonomous vehicle. An autonomous vehicle may use various sensors to detect objects in the environment in which the autonomous vehicle is located. The autonomous vehicle may determine a path of the shuttle environment based on the detected object.
Disclosure of Invention
According to a first aspect, embodiments of the present disclosure provide a method for verifying operation of a sensor, comprising: causing a sensor to acquire sensor data at a first time, wherein the sensor acquires the sensor data by transmitting a wave to a detector; determining that the detector detected the wave at a second time; receiving sensor data from the sensor at a third time; and verifying operation of a sensor based on at least one of the first time, the second time, and the third time, wherein the sensor is for sensing a driving environment during autonomous driving of the autonomous vehicle.
According to a second aspect, embodiments of the present disclosure provide a method for verifying operation of a sensor, comprising: causing the sensor to acquire sensor data at a first time; generating an excitation for causing the sensor to detect at a second time; receiving sensor data from the sensor at a third time; and verifying operation of a sensor based on at least one of the first time, the second time, and the third time, wherein the sensor is for sensing a driving environment during autonomous driving of the autonomous vehicle.
According to a third aspect, embodiments of the present disclosure provide a non-transitory machine-readable medium having instructions stored therein, which when executed by a processor, cause the processor to perform operations comprising: causing a sensor to acquire sensor data at a first time, wherein the sensor acquires the sensor data by transmitting a wave to a detector; determining that the detector detected the wave at a second time; receiving sensor data from the sensor at a third time; and verifying operation of the sensor based on at least one of the first time, the second time, and the third time.
Drawings
Embodiments of the present disclosure are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings in which like reference numerals refer to similar elements.
FIG. 1 is a block diagram illustrating a networked system according to one embodiment.
Fig. 2 is a block diagram illustrating an example of an autonomous vehicle according to one embodiment.
Fig. 3A-3B are block diagrams illustrating examples of a perception and planning system for use with an autonomous vehicle according to one embodiment.
Fig. 4A is a block diagram illustrating an example of a decision and planning process according to one embodiment.
Fig. 4B is a block diagram illustrating a system architecture for autopilot in accordance with one embodiment.
Fig. 5 is a block diagram showing an example of a sensor unit according to one embodiment.
Fig. 6A is a block diagram showing an example of a high-precision time generation unit according to one embodiment.
Fig. 6B is a block diagram illustrating an example of a high-precision time generation unit having three counter generators according to one embodiment.
FIG. 7 is a block diagram of an exemplary synchronization module according to one embodiment.
Fig. 8A is a graph illustrating timing of operation of two sensors according to one embodiment.
Fig. 8B is a graph illustrating the timing of the operation of two sensors according to one embodiment.
FIG. 9 is a flowchart illustrating an exemplary process for synchronizing sensors of an autonomous vehicle, according to one embodiment.
FIG. 10A is a block diagram illustrating an exemplary sensor verification system according to one embodiment.
FIG. 10B is a block diagram illustrating an exemplary sensor verification system according to one embodiment.
FIG. 11 is a block diagram illustrating an exemplary verification system according to one embodiment.
FIG. 12 is a flowchart illustrating an exemplary process for verifying a sensor used in an autonomous vehicle, according to one embodiment.
FIG. 13 is a block diagram illustrating an exemplary verification system according to one embodiment.
FIG. 14 is a flowchart illustrating an exemplary process for verifying a sensor used in an autonomous vehicle, according to one embodiment.
Detailed Description
Various embodiments and aspects of the disclosure will be described with reference to details discussed below, which are illustrated in the accompanying drawings. The following description and drawings are illustrative of the disclosure and are not to be construed as limiting the disclosure. Numerous specific details are described to provide a thorough understanding of various embodiments of the disclosure. However, in some instances, well-known or conventional details are not described in order to provide a concise discussion of embodiments of the present disclosure.
Reference in the specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the disclosure. The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment.
Some embodiments relate to methods, devices, and systems for synchronizing sensors of an autonomous vehicle (ADV). As described above, the ADV may use various sensors to detect objects in the environment and/or the traveling environment in which the ADV is located. If the sensors are not synchronized, the sensors may not be able to capture, acquire, record, sense data at the same time or at the same time. This makes it more difficult for the ADV to correlate and/or use sensor data from the sensor. For example, an ADV may be more difficult to determine if an object detected by one sensor is the same as an object detected by a second sensor. Thus, it may be useful to: the sensors are synchronized such that multiple sensors capture and/or acquire sensor data at the same time (e.g., simultaneously).
Fig. 1 is a block diagram illustrating an autonomous vehicle network configuration according to one embodiment of the present disclosure. Referring to fig. 1, a network configuration 100 includes an autonomous vehicle 101 that may be communicatively coupled to one or more servers 103-104 through a network 102. Although one autonomous vehicle is shown, multiple autonomous vehicles may be coupled to each other and/or to servers 103-104 through network 102. The network 102 may be any type of network, for example, a wired or wireless Local Area Network (LAN), a Wide Area Network (WAN) such as the Internet, a cellular network, a satellite network, or a combination thereof. Servers 103-104 may be any type of server or cluster of servers, such as a network or cloud server, an application server, a backend server, or a combination thereof. The servers 103 to 104 may be data analysis servers, content servers, traffic information servers, map and point of interest (MPOI) servers, location servers, or the like.
An autonomous vehicle refers to a vehicle that may be configured to be in an autonomous mode in which the vehicle navigates through an environment with little or no input from the driver. Such autonomous vehicles may include a sensor system having one or more sensors configured to detect information related to the vehicle operating environment. The vehicle and its associated controller use the detected information to navigate through the environment. The autonomous vehicle 101 may operate in a manual mode, in a full-automatic driving mode, or in a partial-automatic driving mode. Hereinafter, the terms "autonomous vehicle" and "autonomous vehicle" (ADV) are used interchangeably.
In one embodiment, autonomous vehicle 101 includes, but is not limited to, a perception and planning system 110, a vehicle control system 111, a wireless communication system 112, a user interface system 113, an infotainment system (not shown), and a sensor system 115. The autonomous vehicle 101 may also include certain common components included in common vehicles, such as: the components may be controlled by the vehicle control system 111 and/or the sensing and planning system 110 using a variety of communication signals and/or commands, such as acceleration signals or commands, deceleration signals or commands, steering signals or commands, braking signals or commands, and the like.
The components 110-115 may be communicatively coupled to each other via an interconnect, bus, network, or combination thereof. For example, the components 110-115 may be communicatively coupled to each other via a Controller Area Network (CAN) bus. The CAN bus is a vehicle bus standard designed to allow microcontrollers and devices to communicate with each other in applications without a host. It is a message-based protocol originally designed for multiplexing electrical wiring within automobiles, but is also used in many other environments.
Referring now to fig. 2, in one embodiment, sensor system 115 includes, but is not limited to, one or more cameras 211, a Global Positioning System (GPS) unit 212, an Inertial Measurement Unit (IMU) 213, a radar unit 214, and a light detection and ranging (LIDAR) unit 215. The GPS system 212 may include a transceiver operable to provide information regarding the location of the autonomous vehicle. The IMU unit 213 may sense position and orientation changes of the autonomous vehicle based on inertial acceleration. Radar unit 214 may represent a system that utilizes radio signals to sense objects within the local environment of an autonomous vehicle. In some implementations, in addition to sensing an object, radar unit 214 may additionally sense a speed and/or a heading of the object. The LIDAR unit 215 may use a laser to sense objects in the environment of the autonomous vehicle. The LIDAR unit 215 may include, among other system components, one or more laser sources, a laser scanner, and one or more detectors. The camera 211 may include one or more devices to capture images of the surroundings of the autonomous vehicle. The camera 211 may be a still camera and/or a video camera. The camera may be mechanically movable, for example, by mounting the camera on a rotating and/or tilting platform.
The sensor system 115 may also include other sensors such as: sonar sensors, infrared sensors, steering sensors, throttle sensors, brake sensors, and audio sensors (e.g., microphones). The audio sensor may be configured to collect sound from an environment surrounding the autonomous vehicle. The steering sensor may be configured to sense a steering angle of a steering wheel, wheels of a vehicle, or a combination thereof. The throttle sensor and the brake sensor sense a throttle position and a brake position of the vehicle, respectively. In some cases, the throttle sensor and the brake sensor may be integrated as an integrated throttle/brake sensor.
In one embodiment, the vehicle control system 111 includes, but is not limited to, a steering unit 201, a throttle unit 202 (also referred to as an acceleration unit), and a brake unit 203. The steering unit 201 is used to adjust the direction or forward direction of the vehicle. The throttle unit 202 is used to control the speed of the motor or engine, which in turn controls the speed and acceleration of the vehicle. The brake unit 203 decelerates the vehicle by providing friction to decelerate the wheels or tires of the vehicle. It should be noted that the components shown in fig. 2 may be implemented in hardware, software, or a combination thereof.
Returning to fig. 1, the wireless communication system 112 allows communication between the autonomous vehicle 101 and external systems such as devices, sensors, other vehicles, and the like. For example, wireless communication system 112 may communicate wirelessly with one or more devices directly or via a communication network, such as with servers 103-104 through network 102. The wireless communication system 112 may use any cellular communication network or Wireless Local Area Network (WLAN), for example, using WiFi, to communicate with another component or system. The wireless communication system 112 may communicate directly with devices (e.g., a passenger's mobile device, a display device, speakers within the vehicle 101), for example, using an infrared link, bluetooth, or the like. The user interface system 113 may be part of peripheral devices implemented within the vehicle 101, including, for example, a keyboard, a touch screen display device, a microphone, a speaker, and the like.
Some or all of the functions of the autonomous vehicle 101 may be controlled or managed by the perception and planning system 110, particularly when operating in an autonomous mode. The perception and planning system 110 includes the necessary hardware (e.g., processors, memory, storage devices) and software (e.g., operating systems, planning and routing programs) to receive information from the sensor system 115, the control system 111, the wireless communication system 112, and/or the user interface system 113, process the received information, plan a route or path from a starting point to a destination point, and then drive the vehicle 101 based on the planning and control information. Alternatively, the perception and planning system 110 may be integrated with the vehicle control system 111.
For example, a user as a passenger may specify a starting location and destination of a trip, e.g., via a user interface. The perception and planning system 110 obtains trip related data. For example, the awareness and planning system 110 may obtain location and route information from an MPOI server, which may be part of the servers 103-104. The location server provides location services and the MPOI server provides map services and POIs for certain locations. Alternatively, such location and MPOI information may be cached locally in persistent storage of the awareness and planning system 110.
The perception and planning system 110 may also obtain real-time traffic information from a traffic information system or server (TIS) as the autonomous vehicle 101 moves along the route. It should be noted that the servers 103 to 104 may be operated by third party entities. Alternatively, the functionality of servers 103-104 may be integrated with the awareness and planning system 110. Based on the real-time traffic information, the MPOI information, and the location information, and the real-time local environment data (e.g., obstacles, objects, nearby vehicles) detected or sensed by the sensor system 115, the awareness and planning system 110 may plan a path or route and drive the vehicle 101 according to the planned route, e.g., via the control system 111, to safely and efficiently reach the specified destination.
The server 103 may be a data analysis system that performs data analysis services for various clients. In one embodiment, the data analysis system 103 includes a data collector 121 and a machine learning engine 122. The data collector 121 collects driving statistics 123 from various vehicles, which are autonomous vehicles or conventional vehicles driven by human drivers. The driving statistics 123 include information indicating issued driving commands (e.g., throttle, brake, steering commands) and responses of the vehicle (e.g., speed, acceleration, deceleration, direction) captured by sensors of the vehicle at different points in time. The driving statistics 123 may also include information describing driving environments at different points in time, such as routes (including start and destination locations), MPOI, road conditions, weather conditions, and the like.
Based on the driving statistics 123, the machine learning engine 122 generates or trains a set of rules, algorithms, and/or predictive models 124 for various purposes. In one embodiment, the algorithm 124 may include a path algorithm that receives input, constraints, and cost functions and generates a path for the ADV taking into account comfort associated with the path and preferences for the path, thereby remaining close to the lane centerline and away from the obstacle with a buffer. The cost function of the path plan may also be generated as part of algorithm 124. Algorithm 124 may then be uploaded onto the ADV to be utilized in real time during autopilot.
Fig. 3A and 3B are block diagrams illustrating examples of a perception and planning system for use with an autonomous vehicle according to one embodiment. The system 300 may be implemented as part of the autonomous vehicle 101 of fig. 1, including but not limited to the perception and planning system 110, the control system 111, and the sensor system 115. Referring to fig. 3A-3B, perception and planning system 110 includes, but is not limited to, a positioning module 301, a perception module 302, a prediction module 303, a decision module 304, a planning module 305, a control module 306, a route formulation module 307, a static obstacle mapper 308, and a path planner 309.
Some or all of the modules 301 to 309 may be implemented in software, hardware or a combination thereof. For example, the modules may be installed in persistent storage 352, loaded into memory 351, and executed by one or more processors (not shown). It should be noted that some or all of these modules may be communicatively coupled to or integrated with some or all of the modules of the vehicle control system 111 of fig. 2. Some of the modules 301 to 309 may be integrated together as an integrated module.
The positioning module 301 determines the current location of the autonomous vehicle 300 (e.g., using the GPS unit 212) and manages any data related to the user's journey or route. The positioning module 301 (also referred to as a map and route formulation module) manages any data related to the user's journey or route. The user may log in and specify a starting location and destination of the trip, for example, via a user interface. The positioning module 301 communicates with other components of the autonomous vehicle 300, such as the map and route information 311, to obtain trip related data. For example, the positioning module 301 may obtain location and route information from a location server and a Map and POI (MPOI) server. The location server provides location services and the MPOI server provides map services and POIs for certain locations so that they can be cached as part of the map and route information 311. The positioning module 301 may also obtain real-time traffic information from a traffic information system or server as the autonomous vehicle 300 moves along a route.
Based on the sensor data provided by the sensor system 115 and the positioning information obtained by the positioning module 301, the perception module 302 determines the perception of the surrounding environment. The perception information may represent what an average driver would perceive around a vehicle the driver is driving. The perception may include, for example, lane configuration in the form of objects, traffic light signals, relative position of another vehicle, pedestrians, buildings, crosswalks, or other traffic related signs (e.g., stop signs, let go signs), etc. The lane configuration includes information describing the lane or lanes, such as the shape of the lane (straight or curved), the width of the lane, the number of lanes in the road, one-way or two-way lanes, merging or diverging lanes, driving-out lanes, etc.
The perception module 302 may include a computer vision system or functionality of a computer vision system to process and analyze images captured by one or more cameras to identify objects and/or features in an autonomous vehicle environment. The objects may include traffic signals, road boundaries, other vehicles, pedestrians and/or obstacles, etc. Computer vision systems may use object recognition algorithms, video tracking, and other computer vision techniques. In some implementations, the computer vision system can map the environment, track objects, and estimate the speed of objects, among other things. The perception module 302 may also detect objects based on other sensor data provided by other sensors, such as radar and/or LIDAR.
For each object, the prediction module 303 predicts the behavior that the object will exhibit in the environment. According to the set of map/route information 311 and the traffic rule 312, prediction is performed based on the perceived data obtained by perceiving the driving environment at each point in time. For example, if the object is a vehicle in the opposite direction and the current driving environment includes an intersection, the prediction module 303 will predict whether the vehicle is likely to move straight ahead or turn. If the awareness data indicates that the intersection is clear of traffic lights, the prediction module 303 may predict that the vehicle may have to stop completely before entering the intersection. If the awareness data indicates that the vehicle is currently in a left-turn lane only or a right-turn lane only, the prediction module 303 may predict that the vehicle will be more likely to make a left-turn or a right-turn, respectively.
For each object, decision module 304 makes a decision as to how to deal with the object. For example, for a particular object (e.g., another vehicle in an intersection) and its metadata describing the object (e.g., speed, direction, steering angle), the decision module 304 decides how to meet the object (e.g., overtake, yield, stop, pass). The decision module 304 may make such a decision according to a set of rules (e.g., traffic rules or driving rules 312) that may be stored in the persistent storage 352.
The route formulation module 307 is configured to provide one or more routes or paths from the starting point to the destination point. For a given journey from a starting location to a destination location, received for example from a user, the route formulation module 307 obtains route and map information 311 and determines all possible routes or paths from the starting location to the destination location. The route formulation module 307 may generate a reference line in the form of a topographical map for each route it determines from the starting location to the destination location. Reference lines refer to ideal routes or paths without any interference from other vehicles, obstacles or traffic conditions. That is, if there are no other vehicles, pedestrians, or obstacles on the road, the ADV should follow the reference line precisely or closely. The topography map is then provided to decision module 304 and/or planning module 305. The decision module 304 and/or the planning module 305 examine all possible routes to choose and modify one of the best routes taking into account other data provided by other modules, such as traffic conditions from the positioning module 301, driving circumstances perceived by the perception module 302 and traffic conditions predicted by the prediction module 303. The actual path or route used to control the ADV may be close to or different from the reference line provided by the route formulation module 307, depending on the particular driving environment at that point in time.
Based on the decisions for each of the perceived objects, the planning module 305 plans the path or route of the autonomous vehicle and driving parameters (e.g., distance, speed, and/or steering angle) using the reference line provided by the route formulation module 307 as a basis. That is, for a given object, the decision module 304 decides what to do with the object, while the planning module 305 decides how to do it. For example, for a given object, decision module 304 may decide to pass through the object, while planning module 305 may determine whether to pass to the left or right of the object. Planning and control data is generated by a planning module 305, wherein the planning module 305 includes information describing how the vehicle 300 will move in the next movement cycle (e.g., the next route/path segment). For example, the planning and control data may instruct the vehicle 300 to move 10 meters at a speed of 30 miles per hour (mph) and then change to the right lane at a speed of 25 mph.
As part of the planning process, the path planner 309 may generate a plurality of planned ADV states based on the cost function 313, which may be stored in the persistent storage 352.
Based on the planning and control data, the control module 306 controls and drives the autonomous vehicle by sending appropriate commands or signals to the vehicle control system 111 according to the route or path defined by the planning and control data. The planning and control data includes information sufficient to drive the vehicle from a first point to a second point of the path or route at different points in time along the path or route using appropriate vehicle settings or driving parameters (e.g., throttle, brake, steering commands).
In one embodiment, the programming phase is performed in a plurality of programming cycles (also referred to as drive cycles), for example, at each time interval of 100 milliseconds (ms). For each planning period or driving period, one or more control commands will be issued based on the planning and control data. That is, for every 100ms, the planning module 305 plans the next route segment or path segment, including, for example, the target location and the time required for the ADV to reach the target location. Alternatively, the planning module 305 may further specify particular speeds, directions, and/or steering angles, etc. In one embodiment, the planning module 305 plans the route segment or path segment for a next predetermined period of time (e.g., 5 seconds). For each planning cycle, the planning module 305 plans the target location for the current cycle (e.g., the next 5 seconds) based on the target location planned in the previous cycle. The control module 306 then generates one or more control commands (e.g., throttle, brake, steering control commands) based on the planning and control data for the current cycle.
It should be noted that the decision module 304 and the planning module 305 may be integrated as an integrated module. The decision module 304/planning module 305 may include a navigation system or functionality of a navigation system to determine a driving path of an autonomous vehicle. For example, the navigation system may determine a series of speeds and forward directions for enabling the autonomous vehicle to move along the following paths: the path substantially avoids perceived obstructions while advancing the autonomous vehicle along a roadway-based path to a final destination. The destination may be set according to user input via the user interface system 113. The navigation system may dynamically update the driving path while the autonomous vehicle is running. The navigation system may combine data from the GPS system and one or more maps to determine a driving path for the autonomous vehicle.
In one embodiment, the path is planned in the SL coordinate system. The SL coordinate system may be defined with respect to a reference line (road/lane centerline). The longitudinal distance or s-distance represents the distance in the tangential direction along the reference line. Accordingly, the lateral distance or l distance represents the distance perpendicular to the s-direction. The longitudinal dimension in SL space represents the longitudinal distance of a particular object from the current position of the vehicle that is supposed to be driving along the reference line. The lateral dimension in the SL space represents the shortest distance between the object and the reference line at a particular time or location represented by the longitudinal dimension. Such a diagram in SL space is referred to as a SL diagram. In one embodiment, the lateral distance may be defined simply as the distance from the reference line. Thus, in addition to being represented in a cartesian coordinate system (XY plane), the vehicle position (pose) may be represented in a SL coordinate system as ordered pairs (longitudinal pose/position "s pose", lateral pose/position "l pose") with respect to a reference line, or simply as (s, l).
In some implementations, one or more components of the perception and planning system 110 may include and/or may use one or more neural networks. For example, the planning module 305 may include and/or may use one or more neural networks to perform various tasks, functions, operations, actions, and the like. In another example, the perception module 302 may include and/or may use one or more neural networks to perform various tasks, functions, operations, actions, etc. In one embodiment, one or more neural networks, which may be included in and/or used by one or more components of the perception system, may detect lanes (e.g., roadway lanes) in images captured and/or generated by the sensor of the ADV. For example, the neural network used by the perception module 302 may determine a line indicator that may indicate a lane of a road in the image.
As shown in fig. 3A, the system 300 includes an Artificial Intelligence (AI) accelerator 360. The AI accelerator 360 may be a processing device capable of being designed, customized, and/or configured for an artificial intelligence application. For example, the AI accelerator 360 can be used to accelerate the operation of the artificial neural network to perform machine vision and/or machine learning operations, and the like. Examples of AI accelerator 360 may include a Graphics Processing Unit (GPU), an application specific integrated circuit, a field programmable gate array, or the like.
Fig. 4A is a block diagram illustrating an example of a decision and planning process 400 according to one embodiment. The decision and planning process 400 includes positioning/perception data 401, a path decision process 403, a speed decision process 405, a path planning process 407, a speed planning process 409, an aggregator 411, and a trajectory calculator 413.
The path decision process 403 and the speed decision process 405 may be performed by the decision module 304 shown in fig. 3B. Referring to fig. 3B, decision module 304 may use dynamic programming to generate a coarse path profile as an initial constraint for path planning process 407/speed planning process 409. The decision module 304 may use a path state machine that provides previous planning results and important information (e.g., whether the ADV is driving or changing lanes) and traffic rules. Based on the status, traffic rules, reference lines provided by the route formulation module 307, and obstacles perceived by the ADV, the path decision process 403 may decide how to deal with the perceived obstacles (i.e., ignore, overtake, yield, stop, pass) as part of the rough path profile.
For example, in one embodiment, a rough path profile is generated by a cost function consisting of costs based on the curvature of the path and the distance from the reference line and/or reference point to the obstacle. A point on the reference line is selected and moved to the left or right of the reference line as a candidate motion representing a candidate path. Each candidate motion has an associated cost. The associated costs of candidate motions for one or more points on the reference line may be sequentially solved one point at a time using dynamic programming for the optimal cost. Dynamic planning (or dynamic optimization) may be a mathematical optimization method that decomposes a problem to be solved into a sequence of value functions, such that only each of these value functions is solved at a time and their solutions are stored. The next time the same value function occurs, the solution that was previously computed is simply looked up, rather than recalculated.
The speed decision process 405 may use a speed state machine, speed traffic rules, and one or more distance travelled-time (station-time) diagrams. The velocity decision process 405 may use dynamic programming to generate a coarse velocity profile as an initial constraint for the path planning process 407/velocity planning process 409. Based on the state of the speed state machine, the speed traffic rules, the rough path profile generated by the decision process 403, and the perceived obstacles, the speed decision process 405 may generate a rough speed profile to control when to accelerate and/or decelerate the ADV.
The path planning process 407 may use a coarse path profile (e.g., a distance travelled-lateral offset (station-lateral) map) as an initial constraint to recalculate the optimal reference line by using quadratic programming. Quadratic programming involves minimizing or maximizing an objective function (e.g., a quadratic function with several variables) that complies with boundary, linear equation, and/or inequality constraints. One difference between dynamic planning and quadratic planning is that quadratic planning optimizes all candidate motions at once for all points on a reference line. The path planning process 407 may apply a smoothing algorithm (e.g., B-spline or regression) to the output distance travelled-lateral offset map. The path cost function may be utilized to recalculate the reference line using the path cost to optimize the total cost of candidate motion for the reference point, e.g., using Quadratic Programming (QP) optimization.
The speed planning process 409 may use the coarse speed profile (e.g., distance travelled versus time map) and the one or more S-T maps to determine one or more speeds of the ADV. In some embodiments, the S-T map may include S-T trajectories. The S-T trajectory may indicate the distance that the ADV traveled along the path at different points in time, as discussed in more detail below. Thus, the S-T trajectory (of the S-T graph) may be used to indicate or determine the speed of the vehicle at different points in time. The S-T diagram may also include or indicate one or more stop conditions. For example, the S-T map may also indicate the distance and time that the ADV should stop to avoid obstacles such as pedestrians, sidewalks, road dividers (e.g., center dividers), another vehicle, etc. Although ST maps are described herein, other types of maps (e.g., SL maps, maps using a cartesian coordinate system, etc.) may be used in other embodiments. Speed planning process 409 may also use one or more constraints to determine one or more speeds of the ADV. The constraint may be one or more conditions that should be met when the speed planning process 409 determines the set of speeds. For example, the constraint may be a condition imposed on the candidate solution by the satisfied QP optimization problem. One or more constraints may be represented using a speed constraint function as discussed in more detail below.
The aggregator 411 performs the function of aggregating path and speed planning results. For example, in one embodiment, the aggregator 411 may combine the two-dimensional S-T map and the S-L map into a three-dimensional SLT map. In another embodiment, the aggregator 411 may interpolate (or fill in additional points) based on 2 consecutive points on the S-L reference line or the S-T curve. In another embodiment, the aggregator 411 may convert the reference point from (S, L) coordinates to (x, y) coordinates. The track calculator 413 may calculate the final track to control the ADV. For example, the trajectory calculator 413 calculates a list of (x, y, T) points indicating when the ADC should pass a specific (x, y) coordinate, based on the SLT graph provided by the aggregator 411.
Thus, referring back to fig. 4A, the path decision process 403 and the speed decision process 405 will generate a coarse path profile and a coarse speed profile taking into account obstacles and/or traffic conditions. Given all path and speed decisions about the obstacle, path planning process 407 and speed planning process 409 will optimize the coarse path profile and coarse speed profile with the obstacle in mind using QP planning to generate an optimal trajectory with minimal path cost and/or speed cost.
Fig. 4B is a block diagram illustrating a system architecture 450 for autopilot in accordance with one embodiment. The system architecture 450 may represent the system architecture of an autopilot system as shown in fig. 3A and 3B. Referring to fig. 4B, system architecture 450 includes, but is not limited to, an application layer 451, a planning and control (PNC) layer 452, a perception layer 453, a driver layer 454, a firmware layer 455, and a hardware layer 456. The application layer 451 may include a user interface or configuration application that interacts with a user or passenger of the autonomous vehicle, e.g., functionality associated with the user interface system 113. PNC layer 452 may include at least the functionality of planning module 305 and control module 306. The sensing layer 453 may include at least the functionality of the sensing module 302. In one embodiment, there are additional layers including the functionality of the prediction module 303 and/or the decision module 304. Alternatively, such functionality may be included in PNC layer 452 and/or sense layer 453. The system architecture 450 also includes a driver layer 454, a firmware layer 455, and a hardware layer 456. Firmware layer 455 may represent at least the functionality of sensor system 115, which may be implemented in the form of a Field Programmable Gate Array (FPGA). Hardware layer 456 may represent hardware of the autonomous vehicle, such as control system 111 and/or sensor system 115. Layers 451-453 may communicate with firmware layer 455 and hardware layer 456 via device driver layer 454.
Fig. 5 is a block diagram illustrating an example of a sensor system according to one embodiment of the present disclosure. Referring to fig. 5, the sensor system 115 includes a plurality of sensors 510 and a sensor unit 500 coupled to the host system 110. Host system 110 represents a planning and control system as described above, which may include at least some of the modules shown in fig. 3A and 3B. The sensor unit 500 may be implemented in the form of an FPGA device or an ASIC (application specific integrated circuit) device. In one embodiment, the sensor unit 500 includes, among other things, one or more sensor data processing modules 501 (also simply referred to as sensor processing modules), a data transfer module 502, and a sensor control module or logic 503. Modules 501-503 may communicate with sensor 510 via sensor interface 504 and with host system 110 via host interface 505. Alternatively, an internal or external buffer 506 may be utilized to buffer data for processing.
In one embodiment, for a receive path or upstream direction, the sensor processing module 501 is configured to receive sensor data from sensors via the sensor interface 504 and process the sensor data (e.g., format conversion, error checking), which may be temporarily stored in the buffer 506. The data transfer module 502 is configured to send the processed data to the host system 110 using a communication protocol compatible with the host interface 505. Similarly, for a transmit path or downstream direction, the data transfer module 502 is configured to receive data or commands from the host system 110. The sensor processing module 501 then processes the data into a format compatible with the corresponding sensor. The processed data is then sent to the sensor.
In one embodiment, the sensor control module or logic 503 is configured to control certain operations of the sensor 510, such as the timing of capturing activation of sensor data, in response to commands received from a host system (e.g., the awareness module 302) via the host interface 505. Host system 110 may configure sensor 510 to capture sensor data in a coordinated and/or synchronized manner such that the sensor data may be used to sense the driving environment surrounding the vehicle at any point in time.
Sensor interface 504 may include one or more of ethernet, USB (universal serial bus), LTE (long term evolution) or cellular, wiFi, GPS, camera, CAN, serial (e.g., universal asynchronous receiver transmitter or UART), SIM (subscriber identity module) card, and other general purpose input/output (GPIO) interfaces. The host interface 505 may be any high speed or high bandwidth interface, such as a PCIe interface. The sensors 510 may include, for example, various sensors used in an autonomous vehicle, such as cameras, LIDAR devices, RADAR devices, GPS receivers, IMUs, ultrasonic sensors, GNSS (global navigation satellite system) receivers, LTE or cellular SIM cards, vehicle sensors (e.g., throttle, brake, steering sensors), and system sensors (e.g., temperature, humidity, pressure sensors), etc.
For example, the cameras may be coupled through an ethernet or GPIO interface. The GPS sensor may be coupled via a USB or a specific GPS interface. Vehicle sensors may be coupled through a CAN interface. The RADAR sensor or the ultrasonic sensor may be coupled through a GPIO interface. The LIDAR device may be coupled via an ethernet interface. The external SIM module may be coupled via an LTE interface. Similarly, an internal SIM module may be inserted onto the SIM socket of the sensor unit 500. A serial interface, such as UART, may be coupled with the console system for debugging purposes.
Note that the sensor 510 may be any type of sensor and be provided by various suppliers or suppliers. The sensor processing module 501 is configured to handle different types of sensors and their respective data formats and communication protocols. According to one embodiment, each sensor 510 is associated with a particular channel for processing sensor data and transmitting the processed sensor data between host system 110 and the corresponding sensor. Each channel includes a particular sensor processing module and a particular data transfer module that have been configured or programmed to handle the corresponding sensor data and protocols.
In one embodiment, the sensor unit 500 includes a high precision time generation circuit 517. As shown in fig. 6A-6B, high precision time generation circuitry 517 may generate a time and/or timestamp used by each sensor 510 to track when sensor data is transmitted or captured/triggered by each sensor 510 and/or received by sensor unit 500.
The sensor system 115 also includes a synchronization module 519. In an embodiment, the synchronization module 519 may synchronize the data acquisition times of one or more follower sensors with the lead sensor. As discussed in more detail below, this may allow the lead sensor and the following sensor to acquire, record, capture, etc., sensor data at the same time (e.g., simultaneously, at the same time, etc.). If the sensors are not able to capture, acquire, record sensor data at the same time or simultaneously, the ADV may be more difficult to correlate and/or use sensor data from sensors S1 and S2. For example, an ADV may be more difficult to determine if an object detected by one sensor is the same as an object detected by a second sensor. In another example, if two sensors detect an object at different points in time, the ADV may be more difficult to determine the position of the object. In some embodiments, synchronizing the sensors of the ADV may allow the ADV to correlate or more easily correlate different sensor data from different sensors. Correlating sensor data from different sensors may allow an ADV to detect vehicles, objects, obstacles, pedestrians, driveways, etc. in an environment more quickly, efficiently, easily, and/or more accurately.
Referring now to fig. 6A, the high-precision time generation circuit 517 may include a time synchronization unit 550, a GPS sensor 551, and a local timer 553. The time synchronization unit 550 may synchronize the local timer 553 with respect to time derived from a Pulse Per Second (PPS) signal from the GPS sensor 551. PPS may be used to align local timer 553 for accurate time measurement to nanoseconds. The GPS sensor 551 may be part of the GPS unit 212 of the sensor system 115 of fig. 2, or the GPS sensor 551 may be a dedicated GPS sensor integrated within the high precision time generation circuit 517. The local timer 553 may generate a time for the sensor unit 500. The local timer 553 may be any local RTC (e.g., CPU RTC or FPGA RTC) from the sensor unit 500 or a timer of the sensor or a time retrieved from an external source such as a cellular source (e.g., 4G, long Term Evolution (LTE), 5G), WIFI source, FM receiver, etc.
Referring to fig. 6B, the time synchronization unit 550 may include a monitoring module 555, an adjustment module 557, a millisecond generator 603, a microsecond generator 605, a nanosecond generator 607, a demultiplexer 609, and a configuration 611. The millisecond generator 603, microsecond generator 605 and nanosecond generator 607 may generate millisecond, microsecond and nanosecond oscillation periods (e.g., three oscillator counters of different granularity) based on the oscillators of the local timer 553, respectively. Configuration 611 may configure the select signal to select which of the outputs of millisecond generator 603, microsecond generator 605, and nanosecond generator 607 is to be routed to monitoring module 555. The monitoring module 555 may monitor the generated oscillation cycles to count the cycles. The adjustment module 557 may adjust the count (or modify the count representation) to synchronize the local timer 553 with the PPS signal from the GPS sensor 551. In one embodiment, the selection signal for configuration 611 may be programmed by a user of sensor unit 500 or by monitoring module 555/adjustment module 557 in the feedback loop. For example, if it is determined that the local timer 553 is relatively accurate, the user may configure to disable the millisecond generator.
Depending on the type of crystal oscillator used, the local timer 553 may have a precision ranging from 0.1 to 100ppm, for example, any pulse may be offset by 0.1 to 100 microseconds, while the Pulse Per Second (PPS) signal from the GPS sensor 551 has a precision rate of less than 0.1ppm or a deviation of less than 0.1 microseconds. For a GPS PPS signal of 0.1ppm, the PPS signal received from the GPS sensor 551 may infer that the continuous pulses are between 999,999.9 and 1,000,000.1 microseconds per second, while a typical 100ppm local timer 553 may infer that the continuous pulses are between 999,900 and 1,000,100 microseconds per second. Furthermore, the variation of the pulse deviation of the local timer 553 can be changed in real time due to the variation of the ambient temperature of the crystal oscillator IC used by the local timer 553. Thus, the goal is to adjust or synchronize the local timer 553 to match the GPS sensor 551 in real time.
In order to synchronize the local timer 553 with the GPS sensor 551, in one embodiment, the GPS sensor 551 receives a GPS pulse signal (PPS), which is an RF signal transmitted by a satellite broadcasting its signal in space with a certain accuracy (e.g., <0.1 ppm). In some embodiments, the GPS sensor 551 receives PPS signals from a first GPS satellite, and then receives PPS signals from a second GPS satellite if the first GPS satellite is out of range. Because the GPS satellites use their own accurate time measurements and each satellite has its own airborne set of atomic clocks, PPS signals from the GPS satellites can be considered as one or more reference timers. Note, however, that because the local timer 553 is adjusted in real-time to match any one GPS PPS signal, it is assumed that any time difference between GPS PPS signals of two or more different GPS satellites will not matter, as the local timer 553 can be smoothly synchronized in real-time, as described further below.
Upon receiving the GPS PPS signal, the monitoring module 555 may determine any deviation of the time of the PPS signal from the time of the local timer 553, and may generate a second local real-time clock/timer based on the determined deviation. For example, based on PPS signals, date and time information (coordinated universal time or UTC format) may be initially provided by GPS (national marine electronics association) NMEA data information, accurate to a few seconds. Next, in one embodiment, the millisecond generator 603 may generate an oscillation count (e.g., a first granularity) of approximately 1 millisecond using the local timer 553. The frequency of the signal of the local timer 553 may be divided using a divider circuit to generate an oscillation count of approximately 1 millisecond. The monitoring module 555 may then detect or count the number of cycles (e.g., 999 cycles) from the millisecond generator 603 for a GPS PPS signal time interval having one second, e.g., the local timer 553 lags the GPS PPS signal by approximately one millisecond. Because the millisecond generator 603 lags the GPS PPS, in one embodiment, the adjustment module 557 adjusts the millisecond generator output to represent 1.001 milliseconds per oscillation. The millisecond generator 603 then generates the following 1000 oscillation representations for each second: 0.000, 1.001, 2.002, … …, 999.999 and 1001 milliseconds. Thus, the 999 th cycle from the millisecond generator 603 counts to 999.999 milliseconds.
Next, microsecond generator 605 may generate an oscillation count of approximately one microsecond using local timer 553. The signal frequency of the local timer 553 may be divided using a second divider circuit to generate an oscillation count (e.g., a second granularity) of approximately one microsecond. The monitoring module 555 may count 998 cycles or 2 microsecond deviations from the microsecond generator 605 for a 1 millisecond GPS PPS time interval. Also, since the microsecond generator 605 lags the GPS PPS, the adjustment module 557 adjusts the microsecond generator output to represent 1.002 microseconds per oscillation. The microsecond generator then generates the following 1000 oscillation representations for each millisecond: 0.000, 1.002, 2.004, … …, 999.996, 1000.998, and 1002 microseconds. Thus, the 998 th cycle counts to 999.996 microseconds.
Next, nanosecond generator 607 may generate an oscillation count of approximately one nanosecond using local timer 553. The signal frequency of the local timer 553 may be divided using a third frequency divider circuit to generate an oscillation count (e.g., a third granularity) of approximately one nanosecond. The monitoring module 555 may count 997 cycles from the nanosecond generator 607 for a GPS PPS signal time interval of one microsecond or detect a 3 nanosecond offset. Likewise, adjustment module 557 may adjust the nanosecond generator output to represent 1.003 nanoseconds per oscillation. The nanosecond generator then generates the following 1000 oscillation representations for each microsecond: 0.000, 1.003, 2.006, … …, 999.991, 1000.994, 1001.997, and 1003 nanoseconds. Thus, the 977 th cycle from nanosecond generator 607 counts to 999.991 nanoseconds. In this way, any generator output (e.g., representation) or combination thereof may generate high precision time in real time. The high precision time may then be provided to the sensors of the sensor unit 500. In the above example, the time generated using the nanosecond generator has an accuracy of up to one nanosecond. Note that although three generators (e.g., three granularities) are described, any number of generators and granularities may be used to generate high-precision time.
In some implementations, configuration 611 may selectively enable/disable any of generators 603-607 via demultiplexer 609. Any generator may be selectively turned on/off. The selectivity is useful for selecting a subset of the generator outputs (e.g., a nanosecond generator only) when only a subset of the outputs are required. In another embodiment, the monitoring module 555 buffers (e.g., saves) the bias at a different granularity and, if the GPS sensor signal is lost, maintains the first count value, the second count value, and the third count value (e.g., each oscillating value representation) until the GPS sensor signal is restored again.
FIG. 7 is a block diagram of an exemplary synchronization module 519 according to one embodiment. As described above, synchronization module 519 may be part of sensor system 115. The synchronization module 519 may be coupled to the sensors 710A, 720A-720Z, 740A, and 760A-760Z via a sensor interface (e.g., the sensor interface 504 shown in FIG. 5). The sensor interface may include one or more of ethernet, USB (universal serial bus), LTE (long term evolution) or cellular, wiFi, GPS, camera, CAN, serial (e.g., universal asynchronous receiver transmitter or UART), SIM (subscriber identity module) card, PCIe, and/or other general purpose input/output (GPIO) interfaces. The sensors 710A, 720A-720Z, 740A, and 760A-760Z may be part of an autonomous vehicle and may be used by the autonomous vehicle to detect objects in the environment in which the autonomous vehicle is located (e.g., detect objects, vehicles, pedestrians, cyclists, driveways, buildings, signs, etc.). Examples of sensors may include cameras, LIDAR device/sensors, RADAR device sensors, GPS receivers, IMUs, ultrasound devices/sensors, GNSS (global navigation satellite system) receiver sensors, LTE or cellular SIM cards, vehicle sensors (e.g., throttle, brake, steering sensors), and system sensors (e.g., temperature, humidity, pressure sensors), etc.
Sensors 710A and 740A may be lead sensors and sensors 720A-720Z may be trailing sensors. The guidance sensor may be associated with a set of follower sensors (e.g., one or more follower sensors). The guidance sensor may be a sensor for determining when other sensors (e.g., a set of one or more following sensors) should record, capture, acquire, sense, etc. sensor data. For example, the guidance sensor may capture or acquire sensor data at some time. The set of follower sensors associated with the lead sensor may be configured to capture or acquire sensor data simultaneously. For example, the leading sensor and trailing sensor set may capture/acquire sensor data simultaneously. The follower sensor may be associated with (e.g., may be combined with) the guide sensor. As described above, the following sensor may be a sensor that captures/acquires sensor data based on the time at which the associated leading sensor captured/acquired the sensor data. In various embodiments, any combination and/or any number of guidance and/or follow sensors may be used in sensor system 115. For example, the sensors may be organized into different sets, and each set may have one or more guidance sensors and one or more follow sensors.
Each sensor may have one or more data acquisition characteristics. The data acquisition characteristics of the sensor may be characteristics, properties, parameters, attributes, functions, etc. Which may indicate and/or may be used to determine how long it may take for the sensor to capture data. For example, if the sensor is a camera, the data acquisition characteristics of the camera may include shutter speed, exposure settings, aperture settings, ISO speed, and the like. These data acquisition characteristics may be indicative of and/or may be used to determine (e.g., calculate) the time that the camera may take to capture or record an image. In another example, if the sensor is a LIDAR sensor/device, the data acquisition characteristics of the LIDAR sensor/device may include a rotational speed of a laser or mirror. The rotational speed may be indicative of and/or may be used to determine (e.g., calculate) how long it may take for the LIDAR sensor/device to capture LIDAR data.
In one embodiment, the data acquisition characteristics of the sensor may include an amount of time for the sensor to begin capturing/acquiring sensor data after receiving an instruction, message, packet, etc. For example, cameras may take time to activate CCD (charge coupled device) and CMOS (complementary metal oxide semiconductor) sensors in the camera. In another example, the radar device/sensor may take time to power up the transmitter. In another example, the LIDAR device/sensor may take time to power up the laser. The time between receipt of an instruction, message, packet, etc. and the start of capturing sensor data may be referred to as the start-up delay of the sensor. In yet another example, for a LIDAR device/sensor, rotation of a laser or mirror to a particular position/location may take time. The start-up delay may be an example of the data acquisition characteristics of the sensor.
In one embodiment, the data acquisition characteristics of the sensor may include the amount of time it takes for the sensor to capture/acquire sensor data. For example, a camera may take some time to capture or record an image. In another example, a radar sensor/device may take time to perform enough measurements to determine the distance to an object. The sensor captures, obtains, records, generates, etc., the amount of time spent. The sensor data may be referred to as acquisition delay. The acquisition delay may be an example of the data acquisition characteristics of the sensor. The acquisition delay may be based on other data acquisition characteristics of the sensor (e.g., shutter speed, exposure time, etc. of the camera).
Synchronization module 519 includes guidance sensor modules 731A and 751A. In one embodiment, the guidance sensor module may determine one or more data acquisition characteristics of the guidance sensor. For example, guidance sensor module 731A may determine one or more data acquisition characteristics of sensor 710A (e.g., guidance sensor) and guidance sensor module 751A may determine one or more data acquisition characteristics of sensor 740A (e.g., guidance sensor). The guidance sensor module may determine one or more characteristics of the guidance sensor by accessing settings/parameters of the guidance sensor, reading a profile, querying the guidance sensor for data acquisition characteristics (e.g., querying the guidance sensor for data acquisition characteristics), and so forth.
The synchronization module 519 also includes follower sensor modules 733A-733Z and 753A-753Z. In one embodiment, the following sensor module may determine one or more data acquisition characteristics of the following sensor. For example, the follower sensor module 733A may determine one or more data acquisition characteristics of the sensor 720A (e.g., follower sensor). The follower sensor module may determine one or more characteristics of the follower sensor by accessing settings/parameters of the follower sensor, reading a profile, querying the follower sensor for data acquisition characteristics (e.g., requesting data acquisition characteristics from the follower sensor), and so forth.
The synchronization module 519 also includes launch modules 732A-732Z and 752A-752Z. In one embodiment, the initiation module may initiate capture, acquisition, recording, etc. of sensor data. For example, the initiation module 732A may send frames, messages, packets, instructions, etc. to the sensor 720A to cause the sensor to begin capturing sensor data. A message, packet, instruction, etc. may initiate, trigger, cause sensor 720A to capture sensor data, etc. The time at which the initiation module sends a message, packet, instruction, etc. to the sensor (to trigger or cause the sensor to capture sensor data) may be referred to as the initiation time.
In one embodiment, as described above, synchronization module 519 (e.g., a guidance sensor module or a following sensor module) may determine a set of data acquisition characteristics of a guidance sensor (e.g., sensor 710A) and a set of following sensors associated with the guidance sensor (e.g., sensors 720A-720Z). The synchronization module 519 (e.g., start-up module) may synchronize data acquisition times of the lead sensor (e.g., sensor 710A) and the one or more following sensors (e.g., sensors 720A-720Z) based on data acquisition characteristics of the lead sensor (e.g., sensor 710A) and the one or more following sensors (e.g., sensors 720A-720Z). For example, the synchronization module 519 may cause the lead sensor to acquire sensor data at a first data acquisition time (e.g., a first time or period of time) and may cause the following sensor to acquire sensor data at a second data acquisition time (e.g., a second time or period of time). The data acquisition time may also be referred to as a time period, time frame, etc. in which the sensor may acquire sensor data. The first data acquisition time and the second data acquisition time may overlap. For example, the synchronization module 519 may cause the lead sensor and one or more following sensors (associated with the lead sensor) to obtain data simultaneously (e.g., time periods of acquisition delay for the sensors may at least partially overlap).
In one embodiment, the synchronization module 519 (e.g., a start module) may determine a data acquisition time of the lead sensor and/or one or more of the following sensors based on data acquisition characteristics of the lead sensor and/or one or more of the following sensors. For example, as discussed in more detail below, the synchronization module 519 may determine a data acquisition time of the guidance sensor and/or one or more follower sensors based on a start delay and/or an acquisition delay of the guidance sensor and/or one or more follower sensors. In one embodiment, the data acquisition time of the sensor may include an acquisition delay, and/or may be the same as the acquisition delay (e.g., the data acquisition time is the acquisition delay).
In one embodiment, the synchronization module 519 (e.g., a start module) may determine a data acquisition time of the lead sensor and/or one or more following sensors based on a sensor history of the lead sensor and/or one or more following sensors. For example, the synchronization module 519 may track, record, etc., previous start-up delays and acquisition delays of the guidance sensor and/or one or more follow-up sensors. The synchronization module 519 may determine an average start delay and an average acquisition delay for the guidance sensor and/or one or more following sensors. The synchronization module 519 may determine a data acquisition time for the lead sensor and/or the one or more following sensors based on the average start delay and the average acquisition delay for the lead sensor and/or the one or more following sensors.
In one embodiment, the synchronization module 519 (e.g., a start module) may determine a start time of the sensor based on a data acquisition time of the sensor. Based on the amount of time the sensor captures data and the desired data acquisition time, the synchronization module 519 may determine when to activate or trigger the sensor such that the sensor will capture/obtain sensor data at the desired data acquisition time (e.g., during the desired time, time frame, time period, etc.). For example, as discussed in more detail below, if the desired data acquisition time of the sensor (capture/acquisition of sensor data) is time T, the synchronization module 519 may determine that the start time of the sensor should be T minus the start delay of the sensor (and minus some or all of the acquisition delay).
In one embodiment, the lead sensor can support use of the start time and the one or more following sensors can support use of the start time. For example, the guidance sensor and one or more follow sensors may be triggered to begin acquiring sensor data at a particular time (e.g., may indicate that acquisition of sensor data is beginning at a particular time). In another embodiment, the lead sensor may not be able to support use of the start time, and one or more following sensors may be able to support use of the start time. For example, once a sensor begins to obtain sensor data, the sensor may not be able to adjust the frequency or timing for capturing the sensor data (e.g., the sensor may capture data every 15 milliseconds and may not be able to capture data at different time intervals). Because the lead sensor may not be triggered or activated at a certain time, the timing (e.g., acquisition delay and/or activation delay) of the lead sensor may be used by one or more of the following sensors to determine when one or more of the following sensors should be triggered to begin acquiring sensor data.
Fig. 8A is a graph showing the timing of the operation of the two sensors S1 and S2 according to one embodiment. The sensors S1 and S2 may be sensors in an autonomous vehicle. For example, sensor S1 may be a camera and sensor S2 may be a LIDAR sensor/device. Sensor S2 may be a guidance sensor, while sensor S1 may be a follow sensor associated with sensor S2 (e.g., with a guidance sensor). As described above, the guidance sensor may be a sensor for determining when one or more following sensors should record, capture, acquire, sense, etc., sensor data. The sensors S1 and S2 may each have one or more data acquisition characteristics. As described above, the data acquisition characteristics of the sensor may be characteristics, properties, parameters, attributes, functions, etc. that can indicate and/or can be used to determine how long it may take for the sensor to capture data. For example, the start-up delay and/or the acquisition delay may be data acquisition characteristics of the sensor.
As shown in fig. 8A, the sensors S1 and S2 may be activated at 5 ms. For example, the sensors S1 and S2 may receive messages, instructions, data, etc. to begin capturing, recording, acquiring, etc. sensor data at 5 ms. T1A may be the start-up delay of sensor S1 and T2A may be the start-up delay of sensor S2. As described above, the start-up delay may be the time it takes for the sensor to begin capturing, recording, obtaining, etc., sensor data after receiving a message, instruction, data, etc. T1A represents a start-up delay of 5ms and T2A represents a start-up delay of 15 ms. T1B may be the acquisition delay of sensor S1 and T2B may be the acquisition delay of sensor S2. As described above, the acquisition delay may be the amount of time it takes for the sensor to capture sensor data (e.g., the time it takes for the camera to capture and image). T1B represents an acquisition delay of 10ms, and T2B represents an acquisition delay of 15 ms.
As shown in fig. 8A, the sensors S1 and S2 may not collect, record, acquire, capture, etc. sensor data at the same time. For example, the sensor S1 records/acquires sensor data during an acquisition delay T1B, which starts at 10ms and ends at 20 ms. The sensor S2 records/acquires sensor data during an acquisition delay T2B, which starts at 20ms and ends at 35 ms. Because the sensors S1 and S2 capture/acquire sensor data at different points in time, they can detect objects in the environment at different times. For example, sensor S1 may detect another vehicle during acquisition delay T1B, while sensor S2 may detect the same vehicle during acquisition delay T2B. However, because the sensors S1 and S2 detect that the time, time period, time frame, etc. of the vehicle do not overlap (e.g., because the sensors S1 and S2 do not simultaneously acquire/capture sensor data), the ADV may be more difficult to correlate and/or use sensor data from the sensors S1 and S2. For example, the sensor S1 may capture an image of the vehicle during the acquisition delay T1B. The sensor S2 may detect the distance and/or shape of the vehicle during the acquisition delay T2B using a laser. However, because sensor S1 first captures an image of the vehicle and sensor S2 then detects the distance of the vehicle, when sensor S2 detects the distance of the vehicle, the vehicle may no longer be in the same location (e.g., may no longer be at the same distance from ADV). Thus, ADV may be more difficult to correlate sensor data received from sensors S1 and S2. For example, it may be more difficult (or impossible) for an ADV to determine that the vehicle captured in the image from sensor S1 is the same as the vehicle detected by sensor S2.
Fig. 8B is a graph showing the timing of the operation of the two sensors S1 and S2 according to one embodiment. The sensors S1 and S2 may be sensors in an autonomous vehicle. Sensor S2 may be a guidance sensor, while sensor S1 may be a follow sensor associated with sensor S2 (e.g., with a guidance sensor). As described above, the guidance sensor may be a sensor for determining when one or more following sensors should record, capture, acquire, sense, etc., sensor data. The sensors S1 and S2 may each have one or more data acquisition characteristics. As described above, the data acquisition characteristics of the sensor may be characteristics, properties, parameters, attributes, functions, etc. that may indicate and/or may be used to determine how long it may take for the sensor to capture data.
In one embodiment, the ADV may determine the activation time of the sensors S1 and S2 based on the data acquisition characteristics of the sensors S1 and S2. For example, sensor S1 has a start-up delay of 5ms and an acquisition delay of 10 ms. The sensor S2 has a start-up delay of 15ms and an acquisition delay of 15 ms. ADV may determine the data acquisition times of sensors S1 and S2 such that the data acquisition times of sensors S1 and S2 at least partially overlap, or such that sensors S1 and S2 capture, record, acquire, etc. sensor data at the same time (e.g., there is at least some overlap between the data acquisition time of sensor S1 and the data acquisition time of sensor S2).
As shown in fig. 8B, an ADV (e.g., synchronization module 519 shown in fig. 5 and 7) may determine the start-up times of sensors S1 and S2. As described above, the activation time of the sensors S1 and S2 may be the time at which the sensors S1 and S2 may activate recording, capturing, collecting, etc. of sensor data. The activation time of the sensor S1 may be 15ms and the activation time of the sensor S2 may be 5ms. Because the start-up delay of sensor S1 is 5ms and the start-up delay of sensor S2 is 15ms, sensor S2 can start up earlier. This may allow the sensors S1 and S2 to simultaneously begin capturing, recording, etc. sensor data. For example, both sensors S1 and S2 may begin capturing, recording, etc. sensor data at 20 ms. Thus, the sensors S1 and S2 can record, capture, etc. sensor data at the same time.
In other embodiments, the start time and/or the data acquisition time may be offset if there is at least some overlap between the data acquisition times of the sensors (e.g., some overlap between time periods of the acquisition delay of the sensors). For example, the activation time of the sensor S1 may be at time 20ms, and the acquisition delay T1B may be between 25ms and 35 ms. This may allow the sensors S1 and S2 to complete capturing, recording, acquiring, etc. simultaneously.
As described above, ADV may have difficulty correlating sensor data from different sensors when the sensors are not capturing, recording, obtaining, etc. sensor data at the same time. As shown in fig. 8B, the ADV may synchronize the sensors S1 and S2 (e.g., synchronize the operation of the sensors S1 and S2) such that there is at least some overlap in the time periods during which the sensors S1 and S2 capture, record, acquire, etc. This may allow the ADV to more easily correlate the sensor data from the sensors S1 and S2 because the time, time period, time frame, etc. that the sensors S1 and S2 acquire the sensor data overlap. For example, the sensor S1 may be a camera, and may capture an image of the vehicle during the acquisition delay T1B. The sensor S2 may be a LIDAR device/sensor and may detect the distance and/or shape of the vehicle using a laser during the acquisition delay T2B. Because sensor S1 first captures an image of the vehicle and sensor S2 detects the distance from the vehicle at the same time, ADV can determine that the vehicle detected by sensor S2 (using the laser) is the same as the vehicle in the image generated by sensor S1.
In some embodiments, correlating sensor data from different sensors may allow an ADV to more easily and/or accurately detect vehicles, objects, obstacles, pedestrians, driveways, etc. in the environment. For example, if the camera and LIDAR device/sensor detect a vehicle at the same time, the ADV can more easily correlate the sensor data and can be more confident that an object is present in the environment. In another example, if an ADV can more easily correlate sensor data from multiple sensors, the ADV can more easily determine the location, speed, and/or direction of travel of objects in the environment.
FIG. 9 is a flowchart illustrating an exemplary process 900 for synchronizing sensors of an autonomous vehicle according to one embodiment. Process 900 may be performed by processing logic that may comprise software, hardware, or a combination thereof. Process 900 may be performed by processing logic that may comprise hardware (e.g., circuitry, dedicated logic, programmable logic, a processor, a processing device, a Central Processing Unit (CPU), a system-on-a-chip (SoC), etc.), software (e.g., instructions run, executed on a processing device), firmware (e.g., microcode), or a combination thereof. In some implementations, process 900 may be performed by a processing device, a synchronization module (e.g., synchronization module 519 shown in fig. 5 and 7), portions of a synchronization module (e.g., a boot sensor module, a start module, and/or a follow sensor module shown in fig. 7), and so forth.
At block 905, the process 900 may determine one or more data acquisition characteristics of the guidance sensor. For example, process 900 may query for guidance sensors or may access a configuration file. At block 910, the process 900 may determine one or more data acquisition characteristics of one or more following sensors. For example, process 900 may query one or more follower sensors or may access one or more profiles.
At block 915, the process 900 may synchronize the data acquisition time of the lead sensor with the data acquisition time of one or more following sensors. For example, at block 916, process 900 may determine a data acquisition time for the lead sensor and one or more following sensors (e.g., when the sensors should be acquiring, collecting, recording, collecting, etc., sensor data simultaneously). The data acquisition times of the lead sensor and the one or more following sensors may be determined based on data acquisition characteristics of the lead sensor and the one or more following sensors (e.g., based on start-up delays, acquisition delays, etc.). At block 917, the process 900 may determine a start time of the lead sensor and/or one or more following sensors. For example, if the guidance sensor is capable of supporting the user's start time, the process 900 may determine the start time of the guidance sensor and one or more follow sensors. In another example, if the guidance sensor is unable to support the user's start time, the process 900 may determine the start time of one or more follow sensors. At block 918, process 900 may activate a sensor (e.g., a guidance sensor and/or one or more follow sensors) at an activation time. Thus, the guidance sensor and/or one or more follow-up sensors may be caused to collect, record, acquire, collect, etc., sensor data.
At block 920, the process 900 may determine a path of the ADV based on the sensor data. For example, a path on the road may be determined that avoids an obstacle detected by the sensor of the ADV. At block 925, the process 900 may control the ADV based on the path. For example, process 900 may cause the ADV to travel along a path.
Fig. 10A is a block diagram illustrating an exemplary sensor verification system 1000 according to one embodiment. Verification system 1000 includes sensor system 115 and verification device 1020. The sensor system 115 includes a sensor 510, a verification module 1010, and a sensor unit 500. As described above, the sensor system 115 may also include a host system (not shown in FIG. 10A). The sensors 510 may include various sensors used in autonomous vehicles, such as cameras, LIDAR devices, RADAR devices, GPS receivers, IMUs, ultrasonic sensors, GNSS (global navigation satellite system) receivers, LTE or cellular SIM cards, vehicle sensors (e.g., throttle, brake, steering sensors), and system sensors (e.g., temperature, humidity, pressure sensors), among others. Although one sensor 510 is shown in fig. 10A, in other embodiments, the sensor system 115 may include multiple sensors.
In one embodiment, the sensor 510 may be an active sensor. The active sensor may be a sensor that may emit, send, propagate, generate, etc. waves. For example, an active sensor may transmit or emit electromagnetic waves (e.g., radio waves, light waves, infrared waves, etc.), and may detect reflections of the electromagnetic waves to detect objects in the environment surrounding the autonomous vehicle. In another example, an active sensor may transmit or emit sound waves and may detect reflections of the sound waves to detect objects in the environment surrounding the autonomous vehicle. Examples of active sensors may include radar sensors/devices, LIDAR sensors/devices, ultrasonic sensors/devices, and the like. In another embodiment, sensor 510 may be a passive sensor. The passive sensor may be a sensor that does not emit, transmit, propagate, generate waves, etc. For example, passive sensors may detect electromagnetic or acoustic waves to detect objects in the environment surrounding an autonomous vehicle. Examples of passive sensors may include microphones, cameras, video cameras, and the like.
The sensor unit 500 may be implemented in the form of an FPGA device or an ASIC device. As described above, the sensor unit 500 may communicate with the sensor 510 via a sensor interface and may communicate with a host system via a host interface. The sensor interface may include one or more of an ethernet, USB, LTE or cellular, wiFi, GPS, camera, CAN, serial, SIM card, and/or other GPIO interface. The host interface 505 may be any high speed or high bandwidth interface, such as a PCIe interface. Sensor unit 500 may receive sensor data from sensor 510 via a sensor interface and process the sensor data (e.g., format conversion, error checking). The sensor unit 500 may transmit the processed data to the host system using a communication protocol compatible with the host interface. Similarly, for a transmit path or downstream direction, the sensor unit 500 may receive data or commands from a host system. The sensor unit 500 then processes the data into a format compatible with the sensor 510. The processed data is then sent to the sensor 510.
In one embodiment, the sensor control module or logic 503 is configured to control certain operations of the sensor 510, such as timing of the activation of capturing sensor data, in response to commands received from a host system (e.g., the perception module 302) via the host interface 505. Host system 110 may configure sensor 510 to capture sensor data in a coordinated and/or synchronized manner such that the sensor data may be used to sense the driving environment surrounding the vehicle at any point in time. Note that the sensor 510 may be any type of sensor and be provided by various suppliers or suppliers. The sensor unit 500 is configured to handle different types of sensors and their respective data formats and communication protocols.
In one embodiment, the sensor unit 500 further includes a time generation circuit (not shown in fig. 10A). The time generation circuit may generate signals for components of the verification system 1000 that may be indicative of time and/or may be used to determine time (e.g., global time, reference time, etc.). For example, the time generation circuit may be a clock that can continuously output the current time or a timing signal indicating the current time. The sensor 510 and verification module may use the timing signals to determine a time stamp and/or to determine when to perform an operation, action, etc. In one embodiment, the time generation circuit may be part of the verification module 1010. In another example, as described above, the time generation circuit may be part of the sensor unit 500. In another embodiment, the time generation circuit may be separate from the verification module 1010 and the sensor unit 500.
The authentication system 1000 includes an authentication device 1020. In one embodiment, authentication device 1020 may be a detector. The detector may be a device that may detect waves generated, emitted, transmitted, etc., by the sensor 510. For example, the sensor 510 may be a LIDAR device (e.g., an active sensor), and the verification device 1020 may be a photodetector (e.g., a laser) that may detect light waves emitted by the LIDAR device. In another example, sensor 510 may be a radar device (e.g., an active sensor), and authentication device 1020 may be an antenna that detects radio waves emitted by the radar device. In another example, the sensor 510 may be an ultrasonic device (e.g., an active sensor) and the verification device 1020 may be a microphone that detects sound waves emitted by the ultrasonic device.
The sensor system 115 also includes a verification module 1010. In an embodiment, the verification module 1010 may verify the operation of the sensor 510. For example, the verification module 1010 may determine whether the sensor 510 is capable of initiating capture of sensor data quickly enough (e.g., whether the initiation delay is within an acceptable time/range). In another example, the verification module 1010 may determine whether the sensor 510 is capable of capturing sensor data quickly enough (e.g., whether the acquisition delay is within an acceptable time/range). In another example, the verification module 1010 may determine whether the sensor 510 is capable of transmitting sensor data to another device (e.g., to a host system) quickly enough. In yet another example, the verification module 1010 may determine whether the sensor 510 is capable of detecting an object based on sensor data acquired by the sensor 510.
As described above, if the sensors do not capture, acquire, record sensor data at the same time or at the same time (e.g., if the sensors are not synchronized), the ADV may be more difficult to correlate and/or use sensor data from multiple sensors. For example, an ADV may be more difficult to determine if an object detected by one sensor is the same as an object detected by a second sensor, or if two sensors detect an object at different points in time. Synchronizing the sensors of the ADV may allow the ADV to correlate or more easily correlate different sensor data from different sensors. Correlating sensor data from different sensors may allow an ADV to detect vehicles, objects, obstacles, pedestrians, driveways, etc. in an environment more quickly, efficiently, easily, and/or more accurately. For example, an ADV can more easily correlate sensor data and can be more confident that objects are present in the environment. In another example, if an ADV can more easily correlate sensor data from multiple sensors, the ADV can more easily determine the location, speed, and/or direction of travel of objects in the environment.
Verifying operation of the sensor 510 may allow the sensor system 115 (e.g., the synchronization module 519 shown in fig. 5 and 7) to properly synchronize the sensor 510 with other sensors in the autonomous vehicle. For example, if the sensor is not operating properly (e.g., takes too long to acquire sensor data, takes too long to start, etc.), the sensor system 115 may not be able to properly synchronize the sensor 510 with other sensors. The verification module 1010 may allow the sensor system 115 to verify that the sensor 510 is operating properly, which may allow the sensor to be properly synchronized with other sensors in the autonomous vehicle. If the sensor 510 is not operating properly, the verification module 1010 can provide an indication that the sensor 510 is not operating properly (e.g., send an error message, display an error message, etc.). This may allow a user (e.g., driver/passenger, mechanic, technician, etc.) to know when sensor 510 is not operating properly and to replace sensor 510.
FIG. 10B is a block diagram illustrating an exemplary sensor verification system 1050 according to one embodiment. Authentication system 1050 includes sensor 510, authentication module 1010, and authentication device 1020. The sensors 510 may include various sensors (e.g., LIDAR devices, radar devices, ultrasonic sensors, cameras, video cameras, GPS receivers, etc.) used in autonomous vehicles. Although one sensor 510 is shown in fig. 10B, in other embodiments, the verification system 1050 may include multiple sensors.
In one embodiment, the sensor 510 may be an active sensor (e.g., a LIDAR sensor, device, radar sensor/device, IR sensor/device, ultrasonic sensor/device). The active sensor may be a sensor that may emit, transmit, propagate, generate waves (e.g., radio waves, light waves, infrared waves, acoustic waves, etc.), etc., and may detect reflections of the waves to detect objects in the environment surrounding the autonomous vehicle. In another embodiment, sensor 510 may be a passive sensor. The passive sensor may be a sensor (e.g., camera, video camera, GPS receiver, microphone, etc.) that does not emit, transmit, propagate, generate waves, etc. The sensor 510 may be coupled to the authentication module 1010 via a sensor interface. As described above, the sensor interface may include one or more of an ethernet, USB, LTE or cellular, wiFi, GPS, camera, CAN, serial, SIM card, PCIe interface, and/or other GPIO interface. The verification module 1010 may receive sensor data from the sensor 510 via a sensor interface and process the sensor data (e.g., format conversion, error checking). In one embodiment, the verification module 1010 is configured to control certain operations of the sensor 510, such as timing of activation of captured sensor data. This may allow the verification module 1010 to verify the operation of the sensor 510.
In one embodiment, the sensor unit 500 further includes a time generation circuit (not shown in fig. 10A). The time generation circuit may generate signals for components of the verification system 1000 that may be indicative of time and/or may be used to determine time (e.g., global time, reference time, etc.). For example, the time generation circuit may be a clock that can continuously output the current time or a timing signal indicating the current time. The sensor 510, authentication device 1010, and authentication module 1020 may use the timing signals to determine a timestamp and/or to determine when to perform an operation, action, etc. In one embodiment, the time generation circuit may be part of the verification module 1010. In another example, as described above, the time generation circuit may be part of the sensor unit 500. In another embodiment, the time generation circuit may be separate from the verification module 1010 and the sensor unit 500.
The authentication system 1000 includes an authentication device 1010. In one embodiment, authentication device 1020 may be a detector. The detector may be a device that may detect waves generated, emitted, transmitted, etc. by the sensor 510. For example, the sensor 510 may be a LIDAR device (e.g., an active sensor), and the verification device 1020 may be a photodetector (e.g., a laser) that may detect light waves emitted by the LIDAR device. In another example, sensor 510 may be a radar device (e.g., an active sensor), and authentication device 1020 may be an antenna that detects radio waves emitted by the radar device. In another example, the sensor 510 may be an ultrasonic device (e.g., an active sensor) and the verification device 1020 may be a microphone that detects sound waves emitted by the ultrasonic device.
The sensor system 115 also includes a verification module 1010. In an embodiment, the verification module 1010 may verify the operation of the sensor 510. For example, the verification module 1010 may determine whether the sensor 510 is capable of initiating capture of sensor data quickly enough (e.g., whether the initiation delay is within an acceptable time/range). In another example, the verification module 1010 may determine whether the sensor 510 is capable of capturing sensor data quickly enough (e.g., whether the acquisition delay is within an acceptable time/range). In another example, the verification module 1010 may determine whether the sensor 510 is capable of transmitting sensor data to another device (e.g., to a host system) quickly enough. In yet another example, the verification module 1010 may determine whether the sensor 510 is capable of detecting an object based on sensor data acquired by the sensor 510.
As described above, if the sensors do not capture, acquire, record sensor data at the same time or at the same time, the ADV may be more difficult to correlate and/or use sensor data from multiple sensors. Synchronizing the sensors of the ADV may allow the ADV to correlate or more easily correlate different sensor data from different sensors. Correlating sensor data from different sensors may allow an ADV to detect vehicles, objects, obstacles, pedestrians, driveways, etc. in an environment more quickly, efficiently, easily, and/or more accurately. Verifying operation of the sensor 510 may allow a user (e.g., technician, mechanic, etc.) to determine whether the sensor 510 is able to properly synchronize with other sensors in the autonomous vehicle. The verification module 1010 may allow the sensor system 115 to verify that the sensor 510 is operating properly, which may allow the sensor 510 to be properly synchronized with other sensors in the autonomous vehicle. If the sensor 510 is not operating properly, the verification module 1010 can provide an indication that the sensor 510 is not operating properly (e.g., send an error message, display an error message, etc.), which can allow a user to know when the sensor 510 is operating improperly and replace the sensor 510.
Fig. 11 is a block diagram illustrating an exemplary verification system 1100 according to one embodiment. Verification system 1100 includes a sensor 510, a verification module 1010, and a detector 1150. The sensor 510 may be an active sensor such as a LIDAR device, a radar device, an ultrasound device, an infrared device/sensor (e.g., an infrared wave emitting device), an ultraviolet device/sensor (e.g., an ultraviolet wave emitting device), or the like. As described above, the active sensor may be a device that emits, transmits, generates, etc., waves (e.g., radio waves, optical waves, electromagnetic waves, acoustic waves, etc.). The verification module 1010 includes a start module 1110, a detection module 1120, an operation module 1130, and a timing module 1140.
In one embodiment, the initiation module 1110 (e.g., authentication module) can cause the sensor 510 to obtain sensor data and/or begin obtaining sensor data (e.g., initiate an operation, process, function, action, etc. to obtain sensor data). For example, the initiation module 1110 can initiate capture, acquisition, recording, etc., of sensor data. In one embodiment, the initiation module 1110 can send a frame, message, packet, instruction, etc. to the sensor 510 to cause the sensor 510 to begin capturing sensor data. A message, packet, instruction, etc. may activate, trigger, cause sensor 510, etc. to capture sensor data. In another embodiment, the activation module 1110 may send a signal (e.g., one or more voltages on a wire) to the sensor 510 to cause the sensor 510 to begin capturing sensor data. As discussed in more detail below, the activation module 1110 can determine a time at which the activation module 1110 causes the sensor to begin capturing sensor data based on the timing signal generated by the timing module 1140. The activation module 1110 can generate a timestamp or some other information to indicate when the activation module 1110 caused the sensor to begin capturing sensor data.
In one embodiment, detection module 1120 (e.g., verification module 1010) may determine whether detector 1150 has detected a wave transmitted, generated, sent by sensor 510, or the like. As described above, the detector 1150 may be some type of detector and/or detection device capable of detecting waves (e.g., electromagnetic waves, acoustic waves, etc.) emitted, generated, transmitted, etc., by the sensor 510, and the like. The detector 1150 may be a photodetector, antenna, microphone, etc. A detection module 1120 (e.g., verification module 1010) may be coupled to the detector 1150. The detection module 1120 may also determine when the detector 1150 detects a wave emitted by the sensor 510. For example, detection module 1120 may receive signals (e.g., one or more voltages on a wire) and/or messages from detector 1150 indicating that detector 1150 has detected waves emitted by sensor 510. The detection module 1120 may determine when signals and/or messages are received from the detector 1150 based on timing signals generated by the timing module 1140 (discussed in more detail below). Detection module 1120 may generate a timestamp or some other information to indicate the time at which the signal and/or message was received from detector 1150.
In one embodiment, the operations module 1130 may receive sensor data from the sensor 510. The sensor data may be data obtained and/or generated by the sensor. The sensor data may indicate information about the environment in which the autonomous vehicle is operating. For example, the sensor data may indicate and/or identify one or more objects in an environment in which the autonomous vehicle is located. The operations module 1130 may determine when sensor data is received from the sensor 510. For example, the operation module 1130 may determine when sensor data is received from the sensor 510 based on timing signals generated by the timing module 1140 (discussed in more detail below). The operation module 1130 may generate a time stamp or some other information to indicate when sensor data was received from the sensor 510.
The operation module 1130 may verify operation of the sensor 510 based on one or more time and/or time stamps determined by the activation module 1110, the detection module 1120, and the operation module 1130. In one embodiment, the operation module 1130 may determine the start delay of the sensor 510 based on the time (e.g., first time and/or timestamp) at which the start module 1110 causes the sensor 510 to obtain sensor data and the time (e.g., second time and/or timestamp) at which the detector 1150 detects the wave emitted by the sensor 510. For example, the operation module 1130 may determine the start delay based on a difference between the first time and the second time. If the startup delay is less than or equal to a threshold time (e.g., less than a threshold period of time), the operation module 1130 may determine that the sensor 510 is operating correctly and/or within an acceptable performance level. If the startup delay is greater than the threshold time, the operation module 1130 may determine that the sensor 510 is operating incorrectly and/or not within an acceptable performance level. In another example, the operation module 1130 may determine whether the start-up delay of the sensor 510 is within a threshold of the reference start-up delay of the sensor. The reference initiation delay may be based on the data acquisition characteristics of the sensor 510. If the start-up delay is within the threshold of the reference start-up delay, the sensor 510 may function properly and vice versa.
In one embodiment, the operation module 1130 may verify operation of the sensor by determining an acquisition delay of the sensor 510 based on a time (e.g., a first time and/or timestamp) at which the activation module 1110 causes the sensor 510 to acquire sensor data, a time (e.g., a second time and/or timestamp) at which the detector 1150 detects a wave transmitted by the sensor 510, and a time (e.g., a third time and/or timestamp) at which the operation module 1130 receives sensor data from the sensor 510. For example, the operation module 1130 may determine the acquisition delay based on a difference between the second time and the third time. If the acquisition delay is less than or equal to a threshold time (e.g., less than a threshold period of time), the operation module 1130 may determine that the sensor 510 is operating correctly and/or within an acceptable performance level. If the acquisition delay is greater than the threshold time, the operation module 1130 may determine that the sensor 510 is not operating properly and/or is not within an acceptable performance level. In another example, the operation module 1130 may determine whether the acquisition delay of the sensor 510 is within a threshold of the reference acquisition delay of the sensor. The reference acquisition delay may be based on the data acquisition characteristics of the sensor 510. If the acquisition delay is within the threshold of the reference acquisition delay, the sensor 510 may operate properly and vice versa.
In one implementation, the operation module 1130 may verify operation of the sensor 510 based on the difference between the first time and the third time. The difference between the first time and the third time may be indicative of the total amount of time it takes for the sensor 510 to obtain sensor data. The operation module 1130 may determine whether the total amount of time is less than or equal to a threshold (e.g., whether the total amount of time the sensor 510 uses to obtain sensor data is less than 10 milliseconds, 200 milliseconds, or some other suitable period of time). If the total amount of time is less than or equal to the threshold, the operation module 1130 may determine that the sensor 510 is operating properly. If the total amount of time is greater than the threshold, the operation module 1130 may determine that the sensor 510 is not operating properly.
In one embodiment, the sensor data received by the operations module 1130 may include a timestamp. The timestamp may indicate the time at which sensor 510 generated the sensor data. The operation module 1130 may determine whether the time stamp is exactly the time when the detector 1150 detected the wave emitted by the sensor 510 (e.g., the second time and/or the time stamp) and the time when the operation module 1130 received the sensor data from the sensor 510 (e.g., the third time and/or the time stamp). For example, the operation module 1130 may determine whether the timestamp indicates a time less than or earlier than the third time and later than the first time. If the timestamp indicates a time that is less than or earlier than the third time and later than the first time, the operation module 1130 may determine that the sensor is operating properly, and vice versa.
In one embodiment, the operation module 1130 may determine whether sensor data received from the sensor 510 indicates that the sensor detected an object. For example, the reference object may exist in the environment. The operation module 1130 may determine whether the sensor data indicates that the sensor 510 detects the presence of a reference object in the environment. If the sensor 510 detects the presence, location, speed, and/or direction of the reference object, the operation module 1130 may determine that the sensor 510 is operating properly, and vice versa.
In one embodiment, the timing module 1140 may generate a timing signal. The timing signal may be used by the sensor 510 and the detector 1150 to determine a reference time. For example, the timing signal may indicate the current time and/or may be used by the sensor 510 and/or the detector 1150 to determine the current time. This may allow all components of the verification system 1100 (e.g., the sensor 510, the detector 1150, and the verification module 1010) to operate using the same reference time.
As described above, synchronizing the sensors of an ADV may allow an ADV to correlate or more easily correlate different sensor data from different sensors. Correlating sensor data from different sensors may allow an ADV to detect vehicles, objects, obstacles, pedestrians, driveways, etc. in an environment more quickly, efficiently, easily, and/or more accurately. Verifying operation of the sensor 510 may allow a user (e.g., technician, mechanic, etc.) to determine whether the sensor 510 is able to properly synchronize with other sensors in the autonomous vehicle. The verification module 1010 may allow the sensor system 115 to verify that the sensor 510 is operating properly, which may allow the sensor 510 to be properly synchronized with other sensors in the autonomous vehicle. If the sensor 510 is not operating properly, the verification module 1010 can provide an indication that the sensor 510 is not operating properly (e.g., send an error message, display an error message, etc.), which can allow a user to know when the sensor 510 is operating improperly (e.g., whether the sensor is defective) and replace the sensor 510.
FIG. 12 is a flowchart illustrating an exemplary process for verifying a sensor used in an autonomous vehicle, according to one embodiment. Process 1200 may be performed by processing logic that may comprise software, hardware, or a combination thereof. Process 1200 may be performed by processing logic that may comprise hardware (e.g., circuitry, dedicated logic, programmable logic, a processor, a processing device, a Central Processing Unit (CPU), a system-on-a-chip (SoC), etc.), software (e.g., instructions run/executed on a processing device), firmware (e.g., microcode), or a combination thereof. In some implementations, the process 1200 may be performed by a processing device, a verification module (e.g., the verification module 1010 shown in fig. 10A, 10B, 11, and 13), portions of a verification module (e.g., the start-up module, the detection module, the operation module, and/or the timing module shown in fig. 11), and so forth.
At block 1205, the process 1200 may generate a timing signal. For example, process 1200 may generate a continuous signal indicating the current time. At block 1205, the process 1200 may provide a timing signal to the sensor and/or detector. This may allow the sensor and/or detector to operate using the same current time. At block 1215, process 1200 may cause the sensor to obtain sensor data at a first time. For example, process 1200 can send a message and/or signal to a sensor to cause the sensor to obtain sensor data. At block 1220, the process 1200 may determine that the detector has detected the wave generated by the sensor at a second time. For example, process 1200 may receive a signal or message from a detector that indicates that the detector has detected an electromagnetic or acoustic wave generated by a sensor. At block 1225, the process 1200 may receive sensor data from the sensor at a third time.
At block 1230, process 1200 may verify operation of the sensor based on one or more of the first time, the second time, and/or the third time. For example, as described above, process 1200 may determine whether the activation time of the sensor is within a threshold of a reference activation time. In another example, as discussed above, process 1200 may determine whether the acquisition time of the sensor is within a threshold of a reference acquisition time.
Fig. 13 is a block diagram illustrating an exemplary authentication system 1300 according to one embodiment. Verification system 1300 includes sensor 510, verification module 1010, and stimulus generator 1350. The sensor 510 may be a passive sensor such as a camera, video camera, microphone, GPS receiver, or the like. As described above, the passive sensor may be a device that does not emit, transmit, generate waves (e.g., radio waves, optical waves, electromagnetic waves, acoustic waves, etc.), and the like. The verification module 1010 includes a start module 1310, an incentive module 1320, an operation module 1330, and a timing module 1340.
In one embodiment, as described above, the initiation module 1310 (e.g., authentication module) may cause the sensor 510 to obtain sensor data and/or begin obtaining sensor data (e.g., initiate an operation, process, function, action, etc. to obtain sensor data). The initiation module 1310 may send frames, messages, packets, instructions, etc. to the sensor 510 to cause the sensor 510 to begin capturing sensor data. The initiation module 1310 may also send a signal (e.g., one or more voltages on a wire) to the sensor 510 to cause the sensor 510 to begin capturing sensor data. The initiation module 1310 may determine a time at which the initiation module 1310 causes the sensor to begin capturing sensor data based on the timing signal generated by the timing module 1340. The initiation module 1310 may generate a time stamp or some other information to indicate when the initiation module 1310 caused the sensor to begin capturing sensor data.
In one embodiment, the excitation module 1320 may generate an excitation that may be detected by the sensor 510. The excitation module 1320 may generate an excitation over a period of time. For example, the excitation module 1320 may generate light, sound, etc. over a period of time. The time period may be based on the data acquisition characteristics of the sensor 510. For example, the time period may be based on a start-up delay and an acquisition delay of the sensor 510. Based on the activation delay, the excitation module 1320 may begin a time period at or after the time that the sensor 510 has been activated. Based on the acquisition delay, the excitation module 1320 may end the period of time such that the period of time is the same as or within the threshold of the acquisition delay. By generating the stimulus during the acquisition delay of the sensor 510, the stimulus module 1320 may allow the operation module 1330 to determine whether the sensor 510 detects the stimulus during the acquisition delay. The excitation module 1320 may also allow the operation module 1330 to determine whether the start-up delay of the sensor 510 is acceptable. The excitation module 1320 may also allow the operation module 1330 to determine whether the acquisition delay of the sensor 510 is acceptable. The excitation module 1320 may send a message and/or signal to the excitation generator 1350 to cause the excitation generator 1350 to generate an excitation. For example, the message may indicate when and for how long the stimulus generator 1350 should generate the stimulus.
In one embodiment, the excitation module 1320 may determine a period of time that the excitation generator 1350 should generate the excitation based on the reference start-up delay and the reference acquisition delay of the sensor 510. For example, based on the reference start-up delay and the time at which the start-up module 1310 causes the sensor 510 to obtain sensor data, the excitation module 1320 may determine when to begin generating excitation. In another example, based on the reference acquisition delay of the sensor 510, the excitation module 1320 may determine how long the excitation should be generated (e.g., when to stop generating the excitation).
In one embodiment, the operations module 1330 may receive sensor data from the sensor 510. The sensor data may be data obtained and/or generated by the sensor. The sensor data may indicate information about the environment in which the autonomous vehicle is operating. The operations module 1330 may determine when sensor data is received from the sensor 510. The operation module 1330 may generate a timestamp or some other information to indicate the time at which the sensor data was received from the sensor 510.
The operation module 1330 may verify operation of the sensor 510 based on one or more times and/or timestamps determined by the activation module 1310, the excitation module 1320, and the operation module 1330. In one embodiment, the operation module 1330 may determine the start delay of the sensor 510 based on the time (e.g., first time and/or timestamp) at which the start module 1310 causes the sensor 510 to obtain sensor data, the time (e.g., second time and/or timestamp) at which the stimulus generator 1350 generated the stimulus for detection by the sensor 510, and whether the sensor 510 detected the stimulus. For example, if the start-up delay is between times T0 and T1, and the stimulus generator 1350 generates the stimulus at times T1 to T2, the operation module 1130 may determine whether the stimulus is detected during times T1 to T2 or within a threshold outside of times T1 to T2 (e.g., within a threshold time before time T1 or after time T2). If the startup delay is greater than the threshold time, the operation module 1330 may determine that the sensor 510 is operating incorrectly and/or is not within an acceptable performance level. In another example, the operation module 1330 may determine whether the start-up delay of the sensor 510 is within a threshold of the reference start-up delay of the sensor. The reference initiation delay may be based on the data acquisition characteristics of the sensor 510. If the start-up delay is within the threshold of the reference start-up delay, the sensor 510 may function properly and vice versa.
In one embodiment, the operation module 1330 may verify operation of the sensor by determining an acquisition delay of the sensor 510 based on the time at which the excitation module 1320 caused the excitation generator 1350 to generate the excitation and the time at which the operation module 1330 received the sensor data from the sensor 510 (e.g., a third time and/or a timestamp). For example, the operation module 1330 may determine the acquisition delay and whether an excitation is detected during the acquisition delay based on the difference between the second time and the third time. If the acquisition delay is less than or equal to a threshold time (e.g., less than a threshold period of time), the operation module 1330 may determine that the sensor 510 is operating correctly and/or within an acceptable performance level. If the acquisition delay is greater than the threshold time, the operation module 1330 may determine that the sensor 510 is not operating properly and/or is not within an acceptable performance level. In another example, the operation module 1330 may determine whether the acquisition delay of the sensor 510 is within a threshold of the reference acquisition delay of the sensor. The reference acquisition delay may be based on the data acquisition characteristics of the sensor 510. If the acquisition delay is within the threshold of the reference acquisition delay, the sensor 510 may operate properly and vice versa.
In one embodiment, the excitation module 1320 may determine whether or not the sensor 510 detects excitation for how long during the acquisition delay. If excitation is detected during a threshold amount of acquisition delay (e.g., during 90% of the acquisition delay, during 85% of the acquisition delay, or some other suitable amount), then the sensor 510 may operate properly, and vice versa.
In one implementation, the operation module 1330 may verify operation of the sensor 510 based on the difference between the first time and the third time. The difference between the first time and the third time may be indicative of the total amount of time it takes for the sensor 510 to obtain sensor data. The operation module 1330 may determine whether the total amount of time is less than or equal to a threshold. If the total amount of time is less than or equal to the threshold, operation module 1330 may determine that sensor 510 is operating properly. If the total amount of time is greater than the threshold, operation module 1330 may determine that sensor 510 is not operating properly.
In one embodiment, the operation module 1330 may determine whether the sensor data received from the sensor 510 indicates that the stimulus (e.g., light, sound, etc.) generated by the stimulus generator 1350 was detected by the sensor 510. For example, the excitation module 1320 may turn the lamp on for a period of time. The operation module 1330 may determine whether sensor data from the sensor 510 (which may be a camera) indicates that the sensor 510 detects light. If the sensor data indicates that the sensor 510 detects an excitation, the operation module 1330 may determine that the sensor 510 is operating properly, and vice versa.
In one embodiment, the timing module 1340 may generate a timing signal. The timing signal may be used by the sensor 510 and the stimulus generator 1350 to determine a reference time. The timing signal may indicate the current time and/or may be used by the sensor 510 and/or the excitation generator 1350 to determine the current time. This may allow all components of authentication system 1300 (e.g., sensor 510, stimulus generator 1350, and authentication module 1010) to operate using the same reference time.
As described above, synchronizing the sensors of an ADV may allow an ADV to correlate or more easily correlate different sensor data from different sensors. This may allow the ADV to detect vehicles, objects, obstacles, pedestrians, driveways, etc. in the environment more quickly, efficiently, easily, and/or more accurately. Verifying operation of the sensor 510 may allow a user to determine whether the sensor 510 may be properly synchronized with other sensors in the autonomous vehicle. The verification module 1010 may allow the verification system 1300 to verify that the sensor 510 is operating properly, which may allow the sensor 510 to be properly synchronized with other sensors in the autonomous vehicle. If the sensor 510 is not operating properly, the verification module 1010 can provide an indication that the sensor 510 is not operating properly. This may allow a user to know when the sensor 510 is not operating properly and to replace the sensor 510.
FIG. 14 is a flowchart illustrating an exemplary process for verifying a sensor used in an autonomous vehicle, according to one embodiment. Process 1400 may be performed by processing logic that may comprise software, hardware, or a combination thereof. Process 1400 may be performed by processing logic that may comprise hardware (e.g., circuitry, dedicated logic, programmable logic, a processor, a processing device, a Central Processing Unit (CPU), a system-on-a-chip (SoC), etc.), software (e.g., instructions run/executed on a processing device), firmware (e.g., microcode), or a combination thereof. In some implementations, the process 1400 may be performed by a processing device, a verification module (e.g., the verification module 1010 shown in fig. 10A, 10B, 11, and 13), a portion of a verification module (e.g., the start-up module, the incentive module, the operation module, and/or the timing module shown in fig. 13), and so forth.
At block 1405, process 1400 may generate a timing signal. For example, process 1400 may generate a continuous signal that indicates the current time. At block 1410, process 1400 may provide a timing signal to a sensor and/or detector. This may allow the sensor and/or detector to operate using the same current time. At block 1415, process 1400 may cause the sensor to obtain sensor data at a first time. For example, process 1400 can send a message and/or signal to a sensor to cause the sensor to obtain sensor data. At block 1420, process 1400 may generate an excitation for sensor detection. For example, process 1400 causes the stimulus generator to generate sound or light that can be detected by the sensor. At block 1425, process 1400 may receive sensor data from the sensor at a third time.
At block 1430, the process 1400 may verify operation of the sensor based on one or more of the first time, the second time, and/or the third time. For example, as described above, process 1400 may determine whether the activation time of the sensor is within a threshold of a reference activation time. In another example, as discussed above, process 1400 may determine whether the acquisition time of the sensor is within a threshold of a reference acquisition time.
Some portions of the preceding detailed description have been presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, considered to be a self-consistent sequence of operations leading to a desired result. The operations are those requiring physical manipulations of physical quantities.
It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the above discussion, it is appreciated that throughout the description, discussions utilizing terms such as those set forth in the appended claims, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
Embodiments of the present disclosure also relate to an apparatus for performing the operations herein. Such a computer program is stored in a non-transitory computer readable medium. A machine-readable medium includes any mechanism for storing information in a form readable by a machine (e.g., a computer). For example, a machine-readable (e.g., computer-readable) medium includes a machine (e.g., a computer) readable storage medium (e.g., read only memory ("ROM"), random access memory ("RAM"), magnetic disk storage medium, optical storage medium, flash memory device).
The processes or methods depicted in the preceding figures may be performed by processing logic that comprises hardware (e.g., circuitry, dedicated logic, etc.), software (e.g., embodied on a non-transitory computer readable medium), or a combination of both. Although the process or method is described above in terms of some sequential operations, it should be appreciated that some of the operations may be performed in a different order. Further, some operations may be performed in parallel rather than sequentially.
Embodiments of the present disclosure are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of embodiments of the disclosure as described herein.
In the foregoing specification, embodiments of the present disclosure have been described with reference to specific exemplary embodiments thereof. It will be evident that various modifications may be made thereto without departing from the broader spirit and scope of the disclosure as set forth in the following claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.

Claims (20)

1. A method for verifying operation of a sensor used in an autonomous vehicle, comprising:
causing the sensor to acquire sensor data at a first time, wherein the sensor acquires the sensor data by transmitting a wave to a detector;
determining that the detector detects the wave at a second time;
receiving the sensor data from the sensor at a third time; and
determining a start-up delay of the sensor based on the first time and the second time, determining an acquisition delay of the sensor based on the second time and the third time, and verifying operation of the sensor based on the start-up delay or the acquisition delay is within respective thresholds, wherein the sensor is configured to sense a driving environment during autonomous driving of the autonomous vehicle.
2. The method of claim 1, wherein causing the sensor to acquire the sensor data at the first time comprises at least one of:
transmitting a signal to the sensor; and
a message is sent to the sensor, wherein the message indicates that the sensor is to acquire the sensor data.
3. The method of claim 1, wherein determining that the detector has detected the wave at the second time comprises at least one of:
receiving a signal from the detector; and
a message is received from the detector.
4. The method of claim 1, wherein verifying operation of the sensor comprises:
based on the second time and the third time, it is determined whether a data acquisition delay of the sensor is less than or equal to a threshold acquisition delay.
5. The method of claim 1, wherein verifying operation of the sensor comprises:
determining whether a start-up delay of the sensor is less than or equal to a threshold start-up delay based on the first time and the second time.
6. The method of claim 1, wherein verifying operation of the sensor comprises:
It is determined whether a difference between the first time and the third time is less than a threshold time.
7. The method of claim 1, further comprising:
generating a timing signal; and
the timing signal is provided to at least one of the sensor and the detector.
8. The method of claim 7, wherein the sensor data comprises a timestamp determined by the sensor based on the timing signal.
9. The method of claim 1, further comprising:
it is determined whether the sensor data indicates that the sensor detected an object.
10. The method of claim 1, wherein the sensor comprises an active sensor.
11. The method of claim 10, wherein the active sensor comprises at least one of:
light detection and ranging LIDAR sensors;
a radar sensor; and
an ultrasonic sensor.
12. A method for verifying operation of a sensor used in an autonomous vehicle, comprising:
causing the sensor to acquire sensor data at a first time, wherein:
the sensor acquires the sensor data through detection waves; and
the sensor being unable to generate a wave detected by the sensor;
Generating an excitation for causing the sensor to detect at a second time;
receiving sensor data from the sensor at a third time; and
determining a start-up delay of the sensor based on the first time and the second time, determining an acquisition delay of the sensor based on the second time and the third time, and verifying operation of the sensor based on the start-up delay or the acquisition delay is within respective thresholds, wherein the sensor is configured to sense a driving environment during autonomous driving of the autonomous vehicle.
13. The method of claim 12, wherein causing the sensor to acquire the sensor data at the first time comprises at least one of:
transmitting a signal to the sensor; and
a message is sent to the sensor, wherein the message indicates that the sensor is to acquire the sensor data.
14. The method of claim 12, wherein generating an excitation for causing the sensor to detect comprises:
an excitation is generated for a time period beginning at the second time, wherein the time period is determined based on the data acquisition characteristics of the sensor.
15. The method of claim 12, wherein verifying operation of the sensor comprises:
based on the second time and the third time, it is determined whether a data acquisition delay of the sensor is less than or equal to a threshold acquisition delay.
16. The method of claim 12, wherein verifying operation of the sensor comprises:
determining whether a start-up delay of the sensor is less than or equal to a threshold start-up delay based on the first time and the second time.
17. The method of claim 12, wherein verifying operation of the sensor comprises:
it is determined whether a difference between the first time and the third time is less than a threshold time.
18. The method of claim 12, further comprising:
generating a timing signal; and
the timing signal is provided to the sensor.
19. The method of claim 18, wherein the sensor data comprises a timestamp determined by the sensor based on the timing signal.
20. A non-transitory machine-readable medium having instructions stored therein, which when executed by a processor, cause the processor to perform operations comprising:
Causing a sensor to acquire sensor data at a first time, the sensor acquiring the sensor data by transmitting a wave to a detector;
determining that the detector detects the wave at a second time;
receiving sensor data from the sensor at a third time; and
determining a start-up delay of the sensor based on the first time and the second time, determining an acquisition delay of the sensor based on the second time and the third time, and verifying operation of the sensor is within respective thresholds based on the start-up delay or the acquisition delay, wherein the sensor is configured to sense a driving environment during autonomous driving of the autonomous vehicle.
CN201980004347.4A 2019-08-30 2019-08-30 Verifying timing of sensors used in an autonomous vehicle Active CN113016153B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/103811 WO2021035722A1 (en) 2019-08-30 2019-08-30 Verifying timing of sensors used in autonomous driving vehicles

Publications (2)

Publication Number Publication Date
CN113016153A CN113016153A (en) 2021-06-22
CN113016153B true CN113016153B (en) 2023-12-05

Family

ID=74683757

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980004347.4A Active CN113016153B (en) 2019-08-30 2019-08-30 Verifying timing of sensors used in an autonomous vehicle

Country Status (6)

Country Link
US (1) US11488389B2 (en)
EP (1) EP3888276B1 (en)
JP (1) JP7309886B2 (en)
KR (1) KR102491563B1 (en)
CN (1) CN113016153B (en)
WO (1) WO2021035722A1 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10969760B2 (en) * 2018-04-12 2021-04-06 Faro Technologies, Inc. Coordinate measurement system with auxiliary axis
US11874101B2 (en) 2018-04-12 2024-01-16 Faro Technologies, Inc Modular servo cartridges for precision metrology
US11861957B2 (en) * 2019-05-09 2024-01-02 Argo AI, LLC Time master and sensor data collection for robotic system
KR102635388B1 (en) * 2019-10-24 2024-02-13 현대모비스 주식회사 Automotive sensor integration module and system using the same
DE102019216517B3 (en) * 2019-10-25 2021-03-18 Daimler Ag Method for synchronizing at least two sensor systems
KR102525191B1 (en) * 2020-08-07 2023-04-26 한국전자통신연구원 System and method for generating and controlling driving paths in autonomous vehicles
KR102343059B1 (en) * 2021-08-05 2021-12-27 주식회사 인피닉 Data collecting system for artificial intelligence machine learning, and device therefor
KR102493764B1 (en) * 2021-08-31 2023-02-06 국방과학연구소 Time synchronization method of electronic apparatus
DE102022207189B3 (en) 2022-07-14 2023-09-28 Continental Autonomous Mobility Germany GmbH Sensor device, vehicle and method for operating a sensor device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6313741B1 (en) * 1988-09-14 2001-11-06 Wabco Standard Gmbh Fault detection circuit for sensors
CN109194436A (en) * 2018-11-01 2019-01-11 百度在线网络技术(北京)有限公司 Sensor time stabs synchronous detecting method, device, equipment, medium and vehicle
CN109743128A (en) * 2019-01-29 2019-05-10 领目科技(上海)有限公司 A kind of vehicle-mounted multi information synchronous control system and method
CN109815555A (en) * 2018-12-29 2019-05-28 百度在线网络技术(北京)有限公司 The environmental modeling capability assessment method and system of automatic driving vehicle
CN109952600A (en) * 2016-06-30 2019-06-28 奥克托信息技术股份公司 A method of for estimating the running time of vehicle based on the determination of the state of vehicle

Family Cites Families (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2762397B1 (en) 1997-04-18 1999-07-09 Thomson Csf DEVICE FOR SELF TESTING THE TRANSMISSION AND RECEPTION CHAIN OF A RADAR, PARTICULARLY FOR AUTOMOBILE
US6400308B1 (en) * 1998-02-20 2002-06-04 Amerigon Inc. High performance vehicle radar system
US8525723B2 (en) * 1999-06-14 2013-09-03 Escort Inc. Radar detector with navigation function
US7358892B2 (en) * 2005-04-04 2008-04-15 Raytheon Company System and method for coherently combining a plurality of radars
DE102007017522A1 (en) * 2007-04-13 2008-10-16 Sick Ag Test method for checking the functionality of a monitoring sensor, monitoring method and monitoring sensor
AU2012325362B2 (en) * 2011-10-19 2014-08-07 Balu Subramanya Directional speed and distance sensor
DE102013010924A1 (en) * 2013-06-29 2014-12-31 Man Truck & Bus Ag Motor vehicle with at least one driver assistance system using the data of a radar sensor
US9442184B2 (en) * 2014-02-21 2016-09-13 Nxp B.V. Functional safety monitor pin
US9098753B1 (en) * 2014-04-25 2015-08-04 Google Inc. Methods and systems for object detection using multiple sensors
US9733348B2 (en) * 2014-07-03 2017-08-15 GM Global Technology Operations LLC Vehicle radar with beam adjustment
DE102015209129B3 (en) * 2015-05-19 2016-11-10 Robert Bosch Gmbh Method for sensor synchronization
US10101747B2 (en) 2015-12-11 2018-10-16 Uber Technologies, Inc. Formatting sensor data for use in autonomous vehicle communications platform
EP3223034B1 (en) * 2016-03-16 2022-07-20 Ricoh Company, Ltd. Object detection apparatus and moveable apparatus
US10114103B2 (en) 2016-03-31 2018-10-30 Uber Technologies, Inc. System and method for sensor triggering for synchronized operation
US10145948B2 (en) 2016-07-13 2018-12-04 Texas Instruments Incorporated Methods and apparatus for narrowband ranging systems using coarse and fine delay estimation
US10209709B2 (en) * 2016-08-15 2019-02-19 Ford Global Technologies, Llc LIDAR sensor frost detection
EP3407185B1 (en) * 2017-05-24 2019-10-16 Continental Automotive GmbH Algorithm triggered sensor data acquisition
DE102017112789A1 (en) * 2017-06-09 2018-12-13 Valeo Schalter Und Sensoren Gmbh Optoelectronic detection device for a motor vehicle with transmit and receive paths that can be tested via the same signal processing path and method for operating such a detection device
US11636077B2 (en) * 2018-01-05 2023-04-25 Nio Technology (Anhui) Co., Ltd. Methods, devices, and systems for processing sensor data of vehicles
US10884115B2 (en) * 2018-03-09 2021-01-05 Waymo Llc Tailoring sensor emission power to map, vehicle state, and environment
CN108445808A (en) * 2018-03-30 2018-08-24 深圳前海清科技有限公司 The sensing device and method that data synchronize
US10404261B1 (en) * 2018-06-01 2019-09-03 Yekutiel Josefsberg Radar target detection system for autonomous vehicles with ultra low phase noise frequency synthesizer
CN109271880B (en) * 2018-08-27 2021-08-24 深圳一清创新科技有限公司 Vehicle detection method, device, computer equipment and storage medium
CN109450582A (en) 2018-11-01 2019-03-08 百度在线网络技术(北京)有限公司 Sensor time stabs synchronous detecting method, device, equipment, medium and vehicle
CN109725572A (en) * 2018-12-25 2019-05-07 初速度(苏州)科技有限公司 A kind of multisensor accurate clock synchronization system and method
CN109462454A (en) * 2018-12-31 2019-03-12 武汉环宇智行科技有限公司 Automobile sensor method for synchronizing time and system based on PTP protocol

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6313741B1 (en) * 1988-09-14 2001-11-06 Wabco Standard Gmbh Fault detection circuit for sensors
CN109952600A (en) * 2016-06-30 2019-06-28 奥克托信息技术股份公司 A method of for estimating the running time of vehicle based on the determination of the state of vehicle
CN109194436A (en) * 2018-11-01 2019-01-11 百度在线网络技术(北京)有限公司 Sensor time stabs synchronous detecting method, device, equipment, medium and vehicle
CN109815555A (en) * 2018-12-29 2019-05-28 百度在线网络技术(北京)有限公司 The environmental modeling capability assessment method and system of automatic driving vehicle
CN109743128A (en) * 2019-01-29 2019-05-10 领目科技(上海)有限公司 A kind of vehicle-mounted multi information synchronous control system and method

Also Published As

Publication number Publication date
EP3888276B1 (en) 2023-05-31
US20210383133A1 (en) 2021-12-09
EP3888276A1 (en) 2021-10-06
KR102491563B1 (en) 2023-01-26
JP2022534339A (en) 2022-07-29
CN113016153A (en) 2021-06-22
KR20210106460A (en) 2021-08-30
WO2021035722A1 (en) 2021-03-04
EP3888276A4 (en) 2022-06-29
US11488389B2 (en) 2022-11-01
JP7309886B2 (en) 2023-07-18

Similar Documents

Publication Publication Date Title
CN113016153B (en) Verifying timing of sensors used in an autonomous vehicle
EP3614176B1 (en) A hardware centralized time synchronization hub for an autonomous driving vehicle
US11807265B2 (en) Synchronizing sensors of autonomous driving vehicles
EP3614687B1 (en) A gps based high precision timestamp generation circuit for an autonomous driving vehicle
EP3613648B1 (en) A time source recovery system for an autonomous driving vehicle
EP3614222B1 (en) A time source ranking system for an autonomous driving vehicle
CN112543876B (en) System for sensor synchronicity data analysis in an autonomous vehicle
US20210356599A1 (en) Partial point cloud-based pedestrians&#39; velocity estimation method
US11592581B2 (en) Dual inertial measurement units for inertial navigation system
US11662745B2 (en) Time determination of an inertial navigation system in autonomous driving systems

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant