WO2024011408A1 - Method and apparatus for synchronously collecting data, synchronization determination method and apparatus, and autonomous vehicle - Google Patents

Method and apparatus for synchronously collecting data, synchronization determination method and apparatus, and autonomous vehicle Download PDF

Info

Publication number
WO2024011408A1
WO2024011408A1 PCT/CN2022/105187 CN2022105187W WO2024011408A1 WO 2024011408 A1 WO2024011408 A1 WO 2024011408A1 CN 2022105187 W CN2022105187 W CN 2022105187W WO 2024011408 A1 WO2024011408 A1 WO 2024011408A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
image sensor
image
radar sensor
information
Prior art date
Application number
PCT/CN2022/105187
Other languages
French (fr)
Chinese (zh)
Inventor
李贤飞
黄自瑞
张满江
Original Assignee
阿波罗智能技术(北京)有限公司
百度(美国)有限责任公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 阿波罗智能技术(北京)有限公司, 百度(美国)有限责任公司 filed Critical 阿波罗智能技术(北京)有限公司
Priority to PCT/CN2022/105187 priority Critical patent/WO2024011408A1/en
Publication of WO2024011408A1 publication Critical patent/WO2024011408A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Definitions

  • the present disclosure relates to the field of artificial intelligence, specifically to technical fields such as autonomous driving, computer vision, and cloud computing, and in particular to a method of synchronously collecting data, a synchronization determination method, a device, a vehicle-mounted terminal, a controller, a vehicle, a cloud system, and an electronic device and readable storage media.
  • autonomous driving technology With the development of computer technology and network technology, autonomous driving technology has developed rapidly. In autonomous driving technology, it is usually necessary to rely on image data collected by image sensors and point cloud data collected by radar sensors to perceive the environmental information around the vehicle, and determine the autonomous driving strategy based on the sensing results.
  • the present disclosure aims to provide a method for synchronously collecting data, a synchronization determination method, a device, a vehicle-mounted terminal, a controller, a vehicle, a cloud system, an electronic device and a readable storage medium that are conducive to improving data alignment accuracy.
  • a method for synchronously collecting data including: in response to receiving a data packet from a radar sensor for collection at a predetermined angle, determining first time information when the radar sensor collects point cloud data at a predetermined angle. ; wherein the predetermined angle is within the viewing angle range of the image sensor; determining delay information for the image sensor according to the first time information; and sending the delay information to the controller so that the controller controls the image sensor to synchronize with the radar sensor rotated to the predetermined angle
  • the data is collected on the ground, where the data package includes point cloud data collected by the radar sensor at a predetermined angle.
  • a method for synchronously collecting data including: in response to receiving delay information for the image sensor, adjusting the trigger moment of the image sensor according to the delay information; in response to reaching the trigger moment, sending a signal to the image sensor sending a trigger signal; and in response to receiving the image data collected by the image sensor, adding time information to the image data so as to utilize the time information to align the image data with the point cloud data collected by the radar sensor, wherein the delay information is predetermined based on the radar sensor The angle is determined by the time information of collecting point cloud data, and the predetermined angle is within the viewing angle range of the image sensor.
  • a method for synchronously collecting data including: in response to receiving a data packet from a radar sensor for a predetermined angle, determining first time information when the radar sensor collects point cloud data at a predetermined angle. ; Wherein, the predetermined angle is within the viewing angle range of the image sensor; according to the first time information, delay information for the image sensor is determined; and the triggering time of the image sensor is adjusted according to the delay information, so that the image sensor and the radar sensor rotate to the predetermined angle are synchronized.
  • Collecting data in response to reaching the trigger moment, sending a trigger signal to the image sensor; and in response to receiving the image data collected by the image sensor, adding second time information to the image data to utilize the second time information to combine the image data with the radar sensor collection Point cloud data alignment.
  • a synchronization determination method including: acquiring a data packet sequence collected by a radar sensor and an image data sequence collected by an image sensor; for each data packet in the data packet sequence, according to each data The time information of the packet and the time information of each image data in the image data sequence are used to determine the image data matching each data packet; based on the difference and difference threshold in the time information between each data packet and the matching image data, each data packet is determined. the synchronization relationship between each data packet and the matching image data; and determining the synchronization relationship between the radar sensor and the image sensor based on the synchronization relationship between each of the multiple data packets in the data packet sequence and the matching image data.
  • a vehicle-mounted terminal configured to: in response to receiving a data packet from a radar sensor for a predetermined angle, determine the first time the radar sensor collects point cloud data at a predetermined angle. time information; wherein the predetermined angle is within the viewing angle range of the image sensor; determining delay information for the image sensor according to the first time information; and sending the delay information to the controller so that the controller controls the image sensor and the radar that rotates to the predetermined angle
  • the sensors collect data synchronously, where the data packets include point cloud data collected by the radar sensor at a predetermined angle.
  • a controller configured to: in response to receiving delay information for the image sensor, adjust a trigger moment of the image sensor according to the delay information; in response to reaching the trigger moment, send a signal to the image sensor a trigger signal; and in response to receiving the image data collected by the image sensor, adding time information to the image data so as to utilize the time information to align the image data with the point cloud data collected by the radar sensor, wherein the delay information is at a predetermined angle according to the radar sensor The time information for collecting point cloud data is determined, and the predetermined angle is within the viewing angle range of the image sensor.
  • an autonomous vehicle including a vehicle-mounted terminal, a controller, a radar sensor, and an image sensor, wherein the vehicle-mounted terminal is configured to: respond to receiving data for a predetermined angle from the radar sensor
  • the package determines the first time information that the radar sensor collects point cloud data at a predetermined angle; wherein the predetermined angle is within the viewing angle range of the image sensor; the data package includes the point cloud data collected by the radar sensor at a predetermined angle; according to the first time information, Determine delay information for the image sensor; and send the delay information to the controller;
  • the controller is configured to: in response to receiving the delay information, adjust the trigger moment of the image sensor according to the delay information; in response to reaching the trigger moment, send the delay information to the image sensor sending a trigger signal; and in response to receiving the image data collected by the image sensor, adding second time information to the image data so as to utilize the second time information to align the image data with the point cloud data collected by the radar sensor.
  • a cloud system configured to: obtain a data packet sequence collected by a radar sensor and an image data sequence collected by an image sensor; for each data packet in the data packet sequence, according to each The time information of the data packet and the time information of each image data in the image data sequence are used to determine the image data matching each data packet; based on the difference in time information and the difference threshold between each data packet and the matching image data, the image data is determined. the synchronization relationship between each data packet and the matching image data; and determining the synchronization relationship between the radar sensor and the image sensor based on the synchronization relationship between each of multiple data packets in the data packet sequence and the matching image data.
  • a device for synchronously collecting data includes: a time information determination module, configured to determine whether the radar sensor is at a predetermined angle in response to receiving a data packet from a radar sensor for a predetermined angle. Collect first time information of point cloud data; wherein the predetermined angle is within the viewing angle range of the image sensor; a delay information determination module for determining delay information for the image sensor based on the first time information; and an information sending module for Delay information is sent to the controller so that the controller controls the image sensor to collect data synchronously with the radar sensor rotated to a predetermined angle, wherein the data packet includes point cloud data collected by the radar sensor at the predetermined angle.
  • a device for synchronously collecting data including: a time adjustment module, configured to adjust the triggering time of the image sensor according to the delay information in response to receiving delay information for the image sensor; a signal sending module , used to send a trigger signal to the image sensor in response to reaching the trigger moment; and a time adding module, used to add time information to the image data in response to receiving the image data collected by the image sensor, so as to use the time information to combine the image data with the radar
  • the point cloud data collected by the sensor are aligned, wherein the delay information is determined based on the time information when the radar sensor collects the point cloud data at a predetermined angle, and the predetermined angle is within the viewing angle range of the image sensor.
  • a device for synchronously collecting data including: a time information determination module, configured to determine that the radar sensor collects points at a predetermined angle in response to receiving a data packet from a radar sensor for a predetermined angle.
  • the first time information of the cloud data wherein the predetermined angle is within the viewing angle range of the image sensor; the delay information determination module is used to determine the delay information for the image sensor based on the first time information; the time adjustment module is used to determine the delay information based on the delay
  • the information adjusts the trigger moment of the image sensor so that the image sensor collects data synchronously with the radar sensor rotated to a predetermined angle; a signal sending module for sending a trigger signal to the image sensor in response to reaching the trigger moment; and a time adding module for In response to receiving the image data collected by the image sensor, second time information is added to the image data to align the image data with the point cloud data collected by the radar sensor using the second time information.
  • a synchronization determination device including: a data acquisition module for acquiring a data packet sequence collected by a radar sensor and an image data sequence collected by an image sensor; a data matching module for targeting data packets For each data packet in the sequence, the image data matching each data packet is determined based on the time information of each data packet and the time information of each image data in the image data sequence; the data relationship determination module is used to determine the image data matching each data packet according to the time information of each data packet. The difference in time information between the packet and the matching image data and the difference threshold determine the synchronization relationship between each data packet and the matching image data; and a synchronization relationship determination module is used to determine the synchronization relationship between multiple data packets in the data packet sequence. The synchronization relationship between each and the matching image data determines the synchronization relationship between the radar sensor and the image sensor.
  • an electronic device including: at least one processor; and a memory communicatively connected to the at least one processor; wherein the memory stores instructions that can be executed by the at least one processor, and the instructions are At least one processor executes, so that at least one processor can execute the method for synchronously collecting data or the method for synchronization determination provided by the present disclosure.
  • a non-transitory computer-readable storage medium storing computer instructions, wherein the computer instructions are used to cause the computer to execute the method of synchronously collecting data or the method of synchronization determination provided by the present disclosure.
  • a computer program product including a computer program/instruction that, when executed by a processor, implements the method for synchronously collecting data or the method for synchronization determination provided by the present disclosure.
  • Figure 1 is a schematic diagram of an application scenario of a method for synchronously collecting data, a method for determining synchronization, and a device according to an embodiment of the present disclosure
  • Figure 2 is a schematic flowchart of a method for synchronously collecting data according to an embodiment of the present disclosure
  • Figure 3 is a schematic diagram of the principle of determining delay information for an image sensor according to an embodiment of the present disclosure
  • Figure 4 is a schematic flowchart of a method for synchronously collecting data according to another embodiment of the present disclosure
  • Figure 5 is a schematic flowchart of adjusting the triggering time of an image sensor according to an embodiment of the present disclosure
  • Figure 6 is a schematic diagram of the principle of a method for synchronously collecting data according to an embodiment of the present disclosure
  • Figure 7 is a schematic diagram of the principle of synchronous data collection according to an embodiment of the present disclosure.
  • Figure 8 is a schematic flowchart of a synchronization determination method according to an embodiment of the present disclosure.
  • Figure 9 is a schematic diagram of a vehicle-mounted terminal according to an embodiment of the present disclosure.
  • Figure 10 is a schematic diagram of a controller according to an embodiment of the present disclosure.
  • Figure 11 is a schematic diagram of an autonomous vehicle according to an embodiment of the present disclosure.
  • Figure 12 is a schematic diagram of a cloud system according to an embodiment of the present disclosure.
  • Figure 13 is a structural block diagram of a device for synchronously collecting data according to an embodiment of the present disclosure
  • Figure 14 is a structural block diagram of a device for synchronously collecting data according to another embodiment of the present disclosure.
  • Figure 15 is a structural block diagram of a device for synchronously collecting data according to another embodiment of the present disclosure.
  • Figure 16 is a structural block diagram of a synchronization determination device according to an embodiment of the present disclosure.
  • FIG. 17 is a block diagram of an electronic device used to implement the method of synchronously collecting data or the method of synchronization determination according to an embodiment of the present disclosure.
  • autonomous vehicles need to sense rich environmental information around them and take timely safety measures when there are safety hazards in the surrounding environment.
  • autonomous vehicles are usually equipped with radar sensors and image sensors.
  • Image sensors are used to collect image data to obtain rich texture and color information in the environment
  • radar sensors are used to collect point cloud data to obtain distance information of objects in the environment.
  • reconstructed three-dimensional environmental information can be obtained by fusing image data and point cloud data.
  • the collection time of image data and point cloud data is not synchronized, the environmental information obtained by fusing the image data and point cloud data will deviate from the actual environmental information. This deviation will be amplified as the speed of the autonomous vehicle increases, which will affect the safe driving of the autonomous vehicle.
  • the present disclosure provides a method and device for synchronous data collection that improves the collection synchronization of image data and point cloud data, and also provides a synchronization determination that efficiently and accurately evaluates the synchronization of data collection. Methods and apparatus.
  • Figure 1 is a schematic diagram of application scenarios of a method for synchronously collecting data, a method for determining synchronization, and a device according to an embodiment of the present disclosure.
  • the application scenario 100 of this embodiment may include an autonomous vehicle 110 , a road 120 and a cloud system 130 .
  • the autonomous vehicle 110 may be integrated with a vehicle-mounted terminal, a radar sensor 111 and an image sensor 112.
  • the vehicle-mounted terminal can be connected to the radar sensor 111 and the image sensor 112 through communication cables.
  • the vehicle-mounted terminal can obtain the point cloud data collected by the radar sensor 111 and the image data collected by the image sensor 112 through the communication cable.
  • the vehicle-mounted terminal can fuse image data and point cloud data in real time, decide the driving strategy of the vehicle based on the fusion results, and send control signals to the power system of the autonomous vehicle 110 according to the driving strategy to achieve autonomous driving.
  • the radar sensor 111 may be, for example, a laser radar, a millimeter wave radar, or other sensor with a mechanical rotation function.
  • the radar sensor 111 may be, for example, installed on the roof of an autonomous vehicle.
  • the radar sensor 111 can rotate 360° under the control of pulse signals to collect all-round point cloud data around the autonomous vehicle.
  • the radar sensor can obtain the position point information of objects in the environment around the autonomous vehicle by emitting and receiving laser beams, and performs three-dimensional modeling based on the position point information to obtain point cloud data.
  • the image sensor 112 may be, for example, any one of the following cameras: a front-view camera installed on the self-driving vehicle 110 , a surround-view camera installed in any direction of the self-driving vehicle 110 , a rear-view camera, a side-view camera, etc. .
  • the autonomous vehicle 110 may also be integrated with a positioning device.
  • the positioning device may be composed of a Global Positioning System (GPS) and a Geographic Information System (Geographic Information System, GIS). Realize tracking and positioning of autonomous vehicles.
  • GPS Global Positioning System
  • GIS Geographic Information System
  • the vehicle-mounted terminal can, for example, use the positioning device time of the positioning device as the reference time to control the timing of data collection by the radar sensor and the image sensor, so that the radar sensor and the image sensor collect data synchronously.
  • the vehicle-mounted terminal may, for example, control the triggering of the image acquisition device based on the time information of the radar sensor collecting point cloud data, so that the radar sensor and the image sensor collect data synchronously.
  • the autonomous vehicle 110 may also be integrated with a controller that controls the triggering of the image sensor 112 , and the controller may be an artificial intelligence chip, for example.
  • the vehicle-mounted terminal can be communicatively connected with the controller to send delay information determined based on the time information of the radar sensor collecting point cloud data to the controller, so that the controller controls the triggering of the image sensor based on the delay information.
  • the vehicle-mounted terminal can also communicate with the cloud system 130 through a wireless communication link.
  • the cloud system 130 can monitor the driving of the autonomous vehicle based on the data uploaded by the vehicle terminal.
  • the cloud system 130 can obtain the point cloud data and image data collected by the autonomous vehicle within a predetermined period from the vehicle terminal, and collect the point cloud data and images of the autonomous vehicle based on the fusion result of the point cloud data and image data.
  • the synchronicity of the data is evaluated to provide a reference for adjusting the synchronous collection strategy of point cloud data and image data.
  • the method for synchronously collecting data provided by the present disclosure can be executed by the autonomous vehicle 110 .
  • some operations can be executed by the vehicle-mounted terminal in the autonomous vehicle 110 and some operations can be executed by the controller.
  • the device for synchronously collecting data provided by the present disclosure can be provided in the autonomous vehicle 110.
  • some modules can be provided in the vehicle-mounted terminal and some modules can be provided in the controller.
  • the synchronization determination method provided by the present disclosure can be executed by the cloud system 130 .
  • the synchronization determining device provided by the present disclosure can be provided in the cloud system 130 .
  • the structures and types of the autonomous vehicle 110, radar sensor 111, image sensor 112 and cloud system 130 in Figure 1 are only schematic. Depending on implementation requirements, the autonomous vehicle 110, radar sensor 111, image sensor 112 and cloud system 130 may have any structure and type.
  • Figure 2 is a schematic flowchart of a method for synchronously collecting data according to an embodiment of the present disclosure.
  • the method 200 for synchronously collecting data in this embodiment may include operations S210 to S230.
  • the method 200 may be executed, for example, by a vehicle-mounted terminal in an autonomous vehicle.
  • delay information for the image sensor is determined according to the first time information.
  • delay information is sent to the controller.
  • the vehicle-mounted terminal may receive a data packet sent by a radar sensor, where the data packet includes point cloud data collected by the radar sensor.
  • the vehicle-mounted terminal can use the timestamp of the data packet as the time information for the radar sensor to collect point cloud data. Among them, the timestamp can be added by the radar sensor when encapsulating the collected point cloud data to obtain the data package.
  • the radar sensor may send data packets using UDP/IP protocol.
  • the data packet includes Ethernet Header and User Data Packet Protocol Data (UDP Data).
  • UDP Data User Data Packet Protocol Data
  • the user data packet protocol data may include ranging data and additional information.
  • the ranging data consists of multiple data blocks, and each data block includes an azimuth angle value, a distance value, etc. Additional information may include motor speed and time data that drives the radar sensor to rotate.
  • the vehicle-mounted terminal can parse the received data packet and determine whether a data packet for a predetermined angle is received based on the parsing result. For example, a data packet whose azimuth angle value is a predetermined angle is a data packet for the predetermined angle.
  • This embodiment can obtain the first time information of the radar sensor collecting point cloud data at a predetermined angle based on the time data in the data packet for the predetermined angle.
  • the predetermined angle is an angle within the viewing angle range of the image sensor.
  • the predetermined angle can be determined based on the angle between the main optical axis of the image sensor and the horizontal plane.
  • the predetermined angle can be the value of the included angle.
  • the angle at which the radar sensor collects point cloud data can be represented by the angle between the laser beam emitted by the radar sensor when collecting point cloud data and the horizontal plane.
  • the time data in the data packet can include standard time (such as Universal Time Coordinated, UTC) and the periodic encapsulation time of the data packet.
  • standard time such as Universal Time Coordinated, UTC
  • the value range of the periodic encapsulation time of the data packet can be [0 ⁇ s, 1s].
  • Standard time can represent the year, month, day, hour, minute and second of the collected data.
  • the standard time and the periodic encapsulation time can be added to obtain the time information of the ranging data in the collected data packet.
  • the data packet whose azimuth angle value is a predetermined angle can be used as the target data packet, and the time information for collecting the ranging data in the target data packet can be used as the time information determined in operation S210. It can be understood that the combination of ranging data and azimuth angle values can represent point cloud data.
  • this embodiment can also calculate the radar sensor based on the time data in the data packet with an azimuth angle value of 0°.
  • Time information for collecting point cloud data at a predetermined angle For example, the time information determined based on the time data in the data packet with an azimuth angle value of 0° is T 0 , the predetermined angle is angle, and the rotation period of the radar sensor is LIDAR_ROT_INTERVAL.
  • the following formula (1) can be used to calculate the radar sensor to the predetermined angle.
  • Time information T' for collecting point cloud data at angle :
  • T' T 0 +angle*LIDAR_ROT_INTERVAL/360.
  • the delay information for the image sensor can be determined based on the difference between the time information T and the time information T'.
  • the difference after obtaining the difference between the time information T and the time information T', the difference can also be used to take the remainder of the time interval at which the image sensor collects images, and the remainder can be used as the delay information.
  • the value of the delay information can be limited to the time interval during which the image sensor collects images.
  • the adjustment amount of the acquisition time can be reduced.
  • the time interval at which the image sensor collects images may, for example, be positively correlated according to the reciprocal of the acquisition frame rate of the image sensor.
  • the unit of the acquisition frame rate may be, for example, fps, that is, the acquisition frame rate is the number of image data collected by the image sensor per second.
  • this embodiment can limit the value of the delay information to the time interval during which the image sensor collects images is because the image sensor periodically collects image data according to the collection frame rate, and the synchronization of data collection only needs to ensure that the image sensor collects The image data only needs to include data collected simultaneously with the point cloud data collected by the radar sensor turned to a predetermined angle.
  • the delay information can be sent to the controller that controls the triggering timing of the image sensor, so that the triggering timing of the image sensor is controlled by the controller, that is, the acquisition time of the image sensor is controlled, so that the image sensor can communicate with the image sensor.
  • Radar sensors rotated to a predetermined angle collect data synchronously. That is, at the moment when the radar sensor rotated to a predetermined angle collects point cloud data, the image sensor collects image data synchronously.
  • This embodiment determines the delay information based on the time information when the radar sensor collects point cloud data, and sends the delay information to the controller, so that the time when the image sensor collects image data is consistent with the time when the radar sensor collects point clouds within the viewing angle range of the image sensor.
  • the time alignment of data enables the collected image data and point cloud data to express the environmental information at the same time, which can improve the alignment accuracy of the collected image data and point cloud data, help autonomous vehicles make correct driving strategies, and improve autonomous driving. Vehicle driving safety.
  • the technical solution of this embodiment can improve the synchronization of the data collected by the two sensors and ensure that the image sensor
  • the acquired image data includes data aligned with the point cloud data collected by the radar sensor.
  • the collection angle data and time data in the data packet can be obtained by parsing the data packet from the radar sensor.
  • parse the data packet from the lidar and use the value of the analyzed Azlmuth Angle parameter as the acquisition angle data, that is, as the azimuth angle value.
  • the parsed UTC parameter value and the GPS Timestamp parameter value can be used as time data.
  • This embodiment can use any parsing tool that parses UDP packets to parse the data packets from the radar sensor.
  • the above-mentioned radar sensor models and analysis tools are only used as examples to facilitate understanding of the present disclosure, and the present disclosure does not limit this.
  • the values of the corresponding parameters obtained by analysis can be used as the collection angle data and time data.
  • corresponding parsing tools can be used to parse the data packets.
  • the reference time described above may be aligned with the time of the positioning device of the positioning device in the vehicle where the radar sensor is located, for example.
  • the positioning device can send a recommended positioning information (Recommended Minimum Specific GPS/TRANSIT Data, GPRMC) data packet to the radar sensor.
  • the radar sensor can use the UTC time in the GPRMC packet as the initial time.
  • the sum of the time difference between the time when the point cloud data is collected and the time when the GPRMC data packet is received and the initial time can be used as the reference time.
  • the UTC time in the GPRMC data packet is usually in seconds. In this way, the data packets of point cloud data collected by the radar sensor can have high-precision timestamps, which is conducive to achieving high-precision control of the image sensor.
  • FIG. 3 is a schematic diagram of the principle of determining delay information for an image sensor according to an embodiment of the present disclosure.
  • the time information of the radar sensor when the rotation angle is 0° can be calculated based on the time information of point cloud data collected at a predetermined angle. And based on the calculated time information when the rotation angle is 0° and the rotation period of the radar sensor, the rotation deviation value of the radar sensor is determined. Delay information is then determined based on this rotational deviation value.
  • the first time information 301 of the point cloud data collected by the radar sensor at a predetermined angle, the predetermined angle 302 and the rotation period 303 of the radar sensor can be calculated.
  • the second time information 304 of the radar sensor collecting point cloud data at 0° is obtained.
  • the minimum angle resolution of the radar sensor is set to 0.01°, and the azimuth angle Azlmuth Angle in the data packet is measured in units of 0.01°, then the predetermined angle in this embodiment should also be measured in units of 0.01°. Measured in units of 0.01°. In this case, 360 in formula (2) should be replaced by 36000.
  • the second time information 304 can be modulated by the rotation period 303, and the rotation deviation value 305 of the radar sensor can be determined based on the remainder.
  • the initial angle of the radar sensor is 0°
  • the starting time for the radar sensor to collect point cloud data is usually in seconds
  • the rotation period of the radar sensor is usually in the order of ms. If there is no deviation in the rotation of the radar sensor, the time information of the radar sensor collecting point cloud data at an angle of 0° should be an integer multiple of the rotation period. Therefore, the remainder obtained by taking the remainder can be used to represent the rotational deviation value. It can be understood that the difference between T and T' mentioned above can be understood as the rotation deviation value determined in this embodiment.
  • this embodiment may use the rotational deviation value 305 as the delay information of the image sensor.
  • a method similar to the method described above can also be used, in which the rotation deviation value 305 is modulated by the time interval at which the image sensor collects images, and the remainder value is used as the delay information.
  • the delay information may also be determined based on the sum of the rotation deviation value 305 and the rotation period 303, for example. In this way, it can be avoided that when the first time information T is small, T 0 ′ obtained by the above formula (2) is a negative number and cannot be modulated by the time interval during which the image sensor collects images.
  • the sum of the rotation deviation value 305 and the rotation period 303 may be modulated by the time interval for image sensor acquisition, and the remainder may be used as the delay information 306 of the image sensor.
  • a predetermined error value 307 determined based on the optical center position of the image sensor and the position of the target object within the viewing angle range of the image sensor may also be considered.
  • the target object may be the ground, for example, and may be determined based on the angle between a point on the central axis of the vehicle at a predetermined distance from the vehicle and a straight line where the optical center of the image sensor is located and the main optical axis of the image sensor.
  • the predetermined error value is a value greater than 0.
  • the predetermined error value 307 may be an empirical value. For example, the predetermined error value 307 may be 5 ms.
  • this embodiment may determine the delay information of the image sensor based on the sum of the predetermined error value 307 and the rotational deviation value 305 .
  • the accuracy of the determined delay information can be improved. This is because in the image data collected by the image sensor, the data that can reflect the objects on the road that affect the driving of the vehicle are the lower image pixels. Therefore, when aligning data, it is usually preferable to focus on the lower image pixels in the image data collected by the image sensor.
  • the angle at which the radar sensor can collect objects on the road usually deviates from the predetermined angle, and this deviation will cause a deviation in determining the acquisition time of the image sensor.
  • This embodiment can compensate for this deviation by setting a predetermined error value 307. In this way, controlling the time at which the image sensor collects image data based on the delay information determined in this embodiment can enable the image sensor and the radar sensor to synchronously collect objects on the road, thereby improving the synchronization accuracy of data collection.
  • this embodiment may also determine the delay information of the image sensor based on the sum of the predetermined error value 307, the rotation deviation value 305 and the rotation period 303.
  • the controller that controls the image sensor can be, for example, an artificial intelligence chip, such as a field programmable gate array chip (Field Programmable Gate Array, FPGA), which is suitable for processing data based on delay information at a high sampling frequency.
  • FPGA Field Programmable Gate Array
  • the image sensor with high sampling frequency adjusts the acquisition time. It can be applied to scenarios where data is collected synchronously in real time. Accordingly, the synchronous data collection of this embodiment can be performed periodically.
  • the controller After the controller receives the delay information, for example, the triggering time of the image sensor can be adjusted based on the delay information, so that the image sensor collects data synchronously with the radar sensor that rotates to a predetermined angle.
  • the method of synchronously collecting data executed by the controller will be described in detail below with reference to Figures 4 to 5 .
  • Figure 4 is a schematic flowchart of a method for synchronously collecting data according to an embodiment of the present disclosure.
  • the method 400 of synchronously collecting data in this embodiment may include operations S410 to S430.
  • the method 400 can be executed by a controller, for example, it can be executed by an artificial intelligence chip (such as FPGA).
  • an artificial intelligence chip such as FPGA
  • a trigger signal is sent to the image sensor.
  • time information is added to the image data.
  • the system time of the controller may, for example, be aligned with the positioning device time of the positioning device in the vehicle where the controller is located.
  • the positioning device can send a GPRMC data packet to the controller.
  • the controller can use the UTC time in the GPRMC packet as the initial time.
  • the controller can control the image sensor to collect image data at an acquisition frame rate of 30fps starting from the initial time.
  • the delay information is sent by the terminal device to the controller through operation S230 described above.
  • the controller can adjust the time at which the image sensor collects image data after the current time according to the delay information.
  • the adjusted trigger time can be obtained by adding the trigger time after the current time and the delay duration indicated by the delay information.
  • the time interval between two adjacent triggering moments of the image sensor can be considered.
  • this embodiment can use the remainder obtained by taking the delay length indicated by the delay information as the remainder of the time interval as the adjustment amount, and add the trigger time after the current time to the adjustment amount to obtain the adjusted trigger time.
  • the degree of adjustment of the trigger time can be reduced while ensuring synchronous data collection, and frame loss caused by adjusting the trigger time can be avoided.
  • the controller can control the trigger of the image sensor according to the adjusted trigger time, so that the image sensor collects data synchronously with the radar sensor rotated to a predetermined angle.
  • the trigger signal sent to the image sensor is any type of trigger signal that the image sensor can recognize and receive, and the present disclosure does not limit this.
  • the image sensor is triggered in response to the trigger signal and begins collecting image data. After collecting the image data, the image sensor can send the collected image data to the controller. After receiving the image data, the controller can add time information to the image data according to the time information of the current moment.
  • the point cloud data and image data collected by the radar sensor in the present disclosure all have time information.
  • the vehicle-mounted terminal can timely align the two data based on the time information of the two data, and fuse the two aligned data to reconstruct the environmental information.
  • the data packets for the predetermined angle and the determined delay information received by the vehicle-mounted controller are both periodic.
  • the controller rotates periodically according to the periodicity.
  • the delay information is used to adjust the triggering moment of the image sensor, which can avoid inaccurate data alignment caused by the rotation deviation of the radar sensor during vehicle operation. This will help improve the safety of autonomous vehicle driving.
  • FIG. 5 is a schematic flowchart of adjusting the triggering time of an image sensor according to an embodiment of the present disclosure.
  • the triggering time when adjusting the triggering time of the image sensor, if the delay time indicated by the delay information is long, the triggering time can be adjusted step by step to avoid causing the image sensor to be damaged due to a large degree of single adjustment.
  • the collected images lose frames relative to the radar sensor, affecting the autonomous driving of the vehicle.
  • the number of times to adjust the trigger moment and the step size for each adjustment may be determined based on the acquisition frame rate and delay duration of the image sensor. Then adjust the triggering time of the image sensor according to the number of times and step size.
  • the time interval for the image sensor to collect image data can be first determined based on the collection frame rate, and the time interval can be the reciprocal of the collection frame rate. Then, divide the delay duration by the time interval, round up, and use the rounded value as the number of times to adjust the trigger moment. Set the number of times to n, then for the 1st to (n-1)th time, the adjustment step is the value of the time interval. For the nth time, the adjustment step is the delay length divided by the time interval. the remainder. Or, for the first time, the adjustment step is the remainder obtained by dividing the delay length by the time interval, and for subsequent times, the adjustment step is the value of the time interval.
  • the delay duration can be equally divided into n parts, and the step size of each adjustment in n adjustments is the length of one part of the delay duration.
  • the value interval 502 of the adjustment step size at the triggering moment can be determined first based on the acquisition frame rate 501 of the image sensor. Then, the number of adjustments 503 and the step size of each adjustment 504 are determined based on the value interval 502 and the delay length.
  • the collection frame rate 501 may be negatively correlated with the length of the value interval 502 .
  • the length of the value interval 502 may be proportional to the length of the time interval for collecting image data.
  • the upper limit of the value in the value interval 502 can be set to the value of the time interval, or the upper limit of the value can be set to 0.5 times the time interval, etc. This disclosure does not limit this.
  • this embodiment may divide the delay duration by the value of the value interval 502, round up, and use the value obtained by rounding up as the number of adjustments.
  • the controller may, for example, maintain the delay information received in the past. After receiving the latest delay information, the controller may, for example, first find the delay information adjacent to the latest delay information at the reception time from the maintained accepted delay information, and then determine whether the latest delay information is relative to Adjacent delay information changes. If changes occur, adjust the triggering time of the image sensor. Specifically, the triggering time of the image sensor can be adjusted according to the change value of the delay information. In this way, the iterative adjustment of the image sensor can be realized, the degree of adjustment of the triggering moment of the image sensor within a single adjustment cycle can be reduced, and the stability of the image data collected by the image sensor can be improved.
  • this embodiment can calculate the delay duration indicated by the m-th delay information 505 and the (m-1)-th delay.
  • the number of adjustments 503 and the step size 504 of each adjustment are determined.
  • the lower limit of the value interval 502 determined in step 501 may be, for example, a value less than 0.
  • the absolute value of the difference between the lower limit value and 0 may be smaller than the absolute value of the difference between the upper limit value of the value interval and 0. This is due to the limitation of the acquisition frame rate of the image sensor.
  • the difference between this lower limit value and 0 is inversely related to the acquisition frame rate.
  • the acquisition frame rate of the image sensor is set to 30 fps, and the value interval can be set to [-200 ⁇ m, 32 ms], for example. If the acquisition frame rate is 15fps, the value interval can be set to [-400 ⁇ m, 64ms], for example.
  • the controller may, for example, send a trigger signal every two frames. After receiving the trigger signal, the image sensor continuously collects three frames of image data. The differences between the collection time of the second frame of image data, the third frame of image data and the collection time of the first frame of image data are 33.3ms and 66.7ms respectively.
  • the controller may only adjust the sending timing of the trigger signal sent every two frames, thereby adjusting the triggering moment of the image sensor. If the number of adjustments is multiple, the controller can adjust the signal sending mechanism that sends the trigger signal every two frames to the signal sending mechanism that sends the trigger signal every frame. After completing the adjustment of the trigger time and sending the trigger signal to the image sensor according to the adjusted trigger time, the controller can adjust the signal sending mechanism back to the mechanism of sending the trigger signal every two frames.
  • this embodiment can only adjust the most recent Adjustment is made for p consecutive triggering moments.
  • the adjustment to the latest p+1th trigger moment can be determined based on the delay information received in the next cycle.
  • the above-mentioned controller and vehicle-mounted terminal can both be integrated into the automatic driving system of the vehicle.
  • the image sensor and the radar sensor can collect data synchronously by executing the synchronous data collection methods of the above embodiments.
  • Figure 6 is a schematic diagram of the principle of a method for synchronously collecting data according to an embodiment of the present disclosure.
  • the execution subject of the method 600 for synchronously collecting data is the automatic driving system of the automatic driving vehicle.
  • the automatic driving system includes a vehicle-mounted terminal 610 and a controller 620.
  • the controller 620 may specifically be an FPGA chip.
  • the autonomous vehicle may also be provided with a radar sensor 601 and an image sensor 602.
  • the radar sensor 601 may encapsulate the point cloud data collected from various angles into data packets as data packets for each angle. And the data packets of each angle are sent to the vehicle-mounted terminal 610.
  • the vehicle-mounted terminal 610 can analyze the received data packet and determine the angle targeted by the data packet. And after it is determined that the data packet for the predetermined angle is received, the first time information for the radar sensor to collect point cloud data at the predetermined angle is determined. The predetermined angle is within the viewing angle range of the image sensor. It can be understood that the implementation manner of determining the first time information is similar to the implementation manner of operation S210 described above, and will not be described again here.
  • the vehicle-mounted terminal 610 can determine the delay information for the image sensor 602 based on the first time information, and send the delay information to the FPGA 620. It can be understood that the implementation of determining the delay information may be similar to the implementation of operation S220 described above, and will not be described again here.
  • the FPGA chip 620 can adjust the triggering time of the image sensor 602 according to the delay information, so that the image sensor 602 and the radar sensor 601 can collect data synchronously.
  • the implementation of adjusting the triggering time is similar to the implementation of operation S410 described above.
  • the FPGA chip 620 may send a trigger signal to the image sensor 602 when the adjusted trigger time is reached. Image sensor 602 may begin collecting image data in response to the trigger signal.
  • the image sensor 602 may send the collected image data to the FPGA chip 620, so that the FPGA chip 620 adds second time information to the received image data. Specifically, a timestamp can be added to the image data to obtain image data 631 with a timestamp.
  • the obtained series of image data 631 includes points that are aligned with the point cloud data subsequently collected by the radar sensor 601 image data.
  • the image data and the point cloud data can be aligned according to the second time information of the image data and the time information of the point cloud data subsequently collected by the radar sensor.
  • point cloud data and image data with the same time information can be used as an aligned pair of data, or point cloud data and image data with time information that differs less than a threshold can be used as an aligned pair of data.
  • Figure 7 is a schematic diagram of the principle of synchronous data collection according to an embodiment of the present disclosure.
  • the reference time based on which the FPGA adds time information to image data and the reference time used by the radar sensor to collect point cloud data are both set to the GPS clock.
  • the time when the radar sensor collects point cloud data at 0° should be 0.1s, 0.9s, 1s, 1.1s..., and the camera The time for collecting image data should be 33.3ms, 66.7ms, 0.1s,..., 0.8667s, 0.9s, 0.9333s....
  • the time at which the radar sensor collects point cloud data is actually 0.92s, 1.02s,...2.01s.
  • the time when the camera collects image data is actually 0.89s, 0.9233s, ....
  • the trigger time of the camera can be adjusted from 0.89s to 0.92s, and the delay length based on the adjustment is t1, which is 30ms.
  • the trigger time of the camera is adjusted from 1.99s to 2.01s, and the delay length based on the adjustment is t2, which is 20ms.
  • the image data collected by the camera and the point cloud data collected by the radar are temporally aligned.
  • the present disclosure also provides a synchronization determination method for evaluating the synchronization of an image sensor and a radar sensor.
  • the synchronization determination method will be described in detail below with reference to FIG. 8 .
  • Figure 8 is a schematic flowchart of a synchronization determination method according to an embodiment of the present disclosure.
  • the synchronization determination method 800 of this embodiment may include operations S810 to S840.
  • the synchronization determination method 800 can be executed by a cloud system.
  • the cloud system can obtain the data packet sequence and image data sequence collected by the autonomous vehicle within the unit time on a daily basis.
  • the cloud system may obtain the sequence of data packets and the sequence of image data via a wireless communication link, for example.
  • image data matching each data packet is determined based on the time information of each data packet and the time information of each image data in the image data sequence.
  • the cloud system can parse each data packet in the data packet sequence and determine the first time information of the point cloud data in the data packet collected by the radar sensor. Then, the image data whose time information is added to the image data sequence is the same as or similar to the first time information is used as the image data matching each data packet. For example, if the time difference between the second time information of the image data and the first time information of the point cloud data is less than ⁇ 3 ms, it can be determined that the image data matches the point cloud data. For data packets for which matching image data is not determined, the data packet can be discarded.
  • a synchronization relationship between each data packet and the matching image data is determined based on the difference in time information between each data packet and the matching image data and the difference threshold.
  • the difference threshold may be set based on experience, for example, the difference threshold may be set on the premise that it does not affect the safe driving of the vehicle.
  • the difference threshold may be 15 ms, 7 ms, etc., which is not limited in this disclosure.
  • the difference threshold may be determined based on the fusion result of historical image data and point cloud data in the historical data package. If the positional deviation between the pixels projected from the point cloud data into the image coordinate system in the fusion result and the corresponding pixels in the historical image data reaches a critical value (such as a predetermined deviation value), the time information of the historical image data can be combined with the historical data. The difference in packet time information is used as the difference threshold. In this way, the accuracy of the set difference threshold and therefore the evaluation of the synchronization relationship can be increased.
  • the difference in time information between each data packet and the matching image data is greater than the difference threshold, it may be determined that the synchronization relationship between each data packet and the matching image data is asynchronous. If the difference is less than or equal to the difference threshold, it can be determined that the synchronization relationship between each data packet and the matching image data is synchronization.
  • the synchronization relationship between the radar sensor and the image sensor is determined according to the synchronization relationship between each of the plurality of data packets in the data packet sequence and the matching image data.
  • the synchronization relationship between each data packet and the matching image data in the data packet sequence can be counted. If the proportion of data packets synchronized with the matching image data is greater than or equal to the predetermined proportion threshold, it can be determined that the synchronization relationship between the radar sensor and the image sensor is synchronous, otherwise it is determined that the synchronization relationship is out of synchronization.
  • the predetermined ratio threshold may be, for example, a value less than 1 but close to 1 such as 0.8, which is not limited in this disclosure.
  • Embodiments of the present disclosure can improve the efficiency of determining the synchronization relationship by setting a difference threshold and using the difference threshold as a basis for synchronization between the data packets collected by the radar sensor and the image data collected by the image sensor. Compared with the technical solution of fusion of image data and point cloud data in the data package, and determining the synchronization relationship based on the fusion results, it can reduce human investment, reduce the consumption of computing resources, and improve parallel processing capabilities.
  • a predetermined containerization tool may be used to fuse the point cloud data and historical image data in the historical data package.
  • the predetermined container flower tool it can be compatible with different platforms installed on different cloud systems, which is beneficial to improving the applicability of the synchronization determination method provided by the embodiments of the present disclosure.
  • the predetermined containerization tool can, for example, fuse the point cloud data and the image data based on the following principle: align and fuse the point cloud data and the image data according to the external parameters of the radar sensor and the image sensor.
  • the point cloud data can be projected into the camera coordinate system to obtain the pixels projected from the point cloud data into the camera coordinate system.
  • the pixels are superimposed on the corresponding pixels of the image data to obtain the fused data.
  • this embodiment may also send prompt information to the target object. For example, a reminder message can be sent to the target communication account.
  • the target communication account can be an email account or a mobile communication account, etc., and this disclosure does not limit this.
  • prompt information monitoring personnel can be informed of the out-of-synchronization between the image sensor and the radar sensor in time, so that relevant strategies can be adopted to solve the out-of-synchronization problem and reduce the chance of unsafe driving of autonomous vehicles due to out-of-synchronization.
  • the disclosure Based on the method for synchronous data collection performed by a vehicle-mounted terminal provided by the disclosure, the disclosure also provides a vehicle-mounted terminal.
  • Figure 9 is a schematic diagram of a vehicle-mounted terminal according to an embodiment of the present disclosure.
  • this embodiment provides a vehicle-mounted terminal 900 , which can be integrated into any vehicle such as an autonomous vehicle.
  • the vehicle-mounted terminal 900 may, for example, be configured to perform the method of synchronously collecting data performed by the vehicle-mounted terminal described above.
  • the vehicle-mounted terminal 900 may be configured to determine first time information when the radar sensor collects point cloud data at a predetermined angle in response to receiving a data packet from the radar sensor for a predetermined angle.
  • the predetermined angle is within the viewing angle range of the image sensor in the vehicle equipped with the vehicle-mounted terminal, and the data package includes point cloud data collected by the radar sensor at the predetermined angle.
  • the implementation manner of determining the first time information in this embodiment is similar to the implementation manner of operation S210 described above, and will not be described again here.
  • the vehicle-mounted terminal 900 may also be configured to determine delay information for the image sensor based on the first time information, and send the delay information to the controller that controls the image sensor, so that the controller controls the image sensor and rotates to the predetermined position.
  • the radar sensors collect data simultaneously at angles.
  • the vehicle-mounted terminal 900 may, for example, use the principle described above with respect to FIG. 3 to determine the delay information, which is not limited in this disclosure. It can be understood that the structure of the vehicle-mounted terminal 900 in FIG. 9 is only used as a diagram to facilitate understanding of the present disclosure, and the present disclosure does not limit it.
  • the disclosure Based on the synchronous data collection method executed by the controller provided by the disclosure, the disclosure also provides a controller.
  • Figure 10 is a schematic diagram of a controller according to an embodiment of the present disclosure.
  • this embodiment provides a controller 1000 that can be integrated into any vehicle such as an autonomous vehicle, and the controller 1000 can also be communicatively connected with a vehicle-mounted terminal in the vehicle.
  • the controller 1000 may, for example, be configured to perform the method of synchronously collecting data performed by the controller described above.
  • the controller 1000 may be configured to, in response to receiving delay information for the image sensor, adjust the triggering moment of the image sensor according to the delay information.
  • the principle of adjusting the triggering time in this embodiment may be similar to the adjustment principle in operation S410 described above, and will not be described again here. It can be understood that the delay information may be sent by a vehicle-mounted terminal that is communicatively connected to the controller.
  • the controller 1000 may also be configured to send a trigger signal to the image bed Angel in the vehicle in response to reaching the trigger moment, and in response to receiving the image data collected by the image sensor, add time information to the image data so that the vehicle can
  • the terminal uses time information to align the image data with the point cloud data collected by the radar sensor.
  • the controller 1000 may, for example, use the principle described above with respect to FIG. 5 to determine the number of times and steps to adjust the triggering moment, and adjust the triggering moment according to the determined number of times and the step size. This disclosure does not apply to this. Make limitations. It can be understood that the structure of the controller 1000 in FIG. 10 is only used as a diagram to facilitate understanding of the present disclosure, and the present disclosure does not limit this.
  • the present disclosure also provides an autonomous driving vehicle.
  • Figure 11 is a schematic diagram of an autonomous vehicle according to an embodiment of the present disclosure.
  • this embodiment provides an autonomous vehicle 1100 in which a vehicle-mounted terminal, a controller, a radar sensor, and an image sensor can be integrated.
  • the vehicle-mounted terminal may be the vehicle-mounted terminal 900 described above, and the controller may be the controller 1000 described above.
  • the image sensor can periodically collect image data, and the radar sensor can periodically collect point cloud data, and encapsulate the point cloud data to obtain a data package.
  • the vehicle-mounted terminal in the autonomous vehicle may determine the first time information that the radar sensor collects point cloud data at a predetermined angle in response to receiving a data packet from the radar sensor for a predetermined angle. The predetermined angle is within the viewing angle range of the image sensor, and the data package includes point cloud data collected by the radar sensor at the predetermined angle.
  • the vehicle-mounted terminal can also determine the delay information for the image sensor based on the first time information, and send the determined delay information to the controller. It can be understood that the vehicle-mounted terminal determines the delay information in a manner similar to the implementation principle of the method 200 for synchronously collecting data described above, which will not be described again here.
  • the controller can adjust the trigger time of the image sensor according to the delay information, and when it is determined that the trigger time of the image sensor is reached, send a trigger signal to the image sensor.
  • the controller can also add second time information to the image data after receiving the image data collected by the image sensor, so that the vehicle-mounted terminal uses the second time information to align the image data with the point cloud data collected by the radar sensor.
  • the present disclosure also provides a cloud system for executing the synchronization determination method.
  • Figure 12 is a schematic diagram of a cloud system according to an embodiment of the present disclosure.
  • the present disclosure provides a cloud system 1200.
  • the cloud system 1200 can communicate with the self-driving vehicle provided by the present disclosure through a communication link to obtain data collected by sensors from the self-driving vehicle, or receive automatic Data sent by driving a vehicle.
  • the cloud system 1200 may be configured to obtain the data packet sequence collected by the radar sensor and the image data sequence collected by the image sensor in the autonomous vehicle. After acquiring the data packet sequence and the image data sequence, the cloud system 1200 can determine, for each data packet in the data packet sequence, the time information associated with each data packet based on the time information of each data packet and the time information of each image data in the image data sequence. packets of matching image data. Subsequently, the synchronization relationship between each data packet and the matching image data is determined based on the difference in time information between each data packet and the matching image data and the difference threshold. Finally, the cloud system 1200 can determine the synchronization relationship between the radar sensor and the image sensor based on the synchronization relationship between each of the multiple data packets in the data packet sequence and the matching image data.
  • the cloud system 1200 can be used to perform the synchronization determination method described above to determine the synchronization relationship between the radar sensor and the image sensor, which is not limited by the present disclosure.
  • the disclosure Based on the method for synchronously collecting data performed by a vehicle-mounted terminal provided by the disclosure, the disclosure also provides a device for synchronously collecting data, which device can be integrated into the vehicle-mounted terminal.
  • Figure 13 is a structural block diagram of a device for synchronously collecting data according to an embodiment of the present disclosure.
  • the device 1300 for synchronously collecting data in this embodiment can be integrated into a vehicle-mounted terminal.
  • the device 1300 for synchronously collecting data can include a time information determination module 1310, a delay information determination module 1320 and an information sending module 1330.
  • the time information determination module 1310 is configured to determine first time information that the radar sensor collects point cloud data at a predetermined angle in response to receiving a data packet from the radar sensor for a predetermined angle; wherein the predetermined angle is within the viewing angle range of the image sensor.
  • the data package includes point cloud data collected by the radar sensor at a predetermined angle.
  • the time information determination module 1310 may be configured to perform the above-described operation S210, which will not be described again here.
  • the delay information determining module 1320 is configured to determine delay information for the image sensor according to the first time information.
  • the delay information determination module 1320 may be configured to perform the above-described operation S220, which will not be described again here.
  • the information sending module 1330 is used to send delay information to the controller, so that the controller controls the image sensor to collect data synchronously with the radar sensor rotated to a predetermined angle.
  • the information sending module 1330 may be configured to perform the above-described operation S230, which will not be described again here.
  • the time information determination module 1310 may include a parsing sub-module and a first determination sub-module.
  • the parsing sub-module is used to parse the data packet and obtain the time data in the data packet; the time data includes the base time and the periodic encapsulation time.
  • the first determination sub-module is used to determine the first time information for the radar sensor to collect point cloud data at a predetermined angle based on the reference time and periodic packaging time. The reference time is aligned with the time of the positioning device of the positioning device in the vehicle where the radar sensor is located.
  • the delay information determination module 1320 may include a second determination sub-module, a deviation value determination sub-module and a delay determination sub-module.
  • the second determination sub-module is used to determine the second time information for the radar sensor to collect point cloud data at 0° based on the first time information, the rotation period of the radar sensor and the predetermined angle.
  • the deviation value determination sub-module is used to determine the rotation deviation value of the radar sensor based on the remainder obtained by taking the remainder of the rotation period based on the second time information.
  • the delay determination sub-module is used to determine delay information for the image sensor according to the rotation deviation value.
  • the delay determination sub-module may be specifically configured to determine the delay information of the image sensor based on the sum of the predetermined error value and the rotation deviation value.
  • the predetermined error value is greater than 0, and the predetermined error value is determined based on the optical center position of the image sensor and the position of the target object within the viewing angle range of the image sensor.
  • the controller includes an artificial intelligence chip.
  • the present disclosure also provides a device for synchronously collecting data, which device can be integrated into the controller.
  • Figure 14 is a structural block diagram of a device for synchronously collecting data according to another embodiment of the present disclosure.
  • the device 1400 for synchronously collecting data in this embodiment may include a time adjustment module 1410, a signal sending module 1420 and a time adding module 1430.
  • the timing adjustment module 1410 is configured to respond to receiving delay information for the image sensor and adjust the triggering timing of the image sensor according to the delay information.
  • the delay information is determined based on the time information of the radar sensor collecting point cloud data at a predetermined angle, and the predetermined angle is within the viewing angle range of the image sensor.
  • the time adjustment module 1410 may be used to perform the above-described operation S410, which will not be described again here.
  • the signal sending module 1420 is configured to send a trigger signal to the image sensor in response to reaching the trigger moment.
  • the signal sending module 1420 may be configured to perform the above-described operation S420, which will not be described again here.
  • the time adding module 1430 is configured to add time information to the image data in response to receiving the image data collected by the image sensor, so as to use the time information to align the image data with the point cloud data collected by the radar sensor.
  • the time adding module 1430 may be used to perform the operation S430 described above, which will not be described again.
  • the delay information indicates the delay duration
  • the above-mentioned time adjustment module 1410 may include an adjustment information determination sub-module and an adjustment sub-module.
  • the adjustment information determination submodule is used to determine the number of adjustment trigger moments and the step size for each adjustment based on the acquisition frame rate and delay time of the image sensor.
  • the adjustment sub-module is used to adjust the triggering moment of the image sensor based on the number of times and step size.
  • the adjustment information determination sub-module may include an adjustment interval determination unit and an information determination unit.
  • the adjustment interval determination unit is used to determine the value interval of the adjustment step at the triggering moment based on the acquisition frame rate of the image sensor.
  • the information determination unit is used to determine the number of times and step size based on the value interval and delay length. Among them, the acquisition frame rate is negatively related to the length of the value interval.
  • the above-mentioned time adjustment module 1410 is specifically configured to respond to receiving delay information and the delay information changes relative to adjacent delay information in the received delay information, adjusting the triggering of the image sensor according to the change value of the delay information. time.
  • the present disclosure also provides a device for synchronously collecting data, which device can be integrated into the autonomous driving vehicle.
  • Figure 15 is a structural block diagram of a device for synchronously collecting data according to another embodiment of the present disclosure.
  • the device 1500 for synchronously collecting data in this embodiment may include a time information determination module 1510, a delay information determination module 1520, a time adjustment module 1530, a signal sending module 1540 and a time adding module 1550.
  • the time information determination module 1510 is configured to determine first time information that the radar sensor collects point cloud data at a predetermined angle in response to receiving a data packet from the radar sensor for a predetermined angle; wherein the predetermined angle is within the viewing angle range of the image sensor.
  • the data package includes point cloud data collected by the radar sensor at predetermined angles.
  • the time information determination module 1510 may be used to perform the operation S210 described above, which will not be described again.
  • the delay information determination module 1520 is configured to determine delay information for the image sensor according to the first time information. In an embodiment, the delay information determination module 1520 may be configured to perform the operation S220 described above, which will not be described again.
  • the time adjustment module 1530 is used to adjust the triggering time of the image sensor according to the delay information, so that the image sensor collects data synchronously with the radar sensor rotated to a predetermined angle. In one embodiment, the time adjustment module 1530 may be used to perform the operation S410 described above, which will not be described again.
  • the signal sending module 1540 is configured to send a trigger signal to the image sensor in response to reaching the trigger moment.
  • the signal sending module 1540 may be used to perform the operation S420 described above, which will not be described again here.
  • the time adding module 1550 is configured to add second time information to the image data in response to receiving the image data collected by the image sensor, so as to utilize the second time information to align the image data with the point cloud data collected by the radar sensor.
  • the time adding module 1550 may be used to perform the above-described operation S430, which will not be described again here.
  • the present disclosure also provides a synchronization determination device, which can be integrated in a cloud system.
  • Figure 16 is a structural block diagram of a synchronization determination device according to an embodiment of the present disclosure.
  • the synchronization determination device 1600 of this embodiment may include a data acquisition module 1610, a data matching module 1620, a data relationship determination module 1630 and a synchronization relationship determination module 1640.
  • the data acquisition module 1610 is used to acquire the data packet sequence collected by the radar sensor and the image data sequence collected by the image sensor. In an embodiment, the data acquisition module 1610 may be used to perform the above-described operation S810, which will not be described again here.
  • the data matching module 1620 is configured to determine, for each data packet in the data packet sequence, image data matching each data packet based on the time information of each data packet and the time information of each image data in the image data sequence. In one embodiment, the data matching module 1620 may be used to perform the above-described operation S820, which will not be described again here.
  • the data relationship determination module 1630 is configured to determine a synchronization relationship between each data packet and the matching image data based on the difference in time information between each data packet and the matching image data and the difference threshold. In one embodiment, the data relationship determination module 1630 may be used to perform the above-described operation S830, which will not be described again here.
  • the synchronization relationship determination module 1640 is used to determine the synchronization relationship between the radar sensor and the image sensor based on the synchronization relationship between each of the multiple data packets in the data packet sequence and the matching image data. In an embodiment, the synchronization relationship determination module 1640 may be used to perform the above-described operation S840, which will not be described again here.
  • the above-mentioned synchronization determining device may further include a prompt information sending module, configured to send prompt information to the target object in response to the synchronization relationship between the radar sensor and the image sensor indicating out-of-synchronization.
  • the above-mentioned synchronization determination device may further include a threshold determination module for determining a difference threshold.
  • the threshold determination module may include a data fusion sub-module and a threshold determination sub-module.
  • the data fusion sub-module is used to obtain fusion data based on the historical data packets collected by the radar sensor and the historical image data matching the historical data packets; the fusion data represents the projection of the point cloud data in the historical data packets to the pixels in the image coordinate system , and the position deviation value between pixels in historical image data.
  • the threshold determination sub-module is used to determine the difference in time information between the historical data packet and the historical image data as a difference threshold when the position deviation value reaches a predetermined deviation value.
  • the above-mentioned data fusion sub-module is specifically used to use predetermined containerization tools to fuse point cloud data and historical image data in historical data packages to obtain fused data.
  • the collection, storage, use, processing, transmission, provision, disclosure and application of user personal information are all in compliance with relevant laws and regulations, and necessary confidentiality measures have been taken. , and does not violate public order and good customs.
  • the user's authorization or consent is obtained before obtaining or collecting the user's personal information.
  • the present disclosure also provides an electronic device, a readable storage medium, and a computer program product.
  • FIG. 17 shows a schematic block diagram of an example electronic device 1700 that can be used to implement the method of synchronously collecting data or the method of synchronization determination according to embodiments of the present disclosure.
  • Electronic devices are intended to refer to various forms of digital computers, such as laptop computers, desktop computers, workstations, personal digital assistants, servers, blade servers, mainframe computers, and other suitable computers.
  • Electronic devices may also represent various forms of mobile devices, such as personal digital assistants, cellular phones, smart phones, wearable devices, and other similar computing devices.
  • the components shown herein, their connections and relationships, and their functions are examples only and are not intended to limit implementations of the disclosure described and/or claimed herein.
  • the device 1700 includes a computing unit 1701 that can execute according to a computer program stored in a read-only memory (ROM) 1702 or loaded from a storage unit 1708 into a random access memory (RAM) 1703. Various appropriate actions and treatments. In the RAM 1703, various programs and data required for the operation of the device 1700 can also be stored.
  • Computing unit 1701, ROM 1702 and RAM 1703 are connected to each other via bus 1704.
  • Input/output (I/O) interface 1705 is also connected to bus 1704.
  • I/O interface 1705 Multiple components in device 1700 are connected to I/O interface 1705, including: input unit 1706, such as keyboard, mouse, etc.; output unit 1707, such as various types of displays, speakers, etc.; storage unit 1708, such as magnetic disk, optical disk, etc. ; and communication unit 1709, such as a network card, modem, wireless communication transceiver, etc.
  • the communication unit 1709 allows the device 1700 to exchange information/data with other devices through computer networks such as the Internet and/or various telecommunications networks.
  • Computing unit 1701 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of the computing unit 1701 include, but are not limited to, a central processing unit (CPU), a graphics processing unit (GPU), various dedicated artificial intelligence (AI) computing chips, various computing units that run machine learning model algorithms, digital signal processing processor (DSP), and any appropriate processor, controller, microcontroller, etc.
  • the computing unit 1701 performs various methods and processes described above, such as the method of synchronously collecting data or the method of synchronization determination.
  • the synchronization data collection method or the synchronization determination method may be implemented as a computer software program, which is tangibly included in a machine-readable medium, such as the storage unit 1708.
  • part or all of the computer program may be loaded and/or installed onto device 1700 via ROM 1702 and/or communication unit 1709.
  • the computer program When the computer program is loaded into the RAM 1703 and executed by the computing unit 1701, one or more steps of the above-described method of synchronously collecting data or the synchronously determining method may be performed.
  • the computing unit 1701 may be configured to perform the method of synchronously collecting data or the method of synchronously determining in any other suitable manner (eg, by means of firmware).
  • Various implementations of the systems and techniques described above may be implemented in digital electronic circuit systems, integrated circuit systems, field programmable gate arrays (FPGAs), application specific integrated circuits (ASICs), application specific standard products (ASSPs), systems on a chip implemented in a system (SOC), complex programmable logic device (CPLD), computer hardware, firmware, software, and/or combinations thereof.
  • FPGAs field programmable gate arrays
  • ASICs application specific integrated circuits
  • ASSPs application specific standard products
  • SOC system
  • CPLD complex programmable logic device
  • computer hardware firmware, software, and/or combinations thereof.
  • These various embodiments may include implementation in one or more computer programs executable and/or interpreted on a programmable system including at least one programmable processor, the programmable processor
  • the processor which may be a special purpose or general purpose programmable processor, may receive data and instructions from a storage system, at least one input device, and at least one output device, and transmit data and instructions to the storage system, the at least one input device, and the at least one output device.
  • An output device may be a special purpose or general purpose programmable processor, may receive data and instructions from a storage system, at least one input device, and at least one output device, and transmit data and instructions to the storage system, the at least one input device, and the at least one output device.
  • An output device may be a special purpose or general purpose programmable processor, may receive data and instructions from a storage system, at least one input device, and at least one output device, and transmit data and instructions to the storage system, the at least one input device, and the at least one output device.
  • Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general-purpose computer, special-purpose computer, or other programmable data processing device, such that the program codes, when executed by the processor or controller, cause the functions specified in the flowcharts and/or block diagrams/ The operation is implemented.
  • the program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
  • a machine-readable medium may be a tangible medium that may contain or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • the machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium.
  • Machine-readable media may include, but are not limited to, electronic, magnetic, optical, electromagnetic, infrared, or semiconductor systems, devices or devices, or any suitable combination of the foregoing.
  • machine-readable storage media would include one or more wire-based electrical connections, laptop disks, hard drives, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
  • RAM random access memory
  • ROM read only memory
  • EPROM or flash memory erasable programmable read only memory
  • CD-ROM portable compact disk read-only memory
  • magnetic storage device or any suitable combination of the above.
  • the systems and techniques described herein may be implemented on a computer having a display device (eg, a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to the user ); and a keyboard and pointing device (eg, a mouse or a trackball) through which a user can provide input to the computer.
  • a display device eg, a CRT (cathode ray tube) or LCD (liquid crystal display) monitor
  • a keyboard and pointing device eg, a mouse or a trackball
  • Other kinds of devices may also be used to provide interaction with the user; for example, the feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and may be provided in any form, including Acoustic input, voice input or tactile input) to receive input from the user.
  • the systems and techniques described herein may be implemented in a computing system that includes back-end components (e.g., as a data server), or a computing system that includes middleware components (e.g., an application server), or a computing system that includes front-end components (e.g., A user's computer having a graphical user interface or web browser through which the user can interact with implementations of the systems and technologies described herein), or including such backend components, middleware components, or any combination of front-end components in a computing system.
  • the components of the system may be interconnected by any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include: local area network (LAN), wide area network (WAN), and the Internet.
  • Computer systems may include clients and servers.
  • Clients and servers are generally remote from each other and typically interact over a communications network.
  • the relationship of client and server is created by computer programs running on corresponding computers and having a client-server relationship with each other.
  • the server can be a cloud server, also known as cloud computing server or cloud host. It is a host product in the cloud computing service system to solve the problem of traditional physical host and VPS service ("Virtual Private Server", or "VPS" for short). "), there are defects such as difficult management and weak business scalability.
  • the server can also be a distributed system server or a server combined with a blockchain.

Abstract

A method (200, 400, 600) and apparatus (1300, 1400, 1500) for synchronously collecting data, a synchronization determination method (800) and apparatus (1600), a vehicle (110, 1100), and an electronic device (1700), which relate to the field of artificial intelligence, and in particular to the technical field of autonomous driving, computer vision and cloud computing. The specific implementation solution of the method (200) for synchronously collecting data is: in response to receiving a data packet for a predetermined angle (302) from a radar sensor (111, 601), determining first time information (301) of when the radar sensor (111, 601) collects point cloud data at the predetermined angle (302) (S210), wherein the predetermined angle (302) is within an angle-of-view range of an image sensor (112, 602); according to the first time information (301), determining delay information (306) for the image sensor (112, 602) (S220); and sending the delay information (306) to a controller (1000) (S230), such that the controller (1000) controls the image sensor (112, 602) to synchronously collect data with the radar sensor (111, 601) which is rotated to the predetermined angle (302), wherein the data packet comprises the point cloud data collected by the radar sensor (111, 601) at the predetermined angle (302).

Description

同步采集数据的方法和同步确定方法、装置、自动驾驶车Synchronous data collection method and synchronization determination method, device, and autonomous vehicle 技术领域Technical field
本公开涉及人工智能领域,具体涉及自动驾驶、计算机视觉和云计算等技术领域,尤其涉及一种同步采集数据的方法、同步确定方法、装置、车载终端、控制器、车辆、云端系统、电子设备和可读存储介质。The present disclosure relates to the field of artificial intelligence, specifically to technical fields such as autonomous driving, computer vision, and cloud computing, and in particular to a method of synchronously collecting data, a synchronization determination method, a device, a vehicle-mounted terminal, a controller, a vehicle, a cloud system, and an electronic device and readable storage media.
背景技术Background technique
随着计算机技术和网络技术的发展,自动驾驶技术得到快速发展。在自动驾驶技术中,通常需要依赖图像传感器采集的图像数据及雷达传感器采集的点云数据来对车辆周围的环境信息进行感知,并依据感知结果确定自动驾驶策略。With the development of computer technology and network technology, autonomous driving technology has developed rapidly. In autonomous driving technology, it is usually necessary to rely on image data collected by image sensors and point cloud data collected by radar sensors to perceive the environmental information around the vehicle, and determine the autonomous driving strategy based on the sensing results.
发明内容Contents of the invention
本公开旨在提供一种利于提高数据对齐精度的同步采集数据的方法、同步确定方法、装置、车载终端、控制器、车辆、云端系统、电子设备和可读存储介质。The present disclosure aims to provide a method for synchronously collecting data, a synchronization determination method, a device, a vehicle-mounted terminal, a controller, a vehicle, a cloud system, an electronic device and a readable storage medium that are conducive to improving data alignment accuracy.
根据本公开的一个方面,提供了一种同步采集数据的方法,包括:响应于接收到来自雷达传感器的针对预定角度采集的数据包,确定雷达传感器以预定角度采集点云数据的第一时间信息;其中,预定角度在图像传感器的视角范围内;根据第一时间信息,确定针对图像传感器的延迟信息;以及向控制器发送延迟信息,以便控制器控制图像传感器与转动至预定角度的雷达传感器同步地采集数据,其中,数据包包括雷达传感器以预定角度采集的点云数据。According to an aspect of the present disclosure, a method for synchronously collecting data is provided, including: in response to receiving a data packet from a radar sensor for collection at a predetermined angle, determining first time information when the radar sensor collects point cloud data at a predetermined angle. ; wherein the predetermined angle is within the viewing angle range of the image sensor; determining delay information for the image sensor according to the first time information; and sending the delay information to the controller so that the controller controls the image sensor to synchronize with the radar sensor rotated to the predetermined angle The data is collected on the ground, where the data package includes point cloud data collected by the radar sensor at a predetermined angle.
根据本公开的另一个方面,提供了一种同步采集数据的方法,包括:响应于接收到针对图像传感器的延迟信息,根据延迟信息调整图像传感器的触发时刻;响应于到达触发时刻,向图像传感器发送触发信号;以及响应于接收到图像传感器采集的图像数据,向图像数据添加时间信息,以便利用时间信息将图像数据与雷达传感器采集的点云数据对齐,其中,延迟信息是根据雷达传感器以预定角度采集点云数据的时间信息确定的,预定 角度在图像传感器的视角范围内。According to another aspect of the present disclosure, a method for synchronously collecting data is provided, including: in response to receiving delay information for the image sensor, adjusting the trigger moment of the image sensor according to the delay information; in response to reaching the trigger moment, sending a signal to the image sensor sending a trigger signal; and in response to receiving the image data collected by the image sensor, adding time information to the image data so as to utilize the time information to align the image data with the point cloud data collected by the radar sensor, wherein the delay information is predetermined based on the radar sensor The angle is determined by the time information of collecting point cloud data, and the predetermined angle is within the viewing angle range of the image sensor.
根据本公开的另一个方面,提供了一种同步采集数据的方法,包括:响应于接收到来自雷达传感器的针对预定角度的数据包,确定雷达传感器以预定角度采集点云数据的第一时间信息;其中,预定角度在图像传感器的视角范围内;根据第一时间信息,确定针对图像传感器的延迟信息;根据延迟信息调整图像传感器的触发时刻,以便图像传感器与转动至预定角度的雷达传感器同步地采集数据;响应于到达触发时刻,向图像传感器发送触发信号;以及响应于接收到图像传感器采集的图像数据,向图像数据添加第二时间信息,以便利用第二时间信息将图像数据与雷达传感器采集的点云数据对齐。According to another aspect of the present disclosure, a method for synchronously collecting data is provided, including: in response to receiving a data packet from a radar sensor for a predetermined angle, determining first time information when the radar sensor collects point cloud data at a predetermined angle. ; Wherein, the predetermined angle is within the viewing angle range of the image sensor; according to the first time information, delay information for the image sensor is determined; and the triggering time of the image sensor is adjusted according to the delay information, so that the image sensor and the radar sensor rotate to the predetermined angle are synchronized. Collecting data; in response to reaching the trigger moment, sending a trigger signal to the image sensor; and in response to receiving the image data collected by the image sensor, adding second time information to the image data to utilize the second time information to combine the image data with the radar sensor collection Point cloud data alignment.
根据本公开的另一个方面,提供了一种同步确定方法,包括:获取雷达传感器采集的数据包序列和图像传感器采集的图像数据序列;针对数据包序列中的每个数据包,根据每个数据包的时间信息与图像数据序列中各图像数据的时间信息,确定与每个数据包匹配的图像数据;根据每个数据包与匹配的图像数据之间的时间信息的差异及差异阈值,确定每个数据包与匹配的图像数据之间的同步关系;以及根据数据包序列中多个数据包各自与匹配的图像数据之间的同步关系,确定雷达传感器与图像传感器之间的同步关系。According to another aspect of the present disclosure, a synchronization determination method is provided, including: acquiring a data packet sequence collected by a radar sensor and an image data sequence collected by an image sensor; for each data packet in the data packet sequence, according to each data The time information of the packet and the time information of each image data in the image data sequence are used to determine the image data matching each data packet; based on the difference and difference threshold in the time information between each data packet and the matching image data, each data packet is determined. the synchronization relationship between each data packet and the matching image data; and determining the synchronization relationship between the radar sensor and the image sensor based on the synchronization relationship between each of the multiple data packets in the data packet sequence and the matching image data.
根据本公开的另一个方面,提供了一种车载终端,该车载终端被配置为:响应于接收到来自雷达传感器的针对预定角度的数据包,确定雷达传感器以预定角度采集点云数据的第一时间信息;其中,预定角度在图像传感器的视角范围内;根据第一时间信息,确定针对图像传感器的延迟信息;以及向控制器发送延迟信息,以便控制器控制图像传感器与转动至预定角度的雷达传感器同步地采集数据,其中,数据包包括雷达传感器以预定角度采集的点云数据。According to another aspect of the present disclosure, a vehicle-mounted terminal is provided, the vehicle-mounted terminal being configured to: in response to receiving a data packet from a radar sensor for a predetermined angle, determine the first time the radar sensor collects point cloud data at a predetermined angle. time information; wherein the predetermined angle is within the viewing angle range of the image sensor; determining delay information for the image sensor according to the first time information; and sending the delay information to the controller so that the controller controls the image sensor and the radar that rotates to the predetermined angle The sensors collect data synchronously, where the data packets include point cloud data collected by the radar sensor at a predetermined angle.
根据本公开的另一个方面,提供了一种控制器,被配置为:响应于接收到针对图像传感器的延迟信息,根据延迟信息调整图像传感器的触发时刻;响应于到达触发时刻,向图像传感器发送触发信号;以及响应于接收到图像传感器采集的图像数据,向图像数据添加时间信息,以便利用时间信息将图像数据与雷达传感器采集的点云数据对齐,其中,延迟信息是根 据雷达传感器以预定角度采集点云数据的时间信息确定的,预定角度在图像传感器的视角范围内。According to another aspect of the present disclosure, a controller is provided, configured to: in response to receiving delay information for the image sensor, adjust a trigger moment of the image sensor according to the delay information; in response to reaching the trigger moment, send a signal to the image sensor a trigger signal; and in response to receiving the image data collected by the image sensor, adding time information to the image data so as to utilize the time information to align the image data with the point cloud data collected by the radar sensor, wherein the delay information is at a predetermined angle according to the radar sensor The time information for collecting point cloud data is determined, and the predetermined angle is within the viewing angle range of the image sensor.
根据本公开的另一个方面,提供了一种自动驾驶车辆,包括车载终端、控制器、雷达传感器和图像传感器,其中,车载终端被配置为:响应于接收到来自雷达传感器的针对预定角度的数据包,确定雷达传感器以预定角度采集点云数据的第一时间信息;其中,预定角度在图像传感器的视角范围内;数据包包括雷达传感器以预定角度采集的点云数据;根据第一时间信息,确定针对图像传感器的延迟信息;以及向控制器发送延迟信息;其中,控制器被配置为:响应于接收到延迟信息,根据延迟信息调整图像传感器的触发时刻;响应于到达触发时刻,向图像传感器发送触发信号;以及响应于接收到图像传感器采集的图像数据,向图像数据添加第二时间信息,以便利用第二时间信息将图像数据与雷达传感器采集的点云数据对齐。According to another aspect of the present disclosure, an autonomous vehicle is provided, including a vehicle-mounted terminal, a controller, a radar sensor, and an image sensor, wherein the vehicle-mounted terminal is configured to: respond to receiving data for a predetermined angle from the radar sensor The package determines the first time information that the radar sensor collects point cloud data at a predetermined angle; wherein the predetermined angle is within the viewing angle range of the image sensor; the data package includes the point cloud data collected by the radar sensor at a predetermined angle; according to the first time information, Determine delay information for the image sensor; and send the delay information to the controller; wherein the controller is configured to: in response to receiving the delay information, adjust the trigger moment of the image sensor according to the delay information; in response to reaching the trigger moment, send the delay information to the image sensor sending a trigger signal; and in response to receiving the image data collected by the image sensor, adding second time information to the image data so as to utilize the second time information to align the image data with the point cloud data collected by the radar sensor.
根据本公开的另一个方面,提供了一种云端系统,被配置为:获取雷达传感器采集的数据包序列和图像传感器采集的图像数据序列;针对数据包序列中的每个数据包,根据每个数据包的时间信息与图像数据序列中各图像数据的时间信息,确定与每个数据包匹配的图像数据;根据每个数据包与匹配的图像数据之间的时间信息的差异及差异阈值,确定每个数据包与匹配的图像数据之间的同步关系;以及根据数据包序列中多个数据包各自与匹配的图像数据之间的同步关系,确定雷达传感器与图像传感器之间的同步关系。According to another aspect of the present disclosure, a cloud system is provided, configured to: obtain a data packet sequence collected by a radar sensor and an image data sequence collected by an image sensor; for each data packet in the data packet sequence, according to each The time information of the data packet and the time information of each image data in the image data sequence are used to determine the image data matching each data packet; based on the difference in time information and the difference threshold between each data packet and the matching image data, the image data is determined. the synchronization relationship between each data packet and the matching image data; and determining the synchronization relationship between the radar sensor and the image sensor based on the synchronization relationship between each of multiple data packets in the data packet sequence and the matching image data.
根据本公开的另一个方面,提供了一种同步采集数据的装置,该装置包括:时间信息确定模块,用于响应于接收到来自雷达传感器的针对预定角度的数据包,确定雷达传感器以预定角度采集点云数据的第一时间信息;其中,预定角度在图像传感器的视角范围内;延迟信息确定模块,用于根据第一时间信息,确定针对图像传感器的延迟信息;以及信息发送模块,用于向控制器发送延迟信息,以便控制器控制图像传感器与转动至预定角度的雷达传感器同步地采集数据,其中,数据包包括雷达传感器以预定角度采集的点云数据。According to another aspect of the present disclosure, a device for synchronously collecting data is provided. The device includes: a time information determination module, configured to determine whether the radar sensor is at a predetermined angle in response to receiving a data packet from a radar sensor for a predetermined angle. Collect first time information of point cloud data; wherein the predetermined angle is within the viewing angle range of the image sensor; a delay information determination module for determining delay information for the image sensor based on the first time information; and an information sending module for Delay information is sent to the controller so that the controller controls the image sensor to collect data synchronously with the radar sensor rotated to a predetermined angle, wherein the data packet includes point cloud data collected by the radar sensor at the predetermined angle.
根据本公开的另一个方面,提供了一种同步采集数据的装置,包括: 时刻调整模块,用于响应于接收到针对图像传感器的延迟信息,根据延迟信息调整图像传感器的触发时刻;信号发送模块,用于响应于到达触发时刻,向图像传感器发送触发信号;以及时间添加模块,用于响应于接收到图像传感器采集的图像数据,向图像数据添加时间信息,以便利用时间信息将图像数据与雷达传感器采集的点云数据对齐,其中,延迟信息是根据雷达传感器以预定角度采集点云数据的时间信息确定的,预定角度在图像传感器的视角范围内。According to another aspect of the present disclosure, a device for synchronously collecting data is provided, including: a time adjustment module, configured to adjust the triggering time of the image sensor according to the delay information in response to receiving delay information for the image sensor; a signal sending module , used to send a trigger signal to the image sensor in response to reaching the trigger moment; and a time adding module, used to add time information to the image data in response to receiving the image data collected by the image sensor, so as to use the time information to combine the image data with the radar The point cloud data collected by the sensor are aligned, wherein the delay information is determined based on the time information when the radar sensor collects the point cloud data at a predetermined angle, and the predetermined angle is within the viewing angle range of the image sensor.
根据本公开的另一个方面,提供了一种同步采集数据的装置,包括:时间信息确定模块,用于响应于接收到来自雷达传感器的针对预定角度的数据包,确定雷达传感器以预定角度采集点云数据的第一时间信息;其中,预定角度在图像传感器的视角范围内;延时信息确定模块,用于根据第一时间信息,确定针对图像传感器的延迟信息;时刻调整模块,用于根据延迟信息调整图像传感器的触发时刻,以便图像传感器与转动至预定角度的雷达传感器同步地采集数据;信号发送模块,用于响应于到达触发时刻,向图像传感器发送触发信号;以及时间添加模块,用于响应于接收到图像传感器采集的图像数据,向图像数据添加第二时间信息,以便利用第二时间信息将图像数据与雷达传感器采集的点云数据对齐。According to another aspect of the present disclosure, a device for synchronously collecting data is provided, including: a time information determination module, configured to determine that the radar sensor collects points at a predetermined angle in response to receiving a data packet from a radar sensor for a predetermined angle. The first time information of the cloud data; wherein the predetermined angle is within the viewing angle range of the image sensor; the delay information determination module is used to determine the delay information for the image sensor based on the first time information; the time adjustment module is used to determine the delay information based on the delay The information adjusts the trigger moment of the image sensor so that the image sensor collects data synchronously with the radar sensor rotated to a predetermined angle; a signal sending module for sending a trigger signal to the image sensor in response to reaching the trigger moment; and a time adding module for In response to receiving the image data collected by the image sensor, second time information is added to the image data to align the image data with the point cloud data collected by the radar sensor using the second time information.
根据本公开的另一个方面,提供了一种同步确定装置,包括:数据获取模块,用于获取雷达传感器采集的数据包序列和图像传感器采集的图像数据序列;数据匹配模块,用于针对数据包序列中的每个数据包,根据每个数据包的时间信息与图像数据序列中各图像数据的时间信息,确定与每个数据包匹配的图像数据;数据关系确定模块,用于根据每个数据包与匹配的图像数据之间的时间信息的差异及差异阈值,确定每个数据包与匹配的图像数据之间的同步关系;以及同步关系确定模块,用于根据数据包序列中多个数据包各自与匹配的图像数据之间的同步关系,确定雷达传感器与图像传感器之间的同步关系。According to another aspect of the present disclosure, a synchronization determination device is provided, including: a data acquisition module for acquiring a data packet sequence collected by a radar sensor and an image data sequence collected by an image sensor; a data matching module for targeting data packets For each data packet in the sequence, the image data matching each data packet is determined based on the time information of each data packet and the time information of each image data in the image data sequence; the data relationship determination module is used to determine the image data matching each data packet according to the time information of each data packet. The difference in time information between the packet and the matching image data and the difference threshold determine the synchronization relationship between each data packet and the matching image data; and a synchronization relationship determination module is used to determine the synchronization relationship between multiple data packets in the data packet sequence. The synchronization relationship between each and the matching image data determines the synchronization relationship between the radar sensor and the image sensor.
根据本公开的另一个方面,提供了一种电子设备,包括:至少一个处理器;以及与至少一个处理器通信连接的存储器;其中,存储器存储有可被至少一个处理器执行的指令,指令被至少一个处理器执行,以使至少一个处理器能够执行本公开提供的同步采集数据的方法或同步确定方法。According to another aspect of the present disclosure, an electronic device is provided, including: at least one processor; and a memory communicatively connected to the at least one processor; wherein the memory stores instructions that can be executed by the at least one processor, and the instructions are At least one processor executes, so that at least one processor can execute the method for synchronously collecting data or the method for synchronization determination provided by the present disclosure.
根据本公开的另一个方面,提供了一种存储有计算机指令的非瞬时计算机可读存储介质,其中,计算机指令用于使计算机执行本公开提供的同步采集数据的方法或同步确定方法。According to another aspect of the present disclosure, there is provided a non-transitory computer-readable storage medium storing computer instructions, wherein the computer instructions are used to cause the computer to execute the method of synchronously collecting data or the method of synchronization determination provided by the present disclosure.
根据本公开的另一个方面,提供了一种计算机程序产品,包括计算机程序/指令,所述计算机程序/指令在被处理器执行时实现本公开提供的同步采集数据的方法或同步确定方法。According to another aspect of the present disclosure, a computer program product is provided, including a computer program/instruction that, when executed by a processor, implements the method for synchronously collecting data or the method for synchronization determination provided by the present disclosure.
应当理解,本部分所描述的内容并非旨在标识本公开的实施例的关键或重要特征,也不用于限制本公开的范围。本公开的其它特征将通过以下的说明书而变得容易理解。It should be understood that what is described in this section is not intended to identify key or important features of the embodiments of the disclosure, nor is it intended to limit the scope of the disclosure. Other features of the present disclosure will become readily understood from the following description.
附图说明Description of drawings
附图用于更好地理解本方案,不构成对本公开的限定。其中:The accompanying drawings are used to better understand the present solution and do not constitute a limitation of the present disclosure. in:
图1是根据本公开实施例的同步采集数据的方法和同步确定方法、装置的应用场景示意图;Figure 1 is a schematic diagram of an application scenario of a method for synchronously collecting data, a method for determining synchronization, and a device according to an embodiment of the present disclosure;
图2是根据本公开实施例的同步采集数据的方法的流程示意图;Figure 2 is a schematic flowchart of a method for synchronously collecting data according to an embodiment of the present disclosure;
图3是根据本公开实施例的确定针对图像传感器的延迟信息的原理示意图;Figure 3 is a schematic diagram of the principle of determining delay information for an image sensor according to an embodiment of the present disclosure;
图4是根据本公开另一实施例的同步采集数据的方法的流程示意图;Figure 4 is a schematic flowchart of a method for synchronously collecting data according to another embodiment of the present disclosure;
图5是根据本公开实施例的调整图像传感器的触发时刻的流程示意图;Figure 5 is a schematic flowchart of adjusting the triggering time of an image sensor according to an embodiment of the present disclosure;
图6是根据本公开实施例的同步采集数据的方法的原理示意图;Figure 6 is a schematic diagram of the principle of a method for synchronously collecting data according to an embodiment of the present disclosure;
图7是根据本公开实施例的同步采集数据的原理示意图;Figure 7 is a schematic diagram of the principle of synchronous data collection according to an embodiment of the present disclosure;
图8是根据本公开实施例的同步确定方法的流程示意图;Figure 8 is a schematic flowchart of a synchronization determination method according to an embodiment of the present disclosure;
图9是根据本公开实施例的车载终端的示意图;Figure 9 is a schematic diagram of a vehicle-mounted terminal according to an embodiment of the present disclosure;
图10是根据本公开实施例的控制器的示意图;Figure 10 is a schematic diagram of a controller according to an embodiment of the present disclosure;
图11是根据本公开实施例的自动驾驶车辆的示意图;Figure 11 is a schematic diagram of an autonomous vehicle according to an embodiment of the present disclosure;
图12是根据本公开实施例的云端系统的示意图;Figure 12 is a schematic diagram of a cloud system according to an embodiment of the present disclosure;
图13是根据本公开实施例的同步采集数据的装置的结构框图;Figure 13 is a structural block diagram of a device for synchronously collecting data according to an embodiment of the present disclosure;
图14是根据本公开另一实施例的同步采集数据的装置的结构框图;Figure 14 is a structural block diagram of a device for synchronously collecting data according to another embodiment of the present disclosure;
图15是根据本公开另一实施例的同步采集数据的装置的结构框图;Figure 15 is a structural block diagram of a device for synchronously collecting data according to another embodiment of the present disclosure;
图16是根据本公开实施例的同步确定装置的结构框图;以及Figure 16 is a structural block diagram of a synchronization determination device according to an embodiment of the present disclosure; and
图17是用来实施本公开实施例的同步采集数据的方法或同步确定方法的电子设备的框图。FIG. 17 is a block diagram of an electronic device used to implement the method of synchronously collecting data or the method of synchronization determination according to an embodiment of the present disclosure.
具体实施方式Detailed ways
以下结合附图对本公开的示范性实施例做出说明,其中包括本公开实施例的各种细节以助于理解,应当将它们认为仅仅是示范性的。因此,本领域普通技术人员应当认识到,可以对这里描述的实施例做出各种改变和修改,而不会背离本公开的范围和精神。同样,为了清楚和简明,以下的描述中省略了对公知功能和结构的描述。Exemplary embodiments of the present disclosure are described below with reference to the accompanying drawings, in which various details of the embodiments of the present disclosure are included to facilitate understanding and should be considered to be exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications can be made to the embodiments described herein without departing from the scope and spirit of the disclosure. Also, descriptions of well-known functions and constructions are omitted from the following description for clarity and conciseness.
在自动驾驶过程中,自动驾驶车辆需要感知周围丰富的环境信息,并在周围环境中存在安全隐患时,及时采取安全措施。其中,为了感知周围丰富的环境信息,自动驾驶车辆中通常设置有雷达传感器和图像传感器。图像传感器用于采集图像数据,得到环境中丰富的纹理和色彩信息,雷达传感器用于采集点云数据,以获取环境中物体的距离信息。在自动驾驶过程中,可以通过将图像数据和点云数据融合,来得到重建的三维环境信息。During the process of autonomous driving, autonomous vehicles need to sense rich environmental information around them and take timely safety measures when there are safety hazards in the surrounding environment. Among them, in order to sense the rich surrounding environmental information, autonomous vehicles are usually equipped with radar sensors and image sensors. Image sensors are used to collect image data to obtain rich texture and color information in the environment, and radar sensors are used to collect point cloud data to obtain distance information of objects in the environment. During the autonomous driving process, reconstructed three-dimensional environmental information can be obtained by fusing image data and point cloud data.
若图像数据和点云数据的采集时间不同步,则通过融合图像数据和点云数据而得到的环境信息会与实际的环境信息存在偏差。该偏差会随着自动驾驶车辆的运动速度的增加而被放大,进而会影响自动驾驶车辆的安全行驶。If the collection time of image data and point cloud data is not synchronized, the environmental information obtained by fusing the image data and point cloud data will deviate from the actual environmental information. This deviation will be amplified as the speed of the autonomous vehicle increases, which will affect the safe driving of the autonomous vehicle.
基于此,本公开提供了一种提高图像数据和点云数据的采集同步性的同步采集数据的方法和装置,且还提供了一种对数据采集的同步性进行高效、准确地评估的同步确定方法和装置。Based on this, the present disclosure provides a method and device for synchronous data collection that improves the collection synchronization of image data and point cloud data, and also provides a synchronization determination that efficiently and accurately evaluates the synchronization of data collection. Methods and apparatus.
以下将结合图1对本公开提供的方法和装置的应用场景进行描述。The application scenarios of the method and device provided by the present disclosure will be described below with reference to FIG. 1 .
图1是根据本公开实施例的同步采集数据的方法和同步确定方法、装置的应用场景示意图。Figure 1 is a schematic diagram of application scenarios of a method for synchronously collecting data, a method for determining synchronization, and a device according to an embodiment of the present disclosure.
如图1所示,该实施例的应用场景100可以包括自动驾驶车辆110、道路120和云端系统130。As shown in FIG. 1 , the application scenario 100 of this embodiment may include an autonomous vehicle 110 , a road 120 and a cloud system 130 .
其中,自动驾驶车辆110可以集成有车载终端、雷达传感器111和图像传感器112。车载终端可以通过通信线缆与雷达传感器111和图像传感器112通信连接。自动驾驶车辆110在道路120上行驶的过程中,车载终 端可以通过通信线缆获取雷达传感器111采集的点云数据及图像传感器112采集的图像数据。车载终端例如可以实时地融合图像数据和点云数据,根据融合结果来决策车辆的驾驶策略,并根据驾驶策略向自动驾驶车辆110的动力系统等发送控制信号,实现自动驾驶。Among them, the autonomous vehicle 110 may be integrated with a vehicle-mounted terminal, a radar sensor 111 and an image sensor 112. The vehicle-mounted terminal can be connected to the radar sensor 111 and the image sensor 112 through communication cables. While the autonomous vehicle 110 is driving on the road 120, the vehicle-mounted terminal can obtain the point cloud data collected by the radar sensor 111 and the image data collected by the image sensor 112 through the communication cable. For example, the vehicle-mounted terminal can fuse image data and point cloud data in real time, decide the driving strategy of the vehicle based on the fusion results, and send control signals to the power system of the autonomous vehicle 110 according to the driving strategy to achieve autonomous driving.
其中,雷达传感器111例如可以为激光雷达、毫米波雷达等具有机械式旋转功能的传感器,该雷达传感器111例如可以设置在自动驾驶车辆的车顶。例如,该雷达传感器111可以在脉冲信号地控制下进行360°的旋转,以采集自动驾驶车辆周围全方位的点云数据。以激光雷达为例,雷达传感器可以通过发射和接收激光束,来获取自动驾驶车辆周围环境中物体的位置点信息,并根据该些位置点信息进行三维建模,得到点云数据。The radar sensor 111 may be, for example, a laser radar, a millimeter wave radar, or other sensor with a mechanical rotation function. The radar sensor 111 may be, for example, installed on the roof of an autonomous vehicle. For example, the radar sensor 111 can rotate 360° under the control of pulse signals to collect all-round point cloud data around the autonomous vehicle. Taking lidar as an example, the radar sensor can obtain the position point information of objects in the environment around the autonomous vehicle by emitting and receiving laser beams, and performs three-dimensional modeling based on the position point information to obtain point cloud data.
其中,图像传感器112例如可以为以下摄像头中的任意一个:自动驾驶车辆110所安装的前视摄像头、环视摄像头中安装在自动驾驶车辆110的任意一个方位的摄像头、后视摄像头和侧视摄像头等。The image sensor 112 may be, for example, any one of the following cameras: a front-view camera installed on the self-driving vehicle 110 , a surround-view camera installed in any direction of the self-driving vehicle 110 , a rear-view camera, a side-view camera, etc. .
在一实施例中,自动驾驶车辆110中例如还可以集成有定位装置,该定位装置可以由全球卫星定位系统(Global Positioning System,GPS)和地理信息系统(Geographic Information System,GIS)组成,用于实现自动驾驶车辆的跟踪和定位。在该实施例中,车载终端例如可以将定位装置的定位装置时间作为基准时间,来控制雷达传感器和图像传感器采集数据的时机,使得雷达传感器和图像传感器同步地采集数据。In one embodiment, the autonomous vehicle 110 may also be integrated with a positioning device. The positioning device may be composed of a Global Positioning System (GPS) and a Geographic Information System (Geographic Information System, GIS). Realize tracking and positioning of autonomous vehicles. In this embodiment, the vehicle-mounted terminal can, for example, use the positioning device time of the positioning device as the reference time to control the timing of data collection by the radar sensor and the image sensor, so that the radar sensor and the image sensor collect data synchronously.
在一实施例中,车载终端例如可以根据雷达传感器采集点云数据的时间信息来控制图像采集设备的触发,使得雷达传感器和图像传感器同步地采集数据。In one embodiment, the vehicle-mounted terminal may, for example, control the triggering of the image acquisition device based on the time information of the radar sensor collecting point cloud data, so that the radar sensor and the image sensor collect data synchronously.
在一实施例中,自动驾驶车辆110例如还可以集成有控制图像传感器112的触发的控制器,该控制器例如可以为人工智能芯片。该实施例中,车载终端可以与该控制器通信连接,以将根据雷达传感器采集点云数据的时间信息确定的延迟信息发送给控制器,使得控制器根据该延迟信息来控制图像传感器的触发。In one embodiment, the autonomous vehicle 110 may also be integrated with a controller that controls the triggering of the image sensor 112 , and the controller may be an artificial intelligence chip, for example. In this embodiment, the vehicle-mounted terminal can be communicatively connected with the controller to send delay information determined based on the time information of the radar sensor collecting point cloud data to the controller, so that the controller controls the triggering of the image sensor based on the delay information.
如图1所示,该应用场景100中,车载终端还可以通过无线通信链路与云端系统130通信连接。云端系统130可以根据车载终端上传的数据来对自动驾驶车辆的行驶进行监控。例如,云端系统130可以从车载终端处 获取预定周期内自动驾驶车辆采集得到的点云数据和图像数据,并根据点云数据和图像数据的融合结果,来对自动驾驶车辆采集点云数据和图像数据的同步性进行评估,为调整点云数据和图像数据的同步采集策略提供参考。As shown in Figure 1, in this application scenario 100, the vehicle-mounted terminal can also communicate with the cloud system 130 through a wireless communication link. The cloud system 130 can monitor the driving of the autonomous vehicle based on the data uploaded by the vehicle terminal. For example, the cloud system 130 can obtain the point cloud data and image data collected by the autonomous vehicle within a predetermined period from the vehicle terminal, and collect the point cloud data and images of the autonomous vehicle based on the fusion result of the point cloud data and image data. The synchronicity of the data is evaluated to provide a reference for adjusting the synchronous collection strategy of point cloud data and image data.
需要说明的是,本公开提供的同步采集数据的方法可以由自动驾驶车辆110执行,具体可以部分操作由自动驾驶车辆110中的车载终端执行且部分操作由控制器执行。相应地,本公开提供的同步采集数据的装置可以设置在自动驾驶车辆110中,具体可以部分模块设置在车载终端中且部分模块设置在控制器中。本公开提供的同步确定方法可以由云端系统130执行。相应地,本公开提供的同步确定装置可以设置在云端系统130中。It should be noted that the method for synchronously collecting data provided by the present disclosure can be executed by the autonomous vehicle 110 . Specifically, some operations can be executed by the vehicle-mounted terminal in the autonomous vehicle 110 and some operations can be executed by the controller. Accordingly, the device for synchronously collecting data provided by the present disclosure can be provided in the autonomous vehicle 110. Specifically, some modules can be provided in the vehicle-mounted terminal and some modules can be provided in the controller. The synchronization determination method provided by the present disclosure can be executed by the cloud system 130 . Correspondingly, the synchronization determining device provided by the present disclosure can be provided in the cloud system 130 .
应该理解,图1中的自动驾驶车辆110、雷达传感器111、图像传感器112和云端系统130的结构和类型仅仅是示意性的。根据实现需要,可以具有任意结构和类型的自动驾驶车辆110、雷达传感器111、图像传感器112和云端系统130。It should be understood that the structures and types of the autonomous vehicle 110, radar sensor 111, image sensor 112 and cloud system 130 in Figure 1 are only schematic. Depending on implementation requirements, the autonomous vehicle 110, radar sensor 111, image sensor 112 and cloud system 130 may have any structure and type.
以下将结合图2~图6对本公开提供的同步采集数据的方法进行详细描述。The method for synchronously collecting data provided by the present disclosure will be described in detail below with reference to FIGS. 2 to 6 .
图2是根据本公开实施例的同步采集数据的方法的流程示意图。Figure 2 is a schematic flowchart of a method for synchronously collecting data according to an embodiment of the present disclosure.
如图2所示,该实施例的同步采集数据的方法200可以包括操作S210~操作S230。该方法200例如可以由自动驾驶车辆中的车载终端执行。As shown in FIG. 2 , the method 200 for synchronously collecting data in this embodiment may include operations S210 to S230. The method 200 may be executed, for example, by a vehicle-mounted terminal in an autonomous vehicle.
在操作S210,响应于接收到来自雷达传感器的针对预定角度的数据包,确定雷达传感器以预定角度采集点云数据的第一时间信息。In operation S210, in response to receiving a data packet for a predetermined angle from the radar sensor, first time information at which the radar sensor collects point cloud data at a predetermined angle is determined.
在操作S220,根据第一时间信息,确定针对图像传感器的延迟信息。In operation S220, delay information for the image sensor is determined according to the first time information.
在操作S230,向控制器发送延迟信息。In operation S230, delay information is sent to the controller.
根据本公开的实施例,车载终端可以接收雷达传感器发送的数据包,该数据包中包括雷达传感器采集的点云数据。车载终端可以将数据包的时间戳作为雷达传感器采集点云数据的时间信息。其中,时间戳可以由雷达传感器在对采集的点云数据进行封装以得到数据包时添加。According to embodiments of the present disclosure, the vehicle-mounted terminal may receive a data packet sent by a radar sensor, where the data packet includes point cloud data collected by the radar sensor. The vehicle-mounted terminal can use the timestamp of the data packet as the time information for the radar sensor to collect point cloud data. Among them, the timestamp can be added by the radar sensor when encapsulating the collected point cloud data to obtain the data package.
在一实施例中,雷达传感器可以使用UDP/IP协议发送数据包。该数据包中包括以太网报头(Ethernet Header)和用户数据包协议数据(UDP Data)。其中,用户数据包协议数据可以包括测距数据和附加信息。其中, 测距数据由多个数据块构成,每个数据块包括方位角度值和距离值等。附加信息可以包括驱动雷达传感器转动的电动机转速和时间数据等。In one embodiment, the radar sensor may send data packets using UDP/IP protocol. The data packet includes Ethernet Header and User Data Packet Protocol Data (UDP Data). Among them, the user data packet protocol data may include ranging data and additional information. Among them, the ranging data consists of multiple data blocks, and each data block includes an azimuth angle value, a distance value, etc. Additional information may include motor speed and time data that drives the radar sensor to rotate.
该实施例中,车载终端可以解析接收的数据包,根据解析结果来确定是否接收到针对预定角度的数据包。例如,方位角度值为预定角度的数据包,即为针对预定角度的数据包。该实施例可以根据该针对预定角度的数据包中的时间数据,来得到雷达传感器以预定角度采集点云数据的第一时间信息。其中,预定角度为图像传感器的视角范围内的角度。例如,该预定角度可以根据图像传感器的主光轴与水平面的夹角来确定,例如该预定角度可以为该夹角的取值。雷达传感器采集点云数据的角度可以由雷达传感器采集点云数据时发射的激光束与水平面之间的夹角来表示。In this embodiment, the vehicle-mounted terminal can parse the received data packet and determine whether a data packet for a predetermined angle is received based on the parsing result. For example, a data packet whose azimuth angle value is a predetermined angle is a data packet for the predetermined angle. This embodiment can obtain the first time information of the radar sensor collecting point cloud data at a predetermined angle based on the time data in the data packet for the predetermined angle. The predetermined angle is an angle within the viewing angle range of the image sensor. For example, the predetermined angle can be determined based on the angle between the main optical axis of the image sensor and the horizontal plane. For example, the predetermined angle can be the value of the included angle. The angle at which the radar sensor collects point cloud data can be represented by the angle between the laser beam emitted by the radar sensor when collecting point cloud data and the horizontal plane.
例如,数据包中的时间数据可以包括标准时间(例如Universal Time Coordinated,UTC)和数据包的周期性封装时间。其中,数据包的周期性封装时间的取值范围可以为[0μs,1s]。标准时间可以表示采集数据的年、月、日、时、分和秒。该实施例可以将标准时间与周期性封装时间相加,得到采集数据包中测距数据的时间信息。该实施例可以将方位角度值为预定角度的数据包作为目标数据包,并将采集目标数据包中测距数据的时间信息作为操作S210所确定的时间信息。可以理解的是,测距数据和方位角度值相结合可以表示点云数据。For example, the time data in the data packet can include standard time (such as Universal Time Coordinated, UTC) and the periodic encapsulation time of the data packet. Among them, the value range of the periodic encapsulation time of the data packet can be [0μs, 1s]. Standard time can represent the year, month, day, hour, minute and second of the collected data. In this embodiment, the standard time and the periodic encapsulation time can be added to obtain the time information of the ranging data in the collected data packet. In this embodiment, the data packet whose azimuth angle value is a predetermined angle can be used as the target data packet, and the time information for collecting the ranging data in the target data packet can be used as the time information determined in operation S210. It can be understood that the combination of ranging data and azimuth angle values can represent point cloud data.
根据本公开的实施例,在得到以预定角度采集点云数据的第一时间信息T的同时,该实施例还可以根据方位角度值为0°的数据包中的时间数据来推算得到雷达传感器以预定角度采集点云数据的时间信息。例如根据方位角度值为0°的数据包中的时间数据确定的时间信息为T 0、预定角度为angle,雷达传感器的转动周期为LIDAR_ROT_INTERVAL,则可以采用以下公式(1)推算得到雷达传感器以预定角度采集点云数据的时间信息T’: According to an embodiment of the present disclosure, while obtaining the first time information T of point cloud data collected at a predetermined angle, this embodiment can also calculate the radar sensor based on the time data in the data packet with an azimuth angle value of 0°. Time information for collecting point cloud data at a predetermined angle. For example, the time information determined based on the time data in the data packet with an azimuth angle value of 0° is T 0 , the predetermined angle is angle, and the rotation period of the radar sensor is LIDAR_ROT_INTERVAL. The following formula (1) can be used to calculate the radar sensor to the predetermined angle. Time information T' for collecting point cloud data at angle:
T’=T 0+angle*LIDAR_ROT_INTERVAL/360。公式(1) T'=T 0 +angle*LIDAR_ROT_INTERVAL/360. Formula 1)
可以根据时间信息T与时间信息T’的差值,来确定针对图像传感器的延迟信息。The delay information for the image sensor can be determined based on the difference between the time information T and the time information T'.
在一实施例中,在得到时间信息T与时间信息T’的差值后,还可以采用该差值对图像传感器采集图像的时间间隔取余,将取余得到的值作为延迟信息。如此,可以将延迟信息的取值限定到图像传感器采集图像的时 间间隔内,在依据延迟信息控制图像传感器采集图像的采集时刻时,可以减小采集时刻的调整量。其中,图像传感器采集图像的时间间隔例如可以根据图像传感器的采集帧率的倒数正相关。其中,采集帧率的单位例如可以为fps,即采集帧率为图像传感器每秒采集的图像数据的个数。该实施例之所以可以将延迟信息的取值限定到图像传感器采集图像的时间间隔内,是因为图像传感器是按采集帧率周期性地采集图像数据的,数据采集的同步只要保证图像传感器采集的图像数据中包括与转至预定角度的雷达传感器采集的点云数据同时采集的数据即可。In one embodiment, after obtaining the difference between the time information T and the time information T', the difference can also be used to take the remainder of the time interval at which the image sensor collects images, and the remainder can be used as the delay information. In this way, the value of the delay information can be limited to the time interval during which the image sensor collects images. When controlling the acquisition time of the image sensor acquisition according to the delay information, the adjustment amount of the acquisition time can be reduced. The time interval at which the image sensor collects images may, for example, be positively correlated according to the reciprocal of the acquisition frame rate of the image sensor. The unit of the acquisition frame rate may be, for example, fps, that is, the acquisition frame rate is the number of image data collected by the image sensor per second. The reason why this embodiment can limit the value of the delay information to the time interval during which the image sensor collects images is because the image sensor periodically collects image data according to the collection frame rate, and the synchronization of data collection only needs to ensure that the image sensor collects The image data only needs to include data collected simultaneously with the point cloud data collected by the radar sensor turned to a predetermined angle.
在得到该延迟信息后,可以将延迟信息发送给控制图像传感器的触发时机的控制器,以通过控制器控制图像传感器的触发时机,即控制图像传感器采集图像的采集时刻,从而使得图像传感器能够与转动至预定角度的雷达传感器同步地采集数据。即,在转动至预定角度的雷达传感器采集点云数据的时刻,图像传感器同步地在采集图像数据。After obtaining the delay information, the delay information can be sent to the controller that controls the triggering timing of the image sensor, so that the triggering timing of the image sensor is controlled by the controller, that is, the acquisition time of the image sensor is controlled, so that the image sensor can communicate with the image sensor. Radar sensors rotated to a predetermined angle collect data synchronously. That is, at the moment when the radar sensor rotated to a predetermined angle collects point cloud data, the image sensor collects image data synchronously.
该实施例通过根据雷达传感器采集点云数据的时间信息来确定延迟信息,并将延迟信息发送给控制器,可以使得图像传感器采集图像数据的时间与雷达传感器在图像传感器的视角范围内采集点云数据的时间对齐,使得采集的图像数据和点云数据能够表达同一时刻的环境信息,可以提高采集的图像数据和点云数据的对齐精度,利于自动驾驶车辆做出正确的行驶策略,提高自动驾驶车辆的行驶安全性。This embodiment determines the delay information based on the time information when the radar sensor collects point cloud data, and sends the delay information to the controller, so that the time when the image sensor collects image data is consistent with the time when the radar sensor collects point clouds within the viewing angle range of the image sensor. The time alignment of data enables the collected image data and point cloud data to express the environmental information at the same time, which can improve the alignment accuracy of the collected image data and point cloud data, help autonomous vehicles make correct driving strategies, and improve autonomous driving. Vehicle driving safety.
可以理解的是,该实施例的技术方案相较于现有技术中图像传感器和雷达传感器仅基于相同的基准时间采集数据的技术方案,可以提高两个传感器采集数据的同步性,可以保证图像传感器采集的图像数据中包括与雷达传感器采集的点云数据对齐的数据。It can be understood that, compared with the technical solution in the prior art in which the image sensor and the radar sensor only collect data based on the same reference time, the technical solution of this embodiment can improve the synchronization of the data collected by the two sensors and ensure that the image sensor The acquired image data includes data aligned with the point cloud data collected by the radar sensor.
可以理解的是,可以通过解析来自雷达传感器的数据包,得到数据包中的采集角度数据和时间数据。例如,对于Pandar 40P型号的激光雷达,解析来自该激光雷达的数据包,将解析得到的Azlmuth Angle参数的取值作为采集角度数据,即作为方位角度值。可以将解析得到的UTC参数的取值和GPS Timestamp参数的取值作为时间数据。该实施例可以采用解析UDP数据包的任意解析工具来解析来自雷达传感器的数据包。可以理解的是,上述雷达传感器的型号和解析工具仅作为示例以利于理解本公开,本 公开对此不做限定。例如,对于不同型号的雷达传感器,可以将解析得到的对应参数的取值作为采集角度数据和时间数据。对于采用除UDP外其他通信协议传输的数据包,可以采用对应的解析工具来解析数据包。It can be understood that the collection angle data and time data in the data packet can be obtained by parsing the data packet from the radar sensor. For example, for the Pandar 40P model lidar, parse the data packet from the lidar, and use the value of the analyzed Azlmuth Angle parameter as the acquisition angle data, that is, as the azimuth angle value. The parsed UTC parameter value and the GPS Timestamp parameter value can be used as time data. This embodiment can use any parsing tool that parses UDP packets to parse the data packets from the radar sensor. It can be understood that the above-mentioned radar sensor models and analysis tools are only used as examples to facilitate understanding of the present disclosure, and the present disclosure does not limit this. For example, for different types of radar sensors, the values of the corresponding parameters obtained by analysis can be used as the collection angle data and time data. For data packets transmitted using communication protocols other than UDP, corresponding parsing tools can be used to parse the data packets.
在一实施例中,上文描述的基准时间,例如可以与雷达传感器所在车辆中定位装置的定位装置时间对齐。具体地,在车辆启动时,定位装置可以向雷达传感器发送推荐定位信息(Recommended Minimum Specific GPS/TRANSIT Data,GPRMC)数据包。雷达传感器可以将GPRMC数据包中的UTC时间作为初始时间。随后,在雷达传感器采集点云数据的过程中,可以将采集点云数据的时刻与接收到GPRMC数据包的时刻之间的时间差与初始时间的和,作为基准时间。其中,GPRMC数据包中的UTC时间通常为秒级。如此,可以使得雷达传感器采集的点云数据的数据包具有高精度的时间戳,利于实现对图像传感器的高精度控制。In one embodiment, the reference time described above may be aligned with the time of the positioning device of the positioning device in the vehicle where the radar sensor is located, for example. Specifically, when the vehicle is started, the positioning device can send a recommended positioning information (Recommended Minimum Specific GPS/TRANSIT Data, GPRMC) data packet to the radar sensor. The radar sensor can use the UTC time in the GPRMC packet as the initial time. Subsequently, during the process of collecting point cloud data by the radar sensor, the sum of the time difference between the time when the point cloud data is collected and the time when the GPRMC data packet is received and the initial time can be used as the reference time. Among them, the UTC time in the GPRMC data packet is usually in seconds. In this way, the data packets of point cloud data collected by the radar sensor can have high-precision timestamps, which is conducive to achieving high-precision control of the image sensor.
图3是根据本公开实施例的确定针对图像传感器的延迟信息的原理示意图。FIG. 3 is a schematic diagram of the principle of determining delay information for an image sensor according to an embodiment of the present disclosure.
根据本公开的实施例,在确定延时信息时,例如可以根据预定角度采集点云数据的时间信息推算得到雷达传感器在转动角度为0°时的时间信息。并根据推算得到的转动角度为0°时的时间信息和雷达传感器的转动周期,来确定雷达传感器的转动偏离值。随后根据该转动偏离值来确定延迟信息。According to embodiments of the present disclosure, when determining the delay information, for example, the time information of the radar sensor when the rotation angle is 0° can be calculated based on the time information of point cloud data collected at a predetermined angle. And based on the calculated time information when the rotation angle is 0° and the rotation period of the radar sensor, the rotation deviation value of the radar sensor is determined. Delay information is then determined based on this rotational deviation value.
具体地,如图3所示,该实施例300在确定延迟信息时,可以先根据雷达传感器以预定角度采集点云数据的第一时间信息301、预定角度302和雷达传感器的转动周期303,推算得到雷达传感器以0°采集点云数据的第二时间信息304。例如,设定第一时间信息为T,预定角度为angle,雷达传感器的转动周期为LIDAR_ROT_INTERVAL,则第二时间信息T 0’可以采用以下公式(2)计算得到: Specifically, as shown in Figure 3, when determining the delay information in this embodiment 300, the first time information 301 of the point cloud data collected by the radar sensor at a predetermined angle, the predetermined angle 302 and the rotation period 303 of the radar sensor can be calculated. The second time information 304 of the radar sensor collecting point cloud data at 0° is obtained. For example, set the first time information as T, the predetermined angle as angle, and the rotation period of the radar sensor as LIDAR_ROT_INTERVAL, then the second time information T 0 ' can be calculated using the following formula (2):
T 0’=T-angle*LIDAR_ROT_INTERVAL/360。公式(2) T 0 '=T-angle*LIDAR_ROT_INTERVAL/360. Formula (2)
在一实施例中,设定雷达传感器对角度的最小分辨率为0.01°,数据包中的方位角度Azlmuth Angle是以0.01°为单位计量,则该实施例中的预定角度angle相应地也应以0.01°为单位计量。此种情况下,公式(2)中的360应由36000替代。In one embodiment, the minimum angle resolution of the radar sensor is set to 0.01°, and the azimuth angle Azlmuth Angle in the data packet is measured in units of 0.01°, then the predetermined angle in this embodiment should also be measured in units of 0.01°. Measured in units of 0.01°. In this case, 360 in formula (2) should be replaced by 36000.
该实施例在得到第二时间信息304后,可以将该第二时间信息304对转动周期303取余,根据取余得到的余数来确定雷达传感器的转动偏离值305。这是由于雷达传感器的初始角度为0°,且雷达传感器采集点云数据的起始时间通常以秒为最小单位,雷达传感器的转动周期通常为ms量级。若雷达传感器的转动没有偏差,则雷达传感器以0°角度采集点云数据的时间信息应该为转动周期的整数倍。因此,可以采用取余得到的余数来表示转动偏离值。可以理解的是,上述T与T’的差值可以理解为该实施例确定的转动偏离值。In this embodiment, after obtaining the second time information 304, the second time information 304 can be modulated by the rotation period 303, and the rotation deviation value 305 of the radar sensor can be determined based on the remainder. This is because the initial angle of the radar sensor is 0°, and the starting time for the radar sensor to collect point cloud data is usually in seconds, and the rotation period of the radar sensor is usually in the order of ms. If there is no deviation in the rotation of the radar sensor, the time information of the radar sensor collecting point cloud data at an angle of 0° should be an integer multiple of the rotation period. Therefore, the remainder obtained by taking the remainder can be used to represent the rotational deviation value. It can be understood that the difference between T and T' mentioned above can be understood as the rotation deviation value determined in this embodiment.
在得到转动偏离值305后,该实施例可以将该转动偏离值305作为图像传感器的延迟信息。也可以采用与上文描述的方法类似的方法,将转动偏离值305对图像传感器采集图像的时间间隔取余,将取余得到的值作为延迟信息。After obtaining the rotational deviation value 305, this embodiment may use the rotational deviation value 305 as the delay information of the image sensor. A method similar to the method described above can also be used, in which the rotation deviation value 305 is modulated by the time interval at which the image sensor collects images, and the remainder value is used as the delay information.
在一实施例中,在得到转动偏离值305后,例如还可以根据该转动偏离值305与转动周期303的和,来确定延迟信息。如此,可以避免第一时间信息T较小时,上述公式(2)得到的T 0’为负数,无法对图像传感器采集图像的时间间隔取余的情况。例如,该实施例可以将转动偏离值305与转动周期303的和对图像传感器采集图像的时间间隔取余,将取余得到的余数作为图像传感器的延迟信息306。 In one embodiment, after the rotation deviation value 305 is obtained, the delay information may also be determined based on the sum of the rotation deviation value 305 and the rotation period 303, for example. In this way, it can be avoided that when the first time information T is small, T 0 ′ obtained by the above formula (2) is a negative number and cannot be modulated by the time interval during which the image sensor collects images. For example, in this embodiment, the sum of the rotation deviation value 305 and the rotation period 303 may be modulated by the time interval for image sensor acquisition, and the remainder may be used as the delay information 306 of the image sensor.
在一实施例中,在确定延迟信息时,还可以考虑根据图像传感器的光心位置与目标对象在图像传感器的视角范围内的位置确定的预定误差值307。其中,目标对象例如可以为地面,可以根据地面上车辆中轴线上距离车辆预定距离的点和图像传感器的光心位置所在直线与图像传感器的主光轴之间的角度确定。该预定误差值的取值为大于0的值。该预定误差值307可以为经验值,例如,该预定误差值307的取值可以为5ms。In an embodiment, when determining the delay information, a predetermined error value 307 determined based on the optical center position of the image sensor and the position of the target object within the viewing angle range of the image sensor may also be considered. The target object may be the ground, for example, and may be determined based on the angle between a point on the central axis of the vehicle at a predetermined distance from the vehicle and a straight line where the optical center of the image sensor is located and the main optical axis of the image sensor. The predetermined error value is a value greater than 0. The predetermined error value 307 may be an empirical value. For example, the predetermined error value 307 may be 5 ms.
例如,该实施例可以根据预定误差值307与转动偏离值305的和,来确定图像传感器的延迟信息。通过该方式,可以提高确定的延迟信息的精度。这是由于图像传感器采集的图像数据中,能够反映影响车辆行驶的路面上对象的数据为位置靠下的图像像素。因此,在数据对齐时,通常偏向于关注图像传感器采集的图像数据中位置靠下的图像像素。而雷达传感器能够采集到该路面上对象的角度通常与预定角度存在一定偏差,该偏差会 带来确定图像传感器的采集时刻的偏差。该实施例通过设置预定误差值307,可以对该偏差进行弥补。如此,基于该实施例确定的延迟信息来控制图像传感器采集图像数据的时刻,可以使得图像传感器与雷达传感器可以同步地采集路面上对象,提高数据采集的同步精度。For example, this embodiment may determine the delay information of the image sensor based on the sum of the predetermined error value 307 and the rotational deviation value 305 . In this way, the accuracy of the determined delay information can be improved. This is because in the image data collected by the image sensor, the data that can reflect the objects on the road that affect the driving of the vehicle are the lower image pixels. Therefore, when aligning data, it is usually preferable to focus on the lower image pixels in the image data collected by the image sensor. The angle at which the radar sensor can collect objects on the road usually deviates from the predetermined angle, and this deviation will cause a deviation in determining the acquisition time of the image sensor. This embodiment can compensate for this deviation by setting a predetermined error value 307. In this way, controlling the time at which the image sensor collects image data based on the delay information determined in this embodiment can enable the image sensor and the radar sensor to synchronously collect objects on the road, thereby improving the synchronization accuracy of data collection.
可以理解的是,为了避免出现负数的情况,该实施例还可以根据预定误差值307、转动偏离值305与转动周期303的和,来确定图像传感器的延迟信息。It can be understood that, in order to avoid the occurrence of negative numbers, this embodiment may also determine the delay information of the image sensor based on the sum of the predetermined error value 307, the rotation deviation value 305 and the rotation period 303.
在一实施例中,控制图像传感器的控制器例如可以为人工智能芯片,例如可以为现场可编程门阵列芯片(Field Programmable Gate Array,FPGA),以此适于根据高采样频率的延迟信息,对高采样频率的图像传感器进行采集时刻的调整。可以适用于实时同步采集数据的场景。相应地,该实施例的同步采集数据可以周期性地执行。In one embodiment, the controller that controls the image sensor can be, for example, an artificial intelligence chip, such as a field programmable gate array chip (Field Programmable Gate Array, FPGA), which is suitable for processing data based on delay information at a high sampling frequency. The image sensor with high sampling frequency adjusts the acquisition time. It can be applied to scenarios where data is collected synchronously in real time. Accordingly, the synchronous data collection of this embodiment can be performed periodically.
在控制器接收到延迟信息后,例如可以根据该延迟信息来调整图像传感器的触发时刻,使得图像传感器与转动至预定角度的雷达传感器同步地采集数据。以下将结合图4~图5对控制器执行的同步采集数据的方法进行详细描述。After the controller receives the delay information, for example, the triggering time of the image sensor can be adjusted based on the delay information, so that the image sensor collects data synchronously with the radar sensor that rotates to a predetermined angle. The method of synchronously collecting data executed by the controller will be described in detail below with reference to Figures 4 to 5 .
图4是根据本公开实施例的同步采集数据的方法的流程示意图。Figure 4 is a schematic flowchart of a method for synchronously collecting data according to an embodiment of the present disclosure.
如图4所示,该实施例的同步采集数据的方法400可以包括操作S410~操作S430。该方法400可以由控制器执行,具体例如可以由人工智能芯片(例如FPGA)执行。As shown in Figure 4, the method 400 of synchronously collecting data in this embodiment may include operations S410 to S430. The method 400 can be executed by a controller, for example, it can be executed by an artificial intelligence chip (such as FPGA).
在操作S410,响应于接收到针对图像传感器的延迟信息,根据延迟信息调整图像传感器的触发时刻。In operation S410, in response to receiving the delay information for the image sensor, the triggering timing of the image sensor is adjusted according to the delay information.
在操作S420,响应于到达触发时刻,向图像传感器发送触发信号。In operation S420, in response to reaching the trigger moment, a trigger signal is sent to the image sensor.
在操作S430,响应于接收到图像传感器采集的图像数据,向图像数据添加时间信息。In operation S430, in response to receiving the image data collected by the image sensor, time information is added to the image data.
根据本公开的实施例,控制器的系统时间例如可以与控制器所在车辆中定位装置的定位装置时间对齐。例如,在车辆启动时,定位装置可以向控制器发送GPRMC数据包。控制器可以将GPRMC数据包中的UTC时间作为初始时间。控制器例如可以控制图像传感器自初始时间开始,以30fps的采集帧率采集图像数据。According to embodiments of the present disclosure, the system time of the controller may, for example, be aligned with the positioning device time of the positioning device in the vehicle where the controller is located. For example, when the vehicle is started, the positioning device can send a GPRMC data packet to the controller. The controller can use the UTC time in the GPRMC packet as the initial time. For example, the controller can control the image sensor to collect image data at an acquisition frame rate of 30fps starting from the initial time.
延迟信息是终端设备通过上文描述的操作S230发送给控制器的。在接收到延迟信息后,控制器可以根据延迟信息调整当前时刻之后图像传感器采集图像数据的时刻。例如可以将当前时刻之后的触发时刻与延迟信息所指示的延迟时长相加,得到调整后触发时刻。The delay information is sent by the terminal device to the controller through operation S230 described above. After receiving the delay information, the controller can adjust the time at which the image sensor collects image data after the current time according to the delay information. For example, the adjusted trigger time can be obtained by adding the trigger time after the current time and the delay duration indicated by the delay information.
或者,在调整触发时刻时,可以考虑图像传感器的相邻两个触发时刻之间的时间间隔。例如,该实施例可以将延时信息所指示的延迟时长对时间间隔取余后所得到的余数作为调整量,将当前时刻之后的触发时刻与该调整量相加,得到调整后触发时刻。通过该方式,可以在保证同步采集数据的基础上,减小触发时刻的调整程度,避免因调整触发时刻导致的丢帧的情况。Alternatively, when adjusting the triggering moment, the time interval between two adjacent triggering moments of the image sensor can be considered. For example, this embodiment can use the remainder obtained by taking the delay length indicated by the delay information as the remainder of the time interval as the adjustment amount, and add the trigger time after the current time to the adjustment amount to obtain the adjusted trigger time. Through this method, the degree of adjustment of the trigger time can be reduced while ensuring synchronous data collection, and frame loss caused by adjusting the trigger time can be avoided.
在调整触发时刻后,控制器即可根据调整后触发时刻控制图像传感器的触发,使得图像传感器与转动至预定角度的雷达传感器同步地采集数据。可以理解的是,向图像传感器发送的触发信号为图像传感器能够识别和接收的任意类型的触发信号,本公开对此不做限定。图像传感器响应于触发信号而被触发,并开始采集图像数据。在采集得到图像数据后,图像传感器可以将采集的图像数据发送给控制器。则控制器在接收到图像数据后,可以根据当前时刻的时间信息为图像数据添加时间信息。After adjusting the trigger time, the controller can control the trigger of the image sensor according to the adjusted trigger time, so that the image sensor collects data synchronously with the radar sensor rotated to a predetermined angle. It can be understood that the trigger signal sent to the image sensor is any type of trigger signal that the image sensor can recognize and receive, and the present disclosure does not limit this. The image sensor is triggered in response to the trigger signal and begins collecting image data. After collecting the image data, the image sensor can send the collected image data to the controller. After receiving the image data, the controller can add time information to the image data according to the time information of the current moment.
如此,本公开中雷达传感器采集得到的点云数据和图像数据均具有时间信息。车载终端即可根据两个数据的时间信息来对两个数据进行时间上的对齐,并对对齐后的两个数据进行融合,进行环境信息的重建。In this way, the point cloud data and image data collected by the radar sensor in the present disclosure all have time information. The vehicle-mounted terminal can timely align the two data based on the time information of the two data, and fuse the two aligned data to reconstruct the environmental information.
可以理解的是,由于雷达传感器是周期性转动的,因此,车载控制器接收到的针对预定角度的数据包及确定的延迟信息都是周期性的,本公开实施例中控制器通过根据周期性的延迟信息来对图像传感器的触发时刻进行调整,可以避免车辆运行过程中,因雷达传感器的转动偏差而造成的数据对齐不精准的情况。从而利于提高车辆自动驾驶的安全性。It can be understood that since the radar sensor rotates periodically, the data packets for the predetermined angle and the determined delay information received by the vehicle-mounted controller are both periodic. In the embodiment of the present disclosure, the controller rotates periodically according to the periodicity. The delay information is used to adjust the triggering moment of the image sensor, which can avoid inaccurate data alignment caused by the rotation deviation of the radar sensor during vehicle operation. This will help improve the safety of autonomous vehicle driving.
图5是根据本公开实施例的调整图像传感器的触发时刻的流程示意图。FIG. 5 is a schematic flowchart of adjusting the triggering time of an image sensor according to an embodiment of the present disclosure.
根据本公开的实施例,在调整图像传感器的触发时刻时,若延迟信息所指示的延迟时长较长,可以考虑对触发时刻进行分步调整,以此避免因单次调整程度较大导致图像传感器采集的图像相对于雷达传感器丢帧,影响车辆自动驾驶的情况。According to embodiments of the present disclosure, when adjusting the triggering time of the image sensor, if the delay time indicated by the delay information is long, the triggering time can be adjusted step by step to avoid causing the image sensor to be damaged due to a large degree of single adjustment. The collected images lose frames relative to the radar sensor, affecting the autonomous driving of the vehicle.
在一实施例中,可以根据图像传感器的采集帧率和延迟时长,来确定调整触发时刻的次数及针对每次调整的步长。随后根据次数和步长,来调整图像传感器的触发时刻。In one embodiment, the number of times to adjust the trigger moment and the step size for each adjustment may be determined based on the acquisition frame rate and delay duration of the image sensor. Then adjust the triggering time of the image sensor according to the number of times and step size.
例如,可以先根据采集帧率确定图像传感器采集图像数据的时间间隔,该时间间隔可以为采集帧率的倒数。随后,将延迟时长与时间间隔相除,并向上取整,将取整得到的值作为调整触发时刻的次数。设定次数为n,则对于第1次~第(n-1)次,调整的步长为时间间隔的取值,对于第n次,调整的步长为延迟时长与时间间隔相除所得到的余数。或者,对于第一次,调整的步长为延迟时长与时间间隔相除所得到的余数,对于后续次数,调整的步长为时间间隔的取值。或者,可以将延迟时长等分为n份,n次调整中每次调整的步长为一份延迟时长的长度。For example, the time interval for the image sensor to collect image data can be first determined based on the collection frame rate, and the time interval can be the reciprocal of the collection frame rate. Then, divide the delay duration by the time interval, round up, and use the rounded value as the number of times to adjust the trigger moment. Set the number of times to n, then for the 1st to (n-1)th time, the adjustment step is the value of the time interval. For the nth time, the adjustment step is the delay length divided by the time interval. the remainder. Or, for the first time, the adjustment step is the remainder obtained by dividing the delay length by the time interval, and for subsequent times, the adjustment step is the value of the time interval. Alternatively, the delay duration can be equally divided into n parts, and the step size of each adjustment in n adjustments is the length of one part of the delay duration.
如图5所示,在实施例500中,可以先根据图像传感器的采集帧率501来确定触发时刻的调整步长的取值区间502。随后根据该取值区间502和延迟时长,来确定调整次数503和每次调整的步长504。As shown in FIG. 5 , in embodiment 500 , the value interval 502 of the adjustment step size at the triggering moment can be determined first based on the acquisition frame rate 501 of the image sensor. Then, the number of adjustments 503 and the step size of each adjustment 504 are determined based on the value interval 502 and the delay length.
其中,采集帧率501例如可以与取值区间502的长度负相关。例如,取值区间502的长度可以与采集图像数据的时间间隔的长度成正比。例如,可以设定取值区间502中的取值上限值为时间间隔的取值,或者,取值上限值为时间间隔的0.5倍等,本公开对此不做限定。在确定了取值区间502后,该实施例可以将延迟时长与取值区间502的取值相除,并向上取整,将向上取整得到的值作为调整的次数。For example, the collection frame rate 501 may be negatively correlated with the length of the value interval 502 . For example, the length of the value interval 502 may be proportional to the length of the time interval for collecting image data. For example, the upper limit of the value in the value interval 502 can be set to the value of the time interval, or the upper limit of the value can be set to 0.5 times the time interval, etc. This disclosure does not limit this. After determining the value interval 502, this embodiment may divide the delay duration by the value of the value interval 502, round up, and use the value obtained by rounding up as the number of adjustments.
在一实施例中,控制器例如可以维护有历次接收的延迟信息。在接收到最新的延迟信息后,控制器例如可以先从维护的已接受延迟信息中查找到在接收时刻上与该最新的延迟信息相邻的延迟信息,随后确定该最新的延迟信息是否相对于相邻的延迟信息发生变化。若发生变化,再调整图像传感器的触发时刻。具体地,可以根据延迟信息的变化值来调整图像传感器的触发时刻。通过该方式,可以实现对图像传感器的迭代调整,减小单个调整周期内对图像传感器的触发时刻的调整程度,提高图像传感器采集图像数据的稳定性。In one embodiment, the controller may, for example, maintain the delay information received in the past. After receiving the latest delay information, the controller may, for example, first find the delay information adjacent to the latest delay information at the reception time from the maintained accepted delay information, and then determine whether the latest delay information is relative to Adjacent delay information changes. If changes occur, adjust the triggering time of the image sensor. Specifically, the triggering time of the image sensor can be adjusted according to the change value of the delay information. In this way, the iterative adjustment of the image sensor can be realized, the degree of adjustment of the triggering moment of the image sensor within a single adjustment cycle can be reduced, and the stability of the image data collected by the image sensor can be improved.
例如,如图5所示,设定控制器最新接收的延迟信息为第m个延迟信息505,则该实施例可以计算第m个延迟信息505指示的延迟时长与第 (m-1)个延迟信息506指示的延迟时长之间的差值507。随后,根据该差值507与取值区间502,来确定调整的次数503和每次调整的步长504。For example, as shown in Figure 5, if the latest delay information received by the controller is set to the m-th delay information 505, then this embodiment can calculate the delay duration indicated by the m-th delay information 505 and the (m-1)-th delay. The difference 507 between the delay durations indicated by the information 506. Subsequently, based on the difference 507 and the value interval 502, the number of adjustments 503 and the step size 504 of each adjustment are determined.
示例性地,在根据延迟信息的变化值调整图像传感器的触发时刻时,考虑到相邻两个延迟信息所指示的延迟时长的差值可能为负值,则该实施例中,根据采集帧率501确定的取值区间502的下限值例如可以为小于0的值。该下限值与0之间的差值绝对值例如可以小于取值区间的上限值与0之间的差值绝对值,这是由于图像传感器的采集帧率的限制。该下限值与0之间的差值与采集帧率负相关。For example, when adjusting the triggering moment of the image sensor according to the change value of the delay information, considering that the difference between the delay lengths indicated by two adjacent delay information may be a negative value, in this embodiment, according to the acquisition frame rate The lower limit of the value interval 502 determined in step 501 may be, for example, a value less than 0. For example, the absolute value of the difference between the lower limit value and 0 may be smaller than the absolute value of the difference between the upper limit value of the value interval and 0. This is due to the limitation of the acquisition frame rate of the image sensor. The difference between this lower limit value and 0 is inversely related to the acquisition frame rate.
示例性地,设定图像传感器的采集帧率为30fps,取值区间例如可以设置为[-200μm,32ms]。若采集帧率为15fps,则取值区间例如可以设置为[-400μm,64ms]。For example, the acquisition frame rate of the image sensor is set to 30 fps, and the value interval can be set to [-200 μm, 32 ms], for example. If the acquisition frame rate is 15fps, the value interval can be set to [-400μm, 64ms], for example.
可以理解的是,在确定了调整触发时刻的次数和每次调整的步长后,可以根据次数和步长来对最近的一个或连续多个的触发时刻进行调整。例如,设定雷达传感器的转动周期为100ms,图像传感器的采集帧率为30fps,在一个转动周期内,图像传感器可以采集的图像数据为3帧。为了减小信号传输的延迟,控制器例如可以每间隔两帧发送一次触发信号。图像传感器在接收到触发信号后,连续采集三帧图像数据。第二帧图像数据、第三帧图像数据的采集时刻与第一帧图像数据的采集时刻的差值分别为33.3ms和66.7ms。It can be understood that, after determining the number of times to adjust the triggering moment and the step size of each adjustment, the most recent one or multiple consecutive triggering moments can be adjusted based on the number of times and the step size. For example, if the rotation period of the radar sensor is set to 100ms and the acquisition frame rate of the image sensor is 30fps, the image data that the image sensor can collect in one rotation period is 3 frames. In order to reduce the delay of signal transmission, the controller may, for example, send a trigger signal every two frames. After receiving the trigger signal, the image sensor continuously collects three frames of image data. The differences between the collection time of the second frame of image data, the third frame of image data and the collection time of the first frame of image data are 33.3ms and 66.7ms respectively.
示例性地,在确定的调整次数为一次时,控制器可以仅调整每间隔两帧发送的触发信号的发送时机,从而实现对图像传感器的触发时刻的调整。若调整次数为多次时,控制器可以将每间隔两帧发送一次触发信号的信号发送机制调整为每帧均发送触发信号的信号发送机制。在完成对触发时刻的调整并根据调整后的触发时刻向图像传感器发送触发信号后,控制器可以将信号发送机制再调整回每间隔两帧发送触发信号的机制。For example, when the determined number of adjustments is once, the controller may only adjust the sending timing of the trigger signal sent every two frames, thereby adjusting the triggering moment of the image sensor. If the number of adjustments is multiple, the controller can adjust the signal sending mechanism that sends the trigger signal every two frames to the signal sending mechanism that sends the trigger signal every frame. After completing the adjustment of the trigger time and sending the trigger signal to the image sensor according to the adjusted trigger time, the controller can adjust the signal sending mechanism back to the mechanism of sending the trigger signal every two frames.
根据本公开的实施例,若在雷达传感器一个转动周期内,图像传感器可以采集的图像数据为p帧,且根据延迟信息确定的调整次数大于p次的情况下,该实施例可以仅对最近的连续p个触发时刻进行调整。对最近的第p+1个触发时刻的调整则可以根据下一个周期内接收到的延迟信息来确定。According to an embodiment of the present disclosure, if within one rotation cycle of the radar sensor, the image data that the image sensor can collect is p frames, and the number of adjustments determined based on the delay information is greater than p times, this embodiment can only adjust the most recent Adjustment is made for p consecutive triggering moments. The adjustment to the latest p+1th trigger moment can be determined based on the delay information received in the next cycle.
上述控制器和车载终端可以均集成于车辆的自动驾驶系统中,对于自动驾驶系统而言,则可以通过执行上述各实施例的同步采集数据的方法,使得图像传感器和雷达传感器同步地采集数据。The above-mentioned controller and vehicle-mounted terminal can both be integrated into the automatic driving system of the vehicle. For the automatic driving system, the image sensor and the radar sensor can collect data synchronously by executing the synchronous data collection methods of the above embodiments.
图6是根据本公开实施例的同步采集数据的方法的原理示意图。Figure 6 is a schematic diagram of the principle of a method for synchronously collecting data according to an embodiment of the present disclosure.
如图6所示,该实施例中,同步采集数据的方法600的执行主体为自动驾驶车辆的自动驾驶系统。该自动驾驶系统包括车载终端610和控制器620。控制器620具体可以为FPGA芯片。自动驾驶车辆还可以设置有雷达传感器601和图像传感器602。As shown in FIG. 6 , in this embodiment, the execution subject of the method 600 for synchronously collecting data is the automatic driving system of the automatic driving vehicle. The automatic driving system includes a vehicle-mounted terminal 610 and a controller 620. The controller 620 may specifically be an FPGA chip. The autonomous vehicle may also be provided with a radar sensor 601 and an image sensor 602.
在方法600中,雷达传感器601可以将以各个角度采集的点云数据封装为数据包,作为针对各个角度的数据包。并将该各个角度的数据包发送给车载终端610。In the method 600, the radar sensor 601 may encapsulate the point cloud data collected from various angles into data packets as data packets for each angle. And the data packets of each angle are sent to the vehicle-mounted terminal 610.
车载终端610可以对接收到的数据包进行解析,确定数据包所针对的角度。并在确定接收到针对预定角度的数据包后,确定雷达传感器以该预定角度采集点云数据的第一时间信息。其中的预定角度在图像传感器的视角范围内。可以理解的是,该确定第一时间信息的实现方式与上文描述的操作S210的实现方式类似,在此不再赘述。The vehicle-mounted terminal 610 can analyze the received data packet and determine the angle targeted by the data packet. And after it is determined that the data packet for the predetermined angle is received, the first time information for the radar sensor to collect point cloud data at the predetermined angle is determined. The predetermined angle is within the viewing angle range of the image sensor. It can be understood that the implementation manner of determining the first time information is similar to the implementation manner of operation S210 described above, and will not be described again here.
随后,车载终端610可以根据该第一时间信息来确定针对图像传感器602的延时信息,并将该延时信息发送给FPGA 620。可以理解的是,确定延迟信息的实现方式可以与上文描述的操作S220的实现方式类似,在此不再赘述。Subsequently, the vehicle-mounted terminal 610 can determine the delay information for the image sensor 602 based on the first time information, and send the delay information to the FPGA 620. It can be understood that the implementation of determining the delay information may be similar to the implementation of operation S220 described above, and will not be described again here.
FPGA芯片620在接收到该延迟信息后,可以根据延迟信息调整图像传感器602的触发时刻,以使得图像传感器602与雷达传感器601可以同步地采集数据。该调整触发时刻的实现方式与上文描述的操作S410的实现方式类似。在调整了触发时刻后,FPGA芯片620可以在到达调整后的触发时刻时,向图像传感器602发送触发信号。图像传感器602可以响应于该触发信号开始采集图像数据。After receiving the delay information, the FPGA chip 620 can adjust the triggering time of the image sensor 602 according to the delay information, so that the image sensor 602 and the radar sensor 601 can collect data synchronously. The implementation of adjusting the triggering time is similar to the implementation of operation S410 described above. After adjusting the trigger time, the FPGA chip 620 may send a trigger signal to the image sensor 602 when the adjusted trigger time is reached. Image sensor 602 may begin collecting image data in response to the trigger signal.
图像传感器602例如可以将采集的图像数据发送给FPGA芯片620,以由FPGA芯片620为接收到的图像数据添加第二时间信息。具体可以为图像数据添加时间戳,得到具有时间戳的图像数据631。For example, the image sensor 602 may send the collected image data to the FPGA chip 620, so that the FPGA chip 620 adds second time information to the received image data. Specifically, a timestamp can be added to the image data to obtain image data 631 with a timestamp.
该实施例中,由于图像传感器采集图像数据的时刻是根据雷达传感器 采集点云数据的时间信息调整的,因此,得到的一系列图像数据631中包括与雷达传感器601后续采集的点云数据对齐的图像数据。具体可以根据图像数据的第二时间信息与雷达传感器后续采集的点云数据的时间信息,来对图像数据和点云数据进行对齐。例如,可以将时间信息相同的点云数据和图像数据作为对齐的一对数据,或者将时间信息相差小于阈值的点云数据和图像数据作为对齐的一对数据。In this embodiment, since the time at which the image sensor collects image data is adjusted based on the time information at which the radar sensor collects point cloud data, the obtained series of image data 631 includes points that are aligned with the point cloud data subsequently collected by the radar sensor 601 image data. Specifically, the image data and the point cloud data can be aligned according to the second time information of the image data and the time information of the point cloud data subsequently collected by the radar sensor. For example, point cloud data and image data with the same time information can be used as an aligned pair of data, or point cloud data and image data with time information that differs less than a threshold can be used as an aligned pair of data.
图7是根据本公开实施例的同步采集数据的原理示意图。Figure 7 is a schematic diagram of the principle of synchronous data collection according to an embodiment of the present disclosure.
如图7所示,在实施例700中,设定FPGA为图像数据添加时间信息时所依据的基准时间及雷达传感器采集点云数据的基准时间均为GPS时钟。设定GPS的定位周期为1s,雷达传感器采集点云数据的周期为100ms,图像传感器(例如相机)的采集帧率为30fps。As shown in Figure 7, in embodiment 700, the reference time based on which the FPGA adds time information to image data and the reference time used by the radar sensor to collect point cloud data are both set to the GPS clock. Set the positioning period of the GPS to 1s, the point cloud data collection period of the radar sensor to 100ms, and the collection frame rate of the image sensor (such as a camera) to 30fps.
例如,若以GPS时钟为基准,预定角度为0°,起始时刻为0,则雷达传感器以0°采集点云数据的时刻应为0.1s、0.9s、1s、1.1s...,相机采集图像数据的时间应为33.3ms、66.7ms、0.1s、...、0.8667s、0.9s、0.9333s...。在该实施例700中,雷达传感器采集点云数据的时刻实际上为0.92s、1.02s、...2.01s。相机采集图像数据的时刻实际上为0.89s、0.9233s、...。For example, if the GPS clock is used as the reference, the predetermined angle is 0°, and the starting time is 0, then the time when the radar sensor collects point cloud data at 0° should be 0.1s, 0.9s, 1s, 1.1s..., and the camera The time for collecting image data should be 33.3ms, 66.7ms, 0.1s,..., 0.8667s, 0.9s, 0.9333s.... In this embodiment 700, the time at which the radar sensor collects point cloud data is actually 0.92s, 1.02s,...2.01s. The time when the camera collects image data is actually 0.89s, 0.9233s, ....
采用本公开提供的同步采集数据的方法,可以将相机的触发时刻由0.89s调整为0.92s,调整所依据的延迟时长为t1,即30ms。依次类推,相机的触发时刻由1.99s调整为2.01s,调整所依据的延迟时长为t2,即20ms。通过调整,相机采集的图像数据与雷达采集的点云数据在时间上对齐。Using the method of synchronously collecting data provided by this disclosure, the trigger time of the camera can be adjusted from 0.89s to 0.92s, and the delay length based on the adjustment is t1, which is 30ms. By analogy, the trigger time of the camera is adjusted from 1.99s to 2.01s, and the delay length based on the adjustment is t2, which is 20ms. Through adjustment, the image data collected by the camera and the point cloud data collected by the radar are temporally aligned.
本公开还提供了一种用于对图像传感器和雷达传感器的同步性进行评估的同步确定方法。以下将结合图8对该同步确定方法进行详细描述。The present disclosure also provides a synchronization determination method for evaluating the synchronization of an image sensor and a radar sensor. The synchronization determination method will be described in detail below with reference to FIG. 8 .
图8是根据本公开实施例的同步确定方法的流程示意图。Figure 8 is a schematic flowchart of a synchronization determination method according to an embodiment of the present disclosure.
如图8所示,该实施例的同步确定方法800可以包括操作S810~操作S840。该同步确定方法800可以由云端系统执行。As shown in FIG. 8 , the synchronization determination method 800 of this embodiment may include operations S810 to S840. The synchronization determination method 800 can be executed by a cloud system.
在操作S810,获取雷达传感器采集的数据包序列和图像传感器采集的图像数据序列。In operation S810, a data packet sequence collected by the radar sensor and an image data sequence collected by the image sensor are obtained.
根据本公开的实施例,云端系统可以以天为单位,获取自动驾驶车辆在单位时间内采集的数据包序列和图像数据序列。云端系统例如可以经由无线通信链路来获取数据包序列和图像数据序列。According to embodiments of the present disclosure, the cloud system can obtain the data packet sequence and image data sequence collected by the autonomous vehicle within the unit time on a daily basis. The cloud system may obtain the sequence of data packets and the sequence of image data via a wireless communication link, for example.
在操作S820,针对数据包序列中的每个数据包,根据每个数据包的时间信息与图像数据序列中各图像数据的时间信息,确定与每个数据包匹配的图像数据。In operation S820, for each data packet in the data packet sequence, image data matching each data packet is determined based on the time information of each data packet and the time information of each image data in the image data sequence.
根据本公开的实施例,云端系统可以对数据包序列中的各个数据包进行解析,确定雷达传感器采集数据包中的点云数据的第一时间信息。随后将图像数据序列中添加的时间信息与第一时间信息相同或相近的图像数据,作为与每个数据包匹配的图像数据。例如,若图像数据的第二时间信息与点云数据的第一时间信息的时间差小于±3ms,则可以确定该图像数据与点云数据匹配。对于没有确定得到匹配的图像数据的数据包,则可以丢弃该数据包。According to embodiments of the present disclosure, the cloud system can parse each data packet in the data packet sequence and determine the first time information of the point cloud data in the data packet collected by the radar sensor. Then, the image data whose time information is added to the image data sequence is the same as or similar to the first time information is used as the image data matching each data packet. For example, if the time difference between the second time information of the image data and the first time information of the point cloud data is less than ±3 ms, it can be determined that the image data matches the point cloud data. For data packets for which matching image data is not determined, the data packet can be discarded.
在操作S830,根据每个数据包与匹配的图像数据之间的时间信息的差异及差异阈值,确定每个数据包与匹配的图像数据之间的同步关系。In operation S830, a synchronization relationship between each data packet and the matching image data is determined based on the difference in time information between each data packet and the matching image data and the difference threshold.
在一实施例中,差异阈值例如可以根据经验设定,例如,可以以不影响车辆的安全行驶为前提条件,设定该差异阈值。例如,该差异阈值可以为15ms、7ms等,本公开对此不做限定。In one embodiment, the difference threshold may be set based on experience, for example, the difference threshold may be set on the premise that it does not affect the safe driving of the vehicle. For example, the difference threshold may be 15 ms, 7 ms, etc., which is not limited in this disclosure.
在一实施例中,差异阈值可以根据历史图像数据与历史数据包中点云数据的融合结果来确定。若融合结果中点云数据投影至图像坐标系中的像素点与历史图像数据中对应像素点的位置偏差值达到临界值(例如预定偏差值),则可以将历史图像数据的时间信息与历史数据包的时间信息的差值作为差异阈值。通过该方式,可以提高设置的差异阈值的精度,并因此提高同步关系的评估精度。In one embodiment, the difference threshold may be determined based on the fusion result of historical image data and point cloud data in the historical data package. If the positional deviation between the pixels projected from the point cloud data into the image coordinate system in the fusion result and the corresponding pixels in the historical image data reaches a critical value (such as a predetermined deviation value), the time information of the historical image data can be combined with the historical data. The difference in packet time information is used as the difference threshold. In this way, the accuracy of the set difference threshold and therefore the evaluation of the synchronization relationship can be increased.
在该实施例中,若每个数据包与匹配的图像数据之间的时间信息的差异大于差异阈值,则可以确定该每个数据包与匹配的图像数据之间的同步关系为不同步。若差异小于等于差异阈值,则可以确定该每个数据包与匹配的图像数据之间的同步关系为同步。In this embodiment, if the difference in time information between each data packet and the matching image data is greater than the difference threshold, it may be determined that the synchronization relationship between each data packet and the matching image data is asynchronous. If the difference is less than or equal to the difference threshold, it can be determined that the synchronization relationship between each data packet and the matching image data is synchronization.
在操作S840,根据数据包序列中多个数据包各自与匹配的图像数据之间的同步关系,确定雷达传感器与图像传感器之间的同步关系。In operation S840, the synchronization relationship between the radar sensor and the image sensor is determined according to the synchronization relationship between each of the plurality of data packets in the data packet sequence and the matching image data.
根据本公开的实施例,例如可以统计数据包序列中各个数据包与匹配的图像数据的同步关系。若与匹配的图像数据同步的数据包比例大于等于预定比例阈值,则可以确定雷达传感器与图像传感器之间的同步关系为同 步,否则确定同步关系为不同步。其中,预定比例阈值例如可以为0.8等小于1但靠近1的值,本公开对此不做限定。According to embodiments of the present disclosure, for example, the synchronization relationship between each data packet and the matching image data in the data packet sequence can be counted. If the proportion of data packets synchronized with the matching image data is greater than or equal to the predetermined proportion threshold, it can be determined that the synchronization relationship between the radar sensor and the image sensor is synchronous, otherwise it is determined that the synchronization relationship is out of synchronization. The predetermined ratio threshold may be, for example, a value less than 1 but close to 1 such as 0.8, which is not limited in this disclosure.
本公开实施例通过设置差异阈值,并将该差异阈值作为雷达传感器采集的数据包与图像传感器采集的图像数据之间是否同步的依据,可以提高同步关系的确定效率。相较于对图像数据和数据包中的点云数据进行融合,并根据融合结果确定同步关系的技术方案,可以减少人力投入,降低计算资源的消耗,提高并行处理能力。Embodiments of the present disclosure can improve the efficiency of determining the synchronization relationship by setting a difference threshold and using the difference threshold as a basis for synchronization between the data packets collected by the radar sensor and the image data collected by the image sensor. Compared with the technical solution of fusion of image data and point cloud data in the data package, and determining the synchronization relationship based on the fusion results, it can reduce human investment, reduce the consumption of computing resources, and improve parallel processing capabilities.
根据本公开的实施例,可以采用预定容器化工具对历史数据包中的点云数据和历史图像数据进行融合。通过采用预定容器花工具,可以兼容不同云端系统安装的不同平台,利于提高本公开实施例提供的同步确定方法的适用性。According to embodiments of the present disclosure, a predetermined containerization tool may be used to fuse the point cloud data and historical image data in the historical data package. By using the predetermined container flower tool, it can be compatible with different platforms installed on different cloud systems, which is beneficial to improving the applicability of the synchronization determination method provided by the embodiments of the present disclosure.
其中,预定容器化工具例如可以基于以下原理来融合点云数据和图像数据:根据雷达传感器和图像传感器的外参,对点云数据和图像数据进行对准融合。具体可以将点云数据投影到相机坐标系,得到点云数据投影至相机坐标系中的像素点。将该像素点与图像数据对应位置的像素点进行图层的叠加,得到融合后数据。Among them, the predetermined containerization tool can, for example, fuse the point cloud data and the image data based on the following principle: align and fuse the point cloud data and the image data according to the external parameters of the radar sensor and the image sensor. Specifically, the point cloud data can be projected into the camera coordinate system to obtain the pixels projected from the point cloud data into the camera coordinate system. The pixels are superimposed on the corresponding pixels of the image data to obtain the fused data.
根据本公开的实施例,在确定雷达传感器与图像传感器之间的同步关系为不同步的情况下,该实施例还可以向目标对象发送提示信息。例如可以向目标通讯账户发送提示信息。目标通讯账户可以为邮箱账户或者移动通讯账户等,本公开对此不做限定。通过发送提示信息,可以使得监控人员及时得知图像传感器与雷达传感器不同步的情形,以便于采用相关策略解决不同步的问题,以减少因不同步导致自动驾驶车辆行驶不安全的几率。According to an embodiment of the present disclosure, when it is determined that the synchronization relationship between the radar sensor and the image sensor is asynchronous, this embodiment may also send prompt information to the target object. For example, a reminder message can be sent to the target communication account. The target communication account can be an email account or a mobile communication account, etc., and this disclosure does not limit this. By sending prompt information, monitoring personnel can be informed of the out-of-synchronization between the image sensor and the radar sensor in time, so that relevant strategies can be adopted to solve the out-of-synchronization problem and reduce the chance of unsafe driving of autonomous vehicles due to out-of-synchronization.
基于本公开提供的车载终端执行的同步采集数据的方法,本公开还提供了一种车载终端。Based on the method for synchronous data collection performed by a vehicle-mounted terminal provided by the disclosure, the disclosure also provides a vehicle-mounted terminal.
图9是根据本公开实施例的车载终端的示意图。Figure 9 is a schematic diagram of a vehicle-mounted terminal according to an embodiment of the present disclosure.
如图9所示,该实施例提供了一种车载终端900,该车载终端900可以集成于自动驾驶车辆等任意的车辆中。该车载终端900例如可以被配置为执行上文描述的车载终端执行的同步采集数据的方法。As shown in FIG. 9 , this embodiment provides a vehicle-mounted terminal 900 , which can be integrated into any vehicle such as an autonomous vehicle. The vehicle-mounted terminal 900 may, for example, be configured to perform the method of synchronously collecting data performed by the vehicle-mounted terminal described above.
具体地,该车载终端900可以被配置为响应于接收到来自雷达传感器的针对预定角度的数据包,确定雷达传感器以预定角度采集点云数据的第 一时间信息。其中,预定角度在设置有该车载终端的车辆中,图像传感器的视角范围内,数据包包括有雷达传感器以预定角度采集的点云数据。该实施例中确定第一时间信息的实现方式与上文描述的操作S210的实现方式类似,在此不再赘述。Specifically, the vehicle-mounted terminal 900 may be configured to determine first time information when the radar sensor collects point cloud data at a predetermined angle in response to receiving a data packet from the radar sensor for a predetermined angle. The predetermined angle is within the viewing angle range of the image sensor in the vehicle equipped with the vehicle-mounted terminal, and the data package includes point cloud data collected by the radar sensor at the predetermined angle. The implementation manner of determining the first time information in this embodiment is similar to the implementation manner of operation S210 described above, and will not be described again here.
该车载终端900还可以被配置为根据第一时间信息,确定针对图像传感器的延迟信息,并将延迟信息发送给控制图像传感器的控制器,以便控制器控制所述图像传感器与转动至所述预定角度的所述雷达传感器同步地采集数据。The vehicle-mounted terminal 900 may also be configured to determine delay information for the image sensor based on the first time information, and send the delay information to the controller that controls the image sensor, so that the controller controls the image sensor and rotates to the predetermined position. The radar sensors collect data simultaneously at angles.
在一实施例中,车载终端900例如可以采用上文中针对图3描述的原理来确定延迟信息,本公开对此不做限定。可以理解的是,图9中车载终端900的结构仅作为示意以利于理解本公开,本公开对此不做限定。In one embodiment, the vehicle-mounted terminal 900 may, for example, use the principle described above with respect to FIG. 3 to determine the delay information, which is not limited in this disclosure. It can be understood that the structure of the vehicle-mounted terminal 900 in FIG. 9 is only used as a diagram to facilitate understanding of the present disclosure, and the present disclosure does not limit it.
基于本公开提供的控制器执行的同步采集数据的方法,本公开还提供了一种控制器。Based on the synchronous data collection method executed by the controller provided by the disclosure, the disclosure also provides a controller.
图10是根据本公开实施例的控制器的示意图。Figure 10 is a schematic diagram of a controller according to an embodiment of the present disclosure.
如图10所示,该实施例提供了一种控制器1000,该控制器1000可以集成于自动驾驶车辆等任意的车辆中,且该控制器1000还可以与车辆中的车载终端通信连接。该控制器1000例如可以被配置为执行上文描述的控制器执行的同步采集数据的方法。As shown in FIG. 10 , this embodiment provides a controller 1000 that can be integrated into any vehicle such as an autonomous vehicle, and the controller 1000 can also be communicatively connected with a vehicle-mounted terminal in the vehicle. The controller 1000 may, for example, be configured to perform the method of synchronously collecting data performed by the controller described above.
具体地,该控制器1000可以被配置为响应于接收到针对图像传感器的延迟信息,根据延迟信息调整图像传感器的触发时刻。该实施例中调整触发时刻的原理可以与上文描述的操作S410中的调整原理类似,在此不再赘述。可以理解的是,延迟信息可以是与控制器通信连接的车载终端发送的。Specifically, the controller 1000 may be configured to, in response to receiving delay information for the image sensor, adjust the triggering moment of the image sensor according to the delay information. The principle of adjusting the triggering time in this embodiment may be similar to the adjustment principle in operation S410 described above, and will not be described again here. It can be understood that the delay information may be sent by a vehicle-mounted terminal that is communicatively connected to the controller.
该控制器1000还可以被配置为响应于到达触发时刻,向其所在车辆中的图像床安琪发送触发信号,并响应于接收到图像传感器采集的图像数据,向图像数据添加时间信息,以便车载终端利用时间信息将图像数据与雷达传感器采集的点云数据对齐。The controller 1000 may also be configured to send a trigger signal to the image bed Angel in the vehicle in response to reaching the trigger moment, and in response to receiving the image data collected by the image sensor, add time information to the image data so that the vehicle can The terminal uses time information to align the image data with the point cloud data collected by the radar sensor.
在一实施例中,控制器1000例如可以采用上文中针对图5描述的原理来确定调整触发时刻的次数和步长,并根据确定的次数和步长对触发时刻进行调整,本公开对此不做限定。可以理解的是,图10中控制器1000 的结构仅作为示意以利于理解本公开,本公开对此不做限定。In one embodiment, the controller 1000 may, for example, use the principle described above with respect to FIG. 5 to determine the number of times and steps to adjust the triggering moment, and adjust the triggering moment according to the determined number of times and the step size. This disclosure does not apply to this. Make limitations. It can be understood that the structure of the controller 1000 in FIG. 10 is only used as a diagram to facilitate understanding of the present disclosure, and the present disclosure does not limit this.
基于本公开提供的车载终端和控制器,本公开还提供了一种自动驾驶车辆。Based on the vehicle-mounted terminal and controller provided by the present disclosure, the present disclosure also provides an autonomous driving vehicle.
图11是根据本公开实施例的自动驾驶车辆的示意图。Figure 11 is a schematic diagram of an autonomous vehicle according to an embodiment of the present disclosure.
如图11所示,该实施例提供了一种自动驾驶车辆1100,该自动驾驶车辆1100中可以集成有车载终端、控制器、雷达传感器和图像传感器。As shown in FIG. 11 , this embodiment provides an autonomous vehicle 1100 in which a vehicle-mounted terminal, a controller, a radar sensor, and an image sensor can be integrated.
其中,车载终端可以为上文描述的车载终端900,控制器可以为上文描述的控制器1000。The vehicle-mounted terminal may be the vehicle-mounted terminal 900 described above, and the controller may be the controller 1000 described above.
在自动驾驶过程中,图像传感器可以周期性地采集图像数据,雷达传感器可以周期性地采集点云数据,并对点云数据进行封装,得到数据包。自动驾驶车辆中的车载终端可以响应于接收到来自雷达传感器的针对预定角度的数据包,确定雷达传感器以预定角度采集点云数据的第一时间信息。其中,预定角度在图像传感器的视角范围内,数据包包括雷达传感器以预定角度采集的点云数据。在得到第一时间信息后,车载终端还可以根据第一时间信息,确定针对图像传感器的延迟信息,并将确定的延迟信息发送给控制器。可以理解的是,该车载终端确定延迟信息的实现方式可以与上文描述的同步采集数据的方法200的实现原理类似,在此不再赘述。During the process of autonomous driving, the image sensor can periodically collect image data, and the radar sensor can periodically collect point cloud data, and encapsulate the point cloud data to obtain a data package. The vehicle-mounted terminal in the autonomous vehicle may determine the first time information that the radar sensor collects point cloud data at a predetermined angle in response to receiving a data packet from the radar sensor for a predetermined angle. The predetermined angle is within the viewing angle range of the image sensor, and the data package includes point cloud data collected by the radar sensor at the predetermined angle. After obtaining the first time information, the vehicle-mounted terminal can also determine the delay information for the image sensor based on the first time information, and send the determined delay information to the controller. It can be understood that the vehicle-mounted terminal determines the delay information in a manner similar to the implementation principle of the method 200 for synchronously collecting data described above, which will not be described again here.
控制器在接收到延迟信息后,可以根据延迟信息调整图像传感器的触发时刻,并在确定达到图像传感器的触发时刻时,向图像传感器发送触发信号。该控制器还可以在接收到图像传感器采集的图像数据后,向图像数据添加第二时间信息,以便车载终端利用第二时间信息将图像数据与雷达传感器采集的点云数据对齐。After receiving the delay information, the controller can adjust the trigger time of the image sensor according to the delay information, and when it is determined that the trigger time of the image sensor is reached, send a trigger signal to the image sensor. The controller can also add second time information to the image data after receiving the image data collected by the image sensor, so that the vehicle-mounted terminal uses the second time information to align the image data with the point cloud data collected by the radar sensor.
基于本公开提供的同步确定方法,本公开还提供了一种用于执行同步确定方法的云端系统。Based on the synchronization determination method provided by the present disclosure, the present disclosure also provides a cloud system for executing the synchronization determination method.
图12是根据本公开实施例的云端系统的示意图。Figure 12 is a schematic diagram of a cloud system according to an embodiment of the present disclosure.
如图12所示,本公开提供了一种云端系统1200,该云端系统1200可以通过通信链路与本公开提供的自动驾驶车辆通信,以从自动驾驶车辆处获取传感器采集的数据,或者接收自动驾驶车辆发送的数据。As shown in Figure 12, the present disclosure provides a cloud system 1200. The cloud system 1200 can communicate with the self-driving vehicle provided by the present disclosure through a communication link to obtain data collected by sensors from the self-driving vehicle, or receive automatic Data sent by driving a vehicle.
具体地,该云端系统1200可以被配置为获取自动驾驶车辆中雷达传感器采集的数据包序列和图像传感器采集的图像数据序列。在获取到数据 包序列和图像数据序列后,云端系统1200可以针对数据包序列中的每个数据包,根据每个数据包的时间信息与图像数据序列中各图像数据的时间信息,确定与每个数据包匹配的图像数据。随后,根据每个数据包与匹配的图像数据之间的时间信息的差异及差异阈值,确定每个数据包与匹配的图像数据之间的同步关系。最后,该云端系统1200可以根据数据包序列中多个数据包各自与匹配的图像数据之间的同步关系,确定雷达传感器与图像传感器之间的同步关系。Specifically, the cloud system 1200 may be configured to obtain the data packet sequence collected by the radar sensor and the image data sequence collected by the image sensor in the autonomous vehicle. After acquiring the data packet sequence and the image data sequence, the cloud system 1200 can determine, for each data packet in the data packet sequence, the time information associated with each data packet based on the time information of each data packet and the time information of each image data in the image data sequence. packets of matching image data. Subsequently, the synchronization relationship between each data packet and the matching image data is determined based on the difference in time information between each data packet and the matching image data and the difference threshold. Finally, the cloud system 1200 can determine the synchronization relationship between the radar sensor and the image sensor based on the synchronization relationship between each of the multiple data packets in the data packet sequence and the matching image data.
在一实施例中,云端系统1200可以用于执行上文描述的同步确定方法,以确定雷达传感器与图像传感器之间的同步关系,本公开对此不做限定。In one embodiment, the cloud system 1200 can be used to perform the synchronization determination method described above to determine the synchronization relationship between the radar sensor and the image sensor, which is not limited by the present disclosure.
基于本公开提供的车载终端执行的同步采集数据的方法,本公开还提供了一种同步采集数据的装置,该装置可以集成于车载终端中。Based on the method for synchronously collecting data performed by a vehicle-mounted terminal provided by the disclosure, the disclosure also provides a device for synchronously collecting data, which device can be integrated into the vehicle-mounted terminal.
图13是根据本公开实施例的同步采集数据的装置的结构框图。Figure 13 is a structural block diagram of a device for synchronously collecting data according to an embodiment of the present disclosure.
如图13所示,该实施例的同步采集数据的装置1300可以集成于车载终端中,该同步采集数据的装置1300可以包括时间信息确定模块1310、延迟信息确定模块1320和信息发送模块1330。As shown in Figure 13, the device 1300 for synchronously collecting data in this embodiment can be integrated into a vehicle-mounted terminal. The device 1300 for synchronously collecting data can include a time information determination module 1310, a delay information determination module 1320 and an information sending module 1330.
时间信息确定模块1310用于响应于接收到来自雷达传感器的针对预定角度的数据包,确定雷达传感器以预定角度采集点云数据的第一时间信息;其中,预定角度在图像传感器的视角范围内。其中,数据包包括雷达传感器以预定角度采集的点云数据。在一实施例中,时间信息确定模块1310可以用于执行上文描述的操作S210,在此不再赘述。The time information determination module 1310 is configured to determine first time information that the radar sensor collects point cloud data at a predetermined angle in response to receiving a data packet from the radar sensor for a predetermined angle; wherein the predetermined angle is within the viewing angle range of the image sensor. Among them, the data package includes point cloud data collected by the radar sensor at a predetermined angle. In an embodiment, the time information determination module 1310 may be configured to perform the above-described operation S210, which will not be described again here.
延迟信息确定模块1320用于根据第一时间信息,确定针对图像传感器的延迟信息。在一实施例中,延迟信息确定模块1320可以用于执行上文描述的操作S220,在此不再赘述。The delay information determining module 1320 is configured to determine delay information for the image sensor according to the first time information. In an embodiment, the delay information determination module 1320 may be configured to perform the above-described operation S220, which will not be described again here.
信息发送模块1330用于向控制器发送延迟信息,以便控制器控制图像传感器与转动至预定角度的雷达传感器同步地采集数据。在一实施例中,信息发送模块1330可以用于执行上文描述的操作S230,在此不再赘述。The information sending module 1330 is used to send delay information to the controller, so that the controller controls the image sensor to collect data synchronously with the radar sensor rotated to a predetermined angle. In one embodiment, the information sending module 1330 may be configured to perform the above-described operation S230, which will not be described again here.
根据本公开的实施例,时间信息确定模块1310可以包括解析子模块和第一确定子模块。解析子模块用于解析数据包,得到数据包中的时间数据;其中,时间数据包括基准时间和周期性封装时间。第一确定子模块用 于根据基准时间和周期性封装时间,确定雷达传感器以预定角度采集点云数据的第一时间信息。其中,基准时间与雷达传感器所在车辆中定位装置的定位装置时间对齐。According to an embodiment of the present disclosure, the time information determination module 1310 may include a parsing sub-module and a first determination sub-module. The parsing sub-module is used to parse the data packet and obtain the time data in the data packet; the time data includes the base time and the periodic encapsulation time. The first determination sub-module is used to determine the first time information for the radar sensor to collect point cloud data at a predetermined angle based on the reference time and periodic packaging time. The reference time is aligned with the time of the positioning device of the positioning device in the vehicle where the radar sensor is located.
根据本公开的实施例,延迟信息确定模块1320可以包括第二确定子模块、偏离值确定子模块和延迟确定子模块。第二确定子模块用于根据第一时间信息、雷达传感器的转动周期和预定角度,确定雷达传感器以0°采集点云数据的第二时间信息。偏离值确定子模块用于根据第二时间信息对转动周期取余得到的余数,确定雷达传感器的转动偏离值。延迟确定子模块用于根据转动偏离值,确定针对图像传感器的延迟信息。According to an embodiment of the present disclosure, the delay information determination module 1320 may include a second determination sub-module, a deviation value determination sub-module and a delay determination sub-module. The second determination sub-module is used to determine the second time information for the radar sensor to collect point cloud data at 0° based on the first time information, the rotation period of the radar sensor and the predetermined angle. The deviation value determination sub-module is used to determine the rotation deviation value of the radar sensor based on the remainder obtained by taking the remainder of the rotation period based on the second time information. The delay determination sub-module is used to determine delay information for the image sensor according to the rotation deviation value.
根据本公开的实施例,延迟确定子模块具体可以用于根据预定误差值与转动偏离值的和,确定图像传感器的延迟信息。其中,预定误差值的取值大于0,且预定误差值是根据图像传感器的光心位置与目标对象在图像传感器的视角范围内的位置确定的。According to an embodiment of the present disclosure, the delay determination sub-module may be specifically configured to determine the delay information of the image sensor based on the sum of the predetermined error value and the rotation deviation value. The predetermined error value is greater than 0, and the predetermined error value is determined based on the optical center position of the image sensor and the position of the target object within the viewing angle range of the image sensor.
根据本公开的实施例,控制器包括人工智能芯片。According to an embodiment of the present disclosure, the controller includes an artificial intelligence chip.
基于本公开提供的控制器执行的同步采集数据的方法,本公开还提供了一种同步采集数据的装置,该装置可以集成于控制器中。Based on the method for synchronously collecting data executed by the controller provided by the present disclosure, the present disclosure also provides a device for synchronously collecting data, which device can be integrated into the controller.
图14是根据本公开另一实施例的同步采集数据的装置的结构框图。Figure 14 is a structural block diagram of a device for synchronously collecting data according to another embodiment of the present disclosure.
如图14所示,该实施例的同步采集数据的装置1400可以包括时刻调整模块1410、信号发送模块1420和时间添加模块1430。As shown in Figure 14, the device 1400 for synchronously collecting data in this embodiment may include a time adjustment module 1410, a signal sending module 1420 and a time adding module 1430.
时刻调整模块1410用于响应于接收到针对图像传感器的延迟信息,根据延迟信息调整图像传感器的触发时刻。其中,延迟信息是根据雷达传感器以预定角度采集点云数据的时间信息确定的,预定角度在图像传感器的视角范围内。在一实施例中,时刻调整模块1410可以用于执行上文描述的操作S410,在此不再赘述。The timing adjustment module 1410 is configured to respond to receiving delay information for the image sensor and adjust the triggering timing of the image sensor according to the delay information. Among them, the delay information is determined based on the time information of the radar sensor collecting point cloud data at a predetermined angle, and the predetermined angle is within the viewing angle range of the image sensor. In an embodiment, the time adjustment module 1410 may be used to perform the above-described operation S410, which will not be described again here.
信号发送模块1420用于响应于到达触发时刻,向图像传感器发送触发信号。在一实施例中,信号发送模块1420可以用于执行上文描述的操作S420,在此不再赘述。The signal sending module 1420 is configured to send a trigger signal to the image sensor in response to reaching the trigger moment. In an embodiment, the signal sending module 1420 may be configured to perform the above-described operation S420, which will not be described again here.
时间添加模块1430用于响应于接收到图像传感器采集的图像数据,向图像数据添加时间信息,以便利用时间信息将图像数据与雷达传感器采集的点云数据对齐。在一实施例中,时间添加模块1430可以用于执行上 文描述的操作S430,在此不再赘述。The time adding module 1430 is configured to add time information to the image data in response to receiving the image data collected by the image sensor, so as to use the time information to align the image data with the point cloud data collected by the radar sensor. In one embodiment, the time adding module 1430 may be used to perform the operation S430 described above, which will not be described again.
根据本公开的实施例,延迟信息指示延迟时长,上述时刻调整模块1410可以包括调整信息确定子模块和调整子模块。调整信息确定子模块用于根据图像传感器的采集帧率和延迟时长,确定调整触发时刻的次数及针对每次调整的步长。调整子模块用于根据次数和步长,调整图像传感器的触发时刻。According to an embodiment of the present disclosure, the delay information indicates the delay duration, and the above-mentioned time adjustment module 1410 may include an adjustment information determination sub-module and an adjustment sub-module. The adjustment information determination submodule is used to determine the number of adjustment trigger moments and the step size for each adjustment based on the acquisition frame rate and delay time of the image sensor. The adjustment sub-module is used to adjust the triggering moment of the image sensor based on the number of times and step size.
根据本公开的实施例,调整信息确定子模块可以包括调整区间确定单元和信息确定单元。调整区间确定单元用于根据图像传感器的采集帧率,确定触发时刻的调整步长的取值区间。信息确定单元用于根据取值区间和延迟时长,确定次数和步长。其中,采集帧率与取值区间的长度负相关。According to embodiments of the present disclosure, the adjustment information determination sub-module may include an adjustment interval determination unit and an information determination unit. The adjustment interval determination unit is used to determine the value interval of the adjustment step at the triggering moment based on the acquisition frame rate of the image sensor. The information determination unit is used to determine the number of times and step size based on the value interval and delay length. Among them, the acquisition frame rate is negatively related to the length of the value interval.
根据本公开的实施例,上述时刻调整模块1410具体用于响应于接收到延迟信息且延迟信息相对于已接收延迟信息中的相邻延迟信息发生变化,根据延迟信息的变化值调整图像传感器的触发时刻。According to an embodiment of the present disclosure, the above-mentioned time adjustment module 1410 is specifically configured to respond to receiving delay information and the delay information changes relative to adjacent delay information in the received delay information, adjusting the triggering of the image sensor according to the change value of the delay information. time.
基于本公开提供的自动驾驶车辆执行的同步采集数据的方法,本公开还提供了一种同步采集数据的装置,该装置可以集成于自动驾驶车辆中。Based on the method for synchronously collecting data performed by an autonomous driving vehicle provided by the present disclosure, the present disclosure also provides a device for synchronously collecting data, which device can be integrated into the autonomous driving vehicle.
图15是根据本公开另一实施例的同步采集数据的装置的结构框图。Figure 15 is a structural block diagram of a device for synchronously collecting data according to another embodiment of the present disclosure.
如图15所示,该实施例的同步采集数据的装置1500可以包括时间信息确定模块1510、延时信息确定模块1520、时刻调整模块1530、信号发送模块1540和时间添加模块1550。As shown in Figure 15, the device 1500 for synchronously collecting data in this embodiment may include a time information determination module 1510, a delay information determination module 1520, a time adjustment module 1530, a signal sending module 1540 and a time adding module 1550.
时间信息确定模块1510用于响应于接收到来自雷达传感器的针对预定角度的数据包,确定雷达传感器以预定角度采集点云数据的第一时间信息;其中,预定角度在图像传感器的视角范围内。数据包包括雷达传感器以预定角度采集的点云数据。在一实施例中,时间信息确定模块1510可以用于执行上文描述的操作S210,在此不再赘述。The time information determination module 1510 is configured to determine first time information that the radar sensor collects point cloud data at a predetermined angle in response to receiving a data packet from the radar sensor for a predetermined angle; wherein the predetermined angle is within the viewing angle range of the image sensor. The data package includes point cloud data collected by the radar sensor at predetermined angles. In an embodiment, the time information determination module 1510 may be used to perform the operation S210 described above, which will not be described again.
延时信息确定模块1520用于根据第一时间信息,确定针对图像传感器的延迟信息。在一实施例中,延时信息确定模块1520可以用于执行上文描述的操作S220,在此不再赘述。The delay information determination module 1520 is configured to determine delay information for the image sensor according to the first time information. In an embodiment, the delay information determination module 1520 may be configured to perform the operation S220 described above, which will not be described again.
时刻调整模块1530用于根据延迟信息调整图像传感器的触发时刻,以便图像传感器与转动至预定角度的雷达传感器同步地采集数据。在一实施例中,时刻调整模块1530可以用于执行上文描述的操作S410,在此不 再赘述。The time adjustment module 1530 is used to adjust the triggering time of the image sensor according to the delay information, so that the image sensor collects data synchronously with the radar sensor rotated to a predetermined angle. In one embodiment, the time adjustment module 1530 may be used to perform the operation S410 described above, which will not be described again.
信号发送模块1540用于响应于到达触发时刻,向图像传感器发送触发信号。在一实施例中,信号发送模块1540可以用于执行上文描述的操作S420,在此不再赘述。The signal sending module 1540 is configured to send a trigger signal to the image sensor in response to reaching the trigger moment. In an embodiment, the signal sending module 1540 may be used to perform the operation S420 described above, which will not be described again here.
时间添加模块1550用于响应于接收到图像传感器采集的图像数据,向图像数据添加第二时间信息,以便利用第二时间信息将图像数据与雷达传感器采集的点云数据对齐。在一实施例中,时间添加模块1550可以用于执行上文描述的操作S430,在此不再赘述。The time adding module 1550 is configured to add second time information to the image data in response to receiving the image data collected by the image sensor, so as to utilize the second time information to align the image data with the point cloud data collected by the radar sensor. In one embodiment, the time adding module 1550 may be used to perform the above-described operation S430, which will not be described again here.
基于本公开提供的同步确定方法,本公开还提供了一种同步确定装置,该装置可以集成于云端系统中。Based on the synchronization determination method provided by the present disclosure, the present disclosure also provides a synchronization determination device, which can be integrated in a cloud system.
图16是根据本公开实施例的同步确定装置的结构框图。Figure 16 is a structural block diagram of a synchronization determination device according to an embodiment of the present disclosure.
如图16所示,该实施例的同步确定装置1600可以包括数据获取模块1610、数据匹配模块1620、数据关系确定模块1630和同步关系确定模块1640。As shown in Figure 16, the synchronization determination device 1600 of this embodiment may include a data acquisition module 1610, a data matching module 1620, a data relationship determination module 1630 and a synchronization relationship determination module 1640.
数据获取模块1610用于获取雷达传感器采集的数据包序列和图像传感器采集的图像数据序列。在一实施例中,数据获取模块1610可以用于执行上文描述的操作S810,在此不再赘述。The data acquisition module 1610 is used to acquire the data packet sequence collected by the radar sensor and the image data sequence collected by the image sensor. In an embodiment, the data acquisition module 1610 may be used to perform the above-described operation S810, which will not be described again here.
数据匹配模块1620用于针对数据包序列中的每个数据包,根据每个数据包的时间信息与图像数据序列中各图像数据的时间信息,确定与每个数据包匹配的图像数据。在一实施例中,数据匹配模块1620可以用于执行上文描述的操作S820,在此不再赘述。The data matching module 1620 is configured to determine, for each data packet in the data packet sequence, image data matching each data packet based on the time information of each data packet and the time information of each image data in the image data sequence. In one embodiment, the data matching module 1620 may be used to perform the above-described operation S820, which will not be described again here.
数据关系确定模块1630用于根据每个数据包与匹配的图像数据之间的时间信息的差异及差异阈值,确定每个数据包与匹配的图像数据之间的同步关系。在一实施例中,数据关系确定模块1630可以用于执行上文描述的操作S830,在此不再赘述。The data relationship determination module 1630 is configured to determine a synchronization relationship between each data packet and the matching image data based on the difference in time information between each data packet and the matching image data and the difference threshold. In one embodiment, the data relationship determination module 1630 may be used to perform the above-described operation S830, which will not be described again here.
同步关系确定模块1640用于根据数据包序列中多个数据包各自与匹配的图像数据之间的同步关系,确定雷达传感器与图像传感器之间的同步关系。在一实施例中,同步关系确定模块1640可以用于执行上文描述的操作S840,在此不再赘述。The synchronization relationship determination module 1640 is used to determine the synchronization relationship between the radar sensor and the image sensor based on the synchronization relationship between each of the multiple data packets in the data packet sequence and the matching image data. In an embodiment, the synchronization relationship determination module 1640 may be used to perform the above-described operation S840, which will not be described again here.
根据本公开的实施例,上述同步确定装置还可以包括提示信息发送模 块,用于响应于雷达传感器与图像传感器之间的同步关系指示不同步,向目标对象发送提示信息。According to an embodiment of the present disclosure, the above-mentioned synchronization determining device may further include a prompt information sending module, configured to send prompt information to the target object in response to the synchronization relationship between the radar sensor and the image sensor indicating out-of-synchronization.
根据本公开的实施例,上述同步确定装置还可以包括阈值确定模块,用于确定差异阈值。该阈值确定模块可以包括数据融合子模块和阈值确定子模块。数据融合子模块用于根据雷达传感器采集的历史数据包及与历史数据包匹配的历史图像数据,得到融合数据;融合数据表征了历史数据包中的点云数据投影至图像坐标系中的像素点,与历史图像数据中像素点之间的位置偏差值。阈值确定子模块用于在位置偏差值达到预定偏差值的情况下,确定历史数据包与历史图像数据之间的时间信息的差异为差异阈值。According to an embodiment of the present disclosure, the above-mentioned synchronization determination device may further include a threshold determination module for determining a difference threshold. The threshold determination module may include a data fusion sub-module and a threshold determination sub-module. The data fusion sub-module is used to obtain fusion data based on the historical data packets collected by the radar sensor and the historical image data matching the historical data packets; the fusion data represents the projection of the point cloud data in the historical data packets to the pixels in the image coordinate system , and the position deviation value between pixels in historical image data. The threshold determination sub-module is used to determine the difference in time information between the historical data packet and the historical image data as a difference threshold when the position deviation value reaches a predetermined deviation value.
根据本公开的实施例,上述数据融合子模块具体用于采用预定容器化工具对历史数据包中的点云数据和历史图像数据进行融合,得到融合数据。According to embodiments of the present disclosure, the above-mentioned data fusion sub-module is specifically used to use predetermined containerization tools to fuse point cloud data and historical image data in historical data packages to obtain fused data.
需要说明的是,本公开的技术方案中,所涉及的用户个人信息的收集、存储、使用、加工、传输、提供、公开和应用等处理,均符合相关法律法规的规定,采取了必要保密措施,且不违背公序良俗。在本公开的技术方案中,在获取或采集用户个人信息之前,均获取了用户的授权或同意。It should be noted that in the technical solution of this disclosure, the collection, storage, use, processing, transmission, provision, disclosure and application of user personal information are all in compliance with relevant laws and regulations, and necessary confidentiality measures have been taken. , and does not violate public order and good customs. In the technical solution of the present disclosure, the user's authorization or consent is obtained before obtaining or collecting the user's personal information.
根据本公开的实施例,本公开还提供了一种电子设备、一种可读存储介质和一种计算机程序产品。According to embodiments of the present disclosure, the present disclosure also provides an electronic device, a readable storage medium, and a computer program product.
图17示出了可以用来实施本公开实施例的同步采集数据的方法或同步确定方法的示例电子设备1700的示意性框图。电子设备旨在表示各种形式的数字计算机,诸如,膝上型计算机、台式计算机、工作台、个人数字助理、服务器、刀片式服务器、大型计算机、和其它适合的计算机。电子设备还可以表示各种形式的移动装置,诸如,个人数字处理、蜂窝电话、智能电话、可穿戴设备和其它类似的计算装置。本文所示的部件、它们的连接和关系、以及它们的功能仅仅作为示例,并且不意在限制本文中描述的和/或者要求的本公开的实现。FIG. 17 shows a schematic block diagram of an example electronic device 1700 that can be used to implement the method of synchronously collecting data or the method of synchronization determination according to embodiments of the present disclosure. Electronic devices are intended to refer to various forms of digital computers, such as laptop computers, desktop computers, workstations, personal digital assistants, servers, blade servers, mainframe computers, and other suitable computers. Electronic devices may also represent various forms of mobile devices, such as personal digital assistants, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions are examples only and are not intended to limit implementations of the disclosure described and/or claimed herein.
如图17所示,设备1700包括计算单元1701,其可以根据存储在只读存储器(ROM)1702中的计算机程序或者从存储单元1708加载到随机访问存储器(RAM)1703中的计算机程序,来执行各种适当的动作和处理。在RAM 1703中,还可存储设备1700操作所需的各种程序和数据。计算单元1701、ROM 1702以及RAM 1703通过总线1704彼此相连。输入/输 出(I/O)接口1705也连接至总线1704。As shown in Figure 17, the device 1700 includes a computing unit 1701 that can execute according to a computer program stored in a read-only memory (ROM) 1702 or loaded from a storage unit 1708 into a random access memory (RAM) 1703. Various appropriate actions and treatments. In the RAM 1703, various programs and data required for the operation of the device 1700 can also be stored. Computing unit 1701, ROM 1702 and RAM 1703 are connected to each other via bus 1704. Input/output (I/O) interface 1705 is also connected to bus 1704.
设备1700中的多个部件连接至I/O接口1705,包括:输入单元1706,例如键盘、鼠标等;输出单元1707,例如各种类型的显示器、扬声器等;存储单元1708,例如磁盘、光盘等;以及通信单元1709,例如网卡、调制解调器、无线通信收发机等。通信单元1709允许设备1700通过诸如因特网的计算机网络和/或各种电信网络与其他设备交换信息/数据。Multiple components in device 1700 are connected to I/O interface 1705, including: input unit 1706, such as keyboard, mouse, etc.; output unit 1707, such as various types of displays, speakers, etc.; storage unit 1708, such as magnetic disk, optical disk, etc. ; and communication unit 1709, such as a network card, modem, wireless communication transceiver, etc. The communication unit 1709 allows the device 1700 to exchange information/data with other devices through computer networks such as the Internet and/or various telecommunications networks.
计算单元1701可以是各种具有处理和计算能力的通用和/或专用处理组件。计算单元1701的一些示例包括但不限于中央处理单元(CPU)、图形处理单元(GPU)、各种专用的人工智能(AI)计算芯片、各种运行机器学习模型算法的计算单元、数字信号处理器(DSP)、以及任何适当的处理器、控制器、微控制器等。计算单元1701执行上文所描述的各个方法和处理,例如同步采集数据的方法或同步确定方法。例如,在一些实施例中,同步采集数据的方法或同步确定方法可被实现为计算机软件程序,其被有形地包含于机器可读介质,例如存储单元1708。在一些实施例中,计算机程序的部分或者全部可以经由ROM 1702和/或通信单元1709而被载入和/或安装到设备1700上。当计算机程序加载到RAM1703并由计算单元1701执行时,可以执行上文描述的同步采集数据的方法或同步确定方法的一个或多个步骤。备选地,在其他实施例中,计算单元1701可以通过其他任何适当的方式(例如,借助于固件)而被配置为执行同步采集数据的方法或同步确定方法。Computing unit 1701 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of the computing unit 1701 include, but are not limited to, a central processing unit (CPU), a graphics processing unit (GPU), various dedicated artificial intelligence (AI) computing chips, various computing units that run machine learning model algorithms, digital signal processing processor (DSP), and any appropriate processor, controller, microcontroller, etc. The computing unit 1701 performs various methods and processes described above, such as the method of synchronously collecting data or the method of synchronization determination. For example, in some embodiments, the synchronization data collection method or the synchronization determination method may be implemented as a computer software program, which is tangibly included in a machine-readable medium, such as the storage unit 1708. In some embodiments, part or all of the computer program may be loaded and/or installed onto device 1700 via ROM 1702 and/or communication unit 1709. When the computer program is loaded into the RAM 1703 and executed by the computing unit 1701, one or more steps of the above-described method of synchronously collecting data or the synchronously determining method may be performed. Alternatively, in other embodiments, the computing unit 1701 may be configured to perform the method of synchronously collecting data or the method of synchronously determining in any other suitable manner (eg, by means of firmware).
本文中以上描述的系统和技术的各种实施方式可以在数字电子电路系统、集成电路系统、场可编程门阵列(FPGA)、专用集成电路(ASIC)、专用标准产品(ASSP)、芯片上系统的系统(SOC)、复杂可编程逻辑设备(CPLD)、计算机硬件、固件、软件、和/或它们的组合中实现。这些各种实施方式可以包括:实施在一个或者多个计算机程序中,该一个或者多个计算机程序可在包括至少一个可编程处理器的可编程系统上执行和/或解释,该可编程处理器可以是专用或者通用可编程处理器,可以从存储系统、至少一个输入装置、和至少一个输出装置接收数据和指令,并且将数据和指令传输至该存储系统、该至少一个输入装置、和该至少一个输出装置。Various implementations of the systems and techniques described above may be implemented in digital electronic circuit systems, integrated circuit systems, field programmable gate arrays (FPGAs), application specific integrated circuits (ASICs), application specific standard products (ASSPs), systems on a chip implemented in a system (SOC), complex programmable logic device (CPLD), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include implementation in one or more computer programs executable and/or interpreted on a programmable system including at least one programmable processor, the programmable processor The processor, which may be a special purpose or general purpose programmable processor, may receive data and instructions from a storage system, at least one input device, and at least one output device, and transmit data and instructions to the storage system, the at least one input device, and the at least one output device. An output device.
用于实施本公开的方法的程序代码可以采用一个或多个编程语言的任何组合来编写。这些程序代码可以提供给通用计算机、专用计算机或其他可编程数据处理装置的处理器或控制器,使得程序代码当由处理器或控制器执行时使流程图和/或框图中所规定的功能/操作被实施。程序代码可以完全在机器上执行、部分地在机器上执行,作为独立软件包部分地在机器上执行且部分地在远程机器上执行或完全在远程机器或服务器上执行。Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general-purpose computer, special-purpose computer, or other programmable data processing device, such that the program codes, when executed by the processor or controller, cause the functions specified in the flowcharts and/or block diagrams/ The operation is implemented. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
在本公开的上下文中,机器可读介质可以是有形的介质,其可以包含或存储以供指令执行系统、装置或设备使用或与指令执行系统、装置或设备结合地使用的程序。机器可读介质可以是机器可读信号介质或机器可读储存介质。机器可读介质可以包括但不限于电子的、磁性的、光学的、电磁的、红外的、或半导体系统、装置或设备,或者上述内容的任何合适组合。机器可读存储介质的更具体示例会包括基于一个或多个线的电气连接、便携式计算机盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或快闪存储器)、光纤、便捷式紧凑盘只读存储器(CD-ROM)、光学储存设备、磁储存设备、或上述内容的任何合适组合。In the context of this disclosure, a machine-readable medium may be a tangible medium that may contain or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. Machine-readable media may include, but are not limited to, electronic, magnetic, optical, electromagnetic, infrared, or semiconductor systems, devices or devices, or any suitable combination of the foregoing. More specific examples of machine-readable storage media would include one or more wire-based electrical connections, laptop disks, hard drives, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
为了提供与用户的交互,可以在计算机上实施此处描述的系统和技术,该计算机具有:用于向用户显示信息的显示装置(例如,CRT(阴极射线管)或者LCD(液晶显示器)监视器);以及键盘和指向装置(例如,鼠标或者轨迹球),用户可以通过该键盘和该指向装置来将输入提供给计算机。其它种类的装置还可以用于提供与用户的交互;例如,提供给用户的反馈可以是任何形式的传感反馈(例如,视觉反馈、听觉反馈、或者触觉反馈);并且可以用任何形式(包括声输入、语音输入或者、触觉输入)来接收来自用户的输入。To provide interaction with a user, the systems and techniques described herein may be implemented on a computer having a display device (eg, a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to the user ); and a keyboard and pointing device (eg, a mouse or a trackball) through which a user can provide input to the computer. Other kinds of devices may also be used to provide interaction with the user; for example, the feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and may be provided in any form, including Acoustic input, voice input or tactile input) to receive input from the user.
可以将此处描述的系统和技术实施在包括后台部件的计算系统(例如,作为数据服务器)、或者包括中间件部件的计算系统(例如,应用服务器)、或者包括前端部件的计算系统(例如,具有图形用户界面或者网络浏览器的用户计算机,用户可以通过该图形用户界面或者该网络浏览器来与此处描述的系统和技术的实施方式交互)、或者包括这种后台部件、中间件部件、或者前端部件的任何组合的计算系统中。可以通过任何形式或者介质 的数字数据通信(例如,通信网络)来将系统的部件相互连接。通信网络的示例包括:局域网(LAN)、广域网(WAN)和互联网。The systems and techniques described herein may be implemented in a computing system that includes back-end components (e.g., as a data server), or a computing system that includes middleware components (e.g., an application server), or a computing system that includes front-end components (e.g., A user's computer having a graphical user interface or web browser through which the user can interact with implementations of the systems and technologies described herein), or including such backend components, middleware components, or any combination of front-end components in a computing system. The components of the system may be interconnected by any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include: local area network (LAN), wide area network (WAN), and the Internet.
计算机系统可以包括客户端和服务器。客户端和服务器一般远离彼此并且通常通过通信网络进行交互。通过在相应的计算机上运行并且彼此具有客户端-服务器关系的计算机程序来产生客户端和服务器的关系。其中,服务器可以是云服务器,又称为云计算服务器或云主机,是云计算服务体系中的一项主机产品,以解决了传统物理主机与VPS服务(″Virtual Private Server″,或简称″VPS″)中,存在的管理难度大,业务扩展性弱的缺陷。服务器也可以为分布式系统的服务器,或者是结合了区块链的服务器。Computer systems may include clients and servers. Clients and servers are generally remote from each other and typically interact over a communications network. The relationship of client and server is created by computer programs running on corresponding computers and having a client-server relationship with each other. Among them, the server can be a cloud server, also known as cloud computing server or cloud host. It is a host product in the cloud computing service system to solve the problem of traditional physical host and VPS service ("Virtual Private Server", or "VPS" for short). "), there are defects such as difficult management and weak business scalability. The server can also be a distributed system server or a server combined with a blockchain.
应该理解,可以使用上面所示的各种形式的流程,重新排序、增加或删除步骤。例如,本发公开中记载的各步骤可以并行地执行也可以顺序地执行也可以不同的次序执行,只要能够实现本公开公开的技术方案所期望的结果,本文在此不进行限制。It should be understood that various forms of the process shown above may be used, with steps reordered, added or deleted. For example, each step described in the present disclosure can be executed in parallel, sequentially, or in a different order. As long as the desired results of the technical solution disclosed in the present disclosure can be achieved, there is no limitation here.
上述具体实施方式,并不构成对本公开保护范围的限制。本领域技术人员应该明白的是,根据设计要求和其他因素,可以进行各种修改、组合、子组合和替代。任何在本公开的精神和原则之内所作的修改、等同替换和改进等,均应包含在本公开保护范围之内。The above-mentioned specific embodiments do not constitute a limitation on the scope of the present disclosure. It will be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions are possible depending on design requirements and other factors. Any modifications, equivalent substitutions, improvements, etc. made within the spirit and principles of this disclosure shall be included in the protection scope of this disclosure.

Claims (35)

  1. 一种同步采集数据的方法,包括:A method for synchronous data collection, including:
    响应于接收到来自雷达传感器的针对预定角度的数据包,确定所述雷达传感器以所述预定角度采集点云数据的第一时间信息;其中,所述预定角度在图像传感器的视角范围内;In response to receiving a data packet for a predetermined angle from a radar sensor, determining first time information at which the radar sensor collects point cloud data at the predetermined angle; wherein the predetermined angle is within a viewing angle range of the image sensor;
    根据所述第一时间信息,确定针对所述图像传感器的延迟信息;以及determining delay information for the image sensor based on the first time information; and
    向控制器发送所述延迟信息,以便所述控制器控制所述图像传感器与转动至所述预定角度的所述雷达传感器同步地采集数据,sending the delay information to the controller so that the controller controls the image sensor to collect data in synchronization with the radar sensor rotated to the predetermined angle,
    其中,所述数据包包括所述雷达传感器以所述预定角度采集的点云数据。Wherein, the data package includes point cloud data collected by the radar sensor at the predetermined angle.
  2. 根据权利要求1所述的方法,其中,所述响应于接收到来自雷达传感器的针对预定角度的数据包,确定所述雷达传感器以所述预定角度采集点云数据的第一时间信息包括:The method of claim 1, wherein determining, in response to receiving a data packet from a radar sensor for a predetermined angle, the first time information at which the radar sensor collects point cloud data at the predetermined angle includes:
    解析所述数据包,得到所述数据包中的时间数据;其中,所述时间数据包括基准时间和周期性封装时间;以及Parse the data packet to obtain the time data in the data packet; wherein the time data includes a reference time and a periodic encapsulation time; and
    根据所述基准时间和所述周期性封装时间,确定所述雷达传感器以所述预定角度采集点云数据的第一时间信息,Determine the first time information for the radar sensor to collect point cloud data at the predetermined angle according to the reference time and the periodic packaging time,
    其中,所述基准时间与所述雷达传感器所在车辆中定位装置的定位装置时间对齐。Wherein, the reference time is aligned with the positioning device time of the positioning device in the vehicle where the radar sensor is located.
  3. 根据权利要求1所述的方法,其中,所述根据所述第一时间信息,确定针对所述图像传感器的延迟信息包括:The method of claim 1, wherein determining delay information for the image sensor according to the first time information includes:
    根据所述第一时间信息、所述雷达传感器的转动周期和所述预定角度,确定所述雷达传感器以0°采集点云数据的第二时间信息;According to the first time information, the rotation period of the radar sensor and the predetermined angle, determine the second time information for the radar sensor to collect point cloud data at 0°;
    根据所述第二时间信息对所述转动周期取余得到的余数,确定所述雷达传感器的转动偏离值;以及Determine the rotation deviation value of the radar sensor based on the remainder obtained by taking the remainder of the rotation period based on the second time information; and
    根据所述转动偏离值,确定针对所述图像传感器的延迟信息。Delay information for the image sensor is determined based on the rotational deviation value.
  4. 根据权利要求3所述的方法,其中,所述根据所述转动偏离值,确定针对所述图像传感器的延迟信息包括:The method of claim 3, wherein determining delay information for the image sensor according to the rotational deviation value includes:
    根据预定误差值与所述转动偏离值的和,确定所述图像传感器的延迟信息,Determine the delay information of the image sensor according to the sum of the predetermined error value and the rotation deviation value,
    其中,所述预定误差值的取值大于0,且所述预定误差值是根据所述图像传感器的光心位置与目标对象在所述图像传感器的视角范围内的位置确定的。Wherein, the value of the predetermined error value is greater than 0, and the predetermined error value is determined based on the optical center position of the image sensor and the position of the target object within the viewing angle range of the image sensor.
  5. 根据权利要求1所述的方法,其中,所述控制器包括人工智能芯片。The method of claim 1, wherein the controller includes an artificial intelligence chip.
  6. 一种同步采集数据的方法,包括:A method for synchronous data collection, including:
    响应于接收到针对图像传感器的延迟信息,根据所述延迟信息调整所述图像传感器的触发时刻;In response to receiving delay information for the image sensor, adjusting a triggering moment of the image sensor based on the delay information;
    响应于到达所述触发时刻,向所述图像传感器发送触发信号;以及In response to reaching the trigger moment, sending a trigger signal to the image sensor; and
    响应于接收到所述图像传感器采集的图像数据,向所述图像数据添加时间信息,以便利用所述时间信息将所述图像数据与雷达传感器采集的点云数据对齐,In response to receiving the image data collected by the image sensor, adding time information to the image data to utilize the time information to align the image data with the point cloud data collected by the radar sensor,
    其中,所述延迟信息是根据所述雷达传感器以预定角度采集点云数据的时间信息确定的,所述预定角度在所述图像传感器的视角范围内。Wherein, the delay information is determined based on the time information of the radar sensor collecting point cloud data at a predetermined angle, and the predetermined angle is within the viewing angle range of the image sensor.
  7. 根据权利要求6所述的方法,其中,所述延迟信息指示延迟时长;所述根据所述延迟信息调整所述图像传感器的触发时刻包括:The method according to claim 6, wherein the delay information indicates a delay duration; and adjusting the triggering moment of the image sensor according to the delay information includes:
    根据所述图像传感器的采集帧率和所述延迟时长,确定调整所述触发时刻的次数及针对每次调整的步长;以及Determine the number of times to adjust the trigger moment and the step size for each adjustment according to the acquisition frame rate of the image sensor and the delay duration; and
    根据所述次数和所述步长,调整所述图像传感器的触发时刻。The triggering moment of the image sensor is adjusted according to the number of times and the step size.
  8. 根据权利要求7所述的方法,其中,所述根据所述图像传感器的采集帧率和所述延迟时长,确定调整所述触发时刻的次数及针对每次调整的步长包括:The method according to claim 7, wherein determining the number of times to adjust the trigger moment and the step size for each adjustment according to the acquisition frame rate of the image sensor and the delay duration includes:
    根据所述图像传感器的采集帧率,确定所述触发时刻的调整步长的取值区间;以及Determine the value interval of the adjustment step size of the trigger moment according to the acquisition frame rate of the image sensor; and
    根据所述取值区间和所述延迟时长,确定所述次数和所述步长,Determine the number of times and the step size based on the value interval and the delay length,
    其中,所述采集帧率与所述取值区间的长度负相关。Wherein, the collection frame rate is negatively correlated with the length of the value interval.
  9. 根据权利要求6所述的方法,其中,所述响应于接收到针对图像传感器的延迟信息,根据所述延迟信息调整所述图像传感器的触发时刻包括:The method of claim 6, wherein in response to receiving delay information for the image sensor, adjusting the triggering moment of the image sensor according to the delay information includes:
    响应于接收到所述延迟信息且所述延迟信息相对于已接收延迟信息中的相邻延迟信息发生变化,根据所述延迟信息的变化值调整所述图像传感器的触发时刻。In response to receiving the delay information and the delay information changing relative to adjacent delay information in the received delay information, adjusting the triggering moment of the image sensor according to the change value of the delay information.
  10. 一种同步采集数据的方法,包括:A method for synchronous data collection, including:
    响应于接收到来自雷达传感器的针对预定角度的数据包,确定所述雷达传感器以所述预定角度采集点云数据的第一时间信息;其中,所述预定角度在图像传感器的视角范围内;In response to receiving a data packet for a predetermined angle from a radar sensor, determining first time information at which the radar sensor collects point cloud data at the predetermined angle; wherein the predetermined angle is within a viewing angle range of the image sensor;
    根据所述第一时间信息,确定针对所述图像传感器的延迟信息;determining delay information for the image sensor based on the first time information;
    根据所述延迟信息调整所述图像传感器的触发时刻,以便所述图像传感器与转动至所述预定角度的所述雷达传感器同步地采集数据;Adjust the triggering time of the image sensor according to the delay information, so that the image sensor collects data synchronously with the radar sensor rotated to the predetermined angle;
    响应于到达所述触发时刻,向所述图像传感器发送触发信号;以及In response to reaching the trigger moment, sending a trigger signal to the image sensor; and
    响应于接收到所述图像传感器采集的图像数据,向所述图像数据添加第二时间信息,以便利用所述第二时间信息将所述图像数据与所述雷达传感器采集的点云数据对齐。In response to receiving the image data collected by the image sensor, second time information is added to the image data to align the image data with the point cloud data collected by the radar sensor using the second time information.
  11. 一种同步确定方法,包括:A synchronization determination method, including:
    获取雷达传感器采集的数据包序列和图像传感器采集的图像数据序列;Obtain the data packet sequence collected by the radar sensor and the image data sequence collected by the image sensor;
    针对所述数据包序列中的每个数据包,根据所述每个数据包的时间信息与所述图像数据序列中各图像数据的时间信息,确定与所述每个数据包匹配的图像数据;For each data packet in the data packet sequence, determine the image data matching each data packet according to the time information of each data packet and the time information of each image data in the image data sequence;
    根据所述每个数据包与匹配的图像数据之间的时间信息的差异及差异阈值,确定所述每个数据包与匹配的图像数据之间的同步关系;以及Determine the synchronization relationship between each data packet and the matching image data according to the difference in time information and the difference threshold between each data packet and the matching image data; and
    根据所述数据包序列中多个数据包各自与匹配的图像数据之间的同步关系,确定所述雷达传感器与所述图像传感器之间的同步关系。The synchronization relationship between the radar sensor and the image sensor is determined according to the synchronization relationship between each of the plurality of data packets in the data packet sequence and the matching image data.
  12. 根据权利要求11所述的方法,还包括:The method of claim 11, further comprising:
    响应于所述雷达传感器与所述图像传感器之间的同步关系指示不同步,向目标对象发送提示信息。In response to the synchronization relationship between the radar sensor and the image sensor indicating out-of-synchronization, prompt information is sent to the target object.
  13. 根据权利要求11所述的方法,还包括通过以下操作确定所述差异阈值:The method of claim 11, further comprising determining the difference threshold by:
    根据雷达传感器采集的历史数据包及与所述历史数据包匹配的历史图像数据,得到融合数据;所述融合数据表征了所述历史数据包中的点云数据投影至图像坐标系中的像素点,与所述历史图像数据中像素点之间的位置偏差值;以及Fusion data is obtained based on the historical data packets collected by the radar sensor and the historical image data matching the historical data packets; the fusion data represents the projection of the point cloud data in the historical data packets to the pixels in the image coordinate system , and the position deviation value between pixels in the historical image data; and
    在所述位置偏差值达到预定偏差值的情况下,确定所述历史数据包与所述历史图像数据之间的时间信息的差异为所述差异阈值。When the position deviation value reaches a predetermined deviation value, the difference in time information between the historical data packet and the historical image data is determined to be the difference threshold.
  14. 根据权利要求13所述的方法,其中,所述根据雷达传感器采集的历史数据包和与所述历史数据包匹配的历史图像数据,得到融合数据包括:The method according to claim 13, wherein obtaining the fusion data based on the historical data packets collected by the radar sensor and the historical image data matching the historical data packets includes:
    采用预定容器化工具对所述历史数据包中的点云数据和所述历史图像数据进行融合,得到所述融合数据。A predetermined containerization tool is used to fuse the point cloud data in the historical data package and the historical image data to obtain the fused data.
  15. 一种车载终端,被配置为:A vehicle-mounted terminal configured as:
    响应于接收到来自雷达传感器的针对预定角度的数据包,确定所述雷达传感器以所述预定角度采集点云数据的第一时间信息;其中,所述预定角度在图像传感器的视角范围内;In response to receiving a data packet for a predetermined angle from a radar sensor, determining first time information at which the radar sensor collects point cloud data at the predetermined angle; wherein the predetermined angle is within a viewing angle range of the image sensor;
    根据所述第一时间信息,确定针对所述图像传感器的延迟信息;以及determining delay information for the image sensor based on the first time information; and
    向控制器发送所述延迟信息,以便所述控制器控制所述图像传感器与转动至所述预定角度的所述雷达传感器同步地采集数据,sending the delay information to the controller so that the controller controls the image sensor to collect data in synchronization with the radar sensor rotated to the predetermined angle,
    其中,所述数据包包括所述雷达传感器以所述预定角度采集的点云数据。Wherein, the data package includes point cloud data collected by the radar sensor at the predetermined angle.
  16. 一种控制器,被配置为:A controller configured to:
    响应于接收到针对图像传感器的延迟信息,根据所述延迟信息调整所述图像传感器的触发时刻;In response to receiving delay information for the image sensor, adjusting a triggering moment of the image sensor based on the delay information;
    响应于到达所述触发时刻,向所述图像传感器发送触发信号;以及In response to reaching the trigger moment, sending a trigger signal to the image sensor; and
    响应于接收到所述图像传感器采集的图像数据,向所述图像数据添加时间信息,以便利用所述时间信息将所述图像数据与雷达传感器采集的点云数据对齐,In response to receiving the image data collected by the image sensor, adding time information to the image data to utilize the time information to align the image data with the point cloud data collected by the radar sensor,
    其中,所述延迟信息是根据所述雷达传感器以预定角度采集点云数据的时间信息确定的,所述预定角度在所述图像传感器的视角范围内。Wherein, the delay information is determined based on the time information of the radar sensor collecting point cloud data at a predetermined angle, and the predetermined angle is within the viewing angle range of the image sensor.
  17. 一种自动驾驶车辆,包括车载终端、控制器、雷达传感器和图像传感器,其中:An autonomous vehicle, including a vehicle-mounted terminal, a controller, a radar sensor and an image sensor, wherein:
    所述车载终端被配置为:The vehicle-mounted terminal is configured as:
    响应于接收到来自雷达传感器的针对预定角度的数据包,确定所述雷达传感器以所述预定角度采集点云数据的第一时间信息;其中,所述预定角度在图像传感器的视角范围内;所述数据包包括所述雷达传感器以 所述预定角度采集的点云数据;In response to receiving a data packet for a predetermined angle from a radar sensor, determining first time information for the radar sensor to collect point cloud data at the predetermined angle; wherein the predetermined angle is within a viewing angle range of the image sensor; The data package includes point cloud data collected by the radar sensor at the predetermined angle;
    根据所述第一时间信息,确定针对所述图像传感器的延迟信息;以及determining delay information for the image sensor based on the first time information; and
    向控制器发送所述延迟信息;sending the delay information to the controller;
    所述控制器被配置为:The controller is configured to:
    响应于接收到所述延迟信息,根据所述延迟信息调整所述图像传感器的触发时刻;In response to receiving the delay information, adjusting a triggering moment of the image sensor based on the delay information;
    响应于到达所述触发时刻,向所述图像传感器发送触发信号;以及In response to reaching the trigger moment, sending a trigger signal to the image sensor; and
    响应于接收到所述图像传感器采集的图像数据,向所述图像数据添加第二时间信息,以便利用所述第二时间信息将所述图像数据与雷达传感器采集的点云数据对齐。In response to receiving the image data collected by the image sensor, second time information is added to the image data to align the image data with the point cloud data collected by the radar sensor using the second time information.
  18. 一种云端系统,被配置为:A cloud system configured to:
    获取雷达传感器采集的数据包序列和图像传感器采集的图像数据序列;Obtain the data packet sequence collected by the radar sensor and the image data sequence collected by the image sensor;
    针对所述数据包序列中的每个数据包,根据所述每个数据包的时间信息与所述图像数据序列中各图像数据的时间信息,确定与所述每个数据包匹配的图像数据;For each data packet in the data packet sequence, determine the image data matching each data packet according to the time information of each data packet and the time information of each image data in the image data sequence;
    根据所述每个数据包与匹配的图像数据之间的时间信息的差异及差异阈值,确定所述每个数据包与匹配的图像数据之间的同步关系;以及Determine the synchronization relationship between each data packet and the matching image data according to the difference in time information and the difference threshold between each data packet and the matching image data; and
    根据所述数据包序列中多个数据包各自与匹配的图像数据之间的同步关系,确定所述雷达传感器与所述图像传感器之间的同步关系。The synchronization relationship between the radar sensor and the image sensor is determined according to the synchronization relationship between each of the plurality of data packets in the data packet sequence and the matching image data.
  19. 一种同步采集数据的装置,包括:A device for synchronously collecting data, including:
    时间信息确定模块,用于响应于接收到来自雷达传感器的针对预定角度的数据包,确定所述雷达传感器以所述预定角度采集点云数据的第一时间信息;其中,所述预定角度在图像传感器的视角范围内;A time information determination module, configured to determine the first time information of the radar sensor collecting point cloud data at the predetermined angle in response to receiving a data packet from the radar sensor for a predetermined angle; wherein the predetermined angle is in the image Within the viewing angle of the sensor;
    延迟信息确定模块,用于根据所述第一时间信息,确定针对所述图像传感器的延迟信息;以及a delay information determination module, configured to determine delay information for the image sensor according to the first time information; and
    信息发送模块,用于向控制器发送所述延迟信息,以便所述控制器控制所述图像传感器与转动至所述预定角度的所述雷达传感器同步地采集 数据,an information sending module, configured to send the delay information to the controller so that the controller controls the image sensor to collect data synchronously with the radar sensor rotated to the predetermined angle,
    其中,所述数据包包括所述雷达传感器以所述预定角度采集的点云数据。Wherein, the data package includes point cloud data collected by the radar sensor at the predetermined angle.
  20. 根据权利要求19所述的装置,其中,所述时间信息确定模块包括:The device according to claim 19, wherein the time information determination module includes:
    解析子模块,用于解析所述数据包,得到所述数据包中的时间数据;其中,所述时间数据包括基准时间和周期性封装时间;以及A parsing sub-module, used to parse the data packet to obtain the time data in the data packet; wherein the time data includes a reference time and a periodic encapsulation time; and
    第一确定子模块,用于根据所述基准时间和所述周期性封装时间,确定所述雷达传感器以所述预定角度采集点云数据的第一时间信息,A first determination sub-module, configured to determine the first time information of the radar sensor collecting point cloud data at the predetermined angle based on the reference time and the periodic packaging time,
    其中,所述基准时间与所述雷达传感器所在车辆中定位装置的定位装置时间对齐。Wherein, the reference time is aligned with the positioning device time of the positioning device in the vehicle where the radar sensor is located.
  21. 根据权利要求19所述的装置,其中,所述延迟信息确定模块包括:The device according to claim 19, wherein the delay information determining module includes:
    第二确定子模块,用于根据所述第一时间信息、所述雷达传感器的转动周期和所述预定角度,确定所述雷达传感器以0°采集点云数据的第二时间信息;The second determination sub-module is used to determine the second time information for the radar sensor to collect point cloud data at 0° based on the first time information, the rotation period of the radar sensor and the predetermined angle;
    偏离值确定子模块,用于根据所述第二时间信息对所述转动周期取余得到的余数,确定所述雷达传感器的转动偏离值;以及A deviation value determination submodule, configured to determine the rotation deviation value of the radar sensor based on the remainder obtained by taking the remainder of the rotation period based on the second time information; and
    延迟确定子模块,用于根据所述转动偏离值,确定针对所述图像传感器的延迟信息。A delay determination submodule, configured to determine delay information for the image sensor according to the rotation deviation value.
  22. 根据权利要求21所述的装置,其中,所述延迟确定子模块用于:The device according to claim 21, wherein the delay determination sub-module is used to:
    根据预定误差值与所述转动偏离值的和,确定所述图像传感器的延迟信息,Determine the delay information of the image sensor according to the sum of the predetermined error value and the rotation deviation value,
    其中,所述预定误差值的取值大于0,且所述预定误差值是根据所述图像传感器的光心位置与目标对象在所述图像传感器的视角范围内的位置确定的。Wherein, the value of the predetermined error value is greater than 0, and the predetermined error value is determined based on the optical center position of the image sensor and the position of the target object within the viewing angle range of the image sensor.
  23. 根据权利要求19所述的装置,其中,所述控制器包括人工智能芯片。The apparatus of claim 19, wherein the controller includes an artificial intelligence chip.
  24. 一种同步采集数据的装置,包括:A device for synchronous data collection, including:
    时刻调整模块,用于响应于接收到针对图像传感器的延迟信息,根据所述延迟信息调整所述图像传感器的触发时刻;a time adjustment module, configured to respond to receiving delay information for the image sensor and adjust the triggering time of the image sensor according to the delay information;
    信号发送模块,用于响应于到达所述触发时刻,向所述图像传感器发 送触发信号;以及A signal sending module, configured to send a trigger signal to the image sensor in response to reaching the trigger moment; and
    时间添加模块,用于响应于接收到所述图像传感器采集的图像数据,向所述图像数据添加时间信息,以便利用所述时间信息将所述图像数据与雷达传感器采集的点云数据对齐,a time adding module, configured to add time information to the image data in response to receiving the image data collected by the image sensor, so as to utilize the time information to align the image data with the point cloud data collected by the radar sensor,
    其中,所述延迟信息是根据所述雷达传感器以预定角度采集点云数据的时间信息确定的,所述预定角度在所述图像传感器的视角范围内。Wherein, the delay information is determined based on the time information of the radar sensor collecting point cloud data at a predetermined angle, and the predetermined angle is within the viewing angle range of the image sensor.
  25. 根据权利要求24所述的装置,其中,所述延迟信息指示延迟时长;所述时刻调整模块包括:The device according to claim 24, wherein the delay information indicates a delay duration; the time adjustment module includes:
    调整信息确定子模块,用于根据所述图像传感器的采集帧率和所述延迟时长,确定调整所述触发时刻的次数及针对每次调整的步长;以及The adjustment information determination submodule is used to determine the number of times to adjust the triggering moment and the step size for each adjustment based on the acquisition frame rate of the image sensor and the delay length; and
    调整子模块,用于根据所述次数和所述步长,调整所述图像传感器的触发时刻。The adjustment submodule is used to adjust the triggering moment of the image sensor according to the number of times and the step size.
  26. 根据权利要求25所述的装置,其中,所述调整信息确定子模块包括:The device according to claim 25, wherein the adjustment information determination sub-module includes:
    调整区间确定单元,用于根据所述图像传感器的采集帧率,确定所述触发时刻的调整步长的取值区间;以及an adjustment interval determination unit, configured to determine the value interval of the adjustment step at the triggering moment according to the acquisition frame rate of the image sensor; and
    信息确定单元,用于根据所述取值区间和所述延迟时长,确定所述次数和所述步长,an information determination unit, configured to determine the number of times and the step size according to the value interval and the delay length,
    其中,所述采集帧率与所述取值区间的长度负相关。Wherein, the collection frame rate is negatively correlated with the length of the value interval.
  27. 根据权利要求24所述的装置,其中,所述时刻调整模块用于:The device according to claim 24, wherein the time adjustment module is used for:
    响应于接收到所述延迟信息且所述延迟信息相对于已接收延迟信息中的相邻延迟信息发生变化,根据所述延迟信息的变化值调整所述图像传感器的触发时刻。In response to receiving the delay information and the delay information changing relative to adjacent delay information in the received delay information, adjusting the triggering moment of the image sensor according to the change value of the delay information.
  28. 一种同步采集数据的装置,包括:A device for synchronous data collection, including:
    时间信息确定模块,用于响应于接收到来自雷达传感器的针对预定角度的数据包,确定所述雷达传感器以所述预定角度采集点云数据的第一时间信息;其中,所述预定角度在图像传感器的视角范围内;A time information determination module, configured to determine the first time information of the radar sensor collecting point cloud data at the predetermined angle in response to receiving a data packet from the radar sensor for a predetermined angle; wherein the predetermined angle is in the image Within the viewing angle of the sensor;
    延时信息确定模块,用于根据所述第一时间信息,确定针对所述图像传感器的延迟信息;A delay information determination module, configured to determine delay information for the image sensor according to the first time information;
    时刻调整模块,用于根据所述延迟信息调整所述图像传感器的触发时 刻,以便所述图像传感器与转动至所述预定角度的所述雷达传感器同步地采集数据;A time adjustment module, configured to adjust the triggering time of the image sensor according to the delay information, so that the image sensor collects data synchronously with the radar sensor rotated to the predetermined angle;
    信号发送模块,用于响应于到达所述触发时刻,向所述图像传感器发送触发信号;以及a signal sending module, configured to send a trigger signal to the image sensor in response to reaching the trigger moment; and
    时间添加模块,用于响应于接收到所述图像传感器采集的图像数据,向所述图像数据添加第二时间信息,以便利用所述第二时间信息将所述图像数据与所述雷达传感器采集的点云数据对齐,A time adding module, configured to add second time information to the image data in response to receiving the image data collected by the image sensor, so as to use the second time information to combine the image data with the image data collected by the radar sensor. Point cloud data alignment,
    其中,所述数据包包括所述雷达传感器以所述预定角度采集的点云数据。Wherein, the data package includes point cloud data collected by the radar sensor at the predetermined angle.
  29. 一种同步确定装置,包括:A synchronization determination device, including:
    数据获取模块,用于获取雷达传感器采集的数据包序列和图像传感器采集的图像数据序列;The data acquisition module is used to obtain the data packet sequence collected by the radar sensor and the image data sequence collected by the image sensor;
    数据匹配模块,用于针对所述数据包序列中的每个数据包,根据所述每个数据包的时间信息与所述图像数据序列中各图像数据的时间信息,确定与所述每个数据包匹配的图像数据;A data matching module, configured to determine, for each data packet in the data packet sequence, the time information associated with each data packet according to the time information of each data packet and the time information of each image data in the image data sequence. Package matching image data;
    数据关系确定模块,用于根据所述每个数据包与匹配的图像数据之间的时间信息的差异及差异阈值,确定所述每个数据包与匹配的图像数据之间的同步关系;以及A data relationship determination module, configured to determine a synchronization relationship between each data packet and the matching image data based on the difference in time information between each data packet and the matching image data and a difference threshold; and
    同步关系确定模块,用于根据所述数据包序列中多个数据包各自与匹配的图像数据之间的同步关系,确定所述雷达传感器与所述图像传感器之间的同步关系。A synchronization relationship determination module is configured to determine a synchronization relationship between the radar sensor and the image sensor based on the synchronization relationship between each of the plurality of data packets in the data packet sequence and the matching image data.
  30. 根据权利要求29所述的装置,还包括:The device of claim 29, further comprising:
    提示信息发送模块,用于响应于所述雷达传感器与所述图像传感器之间的同步关系指示不同步,向目标对象发送提示信息。A prompt information sending module, configured to send prompt information to the target object in response to the synchronization relationship between the radar sensor and the image sensor indicating out-of-synchronization.
  31. 根据权利要求29所述的装置,还包括阈值确定模块,用于确定所述差异阈值;所述阈值确定模块包括:The device according to claim 29, further comprising a threshold determination module for determining the difference threshold; the threshold determination module includes:
    数据融合子模块,用于根据雷达传感器采集的历史数据包及与所述历史数据包匹配的历史图像数据,得到融合数据;所述融合数据表征了所述历史数据包中的点云数据投影至图像坐标系中的像素点,与所述历史图像数据中像素点之间的位置偏差值;以及The data fusion submodule is used to obtain fusion data based on the historical data packets collected by the radar sensor and the historical image data matching the historical data packets; the fusion data represents the point cloud data in the historical data packets projected to The position deviation value between the pixel point in the image coordinate system and the pixel point in the historical image data; and
    阈值确定子模块,用于在所述位置偏差值达到预定偏差值的情况下,确定所述历史数据包与所述历史图像数据之间的时间信息的差异为所述差异阈值。A threshold determination submodule, configured to determine the difference in time information between the historical data packet and the historical image data as the difference threshold when the position deviation value reaches a predetermined deviation value.
  32. 根据权利要求31所述的装置,其中,所述数据融合子模块用于:The device according to claim 31, wherein the data fusion sub-module is used for:
    采用预定容器化工具对所述历史数据包中的点云数据和所述历史图像数据进行融合,得到所述融合数据。A predetermined containerization tool is used to fuse the point cloud data in the historical data package and the historical image data to obtain the fused data.
  33. 一种电子设备,包括:An electronic device including:
    至少一个处理器;以及at least one processor; and
    与所述至少一个处理器通信连接的存储器;其中,a memory communicatively connected to the at least one processor; wherein,
    所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够执行权利要求1~14中任一项所述的方法。The memory stores instructions executable by the at least one processor, and the instructions are executed by the at least one processor, so that the at least one processor can execute any one of claims 1 to 14. Methods.
  34. 一种存储有计算机指令的非瞬时计算机可读存储介质,其中,所述计算机指令用于使所述计算机执行根据权利要求1~14中任一项所述的方法。A non-transitory computer-readable storage medium storing computer instructions, wherein the computer instructions are used to cause the computer to execute the method according to any one of claims 1 to 14.
  35. 一种计算机程序产品,包括计算机程序,所述计算机程序存储于可读存储介质和电子设备中其中至少之一上,所述计算机程序在被处理器执行时实现根据权利要求1~14中任一项所述的方法。A computer program product, including a computer program, the computer program is stored on at least one of a readable storage medium and an electronic device, and when executed by a processor, the computer program implements the method according to any one of claims 1 to 14 method described in the item.
PCT/CN2022/105187 2022-07-12 2022-07-12 Method and apparatus for synchronously collecting data, synchronization determination method and apparatus, and autonomous vehicle WO2024011408A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/105187 WO2024011408A1 (en) 2022-07-12 2022-07-12 Method and apparatus for synchronously collecting data, synchronization determination method and apparatus, and autonomous vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/105187 WO2024011408A1 (en) 2022-07-12 2022-07-12 Method and apparatus for synchronously collecting data, synchronization determination method and apparatus, and autonomous vehicle

Publications (1)

Publication Number Publication Date
WO2024011408A1 true WO2024011408A1 (en) 2024-01-18

Family

ID=89535235

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/105187 WO2024011408A1 (en) 2022-07-12 2022-07-12 Method and apparatus for synchronously collecting data, synchronization determination method and apparatus, and autonomous vehicle

Country Status (1)

Country Link
WO (1) WO2024011408A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004028601A (en) * 2002-06-21 2004-01-29 Mitsubishi Heavy Ind Ltd Monitoring laser radar system, and imaging method
CN110135485A (en) * 2019-05-05 2019-08-16 浙江大学 The object identification and localization method and system that monocular camera is merged with millimetre-wave radar
CN111435162A (en) * 2020-03-03 2020-07-21 深圳市镭神智能系统有限公司 Laser radar and camera synchronization method, device, equipment and storage medium
CN111522026A (en) * 2020-04-21 2020-08-11 北京三快在线科技有限公司 Data fusion method and device
CN112787740A (en) * 2020-12-26 2021-05-11 武汉光庭信息技术股份有限公司 Multi-sensor time synchronization device and method
CN113076383A (en) * 2020-01-06 2021-07-06 阿里巴巴集团控股有限公司 Road data acquisition vehicle and data acquisition system thereof
CN113985431A (en) * 2021-11-24 2022-01-28 杭州海康汽车软件有限公司 Data acquisition method, system, device, electronic equipment and storage medium
CN114488181A (en) * 2022-01-07 2022-05-13 北京理工大学 Multi-source heterogeneous sensor fusion method and device for camera and laser radar

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004028601A (en) * 2002-06-21 2004-01-29 Mitsubishi Heavy Ind Ltd Monitoring laser radar system, and imaging method
CN110135485A (en) * 2019-05-05 2019-08-16 浙江大学 The object identification and localization method and system that monocular camera is merged with millimetre-wave radar
CN113076383A (en) * 2020-01-06 2021-07-06 阿里巴巴集团控股有限公司 Road data acquisition vehicle and data acquisition system thereof
CN111435162A (en) * 2020-03-03 2020-07-21 深圳市镭神智能系统有限公司 Laser radar and camera synchronization method, device, equipment and storage medium
CN111522026A (en) * 2020-04-21 2020-08-11 北京三快在线科技有限公司 Data fusion method and device
CN112787740A (en) * 2020-12-26 2021-05-11 武汉光庭信息技术股份有限公司 Multi-sensor time synchronization device and method
CN113985431A (en) * 2021-11-24 2022-01-28 杭州海康汽车软件有限公司 Data acquisition method, system, device, electronic equipment and storage medium
CN114488181A (en) * 2022-01-07 2022-05-13 北京理工大学 Multi-source heterogeneous sensor fusion method and device for camera and laser radar

Similar Documents

Publication Publication Date Title
US10789771B2 (en) Method and apparatus for fusing point cloud data
US10917617B2 (en) Tunnel deformation monitoring system
WO2021204144A1 (en) Data processing system and method, sensor, mobile acquisition backpack, and device
US20210356915A1 (en) Systems and methods for time synchronization
WO2020135382A1 (en) System, method, and apparatus for synchronizing time service of multiple sensors, and electronic device
CN111435162B (en) Laser radar and camera synchronization method, device, equipment and storage medium
CN112230240A (en) Space-time synchronization system, device and readable medium for laser radar and camera data
WO2021047271A1 (en) Time synchronization method and apparatus
EP3291551A1 (en) Image delay detection method and system
WO2019047575A1 (en) Fpga based data acquisition card, data acquisition system and data acquisition method
CN107229219A (en) It is a kind of based on GPS module, the computer precision time service method of embedded system and its to realize system
WO2022061799A1 (en) Pose estimation method and device
WO2024011408A1 (en) Method and apparatus for synchronously collecting data, synchronization determination method and apparatus, and autonomous vehicle
CN113219479A (en) Camera and laser radar synchronization method and system of intelligent driving control system
CN114754769A (en) Data synchronization time service system and method for laser radar and inertial sensor
WO2020107195A1 (en) Information synchronization method, unmanned aerial vehicle, load device, system and storage medium
CN112769516A (en) Data synchronous acquisition method and device, electronic equipment and storage medium
WO2023165569A1 (en) Multi-sensor simultaneous positioning method and apparatus, system, and storage medium
CN114415489B (en) Time synchronization method, device, equipment and medium for vehicle-mounted sensor
CN108964825A (en) Calibration method, correction device and time server
CN114911887B (en) Data processing method, device, equipment and storage medium
WO2022033097A1 (en) Method and apparatus for capturing target object, and system and storage medium
CN113985431A (en) Data acquisition method, system, device, electronic equipment and storage medium
CN114063089A (en) Rotation angle error detection method, device, equipment and storage medium
CN113099211A (en) Stereoscopic vision data acquisition system and method with time synchronization

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22950539

Country of ref document: EP

Kind code of ref document: A1