WO2024011408A1 - Procédé et appareil de collecte synchrone de données, procédé et appareil de détermination de synchronisation, et véhicule autonome - Google Patents

Procédé et appareil de collecte synchrone de données, procédé et appareil de détermination de synchronisation, et véhicule autonome Download PDF

Info

Publication number
WO2024011408A1
WO2024011408A1 PCT/CN2022/105187 CN2022105187W WO2024011408A1 WO 2024011408 A1 WO2024011408 A1 WO 2024011408A1 CN 2022105187 W CN2022105187 W CN 2022105187W WO 2024011408 A1 WO2024011408 A1 WO 2024011408A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
image sensor
image
radar sensor
information
Prior art date
Application number
PCT/CN2022/105187
Other languages
English (en)
Chinese (zh)
Inventor
李贤飞
黄自瑞
张满江
Original Assignee
阿波罗智能技术(北京)有限公司
百度(美国)有限责任公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 阿波罗智能技术(北京)有限公司, 百度(美国)有限责任公司 filed Critical 阿波罗智能技术(北京)有限公司
Priority to PCT/CN2022/105187 priority Critical patent/WO2024011408A1/fr
Publication of WO2024011408A1 publication Critical patent/WO2024011408A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Definitions

  • the present disclosure relates to the field of artificial intelligence, specifically to technical fields such as autonomous driving, computer vision, and cloud computing, and in particular to a method of synchronously collecting data, a synchronization determination method, a device, a vehicle-mounted terminal, a controller, a vehicle, a cloud system, and an electronic device and readable storage media.
  • autonomous driving technology With the development of computer technology and network technology, autonomous driving technology has developed rapidly. In autonomous driving technology, it is usually necessary to rely on image data collected by image sensors and point cloud data collected by radar sensors to perceive the environmental information around the vehicle, and determine the autonomous driving strategy based on the sensing results.
  • the present disclosure aims to provide a method for synchronously collecting data, a synchronization determination method, a device, a vehicle-mounted terminal, a controller, a vehicle, a cloud system, an electronic device and a readable storage medium that are conducive to improving data alignment accuracy.
  • a method for synchronously collecting data including: in response to receiving a data packet from a radar sensor for collection at a predetermined angle, determining first time information when the radar sensor collects point cloud data at a predetermined angle. ; wherein the predetermined angle is within the viewing angle range of the image sensor; determining delay information for the image sensor according to the first time information; and sending the delay information to the controller so that the controller controls the image sensor to synchronize with the radar sensor rotated to the predetermined angle
  • the data is collected on the ground, where the data package includes point cloud data collected by the radar sensor at a predetermined angle.
  • a method for synchronously collecting data including: in response to receiving delay information for the image sensor, adjusting the trigger moment of the image sensor according to the delay information; in response to reaching the trigger moment, sending a signal to the image sensor sending a trigger signal; and in response to receiving the image data collected by the image sensor, adding time information to the image data so as to utilize the time information to align the image data with the point cloud data collected by the radar sensor, wherein the delay information is predetermined based on the radar sensor The angle is determined by the time information of collecting point cloud data, and the predetermined angle is within the viewing angle range of the image sensor.
  • a method for synchronously collecting data including: in response to receiving a data packet from a radar sensor for a predetermined angle, determining first time information when the radar sensor collects point cloud data at a predetermined angle. ; Wherein, the predetermined angle is within the viewing angle range of the image sensor; according to the first time information, delay information for the image sensor is determined; and the triggering time of the image sensor is adjusted according to the delay information, so that the image sensor and the radar sensor rotate to the predetermined angle are synchronized.
  • Collecting data in response to reaching the trigger moment, sending a trigger signal to the image sensor; and in response to receiving the image data collected by the image sensor, adding second time information to the image data to utilize the second time information to combine the image data with the radar sensor collection Point cloud data alignment.
  • a synchronization determination method including: acquiring a data packet sequence collected by a radar sensor and an image data sequence collected by an image sensor; for each data packet in the data packet sequence, according to each data The time information of the packet and the time information of each image data in the image data sequence are used to determine the image data matching each data packet; based on the difference and difference threshold in the time information between each data packet and the matching image data, each data packet is determined. the synchronization relationship between each data packet and the matching image data; and determining the synchronization relationship between the radar sensor and the image sensor based on the synchronization relationship between each of the multiple data packets in the data packet sequence and the matching image data.
  • a vehicle-mounted terminal configured to: in response to receiving a data packet from a radar sensor for a predetermined angle, determine the first time the radar sensor collects point cloud data at a predetermined angle. time information; wherein the predetermined angle is within the viewing angle range of the image sensor; determining delay information for the image sensor according to the first time information; and sending the delay information to the controller so that the controller controls the image sensor and the radar that rotates to the predetermined angle
  • the sensors collect data synchronously, where the data packets include point cloud data collected by the radar sensor at a predetermined angle.
  • a controller configured to: in response to receiving delay information for the image sensor, adjust a trigger moment of the image sensor according to the delay information; in response to reaching the trigger moment, send a signal to the image sensor a trigger signal; and in response to receiving the image data collected by the image sensor, adding time information to the image data so as to utilize the time information to align the image data with the point cloud data collected by the radar sensor, wherein the delay information is at a predetermined angle according to the radar sensor The time information for collecting point cloud data is determined, and the predetermined angle is within the viewing angle range of the image sensor.
  • an autonomous vehicle including a vehicle-mounted terminal, a controller, a radar sensor, and an image sensor, wherein the vehicle-mounted terminal is configured to: respond to receiving data for a predetermined angle from the radar sensor
  • the package determines the first time information that the radar sensor collects point cloud data at a predetermined angle; wherein the predetermined angle is within the viewing angle range of the image sensor; the data package includes the point cloud data collected by the radar sensor at a predetermined angle; according to the first time information, Determine delay information for the image sensor; and send the delay information to the controller;
  • the controller is configured to: in response to receiving the delay information, adjust the trigger moment of the image sensor according to the delay information; in response to reaching the trigger moment, send the delay information to the image sensor sending a trigger signal; and in response to receiving the image data collected by the image sensor, adding second time information to the image data so as to utilize the second time information to align the image data with the point cloud data collected by the radar sensor.
  • a cloud system configured to: obtain a data packet sequence collected by a radar sensor and an image data sequence collected by an image sensor; for each data packet in the data packet sequence, according to each The time information of the data packet and the time information of each image data in the image data sequence are used to determine the image data matching each data packet; based on the difference in time information and the difference threshold between each data packet and the matching image data, the image data is determined. the synchronization relationship between each data packet and the matching image data; and determining the synchronization relationship between the radar sensor and the image sensor based on the synchronization relationship between each of multiple data packets in the data packet sequence and the matching image data.
  • a device for synchronously collecting data includes: a time information determination module, configured to determine whether the radar sensor is at a predetermined angle in response to receiving a data packet from a radar sensor for a predetermined angle. Collect first time information of point cloud data; wherein the predetermined angle is within the viewing angle range of the image sensor; a delay information determination module for determining delay information for the image sensor based on the first time information; and an information sending module for Delay information is sent to the controller so that the controller controls the image sensor to collect data synchronously with the radar sensor rotated to a predetermined angle, wherein the data packet includes point cloud data collected by the radar sensor at the predetermined angle.
  • a device for synchronously collecting data including: a time adjustment module, configured to adjust the triggering time of the image sensor according to the delay information in response to receiving delay information for the image sensor; a signal sending module , used to send a trigger signal to the image sensor in response to reaching the trigger moment; and a time adding module, used to add time information to the image data in response to receiving the image data collected by the image sensor, so as to use the time information to combine the image data with the radar
  • the point cloud data collected by the sensor are aligned, wherein the delay information is determined based on the time information when the radar sensor collects the point cloud data at a predetermined angle, and the predetermined angle is within the viewing angle range of the image sensor.
  • a device for synchronously collecting data including: a time information determination module, configured to determine that the radar sensor collects points at a predetermined angle in response to receiving a data packet from a radar sensor for a predetermined angle.
  • the first time information of the cloud data wherein the predetermined angle is within the viewing angle range of the image sensor; the delay information determination module is used to determine the delay information for the image sensor based on the first time information; the time adjustment module is used to determine the delay information based on the delay
  • the information adjusts the trigger moment of the image sensor so that the image sensor collects data synchronously with the radar sensor rotated to a predetermined angle; a signal sending module for sending a trigger signal to the image sensor in response to reaching the trigger moment; and a time adding module for In response to receiving the image data collected by the image sensor, second time information is added to the image data to align the image data with the point cloud data collected by the radar sensor using the second time information.
  • a synchronization determination device including: a data acquisition module for acquiring a data packet sequence collected by a radar sensor and an image data sequence collected by an image sensor; a data matching module for targeting data packets For each data packet in the sequence, the image data matching each data packet is determined based on the time information of each data packet and the time information of each image data in the image data sequence; the data relationship determination module is used to determine the image data matching each data packet according to the time information of each data packet. The difference in time information between the packet and the matching image data and the difference threshold determine the synchronization relationship between each data packet and the matching image data; and a synchronization relationship determination module is used to determine the synchronization relationship between multiple data packets in the data packet sequence. The synchronization relationship between each and the matching image data determines the synchronization relationship between the radar sensor and the image sensor.
  • an electronic device including: at least one processor; and a memory communicatively connected to the at least one processor; wherein the memory stores instructions that can be executed by the at least one processor, and the instructions are At least one processor executes, so that at least one processor can execute the method for synchronously collecting data or the method for synchronization determination provided by the present disclosure.
  • a non-transitory computer-readable storage medium storing computer instructions, wherein the computer instructions are used to cause the computer to execute the method of synchronously collecting data or the method of synchronization determination provided by the present disclosure.
  • a computer program product including a computer program/instruction that, when executed by a processor, implements the method for synchronously collecting data or the method for synchronization determination provided by the present disclosure.
  • Figure 1 is a schematic diagram of an application scenario of a method for synchronously collecting data, a method for determining synchronization, and a device according to an embodiment of the present disclosure
  • Figure 2 is a schematic flowchart of a method for synchronously collecting data according to an embodiment of the present disclosure
  • Figure 3 is a schematic diagram of the principle of determining delay information for an image sensor according to an embodiment of the present disclosure
  • Figure 4 is a schematic flowchart of a method for synchronously collecting data according to another embodiment of the present disclosure
  • Figure 5 is a schematic flowchart of adjusting the triggering time of an image sensor according to an embodiment of the present disclosure
  • Figure 6 is a schematic diagram of the principle of a method for synchronously collecting data according to an embodiment of the present disclosure
  • Figure 7 is a schematic diagram of the principle of synchronous data collection according to an embodiment of the present disclosure.
  • Figure 8 is a schematic flowchart of a synchronization determination method according to an embodiment of the present disclosure.
  • Figure 9 is a schematic diagram of a vehicle-mounted terminal according to an embodiment of the present disclosure.
  • Figure 10 is a schematic diagram of a controller according to an embodiment of the present disclosure.
  • Figure 11 is a schematic diagram of an autonomous vehicle according to an embodiment of the present disclosure.
  • Figure 12 is a schematic diagram of a cloud system according to an embodiment of the present disclosure.
  • Figure 13 is a structural block diagram of a device for synchronously collecting data according to an embodiment of the present disclosure
  • Figure 14 is a structural block diagram of a device for synchronously collecting data according to another embodiment of the present disclosure.
  • Figure 15 is a structural block diagram of a device for synchronously collecting data according to another embodiment of the present disclosure.
  • Figure 16 is a structural block diagram of a synchronization determination device according to an embodiment of the present disclosure.
  • FIG. 17 is a block diagram of an electronic device used to implement the method of synchronously collecting data or the method of synchronization determination according to an embodiment of the present disclosure.
  • autonomous vehicles need to sense rich environmental information around them and take timely safety measures when there are safety hazards in the surrounding environment.
  • autonomous vehicles are usually equipped with radar sensors and image sensors.
  • Image sensors are used to collect image data to obtain rich texture and color information in the environment
  • radar sensors are used to collect point cloud data to obtain distance information of objects in the environment.
  • reconstructed three-dimensional environmental information can be obtained by fusing image data and point cloud data.
  • the collection time of image data and point cloud data is not synchronized, the environmental information obtained by fusing the image data and point cloud data will deviate from the actual environmental information. This deviation will be amplified as the speed of the autonomous vehicle increases, which will affect the safe driving of the autonomous vehicle.
  • the present disclosure provides a method and device for synchronous data collection that improves the collection synchronization of image data and point cloud data, and also provides a synchronization determination that efficiently and accurately evaluates the synchronization of data collection. Methods and apparatus.
  • Figure 1 is a schematic diagram of application scenarios of a method for synchronously collecting data, a method for determining synchronization, and a device according to an embodiment of the present disclosure.
  • the application scenario 100 of this embodiment may include an autonomous vehicle 110 , a road 120 and a cloud system 130 .
  • the autonomous vehicle 110 may be integrated with a vehicle-mounted terminal, a radar sensor 111 and an image sensor 112.
  • the vehicle-mounted terminal can be connected to the radar sensor 111 and the image sensor 112 through communication cables.
  • the vehicle-mounted terminal can obtain the point cloud data collected by the radar sensor 111 and the image data collected by the image sensor 112 through the communication cable.
  • the vehicle-mounted terminal can fuse image data and point cloud data in real time, decide the driving strategy of the vehicle based on the fusion results, and send control signals to the power system of the autonomous vehicle 110 according to the driving strategy to achieve autonomous driving.
  • the radar sensor 111 may be, for example, a laser radar, a millimeter wave radar, or other sensor with a mechanical rotation function.
  • the radar sensor 111 may be, for example, installed on the roof of an autonomous vehicle.
  • the radar sensor 111 can rotate 360° under the control of pulse signals to collect all-round point cloud data around the autonomous vehicle.
  • the radar sensor can obtain the position point information of objects in the environment around the autonomous vehicle by emitting and receiving laser beams, and performs three-dimensional modeling based on the position point information to obtain point cloud data.
  • the image sensor 112 may be, for example, any one of the following cameras: a front-view camera installed on the self-driving vehicle 110 , a surround-view camera installed in any direction of the self-driving vehicle 110 , a rear-view camera, a side-view camera, etc. .
  • the autonomous vehicle 110 may also be integrated with a positioning device.
  • the positioning device may be composed of a Global Positioning System (GPS) and a Geographic Information System (Geographic Information System, GIS). Realize tracking and positioning of autonomous vehicles.
  • GPS Global Positioning System
  • GIS Geographic Information System
  • the vehicle-mounted terminal can, for example, use the positioning device time of the positioning device as the reference time to control the timing of data collection by the radar sensor and the image sensor, so that the radar sensor and the image sensor collect data synchronously.
  • the vehicle-mounted terminal may, for example, control the triggering of the image acquisition device based on the time information of the radar sensor collecting point cloud data, so that the radar sensor and the image sensor collect data synchronously.
  • the autonomous vehicle 110 may also be integrated with a controller that controls the triggering of the image sensor 112 , and the controller may be an artificial intelligence chip, for example.
  • the vehicle-mounted terminal can be communicatively connected with the controller to send delay information determined based on the time information of the radar sensor collecting point cloud data to the controller, so that the controller controls the triggering of the image sensor based on the delay information.
  • the vehicle-mounted terminal can also communicate with the cloud system 130 through a wireless communication link.
  • the cloud system 130 can monitor the driving of the autonomous vehicle based on the data uploaded by the vehicle terminal.
  • the cloud system 130 can obtain the point cloud data and image data collected by the autonomous vehicle within a predetermined period from the vehicle terminal, and collect the point cloud data and images of the autonomous vehicle based on the fusion result of the point cloud data and image data.
  • the synchronicity of the data is evaluated to provide a reference for adjusting the synchronous collection strategy of point cloud data and image data.
  • the method for synchronously collecting data provided by the present disclosure can be executed by the autonomous vehicle 110 .
  • some operations can be executed by the vehicle-mounted terminal in the autonomous vehicle 110 and some operations can be executed by the controller.
  • the device for synchronously collecting data provided by the present disclosure can be provided in the autonomous vehicle 110.
  • some modules can be provided in the vehicle-mounted terminal and some modules can be provided in the controller.
  • the synchronization determination method provided by the present disclosure can be executed by the cloud system 130 .
  • the synchronization determining device provided by the present disclosure can be provided in the cloud system 130 .
  • the structures and types of the autonomous vehicle 110, radar sensor 111, image sensor 112 and cloud system 130 in Figure 1 are only schematic. Depending on implementation requirements, the autonomous vehicle 110, radar sensor 111, image sensor 112 and cloud system 130 may have any structure and type.
  • Figure 2 is a schematic flowchart of a method for synchronously collecting data according to an embodiment of the present disclosure.
  • the method 200 for synchronously collecting data in this embodiment may include operations S210 to S230.
  • the method 200 may be executed, for example, by a vehicle-mounted terminal in an autonomous vehicle.
  • delay information for the image sensor is determined according to the first time information.
  • delay information is sent to the controller.
  • the vehicle-mounted terminal may receive a data packet sent by a radar sensor, where the data packet includes point cloud data collected by the radar sensor.
  • the vehicle-mounted terminal can use the timestamp of the data packet as the time information for the radar sensor to collect point cloud data. Among them, the timestamp can be added by the radar sensor when encapsulating the collected point cloud data to obtain the data package.
  • the radar sensor may send data packets using UDP/IP protocol.
  • the data packet includes Ethernet Header and User Data Packet Protocol Data (UDP Data).
  • UDP Data User Data Packet Protocol Data
  • the user data packet protocol data may include ranging data and additional information.
  • the ranging data consists of multiple data blocks, and each data block includes an azimuth angle value, a distance value, etc. Additional information may include motor speed and time data that drives the radar sensor to rotate.
  • the vehicle-mounted terminal can parse the received data packet and determine whether a data packet for a predetermined angle is received based on the parsing result. For example, a data packet whose azimuth angle value is a predetermined angle is a data packet for the predetermined angle.
  • This embodiment can obtain the first time information of the radar sensor collecting point cloud data at a predetermined angle based on the time data in the data packet for the predetermined angle.
  • the predetermined angle is an angle within the viewing angle range of the image sensor.
  • the predetermined angle can be determined based on the angle between the main optical axis of the image sensor and the horizontal plane.
  • the predetermined angle can be the value of the included angle.
  • the angle at which the radar sensor collects point cloud data can be represented by the angle between the laser beam emitted by the radar sensor when collecting point cloud data and the horizontal plane.
  • the time data in the data packet can include standard time (such as Universal Time Coordinated, UTC) and the periodic encapsulation time of the data packet.
  • standard time such as Universal Time Coordinated, UTC
  • the value range of the periodic encapsulation time of the data packet can be [0 ⁇ s, 1s].
  • Standard time can represent the year, month, day, hour, minute and second of the collected data.
  • the standard time and the periodic encapsulation time can be added to obtain the time information of the ranging data in the collected data packet.
  • the data packet whose azimuth angle value is a predetermined angle can be used as the target data packet, and the time information for collecting the ranging data in the target data packet can be used as the time information determined in operation S210. It can be understood that the combination of ranging data and azimuth angle values can represent point cloud data.
  • this embodiment can also calculate the radar sensor based on the time data in the data packet with an azimuth angle value of 0°.
  • Time information for collecting point cloud data at a predetermined angle For example, the time information determined based on the time data in the data packet with an azimuth angle value of 0° is T 0 , the predetermined angle is angle, and the rotation period of the radar sensor is LIDAR_ROT_INTERVAL.
  • the following formula (1) can be used to calculate the radar sensor to the predetermined angle.
  • Time information T' for collecting point cloud data at angle :
  • T' T 0 +angle*LIDAR_ROT_INTERVAL/360.
  • the delay information for the image sensor can be determined based on the difference between the time information T and the time information T'.
  • the difference after obtaining the difference between the time information T and the time information T', the difference can also be used to take the remainder of the time interval at which the image sensor collects images, and the remainder can be used as the delay information.
  • the value of the delay information can be limited to the time interval during which the image sensor collects images.
  • the adjustment amount of the acquisition time can be reduced.
  • the time interval at which the image sensor collects images may, for example, be positively correlated according to the reciprocal of the acquisition frame rate of the image sensor.
  • the unit of the acquisition frame rate may be, for example, fps, that is, the acquisition frame rate is the number of image data collected by the image sensor per second.
  • this embodiment can limit the value of the delay information to the time interval during which the image sensor collects images is because the image sensor periodically collects image data according to the collection frame rate, and the synchronization of data collection only needs to ensure that the image sensor collects The image data only needs to include data collected simultaneously with the point cloud data collected by the radar sensor turned to a predetermined angle.
  • the delay information can be sent to the controller that controls the triggering timing of the image sensor, so that the triggering timing of the image sensor is controlled by the controller, that is, the acquisition time of the image sensor is controlled, so that the image sensor can communicate with the image sensor.
  • Radar sensors rotated to a predetermined angle collect data synchronously. That is, at the moment when the radar sensor rotated to a predetermined angle collects point cloud data, the image sensor collects image data synchronously.
  • This embodiment determines the delay information based on the time information when the radar sensor collects point cloud data, and sends the delay information to the controller, so that the time when the image sensor collects image data is consistent with the time when the radar sensor collects point clouds within the viewing angle range of the image sensor.
  • the time alignment of data enables the collected image data and point cloud data to express the environmental information at the same time, which can improve the alignment accuracy of the collected image data and point cloud data, help autonomous vehicles make correct driving strategies, and improve autonomous driving. Vehicle driving safety.
  • the technical solution of this embodiment can improve the synchronization of the data collected by the two sensors and ensure that the image sensor
  • the acquired image data includes data aligned with the point cloud data collected by the radar sensor.
  • the collection angle data and time data in the data packet can be obtained by parsing the data packet from the radar sensor.
  • parse the data packet from the lidar and use the value of the analyzed Azlmuth Angle parameter as the acquisition angle data, that is, as the azimuth angle value.
  • the parsed UTC parameter value and the GPS Timestamp parameter value can be used as time data.
  • This embodiment can use any parsing tool that parses UDP packets to parse the data packets from the radar sensor.
  • the above-mentioned radar sensor models and analysis tools are only used as examples to facilitate understanding of the present disclosure, and the present disclosure does not limit this.
  • the values of the corresponding parameters obtained by analysis can be used as the collection angle data and time data.
  • corresponding parsing tools can be used to parse the data packets.
  • the reference time described above may be aligned with the time of the positioning device of the positioning device in the vehicle where the radar sensor is located, for example.
  • the positioning device can send a recommended positioning information (Recommended Minimum Specific GPS/TRANSIT Data, GPRMC) data packet to the radar sensor.
  • the radar sensor can use the UTC time in the GPRMC packet as the initial time.
  • the sum of the time difference between the time when the point cloud data is collected and the time when the GPRMC data packet is received and the initial time can be used as the reference time.
  • the UTC time in the GPRMC data packet is usually in seconds. In this way, the data packets of point cloud data collected by the radar sensor can have high-precision timestamps, which is conducive to achieving high-precision control of the image sensor.
  • FIG. 3 is a schematic diagram of the principle of determining delay information for an image sensor according to an embodiment of the present disclosure.
  • the time information of the radar sensor when the rotation angle is 0° can be calculated based on the time information of point cloud data collected at a predetermined angle. And based on the calculated time information when the rotation angle is 0° and the rotation period of the radar sensor, the rotation deviation value of the radar sensor is determined. Delay information is then determined based on this rotational deviation value.
  • the first time information 301 of the point cloud data collected by the radar sensor at a predetermined angle, the predetermined angle 302 and the rotation period 303 of the radar sensor can be calculated.
  • the second time information 304 of the radar sensor collecting point cloud data at 0° is obtained.
  • the minimum angle resolution of the radar sensor is set to 0.01°, and the azimuth angle Azlmuth Angle in the data packet is measured in units of 0.01°, then the predetermined angle in this embodiment should also be measured in units of 0.01°. Measured in units of 0.01°. In this case, 360 in formula (2) should be replaced by 36000.
  • the second time information 304 can be modulated by the rotation period 303, and the rotation deviation value 305 of the radar sensor can be determined based on the remainder.
  • the initial angle of the radar sensor is 0°
  • the starting time for the radar sensor to collect point cloud data is usually in seconds
  • the rotation period of the radar sensor is usually in the order of ms. If there is no deviation in the rotation of the radar sensor, the time information of the radar sensor collecting point cloud data at an angle of 0° should be an integer multiple of the rotation period. Therefore, the remainder obtained by taking the remainder can be used to represent the rotational deviation value. It can be understood that the difference between T and T' mentioned above can be understood as the rotation deviation value determined in this embodiment.
  • this embodiment may use the rotational deviation value 305 as the delay information of the image sensor.
  • a method similar to the method described above can also be used, in which the rotation deviation value 305 is modulated by the time interval at which the image sensor collects images, and the remainder value is used as the delay information.
  • the delay information may also be determined based on the sum of the rotation deviation value 305 and the rotation period 303, for example. In this way, it can be avoided that when the first time information T is small, T 0 ′ obtained by the above formula (2) is a negative number and cannot be modulated by the time interval during which the image sensor collects images.
  • the sum of the rotation deviation value 305 and the rotation period 303 may be modulated by the time interval for image sensor acquisition, and the remainder may be used as the delay information 306 of the image sensor.
  • a predetermined error value 307 determined based on the optical center position of the image sensor and the position of the target object within the viewing angle range of the image sensor may also be considered.
  • the target object may be the ground, for example, and may be determined based on the angle between a point on the central axis of the vehicle at a predetermined distance from the vehicle and a straight line where the optical center of the image sensor is located and the main optical axis of the image sensor.
  • the predetermined error value is a value greater than 0.
  • the predetermined error value 307 may be an empirical value. For example, the predetermined error value 307 may be 5 ms.
  • this embodiment may determine the delay information of the image sensor based on the sum of the predetermined error value 307 and the rotational deviation value 305 .
  • the accuracy of the determined delay information can be improved. This is because in the image data collected by the image sensor, the data that can reflect the objects on the road that affect the driving of the vehicle are the lower image pixels. Therefore, when aligning data, it is usually preferable to focus on the lower image pixels in the image data collected by the image sensor.
  • the angle at which the radar sensor can collect objects on the road usually deviates from the predetermined angle, and this deviation will cause a deviation in determining the acquisition time of the image sensor.
  • This embodiment can compensate for this deviation by setting a predetermined error value 307. In this way, controlling the time at which the image sensor collects image data based on the delay information determined in this embodiment can enable the image sensor and the radar sensor to synchronously collect objects on the road, thereby improving the synchronization accuracy of data collection.
  • this embodiment may also determine the delay information of the image sensor based on the sum of the predetermined error value 307, the rotation deviation value 305 and the rotation period 303.
  • the controller that controls the image sensor can be, for example, an artificial intelligence chip, such as a field programmable gate array chip (Field Programmable Gate Array, FPGA), which is suitable for processing data based on delay information at a high sampling frequency.
  • FPGA Field Programmable Gate Array
  • the image sensor with high sampling frequency adjusts the acquisition time. It can be applied to scenarios where data is collected synchronously in real time. Accordingly, the synchronous data collection of this embodiment can be performed periodically.
  • the controller After the controller receives the delay information, for example, the triggering time of the image sensor can be adjusted based on the delay information, so that the image sensor collects data synchronously with the radar sensor that rotates to a predetermined angle.
  • the method of synchronously collecting data executed by the controller will be described in detail below with reference to Figures 4 to 5 .
  • Figure 4 is a schematic flowchart of a method for synchronously collecting data according to an embodiment of the present disclosure.
  • the method 400 of synchronously collecting data in this embodiment may include operations S410 to S430.
  • the method 400 can be executed by a controller, for example, it can be executed by an artificial intelligence chip (such as FPGA).
  • an artificial intelligence chip such as FPGA
  • a trigger signal is sent to the image sensor.
  • time information is added to the image data.
  • the system time of the controller may, for example, be aligned with the positioning device time of the positioning device in the vehicle where the controller is located.
  • the positioning device can send a GPRMC data packet to the controller.
  • the controller can use the UTC time in the GPRMC packet as the initial time.
  • the controller can control the image sensor to collect image data at an acquisition frame rate of 30fps starting from the initial time.
  • the delay information is sent by the terminal device to the controller through operation S230 described above.
  • the controller can adjust the time at which the image sensor collects image data after the current time according to the delay information.
  • the adjusted trigger time can be obtained by adding the trigger time after the current time and the delay duration indicated by the delay information.
  • the time interval between two adjacent triggering moments of the image sensor can be considered.
  • this embodiment can use the remainder obtained by taking the delay length indicated by the delay information as the remainder of the time interval as the adjustment amount, and add the trigger time after the current time to the adjustment amount to obtain the adjusted trigger time.
  • the degree of adjustment of the trigger time can be reduced while ensuring synchronous data collection, and frame loss caused by adjusting the trigger time can be avoided.
  • the controller can control the trigger of the image sensor according to the adjusted trigger time, so that the image sensor collects data synchronously with the radar sensor rotated to a predetermined angle.
  • the trigger signal sent to the image sensor is any type of trigger signal that the image sensor can recognize and receive, and the present disclosure does not limit this.
  • the image sensor is triggered in response to the trigger signal and begins collecting image data. After collecting the image data, the image sensor can send the collected image data to the controller. After receiving the image data, the controller can add time information to the image data according to the time information of the current moment.
  • the point cloud data and image data collected by the radar sensor in the present disclosure all have time information.
  • the vehicle-mounted terminal can timely align the two data based on the time information of the two data, and fuse the two aligned data to reconstruct the environmental information.
  • the data packets for the predetermined angle and the determined delay information received by the vehicle-mounted controller are both periodic.
  • the controller rotates periodically according to the periodicity.
  • the delay information is used to adjust the triggering moment of the image sensor, which can avoid inaccurate data alignment caused by the rotation deviation of the radar sensor during vehicle operation. This will help improve the safety of autonomous vehicle driving.
  • FIG. 5 is a schematic flowchart of adjusting the triggering time of an image sensor according to an embodiment of the present disclosure.
  • the triggering time when adjusting the triggering time of the image sensor, if the delay time indicated by the delay information is long, the triggering time can be adjusted step by step to avoid causing the image sensor to be damaged due to a large degree of single adjustment.
  • the collected images lose frames relative to the radar sensor, affecting the autonomous driving of the vehicle.
  • the number of times to adjust the trigger moment and the step size for each adjustment may be determined based on the acquisition frame rate and delay duration of the image sensor. Then adjust the triggering time of the image sensor according to the number of times and step size.
  • the time interval for the image sensor to collect image data can be first determined based on the collection frame rate, and the time interval can be the reciprocal of the collection frame rate. Then, divide the delay duration by the time interval, round up, and use the rounded value as the number of times to adjust the trigger moment. Set the number of times to n, then for the 1st to (n-1)th time, the adjustment step is the value of the time interval. For the nth time, the adjustment step is the delay length divided by the time interval. the remainder. Or, for the first time, the adjustment step is the remainder obtained by dividing the delay length by the time interval, and for subsequent times, the adjustment step is the value of the time interval.
  • the delay duration can be equally divided into n parts, and the step size of each adjustment in n adjustments is the length of one part of the delay duration.
  • the value interval 502 of the adjustment step size at the triggering moment can be determined first based on the acquisition frame rate 501 of the image sensor. Then, the number of adjustments 503 and the step size of each adjustment 504 are determined based on the value interval 502 and the delay length.
  • the collection frame rate 501 may be negatively correlated with the length of the value interval 502 .
  • the length of the value interval 502 may be proportional to the length of the time interval for collecting image data.
  • the upper limit of the value in the value interval 502 can be set to the value of the time interval, or the upper limit of the value can be set to 0.5 times the time interval, etc. This disclosure does not limit this.
  • this embodiment may divide the delay duration by the value of the value interval 502, round up, and use the value obtained by rounding up as the number of adjustments.
  • the controller may, for example, maintain the delay information received in the past. After receiving the latest delay information, the controller may, for example, first find the delay information adjacent to the latest delay information at the reception time from the maintained accepted delay information, and then determine whether the latest delay information is relative to Adjacent delay information changes. If changes occur, adjust the triggering time of the image sensor. Specifically, the triggering time of the image sensor can be adjusted according to the change value of the delay information. In this way, the iterative adjustment of the image sensor can be realized, the degree of adjustment of the triggering moment of the image sensor within a single adjustment cycle can be reduced, and the stability of the image data collected by the image sensor can be improved.
  • this embodiment can calculate the delay duration indicated by the m-th delay information 505 and the (m-1)-th delay.
  • the number of adjustments 503 and the step size 504 of each adjustment are determined.
  • the lower limit of the value interval 502 determined in step 501 may be, for example, a value less than 0.
  • the absolute value of the difference between the lower limit value and 0 may be smaller than the absolute value of the difference between the upper limit value of the value interval and 0. This is due to the limitation of the acquisition frame rate of the image sensor.
  • the difference between this lower limit value and 0 is inversely related to the acquisition frame rate.
  • the acquisition frame rate of the image sensor is set to 30 fps, and the value interval can be set to [-200 ⁇ m, 32 ms], for example. If the acquisition frame rate is 15fps, the value interval can be set to [-400 ⁇ m, 64ms], for example.
  • the controller may, for example, send a trigger signal every two frames. After receiving the trigger signal, the image sensor continuously collects three frames of image data. The differences between the collection time of the second frame of image data, the third frame of image data and the collection time of the first frame of image data are 33.3ms and 66.7ms respectively.
  • the controller may only adjust the sending timing of the trigger signal sent every two frames, thereby adjusting the triggering moment of the image sensor. If the number of adjustments is multiple, the controller can adjust the signal sending mechanism that sends the trigger signal every two frames to the signal sending mechanism that sends the trigger signal every frame. After completing the adjustment of the trigger time and sending the trigger signal to the image sensor according to the adjusted trigger time, the controller can adjust the signal sending mechanism back to the mechanism of sending the trigger signal every two frames.
  • this embodiment can only adjust the most recent Adjustment is made for p consecutive triggering moments.
  • the adjustment to the latest p+1th trigger moment can be determined based on the delay information received in the next cycle.
  • the above-mentioned controller and vehicle-mounted terminal can both be integrated into the automatic driving system of the vehicle.
  • the image sensor and the radar sensor can collect data synchronously by executing the synchronous data collection methods of the above embodiments.
  • Figure 6 is a schematic diagram of the principle of a method for synchronously collecting data according to an embodiment of the present disclosure.
  • the execution subject of the method 600 for synchronously collecting data is the automatic driving system of the automatic driving vehicle.
  • the automatic driving system includes a vehicle-mounted terminal 610 and a controller 620.
  • the controller 620 may specifically be an FPGA chip.
  • the autonomous vehicle may also be provided with a radar sensor 601 and an image sensor 602.
  • the radar sensor 601 may encapsulate the point cloud data collected from various angles into data packets as data packets for each angle. And the data packets of each angle are sent to the vehicle-mounted terminal 610.
  • the vehicle-mounted terminal 610 can analyze the received data packet and determine the angle targeted by the data packet. And after it is determined that the data packet for the predetermined angle is received, the first time information for the radar sensor to collect point cloud data at the predetermined angle is determined. The predetermined angle is within the viewing angle range of the image sensor. It can be understood that the implementation manner of determining the first time information is similar to the implementation manner of operation S210 described above, and will not be described again here.
  • the vehicle-mounted terminal 610 can determine the delay information for the image sensor 602 based on the first time information, and send the delay information to the FPGA 620. It can be understood that the implementation of determining the delay information may be similar to the implementation of operation S220 described above, and will not be described again here.
  • the FPGA chip 620 can adjust the triggering time of the image sensor 602 according to the delay information, so that the image sensor 602 and the radar sensor 601 can collect data synchronously.
  • the implementation of adjusting the triggering time is similar to the implementation of operation S410 described above.
  • the FPGA chip 620 may send a trigger signal to the image sensor 602 when the adjusted trigger time is reached. Image sensor 602 may begin collecting image data in response to the trigger signal.
  • the image sensor 602 may send the collected image data to the FPGA chip 620, so that the FPGA chip 620 adds second time information to the received image data. Specifically, a timestamp can be added to the image data to obtain image data 631 with a timestamp.
  • the obtained series of image data 631 includes points that are aligned with the point cloud data subsequently collected by the radar sensor 601 image data.
  • the image data and the point cloud data can be aligned according to the second time information of the image data and the time information of the point cloud data subsequently collected by the radar sensor.
  • point cloud data and image data with the same time information can be used as an aligned pair of data, or point cloud data and image data with time information that differs less than a threshold can be used as an aligned pair of data.
  • Figure 7 is a schematic diagram of the principle of synchronous data collection according to an embodiment of the present disclosure.
  • the reference time based on which the FPGA adds time information to image data and the reference time used by the radar sensor to collect point cloud data are both set to the GPS clock.
  • the time when the radar sensor collects point cloud data at 0° should be 0.1s, 0.9s, 1s, 1.1s..., and the camera The time for collecting image data should be 33.3ms, 66.7ms, 0.1s,..., 0.8667s, 0.9s, 0.9333s....
  • the time at which the radar sensor collects point cloud data is actually 0.92s, 1.02s,...2.01s.
  • the time when the camera collects image data is actually 0.89s, 0.9233s, ....
  • the trigger time of the camera can be adjusted from 0.89s to 0.92s, and the delay length based on the adjustment is t1, which is 30ms.
  • the trigger time of the camera is adjusted from 1.99s to 2.01s, and the delay length based on the adjustment is t2, which is 20ms.
  • the image data collected by the camera and the point cloud data collected by the radar are temporally aligned.
  • the present disclosure also provides a synchronization determination method for evaluating the synchronization of an image sensor and a radar sensor.
  • the synchronization determination method will be described in detail below with reference to FIG. 8 .
  • Figure 8 is a schematic flowchart of a synchronization determination method according to an embodiment of the present disclosure.
  • the synchronization determination method 800 of this embodiment may include operations S810 to S840.
  • the synchronization determination method 800 can be executed by a cloud system.
  • the cloud system can obtain the data packet sequence and image data sequence collected by the autonomous vehicle within the unit time on a daily basis.
  • the cloud system may obtain the sequence of data packets and the sequence of image data via a wireless communication link, for example.
  • image data matching each data packet is determined based on the time information of each data packet and the time information of each image data in the image data sequence.
  • the cloud system can parse each data packet in the data packet sequence and determine the first time information of the point cloud data in the data packet collected by the radar sensor. Then, the image data whose time information is added to the image data sequence is the same as or similar to the first time information is used as the image data matching each data packet. For example, if the time difference between the second time information of the image data and the first time information of the point cloud data is less than ⁇ 3 ms, it can be determined that the image data matches the point cloud data. For data packets for which matching image data is not determined, the data packet can be discarded.
  • a synchronization relationship between each data packet and the matching image data is determined based on the difference in time information between each data packet and the matching image data and the difference threshold.
  • the difference threshold may be set based on experience, for example, the difference threshold may be set on the premise that it does not affect the safe driving of the vehicle.
  • the difference threshold may be 15 ms, 7 ms, etc., which is not limited in this disclosure.
  • the difference threshold may be determined based on the fusion result of historical image data and point cloud data in the historical data package. If the positional deviation between the pixels projected from the point cloud data into the image coordinate system in the fusion result and the corresponding pixels in the historical image data reaches a critical value (such as a predetermined deviation value), the time information of the historical image data can be combined with the historical data. The difference in packet time information is used as the difference threshold. In this way, the accuracy of the set difference threshold and therefore the evaluation of the synchronization relationship can be increased.
  • the difference in time information between each data packet and the matching image data is greater than the difference threshold, it may be determined that the synchronization relationship between each data packet and the matching image data is asynchronous. If the difference is less than or equal to the difference threshold, it can be determined that the synchronization relationship between each data packet and the matching image data is synchronization.
  • the synchronization relationship between the radar sensor and the image sensor is determined according to the synchronization relationship between each of the plurality of data packets in the data packet sequence and the matching image data.
  • the synchronization relationship between each data packet and the matching image data in the data packet sequence can be counted. If the proportion of data packets synchronized with the matching image data is greater than or equal to the predetermined proportion threshold, it can be determined that the synchronization relationship between the radar sensor and the image sensor is synchronous, otherwise it is determined that the synchronization relationship is out of synchronization.
  • the predetermined ratio threshold may be, for example, a value less than 1 but close to 1 such as 0.8, which is not limited in this disclosure.
  • Embodiments of the present disclosure can improve the efficiency of determining the synchronization relationship by setting a difference threshold and using the difference threshold as a basis for synchronization between the data packets collected by the radar sensor and the image data collected by the image sensor. Compared with the technical solution of fusion of image data and point cloud data in the data package, and determining the synchronization relationship based on the fusion results, it can reduce human investment, reduce the consumption of computing resources, and improve parallel processing capabilities.
  • a predetermined containerization tool may be used to fuse the point cloud data and historical image data in the historical data package.
  • the predetermined container flower tool it can be compatible with different platforms installed on different cloud systems, which is beneficial to improving the applicability of the synchronization determination method provided by the embodiments of the present disclosure.
  • the predetermined containerization tool can, for example, fuse the point cloud data and the image data based on the following principle: align and fuse the point cloud data and the image data according to the external parameters of the radar sensor and the image sensor.
  • the point cloud data can be projected into the camera coordinate system to obtain the pixels projected from the point cloud data into the camera coordinate system.
  • the pixels are superimposed on the corresponding pixels of the image data to obtain the fused data.
  • this embodiment may also send prompt information to the target object. For example, a reminder message can be sent to the target communication account.
  • the target communication account can be an email account or a mobile communication account, etc., and this disclosure does not limit this.
  • prompt information monitoring personnel can be informed of the out-of-synchronization between the image sensor and the radar sensor in time, so that relevant strategies can be adopted to solve the out-of-synchronization problem and reduce the chance of unsafe driving of autonomous vehicles due to out-of-synchronization.
  • the disclosure Based on the method for synchronous data collection performed by a vehicle-mounted terminal provided by the disclosure, the disclosure also provides a vehicle-mounted terminal.
  • Figure 9 is a schematic diagram of a vehicle-mounted terminal according to an embodiment of the present disclosure.
  • this embodiment provides a vehicle-mounted terminal 900 , which can be integrated into any vehicle such as an autonomous vehicle.
  • the vehicle-mounted terminal 900 may, for example, be configured to perform the method of synchronously collecting data performed by the vehicle-mounted terminal described above.
  • the vehicle-mounted terminal 900 may be configured to determine first time information when the radar sensor collects point cloud data at a predetermined angle in response to receiving a data packet from the radar sensor for a predetermined angle.
  • the predetermined angle is within the viewing angle range of the image sensor in the vehicle equipped with the vehicle-mounted terminal, and the data package includes point cloud data collected by the radar sensor at the predetermined angle.
  • the implementation manner of determining the first time information in this embodiment is similar to the implementation manner of operation S210 described above, and will not be described again here.
  • the vehicle-mounted terminal 900 may also be configured to determine delay information for the image sensor based on the first time information, and send the delay information to the controller that controls the image sensor, so that the controller controls the image sensor and rotates to the predetermined position.
  • the radar sensors collect data simultaneously at angles.
  • the vehicle-mounted terminal 900 may, for example, use the principle described above with respect to FIG. 3 to determine the delay information, which is not limited in this disclosure. It can be understood that the structure of the vehicle-mounted terminal 900 in FIG. 9 is only used as a diagram to facilitate understanding of the present disclosure, and the present disclosure does not limit it.
  • the disclosure Based on the synchronous data collection method executed by the controller provided by the disclosure, the disclosure also provides a controller.
  • Figure 10 is a schematic diagram of a controller according to an embodiment of the present disclosure.
  • this embodiment provides a controller 1000 that can be integrated into any vehicle such as an autonomous vehicle, and the controller 1000 can also be communicatively connected with a vehicle-mounted terminal in the vehicle.
  • the controller 1000 may, for example, be configured to perform the method of synchronously collecting data performed by the controller described above.
  • the controller 1000 may be configured to, in response to receiving delay information for the image sensor, adjust the triggering moment of the image sensor according to the delay information.
  • the principle of adjusting the triggering time in this embodiment may be similar to the adjustment principle in operation S410 described above, and will not be described again here. It can be understood that the delay information may be sent by a vehicle-mounted terminal that is communicatively connected to the controller.
  • the controller 1000 may also be configured to send a trigger signal to the image bed Angel in the vehicle in response to reaching the trigger moment, and in response to receiving the image data collected by the image sensor, add time information to the image data so that the vehicle can
  • the terminal uses time information to align the image data with the point cloud data collected by the radar sensor.
  • the controller 1000 may, for example, use the principle described above with respect to FIG. 5 to determine the number of times and steps to adjust the triggering moment, and adjust the triggering moment according to the determined number of times and the step size. This disclosure does not apply to this. Make limitations. It can be understood that the structure of the controller 1000 in FIG. 10 is only used as a diagram to facilitate understanding of the present disclosure, and the present disclosure does not limit this.
  • the present disclosure also provides an autonomous driving vehicle.
  • Figure 11 is a schematic diagram of an autonomous vehicle according to an embodiment of the present disclosure.
  • this embodiment provides an autonomous vehicle 1100 in which a vehicle-mounted terminal, a controller, a radar sensor, and an image sensor can be integrated.
  • the vehicle-mounted terminal may be the vehicle-mounted terminal 900 described above, and the controller may be the controller 1000 described above.
  • the image sensor can periodically collect image data, and the radar sensor can periodically collect point cloud data, and encapsulate the point cloud data to obtain a data package.
  • the vehicle-mounted terminal in the autonomous vehicle may determine the first time information that the radar sensor collects point cloud data at a predetermined angle in response to receiving a data packet from the radar sensor for a predetermined angle. The predetermined angle is within the viewing angle range of the image sensor, and the data package includes point cloud data collected by the radar sensor at the predetermined angle.
  • the vehicle-mounted terminal can also determine the delay information for the image sensor based on the first time information, and send the determined delay information to the controller. It can be understood that the vehicle-mounted terminal determines the delay information in a manner similar to the implementation principle of the method 200 for synchronously collecting data described above, which will not be described again here.
  • the controller can adjust the trigger time of the image sensor according to the delay information, and when it is determined that the trigger time of the image sensor is reached, send a trigger signal to the image sensor.
  • the controller can also add second time information to the image data after receiving the image data collected by the image sensor, so that the vehicle-mounted terminal uses the second time information to align the image data with the point cloud data collected by the radar sensor.
  • the present disclosure also provides a cloud system for executing the synchronization determination method.
  • Figure 12 is a schematic diagram of a cloud system according to an embodiment of the present disclosure.
  • the present disclosure provides a cloud system 1200.
  • the cloud system 1200 can communicate with the self-driving vehicle provided by the present disclosure through a communication link to obtain data collected by sensors from the self-driving vehicle, or receive automatic Data sent by driving a vehicle.
  • the cloud system 1200 may be configured to obtain the data packet sequence collected by the radar sensor and the image data sequence collected by the image sensor in the autonomous vehicle. After acquiring the data packet sequence and the image data sequence, the cloud system 1200 can determine, for each data packet in the data packet sequence, the time information associated with each data packet based on the time information of each data packet and the time information of each image data in the image data sequence. packets of matching image data. Subsequently, the synchronization relationship between each data packet and the matching image data is determined based on the difference in time information between each data packet and the matching image data and the difference threshold. Finally, the cloud system 1200 can determine the synchronization relationship between the radar sensor and the image sensor based on the synchronization relationship between each of the multiple data packets in the data packet sequence and the matching image data.
  • the cloud system 1200 can be used to perform the synchronization determination method described above to determine the synchronization relationship between the radar sensor and the image sensor, which is not limited by the present disclosure.
  • the disclosure Based on the method for synchronously collecting data performed by a vehicle-mounted terminal provided by the disclosure, the disclosure also provides a device for synchronously collecting data, which device can be integrated into the vehicle-mounted terminal.
  • Figure 13 is a structural block diagram of a device for synchronously collecting data according to an embodiment of the present disclosure.
  • the device 1300 for synchronously collecting data in this embodiment can be integrated into a vehicle-mounted terminal.
  • the device 1300 for synchronously collecting data can include a time information determination module 1310, a delay information determination module 1320 and an information sending module 1330.
  • the time information determination module 1310 is configured to determine first time information that the radar sensor collects point cloud data at a predetermined angle in response to receiving a data packet from the radar sensor for a predetermined angle; wherein the predetermined angle is within the viewing angle range of the image sensor.
  • the data package includes point cloud data collected by the radar sensor at a predetermined angle.
  • the time information determination module 1310 may be configured to perform the above-described operation S210, which will not be described again here.
  • the delay information determining module 1320 is configured to determine delay information for the image sensor according to the first time information.
  • the delay information determination module 1320 may be configured to perform the above-described operation S220, which will not be described again here.
  • the information sending module 1330 is used to send delay information to the controller, so that the controller controls the image sensor to collect data synchronously with the radar sensor rotated to a predetermined angle.
  • the information sending module 1330 may be configured to perform the above-described operation S230, which will not be described again here.
  • the time information determination module 1310 may include a parsing sub-module and a first determination sub-module.
  • the parsing sub-module is used to parse the data packet and obtain the time data in the data packet; the time data includes the base time and the periodic encapsulation time.
  • the first determination sub-module is used to determine the first time information for the radar sensor to collect point cloud data at a predetermined angle based on the reference time and periodic packaging time. The reference time is aligned with the time of the positioning device of the positioning device in the vehicle where the radar sensor is located.
  • the delay information determination module 1320 may include a second determination sub-module, a deviation value determination sub-module and a delay determination sub-module.
  • the second determination sub-module is used to determine the second time information for the radar sensor to collect point cloud data at 0° based on the first time information, the rotation period of the radar sensor and the predetermined angle.
  • the deviation value determination sub-module is used to determine the rotation deviation value of the radar sensor based on the remainder obtained by taking the remainder of the rotation period based on the second time information.
  • the delay determination sub-module is used to determine delay information for the image sensor according to the rotation deviation value.
  • the delay determination sub-module may be specifically configured to determine the delay information of the image sensor based on the sum of the predetermined error value and the rotation deviation value.
  • the predetermined error value is greater than 0, and the predetermined error value is determined based on the optical center position of the image sensor and the position of the target object within the viewing angle range of the image sensor.
  • the controller includes an artificial intelligence chip.
  • the present disclosure also provides a device for synchronously collecting data, which device can be integrated into the controller.
  • Figure 14 is a structural block diagram of a device for synchronously collecting data according to another embodiment of the present disclosure.
  • the device 1400 for synchronously collecting data in this embodiment may include a time adjustment module 1410, a signal sending module 1420 and a time adding module 1430.
  • the timing adjustment module 1410 is configured to respond to receiving delay information for the image sensor and adjust the triggering timing of the image sensor according to the delay information.
  • the delay information is determined based on the time information of the radar sensor collecting point cloud data at a predetermined angle, and the predetermined angle is within the viewing angle range of the image sensor.
  • the time adjustment module 1410 may be used to perform the above-described operation S410, which will not be described again here.
  • the signal sending module 1420 is configured to send a trigger signal to the image sensor in response to reaching the trigger moment.
  • the signal sending module 1420 may be configured to perform the above-described operation S420, which will not be described again here.
  • the time adding module 1430 is configured to add time information to the image data in response to receiving the image data collected by the image sensor, so as to use the time information to align the image data with the point cloud data collected by the radar sensor.
  • the time adding module 1430 may be used to perform the operation S430 described above, which will not be described again.
  • the delay information indicates the delay duration
  • the above-mentioned time adjustment module 1410 may include an adjustment information determination sub-module and an adjustment sub-module.
  • the adjustment information determination submodule is used to determine the number of adjustment trigger moments and the step size for each adjustment based on the acquisition frame rate and delay time of the image sensor.
  • the adjustment sub-module is used to adjust the triggering moment of the image sensor based on the number of times and step size.
  • the adjustment information determination sub-module may include an adjustment interval determination unit and an information determination unit.
  • the adjustment interval determination unit is used to determine the value interval of the adjustment step at the triggering moment based on the acquisition frame rate of the image sensor.
  • the information determination unit is used to determine the number of times and step size based on the value interval and delay length. Among them, the acquisition frame rate is negatively related to the length of the value interval.
  • the above-mentioned time adjustment module 1410 is specifically configured to respond to receiving delay information and the delay information changes relative to adjacent delay information in the received delay information, adjusting the triggering of the image sensor according to the change value of the delay information. time.
  • the present disclosure also provides a device for synchronously collecting data, which device can be integrated into the autonomous driving vehicle.
  • Figure 15 is a structural block diagram of a device for synchronously collecting data according to another embodiment of the present disclosure.
  • the device 1500 for synchronously collecting data in this embodiment may include a time information determination module 1510, a delay information determination module 1520, a time adjustment module 1530, a signal sending module 1540 and a time adding module 1550.
  • the time information determination module 1510 is configured to determine first time information that the radar sensor collects point cloud data at a predetermined angle in response to receiving a data packet from the radar sensor for a predetermined angle; wherein the predetermined angle is within the viewing angle range of the image sensor.
  • the data package includes point cloud data collected by the radar sensor at predetermined angles.
  • the time information determination module 1510 may be used to perform the operation S210 described above, which will not be described again.
  • the delay information determination module 1520 is configured to determine delay information for the image sensor according to the first time information. In an embodiment, the delay information determination module 1520 may be configured to perform the operation S220 described above, which will not be described again.
  • the time adjustment module 1530 is used to adjust the triggering time of the image sensor according to the delay information, so that the image sensor collects data synchronously with the radar sensor rotated to a predetermined angle. In one embodiment, the time adjustment module 1530 may be used to perform the operation S410 described above, which will not be described again.
  • the signal sending module 1540 is configured to send a trigger signal to the image sensor in response to reaching the trigger moment.
  • the signal sending module 1540 may be used to perform the operation S420 described above, which will not be described again here.
  • the time adding module 1550 is configured to add second time information to the image data in response to receiving the image data collected by the image sensor, so as to utilize the second time information to align the image data with the point cloud data collected by the radar sensor.
  • the time adding module 1550 may be used to perform the above-described operation S430, which will not be described again here.
  • the present disclosure also provides a synchronization determination device, which can be integrated in a cloud system.
  • Figure 16 is a structural block diagram of a synchronization determination device according to an embodiment of the present disclosure.
  • the synchronization determination device 1600 of this embodiment may include a data acquisition module 1610, a data matching module 1620, a data relationship determination module 1630 and a synchronization relationship determination module 1640.
  • the data acquisition module 1610 is used to acquire the data packet sequence collected by the radar sensor and the image data sequence collected by the image sensor. In an embodiment, the data acquisition module 1610 may be used to perform the above-described operation S810, which will not be described again here.
  • the data matching module 1620 is configured to determine, for each data packet in the data packet sequence, image data matching each data packet based on the time information of each data packet and the time information of each image data in the image data sequence. In one embodiment, the data matching module 1620 may be used to perform the above-described operation S820, which will not be described again here.
  • the data relationship determination module 1630 is configured to determine a synchronization relationship between each data packet and the matching image data based on the difference in time information between each data packet and the matching image data and the difference threshold. In one embodiment, the data relationship determination module 1630 may be used to perform the above-described operation S830, which will not be described again here.
  • the synchronization relationship determination module 1640 is used to determine the synchronization relationship between the radar sensor and the image sensor based on the synchronization relationship between each of the multiple data packets in the data packet sequence and the matching image data. In an embodiment, the synchronization relationship determination module 1640 may be used to perform the above-described operation S840, which will not be described again here.
  • the above-mentioned synchronization determining device may further include a prompt information sending module, configured to send prompt information to the target object in response to the synchronization relationship between the radar sensor and the image sensor indicating out-of-synchronization.
  • the above-mentioned synchronization determination device may further include a threshold determination module for determining a difference threshold.
  • the threshold determination module may include a data fusion sub-module and a threshold determination sub-module.
  • the data fusion sub-module is used to obtain fusion data based on the historical data packets collected by the radar sensor and the historical image data matching the historical data packets; the fusion data represents the projection of the point cloud data in the historical data packets to the pixels in the image coordinate system , and the position deviation value between pixels in historical image data.
  • the threshold determination sub-module is used to determine the difference in time information between the historical data packet and the historical image data as a difference threshold when the position deviation value reaches a predetermined deviation value.
  • the above-mentioned data fusion sub-module is specifically used to use predetermined containerization tools to fuse point cloud data and historical image data in historical data packages to obtain fused data.
  • the collection, storage, use, processing, transmission, provision, disclosure and application of user personal information are all in compliance with relevant laws and regulations, and necessary confidentiality measures have been taken. , and does not violate public order and good customs.
  • the user's authorization or consent is obtained before obtaining or collecting the user's personal information.
  • the present disclosure also provides an electronic device, a readable storage medium, and a computer program product.
  • FIG. 17 shows a schematic block diagram of an example electronic device 1700 that can be used to implement the method of synchronously collecting data or the method of synchronization determination according to embodiments of the present disclosure.
  • Electronic devices are intended to refer to various forms of digital computers, such as laptop computers, desktop computers, workstations, personal digital assistants, servers, blade servers, mainframe computers, and other suitable computers.
  • Electronic devices may also represent various forms of mobile devices, such as personal digital assistants, cellular phones, smart phones, wearable devices, and other similar computing devices.
  • the components shown herein, their connections and relationships, and their functions are examples only and are not intended to limit implementations of the disclosure described and/or claimed herein.
  • the device 1700 includes a computing unit 1701 that can execute according to a computer program stored in a read-only memory (ROM) 1702 or loaded from a storage unit 1708 into a random access memory (RAM) 1703. Various appropriate actions and treatments. In the RAM 1703, various programs and data required for the operation of the device 1700 can also be stored.
  • Computing unit 1701, ROM 1702 and RAM 1703 are connected to each other via bus 1704.
  • Input/output (I/O) interface 1705 is also connected to bus 1704.
  • I/O interface 1705 Multiple components in device 1700 are connected to I/O interface 1705, including: input unit 1706, such as keyboard, mouse, etc.; output unit 1707, such as various types of displays, speakers, etc.; storage unit 1708, such as magnetic disk, optical disk, etc. ; and communication unit 1709, such as a network card, modem, wireless communication transceiver, etc.
  • the communication unit 1709 allows the device 1700 to exchange information/data with other devices through computer networks such as the Internet and/or various telecommunications networks.
  • Computing unit 1701 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of the computing unit 1701 include, but are not limited to, a central processing unit (CPU), a graphics processing unit (GPU), various dedicated artificial intelligence (AI) computing chips, various computing units that run machine learning model algorithms, digital signal processing processor (DSP), and any appropriate processor, controller, microcontroller, etc.
  • the computing unit 1701 performs various methods and processes described above, such as the method of synchronously collecting data or the method of synchronization determination.
  • the synchronization data collection method or the synchronization determination method may be implemented as a computer software program, which is tangibly included in a machine-readable medium, such as the storage unit 1708.
  • part or all of the computer program may be loaded and/or installed onto device 1700 via ROM 1702 and/or communication unit 1709.
  • the computer program When the computer program is loaded into the RAM 1703 and executed by the computing unit 1701, one or more steps of the above-described method of synchronously collecting data or the synchronously determining method may be performed.
  • the computing unit 1701 may be configured to perform the method of synchronously collecting data or the method of synchronously determining in any other suitable manner (eg, by means of firmware).
  • Various implementations of the systems and techniques described above may be implemented in digital electronic circuit systems, integrated circuit systems, field programmable gate arrays (FPGAs), application specific integrated circuits (ASICs), application specific standard products (ASSPs), systems on a chip implemented in a system (SOC), complex programmable logic device (CPLD), computer hardware, firmware, software, and/or combinations thereof.
  • FPGAs field programmable gate arrays
  • ASICs application specific integrated circuits
  • ASSPs application specific standard products
  • SOC system
  • CPLD complex programmable logic device
  • computer hardware firmware, software, and/or combinations thereof.
  • These various embodiments may include implementation in one or more computer programs executable and/or interpreted on a programmable system including at least one programmable processor, the programmable processor
  • the processor which may be a special purpose or general purpose programmable processor, may receive data and instructions from a storage system, at least one input device, and at least one output device, and transmit data and instructions to the storage system, the at least one input device, and the at least one output device.
  • An output device may be a special purpose or general purpose programmable processor, may receive data and instructions from a storage system, at least one input device, and at least one output device, and transmit data and instructions to the storage system, the at least one input device, and the at least one output device.
  • An output device may be a special purpose or general purpose programmable processor, may receive data and instructions from a storage system, at least one input device, and at least one output device, and transmit data and instructions to the storage system, the at least one input device, and the at least one output device.
  • Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general-purpose computer, special-purpose computer, or other programmable data processing device, such that the program codes, when executed by the processor or controller, cause the functions specified in the flowcharts and/or block diagrams/ The operation is implemented.
  • the program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
  • a machine-readable medium may be a tangible medium that may contain or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • the machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium.
  • Machine-readable media may include, but are not limited to, electronic, magnetic, optical, electromagnetic, infrared, or semiconductor systems, devices or devices, or any suitable combination of the foregoing.
  • machine-readable storage media would include one or more wire-based electrical connections, laptop disks, hard drives, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
  • RAM random access memory
  • ROM read only memory
  • EPROM or flash memory erasable programmable read only memory
  • CD-ROM portable compact disk read-only memory
  • magnetic storage device or any suitable combination of the above.
  • the systems and techniques described herein may be implemented on a computer having a display device (eg, a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to the user ); and a keyboard and pointing device (eg, a mouse or a trackball) through which a user can provide input to the computer.
  • a display device eg, a CRT (cathode ray tube) or LCD (liquid crystal display) monitor
  • a keyboard and pointing device eg, a mouse or a trackball
  • Other kinds of devices may also be used to provide interaction with the user; for example, the feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and may be provided in any form, including Acoustic input, voice input or tactile input) to receive input from the user.
  • the systems and techniques described herein may be implemented in a computing system that includes back-end components (e.g., as a data server), or a computing system that includes middleware components (e.g., an application server), or a computing system that includes front-end components (e.g., A user's computer having a graphical user interface or web browser through which the user can interact with implementations of the systems and technologies described herein), or including such backend components, middleware components, or any combination of front-end components in a computing system.
  • the components of the system may be interconnected by any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include: local area network (LAN), wide area network (WAN), and the Internet.
  • Computer systems may include clients and servers.
  • Clients and servers are generally remote from each other and typically interact over a communications network.
  • the relationship of client and server is created by computer programs running on corresponding computers and having a client-server relationship with each other.
  • the server can be a cloud server, also known as cloud computing server or cloud host. It is a host product in the cloud computing service system to solve the problem of traditional physical host and VPS service ("Virtual Private Server", or "VPS" for short). "), there are defects such as difficult management and weak business scalability.
  • the server can also be a distributed system server or a server combined with a blockchain.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention concerne un procédé (200, 400, 600) et un appareil (1300, 1400, 1500) pour collecter de manière synchrone des données, un procédé de détermination de synchronisation (800) et un appareil (1600), un véhicule (110, 1100) et un dispositif électronique (1700), qui se rapportent au domaine de l'intelligence artificielle, et en particulier au domaine technique de la conduite autonome, de la vision artificielle et de l'informatique en nuage. La solution de mise en œuvre spécifique du procédé (200) pour collecter de manière synchrone des données consiste à : en réponse à la réception d'un paquet de données pour un angle prédéterminé (302) à partir d'un capteur radar (111, 601), déterminer des premières informations temporelles (301) du moment où le capteur radar (111, 601) collecte des données de nuage de points à l'angle prédéterminé (302) (S210), l'angle prédéterminé (302) se situant dans une plage d'angle de vision d'un capteur d'image (112, 602) ; en fonction des premières informations temporelles (301), déterminer des informations de retard (306) pour le capteur d'image (112, 602) (S220) ; et envoyer les informations de retard (306) à un dispositif de commande (1000) (S230), de telle sorte que le dispositif de commande (1000) commande le capteur d'image (112, 602) pour collecter de manière synchrone des données avec le capteur radar (111, 601) qui est tourné à l'angle prédéterminé (302), le paquet de données comprenant les données de nuage de points collectées par le capteur radar (111, 601) à l'angle prédéterminé (302).
PCT/CN2022/105187 2022-07-12 2022-07-12 Procédé et appareil de collecte synchrone de données, procédé et appareil de détermination de synchronisation, et véhicule autonome WO2024011408A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/105187 WO2024011408A1 (fr) 2022-07-12 2022-07-12 Procédé et appareil de collecte synchrone de données, procédé et appareil de détermination de synchronisation, et véhicule autonome

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/105187 WO2024011408A1 (fr) 2022-07-12 2022-07-12 Procédé et appareil de collecte synchrone de données, procédé et appareil de détermination de synchronisation, et véhicule autonome

Publications (1)

Publication Number Publication Date
WO2024011408A1 true WO2024011408A1 (fr) 2024-01-18

Family

ID=89535235

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/105187 WO2024011408A1 (fr) 2022-07-12 2022-07-12 Procédé et appareil de collecte synchrone de données, procédé et appareil de détermination de synchronisation, et véhicule autonome

Country Status (1)

Country Link
WO (1) WO2024011408A1 (fr)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004028601A (ja) * 2002-06-21 2004-01-29 Mitsubishi Heavy Ind Ltd 監視用レーザレーダシステム及び撮像方法
CN110135485A (zh) * 2019-05-05 2019-08-16 浙江大学 单目相机与毫米波雷达融合的物体识别与定位方法和系统
CN111435162A (zh) * 2020-03-03 2020-07-21 深圳市镭神智能系统有限公司 激光雷达与相机同步方法、装置、设备和存储介质
CN111522026A (zh) * 2020-04-21 2020-08-11 北京三快在线科技有限公司 一种数据融合的方法及装置
CN112787740A (zh) * 2020-12-26 2021-05-11 武汉光庭信息技术股份有限公司 一种多传感器时间同步装置及方法
CN113076383A (zh) * 2020-01-06 2021-07-06 阿里巴巴集团控股有限公司 道路数据采集车及其数据采集系统
CN113985431A (zh) * 2021-11-24 2022-01-28 杭州海康汽车软件有限公司 一种数据采集方法、系统、装置、电子设备及存储介质
CN114488181A (zh) * 2022-01-07 2022-05-13 北京理工大学 一种相机和激光雷达的多源异构传感器融合方法及设备

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004028601A (ja) * 2002-06-21 2004-01-29 Mitsubishi Heavy Ind Ltd 監視用レーザレーダシステム及び撮像方法
CN110135485A (zh) * 2019-05-05 2019-08-16 浙江大学 单目相机与毫米波雷达融合的物体识别与定位方法和系统
CN113076383A (zh) * 2020-01-06 2021-07-06 阿里巴巴集团控股有限公司 道路数据采集车及其数据采集系统
CN111435162A (zh) * 2020-03-03 2020-07-21 深圳市镭神智能系统有限公司 激光雷达与相机同步方法、装置、设备和存储介质
CN111522026A (zh) * 2020-04-21 2020-08-11 北京三快在线科技有限公司 一种数据融合的方法及装置
CN112787740A (zh) * 2020-12-26 2021-05-11 武汉光庭信息技术股份有限公司 一种多传感器时间同步装置及方法
CN113985431A (zh) * 2021-11-24 2022-01-28 杭州海康汽车软件有限公司 一种数据采集方法、系统、装置、电子设备及存储介质
CN114488181A (zh) * 2022-01-07 2022-05-13 北京理工大学 一种相机和激光雷达的多源异构传感器融合方法及设备

Similar Documents

Publication Publication Date Title
US10789771B2 (en) Method and apparatus for fusing point cloud data
US10917617B2 (en) Tunnel deformation monitoring system
WO2021204144A1 (fr) Système et procédé de traitement de données, capteur, sac à dos d'acquisition mobile, et dispositif
US20210356915A1 (en) Systems and methods for time synchronization
WO2020135382A1 (fr) Système, procédé et appareil de synchronisation de service d'heure de multiples capteurs et dispositif électronique
CN111435162B (zh) 激光雷达与相机同步方法、装置、设备和存储介质
CN112230240A (zh) 激光雷达与相机数据的时空同步系统、装置及可读介质
WO2021047271A1 (fr) Procédé et appareil de synchronisation temporelle
WO2019047575A1 (fr) Carte d'acquisition de données à matrice fpga, système d'acquisition de données et procédé d'acquisition de données
CN107229219A (zh) 一种基于gps模块、嵌入式系统的计算机精确授时方法及其实现系统
WO2022061799A1 (fr) Procédé et dispositif d'estimation de pose
CN111031278A (zh) 一种基于结构光和tof的监控方法和系统
WO2024011408A1 (fr) Procédé et appareil de collecte synchrone de données, procédé et appareil de détermination de synchronisation, et véhicule autonome
CN113219479A (zh) 智能驾驶控制系统的摄像头和激光雷达同步方法及系统
CN114754769A (zh) 激光雷达与惯性传感器的数据同步授时系统和方法
CN108964825A (zh) 校时方法、校时装置和校时服务器
CN117152589A (zh) 目标识别方法、系统及存储介质
WO2023165569A1 (fr) Procédé et appareil de positionnement simultané à capteurs multiples, système, et support de stockage
WO2023141817A1 (fr) Procédé et dispositif de synchronisation temporelle, et support d'enregistrement
CN114415489B (zh) 一种车载传感器时间同步方法、装置、设备和介质
CN114911887B (zh) 数据处理的方法、装置、设备以及存储介质
CN113985431A (zh) 一种数据采集方法、系统、装置、电子设备及存储介质
CN113866789A (zh) 无人机激光雷达点云数据处理系统
CN113099211A (zh) 一种具备时间同步的立体视觉数据采集系统及方法
CN115967462A (zh) 图像帧的同步方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22950539

Country of ref document: EP

Kind code of ref document: A1