WO2020258901A1 - Method and apparatus for processing data of sensor, electronic device, and system - Google Patents

Method and apparatus for processing data of sensor, electronic device, and system Download PDF

Info

Publication number
WO2020258901A1
WO2020258901A1 PCT/CN2020/076813 CN2020076813W WO2020258901A1 WO 2020258901 A1 WO2020258901 A1 WO 2020258901A1 CN 2020076813 W CN2020076813 W CN 2020076813W WO 2020258901 A1 WO2020258901 A1 WO 2020258901A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
sensor
pose
time
smart device
Prior art date
Application number
PCT/CN2020/076813
Other languages
French (fr)
Chinese (zh)
Inventor
邓龙
Original Assignee
上海商汤智能科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海商汤智能科技有限公司 filed Critical 上海商汤智能科技有限公司
Priority to KR1020217016721A priority Critical patent/KR20210087495A/en
Priority to JP2021533178A priority patent/JP7164721B2/en
Publication of WO2020258901A1 publication Critical patent/WO2020258901A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

Definitions

  • the embodiments of the present disclosure relate to smart driving technology, and in particular to a sensor data processing method, device, electronic equipment and system.
  • assisted driving and automatic driving are two important technologies in the field of intelligent driving. Through assisted driving or automatic driving, the occurrence of traffic accidents can be reduced, so they play an important role in the field of intelligent driving.
  • the implementation of assisted driving technology and autonomous driving technology requires the cooperation of multiple sensors.
  • a variety of sensors are set at different positions of the vehicle, and real-time collection of road images, vehicle operation data, etc., and the auxiliary driving system or automatic driving system performs path planning and other control operations based on the data collected by each sensor. Since the trigger time and trigger source of the sensors installed on the vehicle may be different, there may be a problem of out-of-synchronization between multiple sensors and multiple sensors in each sensor.
  • the embodiments of the present disclosure provide a sensor data processing method, device, electronic equipment and system.
  • an embodiment of the present disclosure provides a sensor data processing method, including: acquiring first target data of a first sensor of a smart device at a first time; acquiring a second sensor of the smart device at a second time Two target data, the first moment and the second moment are different moments under the clock of the smart device; acquiring the first pose of the smart device at the first moment and the second moment In the second pose, the first pose and the second pose are different; according to the first pose and the second pose, the first target data is compensated to obtain the Compensation data of the first sensor at the second moment.
  • an embodiment of the present disclosure further provides a sensor data processing device, including: a first acquisition module for acquiring first target data of a first sensor of a smart device at a first moment; a second acquisition module for Acquire the second target data of the second sensor of the smart device at the second time, the first time and the second time are different time under the clock of the smart device; the third acquisition module is used to acquire The first pose of the smart device at the first moment and the second pose at the second moment, the first pose and the second pose are different; the compensation module is used for In the first pose and the second pose, compensation processing is performed on the first target data to obtain the compensation data of the first sensor at the second moment.
  • embodiments of the present disclosure also provide an intelligent driving control method, including: an intelligent driving control device acquires detection data of a sensor provided on a smart device, and the detection data uses the sensor data described in the first aspect above The processing method is obtained; the smart driving control device performs smart driving control on the smart device according to the detection data.
  • an embodiment of the present disclosure further provides an intelligent driving control device, including: an acquisition module, configured to acquire detection data of a sensor set on a smart device, and the detection data uses the sensor described in the first aspect.
  • the data processing method is obtained; the intelligent driving control module is used for intelligent driving control of the smart device according to the detection data.
  • embodiments of the present disclosure further provide an electronic device, including: a memory, configured to store program instructions; a processor, configured to call and execute the program instructions in the memory, and execute the method described in the first aspect above step.
  • embodiments of the present disclosure also provide an intelligent driving system, including: a sensor connected in communication, the electronic device described in the fifth aspect, and the intelligent driving control device in the fourth aspect.
  • the embodiments of the present disclosure also provide a readable storage medium in which a computer program is stored, and the computer program is used to execute the method steps described in the first aspect; or, the The computer program is used to execute the method steps described in the third aspect.
  • the first target data of the first sensor at the first time and the second target data of the second sensor at the second time are acquired according to the smart device
  • the pose information at the first moment and the pose information of the smart device at the second moment are compensated for the first target data to obtain the compensation data of the first sensor at the second moment.
  • the second target data of the second sensor is also the monitoring data at the second time, that is, by performing compensation processing on the data of the first sensor, the data of the first sensor at the corresponding time of the second sensor can be obtained, that is, the obtained first sensor
  • the first target data of the second sensor and the second target data of the second sensor are data at the same time, so as to achieve synchronization of the first sensor and the second sensor.
  • the method realizes the synchronization of the sensors through software, therefore, there is no need to additionally deploy dedicated hardware for synchronous triggering of multiple sensors, which reduces the hardware cost required for data synchronization of multiple sensors.
  • FIG. 1 is a schematic diagram of an application scenario of a sensor data processing method provided by an embodiment of the disclosure
  • FIG. 2 is a first flowchart of a sensor data processing method provided by an embodiment of the disclosure
  • FIG. 3 is a schematic diagram of data synchronization between the first sensor and the second sensor through the above process
  • FIG. 4 is a second schematic flowchart of a sensor data processing method provided by an embodiment of the disclosure.
  • FIG. 5 is a third schematic flowchart of a sensor data processing method provided by an embodiment of the disclosure.
  • FIG. 6 is a fourth schematic flowchart of a sensor data processing method provided by an embodiment of the disclosure.
  • FIG. 7 is a fifth schematic flowchart of a sensor data processing method provided by an embodiment of the disclosure.
  • FIG. 8 is a sixth schematic flowchart of a sensor data processing method provided by an embodiment of the disclosure.
  • FIG. 9 is an example diagram of performing intra-frame data synchronization on the first original data to obtain the first target data
  • FIG. 10 is a seventh schematic flowchart of a sensor data processing method provided by an embodiment of the disclosure.
  • Figure 11 is an example diagram of synchronizing sensors of the same type
  • FIG. 12 is a first module structure diagram of a sensor data processing device provided by an embodiment of the disclosure.
  • FIG. 13 is a second module structure diagram of a sensor data processing device provided by an embodiment of the disclosure.
  • FIG. 14 is the third module structure diagram of the sensor data processing device provided by an embodiment of the disclosure.
  • FIG. 15 is a fourth module structure diagram of a sensor data processing device provided by an embodiment of the disclosure.
  • FIG. 16 is the fifth module structure diagram of the sensor data processing device provided by the embodiments of the disclosure.
  • FIG. 17 is a sixth module structure diagram of a sensor data processing device provided by an embodiment of the disclosure.
  • FIG. 19 is a schematic flowchart of an intelligent driving control method provided by an embodiment of the disclosure.
  • 20 is a schematic structural diagram of an intelligent driving control device provided by an embodiment of the disclosure.
  • FIG. 21 is a schematic diagram of a smart driving system provided by an embodiment of the disclosure.
  • FIG. 1 is a schematic diagram of an application scenario of a sensor data processing method provided by an embodiment of the disclosure.
  • the method can be applied to smart devices such as vehicles, robots, and blind guide devices that are installed with sensors.
  • the types of sensors installed on smart devices can include cameras, LiDAR (Light Detection And Ranging, LiDAR), Millimeter Wave Radar (Radio Detection And Ranging, RADAR), high-precision inertial navigation, Controller Area Net-work At least one sensor such as Bus, CANBUS, etc.
  • the number of sensors of the same type set on the smart device may be one or more, for example, one or more cameras, one or more lidars, etc. may be set. Different sensors can be set in different positions of the smart device.
  • Figure 1 shows a camera and LiDAR as an example.
  • the camera can capture images of the surrounding environment of the smart device in real time, and report the captured images to the smart driving system of the smart device.
  • LiDAR can obtain three-dimensional point coordinates around the smart device by emitting and receiving laser pulses, forming point cloud data, and reporting the point cloud data to the smart driving system of the smart device.
  • RADAR uses electromagnetic waves to detect the ground, vehicles, trees and other objects around the smart device and receive their echoes to obtain the object's position, height, distance and other information, and report it to the smart device's smart driving system.
  • CANBUS transmits the operating parameters of the smart device, such as the vehicle's accelerator operating parameters, steering wheel operating parameters, wheel speed, etc., to the smart driving system of the smart device using serial data transmission.
  • the intelligent driving system performs intelligent driving control based on the data reported by each sensor, such as vehicle positioning, route planning, route deviation warning, and traffic flow analysis.
  • the following embodiments of the present disclosure refer to the smart driving system of the smart device as a "system" for short.
  • FIG. 2 is a schematic flowchart 1 of a sensor data processing method provided by an embodiment of the disclosure. As shown in FIG. 2, the method includes:
  • the system may send a data report instruction to the first sensor and the second sensor, and the first sensor and the second sensor detect data after receiving the data report instruction, and send the detected data to the system.
  • the first sensor and the second sensor may also automatically detect data according to a preset cycle, and send the detected data to the system.
  • S204 Perform compensation processing on the first target data according to the first pose and the second pose to obtain compensation data of the first sensor at the second moment.
  • the first sensor and the second sensor may be any two sensors on the smart device.
  • One or more sensors of the same type may be provided on the smart device, and the first sensor and the second sensor may be two sensors of different types, or two sensors of the same type.
  • the first sensor may be LiDAR
  • the second sensor may be a camera
  • the first sensor is LiDAR and the second sensor is a camera
  • the second sensor since the data detected by the camera is two-dimensional information, and the data detected by LiDAR is three-dimensional information, the data needs to be rotated and translated in the subsequent compensation processing. Therefore, the second sensor that uses the camera that detects two-dimensional information as a reference to perform motion compensation operations such as rotation and translation on the three-dimensional data detected by LiDAR that detects three-dimensional information can ensure that no additional depth is introduced during the compensation process. error.
  • the first sensor and the second sensor may both be LiDAR.
  • each sensor may use its own independent clock, or may use the same clock as the intelligent driving system of the vehicle.
  • the first time and the second time mentioned in this embodiment both refer to the time under the clock of the intelligent driving system of the vehicle.
  • Each sensor may report the detection time of the data under the sensor's clock to the system by carrying time information in the reported data.
  • the time when each sensor sends data is equal to or approximate to the time when the system receives the data.
  • the above-mentioned first time is the time when the first sensor sends data, the time under the clock of the first sensor itself, and the time under the system clock when the system receives data. If the clock of the first sensor itself is different from the clock of the system, the system needs to obtain the first time according to the data sending time under the clock of the first sensor and the clock difference between the first sensor and the system.
  • the process will be implemented as follows Detailed description in the example. The processing method for the above second moment is similar to the above first moment, and will not be repeated here.
  • the first sensor and the second sensor may be out of synchronization due to factors such as different trigger sources and different trigger moments.
  • the first sensor is LiDAR
  • the second sensor is a camera.
  • the LiDAR reports one frame of data to the system every time it rotates.
  • the camera can report data to the system according to its own shooting cycle. Therefore, even if the LiDAR and the camera start working at the same time, they report to the system
  • the data may not be the data at the same time.
  • LiDAR detects a person 100 meters in front of the vehicle, and the camera may capture a person 120 meters in front of the vehicle.
  • the second sensor is used as the reference sensor of the first sensor, and the system receives the second sensor of the second sensor.
  • the second moment of the second target data is used as the reference moment of the first moment when the first sensor sends data.
  • the first target data Compensation processing obtains the compensation data of the first sensor at the second moment.
  • the second target data of the second sensor is also the detection data at the second time, that is, by performing compensation processing on the data of the first sensor
  • the data of the first sensor at the corresponding time of the second sensor can be obtained, that is, the obtained first sensor
  • the data of and the data of the second sensor are data at the same time, so as to realize the synchronization of the first sensor and the second sensor.
  • the synchronization of the sensors is realized through software, so there is no need to deploy additional dedicated hardware for triggering synchronization of multiple sensors, which reduces the hardware cost required for data synchronization of multiple sensors.
  • Figure 3 is a schematic diagram of data synchronization between the first sensor and the second sensor through the above process.
  • the first sensor and the second sensor are The data frame reported by LiDAR may be called a radar frame (or LiDAR frame), and the data frame reported by the camera is called a camera frame (or data frame).
  • the system receives a radar frame at time Tl (ie, the first time), and a camera frame at time Tc (ie, the second time), using the camera as a reference sensor, and using the above process to get Tl time
  • Tl time
  • Tc time
  • the data of the radar frame at time Tc is equivalent to acquiring radar data and camera data at the same time at Tc time, thus realizing the synchronization of LiDAR and camera.
  • each sensor sends detected data to the system in real time, and the data is used to synchronize sensor data.
  • Figure 4 is the second schematic diagram of the synchronization processing of the recorded data of each sensor during data playback.
  • the detection data of the camera is recorded in advance to obtain a series of camera frames, and each camera frame records the time of detection
  • Each radar frame records the time stamp of the detection.
  • data playback read a camera frame and a radar frame. At the same time, get the vehicle's pose queue according to the recorded CANBUS/high-precision inertial navigation detection data, and then get the detection according to the time stamp of the camera frame.
  • the pose of the vehicle at the time of the camera frame, the pose when the radar frame was detected according to the time stamp of the radar frame, and the radar frame is compensated to the detection of the camera frame according to the obtained two poses
  • the camera frame and radar frame are synchronized, and operations such as driving control can be performed based on the synchronized camera frame and radar frame.
  • one of them can be selected as the second sensor, that is, the reference sensor, and the other sensors are synchronized with the reference sensor, so as to realize the synchronization of the sensors on the smart device.
  • the first target data is compensated to obtain the compensation data of the first sensor at the second moment. Since the second target data of the second sensor is also the monitoring data at the second time, by performing compensation processing on the data of the first sensor, the data of the first sensor at the corresponding time of the second sensor can be obtained, that is, the obtained data of the first sensor.
  • the first target data and the second target data of the second sensor are both data at the same time, so as to achieve synchronization of the first sensor and the second sensor.
  • the method realizes the synchronization of the sensors through software, therefore, there is no need to additionally deploy dedicated hardware for synchronous triggering of multiple sensors, which reduces the hardware cost required for data synchronization of multiple sensors.
  • FIG. 5 is a schematic diagram of the third flow of a sensor data processing method provided by an embodiment of the disclosure. As shown in FIG. 5, the first target data is compensated according to the first pose and the second pose in step S204.
  • Alternative methods include:
  • S501 Determine the first coordinate system of the first sensor at the first moment according to the first pose
  • S502 Determine the second coordinate system of the first sensor at the second moment according to the second pose
  • S503 Perform compensation processing on the first target data according to the first coordinate system and the second coordinate system to obtain compensation data of the first sensor at the second time.
  • the vehicle has a corresponding pose at each moment in the running state, and the pose at different moments may change, and the pose change may include rotation and translation.
  • the pose of the sensor Based on the pose of the vehicle at each moment, that is, rotation and translation, the pose of the sensor can be obtained.
  • Each pose of the sensor corresponds to a coordinate system, and the point in the world coordinate system detected by the sensor in each pose is a point in the coordinate system, that is, the coordinate value of the detected point is the coordinate The coordinate value under the system.
  • the vehicle has a corresponding pose at the first moment and the second moment, and further has a first coordinate system corresponding to the first pose and a second coordinate system corresponding to the second pose.
  • the coordinate value of the point in the second coordinate system can be derived, that is, the coordinate value at the second moment, and the value detected by the first sensor
  • Each point in the data performs the above processing, and the detection data of the data detected by the first sensor at the second time can be obtained.
  • the following illustrates the process of performing compensation processing on the first target data of the first sensor based on the first coordinate system and the second coordinate system through examples.
  • the first time is t0
  • the second time is tn
  • the pose of the vehicle at t0 is P0
  • the coordinate system corresponding to P0 is the first coordinate system.
  • the coordinate data of point X obtained in the first coordinate system at time is x0
  • the relationship between x0 and X satisfies the following formula (1):
  • the first sensor at the first time t0 detects the coordinates of the point in the first coordinate system corresponding to the pose at t0, and t0 and tn respectively.
  • the coordinates of the point in the second coordinate system corresponding to the pose at the time tn can be derived.
  • the above processing is performed on each point corresponding to the first target data, and then the detection data of the first sensor at time tn can be obtained.
  • the compensation process is performed based on the first pose of the vehicle at the first moment and the second pose of the vehicle at the second moment. Therefore, the first position of the vehicle at the first moment can be obtained before the compensation process. And the second pose of the vehicle at the second moment.
  • the pose queue of the smart device may be generated first, and then based on the pose queue of the smart device, the first pose of the smart device at the first moment and the second position of the smart device at the second moment are obtained. posture.
  • FIG. 6 is a fourth flowchart of a sensor data processing method provided by an embodiment of the present disclosure. As shown in FIG. 6, the pose queue of the smart device is generated, and the first pose and the second pose are determined based on the pose queue of the smart device The process includes:
  • S602 Generate a pose queue of the smart device according to the above-mentioned pose detection data.
  • the sensors with a pose detection function provided on the vehicle may include CANBUS, high-precision inertial navigation and other sensors.
  • the system can receive real-time pose detection data reported by CANBUS, high-precision inertial navigation sensors, such as vehicle wheel speed, steering wheel and other operating data. Based on these pose detection data, the system can calculate the vehicle's pose at multiple moments, and then Build a pose queue.
  • S603 Determine the first pose and the second pose according to the pose queue of the smart device, the first moment and the second moment.
  • the pose queue of the smart device is composed of poses at each moment.
  • the first moment may be a moment corresponding to a certain pose in the pose pair, that is, the mapping relationship between the first moment and the pose information directly exists in the pose queue.
  • the pose information corresponding to the first moment can be directly obtained from the pose queue.
  • the same processing is performed for the pose information sampling at the second moment, that is, if the mapping relationship between the second moment and the pose information directly exists in the pose queue, the pose information corresponding to the second moment can be directly obtained from the pose queue.
  • the pose queue of the smart device in response to the situation that the pose queue does not include the pose at the first moment, the pose queue of the smart device is compensated according to the first moment to obtain The first pose; and/or, in response to the situation that the pose at the second moment is not included in the pose queue, perform compensation processing on the pose queue of the smart device according to the second moment , To get the second pose.
  • the pose information corresponding to the first moment may not exist in the pose queue, and the pose queue of the smart device can be compensated according to the first moment to obtain the first moment of the smart device.
  • the compensation processing for the pose queue may be interpolation processing, for example.
  • the first moment is t3, and there is no pose information corresponding to t3 in the pose queue of the smart device, you can find the two adjacent moments closest to t3 in the pose queue, for example Find time t4 and t5, t4 and t5 are adjacent, at the same time, t3 is the time between t4 and t5.
  • Using the pose information at time t4 and the pose information at time t5 perform interpolation processing to obtain the pose information corresponding to time t3.
  • the above process can also be processed, which will not be repeated here.
  • the data used is the foregoing first target data.
  • the above-mentioned first target data may refer to unprocessed data directly detected by the first sensor, or the above-mentioned first target data may also be data obtained by pre-synchronizing raw data detected by the first sensor.
  • the first sensor is LiDAR
  • the LiDAR rotates one circle and reports one frame of data to the system after detecting one circle of data. Since LiDAR rotates one circle for a certain period of time, the LiDAR In a LiDAR frame data reported to the system, the actual detection time of each sub-data is different.
  • one LiDAR frame data may include multiple data packets, and each data packet is a sub-data.
  • the sub-data in the data sent by the first sensor can be synchronized first to Make each frame of data sent by the first sensor achieve intra-frame synchronization.
  • synchronizing the above-mentioned sub-data refers to taking the transmission time of one sub-data in each sub-data as a reference time, and performing compensation processing on the remaining sub-data to obtain sub-data of the remaining sub-data at the reference time.
  • the third time point carried when the first sensor reports the first raw data corresponding to the first target data, and the difference information between the clock of the first sensor and the clock of the smart device may first determine the The first moment.
  • the first raw data refers to data that has not undergone intra-frame synchronization processing reported by the first sensor to the system
  • the first target data refers to data that has undergone intra-frame synchronization processing
  • the time of detecting the data may be carried in the first raw data, that is, the aforementioned third time.
  • the third time is the first time under the clock of the smart device
  • the third time is used to identify the time when the first sensor detects the first raw data corresponding to the first target data
  • the third time is the first sensor Time under the clock.
  • the difference information between the clock of the first sensor and the clock of the smart device may be obtained in advance through a specific means.
  • the first sensor is based on the Global Positioning System (Global Positioning System, GPS)
  • the difference information between the clock of the first sensor and the clock of the smart device may be determined according to the error between the GPS clock and the clock of the smart device.
  • the second sensor may also be determined according to the difference information between the clock of the second sensor and the clock of the smart device.
  • the fourth time carried by the second sensor when reporting the second target data is the second time under the smart device clock, and the fourth time is used to identify the detection time of the second target data.
  • the fourth time It is the time under the clock of the second sensor.
  • the second sensor can be used to shoot multiple video frames of the stopwatch of the smart device, and the time information of each video frame and each The time information displayed by the stopwatch corresponding to the video frame is compared and analyzed to obtain the difference information between the clock of the second sensor and the clock of the smart device.
  • the following describes the process of performing intra-frame data synchronization based on the third moment carried when the first sensor reports the first original data.
  • FIG. 7 is a schematic flow chart 5 of the sensor data processing method provided by an embodiment of the disclosure. As shown in FIG. 7, the process of performing intra-frame data synchronization on the first raw data to obtain the first target data includes:
  • the first raw data reported by the first sensor includes multiple sub-data, and carries the detection time of each sub-data at the same time, that is, each sub-data has a corresponding detection time.
  • the system can select one of these detection moments as the reference moment as the third moment; respectively compensate the sub-data at other moments to the reference moment, so as to obtain all the sub-data at the reference moment, and these sub-data are combined into
  • the data is the first target data, so as to realize the intra-frame data synchronization of the first sensor.
  • the latest one of the multiple detection moments may be selected as the third moment.
  • the frame of multiple sub-data in the first raw data can be completed Internal synchronization, thereby further improving the accuracy of sensor synchronization.
  • FIG. 8 is a schematic flow chart 6 of a sensor data processing method provided by an embodiment of the disclosure. As shown in FIG. 8, an optional implementation manner of the foregoing step S702 includes:
  • S801 Determine the coordinate system of the first sensor at the detection time of each sub-data according to the pose of the smart device at the detection time of each sub-data.
  • the third time is used as the reference time, and the sub-data of other detection time is compensated to the third time to realize the synchronization of the sub-data.
  • the process of determining the sub-data corresponding to the sub-data at the third time according to the coordinate system at the detection time of the sub-data and the coordinate system at the third time is consistent with the processing in the above step S503.
  • S803 Perform integration processing on the sub-data corresponding to each sub-data at the third time except the third time to obtain the first target data of the first sensor at the first time.
  • the sub-data corresponding to each sub-data at the third time can be sorted and combined according to the original detection time to obtain the above-mentioned first target data.
  • All sub-data in the first target data are corresponding to the third time.
  • the data in the coordinate system, that is, all the sub-data in the first target data are synchronous data.
  • the first target data is the data in the coordinate system corresponding to the third time.
  • the time at the third time under the clock of the smart device is the first time. Therefore, in the smart device Under the clock of, the above-mentioned first target data is the detection data at the first moment.
  • Figure 9 is an example diagram of performing intra-frame data synchronization on the first raw data to obtain the first target data.
  • the LiDAR includes n in each frame of data reported. +1 data packet, each data packet corresponds to a detection time, you can use the detection time of the nth data packet (data packet n) as the reference time to compensate data packets 0 to n-1 to the data packets respectively n corresponds to the time, so as to realize the intra-frame synchronization of one frame of data.
  • the above describes the process of performing intra-frame synchronization on the first original data composed of multiple sub-data to obtain the first target data.
  • the smart device contains multiple sensors of the same type, and the detection data of the same type of sensor is in the above-mentioned form including multiple sub-data, the first sensor is one of multiple sensors of the same type.
  • the detection data of multiple sensors of the same type may be synchronized based on the multiple sub-data synchronization method described above.
  • FIG. 10 is a schematic flow diagram 7 of a sensor data processing method provided by an embodiment of the disclosure. As shown in FIG. 10, the process of synchronizing detection data of multiple sensors of the same type includes:
  • the type of the third sensor is the same as that of the first sensor.
  • the second raw data includes multiple sub-data. Each sub-data has a corresponding detection time.
  • the third time is the reference time of the detection time of the multiple data included in the second original data;
  • one of the detection moments of the first sensor is used as the reference moment when the third sensor is synchronized.
  • the third moment is used as the reference moment when the third sensor is synchronized.
  • the Each sub-data in the second original data is compensated to the third moment.
  • all the sub-data in the second original data are synchronized to obtain the synchronized third target data.
  • the third time of the first sensor is used as the reference time. Therefore, the third target data synchronized by the third sensor and the second target data synchronized by the first sensor are synchronized.
  • FIG 11 is an example diagram of synchronizing sensors of the same type. As shown in Figure 11, taking the first sensor as LiDAR, the third sensor and the fourth sensor as LiDAR as an example, each frame of data reported by each LiDAR , Including n+1 data packets, each data packet corresponds to a detection time.
  • the detection time of the nth data packet (data packet n) of the first sensor can be used as the reference time, and the data packet 0 to data packet n-1 of the first sensor can be compensated to the corresponding time of data packet n, and, Compensate data packet 0 to data packet n of the third sensor to the corresponding time of data packet n of the first sensor, and respectively compensate data packet 0 to data packet n of the fourth sensor to data packet n of the first sensor.
  • the intra-frame synchronization of the first sensor, the third sensor, and the fourth sensor, and the inter-frame synchronization between the first sensor, the third sensor, and the fourth sensor are realized.
  • the detection time of one sub-data in the detection data of one sensor can be used as a reference At time, for each of the remaining sensors of the same type, all sub-data of the sensor are compensated to the reference time. After this processing, not only the intra-frame synchronization of each sensor can be realized, but also the same time can be realized. Interframe synchronization between types of sensors.
  • the compensation processing is performed on the first target data according to the first pose and the second pose to obtain the data of the first sensor at the second moment
  • the compensation data includes: performing compensation processing on the first target data and the third target data according to the first pose and the second pose to obtain that the first sensor is at the second moment And the compensation data of the third sensor at the second moment.
  • the first target data is compensated according to the first pose and the second pose, it can also be based on the first pose and second pose.
  • the pose performs compensation processing on the third target data to obtain the compensation data of the first sensor at the second moment and the compensation data of the third sensor at the second moment. Since the first sensor and the third sensor are the same type of sensors, synchronization has been achieved, and on this basis, synchronization with the second sensor can further improve the accuracy of sensor synchronization.
  • FIG. 12 is the first module structure diagram of a sensor data processing device provided by an embodiment of the disclosure. As shown in FIG. 12, the device includes:
  • the first acquiring module 1201 is configured to acquire the first target data of the first sensor of the smart device at the first moment;
  • the second acquisition module 1202 is configured to acquire second target data of the second sensor of the smart device at a second time, where the first time and the second time are different time under the clock of the smart device;
  • the third acquiring module 1203 is configured to acquire the first pose of the smart device at the first moment and the second pose at the second moment, the first pose and the second pose different;
  • the compensation module 1204 is configured to perform compensation processing on the first target data according to the first pose and the second pose to obtain compensation data of the first sensor at the second time.
  • the device is used to implement the foregoing method embodiments, and its implementation principles and technical effects are similar, and will not be repeated here.
  • the compensation module 1204 is configured to determine the first coordinate system of the first sensor at the first moment according to the first pose; determine the first coordinate system according to the second pose The second coordinate system of the first sensor at the second time; according to the first coordinate system and the second coordinate system, the first target data is compensated to obtain the first sensor at the Describe the compensation data at the second moment.
  • the third acquisition module 1203 is configured to respectively acquire the pose detection data of the smart device detected at multiple times by the sensor with the pose detection function set on the smart device.
  • Each of the two moments is a moment under the clock of the smart device; according to the pose detection data of the smart device detected at multiple moments by the sensor with pose detection function set on the smart device, generate The pose queue of the smart device; the first pose and the second pose are determined according to the pose queue of the smart device, the first moment and the second moment.
  • the third acquiring module 1203 is configured to respond to the situation that the pose at the first moment is not included in the pose queue, and perform an assessment of the pose of the smart device according to the first moment.
  • the queue performs compensation processing to obtain the first pose; and/or, in response to the situation that the pose at the second moment is not included in the pose queue, the smart device is processed according to the second moment Perform compensation processing on the pose queue to obtain the second pose.
  • FIG. 13 is the second module structure diagram of the sensor data processing device provided by the embodiment of the disclosure.
  • the device further includes: a first determining module 1205, which is used to determine the relationship between the clock of the first sensor and the smart device.
  • the difference information of the clock of the device determines that the third time is the first time under the smart device clock, and the third time is used to identify the first original data corresponding to the first target data detected by the first sensor.
  • the time of the data, the third time is the time under the clock of the first sensor.
  • the first acquisition module 1201 is configured to receive the first raw data reported by the first sensor, the first raw data includes multiple sub-data, and each sub-data has a corresponding detection time, The third time is the reference time of the detection time of the multiple sub-data included in the first raw data; the first sensor is acquired according to the pose of the smart device at the detection time of each sub-data and each sub-data The first target data at the first moment.
  • the first acquisition module 1201 is configured to determine the coordinate system of the first sensor at the detection time of each sub-data according to the pose of the smart device at the detection time of each sub-data; The coordinate system of the first sensor at the detection time of each sub-data except the third time and the coordinate system of the first sensor at the third time, respectively determine each sub-data except the third time The sub-data corresponding to the third time; the sub-data corresponding to each sub-data at the third time except the third time is integrated to obtain the data of the first sensor at the first time The first target data.
  • FIG. 14 is the third module structure diagram of the sensor data processing device provided by the embodiment of the disclosure.
  • the device further includes: a receiving module 1206 for receiving the second raw data reported by the third sensor.
  • the type of the three sensors is the same as the type of the first sensor, the second raw data includes multiple sub-data, each sub-data has a corresponding detection time, and the third time is the multiple included in the second raw data.
  • the fourth acquisition module 1207 is configured to acquire the location of the third sensor at the time of detection of each sub-data of the second raw data and each sub-data of the second raw data according to the smart device.
  • the third target data at the first moment;
  • the compensation module 1204 is configured to perform compensation processing on the first target data and the third target data according to the first pose and the second pose, to obtain that the first sensor is in the second The compensation data at the time and the compensation data of the third sensor at the second time.
  • the device further includes: a fifth acquisition module 1208, which is used by the first determination module 1205 according to the first The difference information between the clock of the sensor and the clock of the smart device determines that the third time is before the first time under the smart device clock, and obtains the first time according to the clock error between the GPS clock and the smart device Difference information between the clock of the sensor and the clock of the smart device.
  • FIG. 16 is the fifth module structure diagram of the sensor data processing device provided by the embodiments of the disclosure.
  • the device further includes: a second determining module 1209, which is used for the second acquiring module 1202 to acquire the first determination of the smart device Before the second target data at the second time, the second sensor determines the fourth carried when the second sensor reports the second target data according to the difference information between the clock of the second sensor and the clock of the smart device.
  • the time is the second time under the smart device clock
  • the fourth time is used to identify the detection time of the second target data
  • the fourth time is the time under the clock of the second sensor.
  • FIG. 17 is a module structure diagram 6 of the sensor data processing device provided by an embodiment of the disclosure. As shown in FIG. 17, the device further includes: a photographing module 1210 for photographing the stopwatch of the smart device using the second sensor Multiple video frames;
  • the analysis module 1211 is used to compare and analyze the time information of each video frame and the time information displayed by the stopwatch corresponding to each video frame to obtain the clock of the second sensor and the smart Difference information of the device's clock.
  • the second sensor is a camera
  • the first sensor is a lidar or millimeter wave radar.
  • the division of the various modules of the above device is only a division of logical functions, and may be fully or partially integrated into a physical entity in actual implementation, or may be physically separated.
  • these modules can all be implemented in the form of software called by processing elements; they can also be implemented in the form of hardware; some modules can be implemented in the form of calling software by processing elements, and some of the modules can be implemented in the form of hardware.
  • the determining module may be a separately established processing element, or it may be integrated into a certain chip of the above-mentioned device for implementation.
  • each step of the above method or each of the above modules can be completed by hardware integrated logic circuits in the processor element or instructions in the form of software.
  • the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more application specific integrated circuits (ASIC), or one or more microprocessors (Digital Signal Processor, DSP), or, one or more Field Programmable Gate Array (Field Programmable Gate Array, FPGA), etc.
  • ASIC application specific integrated circuits
  • DSP Digital Signal Processor
  • FPGA Field Programmable Gate Array
  • the processing element may be a general-purpose processor, such as a central processing unit (CPU) or other processors that can call program codes.
  • CPU central processing unit
  • these modules can be integrated together and implemented in the form of a System-On-a-Chip (SOC).
  • SOC System-On-a-Chip
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium. For example, the computer instructions may be transmitted from a website, computer, server, or data center.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or data center integrated with one or more available media.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk (SSD)), etc.
  • FIG. 18 is a schematic structural diagram of an electronic device provided by an embodiment of the disclosure.
  • the electronic device 1800 may include: a processor 181 and a memory 182; the memory 182 is used to store computer instructions, and when the processor 181 executes the computer instructions, the implementation is as shown in FIGS. 1 to 10 above. Show the scheme of the embodiment.
  • the electronic device 1800 may further include a communication interface 183 for communicating with other devices. It can be understood that the electronic device 1800 may further include a system bus 184, and the system bus 184 is used to implement connection and communication between these components.
  • the system bus 184 may be a Peripheral Component Interconnect (PCI) bus or an Extended Industry Standard Architecture (EISA) bus.
  • PCI Peripheral Component Interconnect
  • EISA Extended Industry Standard Architecture
  • the system bus 184 can be divided into an address bus, a data bus, a control bus, and the like. For ease of representation, only one thick line is used in the figure, but it does not mean that there is only one bus or one type of bus.
  • the communication interface 183 is used to implement communication between the database access device and other devices (for example, a client, a read-write library, and a read-only library).
  • the memory 182 may include a random access memory (Random Access Memory, RAM), and may also include a non-volatile memory (Non-Volatile Memory), such as at least one disk memory.
  • the aforementioned processor 181 may be a general-purpose processor, including a CPU, a network processor (Network Processor, NP), etc.; it may also be a DSP, ASIC, FPGA or other programmable logic device, discrete gate or transistor logic device, discrete hardware component .
  • NP Network Processor
  • FIG. 19 is a schematic flow chart of the intelligent driving control method provided by the embodiments of the present disclosure.
  • the embodiments of the present disclosure also provide an intelligent driving control method, including:
  • S1901 respectively obtain detection data of a first sensor and a second sensor set on the smart device, and the detection data is obtained by using the sensor data processing method provided in the embodiment of the present disclosure
  • S1902 Perform intelligent driving control on the smart device according to the detection data.
  • the execution subject of this embodiment may be a smart driving control device.
  • the smart driving control device of this embodiment and the electronic equipment described in the above embodiments may be located in the same device or separate devices in different devices.
  • the intelligent driving control device of this embodiment is in communication connection with the above-mentioned electronic equipment.
  • the detection data of the first sensor and the second sensor are obtained by the method of the above-mentioned embodiment, and the specific process is referred to the description of the above-mentioned embodiment, which will not be repeated here.
  • the electronic device executes the above-mentioned sensor data processing method, obtains detection data of the first sensor and the second sensor set on the smart device, and outputs the detection data of the first sensor and the second sensor set on the smart device .
  • the intelligent driving control device obtains the detection data of the first sensor and the second sensor, and performs intelligent driving control on the smart device according to the detection data.
  • the smart driving in this embodiment includes assisted driving, automatic driving, and/or driving mode switching between auxiliary driving and automatic driving, and the like.
  • the above-mentioned intelligent driving control may include: braking, changing driving speed, changing driving direction, maintaining lane line, changing the state of lights, switching driving mode, etc., wherein the driving mode switching may be a switching between assisted driving and automatic driving, for example , Switch assisted driving to automatic driving through intelligent driving control.
  • the smart driving control device obtains the detection data of the sensor set on the smart device, and performs smart driving control based on the detection data of the sensor set on the smart device, thereby improving the smart driving performance. Safety and reliability.
  • FIG. 20 is a schematic structural diagram of an intelligent driving control device provided by an embodiment of the disclosure.
  • the intelligent driving control device 2000 of the embodiment of the present disclosure includes: an acquisition module 2001 for Acquiring detection data of the first sensor and the second sensor set on the smart device respectively, and the detection data is obtained by using the aforementioned sensor data processing method;
  • the intelligent driving control module 2002 is used to perform intelligent driving control on the smart device according to the detection data.
  • the intelligent driving control device of the embodiment of the present disclosure may be used to execute the technical solution of the method embodiment shown above, and its implementation principle and technical effect are similar, and will not be repeated here.
  • FIG. 21 is a schematic diagram of a smart driving system provided by an embodiment of the disclosure.
  • the smart driving system 2100 of this embodiment includes: a sensor 2101, an electronic device 1800, and a smart driving control device 2000 connected in communication, wherein the electronic device 1800 is shown in FIG. 18, and the intelligent driving control device 2000 is shown in FIG. 20.
  • the sensor 2101 may include at least one of the cameras, LiDAR, RADAR, high-precision inertial navigation and other sensors described in the foregoing embodiments.
  • the sensor 2101 detects the surrounding environment of the smart device to obtain original detection data, and sends these detection data to the electronic device 1800.
  • the electronic device 1800 receives the original detection data , Perform data synchronization according to the above-mentioned sensor data processing method to obtain synchronized detection data.
  • the electronic device 1800 sends the synchronized detection data to the smart driving control device 2000, and the smart driving control device 2000 performs smart driving control on the smart device according to the synchronized detection data.
  • the embodiment of the present application also provides a storage medium, the storage medium stores instructions, which when run on a computer, causes the computer to execute the method of any one of the embodiments shown in FIGS. 1 to 10; or The computer executes the method of the embodiment shown in FIG. 19 above.
  • An embodiment of the present application also provides a chip for executing instructions.
  • the chip is used to execute the method of any one of the embodiments shown in FIG. 1 to FIG. 10; or, the chip is used to execute the method of the embodiment shown in FIG. method.
  • An embodiment of the present application further provides a program product, the program product includes a computer program, the computer program is stored in a storage medium, at least one processor can read the computer program from the storage medium, and the at least one When the processor executes the computer program, the method of any one of the embodiments shown in FIGS. 1 to 10 may be implemented; or, when the at least one processor executes the computer program, the method of the embodiment shown in FIG. 19 may be implemented .
  • At least one refers to one or more, and “multiple” refers to two or more.
  • “And/or” describes the association relationship of the associated objects, indicating that there can be three relationships, for example, A and/or B, which can mean: A alone exists, both A and B exist, and B exists alone, where A, B can be singular or plural.
  • the character “/” generally indicates that the associated objects before and after are in an “or” relationship; in the formula, the character “/” indicates that the associated objects before and after are in a “division” relationship.
  • “The following at least one item (a)” or similar expressions refers to any combination of these items, including any combination of a single item (a) or plural items (a).
  • at least one of a, b, or c can mean: a, b, c, ab, ac, bc, or abc, where a, b, and c can be single or multiple A.
  • the size of the sequence numbers of the foregoing processes does not mean the order of execution.
  • the execution order of the processes should be determined by their functions and internal logic, and should not correspond to the embodiments of the present disclosure.
  • the implementation process constitutes any limitation.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Electromagnetism (AREA)
  • Arrangements For Transmission Of Measured Signals (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Electric Clocks (AREA)

Abstract

A method and apparatus for processing data of a sensor, an electronic device, and a system. The method comprises: obtaining first target data of a first sensor of a smart device at a first moment (S201); obtaining second target data of a second sensor of the smart device at a second moment, the first moment and the second moment being different moments on a clock of the smart device (S202); obtaining a first pose at the first moment and a second pose at the second moment of the smart device, the first pose and the second pose being different (S203); and performing compensation processing on the first target data according to the first pose and the second pose to obtain compensation data of the first sensor at the second moment (S204).

Description

传感器数据处理方法、装置、电子设备及系统Sensor data processing method, device, electronic equipment and system
相关申请的交叉引用Cross references to related applications
本申请基于申请号为201910556258.2、申请日为2019年6月25日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此以引入方式并入本申请。This application is based on a Chinese patent application with an application number of 201910556258.2 and an application date of June 25, 2019, and claims the priority of the Chinese patent application. The entire content of the Chinese patent application is hereby incorporated into this application by way of introduction.
技术领域Technical field
本公开实施例涉及智能行驶技术,尤其涉及一种传感器数据处理方法、装置、电子设备及系统。The embodiments of the present disclosure relate to smart driving technology, and in particular to a sensor data processing method, device, electronic equipment and system.
背景技术Background technique
辅助驾驶和自动驾驶是智能驾驶领域的两项重要技术,通过辅助驾驶或自动驾驶,可以减少交通事故的发生,因此在智能驾驶领域发挥着重要作用。辅助驾驶技术和自动驾驶技术的实施,需要多种传感器的配合。多种传感器分别设置在车辆的不同位置,并实时采集路面图像、车辆运行数据等,辅助驾驶系统或自动驾驶系统根据各传感器所采集的数据进行路径规划等控制操作。由于车辆上安装的传感器的触发时刻、触发源可能存在不同,因此,多种传感器以及每种传感器中的多个传感器之间可能存在不同步的问题。Assisted driving and automatic driving are two important technologies in the field of intelligent driving. Through assisted driving or automatic driving, the occurrence of traffic accidents can be reduced, so they play an important role in the field of intelligent driving. The implementation of assisted driving technology and autonomous driving technology requires the cooperation of multiple sensors. A variety of sensors are set at different positions of the vehicle, and real-time collection of road images, vehicle operation data, etc., and the auxiliary driving system or automatic driving system performs path planning and other control operations based on the data collected by each sensor. Since the trigger time and trigger source of the sensors installed on the vehicle may be different, there may be a problem of out-of-synchronization between multiple sensors and multiple sensors in each sensor.
发明内容Summary of the invention
本公开实施例提供一种传感器数据处理方法、装置、电子设备及系统。The embodiments of the present disclosure provide a sensor data processing method, device, electronic equipment and system.
第一方面,本公开实施例提供一种传感器数据处理方法,包括:获取智能设备的第一传感器在第一时刻的第一目标数据;获取所述智能设备的第二传感器在第二时刻的第二目标数据,所述第一时刻和所述第二时刻为所述智能设备的时钟下的不同时刻;获取所述智能设备在所述第一时刻的第一位姿和在所述第二时刻的第二位姿,所述第一位姿和所述第二位姿不同;根据所述第一位姿和所述第二位姿,对所述第一目标数据进行补偿处理,得到所述第一传感器在所述第二时刻下的补偿数据。In a first aspect, an embodiment of the present disclosure provides a sensor data processing method, including: acquiring first target data of a first sensor of a smart device at a first time; acquiring a second sensor of the smart device at a second time Two target data, the first moment and the second moment are different moments under the clock of the smart device; acquiring the first pose of the smart device at the first moment and the second moment In the second pose, the first pose and the second pose are different; according to the first pose and the second pose, the first target data is compensated to obtain the Compensation data of the first sensor at the second moment.
第二方面,本公开实施例还提供一种传感器数据处理装置,包括:第一获取模块,用于获取智能设备的第一传感器在第一时刻的第一目标数据;第二获取模块,用于获取所述智能设备的第二传感器在第二时刻的第二目标数据,所述第一时刻和所述第二时刻为所述智能设备的时钟下的不同时刻;第三获取模块,用于获取所述智能设备在所述第一时刻的第一位姿和在所述第二时刻的第二位姿,所述第一位姿和所述第二位姿不同;补偿模块,用于根据所述第一位姿和所述第二位姿,对所述第一目标数据进行补偿处理,得到所述第一传感器在所述第二时刻下的补偿数据。In a second aspect, an embodiment of the present disclosure further provides a sensor data processing device, including: a first acquisition module for acquiring first target data of a first sensor of a smart device at a first moment; a second acquisition module for Acquire the second target data of the second sensor of the smart device at the second time, the first time and the second time are different time under the clock of the smart device; the third acquisition module is used to acquire The first pose of the smart device at the first moment and the second pose at the second moment, the first pose and the second pose are different; the compensation module is used for In the first pose and the second pose, compensation processing is performed on the first target data to obtain the compensation data of the first sensor at the second moment.
第三方面,本公开实施例还提供一种智能行驶控制方法,包括:智能行驶控制装置获取设置在智能设备上的传感器的检测数据,所述检测数据采用如上述第一方面所述的 传感器数据处理方法得到;所述智能行驶控制装置根据所述检测数据对智能设备进行智能行驶控制。In a third aspect, embodiments of the present disclosure also provide an intelligent driving control method, including: an intelligent driving control device acquires detection data of a sensor provided on a smart device, and the detection data uses the sensor data described in the first aspect above The processing method is obtained; the smart driving control device performs smart driving control on the smart device according to the detection data.
第四方面,本公开实施例还提供一种智能行驶控制装置,包括:获取模块,用于获取设置在智能设备上的传感器的检测数据,所述检测数据采用如上述第一方面所述的传感器数据处理方法得到;智能行驶控制模块,用于根据所述检测数据对智能设备进行智能行驶控制。In a fourth aspect, an embodiment of the present disclosure further provides an intelligent driving control device, including: an acquisition module, configured to acquire detection data of a sensor set on a smart device, and the detection data uses the sensor described in the first aspect. The data processing method is obtained; the intelligent driving control module is used for intelligent driving control of the smart device according to the detection data.
第五方面,本公开实施例还提供一种电子设备,包括:存储器,用于存储程序指令;处理器,用于调用并执行所述存储器中的程序指令,执行上述第一方面所述的方法步骤。In a fifth aspect, embodiments of the present disclosure further provide an electronic device, including: a memory, configured to store program instructions; a processor, configured to call and execute the program instructions in the memory, and execute the method described in the first aspect above step.
第六方面,本公开实施例还提供一种智能行驶系统,包括:通信连接的传感器、上述第五方面所述的电子设备和上述第四方面所述的智能行驶控制装置。In a sixth aspect, embodiments of the present disclosure also provide an intelligent driving system, including: a sensor connected in communication, the electronic device described in the fifth aspect, and the intelligent driving control device in the fourth aspect.
第七方面,本公开实施例还提供一种可读存储介质,所述可读存储介质中存储有计算机程序,所述计算机程序用于执行上述第一方面所述的方法步骤;或者,所述计算机程序用于执行上述第三方面所述的方法步骤。In a seventh aspect, the embodiments of the present disclosure also provide a readable storage medium in which a computer program is stored, and the computer program is used to execute the method steps described in the first aspect; or, the The computer program is used to execute the method steps described in the third aspect.
本公开实施例提供的传感器数据处理方法、装置、电子设备及系统,获取到第一传感器在第一时刻的第一目标数据和第二传感器在第二时刻的第二目标数据后,根据智能设备在第一时刻的位姿信息和智能设备在第二时刻的位姿信息,对第一目标数据进行补偿处理,从而得到第一传感器在第二时刻下的补偿数据。由于第二传感器的第二目标数据也是第二时刻的监测数据,即通过对第一传感器的数据进行补偿处理,可以得到第一传感器在第二传感器对应时刻的数据,即所得到的第一传感器的第一目标数据和第二传感器的第二目标数据均为同一时刻的数据,从而实现第一传感器和第二传感器的同步。该方法通过软件方式实现传感器的同步,因此,无需额外部署用于进行多个传感器同步触发的专用硬件,降低多传感器数据同步所需的硬件成本。According to the sensor data processing method, device, electronic device and system provided by the embodiments of the present disclosure, the first target data of the first sensor at the first time and the second target data of the second sensor at the second time are acquired according to the smart device The pose information at the first moment and the pose information of the smart device at the second moment are compensated for the first target data to obtain the compensation data of the first sensor at the second moment. Since the second target data of the second sensor is also the monitoring data at the second time, that is, by performing compensation processing on the data of the first sensor, the data of the first sensor at the corresponding time of the second sensor can be obtained, that is, the obtained first sensor The first target data of the second sensor and the second target data of the second sensor are data at the same time, so as to achieve synchronization of the first sensor and the second sensor. The method realizes the synchronization of the sensors through software, therefore, there is no need to additionally deploy dedicated hardware for synchronous triggering of multiple sensors, which reduces the hardware cost required for data synchronization of multiple sensors.
附图说明Description of the drawings
为了更清楚地说明本公开或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。In order to more clearly explain the technical solutions in the present disclosure or the prior art, the following will briefly introduce the drawings that need to be used in the embodiment or the prior art description. Obviously, the drawings in the following description are the present For some of the disclosed embodiments, for those of ordinary skill in the art, other drawings can be obtained from these drawings without creative labor.
图1为本公开实施例提供的传感器数据处理方法的应用场景示意图;FIG. 1 is a schematic diagram of an application scenario of a sensor data processing method provided by an embodiment of the disclosure;
图2为本公开实施例提供的传感器数据处理方法的流程示意图一;FIG. 2 is a first flowchart of a sensor data processing method provided by an embodiment of the disclosure;
图3为通过上述过程对第一传感器和第二传感器进行数据同步的示意图;3 is a schematic diagram of data synchronization between the first sensor and the second sensor through the above process;
图4为本公开实施例提供的传感器数据处理方法的流程示意图二;FIG. 4 is a second schematic flowchart of a sensor data processing method provided by an embodiment of the disclosure;
图5为本公开实施例提供的传感器数据处理方法的流程示意图三;FIG. 5 is a third schematic flowchart of a sensor data processing method provided by an embodiment of the disclosure;
图6为本公开实施例提供的传感器数据处理方法的流程示意图四;6 is a fourth schematic flowchart of a sensor data processing method provided by an embodiment of the disclosure;
图7为本公开实施例提供的传感器数据处理方法的流程示意图五;FIG. 7 is a fifth schematic flowchart of a sensor data processing method provided by an embodiment of the disclosure;
图8为本公开实施例提供的传感器数据处理方法的流程示意图六;FIG. 8 is a sixth schematic flowchart of a sensor data processing method provided by an embodiment of the disclosure;
图9为对第一原始数据进行帧内数据同步以获取第一目标数据的示例图;FIG. 9 is an example diagram of performing intra-frame data synchronization on the first original data to obtain the first target data;
图10为本公开实施例提供的传感器数据处理方法的流程示意图七;FIG. 10 is a seventh schematic flowchart of a sensor data processing method provided by an embodiment of the disclosure;
图11为对相同类型的传感器进行同步的示例图;Figure 11 is an example diagram of synchronizing sensors of the same type;
图12为本公开实施例提供的传感器数据处理装置的模块结构图一;FIG. 12 is a first module structure diagram of a sensor data processing device provided by an embodiment of the disclosure;
图13为本公开实施例提供的传感器数据处理装置的模块结构图二;FIG. 13 is a second module structure diagram of a sensor data processing device provided by an embodiment of the disclosure;
图14为本公开实施例提供的传感器数据处理装置的模块结构图三;FIG. 14 is the third module structure diagram of the sensor data processing device provided by an embodiment of the disclosure;
图15为本公开实施例提供的传感器数据处理装置的模块结构图四;FIG. 15 is a fourth module structure diagram of a sensor data processing device provided by an embodiment of the disclosure;
图16为本公开实施例提供的传感器数据处理装置的模块结构图五;FIG. 16 is the fifth module structure diagram of the sensor data processing device provided by the embodiments of the disclosure;
图17为本公开实施例提供的传感器数据处理装置的模块结构图六;FIG. 17 is a sixth module structure diagram of a sensor data processing device provided by an embodiment of the disclosure;
图18为本公开实施例提供的一种电子设备的结构示意图;18 is a schematic structural diagram of an electronic device provided by an embodiment of the disclosure;
图19为本公开实施例提供的智能行驶控制方法的流程示意图;FIG. 19 is a schematic flowchart of an intelligent driving control method provided by an embodiment of the disclosure;
图20为本公开实施例提供的智能行驶控制装置的结构示意图;20 is a schematic structural diagram of an intelligent driving control device provided by an embodiment of the disclosure;
图21为本公开实施例提供的智能行驶系统的示意图。FIG. 21 is a schematic diagram of a smart driving system provided by an embodiment of the disclosure.
具体实施方式Detailed ways
为使本发明的目的、技术方案和优点更加清楚,下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。In order to make the objectives, technical solutions and advantages of the present invention clearer, the technical solutions in the embodiments of the present disclosure will be described clearly and completely in conjunction with the accompanying drawings in the embodiments of the present disclosure. Obviously, the described embodiments are the present invention. Invented some embodiments, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art without creative work shall fall within the protection scope of the present invention.
图1为本公开实施例提供的传感器数据处理方法的应用场景示意图,如图1所示,该方法可以适用于安装有传感器的车辆、机器人、导盲设备等智能设备。智能设备上安装的传感器的类型可以包括摄像头、激光雷达(Light Detection And Ranging,LiDAR)、毫米波雷达(Radio Detection And Ranging,RADAR)、高精惯导、控制器局域网总线(Controller Area Net-work Bus,CANBUS)等至少一种传感器,智能设备上设置的相同类型的传感器的数量可能为一个或者多个,例如可设置一个或多个摄像头、一个或多个激光雷达等等。不同传感器可以设置在智能设备的不同位置。图1中以摄像头和LiDAR为例进行示出。其中,摄像头可以实时拍摄智能设备周边环境的图像,并将拍摄到的图像上报至智能设备的智能行驶系统。LiDAR通过发射和接收激光脉冲可以得到智能设备周边的三维点坐标,形成点云数据,并将点云数据上报至智能设备的智能行驶系统。RADAR利用电磁波探测智能设备周边的地面、车辆、树木等物体并接收其回波,从而获得物体的方位、高度、距离等信息,并将其上报至智能设备的智能行驶系统。CANBUS将智能设备的运行参数,例如车辆的油门运行参数、方向盘运行参数、车轮速度等等,使用串行数据传输方式传输到智能设备的智能行驶系统。智能行驶系统基于各传感器所上报的数据,进行智能行驶控制,例如进行车辆定位、路径规划、路径偏移预警以及车流分析等。FIG. 1 is a schematic diagram of an application scenario of a sensor data processing method provided by an embodiment of the disclosure. As shown in FIG. 1, the method can be applied to smart devices such as vehicles, robots, and blind guide devices that are installed with sensors. The types of sensors installed on smart devices can include cameras, LiDAR (Light Detection And Ranging, LiDAR), Millimeter Wave Radar (Radio Detection And Ranging, RADAR), high-precision inertial navigation, Controller Area Net-work At least one sensor such as Bus, CANBUS, etc. The number of sensors of the same type set on the smart device may be one or more, for example, one or more cameras, one or more lidars, etc. may be set. Different sensors can be set in different positions of the smart device. Figure 1 shows a camera and LiDAR as an example. Among them, the camera can capture images of the surrounding environment of the smart device in real time, and report the captured images to the smart driving system of the smart device. LiDAR can obtain three-dimensional point coordinates around the smart device by emitting and receiving laser pulses, forming point cloud data, and reporting the point cloud data to the smart driving system of the smart device. RADAR uses electromagnetic waves to detect the ground, vehicles, trees and other objects around the smart device and receive their echoes to obtain the object's position, height, distance and other information, and report it to the smart device's smart driving system. CANBUS transmits the operating parameters of the smart device, such as the vehicle's accelerator operating parameters, steering wheel operating parameters, wheel speed, etc., to the smart driving system of the smart device using serial data transmission. The intelligent driving system performs intelligent driving control based on the data reported by each sensor, such as vehicle positioning, route planning, route deviation warning, and traffic flow analysis.
为便于描述,本公开以下实施例将智能设备的智能行驶系统简称为“系统”。For ease of description, the following embodiments of the present disclosure refer to the smart driving system of the smart device as a "system" for short.
图2为本公开实施例提供的传感器数据处理方法的流程示意图一,如图2所示,该方法包括:FIG. 2 is a schematic flowchart 1 of a sensor data processing method provided by an embodiment of the disclosure. As shown in FIG. 2, the method includes:
S201、获取智能设备的第一传感器在第一时刻的第一目标数据。S201. Acquire first target data of the first sensor of the smart device at the first moment.
S202、获取智能设备的第二传感器在第二时刻的第二目标数据,该第一时刻和上述第二时刻为智能设备的时钟下的不同时刻。S202. Acquire second target data of a second sensor of the smart device at a second time, where the first time and the foregoing second time are different moments under the clock of the smart device.
可选的,系统可以向第一传感器和第二传感器发送数据上报指令,第一传感器和第二传感器在接收到数据上报指令后检测数据,并将检测到的数据发送给系统。或者,第一传感器和第二传感器也可以按照预设的周期,自动检测数据,并将检测到的数据发送给系统。Optionally, the system may send a data report instruction to the first sensor and the second sensor, and the first sensor and the second sensor detect data after receiving the data report instruction, and send the detected data to the system. Alternatively, the first sensor and the second sensor may also automatically detect data according to a preset cycle, and send the detected data to the system.
S203、获取智能设备在上述第一时刻的第一位姿和在上述第二时刻的第二位姿,该第一位姿和该第二位姿不同。S203. Acquire a first pose of the smart device at the first moment and a second pose at the second moment, where the first pose and the second pose are different.
S204、根据上述第一位姿和上述第二位姿,对上述第一目标数据进行补偿处理,得到上述第一传感器在上述第二时刻下的补偿数据。S204: Perform compensation processing on the first target data according to the first pose and the second pose to obtain compensation data of the first sensor at the second moment.
其中,上述第一传感器和上述第二传感器可以为智能设备上的任意两个传感器。智 能设备上可设置有一个或多个同一类型的传感器,则上述第一传感器和上述第二传感器可以为不同类型的两个传感器,也可以为相同类型的两个传感器。Wherein, the first sensor and the second sensor may be any two sensors on the smart device. One or more sensors of the same type may be provided on the smart device, and the first sensor and the second sensor may be two sensors of different types, or two sensors of the same type.
一种示例中,上述第一传感器可以为LiDAR,上述第二传感器可以为摄像头。In an example, the first sensor may be LiDAR, and the second sensor may be a camera.
在第一传感器为LiDAR、第二传感器为摄像头的情况下,由于摄像头检测到的数据为二维信息,而LiDAR检测到的数据为三维信息,在后续进行补偿处理时,需要对数据进行旋转平移等操作,因此,将检测二维信息的摄像头作为参考的第二传感器,对检测三维信息的LiDAR检测的三维数据进行旋转平移等运动补偿操作,可以保证在进行补偿处理时不会引入额外的深度误差。In the case that the first sensor is LiDAR and the second sensor is a camera, since the data detected by the camera is two-dimensional information, and the data detected by LiDAR is three-dimensional information, the data needs to be rotated and translated in the subsequent compensation processing. Therefore, the second sensor that uses the camera that detects two-dimensional information as a reference to perform motion compensation operations such as rotation and translation on the three-dimensional data detected by LiDAR that detects three-dimensional information can ensure that no additional depth is introduced during the compensation process. error.
另一种示例中,上述第一传感器和上述第二传感器可以均为LiDAR。In another example, the first sensor and the second sensor may both be LiDAR.
本公开实施例中,以智能设备为车辆为例,各传感器可能使用自身独立的时钟,也可能与车辆的智能驾驶系统使用相同的时钟。本实施例所述的第一时刻和第二时刻,均是指车辆的智能驾驶系统的时钟下的时刻。各传感器可以通过在上报的数据中携带时刻信息等方式向系统上报数据在传感器的时钟下的检测时刻,在本公开实施例中,各传感器发送数据的时刻等于或者近似于系统接收数据的时刻。如果第一传感器的时钟与系统的时钟相同,则上述第一时刻既为第一传感器发送数据时、第一传感器自身的时钟下的时刻,也为系统接收数据时系统的时钟下的时刻。如果第一传感器自身的时钟与系统的时钟不同,则系统需要根据第一传感器自身的时钟下的数据发送时刻以及第一传感器与系统的时钟差异,得到上述第一时刻,该过程将在下述实施例中详细说明。上述第二时刻的处理方法与上述第一时刻类似,这里不再赘述。In the embodiments of the present disclosure, taking the smart device as a vehicle as an example, each sensor may use its own independent clock, or may use the same clock as the intelligent driving system of the vehicle. The first time and the second time mentioned in this embodiment both refer to the time under the clock of the intelligent driving system of the vehicle. Each sensor may report the detection time of the data under the sensor's clock to the system by carrying time information in the reported data. In the embodiments of the present disclosure, the time when each sensor sends data is equal to or approximate to the time when the system receives the data. If the clock of the first sensor is the same as the clock of the system, the above-mentioned first time is the time when the first sensor sends data, the time under the clock of the first sensor itself, and the time under the system clock when the system receives data. If the clock of the first sensor itself is different from the clock of the system, the system needs to obtain the first time according to the data sending time under the clock of the first sensor and the clock difference between the first sensor and the system. The process will be implemented as follows Detailed description in the example. The processing method for the above second moment is similar to the above first moment, and will not be repeated here.
第一传感器和第二传感器由于触发源不同、触发时刻不同等因素,可能存在不同步的问题。例如,第一传感器为LiDAR,第二传感器为摄像头,LiDAR每旋转一周向系统上报一帧数据,摄像头可以按照自身的拍摄周期向系统上报数据,因此,即使LiDAR和摄像头同时开始工作,向系统上报的数据可能不是同一时刻的数据。例如,LiDAR探测到车辆前方100米存在一个人,而摄像头可能拍摄到的为车辆前方120米存在一个人。基于该问题,在本实施例中,在第一时刻和第二时刻均为系统的时钟下的时刻的前提下,将第二传感器作为第一传感器的参考传感器,将系统接收第二传感器的第二目标数据的第二时刻作为第一传感器发送数据的第一时刻的参考时刻,根据车辆在第一时刻的第一位姿和车辆在第二时刻的第二位姿,对第一目标数据进行补偿处理,得到第一传感器在第二时刻下的补偿数据。由于第二传感器的第二目标数据也是第二时刻的检测数据,即通过对第一传感器的数据进行补偿处理,可以得到第一传感器在第二传感器对应时刻的数据,即所得到的第一传感器的数据和第二传感器的数据均为同一时刻的数据,从而实现第一传感器和第二传感器的同步。这种方式是通过软件方式实现传感器的同步,因此无需额外部署用于进行多个传感器同步触发的专用硬件,降低多传感器数据同步所需的硬件成本。The first sensor and the second sensor may be out of synchronization due to factors such as different trigger sources and different trigger moments. For example, the first sensor is LiDAR, and the second sensor is a camera. The LiDAR reports one frame of data to the system every time it rotates. The camera can report data to the system according to its own shooting cycle. Therefore, even if the LiDAR and the camera start working at the same time, they report to the system The data may not be the data at the same time. For example, LiDAR detects a person 100 meters in front of the vehicle, and the camera may capture a person 120 meters in front of the vehicle. Based on this problem, in this embodiment, under the premise that the first time and the second time are both times under the clock of the system, the second sensor is used as the reference sensor of the first sensor, and the system receives the second sensor of the second sensor. The second moment of the second target data is used as the reference moment of the first moment when the first sensor sends data. According to the first pose of the vehicle at the first moment and the second pose of the vehicle at the second moment, the first target data Compensation processing obtains the compensation data of the first sensor at the second moment. Since the second target data of the second sensor is also the detection data at the second time, that is, by performing compensation processing on the data of the first sensor, the data of the first sensor at the corresponding time of the second sensor can be obtained, that is, the obtained first sensor The data of and the data of the second sensor are data at the same time, so as to realize the synchronization of the first sensor and the second sensor. In this way, the synchronization of the sensors is realized through software, so there is no need to deploy additional dedicated hardware for triggering synchronization of multiple sensors, which reduces the hardware cost required for data synchronization of multiple sensors.
图3为通过上述过程对第一传感器和第二传感器进行数据同步的示意图,如图3所示,以第一传感器为LiDAR、第二传感器为摄像头为例,第一传感器和第二传感器按照一定的周期上报数据帧,不妨将LiDAR上报的数据帧称为雷达帧(或LiDAR帧),将摄像头上报的数据帧称为摄像头帧(或数据帧)。在一个周期中,系统在Tl时刻(即第一时刻)接收到一个雷达帧,在Tc时刻(即第二时刻)接收到一个摄像头帧,以摄像头作为参考传感器,使用上述的过程可以得到Tl时刻的雷达帧在Tc时刻的数据,则相当于在Tc时刻同时获取到了雷达数据和摄像头数据,从而实现了LiDAR和摄像头的同步。Figure 3 is a schematic diagram of data synchronization between the first sensor and the second sensor through the above process. As shown in Figure 3, taking the first sensor as LiDAR and the second sensor as the camera as an example, the first sensor and the second sensor are The data frame reported by LiDAR may be called a radar frame (or LiDAR frame), and the data frame reported by the camera is called a camera frame (or data frame). In a cycle, the system receives a radar frame at time Tl (ie, the first time), and a camera frame at time Tc (ie, the second time), using the camera as a reference sensor, and using the above process to get Tl time The data of the radar frame at time Tc is equivalent to acquiring radar data and camera data at the same time at Tc time, thus realizing the synchronization of LiDAR and camera.
在上述示例中,各传感器实时向系统发送检测的数据,数据以此进行传感器数据同步。在另一种场景下,还可以预先对各传感器检测的数据进行录制,并对各传感器录制 的数据进行数据回放。在数据回放时,需要对各传感器的数据进行同步。In the above example, each sensor sends detected data to the system in real time, and the data is used to synchronize sensor data. In another scenario, you can also record the data detected by each sensor in advance, and perform data playback on the data recorded by each sensor. During data playback, the data of each sensor needs to be synchronized.
图4为对各传感器的录制数据在数据回放时的同步处理的示意图二,如图4所示,预先对摄像头的检测数据进行录制得到一系列摄像头帧,每帧摄像头帧均记录检测时的时间戳,预先对CANBUS/高精惯导的检测数据进行录制得到一系列车辆运行数据,预先对LiDAR的检测数据进行录制得到一系列雷达帧,每帧雷达帧均记录检测时的时间戳。在数据回放时,读取一帧摄像头帧和一帧雷达帧,同时,根据录制的CANBUS/高精惯导的检测数据得到车辆的位姿队列,进而根据该一帧摄像头帧的时间戳得到检测该一帧摄像头帧时车辆的位姿,根据该一帧雷达帧的时间戳得到检测该一帧雷达帧时的位姿,进而根据得到的两个位姿,将雷达帧补偿到摄像头帧的检测时刻下,从而实现摄像头帧和雷达帧的同步,进而基于同步的摄像头帧和雷达帧可以进行驾驶控制等操作。Figure 4 is the second schematic diagram of the synchronization processing of the recorded data of each sensor during data playback. As shown in Figure 4, the detection data of the camera is recorded in advance to obtain a series of camera frames, and each camera frame records the time of detection Pre-record the detection data of CANBUS/high-precision inertial navigation to obtain a series of vehicle operating data, and pre-record the detection data of LiDAR to obtain a series of radar frames. Each radar frame records the time stamp of the detection. During data playback, read a camera frame and a radar frame. At the same time, get the vehicle's pose queue according to the recorded CANBUS/high-precision inertial navigation detection data, and then get the detection according to the time stamp of the camera frame. The pose of the vehicle at the time of the camera frame, the pose when the radar frame was detected according to the time stamp of the radar frame, and the radar frame is compensated to the detection of the camera frame according to the obtained two poses At the moment, the camera frame and radar frame are synchronized, and operations such as driving control can be performed based on the synchronized camera frame and radar frame.
在具体实施过程中,对于智能设备上的多个传感器,可以选择其中一个作为第二传感器,即参考传感器,并分别将其他传感器与该参考传感器同步,从而实现智能设备上各传感器的同步。In the specific implementation process, for the multiple sensors on the smart device, one of them can be selected as the second sensor, that is, the reference sensor, and the other sensors are synchronized with the reference sensor, so as to realize the synchronization of the sensors on the smart device.
本实施例中,分别获取到第一传感器在第一时刻的第一目标数据和第二传感器在第二时刻的第二目标数据后,根据智能设备在第一时刻的第一位姿和智能设备在第二时刻的第二位姿,对第一目标数据进行补偿处理,从而得到第一传感器在第二时刻下的补偿数据。由于第二传感器的第二目标数据也是第二时刻的监测数据,通过对第一传感器的数据进行补偿处理,可以得到第一传感器在第二传感器对应时刻的数据,即所得到的第一传感器的第一目标数据和第二传感器的第二目标数据均为同一时刻的数据,从而实现第一传感器和第二传感器的同步。该方法通过软件方式实现传感器的同步,因此,无需额外部署用于进行多个传感器同步触发的专用硬件,降低多传感器数据同步所需的硬件成本。In this embodiment, after obtaining the first target data of the first sensor at the first moment and the second target data of the second sensor at the second moment, respectively, according to the first pose of the smart device at the first moment and the smart device At the second posture at the second moment, the first target data is compensated to obtain the compensation data of the first sensor at the second moment. Since the second target data of the second sensor is also the monitoring data at the second time, by performing compensation processing on the data of the first sensor, the data of the first sensor at the corresponding time of the second sensor can be obtained, that is, the obtained data of the first sensor The first target data and the second target data of the second sensor are both data at the same time, so as to achieve synchronization of the first sensor and the second sensor. The method realizes the synchronization of the sensors through software, therefore, there is no need to additionally deploy dedicated hardware for synchronous triggering of multiple sensors, which reduces the hardware cost required for data synchronization of multiple sensors.
图5为本公开实施例提供的传感器数据处理方法的流程示意图三,如图5所示,上述步骤S204中根据第一位姿和第二位姿对第一目标数据进行补偿处理时的一种可选的方式包括:FIG. 5 is a schematic diagram of the third flow of a sensor data processing method provided by an embodiment of the disclosure. As shown in FIG. 5, the first target data is compensated according to the first pose and the second pose in step S204. Alternative methods include:
S501、根据上述第一位姿,确定上述第一传感器在上述第一时刻的第一坐标系;S501: Determine the first coordinate system of the first sensor at the first moment according to the first pose;
S502、根据上述第二位姿,确定上述第一传感器在上述第二时刻的第二坐标系;S502: Determine the second coordinate system of the first sensor at the second moment according to the second pose;
S503、根据上述第一坐标系和上述第二坐标系,对上述第一目标数据进行补偿处理,得到上述第一传感器在上述第二时刻下的补偿数据。S503: Perform compensation processing on the first target data according to the first coordinate system and the second coordinate system to obtain compensation data of the first sensor at the second time.
可选的,以智能设备为车辆为例,车辆在运行状态下,在每个时刻均具有对应的位姿,不同时刻的位姿可能会发生变换,位姿的变换可包括旋转和平移。基于车辆在每个时刻的位姿,即旋转和平移,可以得到传感器的位姿。传感器的每个位姿均对应一个坐标系,每个位姿下传感器所检测到的世界坐标系中的点均为该坐标系下的点,即所检测到的点的坐标值均为该坐标系下的坐标值。在本实施例中,车辆在第一时刻和第二时刻下分别具有对应的位姿,进而分别具有第一位姿对应的第一坐标系和第二位姿对应的第二坐标系,对于世界坐标系中的一个点,基于第一坐标系和第二坐标系的变换,可以推导出该点在第二坐标系的坐标值,即第二时刻的坐标值,对第一传感器所检测到的数据中的每个点均执行上述处理,可以得到第一传感器所检测到的数据在第二时刻的检测数据。Optionally, taking the smart device as a vehicle as an example, the vehicle has a corresponding pose at each moment in the running state, and the pose at different moments may change, and the pose change may include rotation and translation. Based on the pose of the vehicle at each moment, that is, rotation and translation, the pose of the sensor can be obtained. Each pose of the sensor corresponds to a coordinate system, and the point in the world coordinate system detected by the sensor in each pose is a point in the coordinate system, that is, the coordinate value of the detected point is the coordinate The coordinate value under the system. In this embodiment, the vehicle has a corresponding pose at the first moment and the second moment, and further has a first coordinate system corresponding to the first pose and a second coordinate system corresponding to the second pose. For a point in the coordinate system, based on the transformation of the first coordinate system and the second coordinate system, the coordinate value of the point in the second coordinate system can be derived, that is, the coordinate value at the second moment, and the value detected by the first sensor Each point in the data performs the above processing, and the detection data of the data detected by the first sensor at the second time can be obtained.
以下通过示例说明基于第一坐标系和第二坐标系对第一传感器的第一目标数据进行补偿处理的过程。The following illustrates the process of performing compensation processing on the first target data of the first sensor based on the first coordinate system and the second coordinate system through examples.
假设在世界坐标系下存在一个点X,第一时刻为t0时刻,第二时刻为tn时刻,在t0时刻车辆的位姿为P0,该P0对应的坐标系为第一坐标系,在该t0时刻该第一坐标系下得到的点X的坐标数据为x0,则x0和X的关系满足如下公式(1):Suppose there is a point X in the world coordinate system, the first time is t0, the second time is tn, the pose of the vehicle at t0 is P0, and the coordinate system corresponding to P0 is the first coordinate system. The coordinate data of point X obtained in the first coordinate system at time is x0, and the relationship between x0 and X satisfies the following formula (1):
X=P0*x0       (1)X=P0*x0 (1)
假设第二时刻为tn时刻,车辆的位姿为Pn,该Pn对应的坐标系为第二坐标系,在该tn时刻该第二坐标系下得到的点X的坐标数据为xn,则xn和X的关系满足如下公式(2):Assuming that the second time is tn, the pose of the vehicle is Pn, the coordinate system corresponding to Pn is the second coordinate system, and the coordinate data of point X obtained in the second coordinate system at the time tn is xn, then xn and The relationship of X satisfies the following formula (2):
X=Pn*xn      (2)X=Pn*xn (2)
由上述公式(1)和公式(2)可知,x0和xn的关系满足如下公式(3):According to the above formula (1) and formula (2), the relationship between x0 and xn satisfies the following formula (3):
P0*x0=Pn*xn       (3)P0*x0=Pn*xn (3)
对上述公式(3)进行变换处理,可以得到xn,xn可通过如下公式(4)表示:By transforming the above formula (3), xn can be obtained, and xn can be expressed by the following formula (4):
xn=Pn -1*P0*x0       (4) xn=Pn -1 *P0*x0 (4)
在上述过程中,针对世界坐标系下的一个点,通过第一传感器在第一时刻t0检测该点在t0时刻对应位姿对应的第一坐标系下的坐标,以及t0时刻和tn时刻分别对应的位姿,可以推导得出第一传感器在第二时刻tn检测该点时、该点在tn时刻对应位姿对应的第二坐标系系下的坐标。对第一目标数据对应的每个点均进行上述处理,则可以得到第一传感器在tn时刻的检测数据。In the above process, for a point in the world coordinate system, the first sensor at the first time t0 detects the coordinates of the point in the first coordinate system corresponding to the pose at t0, and t0 and tn respectively. When the first sensor detects the point at the second time tn, the coordinates of the point in the second coordinate system corresponding to the pose at the time tn can be derived. The above processing is performed on each point corresponding to the first target data, and then the detection data of the first sensor at time tn can be obtained.
在上述处理过程中,基于车辆在第一时刻的第一位姿以及车辆在第二时刻的第二位姿进行补偿处理,因此,在补偿处理之前可以首先获取车辆在第一时刻的第一位姿以及车辆在第二时刻的第二位姿。In the above process, the compensation process is performed based on the first pose of the vehicle at the first moment and the second pose of the vehicle at the second moment. Therefore, the first position of the vehicle at the first moment can be obtained before the compensation process. And the second pose of the vehicle at the second moment.
一种可选的实施方式中,可以首先生成智能设备的位姿队列,再基于智能设备的位姿队列得到智能设备在第一时刻的第一位姿以及智能设备在第二时刻的第二位姿。In an alternative embodiment, the pose queue of the smart device may be generated first, and then based on the pose queue of the smart device, the first pose of the smart device at the first moment and the second position of the smart device at the second moment are obtained. posture.
图6为本公开实施例提供的传感器数据处理方法的流程示意图四,如图6所示,生成智能设备的位姿队列,以及基于智能设备的位姿队列确定第一位姿和第二位姿的过程包括:FIG. 6 is a fourth flowchart of a sensor data processing method provided by an embodiment of the present disclosure. As shown in FIG. 6, the pose queue of the smart device is generated, and the first pose and the second pose are determined based on the pose queue of the smart device The process includes:
S601、分别获取智能设备上设置的具有位姿检测功能的传感器在多个时刻检测到的位姿检测数据,该多个时刻中的每个时刻均为智能设备的时钟下的时刻。S601. Obtain respectively the pose detection data detected at multiple times by a sensor with a pose detection function set on the smart device, and each of the multiple times is a time under the clock of the smart device.
S602、根据上述位姿检测数据,生成智能设备的位姿队列。S602: Generate a pose queue of the smart device according to the above-mentioned pose detection data.
可选的,以智能设备为车辆为例,车辆上设置的具有位姿检测功能的传感器可以包括CANBUS、高精惯导等传感器。系统可以实时接收CANBUS、高精惯导等传感器上报的位姿检测数据,例如车辆的轮速、方向盘等运行数据,系统根据这些位姿检测数据可以计算得到车辆在多个时刻的位姿,进而构建位姿队列。Optionally, taking the smart device as a vehicle as an example, the sensors with a pose detection function provided on the vehicle may include CANBUS, high-precision inertial navigation and other sensors. The system can receive real-time pose detection data reported by CANBUS, high-precision inertial navigation sensors, such as vehicle wheel speed, steering wheel and other operating data. Based on these pose detection data, the system can calculate the vehicle's pose at multiple moments, and then Build a pose queue.
S603、根据智能设备的位姿队列、上述第一时刻和上述第二时刻,确定上述第一位姿和上述第二位姿。S603: Determine the first pose and the second pose according to the pose queue of the smart device, the first moment and the second moment.
可选的,智能设备的位姿队列由各个时刻的位姿组成。在具体实施过程中,一种情况下,第一时刻可能为位姿对列中某个位姿对应的时刻,即位姿队列中直接存在第一时刻与位姿信息的映射关系。在这种情况下,可以直接从位姿队列中获取第一时刻对应的位姿信息。对于第二时刻的位姿信息采样同样的处理,即如果位姿队列中直接存在第二时刻与位姿信息的映射关系,则可以直接从位姿队列中获取第二时刻对应的位姿信息。Optionally, the pose queue of the smart device is composed of poses at each moment. In a specific implementation process, in one case, the first moment may be a moment corresponding to a certain pose in the pose pair, that is, the mapping relationship between the first moment and the pose information directly exists in the pose queue. In this case, the pose information corresponding to the first moment can be directly obtained from the pose queue. The same processing is performed for the pose information sampling at the second moment, that is, if the mapping relationship between the second moment and the pose information directly exists in the pose queue, the pose information corresponding to the second moment can be directly obtained from the pose queue.
在另一种情况下,响应于所述位姿队列中不包括所述第一时刻的位姿的情况,根据所述第一时刻,对所述智能设备的位姿队列进行补偿处理,得到所述第一位姿;和/或,响应于所述位姿队列中不包括所述第二时刻的位姿的情况,根据所述第二时刻,对所述智能设备的位姿队列进行补偿处理,得到所述第二位姿。In another case, in response to the situation that the pose queue does not include the pose at the first moment, the pose queue of the smart device is compensated according to the first moment to obtain The first pose; and/or, in response to the situation that the pose at the second moment is not included in the pose queue, perform compensation processing on the pose queue of the smart device according to the second moment , To get the second pose.
本实施例中,位姿队列中可能并不存在第一时刻对应的位姿信息,则可以根据第一时刻,对智能设备的位姿队列进行补偿处理,得到智能设备在第一时刻的第一位姿。其中,对位姿队列进行补偿处理例如可以是插值处理。示例性的,第一时刻为t3时刻,智能设备的位姿队列中并不存在t3时刻对应的位姿信息,则可以在位姿队列中查找与t3 时刻最接近的两个相邻时刻,例如查找到t4和t5时刻,t4和t5时刻相邻,同时,t3时刻为t4和t5之间的时刻。利用t4时刻的位姿信息和t5时刻的位姿信息,进行插值处理,得到t3时刻对应的位姿信息。对于第二时刻,也可以上述过程进行处理,这里不再赘述。In this embodiment, the pose information corresponding to the first moment may not exist in the pose queue, and the pose queue of the smart device can be compensated according to the first moment to obtain the first moment of the smart device. Posture. Among them, the compensation processing for the pose queue may be interpolation processing, for example. Exemplarily, the first moment is t3, and there is no pose information corresponding to t3 in the pose queue of the smart device, you can find the two adjacent moments closest to t3 in the pose queue, for example Find time t4 and t5, t4 and t5 are adjacent, at the same time, t3 is the time between t4 and t5. Using the pose information at time t4 and the pose information at time t5, perform interpolation processing to obtain the pose information corresponding to time t3. For the second moment, the above process can also be processed, which will not be repeated here.
在上述实施例中,在根据第一位姿和第二位姿、对第一传感器的第一目标数据进行补偿处理过程中,所使用的数据为上述第一目标数据。上述第一目标数据可以是指第一传感器直接检测到的未经处理的数据,或者,上述第一目标数据也可以是对第一传感器检测的原始数据进行预同步处理后的数据。在一些场景下,例如,在第一传感器为LiDAR的情况下,LiDAR转动一圈,并检测一圈的数据后向系统上报一帧数据,由于LiDAR转动一周需要经历一定的时长,因此,在LiDAR向系统上报的一个LiDAR帧数据中,各子数据的实际检测时刻存在差异。其中,一个LiDAR帧数据中可以包括多个数据包,每个数据包为一个子数据。在诸如这种场景下,在对第一传感器的数据进行补偿处理以实现与作为参考传感器的第二传感器的数据同步之前,可以首先对第一传感器发送的数据中的各子数据进行同步,以使得第一传感器所发送的每帧数据实现帧内同步。其中,对上述各子数据进行同步,是指将各子数据中的一个子数据的发送时刻作为参考时刻,对其余子数据进行补偿处理,得到其余子数据在该参考时刻下的子数据。In the foregoing embodiment, in the process of performing compensation processing on the first target data of the first sensor according to the first pose and the second pose, the data used is the foregoing first target data. The above-mentioned first target data may refer to unprocessed data directly detected by the first sensor, or the above-mentioned first target data may also be data obtained by pre-synchronizing raw data detected by the first sensor. In some scenarios, for example, when the first sensor is LiDAR, the LiDAR rotates one circle and reports one frame of data to the system after detecting one circle of data. Since LiDAR rotates one circle for a certain period of time, the LiDAR In a LiDAR frame data reported to the system, the actual detection time of each sub-data is different. Among them, one LiDAR frame data may include multiple data packets, and each data packet is a sub-data. In a scenario such as this, before performing compensation processing on the data of the first sensor to achieve data synchronization with the second sensor as the reference sensor, the sub-data in the data sent by the first sensor can be synchronized first to Make each frame of data sent by the first sensor achieve intra-frame synchronization. Wherein, synchronizing the above-mentioned sub-data refers to taking the transmission time of one sub-data in each sub-data as a reference time, and performing compensation processing on the remaining sub-data to obtain sub-data of the remaining sub-data at the reference time.
一种可选方式中,可以首先根据第一传感器上报上述第一目标数据对应的第一原始数据时携带的第三时刻,以及上述第一传感器的时钟与智能设备的时钟的差异信息,确定上述第一时刻。In an optional manner, the third time point carried when the first sensor reports the first raw data corresponding to the first target data, and the difference information between the clock of the first sensor and the clock of the smart device may first determine the The first moment.
在本实施例中,第一原始数据是指第一传感器向系统上报的未经过帧内同步处理的数据,第一目标数据是指经过帧内同步处理后的数据。In this embodiment, the first raw data refers to data that has not undergone intra-frame synchronization processing reported by the first sensor to the system, and the first target data refers to data that has undergone intra-frame synchronization processing.
第一传感器在上报第一原始数据时,可以在第一原始数据中携带检测数据的时刻,即上述的第三时刻。When the first sensor reports the first raw data, the time of detecting the data may be carried in the first raw data, that is, the aforementioned third time.
则所述获取智能设备的第一传感器在第一时刻的第一目标数据之前,当系统接收到携带第三时刻的第一原始数据之后,根据第一传感器时钟与智能设备的时钟的差异信息,可以确定出第三时刻在智能设备的时钟下的第一时刻,所述第三时刻用于标识第一传感器检测第一目标数据对应的第一原始数据的时刻,该第三时刻为第一传感器的时钟下的时刻。Then, before acquiring the first target data of the first sensor of the smart device at the first moment, after the system receives the first raw data carrying the third moment, according to the difference information between the clock of the first sensor and the clock of the smart device, It can be determined that the third time is the first time under the clock of the smart device, the third time is used to identify the time when the first sensor detects the first raw data corresponding to the first target data, and the third time is the first sensor Time under the clock.
可选的,第一传感器的时钟与智能设备的时钟的差异信息,可以预先通过特定的手段获取。示例性的,在第一传感器基于全球定位系统(Global Positioning System,GPS)的情况下,可以根据GPS时钟与智能设备的时钟的误差,确定第一传感器的始终与智能设备的时钟的差异信息。Optionally, the difference information between the clock of the first sensor and the clock of the smart device may be obtained in advance through a specific means. Exemplarily, when the first sensor is based on the Global Positioning System (Global Positioning System, GPS), the difference information between the clock of the first sensor and the clock of the smart device may be determined according to the error between the GPS clock and the clock of the smart device.
另外,针对上述第二时刻,在所述获取智能设备的第二传感器在第二时刻的第二目标数据之前,还可以根据第二传感器的时钟与智能设备的时钟的差异信息,确定所述第二传感器上报所述第二目标数据时携带的第四时刻在所述智能设备时钟下的所述第二时刻,该第四时刻用于标识所述第二目标数据的检测时刻,该第四时刻为上述第二传感器的时钟下的时刻。In addition, for the above second time, before acquiring the second target data of the second sensor of the smart device at the second time, the second sensor may also be determined according to the difference information between the clock of the second sensor and the clock of the smart device. The fourth time carried by the second sensor when reporting the second target data is the second time under the smart device clock, and the fourth time is used to identify the detection time of the second target data. The fourth time It is the time under the clock of the second sensor.
一种示例中,如果上述第二传感器为摄像头等能够检测视频帧的传感器,则可以首先使用第二传感器拍摄智能设备的秒表的多个视频帧,对拍摄每个视频帧的时刻信息与每个视频帧所对应的秒表所显示的时刻信息进行比对分析,得到第二传感器的时钟与智能设备的时钟的差异信息。In an example, if the above-mentioned second sensor is a sensor capable of detecting video frames, such as a camera, the second sensor can be used to shoot multiple video frames of the stopwatch of the smart device, and the time information of each video frame and each The time information displayed by the stopwatch corresponding to the video frame is compared and analyzed to obtain the difference information between the clock of the second sensor and the clock of the smart device.
以下说明基于第一传感器上报第一原始数据时携带的第三时刻进行帧内数据同步的过程。The following describes the process of performing intra-frame data synchronization based on the third moment carried when the first sensor reports the first original data.
图7为本公开实施例提供的传感器数据处理方法的流程示意图五,如图7所示,对第一原始数据进行帧内数据同步以获取第一目标数据的过程包括:FIG. 7 is a schematic flow chart 5 of the sensor data processing method provided by an embodiment of the disclosure. As shown in FIG. 7, the process of performing intra-frame data synchronization on the first raw data to obtain the first target data includes:
S701、接收第一传感器上报的第一原始数据,该第一原始数据包括多个子数据,每个子数据具有一个对应的检测时刻,上述第三时刻为上述第一原始数据包括的多个数据的检测时刻的参考时刻。S701. Receive first raw data reported by the first sensor, where the first raw data includes multiple sub-data, and each sub-data has a corresponding detection time. The third time is the detection of multiple data included in the first raw data. The reference moment of the moment.
可选的,第一传感器上报的第一原始数据包括多个子数据,同时携带每个子数据的检测时刻,即每个子数据均具有一个对应的检测时刻。系统可以选择这些检测时刻的中的一个时刻作为参考时刻作为第三时刻;将其他时刻的子数据分别补偿至该参考时刻下,从而得到该参考时刻下的所有子数据,这些子数据组合成的数据即为第一目标数据,从而实现第一传感器的帧内数据同步。示例性的,可以选择多个检测时刻中最晚的一个时刻作为上述第三时刻。Optionally, the first raw data reported by the first sensor includes multiple sub-data, and carries the detection time of each sub-data at the same time, that is, each sub-data has a corresponding detection time. The system can select one of these detection moments as the reference moment as the third moment; respectively compensate the sub-data at other moments to the reference moment, so as to obtain all the sub-data at the reference moment, and these sub-data are combined into The data is the first target data, so as to realize the intra-frame data synchronization of the first sensor. Exemplarily, the latest one of the multiple detection moments may be selected as the third moment.
S702、根据智能设备在每个子数据的检测时刻的位姿以及每个子数据,获取上述第一传感器在上述第一时刻的上述第一目标数据。S702. Obtain the first target data of the first sensor at the first time according to the pose of the smart device at the detection time of each sub-data and each sub-data.
本实施例中,在接收到第一传感器上报的第一原始数据后,基于组成该第一原始数据的每个子数据的检测时刻的位姿,可以完成对第一原始数据中多个子数据的帧内同步,从而进一步提升传感器同步的精确性。In this embodiment, after receiving the first raw data reported by the first sensor, based on the pose of each sub-data that constitutes the first raw data at the detection time, the frame of multiple sub-data in the first raw data can be completed Internal synchronization, thereby further improving the accuracy of sensor synchronization.
图8为本公开实施例提供的传感器数据处理方法的流程示意图六,如图8所示,上述步骤S702的一种可选实施方式包括:FIG. 8 is a schematic flow chart 6 of a sensor data processing method provided by an embodiment of the disclosure. As shown in FIG. 8, an optional implementation manner of the foregoing step S702 includes:
S801、根据智能设备在每个子数据的检测时刻的位姿,确定第一传感器在每个子数据的检测时刻的坐标系。S801: Determine the coordinate system of the first sensor at the detection time of each sub-data according to the pose of the smart device at the detection time of each sub-data.
确定每个检测时刻的坐标系的过程与前述确定第一时刻的第一坐标系的过程相同,可以参照前述的实施例,此处不再赘述。The process of determining the coordinate system at each detection moment is the same as the process of determining the first coordinate system at the first moment described above, and reference may be made to the foregoing embodiment, which will not be repeated here.
S802、根据上述第一传感器在除上述第三时刻外的每个子数据的检测时刻的坐标系和上述第一传感器在上述第三时刻的坐标系,分别确定除第三时刻外的每个子数据在上述第三时刻对应的子数据。S802. According to the coordinate system of the first sensor at the detection time of each sub-data except the third time and the coordinate system of the first sensor at the third time, respectively determine that each sub-data except the third time is The sub-data corresponding to the third moment above.
本步骤中,将第三时刻作为参考时刻,分别将其他检测时刻的子数据补偿至第三时刻下,以实现各子数据的同步。In this step, the third time is used as the reference time, and the sub-data of other detection time is compensated to the third time to realize the synchronization of the sub-data.
其中,根据子数据的检测时刻的坐标系以及第三时刻的坐标系确定子数据在第三时刻对应的子数据的过程与上述步骤S503中的处理过程一致,可以参照上述步骤S503以及公式(1)至(4)的描述,此处不再赘述。Among them, the process of determining the sub-data corresponding to the sub-data at the third time according to the coordinate system at the detection time of the sub-data and the coordinate system at the third time is consistent with the processing in the above step S503. You can refer to the above step S503 and formula (1 The description from) to (4) will not be repeated here.
S803、对除第三时刻外的每个子数据在上述第三时刻对应的子数据进行整合处理,得到上述第一传感器在上述第一时刻的上述第一目标数据。S803: Perform integration processing on the sub-data corresponding to each sub-data at the third time except the third time to obtain the first target data of the first sensor at the first time.
可选的,可以对每个子数据在第三时刻对应的子数据按照原始的检测时刻进行排序组合,从而得到上述第一目标数据,该第一目标数据中的所有子数据均为第三时刻对应坐标系下的数据,即该第一目标数据内的所有子数据为同步数据。在第一传感器的时钟下,该第一目标数据为第三时刻对应坐标系下的数据,而由前文可知,第三时刻在智能设备的时钟下的时刻为第一时刻,因此,在智能设备的时钟下,上述第一目标数据为第一时刻的检测数据。Optionally, the sub-data corresponding to each sub-data at the third time can be sorted and combined according to the original detection time to obtain the above-mentioned first target data. All sub-data in the first target data are corresponding to the third time. The data in the coordinate system, that is, all the sub-data in the first target data are synchronous data. Under the clock of the first sensor, the first target data is the data in the coordinate system corresponding to the third time. As can be seen from the foregoing, the time at the third time under the clock of the smart device is the first time. Therefore, in the smart device Under the clock of, the above-mentioned first target data is the detection data at the first moment.
图9为对第一原始数据进行帧内数据同步以获取第一目标数据的示例图,如图9所示,以第一传感器为LiDAR为例,LiDAR在上报的每一帧数据中,包括n+1个数据包,每个数据包对应一个检测时刻,则可以以第n个数据包(数据包n)的检测时刻为参考时刻,将数据包0至数据包n-1分别补偿到数据包n对应时刻下,从而实现一帧数据的帧内同步。Figure 9 is an example diagram of performing intra-frame data synchronization on the first raw data to obtain the first target data. As shown in Figure 9, taking the first sensor as LiDAR as an example, the LiDAR includes n in each frame of data reported. +1 data packet, each data packet corresponds to a detection time, you can use the detection time of the nth data packet (data packet n) as the reference time to compensate data packets 0 to n-1 to the data packets respectively n corresponds to the time, so as to realize the intra-frame synchronization of one frame of data.
以上说明了对由多个子数据组成的第一原始数据进行帧内同步以得到第一目标数据的过程。在具体实施过程中,如果智能设备上包含了多个同类型的传感器,并且该同类型的传感器的检测数据为上述包括多个子数据的形式,第一传感器为多个同类型的传 感器中的一个,则在将第一传感器与第二传感器进行同步之前,还可以首先基于上述多个子数据同步的方法实现对同类型的多个传感器的检测数据同步。The above describes the process of performing intra-frame synchronization on the first original data composed of multiple sub-data to obtain the first target data. In the specific implementation process, if the smart device contains multiple sensors of the same type, and the detection data of the same type of sensor is in the above-mentioned form including multiple sub-data, the first sensor is one of multiple sensors of the same type. , Before synchronizing the first sensor with the second sensor, the detection data of multiple sensors of the same type may be synchronized based on the multiple sub-data synchronization method described above.
图10为本公开实施例提供的传感器数据处理方法的流程示意图七,如图10所示,对同类型的多个传感器的检测数据进行同步的过程包括:FIG. 10 is a schematic flow diagram 7 of a sensor data processing method provided by an embodiment of the disclosure. As shown in FIG. 10, the process of synchronizing detection data of multiple sensors of the same type includes:
S1001、接收第三传感器上报的第二原始数据,该第三传感器的类型与上述第一传感器的类型相同,该第二原始数据包括多个子数据,每个子数据具有一个对应的检测时刻,上述第三时刻为上述第二原始数据包括的多个数据的检测时刻的参考时刻;S1001. Receive second raw data reported by a third sensor. The type of the third sensor is the same as that of the first sensor. The second raw data includes multiple sub-data. Each sub-data has a corresponding detection time. The third time is the reference time of the detection time of the multiple data included in the second original data;
S1002、根据智能设备在上述第二原始数据的每个子数据的检测时刻的位姿以及上述第二原始数据的每个子数据,获取上述第三传感器在上述第一时刻的第三目标数据。S1002, according to the pose of the smart device at the detection time of each sub-data of the second raw data and each sub-data of the second raw data, obtain the third target data of the third sensor at the first time.
在本实施例中,将第一传感器的其中一个检测时刻,即第三时刻作为第三传感器同步时的参考时刻,根据第三传感器上报的第二原始数据中每个子数据的位姿,可以将第二原始数据中的每个子数据补偿至第三时刻下,经过这种处理后,第二原始数据中的所有子数据得以同步,从而得到同步后的第三目标数据,同时,由于第三传感器是以第一传感器的第三时刻为参考时刻,因此,使得第三传感器同步后的第三目标数据与第一传感器同步后的第二目标数据是同步的。In this embodiment, one of the detection moments of the first sensor, that is, the third moment, is used as the reference moment when the third sensor is synchronized. According to the pose of each sub-data in the second raw data reported by the third sensor, the Each sub-data in the second original data is compensated to the third moment. After this processing, all the sub-data in the second original data are synchronized to obtain the synchronized third target data. At the same time, due to the third sensor The third time of the first sensor is used as the reference time. Therefore, the third target data synchronized by the third sensor and the second target data synchronized by the first sensor are synchronized.
图11为对相同类型的传感器进行同步的示例图,如图11所示,以第一传感器为LiDAR、第三传感器和第四传感器也为LiDAR为例,每个LiDAR上报的每一帧数据中,均包括n+1个数据包,每个数据包对应一个检测时刻。则可以以第一传感器的第n个数据包(数据包n)的检测时刻为参考时刻,将第一传感器的数据包0至数据包n-1分别补偿到数据包n对应时刻下,并且,将第三传感器的数据包0至数据包n分别补偿到第一传感器的数据包n对应时刻下,以及,将第四传感器的数据包0至数据包n分别补偿到第一传感器的数据包n对应时刻下,从而实现第一传感器、第三传感器、第四传感器各自的帧内同步,以及第一传感器、第三传感器、第四传感器之间的帧间同步。Figure 11 is an example diagram of synchronizing sensors of the same type. As shown in Figure 11, taking the first sensor as LiDAR, the third sensor and the fourth sensor as LiDAR as an example, each frame of data reported by each LiDAR , Including n+1 data packets, each data packet corresponds to a detection time. Then, the detection time of the nth data packet (data packet n) of the first sensor can be used as the reference time, and the data packet 0 to data packet n-1 of the first sensor can be compensated to the corresponding time of data packet n, and, Compensate data packet 0 to data packet n of the third sensor to the corresponding time of data packet n of the first sensor, and respectively compensate data packet 0 to data packet n of the fourth sensor to data packet n of the first sensor At the corresponding time, the intra-frame synchronization of the first sensor, the third sensor, and the fourth sensor, and the inter-frame synchronization between the first sensor, the third sensor, and the fourth sensor are realized.
本实施例中,当智能设备上存在多个同类型的传感器、并且该类型的传感器上报的检测数据中包括多个子数据时,可以以一个传感器的检测数据中的一个子数据的检测时刻为参考时刻,对于同类型的其余传感器中的每个传感器,将该传感器的所有子数据均补偿到该参考时刻下,经过该处理,不仅能够实现每个传感器的的帧内同步,还同时能够实现同类型的传感器之间的帧间同步。In this embodiment, when there are multiple sensors of the same type on the smart device, and the detection data reported by this type of sensor includes multiple sub-data, the detection time of one sub-data in the detection data of one sensor can be used as a reference At time, for each of the remaining sensors of the same type, all sub-data of the sensor are compensated to the reference time. After this processing, not only the intra-frame synchronization of each sensor can be realized, but also the same time can be realized. Interframe synchronization between types of sensors.
在一些可选实施例中,所述根据所述第一位姿和所述第二位姿,对所述第一目标数据进行补偿处理,得到所述第一传感器在所述第二时刻下的补偿数据,包括:根据所述第一位姿和所述第二位姿,对所述第一目标数据和所述第三目标数据进行补偿处理,得到所述第一传感器在所述第二时刻下的补偿数据以及所述第三传感器在所述第二时刻下的补偿数据。In some optional embodiments, the compensation processing is performed on the first target data according to the first pose and the second pose to obtain the data of the first sensor at the second moment The compensation data includes: performing compensation processing on the first target data and the third target data according to the first pose and the second pose to obtain that the first sensor is at the second moment And the compensation data of the third sensor at the second moment.
经过上述过程实现同类型的第三传感器和第一传感器的同步之后,在根据第一位姿和第二位姿对第一目标数据进行补偿处理的同时,也可以根据第一位姿和第二位姿对第三目标数据进行补偿处理,从而得到第一传感器在第二时刻下的补偿数据以及第三传感器在第二时刻下的补偿数据。由于第一传感器和第三传感器作为同类型的传感器,已经实现了同步,在此基础上再与第二传感器实现同步,可以进一步提升传感器同步的精确性。After the synchronization between the third sensor and the first sensor of the same type is achieved through the above process, while the first target data is compensated according to the first pose and the second pose, it can also be based on the first pose and second pose. The pose performs compensation processing on the third target data to obtain the compensation data of the first sensor at the second moment and the compensation data of the third sensor at the second moment. Since the first sensor and the third sensor are the same type of sensors, synchronization has been achieved, and on this basis, synchronization with the second sensor can further improve the accuracy of sensor synchronization.
图12为本公开实施例提供的传感器数据处理装置的模块结构图一,如图12所示,该装置包括:FIG. 12 is the first module structure diagram of a sensor data processing device provided by an embodiment of the disclosure. As shown in FIG. 12, the device includes:
第一获取模块1201,用于获取智能设备的第一传感器在第一时刻的第一目标数据;The first acquiring module 1201 is configured to acquire the first target data of the first sensor of the smart device at the first moment;
第二获取模块1202,用于获取所述智能设备的第二传感器在第二时刻的第二目标数据,所述第一时刻和所述第二时刻为所述智能设备的时钟下的不同时刻;The second acquisition module 1202 is configured to acquire second target data of the second sensor of the smart device at a second time, where the first time and the second time are different time under the clock of the smart device;
第三获取模块1203,用于获取所述智能设备在所述第一时刻的第一位姿和在所述第二时刻的第二位姿,所述第一位姿和所述第二位姿不同;The third acquiring module 1203 is configured to acquire the first pose of the smart device at the first moment and the second pose at the second moment, the first pose and the second pose different;
补偿模块1204,用于根据所述第一位姿和所述第二位姿,对所述第一目标数据进行补偿处理,得到所述第一传感器在所述第二时刻下的补偿数据。The compensation module 1204 is configured to perform compensation processing on the first target data according to the first pose and the second pose to obtain compensation data of the first sensor at the second time.
该装置用于实现前述方法实施例,其实现原理和技术效果类似,此处不再赘述。The device is used to implement the foregoing method embodiments, and its implementation principles and technical effects are similar, and will not be repeated here.
另一实施例中,所述补偿模块1204,用于根据所述第一位姿,确定所述第一传感器在所述第一时刻的第一坐标系;根据所述第二位姿,确定所述第一传感器在所述第二时刻的第二坐标系;根据所述第一坐标系和所述第二坐标系,对所述第一目标数据进行补偿处理,得到所述第一传感器在所述第二时刻下的补偿数据。In another embodiment, the compensation module 1204 is configured to determine the first coordinate system of the first sensor at the first moment according to the first pose; determine the first coordinate system according to the second pose The second coordinate system of the first sensor at the second time; according to the first coordinate system and the second coordinate system, the first target data is compensated to obtain the first sensor at the Describe the compensation data at the second moment.
另一实施例中,所述第三获取模块1203,用于分别获取所述智能设备上设置的具有位姿检测功能的传感器在多个时刻检测到的智能设备的位姿检测数据,所述多个时刻中的每个时刻均为所述智能设备的时钟下的时刻;根据所述智能设备上设置的具有位姿检测功能的传感器在多个时刻检测到的智能设备的位姿检测数据,生成所述智能设备的位姿队列;根据所述智能设备的位姿队列、所述第一时刻和所述第二时刻,确定所述第一位姿以及所述第二位姿。In another embodiment, the third acquisition module 1203 is configured to respectively acquire the pose detection data of the smart device detected at multiple times by the sensor with the pose detection function set on the smart device. Each of the two moments is a moment under the clock of the smart device; according to the pose detection data of the smart device detected at multiple moments by the sensor with pose detection function set on the smart device, generate The pose queue of the smart device; the first pose and the second pose are determined according to the pose queue of the smart device, the first moment and the second moment.
另一实施例中,第三获取模块1203,用于响应于所述位姿队列中不包括所述第一时刻的位姿的情况,根据所述第一时刻,对所述智能设备的位姿队列进行补偿处理,得到所述第一位姿;和/或,响应于所述位姿队列中不包括所述第二时刻的位姿的情况,根据所述第二时刻,对所述智能设备的位姿队列进行补偿处理,得到所述第二位姿。In another embodiment, the third acquiring module 1203 is configured to respond to the situation that the pose at the first moment is not included in the pose queue, and perform an assessment of the pose of the smart device according to the first moment. The queue performs compensation processing to obtain the first pose; and/or, in response to the situation that the pose at the second moment is not included in the pose queue, the smart device is processed according to the second moment Perform compensation processing on the pose queue to obtain the second pose.
图13为本公开实施例提供的传感器数据处理装置的模块结构图二,如图13所示,该装置还包括:第一确定模块1205,用于根据所述第一传感器的时钟与所述智能设备的时钟的差异信息,确定第三时刻在所述智能设备时钟下的所述第一时刻,所述第三时刻用于标识所述第一传感器检测所述第一目标数据对应的第一原始数据的时刻,所述第三时刻为所述第一传感器的时钟下的时刻。FIG. 13 is the second module structure diagram of the sensor data processing device provided by the embodiment of the disclosure. As shown in FIG. 13, the device further includes: a first determining module 1205, which is used to determine the relationship between the clock of the first sensor and the smart device. The difference information of the clock of the device determines that the third time is the first time under the smart device clock, and the third time is used to identify the first original data corresponding to the first target data detected by the first sensor. The time of the data, the third time is the time under the clock of the first sensor.
另一实施例中,第一获取模块1201,用于接收所述第一传感器上报的所述第一原始数据,所述第一原始数据包括多个子数据,每个子数据具有一个对应的检测时刻,所述第三时刻为所述第一原始数据包括的多个子数据的检测时刻的参考时刻;根据所述智能设备在每个子数据的检测时刻的位姿以及每个子数据,获取所述第一传感器在所述第一时刻的所述第一目标数据。In another embodiment, the first acquisition module 1201 is configured to receive the first raw data reported by the first sensor, the first raw data includes multiple sub-data, and each sub-data has a corresponding detection time, The third time is the reference time of the detection time of the multiple sub-data included in the first raw data; the first sensor is acquired according to the pose of the smart device at the detection time of each sub-data and each sub-data The first target data at the first moment.
另一实施例中,第一获取模块1201,用于根据所述智能设备在每个子数据的检测时刻的位姿,确定所述第一传感器在每个子数据的检测时刻的坐标系;根据所述第一传感器在除所述第三时刻外的每个子数据的检测时刻的坐标系和所述第一传感器在所述第三时刻的坐标系,分别确定除所述第三时刻外的每个子数据在所述第三时刻对应的子数据;对除所述第三时刻外的每个子数据在所述第三时刻对应的子数据进行整合处理,得到所述第一传感器在所述第一时刻的所述第一目标数据。In another embodiment, the first acquisition module 1201 is configured to determine the coordinate system of the first sensor at the detection time of each sub-data according to the pose of the smart device at the detection time of each sub-data; The coordinate system of the first sensor at the detection time of each sub-data except the third time and the coordinate system of the first sensor at the third time, respectively determine each sub-data except the third time The sub-data corresponding to the third time; the sub-data corresponding to each sub-data at the third time except the third time is integrated to obtain the data of the first sensor at the first time The first target data.
图14为本公开实施例提供的传感器数据处理装置的模块结构图三,如图14所示,该装置还包括:接收模块1206,用于接收第三传感器上报的第二原始数据,所述第三传感器的类型与所述第一传感器的类型相同,所述第二原始数据包括多个子数据,每个子数据具有一个对应的检测时刻,所述第三时刻为所述第二原始数据包括的多个子数据的检测时刻的参考时刻;FIG. 14 is the third module structure diagram of the sensor data processing device provided by the embodiment of the disclosure. As shown in FIG. 14, the device further includes: a receiving module 1206 for receiving the second raw data reported by the third sensor. The type of the three sensors is the same as the type of the first sensor, the second raw data includes multiple sub-data, each sub-data has a corresponding detection time, and the third time is the multiple included in the second raw data. The reference time of the detection time of the sub-data;
第四获取模块1207,用于根据所述智能设备在所述第二原始数据的每个子数据的检测时刻的位姿以及所述第二原始数据的每个子数据,获取所述第三传感器在所述第一时刻的第三目标数据;The fourth acquisition module 1207 is configured to acquire the location of the third sensor at the time of detection of each sub-data of the second raw data and each sub-data of the second raw data according to the smart device. The third target data at the first moment;
补偿模块1204,用于根据所述第一位姿和所述第二位姿,对所述第一目标数据和所 述第三目标数据进行补偿处理,得到所述第一传感器在所述第二时刻下的补偿数据以及所述第三传感器在所述第二时刻下的补偿数据。The compensation module 1204 is configured to perform compensation processing on the first target data and the third target data according to the first pose and the second pose, to obtain that the first sensor is in the second The compensation data at the time and the compensation data of the third sensor at the second time.
图15为本公开实施例提供的传感器数据处理装置的模块结构图四,如图15所示,该装置还包括:第五获取模块1208,用于所述第一确定模块1205根据所述第一传感器的时钟与所述智能设备的时钟的差异信息,确定第三时刻在所述智能设备时钟下的所述第一时刻之前,根据GPS时钟与所述智能设备的时钟误差,获取所述第一传感器的时钟与所述智能设备的时钟的差异信息。15 is the fourth module structure diagram of the sensor data processing device provided by the embodiments of the disclosure. As shown in FIG. 15, the device further includes: a fifth acquisition module 1208, which is used by the first determination module 1205 according to the first The difference information between the clock of the sensor and the clock of the smart device determines that the third time is before the first time under the smart device clock, and obtains the first time according to the clock error between the GPS clock and the smart device Difference information between the clock of the sensor and the clock of the smart device.
图16为本公开实施例提供的传感器数据处理装置的模块结构图五,如图16所示,该装置还包括:第二确定模块1209,用于所述第二获取模块1202获取智能设备的第二传感器在第二时刻的第二目标数据之前,根据所述第二传感器的时钟与所述智能设备的时钟的差异信息,确定所述第二传感器上报所述第二目标数据时携带的第四时刻在所述智能设备时钟下的所述第二时刻,所述第四时刻用于标识所述第二目标数据的检测时刻,所述第四时刻为所述第二传感器的时钟下的时刻。FIG. 16 is the fifth module structure diagram of the sensor data processing device provided by the embodiments of the disclosure. As shown in FIG. 16, the device further includes: a second determining module 1209, which is used for the second acquiring module 1202 to acquire the first determination of the smart device Before the second target data at the second time, the second sensor determines the fourth carried when the second sensor reports the second target data according to the difference information between the clock of the second sensor and the clock of the smart device. The time is the second time under the smart device clock, the fourth time is used to identify the detection time of the second target data, and the fourth time is the time under the clock of the second sensor.
图17为本公开实施例提供的传感器数据处理装置的模块结构图六,如图17所示,该装置还包括:拍摄模块1210,用于使用所述第二传感器拍摄所述智能设备的秒表的多个视频帧;FIG. 17 is a module structure diagram 6 of the sensor data processing device provided by an embodiment of the disclosure. As shown in FIG. 17, the device further includes: a photographing module 1210 for photographing the stopwatch of the smart device using the second sensor Multiple video frames;
分析模块1211,用于对拍摄每个所述视频帧的时刻信息与每个所述视频帧所对应的秒表所显示的时刻信息进行比对分析,得到所述第二传感器的时钟与所述智能设备的时钟的差异信息。The analysis module 1211 is used to compare and analyze the time information of each video frame and the time information displayed by the stopwatch corresponding to each video frame to obtain the clock of the second sensor and the smart Difference information of the device's clock.
另一实施例中,所述第二传感器为摄像头,所述第一传感器为激光雷达或毫米波雷达。In another embodiment, the second sensor is a camera, and the first sensor is a lidar or millimeter wave radar.
需要说明的是,应理解以上装置的各个模块的划分仅仅是一种逻辑功能的划分,实际实现时可以全部或部分集成到一个物理实体上,也可以物理上分开。且这些模块可以全部以软件通过处理元件调用的形式实现;也可以全部以硬件的形式实现;还可以部分模块通过处理元件调用软件的形式实现,部分模块通过硬件的形式实现。例如,确定模块可以为单独设立的处理元件,也可以集成在上述装置的某一个芯片中实现,此外,也可以以程序代码的形式存储于上述装置的存储器中,由上述装置的某一个处理元件调用并执行以上确定模块的功能。其它模块的实现与之类似。此外这些模块全部或部分可以集成在一起,也可以独立实现。这里所述的处理元件可以是一种集成电路,具有信号的处理能力。在实现过程中,上述方法的各步骤或以上各个模块可以通过处理器元件中的硬件的集成逻辑电路或者软件形式的指令完成。It should be noted that it should be understood that the division of the various modules of the above device is only a division of logical functions, and may be fully or partially integrated into a physical entity in actual implementation, or may be physically separated. And these modules can all be implemented in the form of software called by processing elements; they can also be implemented in the form of hardware; some modules can be implemented in the form of calling software by processing elements, and some of the modules can be implemented in the form of hardware. For example, the determining module may be a separately established processing element, or it may be integrated into a certain chip of the above-mentioned device for implementation. In addition, it may also be stored in the memory of the above-mentioned device in the form of program code, and a certain processing element of the above-mentioned device Call and execute the functions of the above-identified module. The implementation of other modules is similar. In addition, all or part of these modules can be integrated together or implemented independently. The processing element described here may be an integrated circuit with signal processing capability. In the implementation process, each step of the above method or each of the above modules can be completed by hardware integrated logic circuits in the processor element or instructions in the form of software.
例如,以上这些模块可以是被配置成实施以上方法的一个或多个集成电路,例如:一个或多个特定集成电路(Application Specific Integrated Circuit,ASIC),或,一个或多个微处理器(Digital Signal Processor,DSP),或,一个或者多个现场可编程门阵列(Field Programmable Gate Array,FPGA)等。再如,当以上某个模块通过处理元件调度程序代码的形式实现时,该处理元件可以是通用处理器,例如中央处理器(Central Processing Unit,CPU)或其它可以调用程序代码的处理器。再如,这些模块可以集成在一起,以片上系统(System-On-a-Chip,SOC)的形式实现。For example, the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more application specific integrated circuits (ASIC), or one or more microprocessors (Digital Signal Processor, DSP), or, one or more Field Programmable Gate Array (Field Programmable Gate Array, FPGA), etc. For another example, when one of the above modules is implemented in the form of processing element scheduling program code, the processing element may be a general-purpose processor, such as a central processing unit (CPU) or other processors that can call program codes. For another example, these modules can be integrated together and implemented in the form of a System-On-a-Chip (SOC).
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本公开实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如, 所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘(Solid State Disk,SSD))等。In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware or any combination thereof. When implemented by software, it can be implemented in the form of a computer program product in whole or in part. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, the procedures or functions described in the embodiments of the present disclosure are generated in whole or in part. The computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium. For example, the computer instructions may be transmitted from a website, computer, server, or data center. Transmission to another website site, computer, server or data center via wired (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.). The computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or data center integrated with one or more available media. The usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk (SSD)), etc.
图18为本公开实施例提供的一种电子设备的结构示意图。如图18所示,该电子设备1800可以包括:处理器181和存储器182;所述存储器182用于存储计算机指令,所述处理器181执行所述计算机指令时实现如上述图1至图10所示实施例的方案。FIG. 18 is a schematic structural diagram of an electronic device provided by an embodiment of the disclosure. As shown in FIG. 18, the electronic device 1800 may include: a processor 181 and a memory 182; the memory 182 is used to store computer instructions, and when the processor 181 executes the computer instructions, the implementation is as shown in FIGS. 1 to 10 above. Show the scheme of the embodiment.
可选地,电子设备1800中还可包括用于和其他设备进行通信的通信接口183。可以理解,电子设备1800中还可包括系统总线184,系统总线184用于实现这些组件之间的连接通信。Optionally, the electronic device 1800 may further include a communication interface 183 for communicating with other devices. It can be understood that the electronic device 1800 may further include a system bus 184, and the system bus 184 is used to implement connection and communication between these components.
其中,系统总线184可以是外设部件互连标准(Peripheral Component Interconnect,PCI)总线或扩展工业标准结构(Extended Industry Standard Architecture,EISA)总线等。所述系统总线184可以分为地址总线、数据总线、控制总线等。为便于表示,图中仅用一条粗线表示,但并不表示仅有一根总线或一种类型的总线。通信接口183用于实现数据库访问装置与其他设备(例如客户端、读写库和只读库)之间的通信。存储器182可能包含随机存取存储器(Random Access Memory,RAM),也可能还包括非易失性存储器(Non-Volatile Memory),例如至少一个磁盘存储器。Among them, the system bus 184 may be a Peripheral Component Interconnect (PCI) bus or an Extended Industry Standard Architecture (EISA) bus. The system bus 184 can be divided into an address bus, a data bus, a control bus, and the like. For ease of representation, only one thick line is used in the figure, but it does not mean that there is only one bus or one type of bus. The communication interface 183 is used to implement communication between the database access device and other devices (for example, a client, a read-write library, and a read-only library). The memory 182 may include a random access memory (Random Access Memory, RAM), and may also include a non-volatile memory (Non-Volatile Memory), such as at least one disk memory.
上述的处理器181可以是通用处理器,包括CPU、网络处理器(Network Processor,NP)等;还可以是DSP、ASIC、FPGA或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。The aforementioned processor 181 may be a general-purpose processor, including a CPU, a network processor (Network Processor, NP), etc.; it may also be a DSP, ASIC, FPGA or other programmable logic device, discrete gate or transistor logic device, discrete hardware component .
图19为本公开实施例提供的智能行驶控制方法的流程示意图,在上述实施例的基础上,本公开实施例还提供一种智能行驶控制方法,包括:FIG. 19 is a schematic flow chart of the intelligent driving control method provided by the embodiments of the present disclosure. On the basis of the above-mentioned embodiments, the embodiments of the present disclosure also provide an intelligent driving control method, including:
S1901、分别获取设置在智能设备上的第一传感器和第二传感器的检测数据,该检测数据采用本公开实施例提供的传感器数据处理方法获取得到;S1901, respectively obtain detection data of a first sensor and a second sensor set on the smart device, and the detection data is obtained by using the sensor data processing method provided in the embodiment of the present disclosure;
S1902、根据所述检测数据对智能设备进行智能行驶控制。S1902 Perform intelligent driving control on the smart device according to the detection data.
本实施例的执行主体可以是智能行驶控制装置,本实施例的智能行驶控制装置和上述实施例所述的电子设备可以位于同一设备中,也可以单独设备在不同的设备中。其中本实施例的智能行驶控制装置与上述的电子设备之间通信连接。The execution subject of this embodiment may be a smart driving control device. The smart driving control device of this embodiment and the electronic equipment described in the above embodiments may be located in the same device or separate devices in different devices. Among them, the intelligent driving control device of this embodiment is in communication connection with the above-mentioned electronic equipment.
其中,第一传感器和第二传感器的检测数据是采用上述实施例的方法得到的,具体过程参照上述实施例的描述,在此不再赘述。Wherein, the detection data of the first sensor and the second sensor are obtained by the method of the above-mentioned embodiment, and the specific process is referred to the description of the above-mentioned embodiment, which will not be repeated here.
具体的,电子设备执行上述的传感器数据处理方法,获得设置在智能设备上的第一传感器和第二传感器的检测数据,并将设置在智能设备上的第一传感器和第二传感器的检测数据输出。智能行驶控制装置获取上述第一传感器和第二传感器的检测数据,并根据检测数据对智能设备进行智能行驶控制。Specifically, the electronic device executes the above-mentioned sensor data processing method, obtains detection data of the first sensor and the second sensor set on the smart device, and outputs the detection data of the first sensor and the second sensor set on the smart device . The intelligent driving control device obtains the detection data of the first sensor and the second sensor, and performs intelligent driving control on the smart device according to the detection data.
本实施例的智能行驶包括辅助行驶、自动行驶和/或辅助行驶和自动行驶之间的行驶模式切换等。上述智能行驶控制可以包括:制动、改变行驶速度、改变行驶方向、车道线保持、改变车灯状态、行驶模式切换等,其中,行驶模式切换可以是辅助行驶与自动行驶之间的切换,例如,通过智能行驶控制将辅助行驶切换为自动行驶。The smart driving in this embodiment includes assisted driving, automatic driving, and/or driving mode switching between auxiliary driving and automatic driving, and the like. The above-mentioned intelligent driving control may include: braking, changing driving speed, changing driving direction, maintaining lane line, changing the state of lights, switching driving mode, etc., wherein the driving mode switching may be a switching between assisted driving and automatic driving, for example , Switch assisted driving to automatic driving through intelligent driving control.
本实施例提供的智能行驶控制方法,智能行驶控制装置通过获取设置在智能设备上的传感器的检测数据,并根据设置在智能设备上的传感器的检测数据进行智能行驶控制,进而提高了智能行驶的安全性和可靠性。In the smart driving control method provided in this embodiment, the smart driving control device obtains the detection data of the sensor set on the smart device, and performs smart driving control based on the detection data of the sensor set on the smart device, thereby improving the smart driving performance. Safety and reliability.
图20为本公开实施例提供的智能行驶控制装置的结构示意图,如图20所示,在上 述实施例的基础上,本公开实施例的智能行驶控制装置2000,包括:获取模块2001,用于分别获取设置在智能设备上的第一传感器和第二传感器的检测数据,所述检测数据采用上述的传感器数据处理方法得到;FIG. 20 is a schematic structural diagram of an intelligent driving control device provided by an embodiment of the disclosure. As shown in FIG. 20, on the basis of the above embodiment, the intelligent driving control device 2000 of the embodiment of the present disclosure includes: an acquisition module 2001 for Acquiring detection data of the first sensor and the second sensor set on the smart device respectively, and the detection data is obtained by using the aforementioned sensor data processing method;
智能行驶控制模块2002,用于根据所述检测数据对智能设备进行智能行驶控制。The intelligent driving control module 2002 is used to perform intelligent driving control on the smart device according to the detection data.
本公开实施例的智能行驶控制装置,可以用于执行上述所示方法实施例的技术方案,其实现原理和技术效果类似,此处不再赘述。The intelligent driving control device of the embodiment of the present disclosure may be used to execute the technical solution of the method embodiment shown above, and its implementation principle and technical effect are similar, and will not be repeated here.
图21为本公开实施例提供的智能行驶系统的示意图,如图21所示,本实施例的智能行驶系统2100包括:通信连接的传感器2101、电子设备1800和智能行驶控制装置2000,其中电子设备1800如图18所示,智能行驶控制装置2000如图20所示。其中,传感器2101可以包括前述实施例中所述的摄像头、LiDAR、RADAR、高精惯导等传感器中的至少之一。FIG. 21 is a schematic diagram of a smart driving system provided by an embodiment of the disclosure. As shown in FIG. 21, the smart driving system 2100 of this embodiment includes: a sensor 2101, an electronic device 1800, and a smart driving control device 2000 connected in communication, wherein the electronic device 1800 is shown in FIG. 18, and the intelligent driving control device 2000 is shown in FIG. 20. The sensor 2101 may include at least one of the cameras, LiDAR, RADAR, high-precision inertial navigation and other sensors described in the foregoing embodiments.
具体的,如图21所示,在实际使用时,传感器2101检测智能设备周围环境,得到原始的检测数据,并将这些检测数据发送给电子设备1800,电子设备1800接收到这些原始的检测数据后,根据上述传感器数据处理方法进行数据同步,得到同步后的检测数据。电子设备1800将同步后的检测数据发送给智能行驶控制装置2000,智能行驶控制装置2000根据同步后的检测数据对智能设备进行智能行驶控制。Specifically, as shown in FIG. 21, in actual use, the sensor 2101 detects the surrounding environment of the smart device to obtain original detection data, and sends these detection data to the electronic device 1800. After the electronic device 1800 receives the original detection data , Perform data synchronization according to the above-mentioned sensor data processing method to obtain synchronized detection data. The electronic device 1800 sends the synchronized detection data to the smart driving control device 2000, and the smart driving control device 2000 performs smart driving control on the smart device according to the synchronized detection data.
本申请实施例还提供一种存储介质,所述存储介质中存储有指令,当其在计算机上运行时,使得计算机执行如上述图1至图10任一所示实施例的方法;或者,使得计算机执行如上述图19所示实施例的方法。The embodiment of the present application also provides a storage medium, the storage medium stores instructions, which when run on a computer, causes the computer to execute the method of any one of the embodiments shown in FIGS. 1 to 10; or The computer executes the method of the embodiment shown in FIG. 19 above.
本申请实施例还提供一种运行指令的芯片,所述芯片用于执行上述图1至图10任一所示实施例的方法;或者,所述芯片用于执行上述图19所示实施例的方法。An embodiment of the present application also provides a chip for executing instructions. The chip is used to execute the method of any one of the embodiments shown in FIG. 1 to FIG. 10; or, the chip is used to execute the method of the embodiment shown in FIG. method.
本申请实施例还提供一种程序产品,所述程序产品包括计算机程序,所述计算机程序存储在存储介质中,至少一个处理器可以从所述存储介质读取所述计算机程序,所述至少一个处理器执行所述计算机程序时可实现上述图1至图10任一所示实施例的方法;或者,所述至少一个处理器执行所述计算机程序时可实现上述图19所示实施例的方法。An embodiment of the present application further provides a program product, the program product includes a computer program, the computer program is stored in a storage medium, at least one processor can read the computer program from the storage medium, and the at least one When the processor executes the computer program, the method of any one of the embodiments shown in FIGS. 1 to 10 may be implemented; or, when the at least one processor executes the computer program, the method of the embodiment shown in FIG. 19 may be implemented .
在本公开实施例中,“至少一个”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A,B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系;在公式中,字符“/”,表示前后关联对象是一种“相除”的关系。“以下至少一项(个)”或其类似表达,是指的这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b,或c中的至少一项(个),可以表示:a,b,c,a-b,a-c,b-c,或a-b-c,其中,a,b,c可以是单个,也可以是多个。In the embodiments of the present disclosure, "at least one" refers to one or more, and "multiple" refers to two or more. "And/or" describes the association relationship of the associated objects, indicating that there can be three relationships, for example, A and/or B, which can mean: A alone exists, both A and B exist, and B exists alone, where A, B can be singular or plural. The character "/" generally indicates that the associated objects before and after are in an "or" relationship; in the formula, the character "/" indicates that the associated objects before and after are in a "division" relationship. "The following at least one item (a)" or similar expressions refers to any combination of these items, including any combination of a single item (a) or plural items (a). For example, at least one of a, b, or c can mean: a, b, c, ab, ac, bc, or abc, where a, b, and c can be single or multiple A.
可以理解的是,在本公开实施例中涉及的各种数字编号仅为描述方便进行的区分,并不用来限制本公开实施例的范围。It can be understood that the various numerical numbers involved in the embodiments of the present disclosure are only for easy distinction for description and are not used to limit the scope of the embodiments of the present disclosure.
可以理解的是,在本公开的实施例中,上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本公开实施例的实施过程构成任何限定。It can be understood that, in the embodiments of the present disclosure, the size of the sequence numbers of the foregoing processes does not mean the order of execution. The execution order of the processes should be determined by their functions and internal logic, and should not correspond to the embodiments of the present disclosure. The implementation process constitutes any limitation.
最后应说明的是:以上各实施例仅用以说明本公开实施例的技术方案,而非对其限制;尽管参照前述各实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本公开各实施例技术方案的范围。Finally, it should be noted that the above embodiments are only used to illustrate the technical solutions of the embodiments of the present disclosure, not to limit them; although the present invention has been described in detail with reference to the foregoing embodiments, those of ordinary skill in the art should understand : It can still modify the technical solutions described in the foregoing embodiments, or equivalently replace some or all of the technical features; these modifications or replacements do not make the essence of the corresponding technical solutions deviate from the technology of the embodiments of the present disclosure The scope of the program.

Claims (29)

  1. 一种传感器数据处理方法,包括:A sensor data processing method, including:
    获取智能设备的第一传感器在第一时刻的第一目标数据;Acquiring the first target data of the first sensor of the smart device at the first moment;
    获取所述智能设备的第二传感器在第二时刻的第二目标数据,所述第一时刻和所述第二时刻为所述智能设备的时钟下的不同时刻;Acquiring second target data of a second sensor of the smart device at a second moment, where the first moment and the second moment are different moments under a clock of the smart device;
    获取所述智能设备在所述第一时刻的第一位姿和在所述第二时刻的第二位姿,所述第一位姿和所述第二位姿不同;Acquiring a first pose of the smart device at the first moment and a second pose at the second moment, where the first pose and the second pose are different;
    根据所述第一位姿和所述第二位姿,对所述第一目标数据进行补偿处理,得到所述第一传感器在所述第二时刻下的补偿数据。According to the first pose and the second pose, performing compensation processing on the first target data to obtain compensation data of the first sensor at the second moment.
  2. 根据权利要求1所述的方法,其中,所述根据所述第一位姿和所述第二位姿,对所述第一目标数据进行补偿处理,得到所述第一传感器在所述第二时刻下的补偿数据,包括:The method according to claim 1, wherein the compensation process is performed on the first target data according to the first pose and the second pose to obtain that the first sensor is in the second Compensation data at the moment, including:
    根据所述第一位姿,确定所述第一传感器在所述第一时刻的第一坐标系;Determine the first coordinate system of the first sensor at the first moment according to the first pose;
    根据所述第二位姿,确定所述第一传感器在所述第二时刻的第二坐标系;Determine the second coordinate system of the first sensor at the second moment according to the second pose;
    根据所述第一坐标系和所述第二坐标系,对所述第一目标数据进行补偿处理,得到所述第一传感器在所述第二时刻下的补偿数据。According to the first coordinate system and the second coordinate system, performing compensation processing on the first target data to obtain compensation data of the first sensor at the second time.
  3. 根据权利要求2所述的方法,其中,所述获取所述智能设备在所述第一时刻的第一位姿和在所述第二时刻的第二位姿,包括:The method according to claim 2, wherein said acquiring the first pose of the smart device at the first moment and the second pose at the second moment comprises:
    分别获取所述智能设备上设置的具有位姿检测功能的传感器在多个时刻检测到的位姿检测数据,所述多个时刻中的每个时刻均为所述智能设备的时钟下的时刻;Acquiring respectively the pose detection data detected by a sensor with a pose detection function set on the smart device at multiple moments, each of the multiple moments being a moment under the clock of the smart device;
    根据所述位姿检测数据,生成所述智能设备的位姿队列;Generating a pose queue of the smart device according to the pose detection data;
    根据所述智能设备的位姿队列、所述第一时刻和所述第二时刻,确定所述第一位姿以及所述第二位姿。The first pose and the second pose are determined according to the pose queue of the smart device, the first moment and the second moment.
  4. 根据权利要求3所述的方法,其中,所述根据所述智能设备的位姿队列、所述第一时刻和所述第二时刻,确定所述第一位姿以及所述第二位姿,包括:The method according to claim 3, wherein the determining the first pose and the second pose according to the pose queue of the smart device, the first moment and the second moment, include:
    响应于所述位姿队列中不包括所述第一时刻的位姿的情况,根据所述第一时刻,对所述智能设备的位姿队列进行补偿处理,得到所述第一位姿;和/或,In response to the situation that the pose queue at the first moment is not included in the pose queue, perform compensation processing on the pose queue of the smart device according to the first moment to obtain the first pose; and /or,
    响应于所述位姿队列中不包括所述第二时刻的位姿的情况,根据所述第二时刻,对所述智能设备的位姿队列进行补偿处理,得到所述第二位姿。In response to the situation that the pose at the second moment is not included in the pose queue, compensation processing is performed on the pose queue of the smart device according to the second moment to obtain the second pose.
  5. 根据权利要求1-4任一项所述的方法,其中,所述获取智能设备的第一传感器在第一时刻的第一目标数据之前,所述方法还包括:The method according to any one of claims 1 to 4, wherein before the acquiring the first target data of the first sensor of the smart device at the first moment, the method further comprises:
    根据所述第一传感器的时钟与所述智能设备的时钟的差异信息,确定第三时刻在所述智能设备时钟下的所述第一时刻,所述第三时刻用于标识所述第一传感器检测所述第一目标数据对应的第一原始数据的时刻,所述第三时刻为所述第一传感器的时钟下的时刻。According to the difference information between the clock of the first sensor and the clock of the smart device, it is determined that the third time is the first time under the clock of the smart device, and the third time is used to identify the first sensor The time when the first raw data corresponding to the first target data is detected, and the third time is a time under the clock of the first sensor.
  6. 根据权利要求5所述的方法,其中,所述获取智能设备的第一传感器在第一时刻的第一目标数据,包括:The method according to claim 5, wherein said acquiring the first target data of the first sensor of the smart device at the first moment comprises:
    接收所述第一传感器上报的所述第一原始数据,所述第一原始数据包括多个子数据,每个子数据具有一个对应的检测时刻,所述第三时刻为所述第一原始数据包括的多个子数据的检测时刻的参考时刻;Receive the first raw data reported by the first sensor, the first raw data includes multiple sub-data, each sub-data has a corresponding detection time, and the third time is included in the first raw data The reference time of the detection time of multiple sub-data;
    根据所述智能设备在每个子数据的检测时刻的位姿以及每个子数据,获取所述第一传感器在所述第一时刻的所述第一目标数据。The first target data of the first sensor at the first time is acquired according to the pose of the smart device at the detection time of each sub-data and each sub-data.
  7. 权利要求6所述的方法,其中,所述根据所述智能设备在每个子数据的检测时刻的位姿以及每个子数据,获取所述第一传感器在所述第一时刻的所述第一目标数据,包括:The method of claim 6, wherein the first target of the first sensor at the first time is obtained according to the pose of the smart device at the detection time of each sub-data and each sub-data Data, including:
    根据所述智能设备在每个子数据的检测时刻的位姿,确定所述第一传感器在每个子数据的检测时刻的坐标系;Determine the coordinate system of the first sensor at the detection time of each sub-data according to the pose of the smart device at the detection time of each sub-data;
    根据所述第一传感器在除所述第三时刻外的每个子数据的检测时刻的坐标系和所述第一传感器在所述第三时刻的坐标系,分别确定除所述第三时刻外的每个子数据在所述第三时刻对应的子数据;According to the coordinate system of the first sensor at the detection time of each sub-data except the third time and the coordinate system of the first sensor at the third time, determine the coordinates other than the third time. The sub-data corresponding to each sub-data at the third moment;
    对除所述第三时刻外的每个子数据在所述第三时刻对应的子数据进行整合处理,得到所述第一传感器在所述第一时刻的所述第一目标数据。Perform integration processing on the sub-data corresponding to each sub-data at the third time except the third time, to obtain the first target data of the first sensor at the first time.
  8. 根据权利要求5-7任一项所述的方法,其中,所述根据所述第一位姿和所述第二位姿,对所述第一目标数据进行补偿处理,得到所述第一传感器在所述第二时刻下的补偿数据之前,所述方法还包括:7. The method according to any one of claims 5-7, wherein the compensation processing is performed on the first target data according to the first pose and the second pose to obtain the first sensor Before the compensation data at the second moment, the method further includes:
    接收第三传感器上报的第二原始数据,所述第三传感器的类型与所述第一传感器的类型相同,所述第二原始数据包括多个子数据,每个子数据具有一个对应的检测时刻,所述第三时刻为所述第二原始数据包括的多个子数据的检测时刻的参考时刻;Receive the second raw data reported by the third sensor, the type of the third sensor is the same as the type of the first sensor, the second raw data includes multiple sub-data, each sub-data has a corresponding detection time, so The third time is a reference time of detection time of the multiple sub-data included in the second original data;
    根据所述智能设备在所述第二原始数据的每个子数据的检测时刻的位姿以及所述第二原始数据的每个子数据,获取所述第三传感器在所述第一时刻的第三目标数据;Obtain the third target of the third sensor at the first time according to the pose of the smart device at the detection time of each sub-data of the second raw data and each sub-data of the second raw data data;
    所述根据所述第一位姿和所述第二位姿,对所述第一目标数据进行补偿处理,得到所述第一传感器在所述第二时刻下的补偿数据,包括:The performing compensation processing on the first target data according to the first pose and the second pose to obtain the compensation data of the first sensor at the second moment includes:
    根据所述第一位姿和所述第二位姿,对所述第一目标数据和所述第三目标数据进行补偿处理,得到所述第一传感器在所述第二时刻下的补偿数据以及所述第三传感器在所述第二时刻下的补偿数据。According to the first pose and the second pose, perform compensation processing on the first target data and the third target data to obtain the compensation data of the first sensor at the second moment, and Compensation data of the third sensor at the second moment.
  9. 根据权利要求5-8任一项所述的方法,其中,所述根据所述第一传感器的时钟与所述智能设备的时钟的差异信息,确定第三时刻在所述智能设备时钟下的所述第一时刻之前,所述方法还包括:8. The method according to any one of claims 5-8, wherein the determination of the third moment in the clock of the smart device according to the difference information between the clock of the first sensor and the clock of the smart device Before the first moment, the method further includes:
    根据全球定位系统GPS时钟与所述智能设备的时钟误差,确定所述第一传感器的时钟与所述智能设备的时钟的差异信息。Determine the difference information between the clock of the first sensor and the clock of the smart device according to the error between the GPS clock of the global positioning system and the clock of the smart device.
  10. 根据权利要求1-9任一项所述的方法,其中,所述获取智能设备的第二传感器在第二时刻的第二目标数据之前,所述方法还包括:The method according to any one of claims 1-9, wherein before the acquiring the second target data of the second sensor of the smart device at the second moment, the method further comprises:
    根据所述第二传感器的时钟与所述智能设备的时钟的差异信息,确定所述第二传感器上报所述第二目标数据时携带的第四时刻在所述智能设备时钟下的所述第二时刻,所述第四时刻用于标识所述第二目标数据的检测时刻,所述第四时刻为所述第二传感器的时钟下的时刻。According to the difference information between the clock of the second sensor and the clock of the smart device, it is determined that the fourth time carried when the second sensor reports the second target data is in the second time under the smart device clock. Time, the fourth time is used to identify the detection time of the second target data, and the fourth time is the time under the clock of the second sensor.
  11. 根据权利要求10所述的方法,其中,所述根据所述第二传感器的时钟与所述智能设备的时钟的差异信息,确定所述第二传感器上报所述第二目标数据时携带的第四时刻在所述智能设备时钟下的所述第二时刻之前,所述方法还还包括:The method according to claim 10, wherein said determining the fourth sensor carried when the second sensor reports the second target data according to the difference information between the clock of the second sensor and the clock of the smart device. The time is before the second time under the smart device clock, and the method further includes:
    使用所述第二传感器拍摄所述智能设备的秒表的多个视频帧;Using the second sensor to shoot multiple video frames of the stopwatch of the smart device;
    对拍摄每个所述视频帧的时刻信息与每个所述视频帧所对应的秒表所显示的时刻信息进行比对分析,得到所述第二传感器的时钟与所述智能设备的时钟的差异信息。Compare and analyze the time information of shooting each of the video frames with the time information displayed by the stopwatch corresponding to each of the video frames to obtain the difference information between the clock of the second sensor and the clock of the smart device .
  12. 根据权利要求1-11任一项所述的方法,其中,所述第二传感器为摄像头,所述第一传感器为激光雷达或毫米波雷达。The method according to any one of claims 1-11, wherein the second sensor is a camera, and the first sensor is a lidar or millimeter wave radar.
  13. 一种传感器数据处理装置,包括:A sensor data processing device includes:
    第一获取模块,用于获取智能设备的第一传感器在第一时刻的第一目标数据;The first acquisition module is used to acquire the first target data of the first sensor of the smart device at the first moment;
    第二获取模块,用于获取所述智能设备的第二传感器在第二时刻的第二目标数据,所述第一时刻和所述第二时刻为所述智能设备的时钟下的不同时刻;A second acquisition module, configured to acquire second target data of a second sensor of the smart device at a second time, where the first time and the second time are different times under the clock of the smart device;
    第三获取模块,用于获取所述智能设备在所述第一时刻的第一位姿和在所述第二时刻的第二位姿,所述第一位姿和所述第二位姿不同;The third acquisition module is used to acquire the first pose of the smart device at the first moment and the second pose at the second moment, where the first pose and the second pose are different ;
    补偿模块,用于根据所述第一位姿和所述第二位姿,对所述第一目标数据进行补偿处理,得到所述第一传感器在所述第二时刻下的补偿数据。The compensation module is configured to perform compensation processing on the first target data according to the first pose and the second pose to obtain the compensation data of the first sensor at the second moment.
  14. 根据权利要求13所述的装置,其中,所述补偿模块,用于根据所述第一位姿,确定所述第一传感器在所述第一时刻的第一坐标系;根据所述第二位姿,确定所述第一传感器在所述第二时刻的第二坐标系;根据所述第一坐标系和所述第二坐标系,对所述第一目标数据进行补偿处理,得到所述第一传感器在所述第二时刻下的补偿数据。The device according to claim 13, wherein the compensation module is configured to determine the first coordinate system of the first sensor at the first moment according to the first pose; according to the second position Pose, determine the second coordinate system of the first sensor at the second moment; perform compensation processing on the first target data according to the first coordinate system and the second coordinate system to obtain the first Compensation data of a sensor at the second moment.
  15. 根据权利要求14所述的装置,其中,所述第三获取模块,用于分别获取所述智能设备上设置的具有位姿检测功能的传感器在多个时刻检测到的位姿检测数据,所述多个时刻中的每个时刻均为所述智能设备的时钟下的时刻;根据所述位姿检测数据,生成所述智能设备的位姿队列;根据所述智能设备的位姿队列、所述第一时刻和所述第二时刻,确定所述第一位姿以及所述第二位姿。The device according to claim 14, wherein the third acquisition module is configured to acquire the pose detection data detected at multiple times by the sensor with the pose detection function set on the smart device, and the Each of the multiple times is a time under the clock of the smart device; generates the pose queue of the smart device according to the pose detection data; according to the pose queue of the smart device, the The first moment and the second moment determine the first pose and the second pose.
  16. 根据权利要求15所述的装置,其中,所述第三获取模块,用于响应于所述位姿队列中不包括所述第一时刻的位姿的情况,根据所述第一时刻,对所述智能设备的位姿队列进行补偿处理,得到所述第一位姿;和/或,响应于所述位姿队列中不包括所述第二时刻的位姿的情况,根据所述第二时刻,对所述智能设备的位姿队列进行补偿处理,得到所述第二位姿。The device according to claim 15, wherein the third acquiring module is configured to respond to the situation that the pose at the first moment is not included in the pose queue, and perform a check on all the poses according to the first moment. The pose queue of the smart device performs compensation processing to obtain the first pose; and/or, in response to the situation that the pose queue does not include the pose at the second moment, according to the second moment , Performing compensation processing on the pose queue of the smart device to obtain the second pose.
  17. 根据权利要求13-16任一项所述的装置,其中,所述装置还包括:The device according to any one of claims 13-16, wherein the device further comprises:
    第一确定模块,用于根据所述第一传感器的时钟与所述智能设备的时钟的差异信息,确定第三时刻在所述智能设备时钟下的所述第一时刻,所述第三时刻用于标识所述第一传感器检测所述第一目标数据对应的第一原始数据的时刻,所述第三时刻为所述第一传感器的时钟下的时刻。The first determining module is configured to determine, according to the difference information between the clock of the first sensor and the clock of the smart device, that the third time is the first time under the smart device clock, and the third time is used To identify the time when the first sensor detects the first raw data corresponding to the first target data, the third time is a time under the clock of the first sensor.
  18. 根据权利要求17所述的装置,其中,所述第一获取模块,用于接收所述第一传感器上报的所述第一原始数据,所述第一原始数据包括多个子数据,每个子数据具有一个对应的检测时刻,所述第三时刻为所述第一原始数据包括的多个子数据的检测时刻的参考时刻;根据所述智能设备在每个子数据的检测时刻的位姿以及每个子数据,获取所述第一传感器在所述第一时刻的所述第一目标数据。The apparatus according to claim 17, wherein the first acquisition module is configured to receive the first raw data reported by the first sensor, the first raw data includes a plurality of sub-data, and each sub-data has A corresponding detection time, the third time is the reference time of the detection time of the multiple sub-data included in the first original data; according to the pose of the smart device at the detection time of each sub-data and each sub-data, Acquiring the first target data of the first sensor at the first moment.
  19. 权利要求18所述的装置,其中,所述第一获取模块,用于根据所述智能设备在每个子数据的检测时刻的位姿,确定所述第一传感器在每个子数据的检测时刻的坐标系;根据所述第一传感器在除所述第三时刻外的每个子数据的检测时刻的坐标系和所述第一传感器在所述第三时刻的坐标系,分别确定除所述第三时刻外的每个子数据在所述第三时刻对应的子数据;对除所述第三时刻外的每个子数据在所述第三时刻对应的子数据进行整合处理,得到所述第一传感器在所述第一时刻的所述第一目标数据。The apparatus of claim 18, wherein the first acquisition module is configured to determine the coordinates of the first sensor at the detection time of each sub-data according to the pose of the smart device at the detection time of each sub-data System; according to the first sensor at the coordinate system of the detection time of each sub-data except the third time and the coordinate system of the first sensor at the third time, respectively determine except for the third time The sub-data corresponding to each sub-data except the third time at the third time; the sub-data corresponding to each sub-data at the third time except the third time is integrated to obtain the first sensor at the The first target data at the first moment.
  20. 根据权利要求17-19任一项所述的装置,其中,所述装置还包括:The device according to any one of claims 17-19, wherein the device further comprises:
    接收模块,用于接收第三传感器上报的第二原始数据,所述第三传感器的类型与所述第一传感器的类型相同,所述第二原始数据包括多个子数据,每个子数据具有一个对应的检测时刻,所述第三时刻为所述第二原始数据包括的多个子数据的检测时刻的参考时刻;The receiving module is configured to receive the second raw data reported by the third sensor, the type of the third sensor is the same as the type of the first sensor, and the second raw data includes multiple sub-data, and each sub-data has a corresponding The detection time of the third time is a reference time of the detection time of the multiple sub-data included in the second original data;
    第四获取模块,用于根据所述智能设备在所述第二原始数据的每个子数据的检测时刻的位姿以及所述第二原始数据的每个子数据,获取所述第三传感器在所述第一时刻的第三目标数据;The fourth acquisition module is configured to acquire the position of the third sensor at the detection time of each sub-data of the second raw data and each sub-data of the second raw data according to the smart device The third target data at the first moment;
    所述补偿模块,用于根据所述第一位姿和所述第二位姿,对所述第一目标数据和所述第三目标数据进行补偿处理,得到所述第一传感器在所述第二时刻下的补偿数据以及所述第三传感器在所述第二时刻下的补偿数据。The compensation module is configured to perform compensation processing on the first target data and the third target data according to the first pose and the second pose, to obtain that the first sensor is in the first position The compensation data at the second time and the compensation data of the third sensor at the second time.
  21. 根据权利要求17-20任一项所述的装置,其中,所述装置还包括:The device according to any one of claims 17-20, wherein the device further comprises:
    第五获取模块,用于所述第一确定模块根据所述第一传感器的时钟与所述智能设备的时钟的差异信息,确定第三时刻在所述智能设备时钟下的所述第一时刻之前,根据全球定位系统GPS时钟与所述智能设备的时钟误差,获取所述第一传感器的时钟与所述智能设备的时钟的差异信息。The fifth acquiring module is used for the first determining module to determine that the third time is before the first time under the smart device clock according to the difference information between the clock of the first sensor and the clock of the smart device According to the error between the GPS clock of the global positioning system and the clock of the smart device, the difference information between the clock of the first sensor and the clock of the smart device is obtained.
  22. 根据权利要求13-21任一项所述的装置,其中,所述装置还包括:The device according to any one of claims 13-21, wherein the device further comprises:
    第二确定模块,用于所述第二获取模块获取智能设备的第二传感器在第二时刻的第二目标数据之前,根据所述第二传感器的时钟与所述智能设备的时钟的差异信息,确定所述第二传感器上报所述第二目标数据时携带的第四时刻在所述智能设备时钟下的所述第二时刻,所述第四时刻用于标识所述第二目标数据的检测时刻,所述第四时刻为所述第二传感器的时钟下的时刻。The second determining module is configured to, before the second acquiring module acquires the second target data of the second sensor of the smart device at the second moment, according to the difference information between the clock of the second sensor and the clock of the smart device, It is determined that the fourth time carried when the second sensor reports the second target data is the second time under the smart device clock, and the fourth time is used to identify the detection time of the second target data , The fourth time is a time under the clock of the second sensor.
  23. 根据权利要求22所述的装置,其中,所述装置还包括:The device according to claim 22, wherein the device further comprises:
    拍摄模块,用于使用所述第二传感器拍摄所述智能设备的秒表的多个视频帧;A photographing module, configured to use the second sensor to photograph multiple video frames of the stopwatch of the smart device;
    分析模块,用于对拍摄每个所述视频帧的时刻信息与每个所述视频帧所对应的秒表所显示的时刻信息进行比对分析,得到所述第二传感器的时钟与所述智能设备的时钟的差异信息。The analysis module is used to compare and analyze the time information of each video frame and the time information displayed by the stopwatch corresponding to each video frame to obtain the clock of the second sensor and the smart device The difference information of the clock.
  24. 根据权利要求13-23任一项所述的装置,其中,所述第二传感器为摄像头,所述第一传感器为激光雷达或毫米波雷达。The device according to any one of claims 13-23, wherein the second sensor is a camera, and the first sensor is a lidar or millimeter wave radar.
  25. 一种智能行驶控制方法,包括:An intelligent driving control method, including:
    分别获取设置在智能设备上的第一传感器和第二传感器的检测数据,所述检测数据采用如权利要求1-12任一项所述的传感器数据处理方法得到;Acquiring detection data of the first sensor and the second sensor set on the smart device respectively, and the detection data is obtained using the sensor data processing method according to any one of claims 1-12;
    根据所述检测数据对智能设备进行智能行驶控制。Perform intelligent driving control on the smart device according to the detection data.
  26. 一种智能行驶控制装置,包括:An intelligent driving control device, including:
    获取模块,用于分别获取设置在智能设备上的第一传感器和第二传感器的检测数据,所述检测数据采用如权利要求1-12任一项所述的传感器数据处理方法得到;The acquisition module is configured to separately acquire detection data of the first sensor and the second sensor set on the smart device, and the detection data is obtained by using the sensor data processing method according to any one of claims 1-12;
    智能行驶控制模块,用于根据所述检测数据对智能设备进行智能行驶控制。The intelligent driving control module is used to perform intelligent driving control on the smart device according to the detection data.
  27. 一种电子设备,包括:An electronic device including:
    存储器,用于存储计算机指令;Memory, used to store computer instructions;
    处理器,用于调用并执行所述存储器中的计算机指令,执行权利要求1-12任一项所述的方法步骤。The processor is configured to call and execute computer instructions in the memory, and execute the method steps of any one of claims 1-12.
  28. 一种智能行驶系统,包括:通信连接的传感器、如权利要求27所述的电子设备和如权利要求26所述的智能行驶控制装置。An intelligent driving system, comprising: a sensor connected in communication, the electronic device according to claim 27, and the intelligent driving control device according to claim 26.
  29. 一种可读存储介质,所述可读存储介质中存储有计算机程序,所述计算机程序用于执行权利要求1-12任一项所述的方法步骤;或者,所述计算机程序用于执行权利要求25所述的方法步骤。A readable storage medium in which a computer program is stored, and the computer program is used to execute the method steps of any one of claims 1-12; or, the computer program is used to execute rights Method steps described in claim 25.
PCT/CN2020/076813 2019-06-25 2020-02-26 Method and apparatus for processing data of sensor, electronic device, and system WO2020258901A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020217016721A KR20210087495A (en) 2019-06-25 2020-02-26 Sensor data processing methods, devices, electronic devices and systems
JP2021533178A JP7164721B2 (en) 2019-06-25 2020-02-26 Sensor data processing method, device, electronic device and system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910556258.2A CN112214009B (en) 2019-06-25 2019-06-25 Sensor data processing method and device, electronic equipment and system
CN201910556258.2 2019-06-25

Publications (1)

Publication Number Publication Date
WO2020258901A1 true WO2020258901A1 (en) 2020-12-30

Family

ID=74048283

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/076813 WO2020258901A1 (en) 2019-06-25 2020-02-26 Method and apparatus for processing data of sensor, electronic device, and system

Country Status (4)

Country Link
JP (1) JP7164721B2 (en)
KR (1) KR20210087495A (en)
CN (1) CN112214009B (en)
WO (1) WO2020258901A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112880674A (en) * 2021-01-21 2021-06-01 深圳市镭神智能系统有限公司 Positioning method, device and equipment of driving equipment and storage medium
CN113033483A (en) * 2021-04-20 2021-06-25 北京百度网讯科技有限公司 Method and device for detecting target object, electronic equipment and storage medium
CN113724303A (en) * 2021-09-07 2021-11-30 广州文远知行科技有限公司 Point cloud and image matching method and device, electronic equipment and storage medium
CN115167612A (en) * 2022-07-14 2022-10-11 北京中科心研科技有限公司 Wall time and supplementary packet method, device and medium for synchronizing data

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112902951A (en) * 2021-01-21 2021-06-04 深圳市镭神智能系统有限公司 Positioning method, device and equipment of driving equipment and storage medium
CN114520855B (en) * 2021-12-31 2024-03-15 广州文远知行科技有限公司 Image frame rendering method and device based on multi-module data and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5906655A (en) * 1997-04-02 1999-05-25 Caterpillar Inc. Method for monitoring integrity of an integrated GPS and INU system
CN103940434A (en) * 2014-04-01 2014-07-23 西安交通大学 Real-time lane line detecting system based on monocular vision and inertial navigation unit
CN104501814A (en) * 2014-12-12 2015-04-08 浙江大学 Attitude and position estimation method based on vision and inertia information
CN105745604A (en) * 2013-11-03 2016-07-06 微软技术许可有限责任公司 Sensor data time alignment
CN106546238A (en) * 2016-10-26 2017-03-29 北京小鸟看看科技有限公司 Wearable device and the method that user's displacement is determined in wearable device
CN107462892A (en) * 2017-07-28 2017-12-12 深圳普思英察科技有限公司 Mobile robot synchronous superposition method based on more sonacs
CN107976193A (en) * 2017-11-21 2018-05-01 出门问问信息科技有限公司 A kind of pedestrian's flight path estimating method, device, flight path infer equipment and storage medium
CN109270534A (en) * 2018-05-07 2019-01-25 西安交通大学 A kind of intelligent vehicle laser sensor and camera online calibration method

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008003779A (en) * 2006-06-21 2008-01-10 Hitachi Ltd Measurement data processing apparatus of preventive safety car
JP2009222438A (en) 2008-03-13 2009-10-01 Toyota Motor Corp Positioning device for movable body
JP5416026B2 (en) 2010-04-23 2014-02-12 本田技研工業株式会社 Vehicle periphery monitoring device
CN103353304B (en) * 2013-06-25 2016-02-03 深圳市宇恒互动科技开发有限公司 A kind ofly three-dimensional inertial motion is sensed to the method and device that compensate
CN104112363B (en) * 2014-07-04 2016-05-25 西安交通大学 Many sensing datas space-time synchronous method and many sensing datas of road vehicular collecting system
DE102016212326A1 (en) * 2016-07-06 2018-01-11 Robert Bosch Gmbh Method for processing sensor data for a position and / or orientation of a vehicle
JP6787102B2 (en) 2016-12-14 2020-11-18 株式会社デンソー Object detection device, object detection method
US10145945B2 (en) * 2017-01-11 2018-12-04 Toyota Research Institute, Inc. Systems and methods for automatically calibrating a LIDAR using information from a secondary vehicle
US10599931B2 (en) * 2017-08-21 2020-03-24 2236008 Ontario Inc. Automated driving system that merges heterogenous sensor data
CN108168918B (en) * 2017-12-25 2019-12-27 中铁第四勘察设计院集团有限公司 Synchronous automatic control system and method for synchronous measurement of automatic track measuring vehicle
CN108957466B (en) * 2018-04-18 2022-01-25 广东宝乐机器人股份有限公司 Radar data compensation method, device, equipment and storage medium for mobile robot
CN109218562B (en) * 2018-09-07 2021-04-27 百度在线网络技术(北京)有限公司 Clock synchronization method, device, equipment, storage medium and vehicle

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5906655A (en) * 1997-04-02 1999-05-25 Caterpillar Inc. Method for monitoring integrity of an integrated GPS and INU system
CN105745604A (en) * 2013-11-03 2016-07-06 微软技术许可有限责任公司 Sensor data time alignment
CN103940434A (en) * 2014-04-01 2014-07-23 西安交通大学 Real-time lane line detecting system based on monocular vision and inertial navigation unit
CN104501814A (en) * 2014-12-12 2015-04-08 浙江大学 Attitude and position estimation method based on vision and inertia information
CN106546238A (en) * 2016-10-26 2017-03-29 北京小鸟看看科技有限公司 Wearable device and the method that user's displacement is determined in wearable device
CN107462892A (en) * 2017-07-28 2017-12-12 深圳普思英察科技有限公司 Mobile robot synchronous superposition method based on more sonacs
CN107976193A (en) * 2017-11-21 2018-05-01 出门问问信息科技有限公司 A kind of pedestrian's flight path estimating method, device, flight path infer equipment and storage medium
CN109270534A (en) * 2018-05-07 2019-01-25 西安交通大学 A kind of intelligent vehicle laser sensor and camera online calibration method

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112880674A (en) * 2021-01-21 2021-06-01 深圳市镭神智能系统有限公司 Positioning method, device and equipment of driving equipment and storage medium
CN113033483A (en) * 2021-04-20 2021-06-25 北京百度网讯科技有限公司 Method and device for detecting target object, electronic equipment and storage medium
CN113033483B (en) * 2021-04-20 2024-02-02 北京百度网讯科技有限公司 Method, device, electronic equipment and storage medium for detecting target object
CN113724303A (en) * 2021-09-07 2021-11-30 广州文远知行科技有限公司 Point cloud and image matching method and device, electronic equipment and storage medium
CN113724303B (en) * 2021-09-07 2024-05-10 广州文远知行科技有限公司 Point cloud and image matching method and device, electronic equipment and storage medium
CN115167612A (en) * 2022-07-14 2022-10-11 北京中科心研科技有限公司 Wall time and supplementary packet method, device and medium for synchronizing data

Also Published As

Publication number Publication date
JP7164721B2 (en) 2022-11-01
CN112214009B (en) 2022-07-26
CN112214009A (en) 2021-01-12
KR20210087495A (en) 2021-07-12
JP2022513780A (en) 2022-02-09

Similar Documents

Publication Publication Date Title
WO2020258901A1 (en) Method and apparatus for processing data of sensor, electronic device, and system
US10789771B2 (en) Method and apparatus for fusing point cloud data
CN108900272B (en) Sensor data acquisition method and system and packet loss judgment method
US20200372672A1 (en) Image-based localization
WO2019119282A1 (en) Method and device for associating image and location information, and movable platform
JP2022509302A (en) Map generation method, operation control method, device, electronic device and system
CN109100730B (en) Multi-vehicle cooperative rapid map building method
US20170337701A1 (en) Method and system for 3d capture based on structure from motion with simplified pose detection
IL269560A (en) Distributed device mapping
CN113160327A (en) Method and system for realizing point cloud completion
US20220229759A1 (en) Method, device, and system for simulation test
JPWO2020039937A1 (en) Position coordinate estimation device, position coordinate estimation method and program
US20220163976A1 (en) Systems and methods for creating and using risk profiles for fleet management of a fleet of vehicles
Zingoni et al. Real-time 3D reconstruction from images taken from an UAV
CN113256683B (en) Target tracking method and related equipment
CN113129382A (en) Method and device for determining coordinate conversion parameters
WO2019080879A1 (en) Data processing method, computer device, and storage medium
CN113378605B (en) Multi-source information fusion method and device, electronic equipment and storage medium
WO2023213083A1 (en) Object detection method and apparatus and driverless car
KR20190134303A (en) Apparatus and method for image recognition
CN116684740A (en) Perception training data generation method, device, computer equipment and storage medium
CN115661014A (en) Point cloud data processing method and device, electronic equipment and storage medium
KR20210008647A (en) Apparatus and method for detecting vehicle type, speed and traffic using radar device and image processing
US20220373661A1 (en) Sensor triggering to synchronize sensor data
KR102019990B1 (en) Method and apparatus for estimating vehicle position based on visible light communication that considering motion blur compensation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20831377

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20217016721

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2021533178

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20831377

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 20831377

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 12/09/2022)

122 Ep: pct application non-entry in european phase

Ref document number: 20831377

Country of ref document: EP

Kind code of ref document: A1