WO2020258901A1 - Method and apparatus for processing data of sensor, electronic device, and system - Google Patents
Method and apparatus for processing data of sensor, electronic device, and system Download PDFInfo
- Publication number
- WO2020258901A1 WO2020258901A1 PCT/CN2020/076813 CN2020076813W WO2020258901A1 WO 2020258901 A1 WO2020258901 A1 WO 2020258901A1 CN 2020076813 W CN2020076813 W CN 2020076813W WO 2020258901 A1 WO2020258901 A1 WO 2020258901A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- sensor
- pose
- time
- smart device
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 81
- 238000012545 processing Methods 0.000 title claims abstract description 72
- 238000001514 detection method Methods 0.000 claims description 124
- 230000008569 process Effects 0.000 claims description 30
- 238000003672 processing method Methods 0.000 claims description 29
- 238000004590 computer program Methods 0.000 claims description 14
- 230000006870 function Effects 0.000 claims description 10
- 238000004891 communication Methods 0.000 claims description 8
- 230000004044 response Effects 0.000 claims description 6
- 238000004458 analytical method Methods 0.000 claims description 2
- 230000010354 integration Effects 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 29
- 230000001360 synchronised effect Effects 0.000 description 18
- 238000005516 engineering process Methods 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 3
- 238000013519 translation Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 206010039203 Road traffic accident Diseases 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000002592 echocardiography Methods 0.000 description 1
- 238000005206 flow analysis Methods 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
- G01C21/1652—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
- G01C21/1656—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0257—Control of position or course in two dimensions specially adapted to land vehicles using a radar
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
Definitions
- the embodiments of the present disclosure relate to smart driving technology, and in particular to a sensor data processing method, device, electronic equipment and system.
- assisted driving and automatic driving are two important technologies in the field of intelligent driving. Through assisted driving or automatic driving, the occurrence of traffic accidents can be reduced, so they play an important role in the field of intelligent driving.
- the implementation of assisted driving technology and autonomous driving technology requires the cooperation of multiple sensors.
- a variety of sensors are set at different positions of the vehicle, and real-time collection of road images, vehicle operation data, etc., and the auxiliary driving system or automatic driving system performs path planning and other control operations based on the data collected by each sensor. Since the trigger time and trigger source of the sensors installed on the vehicle may be different, there may be a problem of out-of-synchronization between multiple sensors and multiple sensors in each sensor.
- the embodiments of the present disclosure provide a sensor data processing method, device, electronic equipment and system.
- an embodiment of the present disclosure provides a sensor data processing method, including: acquiring first target data of a first sensor of a smart device at a first time; acquiring a second sensor of the smart device at a second time Two target data, the first moment and the second moment are different moments under the clock of the smart device; acquiring the first pose of the smart device at the first moment and the second moment In the second pose, the first pose and the second pose are different; according to the first pose and the second pose, the first target data is compensated to obtain the Compensation data of the first sensor at the second moment.
- an embodiment of the present disclosure further provides a sensor data processing device, including: a first acquisition module for acquiring first target data of a first sensor of a smart device at a first moment; a second acquisition module for Acquire the second target data of the second sensor of the smart device at the second time, the first time and the second time are different time under the clock of the smart device; the third acquisition module is used to acquire The first pose of the smart device at the first moment and the second pose at the second moment, the first pose and the second pose are different; the compensation module is used for In the first pose and the second pose, compensation processing is performed on the first target data to obtain the compensation data of the first sensor at the second moment.
- embodiments of the present disclosure also provide an intelligent driving control method, including: an intelligent driving control device acquires detection data of a sensor provided on a smart device, and the detection data uses the sensor data described in the first aspect above The processing method is obtained; the smart driving control device performs smart driving control on the smart device according to the detection data.
- an embodiment of the present disclosure further provides an intelligent driving control device, including: an acquisition module, configured to acquire detection data of a sensor set on a smart device, and the detection data uses the sensor described in the first aspect.
- the data processing method is obtained; the intelligent driving control module is used for intelligent driving control of the smart device according to the detection data.
- embodiments of the present disclosure further provide an electronic device, including: a memory, configured to store program instructions; a processor, configured to call and execute the program instructions in the memory, and execute the method described in the first aspect above step.
- embodiments of the present disclosure also provide an intelligent driving system, including: a sensor connected in communication, the electronic device described in the fifth aspect, and the intelligent driving control device in the fourth aspect.
- the embodiments of the present disclosure also provide a readable storage medium in which a computer program is stored, and the computer program is used to execute the method steps described in the first aspect; or, the The computer program is used to execute the method steps described in the third aspect.
- the first target data of the first sensor at the first time and the second target data of the second sensor at the second time are acquired according to the smart device
- the pose information at the first moment and the pose information of the smart device at the second moment are compensated for the first target data to obtain the compensation data of the first sensor at the second moment.
- the second target data of the second sensor is also the monitoring data at the second time, that is, by performing compensation processing on the data of the first sensor, the data of the first sensor at the corresponding time of the second sensor can be obtained, that is, the obtained first sensor
- the first target data of the second sensor and the second target data of the second sensor are data at the same time, so as to achieve synchronization of the first sensor and the second sensor.
- the method realizes the synchronization of the sensors through software, therefore, there is no need to additionally deploy dedicated hardware for synchronous triggering of multiple sensors, which reduces the hardware cost required for data synchronization of multiple sensors.
- FIG. 1 is a schematic diagram of an application scenario of a sensor data processing method provided by an embodiment of the disclosure
- FIG. 2 is a first flowchart of a sensor data processing method provided by an embodiment of the disclosure
- FIG. 3 is a schematic diagram of data synchronization between the first sensor and the second sensor through the above process
- FIG. 4 is a second schematic flowchart of a sensor data processing method provided by an embodiment of the disclosure.
- FIG. 5 is a third schematic flowchart of a sensor data processing method provided by an embodiment of the disclosure.
- FIG. 6 is a fourth schematic flowchart of a sensor data processing method provided by an embodiment of the disclosure.
- FIG. 7 is a fifth schematic flowchart of a sensor data processing method provided by an embodiment of the disclosure.
- FIG. 8 is a sixth schematic flowchart of a sensor data processing method provided by an embodiment of the disclosure.
- FIG. 9 is an example diagram of performing intra-frame data synchronization on the first original data to obtain the first target data
- FIG. 10 is a seventh schematic flowchart of a sensor data processing method provided by an embodiment of the disclosure.
- Figure 11 is an example diagram of synchronizing sensors of the same type
- FIG. 12 is a first module structure diagram of a sensor data processing device provided by an embodiment of the disclosure.
- FIG. 13 is a second module structure diagram of a sensor data processing device provided by an embodiment of the disclosure.
- FIG. 14 is the third module structure diagram of the sensor data processing device provided by an embodiment of the disclosure.
- FIG. 15 is a fourth module structure diagram of a sensor data processing device provided by an embodiment of the disclosure.
- FIG. 16 is the fifth module structure diagram of the sensor data processing device provided by the embodiments of the disclosure.
- FIG. 17 is a sixth module structure diagram of a sensor data processing device provided by an embodiment of the disclosure.
- FIG. 19 is a schematic flowchart of an intelligent driving control method provided by an embodiment of the disclosure.
- 20 is a schematic structural diagram of an intelligent driving control device provided by an embodiment of the disclosure.
- FIG. 21 is a schematic diagram of a smart driving system provided by an embodiment of the disclosure.
- FIG. 1 is a schematic diagram of an application scenario of a sensor data processing method provided by an embodiment of the disclosure.
- the method can be applied to smart devices such as vehicles, robots, and blind guide devices that are installed with sensors.
- the types of sensors installed on smart devices can include cameras, LiDAR (Light Detection And Ranging, LiDAR), Millimeter Wave Radar (Radio Detection And Ranging, RADAR), high-precision inertial navigation, Controller Area Net-work At least one sensor such as Bus, CANBUS, etc.
- the number of sensors of the same type set on the smart device may be one or more, for example, one or more cameras, one or more lidars, etc. may be set. Different sensors can be set in different positions of the smart device.
- Figure 1 shows a camera and LiDAR as an example.
- the camera can capture images of the surrounding environment of the smart device in real time, and report the captured images to the smart driving system of the smart device.
- LiDAR can obtain three-dimensional point coordinates around the smart device by emitting and receiving laser pulses, forming point cloud data, and reporting the point cloud data to the smart driving system of the smart device.
- RADAR uses electromagnetic waves to detect the ground, vehicles, trees and other objects around the smart device and receive their echoes to obtain the object's position, height, distance and other information, and report it to the smart device's smart driving system.
- CANBUS transmits the operating parameters of the smart device, such as the vehicle's accelerator operating parameters, steering wheel operating parameters, wheel speed, etc., to the smart driving system of the smart device using serial data transmission.
- the intelligent driving system performs intelligent driving control based on the data reported by each sensor, such as vehicle positioning, route planning, route deviation warning, and traffic flow analysis.
- the following embodiments of the present disclosure refer to the smart driving system of the smart device as a "system" for short.
- FIG. 2 is a schematic flowchart 1 of a sensor data processing method provided by an embodiment of the disclosure. As shown in FIG. 2, the method includes:
- the system may send a data report instruction to the first sensor and the second sensor, and the first sensor and the second sensor detect data after receiving the data report instruction, and send the detected data to the system.
- the first sensor and the second sensor may also automatically detect data according to a preset cycle, and send the detected data to the system.
- S204 Perform compensation processing on the first target data according to the first pose and the second pose to obtain compensation data of the first sensor at the second moment.
- the first sensor and the second sensor may be any two sensors on the smart device.
- One or more sensors of the same type may be provided on the smart device, and the first sensor and the second sensor may be two sensors of different types, or two sensors of the same type.
- the first sensor may be LiDAR
- the second sensor may be a camera
- the first sensor is LiDAR and the second sensor is a camera
- the second sensor since the data detected by the camera is two-dimensional information, and the data detected by LiDAR is three-dimensional information, the data needs to be rotated and translated in the subsequent compensation processing. Therefore, the second sensor that uses the camera that detects two-dimensional information as a reference to perform motion compensation operations such as rotation and translation on the three-dimensional data detected by LiDAR that detects three-dimensional information can ensure that no additional depth is introduced during the compensation process. error.
- the first sensor and the second sensor may both be LiDAR.
- each sensor may use its own independent clock, or may use the same clock as the intelligent driving system of the vehicle.
- the first time and the second time mentioned in this embodiment both refer to the time under the clock of the intelligent driving system of the vehicle.
- Each sensor may report the detection time of the data under the sensor's clock to the system by carrying time information in the reported data.
- the time when each sensor sends data is equal to or approximate to the time when the system receives the data.
- the above-mentioned first time is the time when the first sensor sends data, the time under the clock of the first sensor itself, and the time under the system clock when the system receives data. If the clock of the first sensor itself is different from the clock of the system, the system needs to obtain the first time according to the data sending time under the clock of the first sensor and the clock difference between the first sensor and the system.
- the process will be implemented as follows Detailed description in the example. The processing method for the above second moment is similar to the above first moment, and will not be repeated here.
- the first sensor and the second sensor may be out of synchronization due to factors such as different trigger sources and different trigger moments.
- the first sensor is LiDAR
- the second sensor is a camera.
- the LiDAR reports one frame of data to the system every time it rotates.
- the camera can report data to the system according to its own shooting cycle. Therefore, even if the LiDAR and the camera start working at the same time, they report to the system
- the data may not be the data at the same time.
- LiDAR detects a person 100 meters in front of the vehicle, and the camera may capture a person 120 meters in front of the vehicle.
- the second sensor is used as the reference sensor of the first sensor, and the system receives the second sensor of the second sensor.
- the second moment of the second target data is used as the reference moment of the first moment when the first sensor sends data.
- the first target data Compensation processing obtains the compensation data of the first sensor at the second moment.
- the second target data of the second sensor is also the detection data at the second time, that is, by performing compensation processing on the data of the first sensor
- the data of the first sensor at the corresponding time of the second sensor can be obtained, that is, the obtained first sensor
- the data of and the data of the second sensor are data at the same time, so as to realize the synchronization of the first sensor and the second sensor.
- the synchronization of the sensors is realized through software, so there is no need to deploy additional dedicated hardware for triggering synchronization of multiple sensors, which reduces the hardware cost required for data synchronization of multiple sensors.
- Figure 3 is a schematic diagram of data synchronization between the first sensor and the second sensor through the above process.
- the first sensor and the second sensor are The data frame reported by LiDAR may be called a radar frame (or LiDAR frame), and the data frame reported by the camera is called a camera frame (or data frame).
- the system receives a radar frame at time Tl (ie, the first time), and a camera frame at time Tc (ie, the second time), using the camera as a reference sensor, and using the above process to get Tl time
- Tl time
- Tc time
- the data of the radar frame at time Tc is equivalent to acquiring radar data and camera data at the same time at Tc time, thus realizing the synchronization of LiDAR and camera.
- each sensor sends detected data to the system in real time, and the data is used to synchronize sensor data.
- Figure 4 is the second schematic diagram of the synchronization processing of the recorded data of each sensor during data playback.
- the detection data of the camera is recorded in advance to obtain a series of camera frames, and each camera frame records the time of detection
- Each radar frame records the time stamp of the detection.
- data playback read a camera frame and a radar frame. At the same time, get the vehicle's pose queue according to the recorded CANBUS/high-precision inertial navigation detection data, and then get the detection according to the time stamp of the camera frame.
- the pose of the vehicle at the time of the camera frame, the pose when the radar frame was detected according to the time stamp of the radar frame, and the radar frame is compensated to the detection of the camera frame according to the obtained two poses
- the camera frame and radar frame are synchronized, and operations such as driving control can be performed based on the synchronized camera frame and radar frame.
- one of them can be selected as the second sensor, that is, the reference sensor, and the other sensors are synchronized with the reference sensor, so as to realize the synchronization of the sensors on the smart device.
- the first target data is compensated to obtain the compensation data of the first sensor at the second moment. Since the second target data of the second sensor is also the monitoring data at the second time, by performing compensation processing on the data of the first sensor, the data of the first sensor at the corresponding time of the second sensor can be obtained, that is, the obtained data of the first sensor.
- the first target data and the second target data of the second sensor are both data at the same time, so as to achieve synchronization of the first sensor and the second sensor.
- the method realizes the synchronization of the sensors through software, therefore, there is no need to additionally deploy dedicated hardware for synchronous triggering of multiple sensors, which reduces the hardware cost required for data synchronization of multiple sensors.
- FIG. 5 is a schematic diagram of the third flow of a sensor data processing method provided by an embodiment of the disclosure. As shown in FIG. 5, the first target data is compensated according to the first pose and the second pose in step S204.
- Alternative methods include:
- S501 Determine the first coordinate system of the first sensor at the first moment according to the first pose
- S502 Determine the second coordinate system of the first sensor at the second moment according to the second pose
- S503 Perform compensation processing on the first target data according to the first coordinate system and the second coordinate system to obtain compensation data of the first sensor at the second time.
- the vehicle has a corresponding pose at each moment in the running state, and the pose at different moments may change, and the pose change may include rotation and translation.
- the pose of the sensor Based on the pose of the vehicle at each moment, that is, rotation and translation, the pose of the sensor can be obtained.
- Each pose of the sensor corresponds to a coordinate system, and the point in the world coordinate system detected by the sensor in each pose is a point in the coordinate system, that is, the coordinate value of the detected point is the coordinate The coordinate value under the system.
- the vehicle has a corresponding pose at the first moment and the second moment, and further has a first coordinate system corresponding to the first pose and a second coordinate system corresponding to the second pose.
- the coordinate value of the point in the second coordinate system can be derived, that is, the coordinate value at the second moment, and the value detected by the first sensor
- Each point in the data performs the above processing, and the detection data of the data detected by the first sensor at the second time can be obtained.
- the following illustrates the process of performing compensation processing on the first target data of the first sensor based on the first coordinate system and the second coordinate system through examples.
- the first time is t0
- the second time is tn
- the pose of the vehicle at t0 is P0
- the coordinate system corresponding to P0 is the first coordinate system.
- the coordinate data of point X obtained in the first coordinate system at time is x0
- the relationship between x0 and X satisfies the following formula (1):
- the first sensor at the first time t0 detects the coordinates of the point in the first coordinate system corresponding to the pose at t0, and t0 and tn respectively.
- the coordinates of the point in the second coordinate system corresponding to the pose at the time tn can be derived.
- the above processing is performed on each point corresponding to the first target data, and then the detection data of the first sensor at time tn can be obtained.
- the compensation process is performed based on the first pose of the vehicle at the first moment and the second pose of the vehicle at the second moment. Therefore, the first position of the vehicle at the first moment can be obtained before the compensation process. And the second pose of the vehicle at the second moment.
- the pose queue of the smart device may be generated first, and then based on the pose queue of the smart device, the first pose of the smart device at the first moment and the second position of the smart device at the second moment are obtained. posture.
- FIG. 6 is a fourth flowchart of a sensor data processing method provided by an embodiment of the present disclosure. As shown in FIG. 6, the pose queue of the smart device is generated, and the first pose and the second pose are determined based on the pose queue of the smart device The process includes:
- S602 Generate a pose queue of the smart device according to the above-mentioned pose detection data.
- the sensors with a pose detection function provided on the vehicle may include CANBUS, high-precision inertial navigation and other sensors.
- the system can receive real-time pose detection data reported by CANBUS, high-precision inertial navigation sensors, such as vehicle wheel speed, steering wheel and other operating data. Based on these pose detection data, the system can calculate the vehicle's pose at multiple moments, and then Build a pose queue.
- S603 Determine the first pose and the second pose according to the pose queue of the smart device, the first moment and the second moment.
- the pose queue of the smart device is composed of poses at each moment.
- the first moment may be a moment corresponding to a certain pose in the pose pair, that is, the mapping relationship between the first moment and the pose information directly exists in the pose queue.
- the pose information corresponding to the first moment can be directly obtained from the pose queue.
- the same processing is performed for the pose information sampling at the second moment, that is, if the mapping relationship between the second moment and the pose information directly exists in the pose queue, the pose information corresponding to the second moment can be directly obtained from the pose queue.
- the pose queue of the smart device in response to the situation that the pose queue does not include the pose at the first moment, the pose queue of the smart device is compensated according to the first moment to obtain The first pose; and/or, in response to the situation that the pose at the second moment is not included in the pose queue, perform compensation processing on the pose queue of the smart device according to the second moment , To get the second pose.
- the pose information corresponding to the first moment may not exist in the pose queue, and the pose queue of the smart device can be compensated according to the first moment to obtain the first moment of the smart device.
- the compensation processing for the pose queue may be interpolation processing, for example.
- the first moment is t3, and there is no pose information corresponding to t3 in the pose queue of the smart device, you can find the two adjacent moments closest to t3 in the pose queue, for example Find time t4 and t5, t4 and t5 are adjacent, at the same time, t3 is the time between t4 and t5.
- Using the pose information at time t4 and the pose information at time t5 perform interpolation processing to obtain the pose information corresponding to time t3.
- the above process can also be processed, which will not be repeated here.
- the data used is the foregoing first target data.
- the above-mentioned first target data may refer to unprocessed data directly detected by the first sensor, or the above-mentioned first target data may also be data obtained by pre-synchronizing raw data detected by the first sensor.
- the first sensor is LiDAR
- the LiDAR rotates one circle and reports one frame of data to the system after detecting one circle of data. Since LiDAR rotates one circle for a certain period of time, the LiDAR In a LiDAR frame data reported to the system, the actual detection time of each sub-data is different.
- one LiDAR frame data may include multiple data packets, and each data packet is a sub-data.
- the sub-data in the data sent by the first sensor can be synchronized first to Make each frame of data sent by the first sensor achieve intra-frame synchronization.
- synchronizing the above-mentioned sub-data refers to taking the transmission time of one sub-data in each sub-data as a reference time, and performing compensation processing on the remaining sub-data to obtain sub-data of the remaining sub-data at the reference time.
- the third time point carried when the first sensor reports the first raw data corresponding to the first target data, and the difference information between the clock of the first sensor and the clock of the smart device may first determine the The first moment.
- the first raw data refers to data that has not undergone intra-frame synchronization processing reported by the first sensor to the system
- the first target data refers to data that has undergone intra-frame synchronization processing
- the time of detecting the data may be carried in the first raw data, that is, the aforementioned third time.
- the third time is the first time under the clock of the smart device
- the third time is used to identify the time when the first sensor detects the first raw data corresponding to the first target data
- the third time is the first sensor Time under the clock.
- the difference information between the clock of the first sensor and the clock of the smart device may be obtained in advance through a specific means.
- the first sensor is based on the Global Positioning System (Global Positioning System, GPS)
- the difference information between the clock of the first sensor and the clock of the smart device may be determined according to the error between the GPS clock and the clock of the smart device.
- the second sensor may also be determined according to the difference information between the clock of the second sensor and the clock of the smart device.
- the fourth time carried by the second sensor when reporting the second target data is the second time under the smart device clock, and the fourth time is used to identify the detection time of the second target data.
- the fourth time It is the time under the clock of the second sensor.
- the second sensor can be used to shoot multiple video frames of the stopwatch of the smart device, and the time information of each video frame and each The time information displayed by the stopwatch corresponding to the video frame is compared and analyzed to obtain the difference information between the clock of the second sensor and the clock of the smart device.
- the following describes the process of performing intra-frame data synchronization based on the third moment carried when the first sensor reports the first original data.
- FIG. 7 is a schematic flow chart 5 of the sensor data processing method provided by an embodiment of the disclosure. As shown in FIG. 7, the process of performing intra-frame data synchronization on the first raw data to obtain the first target data includes:
- the first raw data reported by the first sensor includes multiple sub-data, and carries the detection time of each sub-data at the same time, that is, each sub-data has a corresponding detection time.
- the system can select one of these detection moments as the reference moment as the third moment; respectively compensate the sub-data at other moments to the reference moment, so as to obtain all the sub-data at the reference moment, and these sub-data are combined into
- the data is the first target data, so as to realize the intra-frame data synchronization of the first sensor.
- the latest one of the multiple detection moments may be selected as the third moment.
- the frame of multiple sub-data in the first raw data can be completed Internal synchronization, thereby further improving the accuracy of sensor synchronization.
- FIG. 8 is a schematic flow chart 6 of a sensor data processing method provided by an embodiment of the disclosure. As shown in FIG. 8, an optional implementation manner of the foregoing step S702 includes:
- S801 Determine the coordinate system of the first sensor at the detection time of each sub-data according to the pose of the smart device at the detection time of each sub-data.
- the third time is used as the reference time, and the sub-data of other detection time is compensated to the third time to realize the synchronization of the sub-data.
- the process of determining the sub-data corresponding to the sub-data at the third time according to the coordinate system at the detection time of the sub-data and the coordinate system at the third time is consistent with the processing in the above step S503.
- S803 Perform integration processing on the sub-data corresponding to each sub-data at the third time except the third time to obtain the first target data of the first sensor at the first time.
- the sub-data corresponding to each sub-data at the third time can be sorted and combined according to the original detection time to obtain the above-mentioned first target data.
- All sub-data in the first target data are corresponding to the third time.
- the data in the coordinate system, that is, all the sub-data in the first target data are synchronous data.
- the first target data is the data in the coordinate system corresponding to the third time.
- the time at the third time under the clock of the smart device is the first time. Therefore, in the smart device Under the clock of, the above-mentioned first target data is the detection data at the first moment.
- Figure 9 is an example diagram of performing intra-frame data synchronization on the first raw data to obtain the first target data.
- the LiDAR includes n in each frame of data reported. +1 data packet, each data packet corresponds to a detection time, you can use the detection time of the nth data packet (data packet n) as the reference time to compensate data packets 0 to n-1 to the data packets respectively n corresponds to the time, so as to realize the intra-frame synchronization of one frame of data.
- the above describes the process of performing intra-frame synchronization on the first original data composed of multiple sub-data to obtain the first target data.
- the smart device contains multiple sensors of the same type, and the detection data of the same type of sensor is in the above-mentioned form including multiple sub-data, the first sensor is one of multiple sensors of the same type.
- the detection data of multiple sensors of the same type may be synchronized based on the multiple sub-data synchronization method described above.
- FIG. 10 is a schematic flow diagram 7 of a sensor data processing method provided by an embodiment of the disclosure. As shown in FIG. 10, the process of synchronizing detection data of multiple sensors of the same type includes:
- the type of the third sensor is the same as that of the first sensor.
- the second raw data includes multiple sub-data. Each sub-data has a corresponding detection time.
- the third time is the reference time of the detection time of the multiple data included in the second original data;
- one of the detection moments of the first sensor is used as the reference moment when the third sensor is synchronized.
- the third moment is used as the reference moment when the third sensor is synchronized.
- the Each sub-data in the second original data is compensated to the third moment.
- all the sub-data in the second original data are synchronized to obtain the synchronized third target data.
- the third time of the first sensor is used as the reference time. Therefore, the third target data synchronized by the third sensor and the second target data synchronized by the first sensor are synchronized.
- FIG 11 is an example diagram of synchronizing sensors of the same type. As shown in Figure 11, taking the first sensor as LiDAR, the third sensor and the fourth sensor as LiDAR as an example, each frame of data reported by each LiDAR , Including n+1 data packets, each data packet corresponds to a detection time.
- the detection time of the nth data packet (data packet n) of the first sensor can be used as the reference time, and the data packet 0 to data packet n-1 of the first sensor can be compensated to the corresponding time of data packet n, and, Compensate data packet 0 to data packet n of the third sensor to the corresponding time of data packet n of the first sensor, and respectively compensate data packet 0 to data packet n of the fourth sensor to data packet n of the first sensor.
- the intra-frame synchronization of the first sensor, the third sensor, and the fourth sensor, and the inter-frame synchronization between the first sensor, the third sensor, and the fourth sensor are realized.
- the detection time of one sub-data in the detection data of one sensor can be used as a reference At time, for each of the remaining sensors of the same type, all sub-data of the sensor are compensated to the reference time. After this processing, not only the intra-frame synchronization of each sensor can be realized, but also the same time can be realized. Interframe synchronization between types of sensors.
- the compensation processing is performed on the first target data according to the first pose and the second pose to obtain the data of the first sensor at the second moment
- the compensation data includes: performing compensation processing on the first target data and the third target data according to the first pose and the second pose to obtain that the first sensor is at the second moment And the compensation data of the third sensor at the second moment.
- the first target data is compensated according to the first pose and the second pose, it can also be based on the first pose and second pose.
- the pose performs compensation processing on the third target data to obtain the compensation data of the first sensor at the second moment and the compensation data of the third sensor at the second moment. Since the first sensor and the third sensor are the same type of sensors, synchronization has been achieved, and on this basis, synchronization with the second sensor can further improve the accuracy of sensor synchronization.
- FIG. 12 is the first module structure diagram of a sensor data processing device provided by an embodiment of the disclosure. As shown in FIG. 12, the device includes:
- the first acquiring module 1201 is configured to acquire the first target data of the first sensor of the smart device at the first moment;
- the second acquisition module 1202 is configured to acquire second target data of the second sensor of the smart device at a second time, where the first time and the second time are different time under the clock of the smart device;
- the third acquiring module 1203 is configured to acquire the first pose of the smart device at the first moment and the second pose at the second moment, the first pose and the second pose different;
- the compensation module 1204 is configured to perform compensation processing on the first target data according to the first pose and the second pose to obtain compensation data of the first sensor at the second time.
- the device is used to implement the foregoing method embodiments, and its implementation principles and technical effects are similar, and will not be repeated here.
- the compensation module 1204 is configured to determine the first coordinate system of the first sensor at the first moment according to the first pose; determine the first coordinate system according to the second pose The second coordinate system of the first sensor at the second time; according to the first coordinate system and the second coordinate system, the first target data is compensated to obtain the first sensor at the Describe the compensation data at the second moment.
- the third acquisition module 1203 is configured to respectively acquire the pose detection data of the smart device detected at multiple times by the sensor with the pose detection function set on the smart device.
- Each of the two moments is a moment under the clock of the smart device; according to the pose detection data of the smart device detected at multiple moments by the sensor with pose detection function set on the smart device, generate The pose queue of the smart device; the first pose and the second pose are determined according to the pose queue of the smart device, the first moment and the second moment.
- the third acquiring module 1203 is configured to respond to the situation that the pose at the first moment is not included in the pose queue, and perform an assessment of the pose of the smart device according to the first moment.
- the queue performs compensation processing to obtain the first pose; and/or, in response to the situation that the pose at the second moment is not included in the pose queue, the smart device is processed according to the second moment Perform compensation processing on the pose queue to obtain the second pose.
- FIG. 13 is the second module structure diagram of the sensor data processing device provided by the embodiment of the disclosure.
- the device further includes: a first determining module 1205, which is used to determine the relationship between the clock of the first sensor and the smart device.
- the difference information of the clock of the device determines that the third time is the first time under the smart device clock, and the third time is used to identify the first original data corresponding to the first target data detected by the first sensor.
- the time of the data, the third time is the time under the clock of the first sensor.
- the first acquisition module 1201 is configured to receive the first raw data reported by the first sensor, the first raw data includes multiple sub-data, and each sub-data has a corresponding detection time, The third time is the reference time of the detection time of the multiple sub-data included in the first raw data; the first sensor is acquired according to the pose of the smart device at the detection time of each sub-data and each sub-data The first target data at the first moment.
- the first acquisition module 1201 is configured to determine the coordinate system of the first sensor at the detection time of each sub-data according to the pose of the smart device at the detection time of each sub-data; The coordinate system of the first sensor at the detection time of each sub-data except the third time and the coordinate system of the first sensor at the third time, respectively determine each sub-data except the third time The sub-data corresponding to the third time; the sub-data corresponding to each sub-data at the third time except the third time is integrated to obtain the data of the first sensor at the first time The first target data.
- FIG. 14 is the third module structure diagram of the sensor data processing device provided by the embodiment of the disclosure.
- the device further includes: a receiving module 1206 for receiving the second raw data reported by the third sensor.
- the type of the three sensors is the same as the type of the first sensor, the second raw data includes multiple sub-data, each sub-data has a corresponding detection time, and the third time is the multiple included in the second raw data.
- the fourth acquisition module 1207 is configured to acquire the location of the third sensor at the time of detection of each sub-data of the second raw data and each sub-data of the second raw data according to the smart device.
- the third target data at the first moment;
- the compensation module 1204 is configured to perform compensation processing on the first target data and the third target data according to the first pose and the second pose, to obtain that the first sensor is in the second The compensation data at the time and the compensation data of the third sensor at the second time.
- the device further includes: a fifth acquisition module 1208, which is used by the first determination module 1205 according to the first The difference information between the clock of the sensor and the clock of the smart device determines that the third time is before the first time under the smart device clock, and obtains the first time according to the clock error between the GPS clock and the smart device Difference information between the clock of the sensor and the clock of the smart device.
- FIG. 16 is the fifth module structure diagram of the sensor data processing device provided by the embodiments of the disclosure.
- the device further includes: a second determining module 1209, which is used for the second acquiring module 1202 to acquire the first determination of the smart device Before the second target data at the second time, the second sensor determines the fourth carried when the second sensor reports the second target data according to the difference information between the clock of the second sensor and the clock of the smart device.
- the time is the second time under the smart device clock
- the fourth time is used to identify the detection time of the second target data
- the fourth time is the time under the clock of the second sensor.
- FIG. 17 is a module structure diagram 6 of the sensor data processing device provided by an embodiment of the disclosure. As shown in FIG. 17, the device further includes: a photographing module 1210 for photographing the stopwatch of the smart device using the second sensor Multiple video frames;
- the analysis module 1211 is used to compare and analyze the time information of each video frame and the time information displayed by the stopwatch corresponding to each video frame to obtain the clock of the second sensor and the smart Difference information of the device's clock.
- the second sensor is a camera
- the first sensor is a lidar or millimeter wave radar.
- the division of the various modules of the above device is only a division of logical functions, and may be fully or partially integrated into a physical entity in actual implementation, or may be physically separated.
- these modules can all be implemented in the form of software called by processing elements; they can also be implemented in the form of hardware; some modules can be implemented in the form of calling software by processing elements, and some of the modules can be implemented in the form of hardware.
- the determining module may be a separately established processing element, or it may be integrated into a certain chip of the above-mentioned device for implementation.
- each step of the above method or each of the above modules can be completed by hardware integrated logic circuits in the processor element or instructions in the form of software.
- the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more application specific integrated circuits (ASIC), or one or more microprocessors (Digital Signal Processor, DSP), or, one or more Field Programmable Gate Array (Field Programmable Gate Array, FPGA), etc.
- ASIC application specific integrated circuits
- DSP Digital Signal Processor
- FPGA Field Programmable Gate Array
- the processing element may be a general-purpose processor, such as a central processing unit (CPU) or other processors that can call program codes.
- CPU central processing unit
- these modules can be integrated together and implemented in the form of a System-On-a-Chip (SOC).
- SOC System-On-a-Chip
- the computer program product includes one or more computer instructions.
- the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
- the computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium. For example, the computer instructions may be transmitted from a website, computer, server, or data center.
- the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or data center integrated with one or more available media.
- the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk (SSD)), etc.
- FIG. 18 is a schematic structural diagram of an electronic device provided by an embodiment of the disclosure.
- the electronic device 1800 may include: a processor 181 and a memory 182; the memory 182 is used to store computer instructions, and when the processor 181 executes the computer instructions, the implementation is as shown in FIGS. 1 to 10 above. Show the scheme of the embodiment.
- the electronic device 1800 may further include a communication interface 183 for communicating with other devices. It can be understood that the electronic device 1800 may further include a system bus 184, and the system bus 184 is used to implement connection and communication between these components.
- the system bus 184 may be a Peripheral Component Interconnect (PCI) bus or an Extended Industry Standard Architecture (EISA) bus.
- PCI Peripheral Component Interconnect
- EISA Extended Industry Standard Architecture
- the system bus 184 can be divided into an address bus, a data bus, a control bus, and the like. For ease of representation, only one thick line is used in the figure, but it does not mean that there is only one bus or one type of bus.
- the communication interface 183 is used to implement communication between the database access device and other devices (for example, a client, a read-write library, and a read-only library).
- the memory 182 may include a random access memory (Random Access Memory, RAM), and may also include a non-volatile memory (Non-Volatile Memory), such as at least one disk memory.
- the aforementioned processor 181 may be a general-purpose processor, including a CPU, a network processor (Network Processor, NP), etc.; it may also be a DSP, ASIC, FPGA or other programmable logic device, discrete gate or transistor logic device, discrete hardware component .
- NP Network Processor
- FIG. 19 is a schematic flow chart of the intelligent driving control method provided by the embodiments of the present disclosure.
- the embodiments of the present disclosure also provide an intelligent driving control method, including:
- S1901 respectively obtain detection data of a first sensor and a second sensor set on the smart device, and the detection data is obtained by using the sensor data processing method provided in the embodiment of the present disclosure
- S1902 Perform intelligent driving control on the smart device according to the detection data.
- the execution subject of this embodiment may be a smart driving control device.
- the smart driving control device of this embodiment and the electronic equipment described in the above embodiments may be located in the same device or separate devices in different devices.
- the intelligent driving control device of this embodiment is in communication connection with the above-mentioned electronic equipment.
- the detection data of the first sensor and the second sensor are obtained by the method of the above-mentioned embodiment, and the specific process is referred to the description of the above-mentioned embodiment, which will not be repeated here.
- the electronic device executes the above-mentioned sensor data processing method, obtains detection data of the first sensor and the second sensor set on the smart device, and outputs the detection data of the first sensor and the second sensor set on the smart device .
- the intelligent driving control device obtains the detection data of the first sensor and the second sensor, and performs intelligent driving control on the smart device according to the detection data.
- the smart driving in this embodiment includes assisted driving, automatic driving, and/or driving mode switching between auxiliary driving and automatic driving, and the like.
- the above-mentioned intelligent driving control may include: braking, changing driving speed, changing driving direction, maintaining lane line, changing the state of lights, switching driving mode, etc., wherein the driving mode switching may be a switching between assisted driving and automatic driving, for example , Switch assisted driving to automatic driving through intelligent driving control.
- the smart driving control device obtains the detection data of the sensor set on the smart device, and performs smart driving control based on the detection data of the sensor set on the smart device, thereby improving the smart driving performance. Safety and reliability.
- FIG. 20 is a schematic structural diagram of an intelligent driving control device provided by an embodiment of the disclosure.
- the intelligent driving control device 2000 of the embodiment of the present disclosure includes: an acquisition module 2001 for Acquiring detection data of the first sensor and the second sensor set on the smart device respectively, and the detection data is obtained by using the aforementioned sensor data processing method;
- the intelligent driving control module 2002 is used to perform intelligent driving control on the smart device according to the detection data.
- the intelligent driving control device of the embodiment of the present disclosure may be used to execute the technical solution of the method embodiment shown above, and its implementation principle and technical effect are similar, and will not be repeated here.
- FIG. 21 is a schematic diagram of a smart driving system provided by an embodiment of the disclosure.
- the smart driving system 2100 of this embodiment includes: a sensor 2101, an electronic device 1800, and a smart driving control device 2000 connected in communication, wherein the electronic device 1800 is shown in FIG. 18, and the intelligent driving control device 2000 is shown in FIG. 20.
- the sensor 2101 may include at least one of the cameras, LiDAR, RADAR, high-precision inertial navigation and other sensors described in the foregoing embodiments.
- the sensor 2101 detects the surrounding environment of the smart device to obtain original detection data, and sends these detection data to the electronic device 1800.
- the electronic device 1800 receives the original detection data , Perform data synchronization according to the above-mentioned sensor data processing method to obtain synchronized detection data.
- the electronic device 1800 sends the synchronized detection data to the smart driving control device 2000, and the smart driving control device 2000 performs smart driving control on the smart device according to the synchronized detection data.
- the embodiment of the present application also provides a storage medium, the storage medium stores instructions, which when run on a computer, causes the computer to execute the method of any one of the embodiments shown in FIGS. 1 to 10; or The computer executes the method of the embodiment shown in FIG. 19 above.
- An embodiment of the present application also provides a chip for executing instructions.
- the chip is used to execute the method of any one of the embodiments shown in FIG. 1 to FIG. 10; or, the chip is used to execute the method of the embodiment shown in FIG. method.
- An embodiment of the present application further provides a program product, the program product includes a computer program, the computer program is stored in a storage medium, at least one processor can read the computer program from the storage medium, and the at least one When the processor executes the computer program, the method of any one of the embodiments shown in FIGS. 1 to 10 may be implemented; or, when the at least one processor executes the computer program, the method of the embodiment shown in FIG. 19 may be implemented .
- At least one refers to one or more, and “multiple” refers to two or more.
- “And/or” describes the association relationship of the associated objects, indicating that there can be three relationships, for example, A and/or B, which can mean: A alone exists, both A and B exist, and B exists alone, where A, B can be singular or plural.
- the character “/” generally indicates that the associated objects before and after are in an “or” relationship; in the formula, the character “/” indicates that the associated objects before and after are in a “division” relationship.
- “The following at least one item (a)” or similar expressions refers to any combination of these items, including any combination of a single item (a) or plural items (a).
- at least one of a, b, or c can mean: a, b, c, ab, ac, bc, or abc, where a, b, and c can be single or multiple A.
- the size of the sequence numbers of the foregoing processes does not mean the order of execution.
- the execution order of the processes should be determined by their functions and internal logic, and should not correspond to the embodiments of the present disclosure.
- the implementation process constitutes any limitation.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Aviation & Aerospace Engineering (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Electromagnetism (AREA)
- Arrangements For Transmission Of Measured Signals (AREA)
- Traffic Control Systems (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Electric Clocks (AREA)
Abstract
Description
Claims (29)
- 一种传感器数据处理方法,包括:A sensor data processing method, including:获取智能设备的第一传感器在第一时刻的第一目标数据;Acquiring the first target data of the first sensor of the smart device at the first moment;获取所述智能设备的第二传感器在第二时刻的第二目标数据,所述第一时刻和所述第二时刻为所述智能设备的时钟下的不同时刻;Acquiring second target data of a second sensor of the smart device at a second moment, where the first moment and the second moment are different moments under a clock of the smart device;获取所述智能设备在所述第一时刻的第一位姿和在所述第二时刻的第二位姿,所述第一位姿和所述第二位姿不同;Acquiring a first pose of the smart device at the first moment and a second pose at the second moment, where the first pose and the second pose are different;根据所述第一位姿和所述第二位姿,对所述第一目标数据进行补偿处理,得到所述第一传感器在所述第二时刻下的补偿数据。According to the first pose and the second pose, performing compensation processing on the first target data to obtain compensation data of the first sensor at the second moment.
- 根据权利要求1所述的方法,其中,所述根据所述第一位姿和所述第二位姿,对所述第一目标数据进行补偿处理,得到所述第一传感器在所述第二时刻下的补偿数据,包括:The method according to claim 1, wherein the compensation process is performed on the first target data according to the first pose and the second pose to obtain that the first sensor is in the second Compensation data at the moment, including:根据所述第一位姿,确定所述第一传感器在所述第一时刻的第一坐标系;Determine the first coordinate system of the first sensor at the first moment according to the first pose;根据所述第二位姿,确定所述第一传感器在所述第二时刻的第二坐标系;Determine the second coordinate system of the first sensor at the second moment according to the second pose;根据所述第一坐标系和所述第二坐标系,对所述第一目标数据进行补偿处理,得到所述第一传感器在所述第二时刻下的补偿数据。According to the first coordinate system and the second coordinate system, performing compensation processing on the first target data to obtain compensation data of the first sensor at the second time.
- 根据权利要求2所述的方法,其中,所述获取所述智能设备在所述第一时刻的第一位姿和在所述第二时刻的第二位姿,包括:The method according to claim 2, wherein said acquiring the first pose of the smart device at the first moment and the second pose at the second moment comprises:分别获取所述智能设备上设置的具有位姿检测功能的传感器在多个时刻检测到的位姿检测数据,所述多个时刻中的每个时刻均为所述智能设备的时钟下的时刻;Acquiring respectively the pose detection data detected by a sensor with a pose detection function set on the smart device at multiple moments, each of the multiple moments being a moment under the clock of the smart device;根据所述位姿检测数据,生成所述智能设备的位姿队列;Generating a pose queue of the smart device according to the pose detection data;根据所述智能设备的位姿队列、所述第一时刻和所述第二时刻,确定所述第一位姿以及所述第二位姿。The first pose and the second pose are determined according to the pose queue of the smart device, the first moment and the second moment.
- 根据权利要求3所述的方法,其中,所述根据所述智能设备的位姿队列、所述第一时刻和所述第二时刻,确定所述第一位姿以及所述第二位姿,包括:The method according to claim 3, wherein the determining the first pose and the second pose according to the pose queue of the smart device, the first moment and the second moment, include:响应于所述位姿队列中不包括所述第一时刻的位姿的情况,根据所述第一时刻,对所述智能设备的位姿队列进行补偿处理,得到所述第一位姿;和/或,In response to the situation that the pose queue at the first moment is not included in the pose queue, perform compensation processing on the pose queue of the smart device according to the first moment to obtain the first pose; and /or,响应于所述位姿队列中不包括所述第二时刻的位姿的情况,根据所述第二时刻,对所述智能设备的位姿队列进行补偿处理,得到所述第二位姿。In response to the situation that the pose at the second moment is not included in the pose queue, compensation processing is performed on the pose queue of the smart device according to the second moment to obtain the second pose.
- 根据权利要求1-4任一项所述的方法,其中,所述获取智能设备的第一传感器在第一时刻的第一目标数据之前,所述方法还包括:The method according to any one of claims 1 to 4, wherein before the acquiring the first target data of the first sensor of the smart device at the first moment, the method further comprises:根据所述第一传感器的时钟与所述智能设备的时钟的差异信息,确定第三时刻在所述智能设备时钟下的所述第一时刻,所述第三时刻用于标识所述第一传感器检测所述第一目标数据对应的第一原始数据的时刻,所述第三时刻为所述第一传感器的时钟下的时刻。According to the difference information between the clock of the first sensor and the clock of the smart device, it is determined that the third time is the first time under the clock of the smart device, and the third time is used to identify the first sensor The time when the first raw data corresponding to the first target data is detected, and the third time is a time under the clock of the first sensor.
- 根据权利要求5所述的方法,其中,所述获取智能设备的第一传感器在第一时刻的第一目标数据,包括:The method according to claim 5, wherein said acquiring the first target data of the first sensor of the smart device at the first moment comprises:接收所述第一传感器上报的所述第一原始数据,所述第一原始数据包括多个子数据,每个子数据具有一个对应的检测时刻,所述第三时刻为所述第一原始数据包括的多个子数据的检测时刻的参考时刻;Receive the first raw data reported by the first sensor, the first raw data includes multiple sub-data, each sub-data has a corresponding detection time, and the third time is included in the first raw data The reference time of the detection time of multiple sub-data;根据所述智能设备在每个子数据的检测时刻的位姿以及每个子数据,获取所述第一传感器在所述第一时刻的所述第一目标数据。The first target data of the first sensor at the first time is acquired according to the pose of the smart device at the detection time of each sub-data and each sub-data.
- 权利要求6所述的方法,其中,所述根据所述智能设备在每个子数据的检测时刻的位姿以及每个子数据,获取所述第一传感器在所述第一时刻的所述第一目标数据,包括:The method of claim 6, wherein the first target of the first sensor at the first time is obtained according to the pose of the smart device at the detection time of each sub-data and each sub-data Data, including:根据所述智能设备在每个子数据的检测时刻的位姿,确定所述第一传感器在每个子数据的检测时刻的坐标系;Determine the coordinate system of the first sensor at the detection time of each sub-data according to the pose of the smart device at the detection time of each sub-data;根据所述第一传感器在除所述第三时刻外的每个子数据的检测时刻的坐标系和所述第一传感器在所述第三时刻的坐标系,分别确定除所述第三时刻外的每个子数据在所述第三时刻对应的子数据;According to the coordinate system of the first sensor at the detection time of each sub-data except the third time and the coordinate system of the first sensor at the third time, determine the coordinates other than the third time. The sub-data corresponding to each sub-data at the third moment;对除所述第三时刻外的每个子数据在所述第三时刻对应的子数据进行整合处理,得到所述第一传感器在所述第一时刻的所述第一目标数据。Perform integration processing on the sub-data corresponding to each sub-data at the third time except the third time, to obtain the first target data of the first sensor at the first time.
- 根据权利要求5-7任一项所述的方法,其中,所述根据所述第一位姿和所述第二位姿,对所述第一目标数据进行补偿处理,得到所述第一传感器在所述第二时刻下的补偿数据之前,所述方法还包括:7. The method according to any one of claims 5-7, wherein the compensation processing is performed on the first target data according to the first pose and the second pose to obtain the first sensor Before the compensation data at the second moment, the method further includes:接收第三传感器上报的第二原始数据,所述第三传感器的类型与所述第一传感器的类型相同,所述第二原始数据包括多个子数据,每个子数据具有一个对应的检测时刻,所述第三时刻为所述第二原始数据包括的多个子数据的检测时刻的参考时刻;Receive the second raw data reported by the third sensor, the type of the third sensor is the same as the type of the first sensor, the second raw data includes multiple sub-data, each sub-data has a corresponding detection time, so The third time is a reference time of detection time of the multiple sub-data included in the second original data;根据所述智能设备在所述第二原始数据的每个子数据的检测时刻的位姿以及所述第二原始数据的每个子数据,获取所述第三传感器在所述第一时刻的第三目标数据;Obtain the third target of the third sensor at the first time according to the pose of the smart device at the detection time of each sub-data of the second raw data and each sub-data of the second raw data data;所述根据所述第一位姿和所述第二位姿,对所述第一目标数据进行补偿处理,得到所述第一传感器在所述第二时刻下的补偿数据,包括:The performing compensation processing on the first target data according to the first pose and the second pose to obtain the compensation data of the first sensor at the second moment includes:根据所述第一位姿和所述第二位姿,对所述第一目标数据和所述第三目标数据进行补偿处理,得到所述第一传感器在所述第二时刻下的补偿数据以及所述第三传感器在所述第二时刻下的补偿数据。According to the first pose and the second pose, perform compensation processing on the first target data and the third target data to obtain the compensation data of the first sensor at the second moment, and Compensation data of the third sensor at the second moment.
- 根据权利要求5-8任一项所述的方法,其中,所述根据所述第一传感器的时钟与所述智能设备的时钟的差异信息,确定第三时刻在所述智能设备时钟下的所述第一时刻之前,所述方法还包括:8. The method according to any one of claims 5-8, wherein the determination of the third moment in the clock of the smart device according to the difference information between the clock of the first sensor and the clock of the smart device Before the first moment, the method further includes:根据全球定位系统GPS时钟与所述智能设备的时钟误差,确定所述第一传感器的时钟与所述智能设备的时钟的差异信息。Determine the difference information between the clock of the first sensor and the clock of the smart device according to the error between the GPS clock of the global positioning system and the clock of the smart device.
- 根据权利要求1-9任一项所述的方法,其中,所述获取智能设备的第二传感器在第二时刻的第二目标数据之前,所述方法还包括:The method according to any one of claims 1-9, wherein before the acquiring the second target data of the second sensor of the smart device at the second moment, the method further comprises:根据所述第二传感器的时钟与所述智能设备的时钟的差异信息,确定所述第二传感器上报所述第二目标数据时携带的第四时刻在所述智能设备时钟下的所述第二时刻,所述第四时刻用于标识所述第二目标数据的检测时刻,所述第四时刻为所述第二传感器的时钟下的时刻。According to the difference information between the clock of the second sensor and the clock of the smart device, it is determined that the fourth time carried when the second sensor reports the second target data is in the second time under the smart device clock. Time, the fourth time is used to identify the detection time of the second target data, and the fourth time is the time under the clock of the second sensor.
- 根据权利要求10所述的方法,其中,所述根据所述第二传感器的时钟与所述智能设备的时钟的差异信息,确定所述第二传感器上报所述第二目标数据时携带的第四时刻在所述智能设备时钟下的所述第二时刻之前,所述方法还还包括:The method according to claim 10, wherein said determining the fourth sensor carried when the second sensor reports the second target data according to the difference information between the clock of the second sensor and the clock of the smart device. The time is before the second time under the smart device clock, and the method further includes:使用所述第二传感器拍摄所述智能设备的秒表的多个视频帧;Using the second sensor to shoot multiple video frames of the stopwatch of the smart device;对拍摄每个所述视频帧的时刻信息与每个所述视频帧所对应的秒表所显示的时刻信息进行比对分析,得到所述第二传感器的时钟与所述智能设备的时钟的差异信息。Compare and analyze the time information of shooting each of the video frames with the time information displayed by the stopwatch corresponding to each of the video frames to obtain the difference information between the clock of the second sensor and the clock of the smart device .
- 根据权利要求1-11任一项所述的方法,其中,所述第二传感器为摄像头,所述第一传感器为激光雷达或毫米波雷达。The method according to any one of claims 1-11, wherein the second sensor is a camera, and the first sensor is a lidar or millimeter wave radar.
- 一种传感器数据处理装置,包括:A sensor data processing device includes:第一获取模块,用于获取智能设备的第一传感器在第一时刻的第一目标数据;The first acquisition module is used to acquire the first target data of the first sensor of the smart device at the first moment;第二获取模块,用于获取所述智能设备的第二传感器在第二时刻的第二目标数据,所述第一时刻和所述第二时刻为所述智能设备的时钟下的不同时刻;A second acquisition module, configured to acquire second target data of a second sensor of the smart device at a second time, where the first time and the second time are different times under the clock of the smart device;第三获取模块,用于获取所述智能设备在所述第一时刻的第一位姿和在所述第二时刻的第二位姿,所述第一位姿和所述第二位姿不同;The third acquisition module is used to acquire the first pose of the smart device at the first moment and the second pose at the second moment, where the first pose and the second pose are different ;补偿模块,用于根据所述第一位姿和所述第二位姿,对所述第一目标数据进行补偿处理,得到所述第一传感器在所述第二时刻下的补偿数据。The compensation module is configured to perform compensation processing on the first target data according to the first pose and the second pose to obtain the compensation data of the first sensor at the second moment.
- 根据权利要求13所述的装置,其中,所述补偿模块,用于根据所述第一位姿,确定所述第一传感器在所述第一时刻的第一坐标系;根据所述第二位姿,确定所述第一传感器在所述第二时刻的第二坐标系;根据所述第一坐标系和所述第二坐标系,对所述第一目标数据进行补偿处理,得到所述第一传感器在所述第二时刻下的补偿数据。The device according to claim 13, wherein the compensation module is configured to determine the first coordinate system of the first sensor at the first moment according to the first pose; according to the second position Pose, determine the second coordinate system of the first sensor at the second moment; perform compensation processing on the first target data according to the first coordinate system and the second coordinate system to obtain the first Compensation data of a sensor at the second moment.
- 根据权利要求14所述的装置,其中,所述第三获取模块,用于分别获取所述智能设备上设置的具有位姿检测功能的传感器在多个时刻检测到的位姿检测数据,所述多个时刻中的每个时刻均为所述智能设备的时钟下的时刻;根据所述位姿检测数据,生成所述智能设备的位姿队列;根据所述智能设备的位姿队列、所述第一时刻和所述第二时刻,确定所述第一位姿以及所述第二位姿。The device according to claim 14, wherein the third acquisition module is configured to acquire the pose detection data detected at multiple times by the sensor with the pose detection function set on the smart device, and the Each of the multiple times is a time under the clock of the smart device; generates the pose queue of the smart device according to the pose detection data; according to the pose queue of the smart device, the The first moment and the second moment determine the first pose and the second pose.
- 根据权利要求15所述的装置,其中,所述第三获取模块,用于响应于所述位姿队列中不包括所述第一时刻的位姿的情况,根据所述第一时刻,对所述智能设备的位姿队列进行补偿处理,得到所述第一位姿;和/或,响应于所述位姿队列中不包括所述第二时刻的位姿的情况,根据所述第二时刻,对所述智能设备的位姿队列进行补偿处理,得到所述第二位姿。The device according to claim 15, wherein the third acquiring module is configured to respond to the situation that the pose at the first moment is not included in the pose queue, and perform a check on all the poses according to the first moment. The pose queue of the smart device performs compensation processing to obtain the first pose; and/or, in response to the situation that the pose queue does not include the pose at the second moment, according to the second moment , Performing compensation processing on the pose queue of the smart device to obtain the second pose.
- 根据权利要求13-16任一项所述的装置,其中,所述装置还包括:The device according to any one of claims 13-16, wherein the device further comprises:第一确定模块,用于根据所述第一传感器的时钟与所述智能设备的时钟的差异信息,确定第三时刻在所述智能设备时钟下的所述第一时刻,所述第三时刻用于标识所述第一传感器检测所述第一目标数据对应的第一原始数据的时刻,所述第三时刻为所述第一传感器的时钟下的时刻。The first determining module is configured to determine, according to the difference information between the clock of the first sensor and the clock of the smart device, that the third time is the first time under the smart device clock, and the third time is used To identify the time when the first sensor detects the first raw data corresponding to the first target data, the third time is a time under the clock of the first sensor.
- 根据权利要求17所述的装置,其中,所述第一获取模块,用于接收所述第一传感器上报的所述第一原始数据,所述第一原始数据包括多个子数据,每个子数据具有一个对应的检测时刻,所述第三时刻为所述第一原始数据包括的多个子数据的检测时刻的参考时刻;根据所述智能设备在每个子数据的检测时刻的位姿以及每个子数据,获取所述第一传感器在所述第一时刻的所述第一目标数据。The apparatus according to claim 17, wherein the first acquisition module is configured to receive the first raw data reported by the first sensor, the first raw data includes a plurality of sub-data, and each sub-data has A corresponding detection time, the third time is the reference time of the detection time of the multiple sub-data included in the first original data; according to the pose of the smart device at the detection time of each sub-data and each sub-data, Acquiring the first target data of the first sensor at the first moment.
- 权利要求18所述的装置,其中,所述第一获取模块,用于根据所述智能设备在每个子数据的检测时刻的位姿,确定所述第一传感器在每个子数据的检测时刻的坐标系;根据所述第一传感器在除所述第三时刻外的每个子数据的检测时刻的坐标系和所述第一传感器在所述第三时刻的坐标系,分别确定除所述第三时刻外的每个子数据在所述第三时刻对应的子数据;对除所述第三时刻外的每个子数据在所述第三时刻对应的子数据进行整合处理,得到所述第一传感器在所述第一时刻的所述第一目标数据。The apparatus of claim 18, wherein the first acquisition module is configured to determine the coordinates of the first sensor at the detection time of each sub-data according to the pose of the smart device at the detection time of each sub-data System; according to the first sensor at the coordinate system of the detection time of each sub-data except the third time and the coordinate system of the first sensor at the third time, respectively determine except for the third time The sub-data corresponding to each sub-data except the third time at the third time; the sub-data corresponding to each sub-data at the third time except the third time is integrated to obtain the first sensor at the The first target data at the first moment.
- 根据权利要求17-19任一项所述的装置,其中,所述装置还包括:The device according to any one of claims 17-19, wherein the device further comprises:接收模块,用于接收第三传感器上报的第二原始数据,所述第三传感器的类型与所述第一传感器的类型相同,所述第二原始数据包括多个子数据,每个子数据具有一个对应的检测时刻,所述第三时刻为所述第二原始数据包括的多个子数据的检测时刻的参考时刻;The receiving module is configured to receive the second raw data reported by the third sensor, the type of the third sensor is the same as the type of the first sensor, and the second raw data includes multiple sub-data, and each sub-data has a corresponding The detection time of the third time is a reference time of the detection time of the multiple sub-data included in the second original data;第四获取模块,用于根据所述智能设备在所述第二原始数据的每个子数据的检测时刻的位姿以及所述第二原始数据的每个子数据,获取所述第三传感器在所述第一时刻的第三目标数据;The fourth acquisition module is configured to acquire the position of the third sensor at the detection time of each sub-data of the second raw data and each sub-data of the second raw data according to the smart device The third target data at the first moment;所述补偿模块,用于根据所述第一位姿和所述第二位姿,对所述第一目标数据和所述第三目标数据进行补偿处理,得到所述第一传感器在所述第二时刻下的补偿数据以及所述第三传感器在所述第二时刻下的补偿数据。The compensation module is configured to perform compensation processing on the first target data and the third target data according to the first pose and the second pose, to obtain that the first sensor is in the first position The compensation data at the second time and the compensation data of the third sensor at the second time.
- 根据权利要求17-20任一项所述的装置,其中,所述装置还包括:The device according to any one of claims 17-20, wherein the device further comprises:第五获取模块,用于所述第一确定模块根据所述第一传感器的时钟与所述智能设备的时钟的差异信息,确定第三时刻在所述智能设备时钟下的所述第一时刻之前,根据全球定位系统GPS时钟与所述智能设备的时钟误差,获取所述第一传感器的时钟与所述智能设备的时钟的差异信息。The fifth acquiring module is used for the first determining module to determine that the third time is before the first time under the smart device clock according to the difference information between the clock of the first sensor and the clock of the smart device According to the error between the GPS clock of the global positioning system and the clock of the smart device, the difference information between the clock of the first sensor and the clock of the smart device is obtained.
- 根据权利要求13-21任一项所述的装置,其中,所述装置还包括:The device according to any one of claims 13-21, wherein the device further comprises:第二确定模块,用于所述第二获取模块获取智能设备的第二传感器在第二时刻的第二目标数据之前,根据所述第二传感器的时钟与所述智能设备的时钟的差异信息,确定所述第二传感器上报所述第二目标数据时携带的第四时刻在所述智能设备时钟下的所述第二时刻,所述第四时刻用于标识所述第二目标数据的检测时刻,所述第四时刻为所述第二传感器的时钟下的时刻。The second determining module is configured to, before the second acquiring module acquires the second target data of the second sensor of the smart device at the second moment, according to the difference information between the clock of the second sensor and the clock of the smart device, It is determined that the fourth time carried when the second sensor reports the second target data is the second time under the smart device clock, and the fourth time is used to identify the detection time of the second target data , The fourth time is a time under the clock of the second sensor.
- 根据权利要求22所述的装置,其中,所述装置还包括:The device according to claim 22, wherein the device further comprises:拍摄模块,用于使用所述第二传感器拍摄所述智能设备的秒表的多个视频帧;A photographing module, configured to use the second sensor to photograph multiple video frames of the stopwatch of the smart device;分析模块,用于对拍摄每个所述视频帧的时刻信息与每个所述视频帧所对应的秒表所显示的时刻信息进行比对分析,得到所述第二传感器的时钟与所述智能设备的时钟的差异信息。The analysis module is used to compare and analyze the time information of each video frame and the time information displayed by the stopwatch corresponding to each video frame to obtain the clock of the second sensor and the smart device The difference information of the clock.
- 根据权利要求13-23任一项所述的装置,其中,所述第二传感器为摄像头,所述第一传感器为激光雷达或毫米波雷达。The device according to any one of claims 13-23, wherein the second sensor is a camera, and the first sensor is a lidar or millimeter wave radar.
- 一种智能行驶控制方法,包括:An intelligent driving control method, including:分别获取设置在智能设备上的第一传感器和第二传感器的检测数据,所述检测数据采用如权利要求1-12任一项所述的传感器数据处理方法得到;Acquiring detection data of the first sensor and the second sensor set on the smart device respectively, and the detection data is obtained using the sensor data processing method according to any one of claims 1-12;根据所述检测数据对智能设备进行智能行驶控制。Perform intelligent driving control on the smart device according to the detection data.
- 一种智能行驶控制装置,包括:An intelligent driving control device, including:获取模块,用于分别获取设置在智能设备上的第一传感器和第二传感器的检测数据,所述检测数据采用如权利要求1-12任一项所述的传感器数据处理方法得到;The acquisition module is configured to separately acquire detection data of the first sensor and the second sensor set on the smart device, and the detection data is obtained by using the sensor data processing method according to any one of claims 1-12;智能行驶控制模块,用于根据所述检测数据对智能设备进行智能行驶控制。The intelligent driving control module is used to perform intelligent driving control on the smart device according to the detection data.
- 一种电子设备,包括:An electronic device including:存储器,用于存储计算机指令;Memory, used to store computer instructions;处理器,用于调用并执行所述存储器中的计算机指令,执行权利要求1-12任一项所述的方法步骤。The processor is configured to call and execute computer instructions in the memory, and execute the method steps of any one of claims 1-12.
- 一种智能行驶系统,包括:通信连接的传感器、如权利要求27所述的电子设备和如权利要求26所述的智能行驶控制装置。An intelligent driving system, comprising: a sensor connected in communication, the electronic device according to claim 27, and the intelligent driving control device according to claim 26.
- 一种可读存储介质,所述可读存储介质中存储有计算机程序,所述计算机程序用于执行权利要求1-12任一项所述的方法步骤;或者,所述计算机程序用于执行权利要求25所述的方法步骤。A readable storage medium in which a computer program is stored, and the computer program is used to execute the method steps of any one of claims 1-12; or, the computer program is used to execute rights Method steps described in claim 25.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020217016721A KR20210087495A (en) | 2019-06-25 | 2020-02-26 | Sensor data processing methods, devices, electronic devices and systems |
JP2021533178A JP7164721B2 (en) | 2019-06-25 | 2020-02-26 | Sensor data processing method, device, electronic device and system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910556258.2A CN112214009B (en) | 2019-06-25 | 2019-06-25 | Sensor data processing method and device, electronic equipment and system |
CN201910556258.2 | 2019-06-25 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020258901A1 true WO2020258901A1 (en) | 2020-12-30 |
Family
ID=74048283
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/076813 WO2020258901A1 (en) | 2019-06-25 | 2020-02-26 | Method and apparatus for processing data of sensor, electronic device, and system |
Country Status (4)
Country | Link |
---|---|
JP (1) | JP7164721B2 (en) |
KR (1) | KR20210087495A (en) |
CN (1) | CN112214009B (en) |
WO (1) | WO2020258901A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112880674A (en) * | 2021-01-21 | 2021-06-01 | 深圳市镭神智能系统有限公司 | Positioning method, device and equipment of driving equipment and storage medium |
CN113033483A (en) * | 2021-04-20 | 2021-06-25 | 北京百度网讯科技有限公司 | Method and device for detecting target object, electronic equipment and storage medium |
CN113724303A (en) * | 2021-09-07 | 2021-11-30 | 广州文远知行科技有限公司 | Point cloud and image matching method and device, electronic equipment and storage medium |
CN115167612A (en) * | 2022-07-14 | 2022-10-11 | 北京中科心研科技有限公司 | Wall time and supplementary packet method, device and medium for synchronizing data |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112902951A (en) * | 2021-01-21 | 2021-06-04 | 深圳市镭神智能系统有限公司 | Positioning method, device and equipment of driving equipment and storage medium |
CN114520855B (en) * | 2021-12-31 | 2024-03-15 | 广州文远知行科技有限公司 | Image frame rendering method and device based on multi-module data and storage medium |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5906655A (en) * | 1997-04-02 | 1999-05-25 | Caterpillar Inc. | Method for monitoring integrity of an integrated GPS and INU system |
CN103940434A (en) * | 2014-04-01 | 2014-07-23 | 西安交通大学 | Real-time lane line detecting system based on monocular vision and inertial navigation unit |
CN104501814A (en) * | 2014-12-12 | 2015-04-08 | 浙江大学 | Attitude and position estimation method based on vision and inertia information |
CN105745604A (en) * | 2013-11-03 | 2016-07-06 | 微软技术许可有限责任公司 | Sensor data time alignment |
CN106546238A (en) * | 2016-10-26 | 2017-03-29 | 北京小鸟看看科技有限公司 | Wearable device and the method that user's displacement is determined in wearable device |
CN107462892A (en) * | 2017-07-28 | 2017-12-12 | 深圳普思英察科技有限公司 | Mobile robot synchronous superposition method based on more sonacs |
CN107976193A (en) * | 2017-11-21 | 2018-05-01 | 出门问问信息科技有限公司 | A kind of pedestrian's flight path estimating method, device, flight path infer equipment and storage medium |
CN109270534A (en) * | 2018-05-07 | 2019-01-25 | 西安交通大学 | A kind of intelligent vehicle laser sensor and camera online calibration method |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008003779A (en) * | 2006-06-21 | 2008-01-10 | Hitachi Ltd | Measurement data processing apparatus of preventive safety car |
JP2009222438A (en) | 2008-03-13 | 2009-10-01 | Toyota Motor Corp | Positioning device for movable body |
JP5416026B2 (en) | 2010-04-23 | 2014-02-12 | 本田技研工業株式会社 | Vehicle periphery monitoring device |
CN103353304B (en) * | 2013-06-25 | 2016-02-03 | 深圳市宇恒互动科技开发有限公司 | A kind ofly three-dimensional inertial motion is sensed to the method and device that compensate |
CN104112363B (en) * | 2014-07-04 | 2016-05-25 | 西安交通大学 | Many sensing datas space-time synchronous method and many sensing datas of road vehicular collecting system |
DE102016212326A1 (en) * | 2016-07-06 | 2018-01-11 | Robert Bosch Gmbh | Method for processing sensor data for a position and / or orientation of a vehicle |
JP6787102B2 (en) | 2016-12-14 | 2020-11-18 | 株式会社デンソー | Object detection device, object detection method |
US10145945B2 (en) * | 2017-01-11 | 2018-12-04 | Toyota Research Institute, Inc. | Systems and methods for automatically calibrating a LIDAR using information from a secondary vehicle |
US10599931B2 (en) * | 2017-08-21 | 2020-03-24 | 2236008 Ontario Inc. | Automated driving system that merges heterogenous sensor data |
CN108168918B (en) * | 2017-12-25 | 2019-12-27 | 中铁第四勘察设计院集团有限公司 | Synchronous automatic control system and method for synchronous measurement of automatic track measuring vehicle |
CN108957466B (en) * | 2018-04-18 | 2022-01-25 | 广东宝乐机器人股份有限公司 | Radar data compensation method, device, equipment and storage medium for mobile robot |
CN109218562B (en) * | 2018-09-07 | 2021-04-27 | 百度在线网络技术(北京)有限公司 | Clock synchronization method, device, equipment, storage medium and vehicle |
-
2019
- 2019-06-25 CN CN201910556258.2A patent/CN112214009B/en active Active
-
2020
- 2020-02-26 WO PCT/CN2020/076813 patent/WO2020258901A1/en active Application Filing
- 2020-02-26 JP JP2021533178A patent/JP7164721B2/en active Active
- 2020-02-26 KR KR1020217016721A patent/KR20210087495A/en not_active Application Discontinuation
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5906655A (en) * | 1997-04-02 | 1999-05-25 | Caterpillar Inc. | Method for monitoring integrity of an integrated GPS and INU system |
CN105745604A (en) * | 2013-11-03 | 2016-07-06 | 微软技术许可有限责任公司 | Sensor data time alignment |
CN103940434A (en) * | 2014-04-01 | 2014-07-23 | 西安交通大学 | Real-time lane line detecting system based on monocular vision and inertial navigation unit |
CN104501814A (en) * | 2014-12-12 | 2015-04-08 | 浙江大学 | Attitude and position estimation method based on vision and inertia information |
CN106546238A (en) * | 2016-10-26 | 2017-03-29 | 北京小鸟看看科技有限公司 | Wearable device and the method that user's displacement is determined in wearable device |
CN107462892A (en) * | 2017-07-28 | 2017-12-12 | 深圳普思英察科技有限公司 | Mobile robot synchronous superposition method based on more sonacs |
CN107976193A (en) * | 2017-11-21 | 2018-05-01 | 出门问问信息科技有限公司 | A kind of pedestrian's flight path estimating method, device, flight path infer equipment and storage medium |
CN109270534A (en) * | 2018-05-07 | 2019-01-25 | 西安交通大学 | A kind of intelligent vehicle laser sensor and camera online calibration method |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112880674A (en) * | 2021-01-21 | 2021-06-01 | 深圳市镭神智能系统有限公司 | Positioning method, device and equipment of driving equipment and storage medium |
CN113033483A (en) * | 2021-04-20 | 2021-06-25 | 北京百度网讯科技有限公司 | Method and device for detecting target object, electronic equipment and storage medium |
CN113033483B (en) * | 2021-04-20 | 2024-02-02 | 北京百度网讯科技有限公司 | Method, device, electronic equipment and storage medium for detecting target object |
CN113724303A (en) * | 2021-09-07 | 2021-11-30 | 广州文远知行科技有限公司 | Point cloud and image matching method and device, electronic equipment and storage medium |
CN113724303B (en) * | 2021-09-07 | 2024-05-10 | 广州文远知行科技有限公司 | Point cloud and image matching method and device, electronic equipment and storage medium |
CN115167612A (en) * | 2022-07-14 | 2022-10-11 | 北京中科心研科技有限公司 | Wall time and supplementary packet method, device and medium for synchronizing data |
Also Published As
Publication number | Publication date |
---|---|
JP7164721B2 (en) | 2022-11-01 |
CN112214009B (en) | 2022-07-26 |
CN112214009A (en) | 2021-01-12 |
KR20210087495A (en) | 2021-07-12 |
JP2022513780A (en) | 2022-02-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020258901A1 (en) | Method and apparatus for processing data of sensor, electronic device, and system | |
US10789771B2 (en) | Method and apparatus for fusing point cloud data | |
CN108900272B (en) | Sensor data acquisition method and system and packet loss judgment method | |
US20200372672A1 (en) | Image-based localization | |
WO2019119282A1 (en) | Method and device for associating image and location information, and movable platform | |
JP2022509302A (en) | Map generation method, operation control method, device, electronic device and system | |
CN109100730B (en) | Multi-vehicle cooperative rapid map building method | |
US20170337701A1 (en) | Method and system for 3d capture based on structure from motion with simplified pose detection | |
IL269560A (en) | Distributed device mapping | |
CN113160327A (en) | Method and system for realizing point cloud completion | |
US20220229759A1 (en) | Method, device, and system for simulation test | |
JPWO2020039937A1 (en) | Position coordinate estimation device, position coordinate estimation method and program | |
US20220163976A1 (en) | Systems and methods for creating and using risk profiles for fleet management of a fleet of vehicles | |
Zingoni et al. | Real-time 3D reconstruction from images taken from an UAV | |
CN113256683B (en) | Target tracking method and related equipment | |
CN113129382A (en) | Method and device for determining coordinate conversion parameters | |
WO2019080879A1 (en) | Data processing method, computer device, and storage medium | |
CN113378605B (en) | Multi-source information fusion method and device, electronic equipment and storage medium | |
WO2023213083A1 (en) | Object detection method and apparatus and driverless car | |
KR20190134303A (en) | Apparatus and method for image recognition | |
CN116684740A (en) | Perception training data generation method, device, computer equipment and storage medium | |
CN115661014A (en) | Point cloud data processing method and device, electronic equipment and storage medium | |
KR20210008647A (en) | Apparatus and method for detecting vehicle type, speed and traffic using radar device and image processing | |
US20220373661A1 (en) | Sensor triggering to synchronize sensor data | |
KR102019990B1 (en) | Method and apparatus for estimating vehicle position based on visible light communication that considering motion blur compensation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20831377 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20217016721 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2021533178 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20831377 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20831377 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 12/09/2022) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20831377 Country of ref document: EP Kind code of ref document: A1 |