CN116246029A - Data synchronization method, device, terminal equipment and computer readable storage medium - Google Patents

Data synchronization method, device, terminal equipment and computer readable storage medium Download PDF

Info

Publication number
CN116246029A
CN116246029A CN202211656935.6A CN202211656935A CN116246029A CN 116246029 A CN116246029 A CN 116246029A CN 202211656935 A CN202211656935 A CN 202211656935A CN 116246029 A CN116246029 A CN 116246029A
Authority
CN
China
Prior art keywords
point
radar
camera
image
mapping
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211656935.6A
Other languages
Chinese (zh)
Inventor
王永
张龙洋
王亚军
王邓江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Wanji Iov Technology Co ltd
Original Assignee
Suzhou Wanji Iov Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Wanji Iov Technology Co ltd filed Critical Suzhou Wanji Iov Technology Co ltd
Priority to CN202211656935.6A priority Critical patent/CN116246029A/en
Publication of CN116246029A publication Critical patent/CN116246029A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0116Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image

Abstract

The application is applicable to the technical field of data processing, and provides a data synchronization method, a device, terminal equipment and a computer readable storage medium, which are applied to a detection system, wherein the detection system comprises a camera and a radar, and the method comprises the following steps: acquiring a first mapping point corresponding to a first pixel point in a first image shot by the camera, wherein the mapping point represents three-dimensional geographic coordinates in an offline point cloud map, and the area covered by the offline point cloud map comprises a shooting area of the camera and a detection area of the radar; acquiring a first target point corresponding to the first mapping point, wherein the first target point is a three-dimensional coordinate point in a first point cloud image detected by the radar, and the detection time of the first point cloud image corresponds to the shooting time of the first image; synchronizing the first pixel point with the first target point. By the method, the success rate of data synchronization of various sensors can be effectively improved.

Description

Data synchronization method, device, terminal equipment and computer readable storage medium
Technical Field
The application belongs to the technical field of data processing, and particularly relates to a data synchronization method, a data synchronization device, terminal equipment and a computer readable storage medium.
Background
Drive-test sensing systems typically include a variety of sensors, such as cameras, lidar, millimeter wave radar, and the like. The road traffic condition is cooperatively detected by the drive test sensing system through various sensors so as to ensure the smoothness and safety of the road traffic.
In a roadside sensing system, the data of a plurality of sensors needs to be unified into one coordinate system, namely, the spatial synchronization of the data of the plurality of sensors. However, since the detection range of each sensor has a limitation, the detection ranges of different sensors may not overlap, and in this case, data synchronization failure is easily caused.
Disclosure of Invention
The embodiment of the application provides a data synchronization method, a data synchronization device, terminal equipment and a computer readable storage medium, which can effectively improve the success rate of data synchronization of various sensors.
In a first aspect, an embodiment of the present application provides a data synchronization method applied to a detection system, where the detection system includes a camera and a radar, the method includes:
acquiring a first mapping point corresponding to a first pixel point in a first image shot by the camera, wherein the mapping point represents three-dimensional geographic coordinates in an offline point cloud map, and the area covered by the offline point cloud map comprises a shooting area of the camera and a detection area of the radar;
acquiring a first target point corresponding to the first mapping point, wherein the first target point is a three-dimensional coordinate point in a first point cloud image detected by the radar, and the detection time of the first point cloud image corresponds to the shooting time of the first image;
synchronizing the first pixel point with the first target point.
In the embodiment of the application, the offline point cloud map and the camera detection result are utilized to realize first time space synchronization, and then the camera detection result and the multi-line laser radar are utilized to realize second time space synchronization. Because the area covered by the offline point cloud map is wider, the data synchronization failure caused by the misalignment of the detection ranges of the sensors can be effectively avoided. By the method, the success rate of data synchronization of various sensors can be effectively improved. In addition, because the offline point cloud map is generated in advance, the time for generating the offline point cloud map is saved in the data synchronization process, and the data synchronization efficiency is improved.
Optionally, the data acquisition precision of the offline point cloud map is higher than the data acquisition precision of the camera and the data acquisition precision of the radar in the detection system.
The offline point cloud map with higher data acquisition precision can make up for acquisition data loss caused by lower data acquisition precision of the sensor in the detection system, so that the occurrence of data synchronization failure caused by acquisition data loss is avoided.
In a possible implementation manner of the first aspect, before acquiring the first mapping point corresponding to the first pixel point in the first image captured by the camera, the method further includes:
and establishing a first mapping relation, wherein the first mapping relation represents a corresponding relation between the three-dimensional geographic coordinates in the offline point cloud map and pixel points in the image shot by the camera.
In a possible implementation manner of the first aspect, the establishing a first mapping relationship includes:
converting each first geographic coordinate in the offline point cloud map to a radar coordinate system of the radar to obtain a three-dimensional first coordinate point;
mapping the first coordinate point to the pixel coordinate system according to the calibration parameters of the camera to obtain a two-dimensional second pixel point;
and establishing the first mapping relation according to the first geographic coordinates and the second pixel points.
In a possible implementation manner of the first aspect, the first geographic coordinate includes a first longitude, a first latitude, and a first altitude;
converting the first geographic coordinate to a radar coordinate system of the radar to obtain a three-dimensional first coordinate point, wherein the method comprises the following steps of:
converting the first longitude, the first latitude and the first altitude into a reference coordinate system to obtain a three-dimensional second coordinate point, wherein the origin of the reference coordinate system is determined according to the second longitude, the second latitude and the second altitude of the radar;
and converting the second coordinate point into the radar coordinate system according to the north offset angle of the radar to obtain the first coordinate point, wherein the north offset angle of the radar represents an included angle between a coordinate axis of the radar and a geographic north pole.
In a possible implementation manner of the first aspect, before acquiring the first mapping relationship, the method further includes:
lane line data are obtained, and the lane line data represent three-dimensional geographic coordinates on lane lines in a shooting area of the camera;
projecting the lane line data into a second image shot by the camera to obtain a two-dimensional third pixel point;
if the position of the third pixel point in the second image is inconsistent with the position of the lane line, recalibrating the parameters of the camera until the position of the third pixel point in the second image is consistent with the position of the lane line;
and if the position of the third pixel point in the second image is consistent with the position of the lane line, determining the current parameter of the camera as the calibration parameter.
In the embodiment of the application, the camera parameters are calibrated through the three-dimensional lane line data, and the reliability of the camera calibration parameters can be improved through the above calibration mode because the lane line data are accurate.
In a possible implementation manner of the first aspect, the synchronizing the first pixel point with the first target point includes:
converting the first pixel point into the radar coordinate system to obtain a second target point;
calculating a spatial distance between the first target point and the second target point;
and if the spatial distance is within a preset range, marking the second target point in the first point cloud image.
In a possible implementation manner of the first aspect, after calculating the spatial distance between the first target point and the second target point, the method further includes:
and if the spatial distance is not in the preset range, re-selecting the first pixel point from the first image until a first target point corresponding to the re-selected first pixel point is acquired.
In a second aspect, an embodiment of the present application provides a data synchronization device, where the detection system includes a camera and a radar, and the device includes:
a first obtaining unit, configured to obtain a first mapping point corresponding to a first pixel point in a first image captured by the camera, where the mapping point represents three-dimensional geographic coordinates in an offline point cloud map, and an area covered by the offline point cloud map includes a capturing area of the camera and a detection area of the radar;
a second obtaining unit, configured to obtain a first target point corresponding to the first mapping point, where the first target point is a three-dimensional coordinate point in a first point cloud image detected by the radar, and a detection time of the first point cloud image corresponds to a shooting time of the first image;
and the data synchronization unit is used for synchronizing the first pixel point and the first target point.
In a third aspect, an embodiment of the present application provides a terminal device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the data synchronization method according to any one of the first aspects when the processor executes the computer program.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium storing a computer program which, when executed by a processor, implements a data synchronization method as in any one of the first aspects above.
In a fifth aspect, embodiments of the present application provide a computer program product, which, when run on a terminal device, causes the terminal device to perform the data synchronization method according to any one of the first aspects above.
It will be appreciated that the advantages of the second to fifth aspects may be found in the relevant description of the first aspect, and are not described here again.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required for the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a data synchronization method provided in an embodiment of the present application;
FIG. 2 is a schematic diagram of a data synchronization effect provided by an embodiment of the present application;
FIG. 3 is a block diagram of a data synchronization device according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a terminal device provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system configurations, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in this specification and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
In addition, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used merely to distinguish between descriptions and are not to be construed as indicating or implying relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise.
Drive-test sensing systems typically include a variety of sensors, such as cameras, lidar, millimeter wave radar, and the like. The road traffic condition is cooperatively detected by the drive test sensing system through various sensors so as to ensure the smoothness and safety of the road traffic.
In a roadside sensing system, the data of a plurality of sensors needs to be unified into one coordinate system, namely, the spatial synchronization of the data of the plurality of sensors. However, since the detection range of each sensor has a limitation, the detection ranges of different sensors may not overlap, and in this case, data synchronization failure is easily caused.
In order to solve the above problems, embodiments of the present application provide a data synchronization method, a data synchronization device, a terminal device, and a computer readable storage medium. In the embodiment of the application, the offline high-precision point cloud map and the camera detection result are utilized to realize the first time of spatial synchronization, and then the camera detection result and the multi-line laser radar are utilized to realize the second time of spatial synchronization. Because the offline high-precision point cloud map has wider coverage area, the data synchronization failure caused by the misalignment of the detection ranges of the sensors can be effectively avoided. By the method, the success rate of data synchronization of various sensors can be effectively improved.
In the following embodiments, a method of data synchronization between different sensors is described by taking a camera and a radar in a detection system as an example. The camera may be a 3D camera, a 2D camera, an RGB camera, a depth camera, or the like. The radar may be a lidar, a multi-line lidar or a millimeter wave radar, etc.
It can be appreciated that the method in the embodiment of the present application may be applied not only to data synchronization between a camera and a radar, but also to data synchronization between other sensors. For example, data synchronization between two radars, or data synchronization between two cameras, etc. Secondly, the detection system according to the embodiments of the present application may be a drive test sensing system as described in the above background art, or may be a vehicle-mounted sensing system, or other detection systems, which is not limited in particular. In addition, the method in the embodiment of the application can be applied to data synchronization among a plurality of sensors in the same detection system, and can also be applied to data synchronization among a plurality of sensors in different detection systems. And are not limited thereto.
Referring to fig. 1, which is a schematic flow chart of a data synchronization method according to an embodiment of the present application, by way of example and not limitation, the method may include the following steps:
s101, acquiring a first mapping point corresponding to a first pixel point in a first image shot by the camera.
The offline point cloud map includes three-dimensional geographic coordinates, such as longitude, latitude, and altitude, of each of the plurality of regions.
In this embodiment of the present application, the mapping points represent three-dimensional geographic coordinates in an offline point cloud map, and the area covered by the offline point cloud map includes a shooting area of the camera and a detection area of the radar.
Alternatively, the offline point cloud map in the embodiment of the present application may be a high-precision point cloud map. It will be appreciated that "high-precision" herein refers to high data accuracy, relative to cameras and radars in the detection system to be synchronized. In other words, the data acquisition precision of the high-precision point cloud map is higher than the data acquisition precision of the camera and the data acquisition precision of the radar in the detection system. Through the high-precision point cloud map, acquired data loss caused by low data acquisition precision of a sensor in a detection system can be compensated, so that the occurrence of data synchronization failure caused by the acquired data loss is avoided.
A high-precision point cloud map may be generated in advance. In the data synchronization process, a pre-generated high-precision point cloud map is used as an offline point cloud map. In this way, the efficiency of data synchronization can be improved.
In some application scenarios, a high-precision point cloud map is acquired within a target area through a variety of sensor data mounted on a graphics-mining vehicle. The sensors on the cart may include lidar, cameras, global navigation satellite systems (Global Navigation Satellite System, GNSS), inertial measurement units (Inertial Measurement Unit, IMU), etc. The laser radar is used for acquiring point cloud data, the camera is used for acquiring environment live-action images, and the GNSS and the IMU are used for positioning.
For example, in the traffic field, features acquired by a high-precision map mainly include lane lines, road traffic facilities (crosswalk, turn markers, traffic facilities, and the like), lane topology network data, and other features.
S102, acquiring a first target point corresponding to the first mapping point.
The first target point is a three-dimensional coordinate point in a first point cloud image detected by the radar, and the detection time of the first point cloud image corresponds to the shooting time of the first image.
In the embodiment of the application, the radar detects in real time to obtain a first point cloud image. The first point cloud image includes a plurality of three-dimensional coordinate points within a radar detection area. The three-dimensional coordinate points may include x, y, z coordinates.
S103, synchronizing the first pixel point with the first target point.
One implementation converts camera data into radar data. Specific: converting the first pixel point into the radar coordinate system to obtain a second target point; calculating a spatial distance between the first target point and the second target point; if the spatial distance is within a preset range, marking the second target point in the first point cloud image; if the spatial distance is not within the preset range, the first pixel point is reselected from the first image, and then the steps S101 to S103 are re-executed until the first target point corresponding to the reselected first pixel point is acquired.
Another implementation converts radar data into camera data. Specific: converting the first target point into a pixel coordinate system to obtain a target pixel point; calculating a pixel distance between the target pixel point and the first pixel point; if the pixel distance is smaller than a preset threshold value, marking the target pixel point in a first shooting image shot by a camera; if the pixel distance is greater than the preset threshold, the first pixel point is reselected from the first image, and then the steps S101-S103 are re-executed until the first target point corresponding to the reselected first pixel point is acquired.
The above steps S101 to S103 describe a process of searching for corresponding radar point cloud data by the pixel points in the camera-captured image. It may be appreciated that another implementation manner of the data synchronization method provided in the embodiments of the present application may be that a point cloud in a first point cloud image detected by a radar searches for a pixel point in a corresponding camera shooting image. Specifically, the method may include: acquiring a first mapping point corresponding to a first target point in a first point cloud image detected by a radar; acquiring a first pixel point corresponding to a first mapping point; synchronizing the first target point and the first pixel point.
In the embodiment of the application, the offline high-precision point cloud map and the camera detection result are utilized to realize the first time of spatial synchronization, and then the camera detection result and the multi-line laser radar are utilized to realize the second time of spatial synchronization. Because the offline high-precision point cloud map has wider coverage area, the data synchronization failure caused by the misalignment of the detection ranges of the sensors can be effectively avoided. By the method, the success rate of data synchronization of various sensors can be effectively improved. In addition, because the offline point cloud map is generated in advance, the time for generating the offline point cloud map is saved in the data synchronization process, and the data synchronization efficiency is improved.
In some embodiments, the step S101 may include the following steps:
1. and acquiring a first mapping relation, wherein the first mapping relation represents a corresponding relation between the three-dimensional geographic coordinates in the offline point cloud map and the pixel points in the image shot by the camera.
2. And acquiring the first mapping point corresponding to the first pixel point according to the first mapping relation.
The first mapping relation in step 1 may be established in advance. Optionally, the establishing process of the first mapping relationship includes:
1) And converting each first geographic coordinate in the offline point cloud map to a radar coordinate system of the radar to obtain a three-dimensional first coordinate point.
2) And mapping the first coordinate point to the pixel coordinate system according to the calibration parameters of the camera to obtain a two-dimensional second pixel point.
3) And establishing the first mapping relation according to the first geographic coordinates and the second pixel points.
In the embodiment of the application, the longitude lon of the known radar position 0 (second longitude), latitude lat 0 (second latitude), altitude alt 0 (second altitude) north offset angle. The geographic coordinates in the offline point cloud map include longitude lon m (first longitude) and latitude lat m (first latitude) and altitude alt m (first altitude). Wherein, the north off angle of the radar represents the included angle between the coordinate axis of the radar and the geographic north pole. Geographic north refers to the intersection of the earth's axis of rotation and the earth's surface at the north of the earth. In practical application, the north offset angle may be an angle between the x-axis of the radar and the geographic north pole, or an angle between the y-axis of the radar and the geographic north pole. The angle between the y-axis of the radar and the geographic north pole is typically used.
Step 1) converting the first geographic coordinate into a radar coordinate system of the radar, and obtaining a three-dimensional first coordinate point by the following steps: longitude lon m Latitude lat m And altitude alt m Converting to a reference coordinate system to obtain a three-dimensional second coordinate point (x, y, x), wherein the origin of the reference coordinate system is according to the longitude lon of the radar 0 Latitude lat 0 And altitude alt 0 Determining; converting the second coordinate point (x, y, z) into the radar coordinate system according to the north offset angle of the radar to obtain the first coordinate point # lm ,y lm ,z lm )。
Illustratively, the earth's long half axis: ra= 6378137.0, earth minor half axis: rb= 6356752.3142. Longitude and latitude (lon) of high-precision map m ,lat m ,alt m ) Rotate (x, y, z):
Figure BDA0004011786590000101
Figure BDA0004011786590000102
z=alt m -alt 0
rotating the matrix:
Figure BDA0004011786590000103
Figure BDA0004011786590000104
Figure BDA0004011786590000105
Rotate=R_z×R_y×R_x
wherein R_x, R_y, R_z are rotation matrices based on the X-axis, Y-axis, Z-axis, respectively. Transferring (x, y, z) to radar coordinate system to obtain the first coordinate point% lm ,y lm ,z lm ):
Figure BDA0004011786590000106
2) According to the calibration parameters of the camera, mapping the first coordinate point to the pixel coordinate system to obtain a two-dimensional second pixel point by the following steps: mapping the first coordinate point to the camera coordinate system to obtain a third coordinate point [ ] cm ,y cm ,x cm ) The method comprises the steps of carrying out a first treatment on the surface of the And mapping the third coordinate point to a pixel coordinate system to obtain the second two-dimensional pixel point (v).
Illustratively, the camera is calibrated in advance, and the internal parameters and external parameters of the camera are obtained and comprise a rotation matrix, a translation matrix, a distortion coefficient and an internal reference matrix.
The third coordinate point is calculated by the following formula:
Figure BDA0004011786590000111
wherein R is a camera external reference pair rotation matrix, and T is a camera external reference translation matrix. The pixel point (', v') is then calculated by the following formula:
Figure BDA0004011786590000112
wherein f x 、f y C is the focal length in the x and y directions in the camera internal reference x 、c y The principal point coordinates in the camera internal parameters, namely the coordinates of the center of the image. And then calculates a second pixel point (, v) based on the distortion coefficient.
For example, when radial distortion occurs, the second pixel point may be calculated according to the following equation:
Figure BDA0004011786590000113
wherein k is 1 、k 2 、k 3 And r is the radial distortion coefficient.
When tangential distortion occurs, the second pixel point can be calculated according to the following equation:
Figure BDA0004011786590000114
wherein p is 1 、p 2 And r is the tangential distortion coefficient.
According to the above conversion process, a first mapping relationship between the first geographic coordinate and the second pixel point can be established. It will be appreciated that the first mapping relationship is used to represent a conversion relationship between the three-dimensional geographical coordinates and the two-dimensional pixel coordinates.
In some embodiments, before obtaining the first mapping, the method further comprises:
lane line data are obtained, and the lane line data represent three-dimensional geographic coordinates on lane lines in a shooting area of the camera; projecting the lane line data into a second image shot by the camera to obtain a two-dimensional third pixel point; if the position of the third pixel point in the second image is inconsistent with the position of the lane line, recalibrating the parameters of the camera until the position of the third pixel point in the second image is consistent with the position of the lane line; and if the position of the third pixel point in the second image is consistent with the position of the lane line, determining the current parameter of the camera as the calibration parameter.
In this embodiment of the present application, lane line data may be acquired by an acquisition vehicle, and the acquisition mode is the same as that of the high-precision point cloud map, specifically, the acquisition mode of the high-precision point cloud map may be referred to, which is not described herein.
In the embodiment of the application, the camera parameters are calibrated through the three-dimensional lane line data, and the reliability of the camera calibration parameters can be improved through the above calibration mode because the lane line data are accurate.
Exemplary, referring to fig. 2, a schematic diagram of a data synchronization effect provided in an embodiment of the present application is shown. As shown in fig. 2, the upper two figures are images taken by a camera, and the lower two figures are point clouds of radar detection. By the data synchronization method provided by the embodiment of the application, the pixel points on the target vehicle in the shot image are mapped to the point cloud image (such as the points in the boxes in the two lower images of the figure 2), so that the data synchronization of the camera data and the radar data is realized.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic of each process, and should not limit the implementation process of the embodiment of the present application in any way.
Corresponding to the data synchronization method described in the above embodiments, fig. 3 is a block diagram of the data synchronization device provided in the embodiment of the present application, and for convenience of explanation, only the portion related to the embodiment of the present application is shown.
Referring to fig. 3, the apparatus includes:
a first obtaining unit 31, configured to obtain a first mapping point corresponding to a first pixel point in a first image captured by the camera, where the mapping point represents three-dimensional geographic coordinates in an offline point cloud map, and an area covered by the offline point cloud map includes a capturing area of the camera and a detection area of the radar.
A second obtaining unit 32, configured to obtain a first target point corresponding to the first mapping point, where the first target point is a three-dimensional coordinate point in a first point cloud image detected by the radar, and a detection time of the first point cloud image corresponds to a shooting time of the first image.
The data synchronization unit 33 is configured to synchronize the first pixel point with the first target point.
Optionally, the first obtaining unit 31 is further configured to:
and establishing a first mapping relation, wherein the first mapping relation represents a corresponding relation between the three-dimensional geographic coordinates in the offline point cloud map and pixel points in the image shot by the camera.
Optionally, the first obtaining unit 31 is further configured to:
converting each first geographic coordinate in the offline point cloud map to a radar coordinate system of the radar to obtain a three-dimensional first coordinate point;
mapping the first coordinate point to the pixel coordinate system according to the calibration parameters of the camera to obtain a two-dimensional second pixel point;
and establishing the first mapping relation according to the first geographic coordinates and the second pixel points.
Optionally, the first geographic coordinate includes a first longitude, a first latitude, and a first altitude. Correspondingly, the first obtaining unit 31 is further configured to:
converting the first longitude, the first latitude and the first altitude into a reference coordinate system to obtain a three-dimensional second coordinate point, wherein the origin of the reference coordinate system is determined according to the second longitude, the second latitude and the second altitude of the radar;
and converting the second coordinate point into the radar coordinate system according to the north offset angle of the radar to obtain the first coordinate point, wherein the north offset angle of the radar represents an included angle between a coordinate axis of the radar and a geographic north pole.
Optionally, the apparatus 3 further comprises:
the calibration unit 34 is used for acquiring lane line data, wherein the lane line data represents three-dimensional geographic coordinates on a lane line in a shooting area of the camera; projecting the lane line data into a second image shot by the camera to obtain a two-dimensional third pixel point; if the position of the third pixel point in the second image is inconsistent with the position of the lane line, recalibrating the parameters of the camera until the position of the third pixel point in the second image is consistent with the position of the lane line; and if the position of the third pixel point in the second image is consistent with the position of the lane line, determining the current parameter of the camera as the calibration parameter.
Optionally, the data synchronization unit 33 is further configured to:
converting the first pixel point into the radar coordinate system to obtain a second target point;
calculating a spatial distance between the first target point and the second target point;
if the spatial distance is within a preset range, marking the second target point in the first point cloud image;
and if the spatial distance is not in the preset range, re-selecting the first pixel point from the first image until a first target point corresponding to the re-selected first pixel point is acquired.
It should be noted that, because the content of information interaction and execution process between the above devices/units is based on the same concept as the method embodiment of the present application, specific functions and technical effects thereof may be referred to in the method embodiment section, and will not be described herein again.
In addition, the data synchronization device shown in fig. 3 may be a software unit, a hardware unit, or a unit combining soft and hard, which are built in an existing terminal device, or may be integrated into the terminal device as an independent pendant, or may exist as an independent terminal device.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
Fig. 4 is a schematic structural diagram of a terminal device provided in an embodiment of the present application. As shown in fig. 4, the terminal device 4 of this embodiment includes: at least one processor 40 (only one shown in fig. 4), a memory 41 and a computer program 42 stored in the memory 41 and executable on the at least one processor 40, the processor 40 implementing the steps in any of the various data synchronization method embodiments described above when executing the computer program 42.
The terminal equipment can be computing equipment such as a desktop computer, a notebook computer, a palm computer, a cloud server and the like. The terminal device may include, but is not limited to, a processor, a memory. It will be appreciated by those skilled in the art that fig. 4 is merely an example of the terminal device 4 and is not meant to be limiting as to the terminal device 4, and may include more or fewer components than shown, or may combine certain components, or different components, such as may also include input-output devices, network access devices, etc.
The processor 40 may be a central processing unit (Central Processing Unit, CPU), the processor 40 may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 41 may in some embodiments be an internal storage unit of the terminal device 4, such as a hard disk or a memory of the terminal device 4. The memory 41 may in other embodiments also be an external storage device of the terminal device 4, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the terminal device 4. Further, the memory 41 may also include both an internal storage unit and an external storage device of the terminal device 4. The memory 41 is used for storing an operating system, application programs, boot Loader (Boot Loader), data, other programs, etc., such as program codes of the computer program. The memory 41 may also be used for temporarily storing data that has been output or is to be output.
Embodiments of the present application also provide a computer readable storage medium storing a computer program, which when executed by a processor, may implement the steps in the above-described method embodiments.
The embodiments of the present application provide a computer program product which, when run on a terminal device, causes the terminal device to perform the steps of the method embodiments described above.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application implements all or part of the flow of the method of the above embodiments, and may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, where the computer program, when executed by a processor, may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to an apparatus/terminal device, recording medium, computer Memory, read-Only Memory (ROM), random access Memory (RAM, random Access Memory), electrical carrier signals, telecommunications signals, and software distribution media. Such as a U-disk, removable hard disk, magnetic or optical disk, etc. In some jurisdictions, computer readable media may not be electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other manners. For example, the apparatus/terminal device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (10)

1. A method of data synchronization, the method comprising:
acquiring a first mapping point corresponding to a first pixel point in a first image shot by a camera, wherein the mapping point represents three-dimensional geographic coordinates in an offline point cloud map, and an area covered by the offline point cloud map comprises a shooting area of the camera and a detection area of the radar;
acquiring a first target point corresponding to the first mapping point, wherein the first target point is a three-dimensional coordinate point in a first point cloud image detected by the radar, and the detection time of the first point cloud image corresponds to the shooting time of the first image;
synchronizing the first pixel point with the first target point.
2. The data synchronization method of claim 1, wherein prior to acquiring a first mapping point corresponding to a first pixel point in a first image captured by the camera, the method further comprises:
and establishing a first mapping relation, wherein the first mapping relation represents a corresponding relation between the three-dimensional geographic coordinates in the offline point cloud map and pixel points in the image shot by the camera.
3. The method for synchronizing data according to claim 2, wherein the establishing a first mapping relationship comprises:
converting each first geographic coordinate in the offline point cloud map to a radar coordinate system of the radar to obtain a three-dimensional first coordinate point;
mapping the first coordinate point to the pixel coordinate system according to the calibration parameters of the camera to obtain a two-dimensional second pixel point;
and establishing the first mapping relation according to the first geographic coordinates and the second pixel points.
4. The data synchronization method of claim 3, wherein the first geographic coordinates comprise a first longitude, a first latitude, and a first altitude;
converting the first geographic coordinate to a radar coordinate system of the radar to obtain a three-dimensional first coordinate point, wherein the method comprises the following steps of:
converting the first longitude, the first latitude and the first altitude into a reference coordinate system to obtain a three-dimensional second coordinate point, wherein the origin of the reference coordinate system is determined according to the second longitude, the second latitude and the second altitude of the radar;
and converting the second coordinate point into the radar coordinate system according to the north offset angle of the radar to obtain the first coordinate point, wherein the north offset angle of the radar represents an included angle between a coordinate axis of the radar and a geographic north pole.
5. A data synchronization method according to claim 2 or 3, characterized in that before the first mapping is obtained, the method further comprises:
lane line data are obtained, and the lane line data represent three-dimensional geographic coordinates on lane lines in a shooting area of the camera;
projecting the lane line data into a second image shot by the camera to obtain a two-dimensional third pixel point;
if the position of the third pixel point in the second image is inconsistent with the position of the lane line, recalibrating the parameters of the camera until the position of the third pixel point in the second image is consistent with the position of the lane line;
and if the position of the third pixel point in the second image is consistent with the position of the lane line, determining the current parameter of the camera as the calibration parameter.
6. The data synchronization method of claim 1, wherein the synchronizing the first pixel point with the first target point comprises:
converting the first pixel point into the radar coordinate system to obtain a second target point;
calculating a spatial distance between the first target point and the second target point;
and if the spatial distance is within a preset range, marking the second target point in the first point cloud image.
7. The data synchronization method of claim 6, wherein after calculating the spatial distance between the first target point and the second target point, the method further comprises:
and if the spatial distance is not in the preset range, re-selecting the first pixel point from the first image until a first target point corresponding to the re-selected first pixel point is acquired.
8. A data synchronization device for use in a detection system, the detection system including a camera and a radar, the device comprising:
a first obtaining unit, configured to obtain a first mapping point corresponding to a first pixel point in a first image captured by the camera, where the mapping point represents three-dimensional geographic coordinates in an offline point cloud map, and an area covered by the offline point cloud map includes a capturing area of the camera and a detection area of the radar;
a second obtaining unit, configured to obtain a first target point corresponding to the first mapping point, where the first target point is a three-dimensional coordinate point in a first point cloud image detected by the radar, and a detection time of the first point cloud image corresponds to a shooting time of the first image;
and the data synchronization unit is used for synchronizing the first pixel point and the first target point.
9. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 7 when executing the computer program.
10. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the method according to any one of claims 1 to 7.
CN202211656935.6A 2022-12-22 2022-12-22 Data synchronization method, device, terminal equipment and computer readable storage medium Pending CN116246029A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211656935.6A CN116246029A (en) 2022-12-22 2022-12-22 Data synchronization method, device, terminal equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211656935.6A CN116246029A (en) 2022-12-22 2022-12-22 Data synchronization method, device, terminal equipment and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN116246029A true CN116246029A (en) 2023-06-09

Family

ID=86626758

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211656935.6A Pending CN116246029A (en) 2022-12-22 2022-12-22 Data synchronization method, device, terminal equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN116246029A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117129956A (en) * 2023-10-27 2023-11-28 深圳绿米联创科技有限公司 Positioning correction method, device, detection equipment, computer equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117129956A (en) * 2023-10-27 2023-11-28 深圳绿米联创科技有限公司 Positioning correction method, device, detection equipment, computer equipment and storage medium
CN117129956B (en) * 2023-10-27 2024-04-09 深圳绿米联创科技有限公司 Positioning correction method, device, detection equipment, computer equipment and storage medium

Similar Documents

Publication Publication Date Title
EP3469306B1 (en) Geometric matching in visual navigation systems
CN110927708B (en) Calibration method, device and equipment of intelligent road side unit
US10324195B2 (en) Visual inertial odometry attitude drift calibration
CN109285188B (en) Method and apparatus for generating position information of target object
CN111127563A (en) Combined calibration method and device, electronic equipment and storage medium
JP5837092B2 (en) Position determination using horizontal angle
WO2021057612A1 (en) Sensor calibration method and apparatus
KR101442703B1 (en) GPS terminal and method for modifying location position
CN113587934B (en) Robot, indoor positioning method and device and readable storage medium
CN111080784A (en) Ground three-dimensional reconstruction method and device based on ground image texture
JP2019078700A (en) Information processor and information processing system
CN111353453A (en) Obstacle detection method and apparatus for vehicle
CN116246029A (en) Data synchronization method, device, terminal equipment and computer readable storage medium
CN111982132B (en) Data processing method, device and storage medium
CN115273027A (en) Environment sensing method, domain controller, storage medium and vehicle
CN112652062B (en) Point cloud map construction method, device, equipment and storage medium
CN112455502B (en) Train positioning method and device based on laser radar
CN111383264B (en) Positioning method, positioning device, terminal and computer storage medium
CN109034214B (en) Method and apparatus for generating a mark
JP2018189463A (en) Vehicle position estimating device and program
CN114413849A (en) Three-dimensional geographic information data processing method and device for power transmission and transformation project
CN115883969B (en) Unmanned aerial vehicle shooting method, unmanned aerial vehicle shooting device, unmanned aerial vehicle shooting equipment and unmanned aerial vehicle shooting medium
WO2023179032A1 (en) Image processing method and apparatus, and electronic device, storage medium, computer program and computer program product
CN110930455B (en) Positioning method, positioning device, terminal equipment and storage medium
CN113009533A (en) Vehicle positioning method and device based on visual SLAM and cloud server

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination