CN114332228A - Data processing method, electronic device and computer storage medium - Google Patents

Data processing method, electronic device and computer storage medium Download PDF

Info

Publication number
CN114332228A
CN114332228A CN202111659713.5A CN202111659713A CN114332228A CN 114332228 A CN114332228 A CN 114332228A CN 202111659713 A CN202111659713 A CN 202111659713A CN 114332228 A CN114332228 A CN 114332228A
Authority
CN
China
Prior art keywords
source
points
point
target object
acquisition device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111659713.5A
Other languages
Chinese (zh)
Inventor
钟礼山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Autonavi Software Co Ltd
Original Assignee
Autonavi Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Autonavi Software Co Ltd filed Critical Autonavi Software Co Ltd
Priority to CN202111659713.5A priority Critical patent/CN114332228A/en
Publication of CN114332228A publication Critical patent/CN114332228A/en
Pending legal-status Critical Current

Links

Images

Abstract

The application provides a data processing method, an electronic device and a computer storage medium. The data processing method comprises the steps of acquiring a plurality of source poses of an acquisition device during acquisition of source point clouds, wherein the source poses comprise the positions and the directions of the acquisition device in a world coordinate system at different moments; obtaining a plurality of source truth points marked on a target object in a source point cloud, wherein the source truth points comprise timestamps and a first three-dimensional coordinate in a world coordinate system; calculating the three-dimensional coordinates of the source true value point in the coordinate system of the acquisition device according to the target time represented by the timestamp and the source position and attitude of the acquisition device in the target time; acquiring a plurality of new poses of the acquisition device under the test point cloud; converting the three-dimensional coordinates of the source true value point in the coordinate system of the acquisition device into second three-dimensional coordinates in a world coordinate system corresponding to the test point cloud according to the new pose of the acquisition device at the target moment; and calculating the characteristic points of the target object in the test point cloud based on the second three-dimensional coordinates. The method can reduce the marking cost.

Description

Data processing method, electronic device and computer storage medium
Technical Field
The embodiment of the application relates to the technical field of high-precision maps, in particular to a data processing method, electronic equipment and a computer storage medium.
Background
In the map production and location of autonomous vehicles, point cloud matching is required. If the target object is acquired at least twice by using the acquisition device to obtain a plurality of groups of point cloud data, due to different poses of the acquisition device during different acquisition, three-dimensional coordinates of the same target object calculated by the point cloud data acquired at different times in a world coordinate system are different, that is, the point cloud data have deviation, and the point cloud data need to be aligned to eliminate the deviation.
In order to verify the matching accuracy of the aligned point cloud data, some feature points (such as corner points or end points of a target object) need to be selected from the point cloud data, and the matching accuracy is determined according to the distance deviation between the feature points. However, the point cloud data is sampled in a real environment, so that gaps exist among points of the point cloud data, and in the sampling process, characteristic points are easily missed due to interference, shielding and the like, or some characteristic points are invisible and the like, so that the difficulty in determining the characteristic points is high, and manual marking is needed on the point cloud data. However, because the data iteration speed is high, the labor cost is high because the data iteration speed needs to be marked every time, and a large amount of manpower and material resources are wasted.
Disclosure of Invention
In view of the above, embodiments of the present application provide a data processing scheme to at least partially solve the above problems.
According to a first aspect of embodiments of the present application, there is provided a data processing method, including: acquiring a plurality of source poses of an acquisition device during acquisition of a source point cloud, wherein the source poses comprise positions and directions of the acquisition device in a world coordinate system at different moments; obtaining a plurality of source truth points marked on a target object in the source point cloud, wherein the source truth points comprise timestamps and a first three-dimensional coordinate in a world coordinate system; calculating the three-dimensional coordinates of the source true value point in the coordinate system of the acquisition device according to the target time represented by the timestamp and the source pose of the acquisition device in the target time; acquiring a plurality of new poses of the acquisition device under the test point cloud; converting the three-dimensional coordinates of the source true value point in the acquisition device coordinate system into second three-dimensional coordinates in a world coordinate system corresponding to the test point cloud according to the new pose of the acquisition device in the target moment; and calculating characteristic points of the target object in the test point cloud based on the second three-dimensional coordinates.
According to a second aspect of embodiments of the present application, there is provided a data processing method, including: displaying a source point cloud of a target object in a display interface; receiving a selection operation of a plurality of shape points in the source point cloud; determining a source true value point of the target object according to the selected plurality of shape points; based on the source true value point, determining feature points of the target object in the test point cloud by using the method of the first aspect.
According to a third aspect of embodiments of the present application, there is provided an electronic apparatus, including: the system comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete mutual communication through the communication bus; the memory is used for storing at least one executable instruction, and the executable instruction causes the processor to execute the corresponding operation of the data processing method according to the first or second aspect.
According to a fourth aspect of embodiments of the present application, there is provided a computer storage medium having stored thereon a computer program which, when executed by a processor, implements the data processing method according to the first or second aspect.
According to a fifth aspect of embodiments herein, there is provided a computer program which, when executed by a processor, implements a method as in the first or second aspect.
According to the data processing scheme provided by the embodiment of the application, the three-dimensional coordinates of the source real-value point in the coordinate system of the acquisition device are determined according to the timestamp and the source position posture of the source real-value point corresponding to the source point cloud for the test point cloud transformed by the source point cloud, and the second three-dimensional coordinates of the test point cloud in the world coordinate system are determined according to the three-dimensional coordinates of the source real-value point in the coordinate system of the acquisition device and the new position posture of the test point cloud changed, so that the real-value point does not need to be marked under the test point cloud, the source real-value point in the source point cloud can be reused, and the cost and the time are reduced.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the embodiments of the present application, and other drawings can be obtained by those skilled in the art according to the drawings.
FIG. 1A is a flowchart illustrating steps of a data processing method according to an embodiment of the present disclosure;
FIG. 1B is a schematic illustration of a true source point of a reticle in the embodiment of FIG. 1A;
FIG. 1C is a schematic diagram of the source true point of the reticle in the embodiment of FIG. 1A;
FIG. 1D is a schematic diagram of a source truth point marked by an arrow on a road surface in the embodiment shown in FIG. 1A;
FIG. 1E is a labeled diagram of a source truth point for the signage of the embodiment shown in FIG. 1A;
FIG. 1F is a schematic illustration of a projection fit in the embodiment of FIG. 1A;
fig. 1G is a schematic diagram of the sub-step of step S106 according to the first embodiment of the present application;
fig. 1H is a schematic diagram of the sub-step of step S112 according to the first embodiment of the present application;
FIG. 2 is a flow chart illustrating steps of a data processing method according to a second embodiment of the present application;
fig. 3 is a block diagram of a cloud server according to a third embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present application.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the embodiments of the present application, the technical solutions in the embodiments of the present application will be described clearly and completely below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application shall fall within the scope of the protection of the embodiments in the present application.
The following further describes specific implementations of embodiments of the present application with reference to the drawings of the embodiments of the present application.
Example one
Referring to fig. 1A, a flowchart illustrating steps of a data processing method according to a first embodiment of the present application is shown.
For the sake of clear description of the method of the present embodiment, before describing the implementation of the method in detail, the application scenario is exemplarily described as follows:
the method can be applied to the scenes of map production, positioning in the automatic driving process and the like. Taking map production as an example, in order to obtain map data, a vehicle (such as a high-precision map vehicle carrying a laser radar) carrying an acquisition device can run along a certain road, target objects (such as trees, signboards, road marking lines and the like) on two sides of the road are scanned by the acquisition device in the running process, so that point cloud data are obtained, points of the point cloud data correspond to certain points on the target objects, and three-dimensional coordinates P of the points in the point cloud data under a coordinate system of the acquisition device are based onlPose T of acquisition device under world coordinate systemwiAnd parameter T of the acquisition deviceilThe three-dimensional coordinate P of the target object in the world coordinate system can be determinedwI.e. its geographical location, thereby enabling the production of a map. And the three-dimensional coordinates of the target object under the world coordinate system can be determined based on the point cloud data, so that the map is manufactured. In particular, three-dimensional coordinates PwCan be expressed as: pw=Twi·Til·Pl
Because the real environment can change, in order to ensure that the map can be updated in time when the environment changes, the vehicle can drive on the road once every a period of time, thereby acquiring newer point cloud data. However, when the vehicle runs on the road, the same track of each vehicle running cannot be ensured, that is, the poses of the acquisition devices may be different, so that the vehicle runs on the roadUnder the condition that the position of the target object is not changed, the three-dimensional coordinate P of the same target object in the world coordinate system is determined by point cloud data acquired at different timeswMay also be different, and therefore requires alignment of the point cloud data acquired at different times.
The alignment method is, for example: the point cloud data A is used as a target point cloud, the point cloud data B is used as a source point cloud, new point cloud data C (recorded as a test point cloud) is obtained by converting the point cloud data B, and the distance between the same characteristic point (namely the same point on the same target object, which can be called as the same name point for short) in the point cloud data C and the point cloud data A is calculated to determine the matching precision. The feature points are, for example, corner points or end points of the target object, but are not limited to these, and may be central lines or side lines of the target object.
It can be known from the alignment process that in order to calculate the distance between the feature points, the feature points need to be determined from the point cloud data C and the point cloud data a, but the feature points cannot be directly acquired due to insufficient density of the point cloud, feature point loss caused by occlusion, or invisible feature points. Some points need to be manually marked in point cloud data (the manually marked points can be called true points because of high accuracy), matching precision can be directly determined based on the manually marked true points, characteristic points can be calculated according to the true points, and then the matching precision is calculated according to the characteristic points.
In order to solve the problem, embodiments of the present application provide a method, by which multiplexing of a true value point can be achieved, so as to reduce the cost and time for labeling the true value point on point cloud data.
In this embodiment, the method includes the steps of:
step S102: a plurality of source poses of an acquisition device during acquisition of a source point cloud are acquired.
The source pose includes the position and orientation of the acquisition device in the world coordinate system at different times. Multiple source poses can constitute the source trajectory of the acquisition device. In general, the source position of the acquisition device can be detected and acquired based on an IMU unit (inertial measurement unit) on the vehicle on which the acquisition device is mounted.
Step S104: and acquiring a plurality of source true value points marked on the target object in the source point cloud.
In this embodiment, for ease of labeling, the source true value point is a shape point (denoted as SP) visible in the target object or a point calculated based on the visible shape point. The source truth points include a timestamp and a first three-dimensional coordinate in a world coordinate system. The source point cloud may be a point cloud obtained by scanning a target object by a laser radar.
The number and location of the source true value points marked on the target objects of different geometries may be different. For example, the target object includes at least one of a first type road marking having a length greater than a set value, a second type road marking having a length less than or equal to the set value, a sign, a directional sign with an arrow on a road surface, and a round bar body. The set value may be a maximum value of the second road marking. For example, 3m or 5 m.
As shown in fig. 1B, the target object is the first type of road marking (also called a long marking, such as a long solid line, a stop line, etc. on the road), and has a rectangular geometric shape and a large ratio of length to width. For such a target object, the source truth points of the target object include at least two pairs of points, and two points of a pair of points are located on different sides of the central axis of the target object. Of course, in order to obtain more accurate feature points (i.e., points on the central axis) later, multiple pairs of points may be marked along the central axis. The timestamp of the source truth point may be the time at which the point was acquired.
As shown in fig. 1C, the target object is the second type of road marking (also called short marking, such as a dashed line or a zebra crossing on the road) and has a rectangular geometry, but the ratio of the length to the width is small compared to the first type of road marking. For such a target object, a 6-point method may be used to label the source true points, that is, the source true points include at least two points located on each side of the target object, and a top point and a bottom point in the direction of the central axis of the target object. The timestamp of the source truth point may be the time at which the point was acquired.
An example way of labeling is, for example: marking 2 source truth points on the left long edge along the central axis direction of the target object, and determining the direction of the left edge; the right long edge is marked with 2 source truth points and used for determining the direction of the right edge; the bottom short edge selects 1 source true point, which is not required to be on the central axis (in practice there is probably no point on the central axis), but the bottom point on the edge, which is recognizable to the naked eye, needs to be selected; the top short edge is selected to be 1 source truth point, again not necessarily on the central axis, but the top most point on the edge is selected to be visually recognizable.
As shown in fig. 1D and 1E, the target object is the sign or the direction with an arrow on the road surface, when the target object is a sign, the geometric shape of the target object is a rectangle or a rounded rectangle, and the source truth points include at least two points on each side of the target object. The timestamp of the source truth point may be the time at which the point was acquired.
An example labeling manner is, for example, 8-point labeling, that is, starting from the left edge and in a counterclockwise order, 2 source true value points are selected from each edge.
When the target object is a road surface to-be-marked arrow, the geometric shape of the target object is complex and can be a single arrow or a combined shape of multiple arrows, and the source truth points comprise at least two points on each side of the target object. The timestamp of the source truth point may be the time at which the point was acquired.
An example way of labeling is, for example: starting from the bottom short edge, 2 source true points are chosen on each edge in a counterclockwise order.
When the target object is a round rod body, the geometric shape of the target object is cylindrical, and the source truth point comprises a central point of a top end section and a central point of a bottom end section of the target object. Optionally, to further fully represent the cylindrical shape, a radius of the top end cross-section and a radius of the bottom end cross-section may also be included.
It should be noted that, because the source true point of the circle bar is not a point actually existing in the source point cloud, but a point calculated based on a point in the source point cloud, the timestamp of the source true point may be a corresponding average time of the point cloud included in the circle bar.
One possible process for determining the source true value point is for example:
procedure a 1: and determining the main shaft direction of the round rod body according to the point cloud contained in the round rod body.
The point cloud contained in the round rod body can be manually selected and denoised so as to remove interference points and improve accuracy. The main axis direction of the point cloud contained in the round rod body can be determined through principal component analysis.
Procedure B1: and selecting a first end point set and a second end point set from the point clouds contained in the round rod body according to the main shaft direction.
One possible way to obtain a first set of end points and a second set of end points may be: and slicing at a first distance from the top point in the point cloud contained in the round rod body along the direction vertical to the main shaft to obtain a top slice cylindrical section, wherein the point in the top slice cylindrical section belongs to a first end point set. Similarly, slicing is performed at a second distance from the lowest point in the point cloud contained in the round bar body along the direction perpendicular to the main axis, so as to obtain a bottom sliced cylindrical section, and the point located in the bottom sliced cylindrical section belongs to a second end point set.
The first distance and the second distance may be determined as needed, and may be the same or different, without limitation. For example, the first distance and the second distance are the same and are one fifth of the distance from the apex to the nadir. Since the greater the number of points included in the first and second end point sets, the greater the computational load, the first and second distances may also be determined from the computational effort.
Procedure C1: and projecting the concentrated points of the first end points to the top end surface of the round rod body along the main shaft direction, and projecting the concentrated points of the second end points to the bottom end surface of the round rod body along the main shaft direction.
The top end face can be an end face passing through a vertex and perpendicular to the main shaft direction, and the shape and the size of the top end section of the round rod body can be formed on the top end face by projecting a point with the first end point concentrated onto the top end face along the main shaft direction. Therefore, the problem that the top section cannot be accurately determined due to point cloud sparsity can be solved.
Similarly, the bottom end section of the round bar body can be obtained by projecting the point in which the second end point is concentrated onto the bottom end surface of the round bar body along the main axis direction.
Procedure D1: and determining a source true value point of the top end of the round rod body according to the projection result of the first end face.
The projection result of the first end surface can be fitted to the cross section of the top end of the circular rod body, as shown in fig. 1F. And then the center point of the circle center of the top end section can be determined as the source true value point of the top end.
Procedure E1: and determining a source true value point of the bottom end of the round rod body according to the projection result of the second end surface.
The projection result of the second end surface can be fitted to form a bottom end section of the round rod body, and then the center point of the circle center of the bottom end section can be determined to serve as a source true value point of the bottom end.
In summary, no matter what the geometric shape of the target object is, the source true value point in the embodiment can be determined by labeling the visible points in the source point cloud, so that the labeling difficulty is reduced, and the labeling efficiency is improved.
Step S106: and calculating the three-dimensional coordinates of the source true value point in the coordinate system of the acquisition device according to the target time represented by the timestamp and the source pose of the acquisition device in the target time.
The source truth-value points not only comprise three-dimensional coordinates of the source truth-value points in a world coordinate system, but also comprise the time stamps, so that the source pose of the acquisition device in the acquisition of the source point cloud can be determined based on the time stamps, and the three-dimensional pose of the source point cloud in the acquisition device coordinate system is further reversely solved. The inverse solution principle is as follows:
first three-dimensional coordinate PwCan be expressed as: pw=Twi·Til·Pl. Wherein, PwIs a first three-dimensional coordinate. T iswiThe source pose of the acquisition device at the ith moment. T isilAre intrinsic parameters of the acquisition device and the IMU unit (inertial measurement unit). PlMay be the three-dimensional coordinates of the source truth points in the coordinate system of the acquisition device. The intrinsic parameters of the acquisition device (such as installation position and angle) are not changed once determined, and the three-dimensional coordinates of the source true point in the acquisition device coordinate system are not changed after acquisition. Thus, when P is knownw、TwiAnd TilThe three-dimensional coordinates can be inversely calculated in the case of (3).
The three-dimensional coordinates calculated by inverse solution can be recorded as PiWhich can be represented as Pi=Twi -1·Pw
In an example, as shown in fig. 1G, step S106 can be implemented by the following sub-steps:
substep S1061: and acquiring the source position and posture of the acquisition device at the target moment.
Because the frequency of the point cloud data acquired by the acquisition device is higher than the frequency of the source pose output by the IMU unit, the source point cloud at some moments does not have a directly corresponding source pose. To solve this problem, the sub-step S1061 may be implemented by:
procedure a 1: and determining whether the time corresponding to the source poses comprises the target time.
If the time corresponding to the source position gesture comprises the target time, the source position gesture corresponding to the target time can be directly obtained. Alternatively, if the target time is not included, process B1 is performed.
Procedure B1: and acquiring a previous time and a later time adjacent to the target time from the times corresponding to the source poses.
For example, the target time may be time t, the previous time may be time t-1, and the subsequent time may be time t + 1.
Procedure C1: and acquiring the source pose at the previous moment and the source pose at the later moment.
Procedure D1: and interpolating the source pose at the previous moment and the source pose at the later moment, and determining the source pose of the acquisition device at the target moment according to an interpolation result.
For example, linear interpolation is performed on the source pose at the previous time and the source pose at the subsequent time, and the source pose at the interpolated target time is determined.
Substep S1062: and determining the three-dimensional coordinates of the source true value point in the coordinate system of the acquisition device according to the first three-dimensional coordinates of the source true value point at the target time and the source position and posture of the acquisition device at the target time.
First three-dimensional sitting is marked by PwThe source position and posture are recorded as TwiThen the three-dimensional coordinate P of the source true value point in the coordinate system of the acquisition deviceiCan be expressed as: pi=Twi -1·Pw
Step S108: and acquiring a plurality of new poses of the acquisition device under the test point cloud.
The new pose of the acquisition device is used to map the source point cloud into a test point cloud. For example, when aligning a source point cloud with a target point cloud, the source point cloud maps out a test point cloud by changing the source pose to a new pose. In this embodiment, the new poses correspond to the source poses one-to-one, and a plurality of new poses can form a new trajectory of the acquisition device. In the process of aligning the source point cloud and the target point cloud, errors in alignment can be reduced by constantly changing new poses.
Step S110: and converting the three-dimensional coordinates of the source true value point in the acquisition device coordinate system into second three-dimensional coordinates in a world coordinate system corresponding to the test point cloud according to the new pose of the acquisition device in the target moment.
When the source true value points are subjected to positive calculation, new poses of the source true value points at the target moment need to be obtained, and the sampling frequency of the source point cloud is higher than that of the IMU unit, so that the source point cloud at a part of moments has no corresponding new poses.
In order to solve this problem, the new pose corresponding to the target time can be obtained by: and determining whether the moments corresponding to the new poses contain the target moment. And if the time corresponding to the new pose contains the target time, the corresponding new pose can be directly obtained.
Or if the target time is not included, acquiring a previous time and a next time adjacent to the target time from the times corresponding to the new poses. And acquiring the new pose at the previous moment and the new pose at the later moment. And performing linear interpolation on the new pose at the previous moment and the new pose at the later moment, and determining the new pose of the acquisition device at the target moment according to an interpolation result.
Passing P 'based on obtained new pose'w=T′wi·PiAnd calculating a second three-dimensional coordinate P 'of the source true value point under the world coordinate system corresponding to the test point cloud in a forward solution mode'w
Step S112: and calculating characteristic points of the target object in the test point cloud based on the second three-dimensional coordinates.
The feature points in the test point cloud can be calculated in different ways for target objects of different geometric shapes. In one example, as shown in fig. 1H, step S112 may include the following sub-steps:
step S1121: and determining the corresponding geometric shape of the target object.
For example, the first type of road markings and the second type of road markings may be rectangular. The sign may be rectangular, rounded rectangular, circular, or the like. The round bar body may be cylindrical. The arrowed direction mark of the pavement can be an arrow or a complex geometric body formed by fusing multiple arrows, and the like.
Step S1122: and calculating the three-dimensional coordinates of the feature points of the target object in the test point cloud in the world coordinate system according to the geometric shape of the target object and the second three-dimensional coordinates of the source true value points.
And when the target object is the first-class road marking, calculating the three-dimensional coordinates of at least two characteristic points on the central axis of the target object in a world coordinate system according to the second three-dimensional coordinates of at least two pairs of source real value points. The central axis can thus be determined by means of at least two characteristic points. The central axis can evaluate the matching accuracy in the remaining 2 directions other than the central axis direction.
And when the target object is a second type road marking, calculating corresponding edges according to the second three-dimensional coordinates of the source true value points on the edges, calculating a central axis of the second type road marking according to the edges, and projecting the second three-dimensional coordinates of the top point onto the central axis to obtain a top end point with the foot as the central axis. And projecting the second three-dimensional coordinate of the bottom point onto the central axis to obtain the bottom end point with the foot as the central axis. The central axis can be determined by the top end point and the bottom end point, and the matching accuracy can be determined in 3 directions by determining the end points of the central axis.
And when the target object is a sign or a direction mark with an arrow on the road surface, calculating edges according to the second three-dimensional coordinates of the source true value points on the edges, wherein the three-dimensional coordinates of the intersection point of two adjacent edges in a world coordinate system are the three-dimensional coordinates of the corner points.
And if the target object is a round rod body, taking the second three-dimensional coordinate of the source true value point as the three-dimensional coordinate of the feature point in the world coordinate system.
When the matching precision of the test point cloud and the target point cloud is determined, the distance between the three-dimensional coordinates of the feature points in the test point cloud in the world coordinate system and the three-dimensional coordinates of the same feature points in the target point cloud in the world coordinate system is calculated, if the distance is smaller than a distance threshold (which can be determined as required), the matching precision of the test point cloud meets the requirement, otherwise, the matching precision of the test point cloud does not meet the requirement.
By the marking mode of the embodiment, the problem that actual characteristic points of some target objects are not directly visible is solved, and accurate three-dimensional coordinates of the characteristic points can be calculated based on the source real value points by taking shape points which can be distinguished obviously as the source real value points. And for the problem that the real value points need to be re-marked because the test point clouds are different after the source point clouds are converted every time, the real value points can be marked and permanently used once by adding timestamps to the shape points and re-calculating the source real value points by using the source pose and the new pose of the acquisition device.
According to the method, only shape points which are easy to distinguish are selected as source truth points, and accurate semantic feature points can be automatically calculated based on the source truth points (even if the feature points are invisible in source laser point cloud), so that the reliability of evaluation indexes is guaranteed. The method is also suitable for low-cost laser radars with low point density, and the application range of the method is greatly widened.
By adding the timestamp to the source truth point and combining the source pose and the new pose, the three-dimensional coordinate of the source truth point in the world coordinate system relative to the test point cloud can be calculated, and the newly calculated three-dimensional coordinate of the source truth point is matched with the test point cloud, so that one-time labeling and permanent use of the truth value can be realized, and the time cost and the labor cost can be greatly saved.
By the method, the problem that evaluation is inaccurate due to the fact that the homonym points are not actual homonym points because the homonym points existing in the existing method for automatically selecting the homonym points are influenced by the sampling density of the laser point cloud is solved. The problem that under some conditions, due to the influence of the gaps of the point clouds, actual characteristic points (angular points, end points and the like) possibly have data loss, so that the characteristic lines (such as axes) cannot be directly selected, and the rod-shaped object can only adopt the laser point clouds on the outer surface and cannot be directly marked can be solved; the ground mark lines mostly have certain width, and the problem that the axis is directly selected by clicking has larger error.
In conclusion, the source truth points of the method can select the shape points which are easy to distinguish, namely the accurate semantic feature points can be automatically calculated according to the shape points, so that the reliability of the evaluation indexes is ensured, and the application range of the method is greatly expanded; meanwhile, the source true value points are provided with the timestamps, so that the source true value points of the test point cloud can be determined based on the timestamps, the source poses and the new poses after the source point cloud transformation, so that true value multiplexing is realized, and the time cost and the labor cost are greatly saved.
Example two
Referring to fig. 2, a flowchart illustrating steps of a data processing method according to a second embodiment of the present application is shown.
The data processing method can be configured on a display terminal, and comprises the following steps:
step S202: and displaying the source point cloud of the target object in the display interface.
The display interface can be an interface of a display terminal, and the source point cloud of the target object is displayed in the display interface, so that a user can conveniently mark the source real value point.
Step S204: and acquiring a plurality of shape points in the source point cloud as the source true value points.
In an example, a user may select some shape points in the presented source point cloud as source truth points. For example, two points on different sides of the central axis can be selected as a pair of points for the first type of road marking, and at least two pairs of points are screened out from the source point cloud to be used as source true value points.
For the second type of road marking, two points can be respectively selected at two side edges as source true value points, and a top point and a bottom point are selected as source true value points.
For sign or road sign with arrowed directions, at least two points from each edge can be selected as source truth points.
For the round rod body, interference point clouds can be manually removed, and then the point clouds of the round rod body are automatically analyzed through an algorithm, so that the central point of the top section and the central point of the bottom section of the round rod body are determined to be used as source true values.
An automatically generated manner may include implementation as: determining the main shaft direction of the round rod body according to the point cloud contained in the round rod body; selecting a first end point set and a second end point set from the point clouds contained in the round rod body according to the main shaft direction; projecting the concentrated points of the first end points to the top end surface of the round rod body along the main shaft direction, and projecting the concentrated points of the second end points to the bottom end surface of the round rod body along the main shaft direction; determining a source true value point of the top end of the round rod body according to the projection result of the first end face; and determining a source true value point of the bottom end of the round rod body according to the projection result of the second end surface.
The point cloud of the round rod body can be determined after manual screening and denoising. The principal axis direction can be determined by principal component analysis. The first end point cloud and the second end point set can be determined by manually clicking the slice position, and the slice distance can also be directly set.
The timestamp of the source true point may be an average time corresponding to the point cloud contained in the circle bar.
Step S206: based on the source true value points, feature points of the target object in the test point cloud are determined using the aforementioned method.
For the source true value point, the three-dimensional coordinates of the source true value point in the coordinate system of the acquisition device can be inversely solved according to the first three-dimensional coordinates of the source true value point in the world coordinate system and the source position posture of the corresponding acquisition device, and then the second three-dimensional coordinates of the source true value point and the test point cloud in the world coordinate system can be solved according to the new posture of the acquisition device and the inversely solved three-dimensional coordinates of the acquisition device in the coordinate system.
And calculating the three-dimensional coordinates of the characteristic points of the target object in the test point cloud in a world coordinate system according to the second three-dimensional coordinates of the source true value points and the geometric shape of the target object.
According to the data processing method, accurate feature points can be automatically generated by an algorithm only by clicking shape points which are easy to distinguish as source true value points, so that the marking difficulty of the feature points is reduced, the method is also suitable for low-cost laser radars with low point density, and the application range is greatly widened.
EXAMPLE III
Referring to fig. 3, a schematic structural diagram of a cloud server according to a third embodiment of the present application is shown.
The cloud server 302 may be connected to the client terminal, and the cloud server is configured to receive a source true value point marked on a source point cloud transmitted by the client terminal, and execute the foregoing method according to the source true value point, so as to multiplex the source true value point marked on the source point cloud onto the test point cloud after the source point cloud is converted into the test point cloud, thereby eliminating the need to mark a true value point on the test point cloud again, and thus reducing the workload of marking.
Example four
Referring to fig. 4, a schematic structural diagram of an electronic device according to a fourth embodiment of the present application is shown, and the specific embodiment of the present application does not limit a specific implementation of the electronic device.
As shown in fig. 4, the electronic device may include: a processor (processor)402, a Communications Interface 404, a memory 406, and a Communications bus 408.
Wherein:
the processor 402, communication interface 404, and memory 406 communicate with each other via a communication bus 408.
A communication interface 404 for communicating with other electronic devices or servers.
The processor 402 is configured to execute the program 410, and may specifically perform relevant steps in the above-described data processing method embodiment.
In particular, program 410 may include program code comprising computer operating instructions.
The processor 402 may be a processor CPU, or an application Specific Integrated circuit (asic), or one or more Integrated circuits configured to implement embodiments of the present application. The intelligent device comprises one or more processors which can be the same type of processor, such as one or more CPUs; or may be different types of processors such as one or more CPUs and one or more ASICs.
And a memory 406 for storing a program 410. Memory 406 may comprise high-speed RAM memory, and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
The program 410 may be specifically configured to cause the processor 402 to perform operations corresponding to the foregoing methods.
For specific implementation of each step in the program 410, reference may be made to corresponding steps and corresponding descriptions in units in the foregoing data processing method embodiments, which are not described herein again. It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described devices and modules may refer to the corresponding process descriptions in the foregoing method embodiments, and are not described herein again.
It should be noted that, according to the implementation requirement, each component/step described in the embodiment of the present application may be divided into more components/steps, and two or more components/steps or partial operations of the components/steps may also be combined into a new component/step to achieve the purpose of the embodiment of the present application.
The above-described methods according to embodiments of the present application may be implemented in hardware, firmware, or as software or computer code storable in a recording medium such as a CD ROM, a RAM, a floppy disk, a hard disk, or a magneto-optical disk, or as computer code originally stored in a remote recording medium or a non-transitory machine-readable medium downloaded through a network and to be stored in a local recording medium, so that the methods described herein may be stored in such software processes on a recording medium using a general-purpose computer, a dedicated processor, or programmable or dedicated hardware such as an ASIC or FPGA. It will be appreciated that the computer, processor, microprocessor controller or programmable hardware includes memory components (e.g., RAM, ROM, flash memory, etc.) that can store or receive software or computer code that, when accessed and executed by the computer, processor or hardware, implements the data processing methods described herein. Further, when a general-purpose computer accesses code for implementing the data processing method shown herein, execution of the code converts the general-purpose computer into a special-purpose computer for executing the data processing method shown herein.
Those of ordinary skill in the art will appreciate that the various illustrative elements and method steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the present application.
The above embodiments are only used for illustrating the embodiments of the present application, and not for limiting the embodiments of the present application, and those skilled in the relevant art can make various changes and modifications without departing from the spirit and scope of the embodiments of the present application, so that all equivalent technical solutions also belong to the scope of the embodiments of the present application, and the scope of patent protection of the embodiments of the present application should be defined by the claims.

Claims (14)

1. A method of data processing, comprising:
acquiring a plurality of source poses of an acquisition device during acquisition of a source point cloud, wherein the source poses comprise positions and directions of the acquisition device in a world coordinate system at different moments;
obtaining a plurality of source truth points marked on a target object in the source point cloud, wherein the source truth points comprise timestamps and a first three-dimensional coordinate in a world coordinate system;
calculating the three-dimensional coordinates of the source true value point in the coordinate system of the acquisition device according to the target time represented by the timestamp and the source pose of the acquisition device in the target time;
acquiring a plurality of new poses of the acquisition device under the test point cloud;
converting the three-dimensional coordinates of the source true value point in the acquisition device coordinate system into second three-dimensional coordinates in a world coordinate system corresponding to the test point cloud according to the new pose of the acquisition device in the target moment;
and calculating characteristic points of the target object in the test point cloud based on the second three-dimensional coordinates.
2. The method of claim 1, wherein said calculating three-dimensional coordinates of the source true point in the acquisition device coordinate system based on a target time represented by the timestamp and a source pose of the acquisition device in the target time comprises:
acquiring the source position and posture of the acquisition device at the target moment;
and determining the three-dimensional coordinates of the source true value point in the coordinate system of the acquisition device according to the first three-dimensional coordinates of the source true value point at the target time and the source position and posture of the acquisition device at the target time.
3. The method of claim 2, wherein the obtaining a source pose of the acquisition device at the target time comprises:
determining whether the time corresponding to the source poses comprises the target time;
if the target time is not included, acquiring a previous time and a subsequent time which are adjacent to the target time from the times corresponding to the source poses;
acquiring the source pose of the previous moment and the source pose of the later moment;
and interpolating the source pose at the previous moment and the source pose at the later moment, and determining the source pose of the acquisition device at the target moment according to an interpolation result.
4. The method of claim 1, wherein said calculating feature points of the target object in the test point cloud based on the second three-dimensional coordinates comprises:
determining a geometric shape corresponding to the target object;
and calculating the three-dimensional coordinates of the characteristic points of the target object in the test point cloud in the world coordinate system according to the geometric shape of the target object and the second three-dimensional coordinates of the source truth points at the target time.
5. The method of claim 1, wherein the target object comprises at least one of a first type of road marking having a length greater than a set value, a second type of road marking having a length less than or equal to the set value, a sign, a road surface arrowed direction sign, and a round bar body.
6. The method of claim 5, wherein,
if the target object is a first-class road marking, the source truth points of the target object comprise at least two pairs of points, and two points in one pair of points are positioned on different sides of a central axis of the target object; and/or the presence of a gas in the gas,
if the target object is a second type of road marking, the source true value points comprise at least two points located on each side of the target object, and a top point and a bottom point in the central axis direction of the target object; and/or the presence of a gas in the gas,
if the target object is a sign or a direction mark with an arrow on a road surface, the source true value points comprise at least two points on each side edge of the target object; and/or the presence of a gas in the gas,
if the target object is a round bar body, the source true point comprises a center point of a top cross section and a center point of a bottom cross section of the target object.
7. A method of data processing, comprising:
displaying a source point cloud of a target object in a display interface;
receiving a selection operation of a plurality of shape points in the source point cloud;
determining a source true value point of the target object according to the selected plurality of shape points;
determining feature points of the target object in the test point cloud using the method of any of claims 1-6 based on the source true value points.
8. The method of claim 7, wherein the determining source truth points for the target object from the selected plurality of shape points comprises:
if the target object is a round rod body, determining the main shaft direction of the round rod body according to the point cloud contained in the round rod body;
selecting a first end point set and a second end point set from the point clouds contained in the round rod body according to the main shaft direction;
projecting the concentrated points of the first end points to the top end surface of the round rod body along the main shaft direction, and projecting the concentrated points of the second end points to the bottom end surface of the round rod body along the main shaft direction;
determining a source true value point of the top end of the round rod body according to the projection result of the first end face;
and determining a source true value point of the bottom end of the round rod body according to the projection result of the second end surface.
9. The method of claim 8, wherein the determining source truth points for the target object from the selected plurality of shape points further comprises:
and determining the average time corresponding to the point cloud contained in the round bar body as the time stamps of the source true value point at the top end and the source true value point at the bottom end of the round bar body.
10. The method according to claim 8 or 9, wherein the determining a source true point of the circular rod tip from the projection of the first end face comprises:
and fitting the top end section of the round rod body according to the projection result of the first end surface, and taking the central point of the top end section as a source true point of the top end.
11. The method according to claim 8 or 9, wherein the determining a source true value point of the bottom end of the circular rod body according to the projection result of the second end surface comprises:
and fitting a bottom cross section of the round rod body according to the projection result of the second end surface, and taking the central point of the bottom cross section as a source true point of the bottom.
12. An electronic device, comprising: the system comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete mutual communication through the communication bus;
the memory is used for storing at least one executable instruction, and the executable instruction causes the processor to execute the corresponding operation of the data processing method according to any one of claims 1 to 6 or any one of claims 7 to 11.
13. A computer storage medium on which a computer program is stored which, when executed by a processor, implements a data processing method as claimed in any one of claims 1 to 6 or any one of claims 7 to 11.
14. A computer program which, when executed by a processor, implements a data processing method as claimed in any one of claims 1 to 6 or any one of claims 7 to 11.
CN202111659713.5A 2021-12-30 2021-12-30 Data processing method, electronic device and computer storage medium Pending CN114332228A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111659713.5A CN114332228A (en) 2021-12-30 2021-12-30 Data processing method, electronic device and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111659713.5A CN114332228A (en) 2021-12-30 2021-12-30 Data processing method, electronic device and computer storage medium

Publications (1)

Publication Number Publication Date
CN114332228A true CN114332228A (en) 2022-04-12

Family

ID=81019671

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111659713.5A Pending CN114332228A (en) 2021-12-30 2021-12-30 Data processing method, electronic device and computer storage medium

Country Status (1)

Country Link
CN (1) CN114332228A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115102932A (en) * 2022-06-09 2022-09-23 腾讯科技(深圳)有限公司 Data processing method, device, equipment, storage medium and product of point cloud media

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101533529A (en) * 2009-01-23 2009-09-16 北京建筑工程学院 Range image-based 3D spatial data processing method and device
WO2016157802A1 (en) * 2015-03-27 2016-10-06 日本電気株式会社 Information processing apparatus, information processing system, information processing method, and storage medium
CN110411464A (en) * 2019-07-12 2019-11-05 中南大学 Three-dimensional point cloud ground drawing generating method, device, equipment and storage medium
CN111442722A (en) * 2020-03-26 2020-07-24 达闼科技成都有限公司 Positioning method, positioning device, storage medium and electronic equipment
CN112053365A (en) * 2019-06-06 2020-12-08 武汉星源云意科技有限公司 Method for extracting laser point cloud data of power transmission line conductor based on linear characteristics
US20200402300A1 (en) * 2019-06-21 2020-12-24 Harbin Institute Of Technology Terrain modeling method that fuses geometric characteristics and mechanical charateristics, computer readable storage medium, and terrain modeling system thereof
CN112733812A (en) * 2021-03-01 2021-04-30 知行汽车科技(苏州)有限公司 Three-dimensional lane line detection method, device and storage medium
CN113137973A (en) * 2020-01-20 2021-07-20 北京初速度科技有限公司 Image semantic feature point truth value determining method and device
CN113255578A (en) * 2021-06-18 2021-08-13 湖北亿咖通科技有限公司 Traffic identification recognition method and device, electronic equipment and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101533529A (en) * 2009-01-23 2009-09-16 北京建筑工程学院 Range image-based 3D spatial data processing method and device
WO2016157802A1 (en) * 2015-03-27 2016-10-06 日本電気株式会社 Information processing apparatus, information processing system, information processing method, and storage medium
CN112053365A (en) * 2019-06-06 2020-12-08 武汉星源云意科技有限公司 Method for extracting laser point cloud data of power transmission line conductor based on linear characteristics
US20200402300A1 (en) * 2019-06-21 2020-12-24 Harbin Institute Of Technology Terrain modeling method that fuses geometric characteristics and mechanical charateristics, computer readable storage medium, and terrain modeling system thereof
CN110411464A (en) * 2019-07-12 2019-11-05 中南大学 Three-dimensional point cloud ground drawing generating method, device, equipment and storage medium
CN113137973A (en) * 2020-01-20 2021-07-20 北京初速度科技有限公司 Image semantic feature point truth value determining method and device
CN111442722A (en) * 2020-03-26 2020-07-24 达闼科技成都有限公司 Positioning method, positioning device, storage medium and electronic equipment
CN112733812A (en) * 2021-03-01 2021-04-30 知行汽车科技(苏州)有限公司 Three-dimensional lane line detection method, device and storage medium
CN113255578A (en) * 2021-06-18 2021-08-13 湖北亿咖通科技有限公司 Traffic identification recognition method and device, electronic equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115102932A (en) * 2022-06-09 2022-09-23 腾讯科技(深圳)有限公司 Data processing method, device, equipment, storage medium and product of point cloud media
CN115102932B (en) * 2022-06-09 2024-01-12 腾讯科技(深圳)有限公司 Data processing method, device, equipment, storage medium and product of point cloud media

Similar Documents

Publication Publication Date Title
KR102273559B1 (en) Method, apparatus, and computer readable storage medium for updating electronic map
CN111220993B (en) Target scene positioning method and device, computer equipment and storage medium
EP4152052A1 (en) Location determination method, device, and system, and computer readable storage medium
CN112380317B (en) High-precision map updating method and device, electronic equipment and storage medium
CN112069856A (en) Map generation method, driving control method, device, electronic equipment and system
CN110705458B (en) Boundary detection method and device
CN108734780B (en) Method, device and equipment for generating map
CN113341397A (en) Reflection value map construction method and device
CN112419512B (en) Air three-dimensional model repairing system and method based on semantic information
CN111221808A (en) Unattended high-precision map quality inspection method and device
CN111007531A (en) Road edge detection method based on laser point cloud data
CN111121805A (en) Local positioning correction method, device and medium based on road traffic marking marks
CN113945937A (en) Precision detection method, device and storage medium
CN111080682A (en) Point cloud data registration method and device
CN114882316A (en) Target detection model training method, target detection method and device
CN114485698B (en) Intersection guide line generation method and system
CN114332228A (en) Data processing method, electronic device and computer storage medium
CN110363054B (en) Road marking line identification method, device and system
CN111856499B (en) Map construction method and device based on laser radar
CN115546551A (en) Deep learning-based geographic information extraction method and system
CN112507977B (en) Lane line positioning method and device and electronic equipment
CN115344655A (en) Method and device for finding change of feature element, and storage medium
CN115560744A (en) Robot, multi-sensor-based three-dimensional mapping method and storage medium
CN114187357A (en) High-precision map production method and device, electronic equipment and storage medium
CN112767458B (en) Method and system for registering laser point cloud and image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination