CN114895250A - Radar data processing method and device, electronic equipment and storage medium - Google Patents

Radar data processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114895250A
CN114895250A CN202210411698.0A CN202210411698A CN114895250A CN 114895250 A CN114895250 A CN 114895250A CN 202210411698 A CN202210411698 A CN 202210411698A CN 114895250 A CN114895250 A CN 114895250A
Authority
CN
China
Prior art keywords
ith
radar data
reference frame
straight line
current frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210411698.0A
Other languages
Chinese (zh)
Inventor
王三喜
方晓波
张勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Newpoint Intelligent Technology Group Co ltd
Original Assignee
Newpoint Intelligent Technology Group Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Newpoint Intelligent Technology Group Co ltd filed Critical Newpoint Intelligent Technology Group Co ltd
Priority to CN202210411698.0A priority Critical patent/CN114895250A/en
Publication of CN114895250A publication Critical patent/CN114895250A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/021Auxiliary means for detecting or identifying radar signals or the like, e.g. radar jamming signals
    • G01S7/022Road traffic radar detectors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/91Radar or analogous systems specially adapted for specific applications for traffic control

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The embodiment of the invention provides a radar data processing method, a device, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring radar data of a current frame and radar data of a reference frame; acquiring a first object existing in the reference frame and not existing in the current frame, and a second object not existing in the reference frame and existing in the current frame; acquiring a following object of the ith first object in a reference frame; when the following object is obtained, obtaining a spliced object of the ith first object in the current frame from the second object according to radar data of the following object in the reference frame and radar data of the ith first object in the reference frame; when the splicing object is acquired, determining radar data of the splicing object in the current frame as radar data of the ith first object in the current frame. Therefore, in the embodiment of the present invention, the problem that the absence of radar data causes the detection object in the detection range to disappear is solved.

Description

Radar data processing method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to a radar data processing method and apparatus, an electronic device, and a storage medium.
Background
With the explosive development of the intelligent transportation industry, the application of intersection radar equipment is receiving more and more attention, and the radar equipment senses the speed and the size of a road dynamic object in a certain peripheral range according to the reflection characteristics of waves, so that the continuous traffic flow state of a sensing area is reconstructed.
At present, because the loading amount of intersection radar equipment is large and is limited by cost reasons, the intersection often cannot select to install high-performance and high-precision radars in a large scale, and the possibility of replacing the high-precision radars in a large scale in a short time is not high, so that the intersection radar equipment is mostly medium-low precision radar equipment.
The frequency of signals generated by the medium-low precision radar equipment is low, so that the resolution of returned images is low, and the detection loss of track objects is easy to occur; meanwhile, as the distance of the radar equipment to detect the object rises, the emission and the return of the radar signal are easily interfered, and the corresponding detection precision is reduced; and the detection range of the radar is interfered by various factors such as equipment installation angle, height, weather, barrier shielding, acquisition target overlapping and the like.
Therefore, in the prior art, when the low-and-medium-precision radar device is used for detecting the object at the intersection, the returned detection data may be missing, so that the detection object in the detection range disappears.
Disclosure of Invention
The embodiment of the invention provides a radar data processing method and device, electronic equipment and a storage medium, and aims to solve the problem that a detection object in a detection range disappears due to the loss of radar data.
In a first aspect of embodiments of the present invention, a radar data processing method is provided, where the method includes:
acquiring radar data of a current frame and radar data of a reference frame, wherein the reference frame is a previous frame of the current frame, and the radar data comprises radar data detected by a radar on at least one object;
obtaining a first object that is present in the reference frame and that is not present in the current frame;
acquiring a following object of the ith first object in the reference frame, wherein i is an integer from 1 to N, and N is the total number of the first objects;
under the condition that the following object is obtained, obtaining a spliced object of the ith first object in the current frame according to radar data of the following object in the reference frame and radar data of the ith first object in the reference frame;
when the spliced object is obtained, determining the radar data of the spliced object in the current frame as the radar data of the ith first object in the current frame.
In a second aspect of the embodiments of the present invention, there is also provided a radar data processing apparatus, including:
the radar data acquisition module is used for acquiring radar data of a current frame and radar data of a reference frame, wherein the reference frame is a previous frame of the current frame, and the radar data comprises radar data of at least one object detected by a radar;
a first object obtaining module for obtaining a first object that is present in the reference frame and is not present in the current frame;
the following object acquisition module is used for acquiring the following object of the ith first object in the reference frame, wherein i is an integer from 1 to N, and N is the total number of the first objects;
a spliced object acquisition module, configured to, when the following object is acquired, acquire a spliced object of an ith first object in the current frame according to radar data of the following object in the reference frame and radar data of the ith first object in the reference frame;
and the determining module is used for determining the radar data of the spliced object in the current frame as the radar data of the ith first object in the current frame when the spliced object is obtained.
In a third aspect of the embodiments of the present invention, there is further provided an electronic device, including a processor, a communication interface, a memory, and a communication bus, where the processor, the communication interface, and the memory complete communication with each other through the communication bus;
a memory for storing a computer program;
and the processor is used for realizing the radar data processing method when executing the program stored in the memory.
In a fourth aspect of the embodiments of the present invention, there is also provided a computer-readable storage medium having stored therein instructions, which, when run on a computer, cause the computer to execute any of the above-described radar data processing methods.
In a fifth aspect of the embodiments of the present invention, there is also provided a computer program product containing instructions which, when run on a computer, cause the computer to perform any of the radar data processing methods described above.
The embodiment of the invention at least comprises the following technical effects:
the radar data processing method provided by the embodiment of the invention can acquire radar data of a current frame and radar data of a reference frame, wherein the reference frame is a previous frame of the current frame, and the radar data comprises radar data of at least one object detected by a radar; then, a first object existing in the reference frame and not existing in the current frame and a second object not existing in the reference frame and existing in the current frame are obtained; thirdly, acquiring a following object of the ith first object in the reference frame, wherein i is an integer from 1 to N, and N is the total number of the first objects; thus, when the following object is acquired, acquiring a spliced object of the ith first object in the current frame from the second object according to the radar data of the following object in the reference frame and the radar data of the ith first object in the reference frame; and when the spliced object is obtained, determining the radar data of the spliced object in the current frame as the radar data of the ith first object in the current frame.
Therefore, in the embodiment of the invention, when the radar data of the current frame is acquired, the radar data prediction and the splicing of the missing track of the object of the current frame missing radar data can be performed according to the radar data of the current frame and the radar data of the previous frame, namely, the spliced object is determined according to the following object, so that the real-time track splicing of the object missing radar data is realized; therefore, the processed current frame data is used as a reference frame of the next frame, so that the obtained radar data can be continuously repaired in real time, and the problem that the detection object in the detection range disappears due to the loss of the radar data is solved to a certain extent.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below.
Fig. 1 is a schematic diagram of a radar data processing method provided in an embodiment of the present invention;
FIG. 2 is a schematic illustration of factors influencing vehicle predictions within a roadway opening provided in an embodiment of the present invention;
FIG. 3 is a schematic view of a semicircular area for determining an object to which a disturbance is applied, provided in an embodiment of the present invention;
fig. 4 is a flowchart of a specific implementation of a radar data processing method provided in an embodiment of the present invention;
FIG. 5 is a schematic diagram of a radar track before a vehicle disappears, provided in an embodiment of the present invention;
FIG. 6 is a schematic illustration of radar track prediction for a disappearing vehicle as provided in an embodiment of the present invention;
FIG. 7 is a schematic illustration of a radar track stitching for a disappearing vehicle as provided in an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of a radar data processing apparatus according to an embodiment of the present invention;
fig. 9 is a block diagram of an electronic device provided in an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
In various embodiments of the present invention, it should be understood that the sequence numbers of the following processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
The technical solution in the embodiment of the present invention is described below with reference to the drawings in the embodiment of the present invention.
Fig. 1 is a schematic flow chart of a radar data processing method according to an embodiment of the present invention, where the method includes the following steps:
step 101: and acquiring radar data of the current frame and radar data of the reference frame.
The reference frame is a frame before the current frame, and the radar data comprises radar data detected by a radar on at least one object.
For example, in a time period, if the current frame is the 10 th frame, the reference frame is the 9 th frame; the 1 frame is 0.1 second.
In addition, the radar data may include the type of object, the position (e.g., latitude and longitude information), the speed, and the moving direction (e.g., direction angle information).
Step 102: a first object present in the reference frame and absent in the current frame and a second object absent in the reference frame and present in the current frame are obtained.
Wherein, the presence in the reference frame indicates that the object has radar data in the reference frame; absence in the current frame indicates that the object does not have radar data in the reference frame.
For example, the detection objects of the radar device are A, B two objects, where a has no recorded radar data in the reference frame and has recorded radar data in the current frame; and B, recording radar data in the reference frame, determining A as a second object and determining B as a first object if the radar data is not recorded in the current frame.
Step 103: and acquiring the following objects of the ith first object in the reference frame, wherein i is an integer from 1 to N, and N is the total number of the first objects.
Wherein the following object represents an object located in front of the first object in the reference frame.
For example, if the first object is a vehicle a, the following object is an arbitrary vehicle traveling in a straight line ahead of the vehicle within a predetermined length, and a vehicle B closest to the vehicle a among the arbitrary vehicles can be recognized as the following object based on the traveling angle and position of the vehicle a.
In addition, when N is an integer greater than 1 (i.e., when the first object is plural), it is necessary to acquire the following object of each first object.
Step 104: when the following object is acquired, acquiring a spliced object of the ith first object in the current frame from the second object according to the radar data of the following object in the reference frame and the radar data of the ith first object in the reference frame.
Wherein, if the first object exists in the reference frame and does not exist in the current frame, the first object may be referred to as a "lost object"; and the second object is not present in the reference frame and is present in the current frame, the second object may be referred to as an "object to be stitched". In the embodiment of the present invention, the following object of the second disappearing object may be determined first, so that the stitching object of the disappearing object is selected from the "objects to be stitched" according to the following object.
Step 105: when the spliced object is obtained, determining the radar data of the spliced object in the current frame as the radar data of the ith first object in the current frame.
After the splicing object is obtained, determining radar data of the splicing object in the current frame as radar data of the ith first object in the current frame, wherein identification Information (ID) included in the radar data of the splicing object is modified into the ID of the ith first object. For example, the ID of vehicle a is "1234", the ID of vehicle B is "1212"; if the vehicle a is the object of the splice of the vehicle B, the ID of the vehicle a is modified to "1212".
In addition, after the step 105 is executed, the current frame is used as a reference frame of the next frame, so that the steps 101 to 105 are repeatedly executed, and the radar data obtained in real time is continuously repaired.
In addition, the radar data may be patched (i.e. the above steps 101 to 105 are performed once) at preset time intervals, or when a preset operation (for example, a click operation on a preset control on the radar data display interface) is received.
It should be noted that, when there are a plurality of radars, the operations of the above steps 101 to 105 may be performed individually for the radar data of each radar.
As can be seen from the foregoing steps 101 to 105, the radar data processing method provided in the embodiment of the present invention can acquire radar data of a current frame and radar data of a reference frame, where the reference frame is a frame previous to the current frame, and the radar data includes radar data detected by a radar on at least one object; then, a first object existing in the reference frame and not existing in the current frame and a second object not existing in the reference frame and existing in the current frame are obtained; thirdly, acquiring a following object of the ith first object in the reference frame, wherein i is an integer from 1 to N, and N is the total number of the first objects; thus, when the following object is acquired, acquiring a spliced object of the ith first object in the current frame from the second object according to the radar data of the following object in the reference frame and the radar data of the ith first object in the reference frame; and when the spliced object is obtained, determining the radar data of the spliced object in the current frame as the radar data of the ith first object in the current frame.
Therefore, in the embodiment of the invention, when the radar data of the current frame is acquired, the radar data prediction and the splicing of the missing track of the object of the current frame missing radar data can be performed according to the radar data of the current frame and the radar data of the previous frame, namely, the spliced object is determined according to the following object, so that the real-time track splicing of the object missing radar data is realized; therefore, the processed current frame data is used as a reference frame of the next frame, so that the obtained radar data can be continuously repaired in real time, and the problem that the detection object in the detection range disappears due to the loss of the radar data is solved to a certain extent.
Optionally, the acquiring a following object of the ith first object in the reference frame includes:
determining a reference area where the ith first object is located according to radar data of the ith first object in the reference frame;
acquiring a third object within the reference region in the reference frame;
and determining an object which is closest to the ith first object and is positioned in front of the ith first object in the third objects as a following object of the ith first object in the reference frame.
Wherein the reference region may be a front rectangular region of the first object in the reference frame, then the third object represents an object located in front of the first object in the reference frame.
In addition, the above-mentioned "front" means a front in the moving direction of the i-th first object.
As can be seen from the above description, in the embodiment of the present invention, a reference area may be defined based on the position of the ith first object in the reference frame, so that an object closest to the ith first object and located in front of the ith first object in the reference area is taken as a following object of the ith first object.
Optionally, the radar data includes a position and a direction of motion of the object;
the determining a reference area where the ith first object is located according to the radar data of the ith first object in the reference frame includes:
determining a target straight line, wherein the target straight line passes through the position of the ith first object in the reference frame and is parallel to the motion direction of the ith first object in the reference frame;
obtaining a first straight line, a second straight line, a third straight line and a fourth straight line based on the target straight line, and determining a rectangular area formed by intersecting the first straight line, the second straight line, the third straight line and the fourth straight line as the reference area;
the first straight line and the second straight line are positioned on two sides of the target straight line and are parallel to the first straight line and the second straight line, the distance between the first straight line and the target straight line is a first preset value, and the distance between the second straight line and the target straight line is the first preset value;
the third straight line passes through the position of the ith first object in the reference frame and is perpendicular to the target straight line, the fourth straight line passes through a first target reference point and is perpendicular to the target straight line, the first target reference point is a point on the target straight line forward along the motion direction of the ith first object in the reference frame, and the distance between the first target reference point and the third straight line is the second preset value.
For example, the first object is a vehicle a, the radar data of which includes the position and direction of motion of the vehicle a; taking the motion direction of the vehicle A as a linear direction, and making a straight line to pass through the position of the vehicle A, wherein the straight line is marked as a target straight line; making two straight lines which are parallel to the target straight line and positioned at two sides of the target straight line along a direction vertical to the target straight line, and respectively marking as a first straight line and a second straight line, wherein the distance between the first straight line and the target straight line, the distance between the second straight line and the target straight line can be a first preset value (such as 1.5 meters); making a straight line which is perpendicular to the target straight line and passes through the position of the vehicle A, and recording the straight line as a third straight line; making a fourth straight line parallel to the third straight line along the moving direction of the vehicle A, wherein the distance between the fourth straight line and the third straight line can be a second preset value (such as 100 meters); the rectangular area surrounded by the first straight line, the second straight line, the third straight line and the fourth straight line is the reference area where the vehicle a is located.
It should be noted that the first preset value and the second preset value may be set according to a motion scene in which the first object is located. For example, in a driving lane, the first preset value and the second preset value are determined according to lane grades.
Optionally, the obtaining, according to the radar data of the following object in the reference frame and the radar data of the ith first object in the reference frame, a spliced object of the ith first object in the current frame from the second object includes:
under the condition that the following object is obtained, predicting the target position of the ith first object in the current frame according to the radar data of the following object in the reference frame and the radar data of the ith first object in the reference frame;
and acquiring a splicing object of the ith first object in the current frame from the second object according to the target position.
As can be seen from the above, in the embodiment of the present invention, when a following object is acquired, a target position of an ith first object in a current frame may be predicted according to radar data of the following object and the ith first object in a reference frame, so that a spliced object of the ith first object may be selected from the second objects according to the target position.
Optionally, the radar data includes a position and a velocity of the object;
the predicting the target position of the ith first object in the current frame according to the radar data of the following object in the reference frame and the radar data of the ith first object in the reference frame under the condition of acquiring the following object includes:
under the condition that the car-following object is obtained, calculating a first acceleration of the ith first object in the reference frame according to the speed and the position of the car-following object in the reference frame, the speed and the position of the ith first object in the reference frame and a predetermined car-following model;
calculating a second acceleration of the ith first object in the reference frame, wherein the second acceleration is an acceleration generated by a resultant force of a boundary acting force and a repulsive force applied to the first object in the reference frame;
determining the comprehensive acceleration of the ith first object in the reference frame according to the first acceleration and the second acceleration;
and predicting the target position of the ith first object in the current frame according to the comprehensive acceleration.
Wherein the following model describes the interaction between two adjacent vehicles in a vehicle fleet driving on a one-way track limiting an overtaking vehicle; meanwhile, the following model is suitable for the queue driving of the vehicles on the single lane which cannot overtake, and is a mathematical model for researching the driving state of the following vehicle following the front vehicle when the vehicles queue to drive on the single lane.
In addition, the speed and the position (namely longitude and latitude coordinates) of the following object in the reference frame and the speed and the position (namely longitude and latitude coordinates) of the ith first object in the reference frame are substituted into the following model formula to calculate and obtain a first acceleration:
Figure BDA0003604279020000091
Figure BDA0003604279020000092
where a denotes a first acceleration, a' denotes a desired acceleration of the current road section, v a Representing the velocity, v, of the first object o The method is characterized by comprising the steps of representing the expected speed of a first object in a current motion scene, v representing the speed of a following object, Deltav representing the speed difference value of the following object and the first object, T representing a safe following time interval (the ratio of the longitudinal vehicle distance of two vehicles to the speed of a rear vehicle, generally representing the safety of an automatic cruise function, and can be regarded as the time required by the rear vehicle to collide with the front vehicle if the front vehicle is braked and stopped and the rear vehicle is not decelerated, for example, 1.5 seconds), and b representing the expected deceleration (the ratio of the change amount of the speed after braking to the time for generating the change, which is referred to as acceleration, used on the braking of the vehicle, generally referred to as braking deceleration) s representing the acceleration α Denotes the front-rear distance, s, of the first object from the following object 0 Representing a parking space of 2 meters, α is a constant with a default value of 4, and β is a constant with a default value of 2.
It should be noted that, because there is more uncertainty in the motion state of the first object in different scenes, as shown in fig. 2, for example, a vehicle driving at a complex intersection, where if the vehicle does not have a definite driving direction and speed, the driving direction may be interfered by a neighboring vehicle, an interactive vehicle turning left, a neighboring non-motor vehicle, and the like, so as to increase the uncertainty in the target position prediction, and meanwhile, in the real-time trajectory prediction, many other complex environmental influences are also considered. Therefore, in a complex intersection, since a follow-up model cannot accurately represent more detailed behaviors such as turning of the vehicle, the social force model is required to further predict the lost vehicle.
The social force model is a traffic flow model considering the driving direction, the target direction, the interference of surrounding vehicles, pedestrians and non-vehicles, wherein the resultant force received by the vehicle i driving in a complex road junction
Figure BDA0003604279020000104
Mainly comprising a driving force
Figure BDA0003604279020000105
Boundary acting force
Figure BDA0003604279020000108
And repulsive force
Figure BDA0003604279020000109
Further, the boundary force is a force indicating that the driver avoids traveling beyond the lane line and other boundary lines, and the repulsive force is a force generated by the driver avoiding colliding with the vehicle in the field of view.
It should be noted that, the driving force represented by the following model acceleration may be regarded as the first acceleration, and then the first acceleration and the second acceleration may be added to be the integrated acceleration; thereby predicting a target position of the first object within the current frame based on the integrated acceleration.
In the embodiment of the invention, the comprehensive acceleration and the radar data of the first object in the reference frame are substituted into a target position prediction formula to obtain the longitude and latitude coordinates of the first object in the current frame:
Figure BDA0003604279020000101
Figure BDA0003604279020000102
Figure BDA0003604279020000103
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA00036042790200001010
a displacement representing the position of the first object in the reference frame, moved to the position in the current frame;
Figure BDA00036042790200001011
representing a velocity of the first object in the reference frame;
Figure BDA00036042790200001012
represents the integrated acceleration; x is the number of 1 A predicted abscissa representing a target position; y is 1 A predicted ordinate representing a target position; theta.theta. 1 To represent
Figure BDA00036042790200001013
The direction angle of (1); x is the number of 0 Representing the abscissa of the first object in the reference frame; y is 0 Representing the ordinate of the first object in the reference frame.
Optionally, the method further includes:
under the condition that the following object is not acquired, setting the first acceleration of the ith first object as a target preset value;
calculating a second acceleration of the ith first object in the reference frame, wherein the second acceleration is an acceleration generated by a resultant force of a boundary acting force and a repulsive force applied to the first object in the reference frame;
determining the comprehensive acceleration of the ith first object in the reference frame according to the first acceleration and the second acceleration;
and predicting the target position of the ith first object in the current frame according to the comprehensive acceleration.
Wherein, the following object is not acquired, and the first object is the first object of the current scene; for example, when the first object is a vehicle a and the current scene is a queuing area, the motion state of the vehicle a is determined according to the motion state of the vehicle in the adjacent lane, and if the motion direction of the vehicle a is red light forbidden, the speed and the acceleration of the vehicle a are kept to be 0; if the moving direction of the vehicle A is green light passing, the acceleration of the vehicle A is the maximum acceleration of the current road section.
If the first object is the vehicle a and the current scene is the non-queuing area, the acceleration of the vehicle a is 0, and the speed and the moving direction thereof are the same as those of the previous frame.
As can be seen from the above, if the following object is not acquired, the first acceleration of the ith first object is set as the target preset value, and then the first acceleration is combined with the acceleration (i.e., the second acceleration) generated by the resultant force of the boundary acting force and the repulsive force applied to the first object in the reference frame, so that the target position of the ith first object in the current frame can be predicted according to the combined integrated acceleration.
Optionally, the calculating a second acceleration of the ith first object in the reference frame includes:
determining the boundary acting force of the ith first object in the reference frame;
acquiring an interference application object of the ith first object in the reference frame;
acquiring a repulsive force of each of the interference applying objects received by the ith first object;
calculating a resultant force of the boundary acting force and each of the repulsive forces;
and calculating the second acceleration according to the resultant force.
In the embodiment of the present invention, the acting force may be obtained by calculating a product of the mass of the object and the acceleration, and in the embodiment of the present invention, the mass of the first object may be set to a default value of 1, and a resultant force obtained by adding the boundary acting force and the repulsive force may be regarded as the second acceleration.
In addition, the boundary acting force can be obtained according to the following boundary acting force formula
Figure BDA0003604279020000111
Figure BDA0003604279020000112
Wherein A is B 、B B Is a constant, a gain factor representing the boundary force,
Figure BDA0003604279020000113
indicating a distance of the first object from the boundary line, a direction pointing from the first object to the boundary line;
Figure BDA0003604279020000114
is a normal vector perpendicular to the boundary line.
In addition, the repulsive force can be obtained according to the following repulsive force formula
Figure BDA0003604279020000121
Figure BDA0003604279020000122
Wherein i represents the ith first object, and j represents the jth interference applying object; a. the V 、B V Is a constant, is a gain coefficient of the force between the first object and the disturbance applying object; omega is a constant; d ij The direction of the vector is from the ith first object to the jth interference applying object, and the magnitude of the vector is the distance between the ith first object and the jth interference applying object; n is ij Is a unit vector whose direction is directed from the jth interfering object to the ith first object.
Optionally, the obtaining an interference-applying object of the ith first object in the reference frame includes:
taking a second target reference point as a circle center and a preset length as a radius to obtain a semicircle of which the boundary line passes through the position of the ith first object in the reference frame, wherein the second target reference point is a point along the motion direction of the ith first object;
determining an object located within the semi-circle defined region in the reference frame as the interference applying object.
That is, as shown in fig. 3, a position which is a circular position and is a preset length away from the ith first object is determined forward along the moving direction of the ith first object in the reference frame, so that a semicircle is drawn between the position of the circle center and the position of the ith first object by taking the position as the circle center and the preset length as the radius, and then the object in the semicircular area is the interference applying object of the ith first object.
Optionally, the radar data includes object type, speed, position and direction of motion;
radar data includes object type, speed, position, and direction of motion;
the obtaining a splicing object of the ith first object in the current frame according to the target position includes:
acquiring objects which accord with preset conditions in the second objects to serve as candidate splicing objects, wherein the preset conditions comprise that the object types are the same as those of the ith first object, the deviation angle between the motion direction and the motion direction of the ith first object is smaller than a third preset value, and the deviation between the speed and the speed of the ith first object is smaller than a fourth preset value;
and acquiring an object which is closest to the ith first object in the candidate splicing objects to serve as the splicing object of the ith first object in the current frame.
When the object is a vehicle, the fourth preset value may take a different value according to whether the second object is a motor vehicle or a non-motor vehicle.
For example, there are a vehicle a, a vehicle B, a vehicle C, a vehicle D, wherein the vehicle a is of type X, the angle of the direction of movement is a1, the speed is a 2; the type of the vehicle B is Y, the angle of the moving direction is B1, and the speed is B2; the type of the vehicle C is X, the angle of the moving direction is C1, and the speed is C2; the type of the vehicle C is X, the angle of the moving direction is C1, and the speed is C2; the type of the vehicle D is X, the angle of the moving direction is D1, and the speed is D2; if the absolute value of the difference value between a1 and C1 is smaller than a third preset value, and the absolute value of the difference value between a2 and C2 is smaller than a fourth preset value, determining that the vehicle C is a candidate splicing object; if the absolute value of the difference value between a1 and D1 is smaller than the third preset value, and the absolute value of the difference value between a2 and D2 is smaller than the fourth preset value, determining that the vehicle D is a candidate splicing object; and determining the vehicle closest to the position of the vehicle A as a splicing object in the vehicles C and D.
Optionally, the method further includes:
when the splicing object is not obtained, determining the target position as the position of the ith first object in the radar data in the current frame, and recording that the ith first object is not successfully spliced for one time;
and stopping predicting the position of the ith first object when the continuous times of the first object which is not spliced successfully reach the preset times.
After the predicted position (namely the target position) of the first object in the current frame is obtained and the splicing object is not obtained, taking the target position as the position of the first object in the current frame; therefore, when the current frame is used as the reference frame of the next frame, the radar data can be continuously repaired. And continuing to predict the first object when the times of unsuccessful splicing results recorded by the same first object continuously reach a preset time.
In addition, the predicted duration of the ith first object lasts for the duration of X frames when the ith first object is not spliced successfully X times in succession, and therefore, by judging whether the number of times that the ith first object is not spliced successfully exceeds the preset number of times, it can be determined whether the predicted duration of the ith first object exceeds the preset duration.
Therefore, in order to avoid that the disappearing object in the radar track does not reappear, a certain object is always predicted, the number of the objects in the same time is inconsistent with the number of the actual objects, and a part of the predicted objects are increased compared with the actual objects. Therefore, an upper limit is set for the number or duration of prediction, for example, after the prediction track exceeds 5 seconds after the object disappears, the spliced object cannot be found, and the prediction is terminated.
In summary, the specific implementation of the radar data processing method according to the embodiment of the present invention can be as follows:
as shown in fig. 4, step H1: acquiring radar data acquired by a radar device for at least one detection object in a reference frame and a current frame;
step H2: determining a first object from the radar data of the reference frame and the current frame, wherein the first object is an object existing in the reference frame and not existing in the current frame;
step H3: determining whether the first object is a following object; calculating a first acceleration of the first object in a case where a following object exists in the first object; under the condition that the first object does not have a following object, the first acceleration of the first object is a preset value;
step H4: acquiring boundary acting force of a first object;
step H5: acquiring a repulsive force of the first object;
step H6: predicting the position of the first object in the current frame according to the first acceleration and a second acceleration generated by the resultant force of the boundary acting force and the repulsive force of the first object;
step H7: determining whether a splicing object exists according to the predicted position of the first object in the current frame; under the condition that the spliced object exists, modifying the ID of the spliced object into the ID of the first object, namely completing the track prediction of the first object; judging whether the predicted time length of the first object reaches the maximum predicted time length or not under the condition that no splicing object exists, and stopping predicting the first object under the condition that the maximum predicted time length is reached; under the condition that the maximum prediction duration is not reached, recording radar data predicted by a first object as radar data of the first object in a current frame (so that when the current frame is used as a reference frame of a next frame, the first object is considered to be present in the reference frame of the next frame), and entering the next frame;
step H8: determining whether the current frame is the last frame, and stopping prediction under the condition that the current frame is the last frame; in the case where the current frame is not the last frame, step H2 is performed.
Specifically, for example, a reference frame data set is named as M, and a current frame data set is named as N;
the set of IDs present in M, absent in N, named D, refers to the disappearing vehicle;
the set of IDs present in N, absent in M, named a, refers to vehicles to be spliced that are present again.
The first condition is as follows:
as shown in fig. 5, M exists in the graph, i.e., the vehicle (r) is not detected in N, i.e., the vehicle (r) disappears in N. Therefore, the track of the vehicle I is predicted in the N, the vehicle I is located in the intersection of the intersection at the moment, the direction angle of the vehicle is turned left before disappearance, and the vehicle I continues to run according to the original angle and the original speed under the condition that the effective following object of the vehicle I cannot be identified, namely the following acceleration of the vehicle I is 0 at the moment. Because the vehicle (I) is a left-turning vehicle, the vehicle is limited by the running boundary of the left-turning vehicle, namely, an acceleration a2 which is perpendicular to the current running direction and is towards the left is applied to the left-turning vehicle in the whole course inside the intersection of the intersection, so that the running angle of the left-turning vehicle is continuously changed, and the whole steering process is completed. Under the action of a2, the vehicle obtains a velocity v2 in the vertical direction.
The vehicle (I) has a vehicle (II) in a visible range, wherein the driving direction of the vehicle (II) faces the vehicle (I), and the speed of the vehicle (II) is greater than a threshold value, so that the vehicle (II) has a repulsive force acting on the predicted vehicle (I), the repulsive force is perpendicular to the driving direction of the vehicle (I) and faces the vehicle (II), after the original speed v1, the speed v2 caused by a boundary acting force and the speed v3 caused by repulsive force acceleration are combined, the driving speed of the vehicle (I) is v0, as shown in FIG. 6, the driving speed and the boundary acting force are considered, the repelling force of the vehicle (II) to the vehicle (I) is considered on the basis of the fifth speed, and the result is most suitable for the actual situation.
Case two:
in the current frame, the vehicle is present in M and absent in N, i.e., disappears in N, and thus the trajectory prediction is performed on the vehicle. When the vehicle is in the reference frame, the vehicle does not recognize the front vehicle and is acted by the driving boundary force of the right-turn lane, and the predicted driving track of the vehicle is in a right-turn posture for track prediction. The predicted trajectory of the target vehicle is performed in the following several frames.
As shown in fig. 7, when the vehicle is predicted in the current frame, and the vehicle occurs in the next frame of the current frame, because the distance between the position of the vehicle and the predicted position of the predicted vehicle is less than 5m, the vehicle types are the same, the difference in the traveling direction is less than 30 degrees, and the speed difference is less than 5m/s, the preset condition for determining the spliced object is satisfied, and thus the vehicle is considered to be the vehicle which appears after the vehicle disappears, the prediction of the vehicle is cancelled in the next frame of the current frame, the trajectories of the vehicle and the vehicle are spliced, and the IDs of all the vehicles in the subsequent frame are named as the IDs of the vehicles (c).
As can be seen from the above, the radar data processing method according to the embodiment of the present invention predicts the movement tracks of a plurality of targets in real time based on the microscopic driving behavior model according to the radar detection data transmitted in real time, identifies the predicted target vehicle according to the real-time data of the two frames before and after, predicts the driving acceleration of the disappearing vehicle in the straight line direction in real time by using the following model, and obtains the acceleration of the vehicle in other directions based on the social force model on the basis of the following model, thereby realizing the real-time prediction of the traveling track of the disappearing vehicle until a vehicle having a vehicle type, a shape and a position similar to the predicted vehicle and appearing again is found, at this time, the prediction is stopped, and the unification of IDs of the two vehicles is completed, thereby realizing the completion of the missing track.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 8, a block diagram illustrating a radar data processing apparatus according to an embodiment of the present invention is shown, where the radar data processing apparatus 800 may include the following modules:
a radar data obtaining module 801, configured to obtain radar data of a current frame and radar data of a reference frame, where the reference frame is a previous frame of the current frame, and the radar data includes radar data detected by a radar on at least one object;
a first object obtaining module 802, configured to obtain a first object existing in the reference frame and not existing in the current frame, and a second object not existing in the reference frame and existing in the current frame;
a following object obtaining module 803, configured to obtain a following object of the ith first object in the reference frame, where i is an integer from 1 to N, and N is the total number of the first objects;
a spliced object obtaining module 804, configured to, when the following object is obtained, obtain a spliced object of an ith first object in the current frame from the second object according to radar data of the following object in the reference frame and radar data of the ith first object in the reference frame;
a determining module 805, configured to determine, when the spliced object is obtained, radar data of the spliced object in the current frame as radar data of the ith first object in the current frame.
Optionally, the following object obtaining module 803 includes:
a reference area determining submodule, configured to determine, according to radar data of an ith first object in the reference frame, a reference area where the ith first object is located;
a third object acquisition sub-module for acquiring a third object within the reference region in the reference frame;
and the following object determining submodule is used for determining an object which is closest to the ith first object and is positioned in front of the ith first object in the third objects as a following object of the ith first object in the reference frame.
Optionally, the radar data includes a position and a direction of motion of the object;
the reference region determination sub-module includes:
a first determining unit, configured to determine a target straight line, where the target straight line passes through a position of an ith first object in the reference frame and is parallel to a motion direction of the ith first object in the reference frame;
a second determining unit, configured to obtain a first straight line, a second straight line, a third straight line, and a fourth straight line based on the target straight line, and determine a rectangular region formed by intersecting the first straight line, the second straight line, the third straight line, and the fourth straight line as the reference region;
the first straight line and the second straight line are positioned on two sides of the target straight line and are parallel to the first straight line and the second straight line, the distance between the first straight line and the target straight line is a first preset value, and the distance between the second straight line and the target straight line is the first preset value;
the third straight line passes through the position of the ith first object in the reference frame and is perpendicular to the target straight line, the fourth straight line passes through a first target reference point and is perpendicular to the target straight line, the first target reference point is a point on the target straight line forward along the motion direction of the ith first object in the reference frame, and the distance between the first target reference point and the third straight line is the second preset value.
Optionally, the spliced object obtaining module 804 includes:
the prediction sub-module is used for predicting the target position of the ith first object in the current frame according to the radar data of the following object in the reference frame and the radar data of the ith first object in the reference frame under the condition that the following object is obtained;
and the splicing object determining submodule is used for acquiring a splicing object of the ith first object in the current frame from the second object according to the target position.
Optionally, the radar data includes a position and a velocity of the object;
the prediction sub-module includes:
a first acceleration calculation unit, configured to, when the car-following object is acquired, calculate a first acceleration of an ith first object in the reference frame according to a speed and a position of the car-following object in the reference frame, a speed and a position of the ith first object in the reference frame, and a predetermined car-following model;
a second acceleration calculating unit, configured to calculate a second acceleration of the ith first object in the reference frame, where the second acceleration is an acceleration generated by a resultant force of a boundary acting force and a repulsive force applied to the ith first object in the reference frame;
a comprehensive acceleration determining unit, configured to determine, according to the first acceleration and the second acceleration, a comprehensive acceleration of the ith first object in the reference frame;
and the target position prediction unit is used for predicting the target position of the ith first object in the current frame according to the comprehensive acceleration.
Optionally, the radar data processing apparatus 800 further includes:
the setting module is used for setting the first acceleration of the ith first object as a target preset value under the condition that the following object is not obtained;
a second acceleration calculating module, configured to calculate a second acceleration of the ith first object in the reference frame, where the second acceleration is an acceleration generated by a resultant force of a boundary acting force and a repulsive force applied to the ith first object in the reference frame;
a comprehensive acceleration determining module, configured to determine, according to the first acceleration and the second acceleration, a comprehensive acceleration of the ith first object in the reference frame;
and the target position prediction module is used for predicting the target position of the ith first object in the current frame according to the comprehensive acceleration.
Optionally, the second acceleration calculation module includes:
a boundary acting force determining submodule for determining the boundary acting force of the ith first object in the reference frame;
an interference applying object obtaining sub-module, configured to obtain an interference applying object of the ith first object in the reference frame;
a repulsive force acquiring submodule for acquiring a repulsive force of each of the disturbance applying objects to which the ith one of the first objects is subjected;
the resultant force calculation submodule is used for calculating the resultant force of the boundary acting force and each repulsive force;
and the second acceleration calculation submodule is used for calculating the second acceleration according to the resultant force.
Optionally, the interference applying object obtaining sub-module includes:
a third determining unit, configured to obtain a semicircle where a boundary line passes through a position of an ith first object in the reference frame by using a second target reference point as a circle center and a preset length as a radius, where the second target reference point is a point along a moving direction of the ith first object;
a fourth determination unit configured to determine an object located in the semicircular encircled area in the reference frame as the interference applying object.
Optionally, the radar data includes object type, speed, position and direction of motion;
the spliced object determination submodule includes:
the first obtaining unit is used for obtaining an object which meets a preset condition in the second objects to serve as a candidate splicing object, wherein the preset condition comprises that the object type is the same as that of the ith first object, the deviation angle between the motion direction and the motion direction of the ith first object is smaller than a third preset value, and the deviation between the speed and the speed of the ith first object is smaller than a fourth preset value;
and a second obtaining unit, configured to obtain, from the candidate splicing objects, an object closest to the ith first object as a splicing object of the ith first object in the current frame.
Optionally, the radar data processing apparatus 800 further includes:
the position determining module is used for determining the target position as the position of the ith first object in the radar data in the current frame when the spliced object is not obtained, and recording that the ith first object is not spliced successfully for one time;
and the stopping prediction module is used for stopping predicting the position of the ith first object when the continuous times of the first object which is not spliced successfully reach the preset times.
Therefore, the radar data processing method provided by the embodiment of the invention can acquire the radar data of a current frame and the radar data of a reference frame, wherein the reference frame is a previous frame of the current frame, and the radar data comprises radar data detected by a radar on at least one object; then, a first object existing in the reference frame and not existing in the current frame and a second object not existing in the reference frame and existing in the current frame are obtained; thirdly, acquiring a following object of the ith first object in the reference frame, wherein i is an integer from 1 to N, and N is the total number of the first objects; thus, when the following object is acquired, acquiring a spliced object of the ith first object in the current frame from the second object according to the radar data of the following object in the reference frame and the radar data of the ith first object in the reference frame; and when the spliced object is obtained, determining the radar data of the spliced object in the current frame as the radar data of the ith first object in the current frame.
Therefore, in the embodiment of the invention, when the radar data of the current frame is acquired, the radar data prediction and the splicing of the missing track of the object of the current frame missing radar data can be performed according to the radar data of the current frame and the radar data of the previous frame, namely, the spliced object is determined according to the following object, so that the real-time track splicing of the object missing radar data is realized; therefore, the processed current frame data is used as a reference frame of the next frame, so that the obtained radar data can be continuously repaired in real time, and the problem that the detection object in the detection range disappears due to the loss of the radar data is solved to a certain extent.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
An embodiment of the present invention further provides an electronic device, as shown in fig. 9, which includes a processor 901, a communication interface 902, a memory 903, and a communication bus 904, where the processor 901, the communication interface 902, and the memory 903 complete mutual communication through the communication bus 904,
a memory 903 for storing computer programs;
the processor 901 is configured to implement the radar data processing method described above when executing the program stored in the memory 903.
The Memory may include a Random Access Memory (RAM) or a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the Integrated Circuit may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component.
In yet another embodiment of the present invention, a computer-readable storage medium is further provided, which has instructions stored therein, and when the instructions are executed on a computer, the instructions cause the computer to execute the radar data processing method described in any one of the above embodiments.
In a further embodiment provided by the present invention, there is also provided a computer program product containing instructions which, when run on a computer, cause the computer to perform the radar data processing method of any one of the above embodiments.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, it may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the invention to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above description is only for the preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.

Claims (13)

1. A method of radar data processing, the method comprising:
acquiring radar data of a current frame and radar data of a reference frame, wherein the reference frame is a previous frame of the current frame, and the radar data comprises radar data detected by a radar on at least one object;
acquiring a first object existing in the reference frame and not existing in the current frame, and a second object not existing in the reference frame and existing in the current frame;
acquiring a following object of the ith first object in the reference frame, wherein i is an integer from 1 to N, and N is the total number of the first objects;
when the following object is obtained, obtaining a spliced object of the ith first object in the current frame from the second object according to radar data of the following object in the reference frame and radar data of the ith first object in the reference frame;
when the spliced object is obtained, determining the radar data of the spliced object in the current frame as the radar data of the ith first object in the current frame.
2. The method according to claim 1, wherein the obtaining of the i-th following object of the first object in the reference frame comprises:
determining a reference area where the ith first object is located according to radar data of the ith first object in the reference frame;
acquiring a third object within the reference region in the reference frame;
and determining an object which is closest to the ith first object and is positioned in front of the ith first object in the third objects as a following object of the ith first object in the reference frame.
3. The method of claim 2, wherein the radar data includes a position and a direction of motion of the object;
the determining a reference area where the ith first object is located according to the radar data of the ith first object in the reference frame includes:
determining a target straight line, wherein the target straight line passes through the position of the ith first object in the reference frame and is parallel to the motion direction of the ith first object in the reference frame;
obtaining a first straight line, a second straight line, a third straight line and a fourth straight line based on the target straight line, and determining a rectangular area formed by intersecting the first straight line, the second straight line, the third straight line and the fourth straight line as the reference area;
the first straight line and the second straight line are positioned on two sides of the target straight line and are parallel to the first straight line and the second straight line, the distance between the first straight line and the target straight line is a first preset value, and the distance between the second straight line and the target straight line is the first preset value;
the third straight line passes through the position of the ith first object in the reference frame and is perpendicular to the target straight line, the fourth straight line passes through a first target reference point and is perpendicular to the target straight line, the first target reference point is a point on the target straight line forward along the motion direction of the ith first object in the reference frame, and the distance between the first target reference point and the third straight line is the second preset value.
4. The radar data processing method according to claim 1, wherein the obtaining, from the second object, a stitched object of an ith first object in the current frame according to the radar data of the following object in the reference frame and the radar data of the ith first object in the reference frame comprises:
under the condition that the following object is obtained, predicting the target position of the ith first object in the current frame according to the radar data of the following object in the reference frame and the radar data of the ith first object in the reference frame;
and acquiring a splicing object of the ith first object in the current frame from the second object according to the target position.
5. The radar data processing method of claim 4, wherein the radar data includes a position and a velocity of an object;
the predicting the target position of the ith first object in the current frame according to the radar data of the following object in the reference frame and the radar data of the ith first object in the reference frame under the condition of acquiring the following object includes:
under the condition that the car-following object is obtained, calculating a first acceleration of the ith first object in the reference frame according to the speed and the position of the car-following object in the reference frame, the speed and the position of the ith first object in the reference frame and a predetermined car-following model;
calculating a second acceleration of the ith first object in the reference frame, wherein the second acceleration is an acceleration generated by a resultant force of a boundary acting force and a repulsive force applied to the first object in the reference frame;
determining the comprehensive acceleration of the ith first object in the reference frame according to the first acceleration and the second acceleration;
and predicting the target position of the ith first object in the current frame according to the comprehensive acceleration.
6. The method of claim 4, further comprising:
under the condition that the following object is not acquired, setting the first acceleration of the ith first object as a target preset value;
calculating a second acceleration of the ith first object in the reference frame, wherein the second acceleration is an acceleration generated by a resultant force of a boundary acting force and a repulsive force applied to the first object in the reference frame;
determining the comprehensive acceleration of the ith first object in the reference frame according to the first acceleration and the second acceleration;
and predicting the target position of the ith first object in the current frame according to the comprehensive acceleration.
7. The method according to claim 5 or 6, wherein said calculating a second acceleration of the ith said first object in said reference frame comprises:
determining the boundary acting force of the ith first object in the reference frame;
acquiring an interference application object of the ith first object in the reference frame;
acquiring a repulsive force of each of the interference applying objects received by the ith first object;
calculating a resultant force of the boundary acting force and each of the repulsive forces;
and calculating the second acceleration according to the resultant force.
8. The method of claim 7, wherein obtaining the interference-applying object of the ith first object in the reference frame comprises:
taking a second target reference point as a circle center and a preset length as a radius to obtain a semicircle of which the boundary line passes through the position of the ith first object in the reference frame, wherein the second target reference point is a point along the motion direction of the ith first object;
determining an object located within the semi-circle defined region in the reference frame as the interference applying object.
9. The radar data processing method of claim 4, wherein the radar data includes an object type, a speed, a position, and a direction of motion;
the obtaining a splicing object of the ith first object in the current frame according to the target position includes:
acquiring objects which accord with preset conditions in the second objects to serve as candidate splicing objects, wherein the preset conditions comprise that the object types are the same as those of the ith first object, the deviation angle between the motion direction and the motion direction of the ith first object is smaller than a third preset value, and the deviation between the speed and the speed of the ith first object is smaller than a fourth preset value;
and acquiring an object which is closest to the ith first object in the candidate splicing objects to serve as the splicing object of the ith first object in the current frame.
10. The method of claim 4, further comprising:
when the splicing object is not obtained, determining the target position as the position of the ith first object in the radar data in the current frame, and recording that the ith first object is not successfully spliced for one time;
and stopping predicting the position of the ith first object when the continuous times of the first object which is not spliced successfully reach the preset times.
11. A radar data processing apparatus, characterized in that the apparatus comprises:
the radar data acquisition module is used for acquiring radar data of a current frame and radar data of a reference frame, wherein the reference frame is a previous frame of the current frame, and the radar data comprises radar data of at least one object detected by a radar;
a first object obtaining module for obtaining a first object that is present in the reference frame and is not present in the current frame;
the following object acquisition module is used for acquiring the following object of the ith first object in the reference frame, wherein i is an integer from 1 to N, and N is the total number of the first objects;
a spliced object acquisition module, configured to, when the following object is acquired, acquire a spliced object of an ith first object in the current frame according to radar data of the following object in the reference frame and radar data of the ith first object in the reference frame;
and the determining module is used for determining the radar data of the spliced object in the current frame as the radar data of the ith first object in the current frame when the spliced object is obtained.
12. The electronic equipment is characterized by comprising a processor, a communication interface, a memory and a communication bus, wherein the processor and the communication interface are used for realizing the communication between the processor and the memory through the communication bus;
a memory for storing a computer program;
a processor for implementing the method steps of any of claims 1-10 when executing a program stored in the memory.
13. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-10.
CN202210411698.0A 2022-04-19 2022-04-19 Radar data processing method and device, electronic equipment and storage medium Pending CN114895250A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210411698.0A CN114895250A (en) 2022-04-19 2022-04-19 Radar data processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210411698.0A CN114895250A (en) 2022-04-19 2022-04-19 Radar data processing method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114895250A true CN114895250A (en) 2022-08-12

Family

ID=82717718

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210411698.0A Pending CN114895250A (en) 2022-04-19 2022-04-19 Radar data processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114895250A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115168810A (en) * 2022-09-08 2022-10-11 南京慧尔视智能科技有限公司 Traffic data generation method and device, electronic equipment and storage medium
CN115331469A (en) * 2022-08-15 2022-11-11 北京图盟科技有限公司 Vehicle track online restoration method, device and equipment

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115331469A (en) * 2022-08-15 2022-11-11 北京图盟科技有限公司 Vehicle track online restoration method, device and equipment
CN115168810A (en) * 2022-09-08 2022-10-11 南京慧尔视智能科技有限公司 Traffic data generation method and device, electronic equipment and storage medium
CN115168810B (en) * 2022-09-08 2022-11-29 南京慧尔视智能科技有限公司 Traffic data generation method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN114895250A (en) Radar data processing method and device, electronic equipment and storage medium
CN104554258B (en) Using the path planning of the avoidance steering operation of virtual potential field technology
CN109934164B (en) Data processing method and device based on track safety degree
CN112242069B (en) Method and device for determining vehicle speed
US10160459B2 (en) Vehicle lane direction detection
EP3842316B1 (en) Method and device for controlling operation of self-driving car
US20230400859A1 (en) Predicting Jaywaking Behaviors of Vulnerable Road Users
CN111638520A (en) Obstacle recognition method, obstacle recognition device, electronic device and storage medium
CN114475651A (en) Blind area barrier emergency avoiding method and device based on vehicle-road cooperation
Shin et al. Occlusion handling and track management method of high-level sensor fusion for robust pedestrian tracking
CN114368394A (en) Method and device for attacking V2X equipment based on Internet of vehicles and storage medium
CN114537447A (en) Safe passing method and device, electronic equipment and storage medium
CN114475656A (en) Travel track prediction method, travel track prediction device, electronic device, and storage medium
CN114802251A (en) Control method and device for automatic driving vehicle, electronic device and storage medium
CN110497906B (en) Vehicle control method, apparatus, device, and medium
CN115060279B (en) Path planning method, path planning device, electronic equipment and computer readable medium
CN115662186A (en) Vehicle obstacle avoidance method and system based on artificial intelligence
CN115257720A (en) Emergency collision avoidance method, device, equipment and medium based on turning scene
CN115320581A (en) Real-time vehicle lane change risk assessment method, computer device and computer storage medium
US11353875B2 (en) Target intention predicting method and system thereof
CN112612284B (en) Data storage method and device
CN113721598A (en) Obstacle trajectory prediction method, device, equipment and storage medium
US20230280753A1 (en) Robust behavior prediction neural networks through non-causal agent based augmentation
CN115482679B (en) Automatic driving blind area early warning method and device and message server
US11267465B1 (en) Enhanced threat assessment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination