CN112334946A - Data processing method, data processing device, radar, equipment and storage medium - Google Patents

Data processing method, data processing device, radar, equipment and storage medium Download PDF

Info

Publication number
CN112334946A
CN112334946A CN201980040012.8A CN201980040012A CN112334946A CN 112334946 A CN112334946 A CN 112334946A CN 201980040012 A CN201980040012 A CN 201980040012A CN 112334946 A CN112334946 A CN 112334946A
Authority
CN
China
Prior art keywords
target
target object
information
radar
moment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980040012.8A
Other languages
Chinese (zh)
Inventor
王石荣
高迪
王俊喜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN112334946A publication Critical patent/CN112334946A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/415Identification of targets based on measurements of movement associated with the target
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Probability & Statistics with Applications (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

A data processing method, an apparatus, a radar, a device and a storage medium. The method comprises the following steps: acquiring first space information and first energy information (101) of a target object detected by a radar at a target moment, and calculating a target observation center (102) of the target object at the target moment according to the first energy information and the first space information; and fusing the target observation center with the first target track of the target object at the historical moment to generate a second target track of the target object (103). The first target track is determined according to second spatial information and second energy information of the target object at a historical time, and the historical time is before the target time. The method integrates the spatial information and the energy information of the target object, determines the target track of the target object, and is beneficial to improving the accuracy of the target track, and further beneficial to improving the accuracy of follow-up work by utilizing the track of the target object.

Description

Data processing method, data processing device, radar, equipment and storage medium
Technical Field
The present application relates to the field of radar technologies, and in particular, to a data processing method and apparatus, a radar, a device, and a storage medium.
Background
The radar is composed of a transmitter, a receiver, an information processing system and the like, and can transmit a detection signal, then reflect the received signal back from a target, and obtain related information of the target according to the reflected signal, such as parameters of the distance between the radar and the target, the direction, the height and the shape of the target and the like. Therefore, radar is widely used for target detection and tracking.
In practical application, the spatial information of each target in the moving process is usually identified, and the moving track is generated according to the spatial information, so that target tracking, accident troubleshooting and the like are realized by the moving track. However, the accuracy of the movement trajectory generated in the related art is low.
Disclosure of Invention
Aspects of the present application provide a data processing method, apparatus, radar, device, and storage medium to improve accuracy of a target movement trajectory.
An embodiment of the present application provides a data processing method, including:
acquiring first space information and first energy information of a target object detected by a radar at a target moment;
calculating a target observation center of the target object at the target moment according to the first energy information and the first space information;
fusing the target observation center with a first target track of the target object at a historical moment to generate a second target track of the target object;
wherein the first target trajectory is determined according to second spatial information and second energy information of the target object at the historical time; the historical time is before the target time.
An embodiment of the present application further provides a data processing apparatus, including: the system comprises a first acquisition module, a calculation module and a first fusion module;
the first acquisition module is used for acquiring first space information and first energy information of a target object detected by the radar at a target moment;
the calculation module is used for calculating a target observation center of the target object at the target moment according to the first energy information and the first space information;
the first fusion module is used for fusing the target observation center and a first target track of the target object at a historical moment to generate a second target track of the target object;
wherein the first target trajectory is determined according to second spatial information and second energy information of the target object at the historical time; the historical time is before the target time.
An embodiment of the present application further provides a radar, including: a memory and a processor; wherein the memory is for storing a computer program and a first target trajectory of a target object at a historical time; the first target track is determined according to second spatial information and second energy information of the target object at the historical moment;
the processor is coupled to the memory for executing the computer program for:
acquiring first space information and first energy information of a target object detected by a radar at the target moment; wherein the historical time is before the target time;
calculating a target observation center of the target object at the target moment according to the first energy information and the first space information;
and fusing the target observation center and the first target track to generate a second target track of the target object.
The embodiment of the application also provides detection equipment which is loaded with the radar and comprises a memory and a processor; the radar is used for acquiring first space information and first energy information of a target object detected at a target moment;
the memory is used for storing a computer program and a first target track of the target object at historical time; the first target track is determined according to second spatial information and second energy information of the target object at historical time; the historical time is before the target time;
the processor is coupled to the memory for executing the computer program for: calculating a target observation center of the target object at the target moment according to the first energy information and the first space information; and fusing the target observation center and the first target track to generate a second target track of the target object.
The embodiment of the application also provides mobile equipment which is loaded with the radar; the radar is configured to: acquiring first space information and first energy information of a target object detected at the target moment; calculating a target observation center of the target object at the target moment according to the first energy information and the first space information; fusing the target observation center with a first target track of the target object at a historical moment to generate a second target track of the target object; wherein the first target trajectory is determined according to second spatial information and second energy information of the target object at historical time; the historical time is before the target time.
Embodiments of the present application also provide a computer-readable storage medium having stored thereon computer instructions that, when executed by one or more processors, cause the one or more processors to perform acts comprising:
acquiring first space information and first energy information of a target object detected by a radar at a target moment;
calculating a target observation center of the target object at the target moment according to the first energy information and the first space information;
fusing the target observation center with a first target track of the target object at a historical moment to generate a second target track of the target object;
wherein the first target trajectory is determined according to second spatial information and second energy information of the target object at historical time; the historical time is before the target time.
In the embodiment of the application, a target observation center of a target object can be calculated according to spatial information and energy information of the target object, and the observation center of the target object and a target track of the target object at a historical moment are fused to generate a new target track of the target object. The method and the device have the advantages that the space information and the energy information of the target object are integrated, the target track of the target object is determined, the accuracy of the target track is improved, and the accuracy of follow-up work by utilizing the track of the target object is improved. For example, when tracking a target, it is helpful to improve the accuracy of positioning the target object, thereby improving the accuracy of target tracking, and so on.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic flowchart of a data processing method according to an embodiment of the present application;
fig. 2 is a schematic flow chart of another data processing method according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of a data processing apparatus according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a radar provided in an embodiment of the present application;
fig. 5 is a schematic structural diagram of a detection apparatus according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of a mobile device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
For the technical problem that the accuracy of generating the target object track is low in the prior art, in some embodiments of the present application, a target observation center of the target object may be calculated according to spatial information and energy information of the target object, and the observation center of the target object and the target track of the target object at the historical time are fused to generate a new target track of the target object. The method and the device have the advantages that the space information and the energy information of the target object are integrated, the target track of the target object is determined, the accuracy of the target track is improved, and the accuracy of follow-up work by utilizing the track of the target object is improved. For example, when tracking a target, it is helpful to improve the accuracy of positioning the target object, thereby improving the accuracy of target tracking, and so on.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic flowchart of a data processing method according to an embodiment of the present application. As shown in fig. 1, the method includes:
101. first space information and first energy information of a target object detected by a radar at a target moment are obtained.
102. And calculating the target observation center of the target object at the target moment according to the first energy information and the first space information.
103. And fusing the target observation center and the first target track of the target object at the historical moment to generate a second target track of the target object.
In this embodiment, the radar may be a directional radar or a rotary radar. The radar may be a microwave radar, a laser radar, or the like, but is not limited thereto.
In this embodiment, the radar can detect not only the spatial information of the target object but also the energy information of the target object. In the present embodiment, the spatial information of the target object includes position information and geometric features of the target object. The energy information of the target object is based on the intensity information of the echo signal returned by the target object. The strength of the echo signal may reflect to some extent the distance of the target object from the radar. Therefore, both the spatial information and the energy information of the target object can reflect the relative positional relationship between the target object and the radar to some extent. Therefore, if the spatial information and the energy information of the target object are combined to represent the track information of the target object, the accuracy of the determined track is improved.
Based on the above analysis, in step 101, first spatial information and second spatial information of a target object detected by the radar at a target time are acquired. The target time refers to the time when the radar detects the first space information and the second space information. In different application scenarios, the target time may be the current time or the past time. For example, in an application scenario of tracking of a target object, the target time may be a current detection time of the radar. For another example, in an application scenario in which accident troubleshooting or scene duplication is performed using a trajectory of a target object, the target time is a past time as compared with the time at which the first spatial information and the second spatial information are acquired.
Next, in step 102, a target observation center of the target object is calculated based on the first spatial information and the first energy information. Further, in step 103, the target observation center is fused with the first target trajectory of the target object at the historical time, so as to generate a second target trajectory of the target object. The track generation mode integrates the spatial information and the energy information of the target object, and is beneficial to improving the accuracy of the target track, and further is beneficial to improving the accuracy of follow-up work carried out by utilizing the track of the target object. For example, when tracking a target, it is helpful to improve the accuracy of positioning the target object, thereby improving the accuracy of target tracking, and so on.
In the present embodiment, the first target trajectory is trajectory information of the target object at the historical time. The historical time is the detection time before the target time. Preferably, the history time is a detection time closest to the target time. Further, in the present embodiment, the first target trajectory is determined based on the second spatial information and the second energy information of the target object at the historical time. For a specific implementation of determining the first target track, reference may be made to the related contents of the determination manner of the second target track of the target object in the foregoing and the following embodiments, which are not repeated herein.
It should be noted that, in the embodiment of the present application, descriptions of "first" and "second" are used to distinguish different information, tracks, devices, modules, and the like, and do not represent a sequential order, nor limit that "first" and "second" are different types.
Optionally, in step 103, a Kalman (Kalman) filtering method may be used to fuse the target observation center and the first target trajectory of the target object. Alternatively, the second target trajectory may be predicted using a kalman filtering method.
In the embodiment of the application, the radar can be carried on various devices to complete related tasks. For example, radar may be mounted on a mobile device to detect obstacles, or to detect and track targets, and so on. In this embodiment, in different application scenarios, different operations may be performed according to the second target trajectory. For example, in an application scenario of tracking a target, a device mounted with a radar may be guided to track the target object according to the second target track; for another example, in an application scenario of accident investigation, the operating environment of the device with radar may be restored according to the second target trajectory and other target trajectories of the target object before the second target trajectory; and the like, but are not limited thereto.
Further, the radar can be mounted on various devices. For example, the radar may be mounted on a mobile device, wherein the mobile device may be an autonomous mobile device, such as a drone, an unmanned vehicle, or a robot, but is not limited thereto; alternatively, the mobile device may be a mobile device that requires human control, such as, but not limited to, a non-unmanned vehicle, a boat, an airplane, and the like.
In this embodiment, the radar includes a transmitter, a receiver, and an information processing system, wherein the transmitter is configured to transmit a probe signal, the probe signal reflects an echo signal when encountering an obstacle, and the receiver can receive the echo signal. Then, the information processing system can obtain the relevant information of the target, such as the distance between the radar and the target, the azimuth, the height and the shape of the target, and the like according to the reflected echo information.
In this embodiment, the detection signal when encountering an obstacle is substantially: the probe signal encounters a point on the obstacle. For convenience of description and distinction, an obstacle point actually encountered by the probe signal during propagation is defined as a probe point. In this embodiment, the detection point may be a certain point on the target object, or may belong to other objects besides the target object, such as dust in the air. Wherein, each detection signal can return a corresponding echo signal when meeting a detection point.
Further, the spatial coordinate information and the energy information of the detection points can be obtained based on the detection signals transmitted by the radar and the received echo signals, and the spatial coordinate information and the spatial energy information of the plurality of detection points form point cloud information, namely the point cloud information is a set formed by a series of spatial coordinate points and spatial energy information. In the present embodiment, one data point in the point cloud information may be interpreted as a combination of spatial coordinate information and energy information of the detection point.
Optionally, the spatial coordinate information corresponding to the detection point may be calculated according to the distance between the radar and the detection point and the pose of the radar. The position and orientation of the radar are referred to as the position and orientation of the radar. Further, the orientation of the radar may refer to the directivity of the radar antenna. Further, according to the directivity of the radar antenna, the direction of the detection point relative to the radar can be obtained; and then according to the direction of the detection point relative to the radar, the distance between the radar and the detection point and the position of the radar, the space coordinate of the detection point can be calculated. Optionally, the energy information of the detection point can also be determined according to the intensity information of the echo signal.
In this embodiment, the target object detected by the radar corresponds to a plurality of detection points, and the data points corresponding to the plurality of detection points constitute point cloud information of the target object. In the present embodiment, a plurality means 2 or more. Based on this, in step 101, a reference observation center and a reference geometric feature of the target object at the target time may be calculated as first spatial information of the target object according to the point cloud information of the target object at the target time. In this embodiment, the point cloud information of the target object at the target time includes: a plurality of position coordinates of the target object. In this embodiment, a specific implementation of calculating the reference observation center and the reference geometric feature of the target object at the target time is not limited. Alternatively, an average coordinate value of the plurality of position coordinates may be calculated as a reference observation center of the target object at the target time. For example, it is assumed that the point cloud information of the target object at the target time includes: k position coordinates, wherein K is more than or equal to 2 and is an integer. And defining the K position coordinates as position coordinates in the point cloud information obtained by the m detection of the radar. The K position coordinates are respectively expressed as (x)mk,ymk,zmk) Wherein K is 1,2. The reference observation center can be expressed as
Figure BDA0002834377150000081
Wherein the content of the first and second substances,
Figure BDA0002834377150000082
optionally, a cuboid is determined based on the maximum and minimum of the plurality of position coordinates and is lengthenedThe side length of the cube is used as a reference geometric characteristic of the target object at the target moment. These position coordinates are contained within a rectangular parallelepiped. Wherein, the side length of the cuboid refers to the length, width and height of the cuboid. I.e. |m=max(xmk)-min(xmk);wm=max(ymk)-min(ymk);hm=max(zmk)-min(zmk) (ii) a Wherein lm、wm,hmEach represents the length, width and height of the rectangular parallelepiped.
Alternatively, an ellipsoid can be determined according to the maximum value and the minimum value of the position coordinates, and the length of the central axis of the ellipsoid is used as the reference geometric characteristic of the target object. Wherein the position coordinates are contained within the ellipsoid. Optionally, the length of the central axis of the ellipsoid comprises the length of the long central axis and the short central axis of the ellipsoid. Alternatively, a sphere may be determined according to the maximum value and the minimum value of the position coordinates, and the radius of the sphere may be used as the reference geometric feature of the target object at the target time. Wherein the position coordinates are contained within the sphere.
In this embodiment, the position coordinates are three-dimensional space coordinates, and the maximum value of the plurality of position coordinates is a three-dimensional coordinate value determined by the maximum value of the x axis, the maximum value of the y axis, and the maximum value of the z axis; the maximum value of the plurality of position coordinates is a three-dimensional coordinate value determined by the minimum value of the x-axis, the minimum value of the y-axis and the minimum value of the z-axis.
In the embodiment of the present application, the point cloud information of the target object at the target time further includes a plurality of energy values of the target object at the target time. Alternatively, an average energy value of the plurality of energy values may be calculated as the first energy information of the target object at the target time.
Further, considering that the energy information and the spatial information of the target object account for different proportions when representing the trajectory information of the target object, in this step 201, an observation weighting factor may also be calculated according to the first energy information, the reference observation center, and the reference geometric feature; and calculating a target observation center of the target object at the target moment based on the observation weighting factor and the reference observation center.
Alternatively, a ratio of a distance between the reference observation center of the target object at the target time and the first target trajectory to a gate radius of the target object at the historical time may be calculated as the distance weighting factor. Alternatively, the gate radius of the target object at the historical time can be calculated according to the weighted geometric characteristics of the target object at the historical time. And calculating the weighted geometric characteristics of the target object at the historical moment according to the second spatial information and the second energy information. Assuming that the historical time is the time when the radar detects the target object for the nth time, wherein the nth time is before the mth time; and the reference geometric characteristics of the target object at the historical time are assumed to be represented by the length, width and height of the cuboid. Correspondingly, the weighting geometric characteristics are also characterized by the length, width and height of the cuboid, and the weighting geometric characteristics can be ln、wn,hnThen the gate radius of the target object at the historical time can be expressed as:
Figure BDA0002834377150000091
alternatively, the distance weighting factor may be expressed as:
Figure BDA0002834377150000092
where d represents the distance between the reference observation center and the first target trajectory. Wherein the closer the reference observation center is to the first target track, rdThe larger. In some cases, it is possible to use,
Figure BDA0002834377150000093
the distance between the reference observation center and the first target trajectory may approach infinity because the distance is close enough, resulting in a greater weight of the distance weighting factor in the observation weighting factor, which may affect the accuracy of the subsequent second target trajectory determination. Based on this, the distance weighting factor can also be expressed by a function that grows more gradually. For example, the distance weighting factor may also be expressed as:
Figure BDA0002834377150000094
further, the method can be used for preparing a novel materialAnd calculating the ratio of the reference geometric feature of the target object at the target moment to the weighted geometric feature of the target object at the historical moment as a geometric feature weighting factor. The weighted geometric features of the target object at the historical time are the target geometric features of the target object at the historical time, and can be obtained by calculation according to the reference geometric features of the target object at the historical time and the observation weighting factors of the target object at the historical time, and the specific calculation process can be referred to in the embodiment of the present application as the calculation process of the reference geometric features of the target object at the target time. The geometric feature weighting factors are: the ratio of each characteristic parameter of the reference geometric feature of the target object at the target moment to the characteristic parameter of the same attribute of the weighted geometric feature of the target object at the historical moment. For example, the reference geometric characteristic of the target object at the target time is represented by a rectangular parallelepiped, and the characteristic parameter is the length, width and height l of the rectangular parallelepipedm、wm,hm(ii) a Correspondingly, the characteristic parameter of the weighted geometric feature of the target object at the historical moment is ln、wn,hnThen the geometric weighting factor can be expressed as:
Figure BDA0002834377150000101
further, calculating a ratio of the first energy information to weighted energy information of the target object at the historical moment as an energy weighting factor; and the weighted energy information of the target object at the historical moment is obtained by calculation according to the second spatial information and the second energy information of the target object. Alternatively, the energy weighting factor may be expressed as:
Figure BDA0002834377150000102
further, an observation weighting factor may be calculated based on the distance weighting factor, the geometric feature weighting factor, and the energy weighting factor. Alternatively, the product of the distance weighting factor, the geometric feature weighting factor, and the energy weighting factor may be used as the observation weighting factor. That is, if the reference geometric feature of the target object adopts the length, width and height table of the cuboidThen observe the weighting factor qiCan be expressed as: q. q.si=rl*rw*r*rp*rd. Wherein I is 1,2. I represents the number of data points of the target object in the point cloud information at the target time within the gate radius of the target object at the historical time.
In the embodiment of the present application, it is considered that the point cloud information detected by the radar at the target time may include point cloud information of a plurality of target objects and some noise that does not belong to any target object, and therefore, before step 102, a target object needs to be identified from at least one object to be identified. Optionally, before step 102, a reference observation center of at least one object to be identified detected by the radar may be calculated according to point cloud information corresponding to the at least one object to be identified at the target time; and identifying the target object from the at least one object to be identified according to the first target track and the reference observation center of the at least one object to be identified. Next, taking the first object to be recognized as an example, a determination process of whether the first object to be recognized is a target object will be exemplarily described. The first object to be identified is any one of the at least one object to be identified.
Optionally, it may be determined whether the distance between the reference observation center of the first object to be identified at the target time and the first target trajectory is less than or equal to the wave gate radius of the target object at the historical time; and if so, determining that the first object to be identified is the target object. Correspondingly, if the judgment result is negative, the first object to be identified is determined not to be the target object.
Further, if the number of the target objects determined as the target objects in the at least one object to be recognized is multiple, when the observation weighting factor at the target time is calculated, the reference observation weighting factor of each of the multiple objects may be calculated according to the first energy information, the reference observation center, and the reference geometric feature of each of the multiple objects determined as the target objects in the at least one object to be recognized; carrying out normalization processing on the reference observation weighting factor of the first target object by using the respective reference observation weighting factors of the plurality of objects to obtain the observation weighting factor of the first target object; whereinThe first target object is any one of a plurality of objects. Assuming that J target objects of at least one object to be identified are determined as target objects, the observation weighting factors of the J target objects are respectively expressed as
Figure BDA0002834377150000111
Wherein J is 1,2. That is to say, the
Figure BDA0002834377150000112
Reassign value to qj
Further, a target observation center of the target object at the target time may be calculated based on the observation weighting factor of the target object and the reference observation center of the target object at the target time. Alternatively, the target observation center of the target object at the target time may be expressed as: (x)m,ym,zm) Wherein, in the step (A),
Figure BDA0002834377150000113
Figure BDA0002834377150000114
accordingly, the weighted geometry of the target object at the target time instant (target geometry) can be expressed as:
Figure BDA0002834377150000115
the weighted energy information of the target object at the target time instant can be expressed as:
Figure BDA0002834377150000116
in the embodiment of the application, if the first object to be recognized is not a target object and any one of other objects determined at historical time, a kinematic parameter of the device carrying the radar when the radar detects the first object to be recognized can be acquired; predicting track information of a first object to be recognized at a subsequent moment according to kinematic parameters of equipment carrying the radar when the radar detects the first object to be recognized; and according to the predicted track information of the first object to be recognized at the subsequent time and the reference observation center of the unknown object detected by the radar at the subsequent time, track fusion is carried out on the unknown object and the first object to be recognized.
Optionally, the kinematic parameters of the device equipped with the radar when the radar detects the first object to be identified include: the apparatus mounted with the radar includes at least one of position information, acceleration information, velocity information, and a direction of motion of the apparatus when the radar detects the first object to be recognized.
Further, in consideration that the point cloud information detected by the radar at the target time may include point cloud information of a plurality of target objects and noise points not belonging to any target object, the point cloud information detected by the radar needs to be classified, that is, the point clouds belonging to the same target object are classified into one category. The method comprises the steps of obtaining point cloud information detected by a radar, and performing density clustering on the point cloud information detected by the radar to divide the point cloud information into at least one point cloud subset, wherein one point cloud subset corresponds to an object to be identified.
Further, in order to determine to which target object the point cloud information detected by the radar at the target time belongs, density clustering processing may be performed on the point cloud information corresponding to the plurality of detection points, and data points belonging to the same target object may be classified into the same cluster.
In practical application, due to the scanning characteristic of the radar, the point clouds of the detection points are distributed in a radial manner by taking the center of the radar as an origin, and the closer the detection points are to the radar, the denser the point cloud distribution corresponding to the detection points is, namely, the distribution between the points in the point cloud becomes dispersed along with the increase of the distance between the detection points and the radar. Based on the method, the size of the neighborhood used when the density clustering processing is carried out on the point cloud information can be determined according to the distance between the radar and the detection point, namely the size of the neighborhood of the data point in the point cloud information is determined. Therefore, the neighborhood size of the point cloud information corresponding to the plurality of detection points can be determined according to the distance between the radar and the detection points, and the neighborhood size is used as the neighborhood size for performing density clustering on the point cloud information. The size of a neighborhood used when density clustering is carried out on point cloud information is determined according to the distance between the radar and the detection point, the distribution between the points in the point cloud is considered in the neighborhood determining mode, the characteristic of dispersion along with the increase of the distance between the detection point and the radar is beneficial to improving the adaptivity of the density clustering, and further the accuracy of the density clustering is beneficial to improving.
The distance between the radar and the detection point can be obtained according to the difference between the detection signal and the echo signal. The detection signals transmitted by the radar are different, and the mode of acquiring the distance between the radar and the detection point is also different. For example, if the probe signal transmitted by the radar is a pulse signal, the distance between the radar and the probe point can be calculated according to the time difference between the probe signal transmitted by the radar and the received echo signal. Namely, the distance between the radar and the detection point is calculated by using a time-of-flight method. Alternatively, knowing the speed of the probe signal and the echo signal in the atmospheric propagation, the distance between the radar and the probe point can be calculated according to the time difference between the probe signal sent by the radar and the received echo signal and the speed of the probe signal and the echo signal in the atmospheric propagation. Alternatively, the detection signal may be an electromagnetic wave signal, such as a microwave signal or a laser signal, etc., but is not limited thereto.
For another example, if the probe signal transmitted by the radar is a continuous wave signal, the distance between the radar and the probe point can be calculated according to the frequency difference between the probe signal transmitted by the radar and the received echo signal. Optionally, the Continuous Wave is a Frequency Modulated Continuous Wave (FMCW). The frequency modulation method may be triangular frequency modulation, sawtooth frequency modulation, code modulation, noise frequency modulation, or the like, but is not limited thereto.
Further, the size of the neighborhood used when the density clustering processing is performed on the point cloud information, that is, the size of the neighborhood of the data point in the point cloud information, can be determined according to the distance between the radar and the detection point. Further, density clustering can be performed on the point cloud information corresponding to the multiple detection points based on the determined size of the neighborhood, so that point cloud information corresponding to at least one object to be identified is obtained.
In the embodiment of the present application, a specific implementation manner of performing density clustering on point cloud information corresponding to a plurality of detection points is not limited. Optionally, a Density-Based Spatial Clustering of Applications with Noise (DBSCAN) algorithm may be adopted to perform Density Clustering on the point cloud information corresponding to the plurality of probe points to obtain at least one point cloud subset, where one point cloud subset corresponds to one object to be identified. Optionally, for a first data point, determining whether the number of data points included in a neighborhood of the first data point is greater than or equal to a known number threshold; if the judgment result is yes, clustering the first data point and all data points with the density of the first data point reaching into a cluster to obtain a point cloud subset to which the first data point belongs; the first data point is any data point which is not clustered in the point cloud information corresponding to the plurality of detection points. Accordingly, a first data point may be determined to be noisy if the number of data points contained within the neighborhood of the first data point is less than a known number threshold.
To facilitate understanding of the above data processing, the following description is provided with reference to a specific embodiment shown in fig. 2. As shown in fig. 2, the data processing process mainly includes:
201. and acquiring point cloud information of the object to be identified, which is detected by the radar at the target moment.
202. And calculating a reference observation center, reference geometric characteristics and first energy information of the object to be identified at the target moment according to the point cloud information of the object to be identified detected by the radar at the target moment.
203. And acquiring a first target track, a weighting geometric characteristic and weighting energy information of the target object at historical time.
The first target track of the target object at the historical time may be a target observation center of the target object at the historical target time, and in some embodiments, may also be referred to as a weighted observation center of the radar at the historical time.
204. And calculating the wave gate radius r of the target object at the historical moment according to the weighted geometric characteristics of the target object at the historical moment.
205. And calculating the distance d between the reference observation center of the object to be recognized at the target moment and the first target track of the target object.
206. And judging whether the distance d between the reference observation center of the object to be identified at the target moment and the first target track is smaller than or equal to the wave gate radius r of the target object at the historical moment. If the judgment result is yes, determining that the object to be identified is the target object, and executing step 207; and if the judgment result is negative, determining that the object to be identified is not the target object.
207. And calculating an observation weighting factor according to the first energy information of the object to be identified, the reference observation center of the target moment and the reference geometric characteristics.
208. And calculating the target observation center of the target object at the target moment based on the observation weighting factor and the reference observation center of the object to be identified at the target moment.
209. And fusing the target observation center of the target object at the target moment with the first target track to generate a second target track of the target object.
It should be noted that the execution subjects of the steps of the methods provided in the above embodiments may be the same device, or different devices may be used as the execution subjects of the methods. For example, the execution subject of steps 101 and 102 may be device a; for another example, the execution subject of step 101 may be device a, and the execution subject of step 102 may be device B; and so on.
In addition, in some of the flows described in the above embodiments and the drawings, operations that appear in a particular order are included, but it should be clearly understood that these operations may be performed out of the order they appear herein or in parallel, and the order of the operations, such as 201, 202, etc., is merely used to distinguish between the various operations, and the order itself does not represent any order of execution. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel.
Accordingly, embodiments of the present application also provide a computer-readable storage medium storing computer instructions, which when executed by one or more processors, cause the one or more processors to perform actions comprising: acquiring first space information and first energy information of a target object detected by a radar at a target moment; calculating a target observation center of the target object at the target moment according to the first energy information and the first space information; fusing the target observation center and a first target track of a target object at a historical moment to generate a second target track of the target object; the first target track is determined according to second spatial information and second energy information of the target object at the historical moment; and the historical time is before the target time.
Optionally, when the computer instructions are executed by one or more processors, the one or more processors are caused to perform the relevant steps in fig. 1, fig. 2 and the optional embodiments, which refer to the relevant contents of the above embodiments specifically, and are not described herein again.
Fig. 3 is a data processing apparatus according to an embodiment of the present application. As shown in fig. 3, the apparatus includes: a first acquisition module 30a, a calculation module 30b and a first fusion module 30 c.
In this embodiment, the first obtaining module 30a is configured to obtain first spatial information and first energy information of a target object detected by the radar at a target time. And the calculating module 30b is configured to calculate a target observation center of the target object at the target time according to the first energy information and the first spatial information. The first fusion module 30c is configured to fuse the target observation center with the first target trajectory of the target object at the historical time to generate a second target trajectory of the target object. The first target track is determined according to second spatial information and second energy information of the target object at the historical moment; the historical time is prior to the target time.
Optionally, the historical time is a detection time closest to the target time.
In some embodiments, the first obtaining module 30a, when obtaining the first spatial information of the target object detected by the radar at the target time, is specifically configured to: and calculating a reference observation center and reference geometric characteristics of the target object at the target time according to the point cloud information of the target object at the target time, wherein the reference observation center and the reference geometric characteristics are used as first spatial information of the target object.
Further, the point cloud information of the target object at the target time includes: a plurality of position coordinates of the target object. Correspondingly, when calculating the reference observation center of the target object at the target time, the first obtaining module 30a is specifically configured to: an average coordinate value of the plurality of position coordinates is calculated as a reference observation center of the target object at the target time.
Optionally, the point cloud information of the target object at the target time includes: a plurality of position coordinates of the target object at the target time. Correspondingly, when calculating the reference geometric feature of the target object at the target time, the first obtaining module 30a is specifically configured to: determining a cuboid according to the maximum value and the minimum value of the position coordinates, and taking the side length of the cuboid as a reference geometric characteristic of the target object at the target moment; or determining an ellipsoid according to the maximum value and the minimum value of the position coordinates, and taking the length of the central axis of the ellipsoid as the reference geometric characteristic of the target object at the target moment; or, determining a sphere according to the maximum value and the minimum value of the position coordinates, and taking the radius of the sphere as the reference geometric characteristic of the target object at the target moment.
In other embodiments, the point cloud information of the target object at the target time comprises: a plurality of energy values of the target object at the target time. Accordingly, when acquiring the first energy information of the target object detected by the radar, the first acquiring module 30a is specifically configured to: an average energy value of the plurality of energy values is calculated as the first energy information.
In still other embodiments, the calculating module 30b is specifically configured to, when calculating the target observation center of the target object: calculating an observation weighting factor according to the first energy information, the reference observation center and the reference geometric characteristics; and calculating a target observation center of the target object based on the observation weighting factor and the reference observation center.
Further, when calculating the observation weighting factor, the calculating module 30b is specifically configured to: calculating the ratio of the reference geometric features to the weighted geometric features of the target object at the historical moment, and taking the ratio as a geometric feature weighting factor; calculating a ratio of the first energy information to weighted energy information of the target object at the historical moment, and taking the ratio as an energy weighting factor; calculating an observation weighting factor according to the distance weighting factor, the geometric feature weighting factor and the energy weighting factor; and the weighted geometric characteristics and the weighted energy information of the target object at the historical moment are calculated according to the second spatial information and the second energy information.
Optionally, the calculation module 30b is further configured to: and calculating the wave gate radius of the target object at the historical moment according to the weighted geometric characteristics of the target object at the historical moment.
Optionally, before calculating the target observation center of the target object at the target time, the calculating module 30b is further configured to: calculating a reference observation center of at least one object to be identified detected at a target moment, which is detected by a radar, according to point cloud information corresponding to the at least one object to be identified at the target moment; and identifying the target object from the at least one object to be identified according to the first target track and the reference observation center of the at least one object to be identified at the target moment.
Further, when the target object is identified from the at least one object to be identified, the calculating module 30b is specifically configured to: for the first object to be recognized, judging whether the distance between the reference observation center of the first object to be recognized at the target moment and the first target track is smaller than or equal to the wave gate radius of the target object at the historical moment; if the judgment result is yes, determining that the first object to be identified is the target object; the first object to be identified is any one of the at least one object to be identified.
Optionally, the calculation module 30b is further configured to, before calculating the reference observation center of the at least one object to be identified detected by the radar: obtaining distances between the radar and a plurality of detection points detected by the radar at a target moment and point cloud information corresponding to the detection points; determining the size of the neighborhood of point cloud information corresponding to the detection points according to the distance between the radar and the detection points at the target moment; and performing density clustering on the point cloud information corresponding to the plurality of detection points based on the size of the neighborhood to obtain the point cloud information corresponding to at least one object to be identified at the target moment.
In some other embodiments, if the number of the target objects determined to be multiple in the at least one object to be identified is, the calculating module 30b is specifically configured to: aiming at a first target object, calculating a reference observation weighting factor of each of a plurality of objects according to first energy information and a reference observation center and a reference geometric characteristic of each of the plurality of objects determined as the target object in at least one object to be identified; carrying out normalization processing on the reference observation weighting factor of the first target object by using the respective reference observation weighting factors of the plurality of objects to obtain the observation weighting factor of the first target object; the first target object is any one of a plurality of objects.
Further, the data processing apparatus further includes: a second acquisition module 30d, a prediction module 30e and a second fusion module 30 f. Optionally, if the first object to be identified is not the target object and any one of the other objects determined at the historical time, the second obtaining module 30d is configured to: and acquiring kinematic parameters of equipment carrying the radar when the radar detects the first object to be identified. Accordingly, the prediction module 30e is configured to: and predicting track information of the first object to be recognized at a subsequent moment according to the kinematic parameters of the equipment with the radar when the radar detects the first object to be recognized. The second fusion module 30f is configured to: and performing track fusion on the unknown object and the first object to be identified according to the predicted track information of the first object to be identified at the subsequent moment and the reference observation center of the unknown object detected by the radar at the subsequent moment.
Optionally, the kinematic parameters of the device equipped with the radar when the radar detects the first object to be identified include: the apparatus mounted with the radar includes at least one of position information, acceleration information, velocity information, and a direction of motion of the apparatus when the radar detects the first object to be recognized.
Further, in some embodiments, the data processing apparatus may further include: and a guide module 30 g. Accordingly, the guidance module 30g is configured to: and guiding the equipment with the radar to track the target object according to the second target track.
In another embodiment, the data processing apparatus may further include: and a reduction module 30 h. Accordingly, the reduction module 30h is configured to: and restoring the working environment of the equipment with the radar according to the second target track and other target tracks of the target object before the second target track.
The data processing apparatus according to this embodiment may calculate a target observation center of the target object based on the spatial information and the energy information of the target object, and fuse the observation center of the target object with a target trajectory of the target object at a historical time to generate a new target trajectory of the target object. The method and the device have the advantages that the space information and the energy information of the target object are integrated, the target track of the target object is determined, the accuracy of the target track is improved, and the accuracy of follow-up work by utilizing the track of the target object is improved. For example, when tracking a target, it is helpful to improve the accuracy of positioning the target object, thereby improving the accuracy of target tracking, and so on.
Fig. 4 is a schematic structural diagram of a radar according to an embodiment of the present application. As shown in fig. 4, the radar includes: a memory 40a and a processor 40 b. In the present embodiment, the memory 40a is used to store a computer program and a first target trajectory of a target object at a historical time; the first target trajectory is determined according to second spatial information and second energy information of the target object at the historical time.
In this embodiment, the processor is coupled to the memory for executing the computer program for: acquiring first space information and first energy information of a target object detected by a radar at a target moment; wherein the historical time is before the target time; calculating a target observation center of the target object at the target moment according to the first energy information and the first space information; and fusing the target observation center with the first target track to generate a second target track of the target object.
Optionally, the historical time is a detection time closest to the target time.
In some embodiments, the processor 40b, when acquiring the first spatial information of the target object detected by the radar at the target time, is specifically configured to: and calculating a reference observation center and reference geometric characteristics of the target object at the target time according to the point cloud information of the target object at the target time, wherein the reference observation center and the reference geometric characteristics are used as first spatial information of the target object.
Further, the point cloud information of the target object at the target time includes: a plurality of position coordinates of the target object. Accordingly, when calculating the reference observation center of the target object at the target time, the processor 40b is specifically configured to: an average coordinate value of the plurality of position coordinates is calculated as a reference observation center of the target object at the target time.
Optionally, the point cloud information of the target object at the target time includes: a plurality of position coordinates of the target object at the target time. Accordingly, when calculating the reference geometric feature of the target object at the target time, the processor 40b is specifically configured to: determining a cuboid according to the maximum value and the minimum value of the position coordinates, and taking the side length of the cuboid as a reference geometric characteristic of the target object at the target moment; or determining an ellipsoid according to the maximum value and the minimum value of the position coordinates, and taking the length of the central axis of the ellipsoid as the reference geometric characteristic of the target object at the target moment; or, determining a sphere according to the maximum value and the minimum value of the position coordinates, and taking the radius of the sphere as the reference geometric characteristic of the target object at the target moment.
In other embodiments, the point cloud information of the target object at the target time comprises: a plurality of energy values of the target object at the target time. Accordingly, the processor 40b, when acquiring the first energy information of the target object detected by the radar, is specifically configured to: an average energy value of the plurality of energy values is calculated as the first energy information.
In still other embodiments, the processor 40b, when calculating the target observation center of the target object, is specifically configured to: calculating an observation weighting factor according to the first energy information, the reference observation center and the reference geometric characteristics; and calculating a target observation center of the target object based on the observation weighting factor and the reference observation center.
Further, the processor 40b, when calculating the observation weighting factor, is specifically configured to: calculating the ratio of the reference geometric features to the weighted geometric features of the target object at the historical moment, and taking the ratio as a geometric feature weighting factor; calculating a ratio of the first energy information to weighted energy information of the target object at the historical moment, and taking the ratio as an energy weighting factor; calculating an observation weighting factor according to the distance weighting factor, the geometric feature weighting factor and the energy weighting factor; and the weighted geometric characteristics and the weighted energy information of the target object at the historical moment are calculated according to the second spatial information and the second energy information.
Optionally, the processor 40b is further configured to: and calculating the wave gate radius of the target object at the historical moment according to the weighted geometric characteristics of the target object at the historical moment.
Optionally, the processor 40b is further configured to, before calculating the target observation center of the target object at the target time: calculating a reference observation center of at least one object to be identified detected at a target moment, which is detected by a radar, according to point cloud information corresponding to the at least one object to be identified at the target moment; and identifying the target object from the at least one object to be identified according to the first target track and the reference observation center of the at least one object to be identified at the target moment.
Further, when the processor 40b identifies the target object from the at least one object to be identified, it is specifically configured to: for the first object to be recognized, judging whether the distance between the reference observation center of the first object to be recognized at the target moment and the first target track is smaller than or equal to the wave gate radius of the target object at the historical moment; if the judgment result is yes, determining that the first object to be identified is the target object; the first object to be identified is any one of the at least one object to be identified.
Optionally, the processor 40b, before calculating the reference observation center of the at least one object to be identified detected by the radar, is further configured to: obtaining distances between the radar and a plurality of detection points detected by the radar at a target moment and point cloud information corresponding to the detection points; determining the size of the neighborhood of point cloud information corresponding to the detection points according to the distance between the radar and the detection points at the target moment; and performing density clustering on the point cloud information corresponding to the plurality of detection points based on the size of the neighborhood to obtain the point cloud information corresponding to at least one object to be identified at the target moment.
In some other embodiments, if the number of the target objects determined to be the at least one object to be identified is multiple, the processor 40b is specifically configured to: aiming at a first target object, calculating a reference observation weighting factor of each of a plurality of objects according to first energy information and a reference observation center and a reference geometric characteristic of each of the plurality of objects determined as the target object in at least one object to be identified; carrying out normalization processing on the reference observation weighting factor of the first target object by using the respective reference observation weighting factors of the plurality of objects to obtain the observation weighting factor of the first target object; the first target object is any one of a plurality of objects.
Further, if the first object to be recognized is not the target object and any one of the other objects determined at the historical time, the processor 40b is further configured to: acquiring kinematic parameters of equipment carrying a radar when the radar detects a first object to be identified; predicting track information of a first object to be recognized at a subsequent moment according to kinematic parameters of equipment carrying the radar when the radar detects the first object to be recognized; and performing track fusion on the unknown object and the first object to be identified according to the predicted track information of the first object to be identified at the subsequent moment and the reference observation center of the unknown object detected by the radar at the subsequent moment.
Optionally, the kinematic parameters of the device equipped with the radar when the radar detects the first object to be identified include: the apparatus mounted with the radar includes at least one of position information, acceleration information, velocity information, and a direction of motion of the apparatus when the radar detects the first object to be recognized.
Further, in some embodiments, the processor 40b is further configured to: guiding equipment carrying radar to track the target object according to the second target track; or restoring the working environment of the equipment with the radar according to the second target track and other target tracks of the target object before the second target track.
In some alternative embodiments, as shown in fig. 4, the radar may further include: optional components such as a communication component 40c, a horizontal angle detection device 40d, an electrical scanning angle measurement device 40e, and a power supply component 40 f. Only some of the components are shown schematically in fig. 4, and it is not meant that the radar must include all of the components shown in fig. 4, nor that the radar can include only the components shown in fig. 4.
In the present embodiment, the communication component 40c is configured to transmit a detection signal and receive an echo signal reflected when the detection signal detects a detection point. Alternatively, as shown in fig. 4, the communication component 40c may include: transmitter 40c1, receiver 40c2, and antenna 40c3, and so on, but is not limited thereto. The functions and implementation forms of the transmitter 40c1, the receiver 40c2, and the antenna 40c3 are all common knowledge in the art, and are not described herein.
In the present embodiment, the horizontal angle detection device 40d may measure the rotational position of the radar; the electronically scanned goniometer 40d may measure the off-angle of the target relative to the axial direction of the transmitter 40c 1. For the arrangement of the horizontal angle detection device 40d, reference may be made to the related contents of the above embodiments, and details are not repeated herein. Alternatively, the horizontal angle detecting device 40d includes: a photosensor 40d1 and a grating disk 40d 2. The description of the arrangement, implementation and operation principle of the photoelectric sensor 40d1 and the grating disk 40d2 belongs to the common general knowledge in the art, and is not repeated herein.
According to the radar provided by the embodiment, the target observation center of the target object can be calculated according to the spatial information and the energy information of the target object, and the observation center of the target object and the target track of the target object at the historical moment are fused to generate a new target track of the target object. The method and the device have the advantages that the space information and the energy information of the target object are integrated, the target track of the target object is determined, the accuracy of the target track is improved, and the accuracy of follow-up work by utilizing the track of the target object is improved. For example, when tracking a target, it is helpful to improve the accuracy of positioning the target object, thereby improving the accuracy of target tracking, and so on.
Fig. 5 is a schematic structural diagram of a detection apparatus according to an embodiment of the present application. As shown in fig. 5, the detecting apparatus includes: a memory 50a and a processor 50 b. The probe apparatus is also mounted with a radar 50 c.
In the present embodiment, the radar 50c is configured to acquire first spatial information and first energy information of a target object detected at a target time.
The memory 50a stores a computer program and a first target trajectory of a target object at a historical time; the first target track is determined according to second spatial information and second energy information of the target object at the historical moment; the historical time is prior to the target time.
The processor 50b is coupled to the memory 50a for executing a computer program for: calculating a target observation center of the target object at the target moment according to the first energy information and the first space information; and fusing the target observation center and the first target track to generate a second target track of the target object.
Optionally, the historical time is a detection time closest to the target time.
In some embodiments, the radar 50c, when acquiring the first spatial information of the target object detected at the target time, is specifically configured to: and calculating a reference observation center and reference geometric characteristics of the target object at the target time according to the point cloud information of the target object at the target time, wherein the reference observation center and the reference geometric characteristics are used as first spatial information of the target object.
Further, the point cloud information of the target object at the target time includes: a plurality of position coordinates of the target object. Accordingly, when calculating the reference observation center of the target object at the target time, the radar 50c is specifically configured to: an average coordinate value of the plurality of position coordinates is calculated as a reference observation center of the target object at the target time.
Optionally, the point cloud information of the target object at the target time includes: a plurality of position coordinates of the target object at the target time. Accordingly, when calculating the reference geometric feature of the target object at the target time, the radar 50c is specifically configured to: determining a cuboid according to the maximum value and the minimum value of the position coordinates, and taking the side length of the cuboid as a reference geometric characteristic of the target object at the target moment; or determining an ellipsoid according to the maximum value and the minimum value of the position coordinates, and taking the length of the central axis of the ellipsoid as the reference geometric characteristic of the target object at the target moment; or, determining a sphere according to the maximum value and the minimum value of the position coordinates, and taking the radius of the sphere as the reference geometric characteristic of the target object at the target moment.
In other embodiments, the point cloud information of the target object at the target time comprises: a plurality of energy values of the target object at the target time. Accordingly, when acquiring the first energy information of the target object detected by the radar, the radar 50c is specifically configured to: an average energy value of the plurality of energy values is calculated as the first energy information.
In still other embodiments, the processor 50b, when calculating the target observation center of the target object, is specifically configured to: calculating an observation weighting factor according to the first energy information, the reference observation center and the reference geometric characteristics; and calculating a target observation center of the target object based on the observation weighting factor and the reference observation center.
Further, the processor 50b, when calculating the observation weighting factor, is specifically configured to: calculating the ratio of the reference geometric features to the weighted geometric features of the target object at the historical moment, and taking the ratio as a geometric feature weighting factor; calculating a ratio of the first energy information to weighted energy information of the target object at the historical moment, and taking the ratio as an energy weighting factor; calculating an observation weighting factor according to the distance weighting factor, the geometric feature weighting factor and the energy weighting factor; and the weighted geometric characteristics and the weighted energy information of the target object at the historical moment are calculated according to the second spatial information and the second energy information.
Optionally, the processor 50b is further configured to: and calculating the wave gate radius of the target object at the historical moment according to the weighted geometric characteristics of the target object at the historical moment.
Optionally, before calculating the target observation center of the target object at the target time, the processor 50b is further configured to: calculating a reference observation center of at least one object to be identified detected at a target moment, which is detected by a radar, according to point cloud information corresponding to the at least one object to be identified at the target moment; and identifying the target object from the at least one object to be identified according to the first target track and the reference observation center of the at least one object to be identified at the target moment.
Further, when the processor 50b identifies the target object from the at least one object to be identified, it is specifically configured to: for the first object to be recognized, judging whether the distance between the reference observation center of the first object to be recognized at the target moment and the first target track is smaller than or equal to the wave gate radius of the target object at the historical moment; if the judgment result is yes, determining that the first object to be identified is the target object; the first object to be identified is any one of the at least one object to be identified.
Optionally, the processor 50b, before calculating the reference observation center of the at least one object to be identified detected by the radar, is further configured to: obtaining distances between the radar and a plurality of detection points detected by the radar at a target moment and point cloud information corresponding to the detection points; determining the size of the neighborhood of point cloud information corresponding to the detection points according to the distance between the radar and the detection points at the target moment; and performing density clustering on the point cloud information corresponding to the plurality of detection points based on the size of the neighborhood to obtain the point cloud information corresponding to at least one object to be identified at the target moment.
In some other embodiments, if the number of the target objects determined to be the at least one object to be identified is multiple, the processor 50b is specifically configured to: aiming at a first target object, calculating a reference observation weighting factor of each of a plurality of objects according to first energy information and a reference observation center and a reference geometric characteristic of each of the plurality of objects determined as the target object in at least one object to be identified; carrying out normalization processing on the reference observation weighting factor of the first target object by using the respective reference observation weighting factors of the plurality of objects to obtain the observation weighting factor of the first target object; the first target object is any one of a plurality of objects.
Further, if the first object to be recognized is not the target object and any of the other objects determined at the historical time, the processor 50b is further configured to: acquiring kinematic parameters of equipment carrying a radar when the radar detects a first object to be identified; predicting track information of a first object to be recognized at a subsequent moment according to kinematic parameters of equipment carrying the radar when the radar detects the first object to be recognized; and performing track fusion on the unknown object and the first object to be identified according to the predicted track information of the first object to be identified at the subsequent moment and the reference observation center of the unknown object detected by the radar at the subsequent moment.
Optionally, the kinematic parameters of the device equipped with the radar when the radar detects the first object to be identified include: the apparatus mounted with the radar includes at least one of position information, acceleration information, velocity information, and a direction of motion of the apparatus when the radar detects the first object to be recognized.
Further, in some embodiments, the processor 50b is further configured to: guiding equipment carrying radar to track the target object according to the second target track; or restoring the working environment of the equipment with the radar according to the second target track and other target tracks of the target object before the second target track.
Optionally, the detection device is a drone, an unmanned vehicle, a robot or a ship, and the like, but is not limited thereto.
In some alternative embodiments, as shown in fig. 5, the detection apparatus may further include: a power component 50d, a communication component 50f, a drive component 50g, a display component 50h, an audio component 50i, or an optional component such as one or more sensors 50 j. The detection device has different implementation forms, and other components included in the detection device are different. Further details regarding other components included in the detection device are not provided herein, as they are well known in the art to which the detection device itself pertains. Only some of the components are schematically shown in fig. 5, and it is not intended that the detecting device must include all of the components shown in fig. 5, nor that the detecting device only includes the components shown in fig. 5.
The detection device provided by the embodiment of the application comprises: a radar and a processor. The radar and the processor are matched with each other, the target observation center of the target object can be calculated according to the space information and the energy information of the target object, and the observation center of the target object and the target track of the target object at the historical moment are fused to generate a new target track of the target object. The method and the device have the advantages that the space information and the energy information of the target object are integrated, the target track of the target object is determined, the accuracy of the target track is improved, and the accuracy of follow-up work by utilizing the track of the target object is improved. For example, when tracking a target, it is helpful to improve the accuracy of positioning the target object, thereby improving the accuracy of target tracking, and so on.
Fig. 6 is a schematic structural diagram of a mobile device according to an embodiment of the present application. As shown in fig. 6, the mobile device is mounted with a radar 60 a. Wherein the radar 60a is configured to: acquiring first space information and first energy information of a target object detected at a target moment; calculating a target observation center of the target object at the target moment according to the first energy information and the first space information; fusing the target observation center with a first target track of the target object at the historical moment to generate a second target track of the target object; the first target track is determined according to second spatial information and second energy information of the target object at the historical moment; the historical time is prior to the target time.
For specific embodiments of the first spatial information and the first energy information of the target object obtained by the radar 60a, calculating the target observation center of the target object at the target time, and fusing the target observation center and the first target trajectory of the target object at the historical time, reference may be made to relevant contents of the above embodiments, and details are not repeated here. For the specific implementation and structure of the radar 60a, reference is also made to the related contents of the above embodiments, and details are not repeated here.
In some embodiments, the mobile device further comprises: a processor 60 b. Alternatively, the radar 60a may provide a second target trajectory of the target object and other target trajectories preceding the second target trajectory to the processor 60 b. Accordingly, the processor 60b may guide the mobile device to track the target object according to the second target trajectory; or restoring the motion environment of the mobile equipment according to the second target track and other tracks of the target object before the second target track.
In some alternative embodiments, as shown in fig. 6, the detection apparatus may further include: memory 60c, power component 60d, communication component 60f, drive component 60g, display component 60h, audio component 60i, or one or more sensors 60 j. The mobile device has different implementation forms, and other components included in the mobile device are different. Further details regarding other components included in the mobile device are not provided herein, as they are well known in the art of mobile devices per se. Only some of the components are shown schematically in fig. 6, and it is not meant that the mobile device must include all of the components shown in fig. 6, nor that the mobile device can include only the components shown in fig. 6.
The mobile device provided in this embodiment is equipped with a radar. The radar can calculate a target observation center of the target object according to the spatial information and the energy information of the target object, and fuse the observation center of the target object and a target track of the target object at a historical moment to generate a new target track of the target object. The method and the device have the advantages that the space information and the energy information of the target object are integrated, the target track of the target object is determined, the accuracy of the target track is improved, and the accuracy of follow-up work by utilizing the track of the target object is improved. For example, when tracking a target, it is helpful to improve the accuracy of positioning the target object, thereby improving the accuracy of target tracking, and so on.
In embodiments of the present application, the memory is used to store computer programs and may be configured to store other various data to support operations on the device on which it is located. Wherein the processor may execute a computer program stored in the memory to implement the corresponding control logic. The memory may be implemented by any type or combination of volatile or non-volatile storage devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read only memory (EEAROM), erasable programmable read only memory (earrom), programmable read only memory (AROM), Read Only Memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
In the embodiments of the present application, the processor may be any hardware processing device that can execute the above described method logic. Alternatively, the processor may be a Central processing Unit (CAU), a graphics processing Unit (GAU), or a Micro Control Unit (MCU); programmable devices such as Field-programmable Gate arrays (FAGAs), programmable Array Logic devices (AAL), General Array Logic devices (GAL), complex programmable Logic devices (CALD), etc.; or Advanced Reduced Instruction Set (RISC) processors (ARM) or System On Chip (SOC), etc., but is not limited thereto.
In embodiments of the present application, the communication component may also be configured to facilitate wired or wireless communication between the device in which it is located and other devices. The device can access a wireless network based on a communication standard, such as WiFi, 2G or 3G, 4G, 5G or a combination thereof. In an exemplary embodiment, the communication component receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component may also be implemented based on Near Field Communication (NFC) technology, Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, or other technologies.
In the embodiment of the present application, the display assembly may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the display assembly includes a touch panel, the display assembly may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
In this embodiment, the power supply component is configured to provide power to the various components of the device in which it is located. The power components may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device in which the power component is located.
In this embodiment, the audio component may be configured to output and/or input an audio signal. For example, the audio component includes a Microphone (MIC) configured to receive an external audio signal when the device in which the audio component is located is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in a memory or transmitted via a communication component. In some embodiments, the audio assembly further comprises a speaker for outputting audio signals. For example, for devices with language interaction functionality, voice interaction with a user may be enabled through an audio component, and so forth.
It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor limit the types of "first" and "second" to be different.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied in the medium.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in one flow or process of the flowchart and/or one block or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flow and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CAU), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (ARAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), electrically erasable programmable read only memory (EEAROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (23)

1. A data processing method, comprising:
acquiring first space information and first energy information of a target object detected by a radar at a target moment;
calculating a target observation center of the target object at the target moment according to the first energy information and the first space information;
fusing the target observation center with a first target track of the target object at a historical moment to generate a second target track of the target object;
wherein the first target trajectory is determined according to second spatial information and second energy information of the target object at the historical time; the historical time is before the target time.
2. The method of claim 1, wherein the obtaining first spatial information of the target object detected by the radar at the target time comprises:
and calculating a reference observation center and a reference geometric feature of the target object at the target time according to the point cloud information of the target object at the target time, wherein the reference observation center and the reference geometric feature are used as first spatial information of the target object.
3. The method of claim 2, wherein the point cloud information of the target object at the target time comprises: a plurality of position coordinates of the target object; the calculating the reference observation center of the target object at the target moment according to the point cloud information of the target object comprises the following steps:
and calculating the average coordinate value of the position coordinates as the reference observation center of the target object at the target time.
4. The method of claim 2, wherein the point cloud information of the target object at the target time comprises: a plurality of position coordinates of the target object at the target time; the calculating the reference geometric feature of the target object at the target moment according to the point cloud information of the target object comprises the following steps:
determining a cuboid according to the maximum value and the minimum value of the position coordinates, and taking the side length of the cuboid as a reference geometric characteristic of the target object at the target moment;
alternatively, the first and second electrodes may be,
determining an ellipsoid according to the maximum value and the minimum value of the position coordinates, and taking the length of a middle axis of the ellipsoid as a reference geometric characteristic of the target object at the target moment;
alternatively, the first and second electrodes may be,
and determining a sphere according to the maximum value and the minimum value of the position coordinates, and taking the radius of the sphere as the reference geometric characteristic of the target object at the target moment.
5. The method of claim 2, wherein the point cloud information of the target object at the target time comprises: a plurality of energy values of the target object at the target time; the acquiring first energy information of a target object detected by a radar includes:
calculating an average energy value of the plurality of energy values as the first energy information.
6. The method of claim 2, wherein calculating a target observation center of the target object at the target time based on the first energy information and the first spatial information comprises:
calculating an observation weighting factor according to the first energy information, the reference observation center and the reference geometric feature;
and calculating the target observation center of the target object at the target moment based on the observation weighting factor and the reference observation center.
7. The method of claim 6, wherein said calculating an observation weighting factor based on said first energy information, said reference observation center, and said reference geometric feature comprises:
calculating the ratio of the distance between the reference observation center and the first target track to the wave gate radius of the target object at the historical moment as a distance weighting factor;
calculating the ratio of the reference geometric feature to the weighted geometric feature of the target object at the historical moment, and taking the ratio as a geometric feature weighting factor;
calculating a ratio between the first energy information and weighted energy information of the target object at the historical moment as an energy weighting factor;
calculating the observation weighting factor according to the distance weighting factor, the geometric feature weighting factor and the energy weighting factor;
and calculating the weighted geometric characteristics and the weighted energy information of the target object at the historical moment according to the second spatial information and the second energy information.
8. The method of claim 7, further comprising:
and calculating the wave gate radius of the target object at the historical moment according to the weighted geometric characteristics of the target object at the historical moment.
9. The method of claim 6, further comprising, prior to calculating a target observation center of the target object at the target time based on the first energy information and the first spatial information:
calculating a reference observation center of at least one object to be identified detected by the radar at the target moment according to the point cloud information corresponding to the at least one object to be identified at the target moment;
and identifying the target object from the at least one object to be identified according to the first target track and the reference observation center of the at least one object to be identified at the target moment.
10. The method according to claim 9, before calculating a reference observation center of at least one object to be identified detected by the radar according to the point cloud information corresponding to the at least one object to be identified at the target time, further comprising:
obtaining distances between the radar and a plurality of detection points detected by the radar at the target moment and point cloud information corresponding to the detection points;
determining the size of the neighborhood of the point cloud information corresponding to the detection points according to the distance between the radar and the detection points at the target moment;
and performing density clustering on the point cloud information corresponding to the plurality of detection points based on the size of the neighborhood to obtain the point cloud information corresponding to the at least one object to be identified at the target moment.
11. The method of claim 9, wherein the identifying the target object from the at least one object to be identified according to the first target trajectory and a reference observation center of the at least one object to be identified at the target time comprises:
for a first object to be identified, judging whether the distance between the reference observation center of the first object to be identified at the target moment and the first target track is smaller than or equal to the wave gate radius of the target object at the historical moment;
if the judgment result is yes, determining that the first object to be identified is the target object;
the first object to be identified is any one of the at least one object to be identified.
12. The method of claim 11, wherein if the number of the target objects determined to be a plurality of the at least one object to be identified is more than one, the calculating an observation weighting factor according to the first energy information, the reference observation center, and the reference geometric feature comprises:
for a first target object, calculating a reference observation weighting factor of each of a plurality of objects according to first energy information and a reference observation center and a reference geometric feature of each of the plurality of objects determined as the target object in the at least one object to be identified;
carrying out normalization processing on the reference observation weighting factor of the first target object by using the respective reference observation weighting factors of the plurality of objects to obtain the observation weighting factor of the first target object;
wherein the first target object is any one of the plurality of objects.
13. The method of claim 11, further comprising:
if the first object to be recognized is not the target object and any one of other objects determined at the historical moment, acquiring a kinematic parameter of equipment carrying the radar when the radar detects the first object to be recognized;
predicting track information of the first object to be recognized at a subsequent moment according to kinematic parameters of the equipment with the radar when the radar detects the first object to be recognized;
and performing track fusion on the unknown object and the first object to be identified according to the predicted track information of the first object to be identified at the subsequent moment and the reference observation center of the unknown object detected by the radar at the subsequent moment.
14. The method according to claim 13, wherein the kinematic parameters of the apparatus on which the radar is mounted when the radar detects the first object to be recognized include: and at least one of position information, acceleration information, speed information, and a direction of motion of the device on which the radar is mounted when the radar detects the first object to be recognized.
15. The method according to any of claims 1-14, wherein the historical time is a detection time closest to the target time.
16. The method according to any one of claims 1-14, further comprising performing at least one of:
guiding equipment carrying the radar to track the target object according to the second target track;
alternatively, the first and second electrodes may be,
and restoring the working environment of the equipment with the radar according to the second target track and other target tracks of the target object before the second target track.
17. A data processing apparatus, comprising: the system comprises a first acquisition module, a calculation module and a first fusion module;
the first acquisition module is used for acquiring first space information and first energy information of a target object detected by the radar at a target moment;
the calculation module is used for calculating a target observation center of the target object at the target moment according to the first energy information and the first space information;
the first fusion module is used for fusing the target observation center and a first target track of the target object at a historical moment to generate a second target track of the target object;
wherein the first target trajectory is determined according to second spatial information and second energy information of the target object at the historical time; the historical time is before the target time.
18. A radar, comprising: a memory and a processor; wherein the memory is for storing a computer program and a first target trajectory of a target object at a historical time; the first target track is determined according to second spatial information and second energy information of the target object at the historical moment;
the processor is coupled to the memory for executing the computer program for:
acquiring first space information and first energy information of a target object detected by a radar at the target moment; wherein the historical time is before the target time
Calculating a target observation center of the target object at the target moment according to the first energy information and the first space information;
and fusing the target observation center and the first target track to generate a second target track of the target object.
19. A detection device is characterized by being provided with a radar, and comprising a memory and a processor; the radar is used for acquiring first space information and first energy information of a target object detected at a target moment;
the memory is used for storing a computer program and a first target track of the target object at historical time; the first target track is determined according to second spatial information and second energy information of the target object at historical time; the historical time is before the target time;
the processor is coupled to the memory for executing the computer program for: calculating a target observation center of the target object at the target moment according to the first energy information and the first space information; and fusing the target observation center and the first target track to generate a second target track of the target object.
20. A mobile device is characterized in that a radar is carried; the radar is configured to: acquiring first space information and first energy information of a target object detected at the target moment; calculating a target observation center of the target object at the target moment according to the first energy information and the first space information; fusing the target observation center with a first target track of the target object at a historical moment to generate a second target track of the target object; wherein the first target trajectory is determined according to second spatial information and second energy information of the target object at historical time; the historical time is before the target time.
21. The mobile device of claim 20, further comprising: a memory and a processor; wherein the memory is for storing a computer program;
the radar is further configured to: providing the second target trajectory to the processor;
the processor is coupled to the memory for executing the computer program for: guiding the mobile equipment to track the target object according to the second target track; or restoring the motion environment of the mobile equipment according to the second target track and other tracks of the target object before the second target track.
22. The apparatus of claim 20 or 21, wherein the mobile device is a drone, an unmanned vehicle, a robot, or a ship.
23. A computer-readable storage medium having stored thereon computer instructions, which when executed by one or more processors, cause the one or more processors to perform acts comprising:
acquiring first space information and first energy information of a target object detected by a radar at a target moment;
calculating a target observation center of the target object at the target moment according to the first energy information and the first space information;
fusing the target observation center with a first target track of the target object at a historical moment to generate a second target track of the target object;
wherein the first target trajectory is determined according to second spatial information and second energy information of the target object at historical time; the historical time is before the target time.
CN201980040012.8A 2019-11-05 2019-11-05 Data processing method, data processing device, radar, equipment and storage medium Pending CN112334946A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/115814 WO2021087777A1 (en) 2019-11-05 2019-11-05 Data processing method and apparatus, and radar, device and storage medium

Publications (1)

Publication Number Publication Date
CN112334946A true CN112334946A (en) 2021-02-05

Family

ID=74319384

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980040012.8A Pending CN112334946A (en) 2019-11-05 2019-11-05 Data processing method, data processing device, radar, equipment and storage medium

Country Status (2)

Country Link
CN (1) CN112334946A (en)
WO (1) WO2021087777A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113208566B (en) * 2021-05-17 2023-06-23 深圳大学 Data processing method and device, electronic equipment and storage medium
CN113325383B (en) * 2021-06-17 2023-07-04 广东工业大学 Self-adaptive vehicle millimeter wave radar clustering algorithm and device based on grid and DBSCAN
CN113504796B (en) * 2021-09-07 2022-01-14 特金智能科技(上海)有限公司 Unmanned aerial vehicle trajectory processing method and device, electronic equipment and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3078935A1 (en) * 2015-04-10 2016-10-12 The European Atomic Energy Community (EURATOM), represented by the European Commission Method and device for real-time mapping and localization
CN108549084B (en) * 2018-01-30 2020-06-02 西安交通大学 Target detection and attitude estimation method based on sparse two-dimensional laser radar
CN109581312B (en) * 2018-11-22 2023-07-14 西安电子科技大学昆山创新研究院 High-resolution millimeter wave radar multi-target clustering method
CN110210389B (en) * 2019-05-31 2022-07-19 东南大学 Multi-target identification tracking method for road traffic scene

Also Published As

Publication number Publication date
WO2021087777A1 (en) 2021-05-14

Similar Documents

Publication Publication Date Title
EP3812793B1 (en) Information processing method, system and equipment, and computer storage medium
US20200409372A1 (en) Data fusion method and related device
US11410482B2 (en) Information processing method and apparatus, electronic device, and storage medium
KR102032070B1 (en) System and Method for Depth Map Sampling
CN112334946A (en) Data processing method, data processing device, radar, equipment and storage medium
CN110865393A (en) Positioning method and system based on laser radar, storage medium and processor
RU2656711C2 (en) Method and system for detecting and tracking of moving objects based on three-dimensional sensor data
US9354305B2 (en) Method for producing at least information for track fusion and association for radar target tracking, and storage medium thereof
US11587445B2 (en) System and method for fusing asynchronous sensor tracks in a track fusion application
KR20210001219A (en) Radar data processing device and method to adjust local resolving power
US11507092B2 (en) Sequential clustering
WO2020250020A9 (en) Lidar and radar based tracking and mapping system and method thereof
JP2018077210A (en) Systems and methods for spatial filtering using data with widely different error magnitudes
CN113448326A (en) Robot positioning method and device, computer storage medium and electronic equipment
EP3842885A1 (en) Autonomous movement device, control method and storage medium
WO2021087760A1 (en) Target detection method, radar, device, and storage medium
US11880209B2 (en) Electronic apparatus and controlling method thereof
US20190187251A1 (en) Systems and methods for improving radar output
US20210080571A1 (en) Classification of Static and Dynamic Objects
CN112558035B (en) Method and device for estimating the ground
CN116699596A (en) Method and device for correcting speed of millimeter wave radar of vehicle
CN112313535A (en) Distance detection method, distance detection device, autonomous mobile platform, and storage medium
US11280899B2 (en) Target recognition from SAR data using range profiles and a long short-term memory (LSTM) network
US20230359186A1 (en) System And Method for Controlling a Mobile Industrial Robot Using a Probabilistic Occupancy Grid
CN114330726A (en) Tracking and positioning method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination