WO2021087777A1 - 数据处理方法、装置、雷达、设备及存储介质 - Google Patents

数据处理方法、装置、雷达、设备及存储介质 Download PDF

Info

Publication number
WO2021087777A1
WO2021087777A1 PCT/CN2019/115814 CN2019115814W WO2021087777A1 WO 2021087777 A1 WO2021087777 A1 WO 2021087777A1 CN 2019115814 W CN2019115814 W CN 2019115814W WO 2021087777 A1 WO2021087777 A1 WO 2021087777A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
target object
moment
information
radar
Prior art date
Application number
PCT/CN2019/115814
Other languages
English (en)
French (fr)
Inventor
王石荣
高迪
王俊喜
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2019/115814 priority Critical patent/WO2021087777A1/zh
Priority to CN201980040012.8A priority patent/CN112334946A/zh
Publication of WO2021087777A1 publication Critical patent/WO2021087777A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/415Identification of targets based on measurements of movement associated with the target
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image

Definitions

  • This application relates to the field of radar technology, and in particular to a data processing method, device, radar, equipment, and storage medium.
  • Radar is composed of transmitter, receiver and information processing system. It can transmit detection signal and then reflect the received signal from the target. According to the reflected signal, relevant information of the target can be obtained, such as the relationship between radar and target. Parameters such as the distance between each other, the orientation, height, and shape of the target. Therefore, radar is widely used in target detection and tracking.
  • aspects of the present application provide a data processing method, device, radar, equipment, and storage medium to improve the accuracy of the target moving track.
  • the embodiment of the present application provides a data processing method, including:
  • the first target trajectory is determined according to the second spatial information and second energy information of the target object at the historical moment; the historical moment is located before the target moment.
  • An embodiment of the present application also provides a data processing device, including: a first acquisition module, a calculation module, and a first fusion module;
  • the first acquisition module is configured to acquire the first spatial information and first energy information of the target object detected by the radar at the target moment;
  • the calculation module is configured to calculate the target observation center of the target object at the target moment according to the first energy information and the first spatial information;
  • the first fusion module is configured to merge the target observation center and the first target trajectory of the target object at a historical moment to generate a second target trajectory of the target object;
  • the first target trajectory is determined according to the second spatial information and second energy information of the target object at the historical moment; the historical moment is located before the target moment.
  • An embodiment of the present application also provides a radar, including: a memory and a processor; wherein the memory is used to store a computer program and a first target trajectory of a target object at a historical moment; the first target trajectory is based on the target Determined by the second spatial information and second energy information of the object at the historical moment;
  • the processor is coupled to the memory, and is configured to execute the computer program for:
  • the target observation center is fused with the first target trajectory to generate a second target trajectory of the target object.
  • An embodiment of the present application also provides a detection device equipped with a radar and including a memory and a processor; wherein the radar is used to obtain first spatial information and first energy information of a target object detected at the target moment;
  • the memory is used to store a computer program and a first target trajectory of the target object at a historical moment; the first target trajectory is determined according to the second spatial information and second energy information of the target object at the historical moment; The historical moment is before the target moment;
  • the processor is coupled to the memory, and is configured to execute the computer program for: calculating the target observation center of the target object at the target moment according to the first energy information and the first spatial information Fusion of the target observation center and the first target trajectory to generate a second target trajectory of the target object.
  • An embodiment of the present application also provides a mobile device equipped with a radar; the radar is used to: obtain first spatial information and first energy information of a target object detected at the target moment; according to the first energy information And the first spatial information, calculating the target observation center of the target object at the target time; fusing the target observation center with the first target trajectory of the target object at the historical time to generate the The second target trajectory of the target object; wherein the first target trajectory is determined according to the second spatial information and second energy information of the target object at a historical moment; the historical moment is located before the target moment.
  • the embodiments of the present application also provide a computer-readable storage medium storing computer instructions, when the computer instructions are executed by one or more processors, causing the one or more processors to perform actions including the following:
  • the first target trajectory is determined according to the second spatial information and second energy information of the target object at a historical moment; the historical moment is located before the target moment.
  • the target observation center of the target object can be calculated according to the spatial information and energy information of the target object, and the observation center of the target object can be fused with the target trajectory of the target object at historical moments to generate a new target object’s trajectory.
  • Target trajectory integrating the spatial information and energy information of the target object to determine the target trajectory of the target object helps to improve the accuracy of the target trajectory, which in turn helps to improve the accuracy of subsequent work using the trajectory of the target object. For example, when tracking a target, it helps to improve the accuracy of positioning the target object, thereby helping to improve the accuracy of target tracking, and so on.
  • FIG. 1 is a schematic flowchart of a data processing method provided by an embodiment of this application
  • FIG. 2 is a schematic flowchart of another data processing method provided by an embodiment of the application.
  • FIG. 3 is a schematic structural diagram of a data processing device provided by an embodiment of the application.
  • FIG. 4 is a schematic structural diagram of a radar provided by an embodiment of this application.
  • FIG. 5 is a schematic structural diagram of a detection device provided by an embodiment of this application.
  • FIG. 6 is a schematic structural diagram of a mobile device provided by an embodiment of this application.
  • the target observation center of the target object can be calculated according to the spatial information and energy information of the target object, and the observation of the target object The center and the target trajectory of the target object at historical moments are merged to generate a new target trajectory of the target object.
  • integrating the spatial information and energy information of the target object to determine the target trajectory of the target object helps to improve the accuracy of the target trajectory, which in turn helps to improve the accuracy of subsequent work using the trajectory of the target object. For example, when tracking a target, it helps to improve the accuracy of positioning the target object, thereby helping to improve the accuracy of target tracking, and so on.
  • FIG. 1 is a schematic flowchart of a data processing method provided by an embodiment of this application. As shown in Figure 1, the method includes:
  • the radar may be a directional radar or a rotating radar.
  • the radar may be a microwave radar, or a laser radar, etc., but is not limited to this.
  • the radar can detect not only the spatial information of the target object, but also the energy information of the target object.
  • the spatial information of the target object includes the position information and geometric characteristics of the target object.
  • the energy information of the target object is based on the intensity information of the echo signal returned by the target object.
  • the strength of the echo signal can reflect the distance of the target object from the radar to a certain extent. Therefore, the spatial information and energy information of the target object can reflect the relative position relationship between the target object and the radar to a certain extent. Based on this, if the spatial information and energy information of the target object are combined to characterize the trajectory information of the target object, it is helpful to improve the accuracy of the determined trajectory.
  • the target time refers to the time when the radar detects the first spatial information and the second spatial information.
  • the target moment can be the current moment or the past moment.
  • the target time may be the current detection time of the radar.
  • the target moment is a past moment compared to the moment when the first spatial information and the second spatial information are acquired.
  • the target observation center of the target object is calculated according to the first spatial information and the first energy information. Further, in step 103, the target observation center is merged with the first target trajectory of the target object at the historical moment to generate a second target trajectory of the target object.
  • This trajectory generation method integrates the spatial information and energy information of the target object, helps to improve the accuracy of the target trajectory, and further helps to improve the accuracy of subsequent work using the trajectory of the target object. For example, when tracking a target, it helps to improve the accuracy of positioning the target object, thereby helping to improve the accuracy of target tracking, and so on.
  • the first target trajectory is trajectory information of the target object at a historical moment.
  • the historical moment is the detection moment before the target moment.
  • the historical moment is the detection moment closest to the target moment.
  • the first target trajectory is determined according to the second spatial information and second energy information of the target object at a historical moment.
  • the specific implementation manner for determining the first target trajectory please refer to the related content of the method for determining the second target trajectory of the target object in the above and the following embodiments, which will not be repeated here.
  • a Kalman filtering method may be used to fuse the target observation center and the first target trajectory of the target object.
  • the Kalman filter method may be used to predict the second target trajectory.
  • the radar can be mounted on various devices to complete related tasks.
  • radar can be mounted on mobile devices to detect obstacles, or to detect and track targets, and so on.
  • different tasks can be performed according to the second target trajectory.
  • the device equipped with radar can be guided to track the target object according to the second target trajectory; another example, in the application scenario of accident investigation, the second target trajectory and the target object can be tracked.
  • Other target trajectories before the second target trajectory restore the working environment of the device equipped with radar; etc., but not limited to this.
  • the radar can be mounted on various devices.
  • the radar can be mounted on a mobile device, where the mobile device can be an autonomous mobile device, such as a drone, an unmanned vehicle or a robot, but not limited to it; or the mobile device can also be a mobile device that requires human control Equipment, such as non-unmanned vehicles, boats, airplanes, etc., but not limited to this.
  • the radar includes a transmitter, a receiver, and an information processing system.
  • the transmitter is used to transmit a detection signal.
  • the detection signal encounters an obstacle, it will reflect an echo signal, and the receiver can receive the echo. signal.
  • the information processing system can obtain relevant information about the target based on the reflected echo information, such as the distance between the radar and the target, the target's position, height and shape and other parameters.
  • the essence of the detection signal encountering an obstacle is that the detection signal encounters a certain point on the obstacle.
  • the obstacle points that the detection signal actually encounters in the propagation process are defined as detection points.
  • the detection point may be a certain point on the target object, or it may belong to other objects outside the target object, such as dust in the air, and so on. Among them, when each detection signal encounters a detection point, it can return the corresponding echo signal.
  • the space coordinate information and energy information of the detection point can be obtained.
  • the space coordinate information and space energy information of multiple detection points form point cloud information, namely point cloud information It is a collection of a series of space coordinate points and space energy information.
  • a data point in the point cloud information can be interpreted as a combination of the space coordinate information and energy information of the probe point.
  • the spatial coordinate information corresponding to the detection point can be calculated according to the distance between the radar and the detection point and the pose of the radar.
  • the pose of the radar refers to the position and orientation of the radar.
  • the orientation of the radar may refer to the directivity of the radar antenna.
  • the direction of the detection point compared to the radar can be obtained; and then according to the direction of the detection point compared to the radar, the distance between the radar and the detection point, and the position of the radar, the detection point can be calculated Space coordinates.
  • the energy information of the detection point can also be determined according to the intensity information of the echo signal.
  • the target object detected by the radar corresponds to multiple detection points, and the data points corresponding to the multiple detection points constitute the point cloud information of the target object.
  • multiple refers to two or more.
  • the reference observation center and reference geometric characteristics of the target object at the target time can be calculated according to the point cloud information of the target object at the target time as the first spatial information of the target object.
  • the point cloud information of the target object at the target time includes: multiple position coordinates of the target object.
  • the specific implementation manner for calculating the reference observation center and reference geometric characteristics of the target object at the target time is not limited.
  • the average coordinate value of multiple position coordinates can be calculated and used as the reference observation center of the target object at the target time.
  • the point cloud information of the target object at the target moment contains: K position coordinates, where K ⁇ 2 and is an integer.
  • the K position coordinates are defined as the position coordinates in the point cloud information obtained by the radar m-th detection.
  • the reference observation center can be expressed as among them,
  • a cuboid is determined according to the maximum value and the minimum value of the multiple position coordinates, and the side length of the cuboid is used as the reference geometric feature of the target object at the target moment.
  • These position coordinates should be contained in the cuboid.
  • the side length of the cuboid refers to the length, width and height of the cuboid.
  • an ellipsoid can be determined according to the maximum and minimum values of the multiple position coordinates, and the length of the central axis of the ellipsoid can be used as the reference geometric feature of the target object. Among them, these position coordinates are contained in the ellipsoid.
  • the length of the central axis of the ellipsoid includes the length of the long central axis and the short central axis of the ellipsoid.
  • a sphere may also be determined according to the maximum and minimum values of multiple position coordinates, and the radius of the sphere may be used as the reference geometric feature of the target object at the target moment. Among them, these position coordinates are contained in the sphere.
  • the position coordinates are three-dimensional space coordinates
  • the maximum value of the multiple position coordinates is a three-dimensional coordinate value determined by the maximum value of the x-axis, the maximum value of the y-axis, and the maximum value of the z-axis
  • the maximum value of the coordinate is a three-dimensional coordinate value determined by the minimum value of the x-axis, the minimum value of the y-axis, and the minimum value of the z-axis.
  • the point cloud information of the target object at the target time also includes multiple energy values of the target object at the target time.
  • an average energy value of multiple energy values may be calculated as the first energy information of the target object at the target time.
  • the first energy information, the reference observation center, and the reference Based on the geometric characteristics, the observation weighting factor is calculated; and based on the observation weighting factor and the reference observation center, the target observation center of the target object at the target time is calculated.
  • the ratio of the distance between the reference observation center of the target object at the target time and the first target trajectory and the gate radius of the target object at the historical time may be calculated as a distance weighting factor.
  • the gate radius of the target object at the historical moment can be calculated according to the weighted geometric characteristics of the target object at the historical moment.
  • the weighted geometric feature of the target object at the historical moment is calculated according to the second spatial information and the second energy information. It is assumed that the historical time is the time when the radar detects the target object for the nth time, where the nth time is before the mth time; and it is assumed that the reference geometric characteristics of the target object at the historical time are characterized by the length, width and height of the rectangular parallelepiped.
  • weighted geometric features are also characterized by the length, width and height of the cuboid.
  • the weighted geometric features can be l n , w n , h n , and the gate radius of the target object at the historical moment can be expressed as:
  • the distance weighting factor can be expressed as: Among them, d represents the distance between the reference observation center and the first target trajectory. Among them, the closer the reference observation center is to the first target trajectory, the larger r d is. In some cases, It may be that the distance between the reference observation center and the first target trajectory is close enough to approach infinity, resulting in a larger proportion of the distance weighting factor in the observation weighting factor, which affects the accuracy of subsequent determination of the second target trajectory. Based on this, a function that grows more slowly can also be used to express the distance weighting factor. For example, the distance weighting factor can also be expressed as:
  • the ratio of the reference geometric feature of the target object at the target time to the weighted geometric feature of the target object at the historical time can also be calculated as the geometric feature weighting factor.
  • the weighted geometric feature of the target object at the historical moment is the target geometric feature of the target object at the historical moment, which can be calculated according to the reference geometric feature of the target object at the historical moment and the observation weighting factor of the historical moment.
  • the geometric feature weighting factor is: the ratio of each representative parameter of the reference geometric feature of the target object at the target time to the representative parameter of the same attribute of the weighted geometric feature of the target object at the historical time.
  • the reference geometric feature of the target object at the target moment is represented by a rectangular parallelepiped, and its characteristic parameters are the length, width and height of the rectangular parallelepiped l m , w m , and h m ; correspondingly, the weighted geometric characteristic of the target object at the historical moment is l n , w n , h n , the geometric feature weighting factor can be expressed as:
  • the ratio between the first energy information and the weighted energy information of the target object at the historical moment is calculated as an energy weighting factor; wherein, the weighted energy information of the target object at the historical moment is based on the second spatial information and the first spatial information of the target object. 2.
  • the energy weighting factor can be expressed as:
  • the observation weighting factor can be calculated according to the distance weighting factor, geometric feature weighting factor, and energy weighting factor.
  • the point cloud information detected by the radar at the target time may include the point cloud information of multiple targets and some noise points that do not belong to any target
  • the reference observation center of the at least one object to be identified detected by the radar can be calculated according to the point cloud information corresponding to the at least one object to be identified at the target moment; and according to the first target trajectory and the at least one The reference observation center of the object to be recognized recognizes the target object from at least one object to be recognized.
  • the process of determining whether the first object to be recognized is a target object is exemplarily described below.
  • the first object to be identified is any one of at least one object to be identified.
  • the number of at least one object to be identified is determined to be the target object is multiple
  • the number of objects determined to be the target object in the at least one object to be identified may be determined according to the first Energy information, reference observation center and reference geometric characteristics, calculate the respective reference observation weighting factors of multiple objects; and use the respective reference observation weighting factors of multiple objects to normalize the reference observation weighting factors of the first target object, Obtain the observation weighting factor of the first target object; wherein, the first target object is any one of a plurality of objects.
  • the target observation center of the target object at the target time can be calculated based on the observation weighting factor of the target object and the reference observation center of the target object at the target time.
  • the target observation center of the target object at the target moment can be expressed as: (x m , y m , z m ), where,
  • weighted geometric feature (target geometric feature) of the target object at the target moment can be expressed as:
  • weighted energy information of the target object at the target moment can be expressed as:
  • the movement of the radar-equipped device when the first object to be identified is detected by the radar can be acquired.
  • Learning parameters; according to the kinematics parameters of the radar-equipped device when the radar detects the first object to be identified, predict the trajectory information of the first object to be identified at the subsequent time; and according to the predicted trajectory information of the first object to be identified at the subsequent time The trajectory information and the reference observation center of the unknown object detected by the radar at a subsequent time are used for trajectory fusion of the unknown object and the first object to be identified.
  • the kinematic parameters of the radar-equipped device when the radar detects the first object to be identified include: position information, acceleration information, speed information, and information of the radar-equipped device when the radar detects the first object to be identified. At least one of the movement directions.
  • the point cloud information detected by the radar at the target moment may include the point cloud information of multiple targets and some noise points that do not belong to any target
  • it is necessary to classify the point cloud information detected by the radar which is about The point clouds of the same target are divided into one category.
  • the point cloud information detected by the radar may be subjected to density clustering to divide the point cloud information into at least one point cloud subset, and one of the point cloud subsets corresponds to an object to be identified.
  • the point cloud information corresponding to multiple detection points can be subjected to density clustering processing, and the data points belonging to the same target object can be divided into the same cluster in.
  • the point cloud of the detection points is distributed radially with the radar center as the origin, and the closer the detection point is to the radar, the denser the corresponding point cloud distribution of these detection points is, that is, the midpoint of the point cloud.
  • the distribution between the points and the points becomes scattered as the distance between the detection point and the radar increases.
  • the size of the neighborhood used for density clustering of the point cloud information can be determined according to the distance between the radar and the detection point, that is, the size of the neighborhood of the data point in the point cloud information is determined.
  • the size of the neighborhood of the point cloud information corresponding to the multiple detection points can be determined according to the distance between the radar and the detection point, as the size of the neighborhood for density clustering of the point cloud information.
  • the size of the neighborhood used for density clustering of the point cloud information is determined according to the distance between the radar and the detection point.
  • This neighborhood determination method takes into account the distribution of points in the point cloud. With the increase and dispersion characteristics of the detection point and the radar, it helps to improve the adaptability of density clustering, which in turn helps to improve the accuracy of density clustering.
  • the distance between the radar and the detection point can be obtained according to the difference between the detection signal and the echo signal.
  • the detection signal emitted by the radar is different, and the way to obtain the distance between the radar and the detection point is also different.
  • the detection signal emitted by the radar is a pulse signal
  • the distance between the radar and the detection point can be calculated according to the time difference between the detection signal sent by the radar and the echo signal received. That is, the time-of-flight method is used to calculate the distance between the radar and the detection point.
  • the detection signal may be an electromagnetic wave signal, such as a microwave signal or a laser signal, etc., but is not limited thereto.
  • the distance between the radar and the detection point can be calculated according to the frequency difference between the detection signal sent by the radar and the received echo signal.
  • the continuous wave is a frequency modulated continuous wave (Frequency Modulated Continuous Wave, FMCW).
  • FMCW Frequency Modulated Continuous Wave
  • the frequency modulation mode can be triangular wave frequency modulation, sawtooth frequency modulation, code modulation or noise frequency modulation, but is not limited to this.
  • the size of the neighborhood used when performing density clustering processing on the point cloud information can be determined according to the distance between the radar and the detection point, that is, the size of the neighborhood of the data point in the point cloud information can be determined. Further, based on the determined size of the neighborhood, density clustering may be performed on the point cloud information corresponding to the multiple detection points to obtain the point cloud information corresponding to at least one object to be identified.
  • the specific implementation manner of performing density clustering on the point cloud information corresponding to multiple detection points is not limited.
  • a density-based clustering (Density-Based Spatial Clustering of Applications with Noise, DBSCAN) algorithm can be used to perform density clustering on the point cloud information corresponding to multiple detection points to obtain at least one point cloud subset, where , A point cloud subset corresponds to an object to be identified.
  • the first data point it is determined whether the number of data points contained in the neighborhood of the first data point is greater than or equal to a known number threshold; if the determination result is yes, the first data point and the first data point are combined All data points with reachable point density are clustered into a cluster to obtain the point cloud subset to which the first data point belongs; where the first data point is the point cloud information corresponding to multiple detection points that has not yet been clustered Any data point.
  • the first data point can be determined as a noise point.
  • the data processing process mainly includes:
  • the radar According to the point cloud information of the object to be recognized detected by the radar at the target time, calculate the reference observation center, reference geometric characteristics, and first energy information of the object to be recognized at the target time.
  • the first target trajectory of the target object at the historical moment may be the target observation center of the target object at the historical target moment. In some embodiments, it may also be referred to as the weighted observation center of the radar at the historical moment.
  • step 206 Determine whether the distance d between the reference observation center of the object to be identified at the target time and the first target trajectory is less than or equal to the gate radius r of the target object at the historical time. If the judgment result is yes, it is determined that the object to be identified is the target object, and step 207 is executed; if the judgment result is no, it is determined that the object to be identified is not the target object.
  • the execution subject of each step of the method provided in the foregoing embodiment may be the same device, or different devices may also be the execution subject of the method.
  • the execution subject of steps 101 and 102 may be device A; for another example, the execution subject of step 101 may be device A, and the execution subject of step 102 may be device B; and so on.
  • an embodiment of the present application also provides a computer-readable storage medium storing computer instructions.
  • the or processor is caused to perform actions including the following: Obtain the radar detection at the target moment The first space information and first energy information of the target object; calculate the target observation center of the target object at the target time according to the first energy information and the first space information; combine the target observation center and the target object at the first historical time The target trajectory is fused to generate a second target trajectory of the target object; wherein the first target trajectory is determined according to the second spatial information and second energy information of the target object at the historical moment; and the historical moment is located before the target moment.
  • the one or the processor when the computer instruction is executed by the one or processor, the one or the processor is caused to execute the relevant steps in the above-mentioned FIG. 1, FIG. 2 and the alternative embodiment.
  • the relevant content of the above-mentioned embodiment please refer to the relevant content of the above-mentioned embodiment, which will not be omitted here. Go into details.
  • Fig. 3 is a data processing device provided by an embodiment of the application. As shown in Fig. 3, the device includes: a first acquisition module 30a, a calculation module 30b, and a first fusion module 30c.
  • the first acquisition module 30a is configured to acquire the first spatial information and first energy information of the target object detected by the radar at the target moment.
  • the calculation module 30b is configured to calculate the target observation center of the target object at the target time according to the first energy information and the first space information.
  • the first fusion module 30c is used to merge the target observation center with the first target trajectory of the target object at a historical moment to generate a second target trajectory of the target object.
  • the first target trajectory is determined according to the second spatial information and second energy information of the target object at the historical moment; the historical moment is located before the target moment.
  • the historical moment is the detection moment closest to the target moment.
  • the first acquisition module 30a when the first acquisition module 30a acquires the first spatial information of the target object detected by the radar at the target time, it is specifically used to: calculate the target object at the target time according to the point cloud information of the target object at the target time The reference observation center and reference geometric characteristics of the target object are used as the first spatial information of the target object.
  • the point cloud information of the target object at the target moment includes multiple position coordinates of the target object.
  • the first acquisition module 30a is specifically used to calculate the average coordinate value of multiple position coordinates as the reference observation center of the target object at the target time.
  • the point cloud information of the target object at the target time includes: multiple position coordinates of the target object at the target time.
  • the first acquiring module 30a calculates the reference geometric characteristics of the target object at the target moment, it is specifically used to: determine a cuboid according to the maximum and minimum values of the multiple position coordinates, and use the side length of the cuboid as the target object The reference geometric feature at the target time; or, based on the maximum and minimum values of multiple position coordinates, determine an ellipsoid, and use the length of the central axis of the ellipsoid as the reference geometric feature of the target object at the target time; or, according to multiple The maximum and minimum values of the position coordinates are determined to determine a sphere, and the radius of the sphere is used as the reference geometric feature of the target object at the target moment.
  • the point cloud information of the target object at the target time includes: multiple energy values of the target object at the target time.
  • the first acquisition module 30a acquires the first energy information of the target object detected by the radar, it is specifically used to calculate the average energy value of the multiple energy values as the first energy information.
  • the calculation module 30b when calculating the target observation center of the target object, is specifically configured to: calculate an observation weighting factor based on the first energy information, the reference observation center, and the reference geometric characteristics; and based on the observation weighting factor and the reference Observation center, calculate the target observation center of the target object.
  • the calculation module 30b is specifically used to: calculate the ratio of the reference geometric feature to the weighted geometric feature of the target object at the historical moment as the geometric feature weighting factor; calculate the first energy information and the target object at the historical moment The ratio between the weighted energy information is used as the energy weighting factor; the observation weighting factor is calculated according to the distance weighting factor, geometric feature weighting factor, and energy weighting factor; among them, the weighted geometric features and weighted energy information of the target object at historical moments are both Calculated based on the second spatial information and the second energy information.
  • the calculation module 30b is further configured to calculate the gate radius of the target object at the historical moment according to the weighted geometric characteristics of the target object at the historical moment.
  • the calculation module 30b is further configured to: calculate the at least one object to be recognized detected by the radar based on the point cloud information corresponding to the at least one object to be recognized at the target time.
  • the calculation module 30b recognizes the target object from the at least one object to be recognized, it is specifically configured to: for the first object to be recognized, determine that the first object to be recognized is between the reference observation center at the target time and the first target trajectory Whether the distance of is less than or equal to the gate radius of the target object at the historical moment; if the judgment result is yes, the first object to be recognized is determined to be the target object; wherein, the first object to be recognized is any one of at least one object to be recognized Object.
  • the calculation module 30b is also used to: obtain the distance between the radar at the target time and the multiple detection points detected by the radar and the multiple detections.
  • Point cloud information corresponding to the point according to the distance between the radar and multiple detection points at the target time, determine the size of the neighborhood of the point cloud information corresponding to multiple detection points; correspond to multiple detection points based on the size of the neighborhood Density clustering of the point cloud information is performed to obtain the point cloud information corresponding to at least one object to be identified at the target moment.
  • the calculation module 30b is specifically configured to: for the first target object, according to the at least one object to be identified when calculating the observation weighting factor
  • the first energy information, the reference observation center, and the reference geometric characteristics of the multiple objects that are determined as the target object are calculated, and the reference observation weighting factors of the multiple objects are calculated; the reference observation weighting factors of the multiple objects are used to calculate the first
  • the reference observation weighting factor of the target object is normalized to obtain the observation weighting factor of the first target object; wherein, the first target object is any object among a plurality of objects.
  • the data processing device further includes: a second acquisition module 30d, a prediction module 30e, and a second fusion module 30f.
  • the second acquisition module 30d is configured to: acquire that the radar-equipped device detects the first object to be identified on the radar. Kinematics parameters at time.
  • the prediction module 30e is used to predict the trajectory information of the first object to be identified at a subsequent time according to the kinematic parameters of the device equipped with the radar when the radar detects the first object to be identified.
  • the second fusion module 30f is configured to perform trajectory fusion on the unknown object and the first object to be recognized according to the predicted trajectory information of the first object to be recognized at a subsequent time and the reference observation center of the unknown object detected by the radar at the subsequent time.
  • the kinematic parameters of the radar-equipped device when the radar detects the first object to be identified include: position information, acceleration information, speed information, and information of the radar-equipped device when the radar detects the first object to be identified. At least one of the movement directions.
  • the data processing device may further include: a boot module 30g.
  • the guiding module 30g is used to guide the device equipped with the radar to track the target object according to the second target trajectory.
  • the data processing device may further include: a restoration module 30h.
  • the restoration module 30h is used to restore the working environment of the radar-equipped device according to the second target trajectory and other target trajectories of the target object before the second target trajectory.
  • the data processing device provided in this embodiment can calculate the target observation center of the target object according to the spatial information and energy information of the target object, and merge the observation center of the target object with the target trajectory of the target object at historical moments to generate the target object’s New target trajectory.
  • integrating the spatial information and energy information of the target object to determine the target trajectory of the target object helps to improve the accuracy of the target trajectory, which in turn helps to improve the accuracy of subsequent work using the trajectory of the target object. For example, when tracking a target, it helps to improve the accuracy of positioning the target object, thereby helping to improve the accuracy of target tracking, and so on.
  • Fig. 4 is a schematic structural diagram of a radar provided by an embodiment of the application.
  • the radar includes: a memory 40a and a processor 40b.
  • the memory 40a is used to store the computer program and the first target trajectory of the target object at the historical moment; the first target trajectory is determined according to the second spatial information and the second energy information of the target object at the historical moment.
  • the processor is coupled to the memory for executing a computer program for: acquiring the first spatial information and first energy information of the target object detected by the radar at the target time; wherein, the historical time is before the target time Calculate the target observation center of the target object at the target moment according to the first energy information and the first spatial information; and fuse the target observation center with the first target trajectory to generate a second target trajectory of the target object.
  • the historical moment is the detection moment closest to the target moment.
  • the processor 40b when the processor 40b obtains the first spatial information of the target object detected by the radar at the target time, it is specifically configured to: calculate the benchmark of the target object at the target time according to the point cloud information of the target object at the target time The observation center and reference geometric features are used as the first spatial information of the target object.
  • the point cloud information of the target object at the target moment includes multiple position coordinates of the target object.
  • the processor 40b when calculating the reference observation center of the target object at the target time, is specifically configured to calculate the average coordinate value of multiple position coordinates as the reference observation center of the target object at the target time.
  • the point cloud information of the target object at the target time includes: multiple position coordinates of the target object at the target time.
  • the processor 40b calculates the reference geometric characteristics of the target object at the target time, it is specifically used to: determine a cuboid according to the maximum and minimum values of multiple position coordinates, and use the side length of the cuboid as the target object at the target The reference geometric characteristics of the time; or, according to the maximum and minimum values of multiple position coordinates, determine an ellipsoid, and use the length of the central axis of the ellipsoid as the reference geometric characteristics of the target object at the target time; or, according to multiple positions
  • the maximum and minimum values of the coordinates are used to determine a sphere, and the radius of the sphere is used as the reference geometric feature of the target object at the target moment.
  • the point cloud information of the target object at the target time includes: multiple energy values of the target object at the target time.
  • the processor 40b obtains the first energy information of the target object detected by the radar, it is specifically used to calculate the average energy value of the multiple energy values as the first energy information.
  • the processor 40b when the processor 40b calculates the target observation center of the target object, it is specifically configured to: calculate an observation weighting factor based on the first energy information, the reference observation center, and the reference geometric characteristics; and based on the observation weighting factor and the reference Observation center, calculate the target observation center of the target object.
  • the processor 40b calculates the observation weighting factor
  • it is specifically configured to: calculate the ratio of the reference geometric feature to the weighted geometric feature of the target object at the historical moment as the geometric feature weighting factor; calculate the first energy information and the target object at the historical moment The ratio between the weighted energy information is used as the energy weighting factor; the observation weighting factor is calculated according to the distance weighting factor, geometric feature weighting factor, and energy weighting factor; among them, the weighted geometric features and weighted energy information of the target object at historical moments are both Calculated based on the second spatial information and the second energy information.
  • the processor 40b is further configured to calculate the gate radius of the target object at the historical moment according to the weighted geometric characteristics of the target object at the historical moment.
  • the processor 40b is further configured to: calculate the at least one object to be recognized detected by the radar based on the point cloud information corresponding to the at least one object to be recognized at the target time.
  • the processor 40b recognizes the target object from the at least one object to be recognized, it is specifically configured to: for the first object to be recognized, determine that the first object to be recognized is between the reference observation center at the target time and the first target trajectory Whether the distance of is less than or equal to the gate radius of the target object at the historical moment; if the judgment result is yes, the first object to be recognized is determined to be the target object; wherein, the first object to be recognized is any one of at least one object to be recognized Object.
  • the processor 40b is also used to: obtain the distance between the radar at the target time and the multiple detection points detected by the radar and the multiple detections.
  • Point cloud information corresponding to the point according to the distance between the radar and multiple detection points at the target time, determine the size of the neighborhood of the point cloud information corresponding to multiple detection points; correspond to multiple detection points based on the size of the neighborhood Density clustering of the point cloud information is performed to obtain the point cloud information corresponding to at least one object to be identified at the target moment.
  • the processor 40b is specifically configured to: for the first target object, according to the at least one object to be identified when calculating the observation weighting factor
  • the first energy information, the reference observation center, and the reference geometric characteristics of the multiple objects that are determined as the target object are calculated, and the reference observation weighting factors of the multiple objects are calculated; the reference observation weighting factors of the multiple objects are used to calculate the first
  • the reference observation weighting factor of the target object is normalized to obtain the observation weighting factor of the first target object; wherein, the first target object is any object among a plurality of objects.
  • the processor 40b is further configured to: obtain the movement of the radar-equipped device when the radar detects the first object to be identified Learning parameters; predict the trajectory information of the first object to be identified at subsequent moments according to the kinematic parameters of the device equipped with radar when the radar detects the first object to be identified; and according to the predicted trajectory information of the first object to be identified at the subsequent moments.
  • the trajectory information and the reference observation center of the unknown object detected by the radar at a subsequent time are used for trajectory fusion of the unknown object and the first object to be identified.
  • the kinematic parameters of the radar-equipped device when the radar detects the first object to be identified include: position information, acceleration information, speed information, and information of the radar-equipped device when the radar detects the first object to be identified. At least one of the movement directions.
  • the processor 40b is further configured to: guide the radar-equipped device to track the target object according to the second target trajectory; or, according to the second target trajectory and the target object before the second target trajectory Other target trajectories restore the working environment of equipment equipped with radar.
  • the radar may further include optional components such as a communication component 40c, a horizontal angle detection device 40d, an electrical scanning angle measuring device 40e, and a power supply component 40f. Only part of the components are schematically shown in FIG. 4, which does not mean that the radar must include all the components shown in FIG. 4, nor does it mean that the radar can only include the components shown in FIG.
  • the communication component 40c is used to transmit the detection signal and receive the echo signal reflected when the detection point is detected by the detection signal.
  • the communication component 40c may include: a transmitter 40c1, a receiver 40c2, an antenna 40c3, etc., but is not limited thereto.
  • the functions and implementation forms of the transmitter 40c1, the receiver 40c2, and the antenna 40c3 belong to the common knowledge in the art, and will not be repeated here.
  • the horizontal angle detection device 40d can measure the rotation position of the radar; the electrical scanning angle measuring device 40d can measure the deviation angle of the target relative to the axial direction of the transmitter 40c1.
  • the horizontal angle detection device 40d includes: a photoelectric sensor 40d1 and a grating disk 40d2. The description of the arrangement, implementation form, and working principle of the photoelectric sensor 40d1 and the grating disk 40d2 belong to the common knowledge in the field, and will not be repeated here.
  • the radar provided in this embodiment can calculate the target observation center of the target object based on the spatial information and energy information of the target object, and merge the observation center of the target object with the target trajectory of the target object at historical moments to generate a new target object’s trajectory.
  • Target trajectory integrating the spatial information and energy information of the target object to determine the target trajectory of the target object helps to improve the accuracy of the target trajectory, which in turn helps to improve the accuracy of subsequent work using the trajectory of the target object. For example, when tracking a target, it helps to improve the accuracy of positioning the target object, thereby helping to improve the accuracy of target tracking, and so on.
  • FIG. 5 is a schematic structural diagram of a detection device provided by an embodiment of this application. As shown in Figure 5, the detection device includes a memory 50a and a processor 50b. The detection equipment is also equipped with a radar 50c.
  • the radar 50c is used to obtain the first spatial information and first energy information of the target object detected at the target moment.
  • the memory 50a stores the computer program and the first target trajectory of the target object at the historical moment; the first target trajectory is determined according to the second spatial information and the second energy information of the target object at the historical moment; the historical moment is located before the target time.
  • the processor 50b is coupled to the memory 50a for executing a computer program for: calculating the target observation center of the target object at the target time according to the first energy information and the first spatial information; and fusing the target observation center with the first target trajectory , To generate the second target trajectory of the target object.
  • the historical moment is the detection moment closest to the target moment.
  • the radar 50c when the radar 50c acquires the first spatial information of the target object detected at the target time, it is specifically used to calculate the reference observation center of the target object at the target time according to the point cloud information of the target object at the target time And the reference geometric feature, as the first spatial information of the target object.
  • the point cloud information of the target object at the target moment includes multiple position coordinates of the target object.
  • the radar 50c calculates the reference observation center of the target object at the target time, it is specifically used to calculate the average coordinate value of multiple position coordinates as the reference observation center of the target object at the target time.
  • the point cloud information of the target object at the target time includes: multiple position coordinates of the target object at the target time.
  • the radar 50c calculates the reference geometric characteristics of the target object at the target time, it is specifically used to: determine a cuboid according to the maximum and minimum values of multiple position coordinates, and use the side length of the cuboid as the target object at the target time Or, determine an ellipsoid based on the maximum and minimum of multiple position coordinates, and use the length of the central axis of the ellipsoid as the reference geometric feature of the target object at the target time; or, according to multiple position coordinates Determine the maximum and minimum values of a sphere, and use the radius of the sphere as the reference geometric feature of the target object at the target moment.
  • the point cloud information of the target object at the target time includes: multiple energy values of the target object at the target time.
  • the radar 50c obtains the first energy information of the target object detected by the radar, it is specifically used to calculate the average energy value of the multiple energy values as the first energy information.
  • the processor 50b when the processor 50b calculates the target observation center of the target object, it is specifically configured to: calculate an observation weighting factor based on the first energy information, the reference observation center, and the reference geometric characteristics; and based on the observation weighting factor and the reference Observation center, calculate the target observation center of the target object.
  • the processor 50b calculates the observation weighting factor
  • it is specifically configured to: calculate the ratio of the reference geometric feature to the weighted geometric feature of the target object at the historical moment as the geometric feature weighting factor; calculate the first energy information and the target object at the historical moment The ratio between the weighted energy information is used as the energy weighting factor; the observation weighting factor is calculated according to the distance weighting factor, geometric feature weighting factor, and energy weighting factor; among them, the weighted geometric features and weighted energy information of the target object at historical moments are both Calculated based on the second spatial information and the second energy information.
  • the processor 50b is further configured to calculate the gate radius of the target object at the historical moment according to the weighted geometric characteristics of the target object at the historical moment.
  • the processor 50b is further configured to: calculate the at least one object to be recognized detected by the radar based on the point cloud information corresponding to the at least one object to be recognized at the target time.
  • the processor 50b recognizes the target object from the at least one object to be recognized, it is specifically configured to: for the first object to be recognized, determine that the first object to be recognized is between the reference observation center at the target time and the first target trajectory Whether the distance of is less than or equal to the gate radius of the target object at the historical moment; if the judgment result is yes, the first object to be recognized is determined to be the target object; wherein, the first object to be recognized is any one of at least one object to be recognized Object.
  • the processor 50b is further used to: obtain the distance between the radar at the target time and the multiple detection points detected by the radar and multiple detections.
  • Point cloud information corresponding to the point according to the distance between the radar and multiple detection points at the target time, determine the size of the neighborhood of the point cloud information corresponding to multiple detection points; correspond to multiple detection points based on the size of the neighborhood Density clustering of the point cloud information is performed to obtain the point cloud information corresponding to at least one object to be identified at the target moment.
  • the processor 50b calculates the observation weighting factor, it is specifically configured to: for the first target object, according to the at least one object to be identified
  • the first energy information, the reference observation center, and the reference geometric characteristics of the multiple objects determined as the target object are calculated, and the reference observation weighting factors of the multiple objects are calculated; the reference observation weighting factors of the multiple objects are used to calculate the first energy information and the reference observation center and reference geometric characteristics of the multiple objects.
  • the reference observation weighting factor of the target object is normalized to obtain the observation weighting factor of the first target object; wherein, the first target object is any object among a plurality of objects.
  • the processor 50b is further configured to: obtain the movement of the radar-equipped device when the radar detects the first object to be identified Learning parameters; predict the trajectory information of the first object to be identified at subsequent moments according to the kinematic parameters of the device equipped with radar when the radar detects the first object to be identified; and according to the predicted trajectory information of the first object to be identified at the subsequent moments.
  • the trajectory information and the reference observation center of the unknown object detected by the radar at a subsequent time are used for trajectory fusion of the unknown object and the first object to be identified.
  • the kinematic parameters of the radar-equipped device when the radar detects the first object to be identified include: position information, acceleration information, speed information, and information of the radar-equipped device when the radar detects the first object to be identified. At least one of the movement directions.
  • the processor 50b is further configured to: guide the radar-equipped device to track the target object according to the second target trajectory; or, according to the second target trajectory and the target object before the second target trajectory Other target trajectories restore the working environment of equipment equipped with radar.
  • the detection device is an unmanned aerial vehicle, an unmanned vehicle, a robot or a ship, etc., but it is not limited thereto.
  • the detection device may further include: a power supply component 50d, a communication component 50f, a driving component 50g, a display component 50h, an audio component 50i, or one or a sensor 50j and other optional components.
  • a power supply component 50d a communication component 50f
  • a driving component 50g a driving component 50g
  • a display component 50h a display component 50h
  • an audio component 50i or one or a sensor 50j and other optional components.
  • the implementation form of the detection device is different, and the other components it includes are also different.
  • the other components included in the detection device belong to the common knowledge in the field of the detection device itself, and will not be repeated here. Only part of the components are schematically shown in FIG. 5, which does not mean that the detection device must include all the components shown in FIG. 5, nor does it mean that the detection device can only include the components shown in FIG.
  • the detection equipment provided by the embodiment of the present application includes a radar and a processor.
  • the radar and the processor cooperate with each other to calculate the target observation center of the target object based on the space information and energy information of the target object, and merge the observation center of the target object with the target trajectory of the target object at historical moments to generate the target object's New target trajectory.
  • integrating the spatial information and energy information of the target object to determine the target trajectory of the target object helps to improve the accuracy of the target trajectory, which in turn helps to improve the accuracy of subsequent work using the trajectory of the target object. For example, when tracking a target, it helps to improve the accuracy of positioning the target object, thereby helping to improve the accuracy of target tracking, and so on.
  • FIG. 6 is a schematic structural diagram of a mobile device provided by an embodiment of this application.
  • the mobile device is equipped with a radar 60a.
  • the radar 60a is used to: obtain the first spatial information and first energy information of the target object detected at the target moment; calculate the target observation center of the target object at the target moment according to the first energy information and the first spatial information; The target observation center is fused with the first target trajectory of the target object at the historical moment to generate the second target trajectory of the target object; wherein the first target trajectory is based on the second spatial information and the second energy of the target object at the historical moment The information is determined; the historical moment is before the target moment.
  • the first spatial information and first energy information of the target object acquired by the radar 60a the calculation of the target observation center of the target object at the target time, and the fusion of the target observation center and the first target trajectory of the target object at the historical time
  • the radar 60a please refer to the related content of the above-mentioned embodiment, which will not be repeated here.
  • the mobile device further includes a processor 60b.
  • the radar 60a may provide the second target trajectory of the target object and other target trajectories before the second target trajectory to the processor 60b.
  • the processor 60b can guide the mobile device to track the target object according to the second target trajectory; or, according to the second target trajectory and other trajectories of the target object before the second target trajectory, restore the motion environment of the mobile device.
  • the detection device may further include: a memory 60c, a power supply component 60d, a communication component 60f, a driving component 60g, a display component 60h, an audio component 60i, or one or sensor 60j, etc.
  • a memory 60c a memory 60c
  • a power supply component 60d a communication component 60f
  • a driving component 60g a display component 60h
  • an audio component 60i or one or sensor 60j, etc.
  • Select components are included in the mobile device.
  • the implementation form of the mobile device is different, and other components included in it are also different.
  • the other components included in the mobile device belong to common knowledge in the field to which the mobile device itself belongs, and will not be repeated here. Only part of the components are schematically shown in FIG. 6, which does not mean that the mobile device must include all the components shown in FIG. 6, nor does it mean that the mobile device can only include the components shown in FIG. 6.
  • the mobile device provided in this embodiment is equipped with a radar.
  • the radar can calculate the target observation center of the target object according to the spatial information and energy information of the target object, and merge the observation center of the target object with the target trajectory of the target object at a historical moment to generate a new target trajectory of the target object.
  • integrating the spatial information and energy information of the target object to determine the target trajectory of the target object helps to improve the accuracy of the target trajectory, which in turn helps to improve the accuracy of subsequent work using the trajectory of the target object. For example, when tracking a target, it helps to improve the accuracy of positioning the target object, thereby helping to improve the accuracy of target tracking, and so on.
  • the memory is used to store a computer program, and can be configured to store various other data to support operations on the device where it is located.
  • the processor can execute the computer program stored in the memory to realize the corresponding control logic.
  • the memory can be implemented by any type of volatile or non-volatile storage devices or their combination, such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEAROM), erasable and programmable Read only memory (EAROM), programmable read only memory (AROM), read only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
  • the processor may be any hardware processing device that can execute the logic of the foregoing method.
  • the processor may be a central processing unit (Central Arocessing Unit, CAU), a graphics processor (GraAhics Arocessing Unit, GAU), or a micro control unit (Microcontroller Unit, MCU); it may also be a Field Programmable Gate Array (Field Programmable Gate Array).
  • FAGA -Arogrammable Gate Array
  • AAL Programmable Array Logic Device
  • General Array Logic Device General Array Logic, GAL
  • Complex Programmable Logic Device ComAlex Arogrammable Logic Device, CALD
  • RISC Advanced Reduced Instruction Set
  • ARM Advanced RISC Machines
  • system chip System on ChiA SOC
  • the communication component may also be configured to facilitate wired or wireless communication between the device where it is located and other devices.
  • the device where it is located can access wireless networks based on communication standards, such as WiFi, 2G or 3G, 4G, 5G or a combination of them.
  • the communication component receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel.
  • the communication component may also be based on near field communication (NFC) technology, radio frequency identification (RFID) technology, infrared data association (IrDA) technology, ultra-wideband (UWB) technology, Bluetooth (BT) technology Or other technologies to achieve.
  • NFC near field communication
  • RFID radio frequency identification
  • IrDA infrared data association
  • UWB ultra-wideband
  • Bluetooth Bluetooth
  • the display component may include a liquid crystal display (LCD) and a touch panel (TP). If the display component includes a touch panel, the display component may be implemented as a touch screen to receive input signals from the user.
  • the touch panel includes one or touch sensors to sense touch, sliding, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure related to the touch or slide operation.
  • the power supply component is configured to provide power to various components of the device in which it is located.
  • the power supply component may include a power management system, an or power supply, and other components associated with generating, managing, and distributing power for the device where the power supply component is located.
  • the audio component may be configured to output and/or input audio signals.
  • the audio component includes a microphone (MIC).
  • MIC microphone
  • the audio component When the device where the audio component is located is in an operating mode, such as call mode, recording mode, and voice recognition mode, the microphone is configured to receive external audio signals.
  • the received audio signal can be further stored in a memory or sent via a communication component.
  • the audio component further includes a speaker for outputting audio signals.
  • audio components can be used to achieve voice interaction with users.
  • the embodiments of the present invention can be provided as a method, a system, or a computer program product. Therefore, the present invention may adopt the form of a complete hardware embodiment, a complete software embodiment, or an embodiment combining software and hardware. Moreover, the present invention may be in the form of a computer program product implemented on one or a computer-usable storage medium (including but not limited to disk storage, CD-ROM, optical storage, etc.) containing computer-usable program code.
  • a computer-usable storage medium including but not limited to disk storage, CD-ROM, optical storage, etc.
  • These computer program instructions can be provided to the processor of a general-purpose computer, a special-purpose computer, an embedded processor, or other programmable data processing equipment to generate a machine, so that the instructions executed by the processor of the computer or other programmable data processing equipment are generated It is a device that realizes the functions specified in a process or a process in a flowchart and/or a block or a block in a block diagram.
  • These computer program instructions can also be stored in a computer-readable memory that can guide a computer or other programmable data processing equipment to work in a specific manner, so that the instructions stored in the computer-readable memory produce an article of manufacture including the instruction device.
  • the device realizes the functions specified in a flow chart or a flow and/or a block or a block in a block diagram.
  • These computer program instructions can also be loaded on a computer or other programmable data processing equipment, so that a series of operation steps are executed on the computer or other programmable equipment to produce computer-implemented processing, so as to execute on the computer or other programmable equipment. Instructions provide steps for implementing functions specified in a flow or a flow in the flowchart and/or a block or a block in the block diagram.
  • the computing device includes an OR processor (CAU), input/output interface, network interface, and memory.
  • CAU OR processor
  • the memory may include non-permanent memory in a computer-readable medium, random access memory (RAM) and/or non-volatile memory, such as read-only memory (ROM) or flash memory (flash RAM).
  • RAM random access memory
  • ROM read-only memory
  • flash RAM flash memory
  • Computer-readable media include permanent and non-permanent, removable and non-removable media, and information storage can be realized by any method or technology.
  • the information can be computer-readable instructions, data structures, program modules, or other data.
  • Examples of computer storage media include, but are not limited to, phase change memory (ARAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEAROM), flash memory or other memory technology, CD-ROM, digital versatile disc (DVD) or other optical storage, Magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices or any other non-transmission media can be used to store information that can be accessed by computing devices. According to the definition in this article, computer-readable media does not include transitory media, such as modulated data signals and carrier waves.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Probability & Statistics with Applications (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

一种数据处理方法、装置、雷达、设备及存储介质。该方法包括:获取雷达在目标时刻探测到的目标对象的第一空间信息和第一能量信息(101);根据该第一能量信息和第一空间信息,计算该目标对象在目标时刻的目标观测中心(102);将目标观测中心与目标对象在历史时刻的第一目标轨迹进行融合,以生成目标对象的第二目标轨迹(103)。其中,该第一目标轨迹是根据目标对象在历史时刻的第二空间信息和第二能量信息确定的,该历史时刻位于目标时刻之前。该方法综合目标对象的空间信息和能量信息,确定目标对象的目标轨迹,有助于提高目标轨迹的准确性,进而有助于提高后续利用目标对象的轨迹进行后续工作的准确性。

Description

数据处理方法、装置、雷达、设备及存储介质 技术领域
本申请涉及雷达技术领域,尤其涉及一种数据处理方法、装置、雷达、设备及存储介质。
背景技术
雷达由发射机、接收机和信息处理系统等组成,其可发射探测信号然后将接收到的从目标反射回来的信号,并根据反射回来的信号便可获得目标的有关信息,例如雷达与目标之间的距离、目标的方位、高度和形状等参数。因此,雷达被广泛应用于目标探测和跟踪上。
在实际应用中,通常是识别出每个目标在移动过程中的空间信息,据此生成其移动轨迹,进而,借助该移动轨迹实现目标跟踪,或事故排查等等。但是,现有技术中生成的移动轨迹的准确度较低。
发明内容
本申请的方面提供一种数据处理方法、装置、雷达、设备及存储介质,用以提高目标移动轨迹的准确度。
本申请实施例提供一种数据处理方法,包括:
获取雷达在目标时刻探测到的目标对象的第一空间信息和第一能量信息;
根据所述第一能量信息和所述第一空间信息,计算所述目标对象在所述目标时刻的目标观测中心;
将所述目标观测中心与所述目标对象在历史时刻的第一目标轨迹进行融合,以生成所述目标对象的第二目标轨迹;
其中,所述第一目标轨迹是根据所述目标对象在所述历史时刻的第二空间信息和第二能量信息确定的;所述历史时刻位于所述目标时刻之前。
本申请实施例还提供一种数据处理装置,包括:第一获取模块、计算模块和第一融合模块;
其中,所述第一获取模块,用于获取雷达在目标时刻探测到的目标对象的第一空间信息和第一能量信息;
所述计算模块,用于根据所述第一能量信息和所述第一空间信息,计算所述目标对象在所述目标时刻的目标观测中心;
所述第一融合模块,用于将所述目标观测中心与所述目标对象在历史时刻的第一目标轨迹进行融合,以生成所述目标对象的第二目标轨迹;
其中,所述第一目标轨迹是根据所述目标对象在所述历史时刻的第二空间信息和第二能量信息确定的;所述历史时刻位于所述目标时刻之前。
本申请实施例还提供一种雷达,包括:存储器和处理器;其中,所述存储器用于存储计算机程序和目标对象在历史时刻的第一目标轨迹;所述第一目标轨迹是根据所述目标对象在所述历史时刻的第二空间信息和第二能量信息确定的;
所述处理器耦合至所述存储器,用于执行所述计算机程序以用于:
获取雷达在所述目标时刻探测到的目标对象的第一空间信息和第一能量信息;其中,所述历史时刻位于所述目标时刻之前;
根据所述第一能量信息和所述第一空间信息,计算所述目标对象在所述目标时刻的目标观测中心;
将所述目标观测中心与所述第一目标轨迹进行融合,以生成所述目标对象的第二目标轨迹。
本申请实施例还提供一种探测设备,搭载有雷达,并包括存储器和处理器;其中,所述雷达用于获取在目标时刻探测到的目标对象的第一空间信息和第一能量信息;
所述存储器用于存储计算机程序和所述目标对象在历史时刻的第一目标 轨迹;所述第一目标轨迹是根据所述目标对象在历史时刻的第二空间信息和第二能量信息确定的;所述历史时刻位于所述目标时刻之前;
所述处理器耦合至所述存储器,用于执行所述计算机程序以用于:根据所述第一能量信息和所述第一空间信息,计算所述目标对象在所述目标时刻的目标观测中心;将所述目标观测中心与所述第一目标轨迹进行融合,以生成所述目标对象的第二目标轨迹。
本申请实施例还提供一种移动设备,搭载有雷达;所述雷达用于:获取在所述目标时刻探测到的目标对象的第一空间信息和第一能量信息;根据所述第一能量信息和所述第一空间信息,计算所述目标对象在所述目标时刻的目标观测中心;将所述目标观测中心与所述目标对象的在历史时刻的第一目标轨迹进行融合,以生成所述目标对象的第二目标轨迹;其中,所述第一目标轨迹是根据所述目标对象在历史时刻的第二空间信息和第二能量信息确定的;所述历史时刻位于所述目标时刻之前。
本申请实施例还提供一种存储有计算机指令的计算机可读存储介质,当所述计算机指令被一个或多个处理器执行时,致使所述一个或多个处理器执行包括以下的动作:
获取雷达在目标时刻探测到的目标对象的第一空间信息和第一能量信息;
根据所述第一能量信息和所述第一空间信息,计算所述目标对象在所述目标时刻的目标观测中心;
将所述目标观测中心与所述目标对象在历史时刻的第一目标轨迹进行融合,以生成所述目标对象的第二目标轨迹;
其中,所述第一目标轨迹是根据所述目标对象在历史时刻的第二空间信息和第二能量信息确定的;所述历史时刻位于所述目标时刻之前。
在本申请实施例中,可根据目标对象的空间信息和能量信息,计算目标对象的目标观测中心,并将目标对象的观测中心与目标对象在历史时刻的目标轨迹融合,生成目标对象的新的目标轨迹。其中,综合目标对象的空间信 息和能量信息,确定目标对象的目标轨迹,有助于提高目标轨迹的准确性,进而有助于提高后续利用目标对象的轨迹进行后续工作的准确性。例如,在对目标跟踪时,有助于提高对目标对象定位的准确度,从而有助于提高目标跟踪的准确度等等。
附图说明
此处所说明的附图用来提供对本申请的进一步理解,构成本申请的一部分,本申请的示意性实施例及其说明用于解释本申请,并不构成对本申请的不当限定。在附图中:
图1为本申请实施例提供的一种数据处理方法的流程示意图;
图2为本申请实施例提供的另一种数据处理方法的流程示意图;
图3为本申请实施例提供的一种数据处理装置的结构示意图;
图4为本申请实施例提供的一种雷达的结构示意图;
图5为本申请实施例提供的一种探测设备的结构示意图;
图6为本申请实施例提供的一种移动设备的结构示意图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合本申请具体实施例及相应的附图对本申请技术方案进行清楚、完整地描述。显然,所描述的实施例仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
针对现有生成目标对象的轨迹的准确度较低的技术问题,在本申请一些实施例中,可根据目标对象的空间信息和能量信息,计算目标对象的目标观测中心,并将目标对象的观测中心与目标对象在历史时刻的目标轨迹融合,生成目标对象的新的目标轨迹。其中,综合目标对象的空间信息和能量信息,确定目标对象的目标轨迹,有助于提高目标轨迹的准确性,进而有助 于提高后续利用目标对象的轨迹进行后续工作的准确性。例如,在对目标跟踪时,有助于提高对目标对象定位的准确度,从而有助于提高目标跟踪的准确度等等。
以下结合附图,详细说明本申请各实施例提供的技术方案。
图1为本申请实施例提供的一种数据处理方法的流程示意图。如图1所示,该方法包括:
101、获取雷达在目标时刻探测到的目标对象的第一空间信息和第一能量信息。
102、根据第一能量信息和第一空间信息,计算目标对象在目标时刻的目标观测中心。
103、将目标观测中心与目标对象在历史时刻的第一目标轨迹进行融合,以生成目标对象的第二目标轨迹。
在本实施例中,雷达可为定向雷达,也可为旋转雷达。其中,雷达可以为微波雷达,也可为激光雷达等,但不限于此。
在本实施例中,雷达不仅可探测到目标对象的空间信息,还可探测到目标对象的能量信息。在本实施例中,目标对象的空间信息包括目标对象的位置信息和几何特征。目标对象的能量信息根据目标对象返回的回波信号的强度信息。回波信号的强度在一定程度上可反映目标对象距离雷达的距离。因此,目标对象的空间信息和能量信息均在一定程度上可反映目标对象与雷达之间的相对位置关系。基于此,若将目标对象的空间信息和能量信息进行结合来表征目标对象的轨迹信息,有助于提高确定出的轨迹的准确性。
基于上述分析,在步骤101中,获取雷达在目标时刻探测到的目标对象的第一空间信息和第二空间信息。其中,目标时刻是指雷达探测到第一空间信息和第二空间信息的时刻。在不同的应用场景中,目标时刻可为当前时刻,也可为过去时刻。例如,在对目标对象的跟踪的应用场景中,目标时刻可为雷达当前的探测时刻。又例如,在利用目标对象的轨迹进行事故排查或场景复盘的应用场景中,目标时刻相较于获取第一空间信息和第二空间信息的时 刻为过去时刻。
接着,在步骤102中,根据第一空间信息和第一能量信息,计算目标对象的目标观测中心。进一步,在步骤103中,将目标观测中心与目标对象在历史时刻的第一目标轨迹进行融合,生成目标对象的第二目标轨迹。这种轨迹生成方式综合了目标对象的空间信息和能量信息,有助于提高目标轨迹的准确性,进而有助于提高后续利用目标对象的轨迹进行后续工作的准确性。例如,在对目标跟踪时,有助于提高对目标对象定位的准确度,从而有助于提高目标跟踪的准确度等等。
在本实施例中,第一目标轨迹是目标对象在历史时刻的轨迹信息。其中,历史时刻为目标时刻之前的探测时刻。优选地,历史时刻是与目标时刻距离最近的探测时刻。进一步,在本实施例中,第一目标轨迹是根据目标对象在历史时刻的第二空间信息和第二能量信息确定的。其中,确定第一目标轨迹的具体实施方式可参见上述和下述实施例中对目标对象的第二目标轨迹的确定方式的相关内容,在此暂不赘述。
需要说明的是,在本申请实施例中,“第一”、“第二”等描述,是用于区分不同的信息、轨迹、设备、模块等,不代表先后顺序,也不限定“第一”和“第二”是不同的类型。
可选地,在步骤103中,可采用卡尔曼(Kalman)滤波方法,对目标观测中心与目标对象的第一目标轨迹进行融合。可选地,可使用卡尔曼滤波方法预测第二目标轨迹。
在本申请实施例中,雷达可搭载在各种设备上完成相关任务。例如,雷达可搭载在移动设备上实现对障碍物的探测,或者实现对目标的检查和跟踪等等。在本实施例中,在不同的应用场景中,可根据第二目标轨迹进行不同的工作。例如,在对目标跟踪的应用场景中,可根据第二目标轨迹,引导搭载有雷达的设备对目标对象进行跟踪;又例如,在事故排查的应用场景中,可根据第二目标轨迹以及目标对象在第二目标轨迹之前的其它目标轨迹,对搭载有雷达的设备的工作环境进行还原;等等,但不限于此。
进一步,雷达可搭载在各种设备上。例如,雷达可搭载在移动设备上,其中移动设备可以为自主移动设备,如无人机、无人驾驶车辆或机器人等,但不限于此;或者,移动设备也可以为需要人员控制的可移动的设备,如非无人驾驶的车辆、船只、飞机等等,但不限于此。
在本实施例中,雷达包括发射机、接收机和信息处理系统,其中,发射机用于发射探测信号,该探测信号遇到障碍物时会反射回来回波信号,接收机可接收该回波信号。之后,信息处理系统便可根据反射回来的回波信息获得目标的有关信息,例如雷达与目标之间的距离、目标的方位、高度和形状等参数。
在本实施例中,探测信号遇到障碍物其实质为:探测信号遇到障碍物上的某一点。为了便于描述和区分,将探测信号在传播过程中实际遇到的障碍点,定义为探测点。在本实施例中,探测点可能为目标对象上的某一点,也可能属于目标对象之外的其它对象,例如空气中的灰尘等等。其中,每个探测信号在遇到一个探测点时,可返回对应的回波信号。
进一步,可基于雷达发射的探测信号和接收到的回波信号,得到探测点的空间坐标信息和能量信息,多个探测点的空间坐标信息和空间能量信息便组成点云信息,即点云信息为一系列的空间坐标点和空间能量信息组成的集合。在本实施例中,点云信息中的一个数据点可解释为探测点的空间坐标信息和能量信息的组合。
可选地,可根据雷达与探测点之间的距离以及雷达的位姿,计算探测点对应的空间坐标信息。其中,雷达的位姿是指雷达的位置和朝向。进一步,雷达的朝向可以是指雷达天线的方向性。进一步,根据雷达天线的方向性,可获得探测点相较于雷达的方向;进而根据探测点相较于雷达的方向、雷达与探测点之间的距离以及雷达的位置,可计算出探测点的空间坐标。可选地,还可根据回波信号的强度信息,确定探测点的能量信息。
在本实施例中,雷达探测到的目标对象对应多个探测点,多个探测点对应的数据点便组成目标对象的点云信息。在本实施例中,多个是指2个或2 个以上。基于此,在步骤101中,可根据目标对象在目标时刻的点云信息,计算目标对象在目标时刻的基准观测中心和基准几何特征,作为目标对象的第一空间信息。在本实施例中,目标对象在目标时刻的点云信息包含:目标对象的多个位置坐标。在本实施例中,不限定计算目标对象在目标时刻的基准观测中心和基准几何特征的具体实施方式。可选地,可计算多个位置坐标的平均坐标值,作为目标对象在目标时刻的基准观测中心。例如,假设目标对象在目标时刻的点云信息中包含:K个位置坐标,其中,K≥2,且为整数。并将这K个位置坐标定义为雷达第m次探测得到的点云信息中的位置坐标。这K个位置坐标分别表示为(x mk,y mk,z mk),其中,k=1,2...,K。则基准观测中心可表示为
Figure PCTCN2019115814-appb-000001
其中,
Figure PCTCN2019115814-appb-000002
可选地,根据多个位置坐标的最大值和最小值,确定一长方体,并将长方体的边长作为目标对象在目标时刻的基准几何特征。这些位置坐标该包含在长方体内。其中,长方体的边长是指长方体的长宽高。即l m=max(x mk)-min(x mk);w m=max(y mk)-min(y mk);h m=max(z mk)-min(z mk);其中,l m、w m,h m分别表示长方体的长宽高。
或者,可根据多个位置坐标的最大值和最小值,确定一椭球体,并将椭球体的中轴长度作为目标对象的基准几何特征。其中,这些位置坐标包含在该椭球体内。可选地,椭球体的中轴长度包括椭球体的长中轴和短中轴的长度。或者,还可根据多个位置坐标的最大值和最小值,确定一球体,并将球体的半径作为目标对象在目标时刻的基准几何特征。其中,这些位置坐标包含在该球体内。
在本实施例中,位置坐标为三维空间坐标,多个位置坐标的最大值为x轴的最大值、y轴的最大值和z轴的最大值所确定出的一个三维坐标值;多个位置坐标的最大值为x轴的最小值、y轴的最小值和z轴的最小值所确定出的一个三维坐标值。
在本申请实施例中,目标对象在目标时刻的点云信息还包含目标对象在 目标时刻的多个能量值。可选地,可计算多个能量值的平均能量值,作为目标对象在目标时刻的第一能量信息。
进一步,考虑到目标对象的能量信息和空间信息在表征目标对象的轨迹信息时所占的比重不同,基于此,在本步骤201中,还可根据第一能量信息、基准观测中心以及所述基准几何特征,计算观测加权因子;并基于该观测加权因子和基准观测中心,计算目标对象在目标时刻的目标观测中心。
可选地,可计算目标对象在在目标时刻的基准观测中心和第一目标轨迹之间的距离与目标对象在历史时刻的波门半径的比值,作为距离加权因子。可选地,可根据目标对象在历史时刻的加权几何特征,计算目标对象在历史时刻的波门半径。其中,目标对象在历史时刻的加权几何特征是根据第二空间信息和第二能量信息计算得到的。假设历史时刻为雷达对目标对象进行第n次探测的时刻,其中,第n次位于上述第m次之前;并假设目标对象在历史时刻的基准几何特征用长方体的长宽高进行表征。相应地,其加权几何特征也采用长方体的长宽高进行表征,加权几何特征可为l n、w n,h n,则目标对象在历史时刻的波门半径可表示为:
Figure PCTCN2019115814-appb-000003
可选地,距离加权因子可表示为:
Figure PCTCN2019115814-appb-000004
其中,d表示基准观测中心和第一目标轨迹之间的距离。其中,基准观测中心与第一目标轨迹越接近,r d越大。在一些情况下,
Figure PCTCN2019115814-appb-000005
可能因为基准观测中心与第一目标轨迹之间的距离足够近而趋近于无穷大,导致距离加权因子在观测加权因子的占的比重较大,而影响后续第二目标轨迹确定的准确度。基于此,还可采用增长较为平缓的函数表示距离加权因子。例如,距离加权因子还可表示为:
Figure PCTCN2019115814-appb-000006
进一步,还可计算目标对象在目标时刻的基准几何特征与目标对象在历史时刻的加权几何特征的比值,作为几何特征加权因子。其中,目标对象在历史时刻的加权几何特征为目标对象在历史时刻的目标几何特征,可根据目 标对象在历史时刻的基准几何特征和历史时刻的观测加权因子计算得到的,其具体计算过程可参见本申请实施例中目标对象在目标时刻的基准几何特征的计算过程。几何特征加权因子为:目标对象的在目标时刻基准几何特征的各表征参量与目标对象在历史时刻的加权几何特征的相同属性的表征参量的比值。例如,目标对象在目标时刻的基准几何特征用长方体进行表示,其表征参量为长方体的长宽高l m、w m,h m;相应地,目标对象在历史时刻的加权几何特征的表征参量为l n、w n,h n,则几何特征加权因子可表示为:
Figure PCTCN2019115814-appb-000007
进一步,计算第一能量信息与目标对象在历史时刻的加权能量信息之间的比值,作为能量加权因子;其中,目标对象在历史时刻的加权能量信息均是根据目标对象的第二空间信息和第二能量信息计算得到的。可选地,能量加权因子可表示为:
Figure PCTCN2019115814-appb-000008
进一步,可根据距离加权因子、几何特征加权因子以及能量加权因子,计算观测加权因子。可选地,可将距离加权因子、几何特征加权因子以及能量加权因子的乘积作为观测加权因子。即若目标对象的基准几何特征采用长方体的长宽高表示,则观测加权因子q i可表示为:q i=r l*r w*r*r p*r d。其中,i=1,2...,I。I表示目标对象在目标时刻的点云信息中位于目标对象在历史时刻的波门半径内的数据点的数量。
在本申请实施例中,考虑到雷达在目标时刻探测到的点云信息可能包含多个目标物的点云信息和一些不属于任何目标物的噪点,基于此,在步骤102之前,还需从至少一个待识别对象中识别出目标对象。可选地,在步骤102之前,还可根据至少一个待识别对象在目标时刻对应的点云信息,计算雷达 探测到的至少一个待识别对象的基准观测中心;并根据第一目标轨迹以及至少一个待识别对象的基准观测中心,从至少一个待识别对象中识别出目标对象。下面以第一待识别对象为例,对第一待识别对象是否为目标对象的判断过程进行示例性说明。其中,第一待识别对象为至少一个待识别对象中的任一对象。
可选地,可判断第一待识别对象在目标时刻的基准观测中心与第一目标轨迹之间的距离是否小于或等于目标对象在历史时刻的波门半径;若判断结果为是,则确定第一待识别对象为目标对象。相应地,若判断结果为否,则确定第一待识别对象不是目标对象。
进一步,若至少一个待识别对象中确定为目标对象的数量为多个,则在计算目标时刻的观测加权因子时,可根据至少一个待识别对象中确定为目标对象的多个对象各自的第一能量信息、基准观测中心以及基准几何特征,计算多个对象各自的基准观测加权因子;并利用多个对象各自的基准观测加权因子,对第一目标对象的基准观测加权因子进行归一化处理,得到第一目标对象的观测加权因子;其中,第一目标对象为多个对象中的任一对象。假设,至少一个待识别对象有J个目标对象被确定为目标对象,则J个目标对象的观测加权因子分别表示为
Figure PCTCN2019115814-appb-000009
其中,j=1,2...,J。即将
Figure PCTCN2019115814-appb-000010
重新赋值给q j
进一步,可基于目标对象的观测加权因子和目标对象在目标时刻基准观测中心,计算目标对象在目标时刻的目标观测中心。可选地,目标对象在目标时刻目标观测中心可表示为:(x m,y m,z m),其中,
Figure PCTCN2019115814-appb-000011
Figure PCTCN2019115814-appb-000012
相应地,目标对象在目标时刻的加权几何特征(目标几何特征)可表示为:
Figure PCTCN2019115814-appb-000013
目标对象在目标时刻的加权能 量信息可表示为:
Figure PCTCN2019115814-appb-000014
在本申请实施例中,若第一待识别对象不是目标对象以及历史时刻确定出的其它对象中的任一对象,则可获取搭载有雷达的设备在雷达探测到第一待识别对象时的运动学参数;根据搭载有雷达的设备在雷达探测到第一待识别对象时的运动学参数,预测第一待识别对象在后续时刻的轨迹信息;并根据预测的第一待识别对象在后续时刻的轨迹信息以及雷达在后续时刻探测到的未知对象的基准观测中心,对未知对象和第一待识别对象进行轨迹融合。
可选地,搭载有雷达的设备在雷达探测到第一待识别对象时的运动学参数包括:搭载有雷达的设备在雷达探测到第一待识别对象时的位置信息、加速度信息、速度信息以及运动方向中的至少一种。
进一步,考虑到雷达在目标时刻探测到的点云信息可能包含多个目标物的点云信息和一些不属于任何目标物的噪点,因此还需对雷达探测到的点云信息进行分类,即将属于同一目标物的点云进行划分为一类。其中,可对雷达探测到的点云信息进行密度聚类,以将这些点云信息划分为至少一个点云子集,其中一个点云子集对应一个待识别对象。
进一步,为了确定雷达在目标时刻探测的点云信息分别隶属于哪个目标对象,可对多个探测点对应的点云信息进行密度聚类处理,将属于同一目标对象的数据点划分至同一聚类中。
在实际应用中,由于雷达的扫描特性,探测点的点云以雷达中心为原点,呈放射状发散分布,且探测点距离雷达越近,这些探测点对应点云分布越密集,即点云中点与点之间的分布,随着探测点与雷达的距离的增大而变得分散。基于此,可根据雷达与探测点之间的距离,确定对点云信息进行密度聚类处理时所使用的邻域的大小,即确定点云信息中数据点的邻域的大小。因此,可根据雷达与探测点之间的距离,确定多个探测点对应的点云信息的邻域大小,作为对点云信息进行密度聚类的邻域的大小。其中,根据雷达与探测点之间的距离,确定对点云信息进行密度聚类时所使用的邻域的大小,这 种邻域确定方式兼顾了点云中点与点之间的分布,随着探测点与雷达的距离的增大而分散的特性,有助于提高密度聚类的自适应性,进而有助于提高密度聚类的准确率。
其中,可根据探测信号与回波信号之间的差异,获取雷达与探测点之间的距离。其中,雷达所发射的探测信号不同,获取雷达与探测点之间的距离的方式也有所差异。例如,若雷达发射的探测信号为脉冲信号,则可根据雷达发出的探测信号与接收到的回波信号之间的时间差,计算雷达与探测点之间的距离。即利用飞行时间法计算雷达与探测点之间的距离。可选地,已知探测信号和回波信号在大气传播中的速度,则根据雷达发出的探测信号与接收到的回波信号之间的时间差以及探测信号和回波信号在大气传播中的速度,便可计算出雷达与探测点之间的距离。可选地,探测信号可以为电磁波信号,例如微波信号或激光信号等等,但不限于此。
又例如,若雷达发射的探测信号为连续波信号,则可根据雷达发出的探测信号与接收到的回波信号之间的频率差,计算雷达与探测点之间的距离。可选地,连续波为调频连续波(Frequency Modulated Continuous Wave,FMCW)。其中,调频方式可以为三角波调频、锯齿波调频、编码调制或者噪声调频等,但不限于此。
进一步,可根据雷达与探测点之间的距离,确定对点云信息进行密度聚类处理时所使用的邻域的大小,即确定点云信息中数据点的邻域的大小。进一步,可基于确定出的邻域的大小,对多个探测点对应的点云信息进行密度聚类,得到至少一个待识别对象对应的点云信息。
在本申请实施例中,不限定对多个探测点对应的点云信息进行密度聚类的具体实施方式。可选地,可采用基于密度空间的聚类(Density-Based Spatial Clustering of Applications with Noise,DBSCAN)算法对多个探测点对应的点云信息进行密度聚类,得到至少一个点云子集,其中,一个点云子集对应一个待识别对象。可选地,针对第一数据点,判断第一数据点的邻域内包含的数据点的数量是否大于或等于已知的数量阈值;若判断结果为是,则 将第一数据点和第一数据点密度可达的所有数据点聚类至一个簇中,以得到第一数据点所属的点云子集;其中,第一数据点为多个探测点对应的点云信息中尚未进行聚类的任一数据点。相应地,若第一数据点的邻域内包含的数据点的数量小于已知的数量阈值,则可将第一数据点确定为噪点。
为了更方便理解上述数据处理过程,下面结合图2所示的一具体实施例进行示例性说明。如图2所示,数据处理过程主要包括:
201、获取雷达在目标时刻探测到的待识别对象的点云信息。
202、根据雷达在目标时刻探测到的待识别对象的点云信息,计算待识别对象在目标时刻的基准观测中心、基准几何特征以及第一能量信息。
203、获取目标对象在历史时刻的第一目标轨迹、加权几何特征、加权能量信息。
其中,目标对象在历史时刻的第一目标轨迹可为目标对象在历史目标时刻的目标观测中心,在一些实施例中,也可称之为雷达在历史时刻的加权观测中心。
204、根据目标对象在历史时刻的加权几何特征,计算目标对象在历史时刻的波门半径r。
205、计算待识别对象在目标时刻的基准观测中心与目标对象的第一目标轨迹之间的距离d。
206、判断待识别对象在目标时刻的基准观测中心与第一目标轨迹之间的距离d是否小于或等于目标对象在历史时刻的波门半径r。若判断结果为是,则确定待识别对象为目标对象,并则执行步骤207;若判断结果为否,则确定待识别对象不是目标对象。
207、根据待识别对象的第一能量信息、目标时刻的基准观测中心以及基准几何特征,计算观测加权因子。
208、基于观测加权因子和待识别对象在目标时刻的基准观测中心,计算目标对象在目标时刻的目标观测中心。
209、将目标对象在目标时刻的目标观测中心与第一目标轨迹进行融合, 生成目标对象的第二目标轨迹。
需要说明的是,上述实施例所提供方法的各步骤的执行主体均可以是同一设备,或者,该方法也由不同设备作为执行主体。比如,步骤101和102的执行主体可以为设备A;又比如,步骤101的执行主体可以为设备A,步骤102的执行主体可以为设备B;等等。
另外,在上述实施例及附图中的描述的一些流程中,包含了按照特定顺序出现的操作,但是应该清楚了解,这些操作可以不按照其在本文中出现的顺序来执行或并行执行,操作的序号如201、202等,仅仅是用于区分开各个不同的操作,序号本身不代表任何的执行顺序。另外,这些流程可以包括更多或更少的操作,并且这些操作可以按顺序执行或并行执行。
相应地,本申请实施例还提供一种存储有计算机指令的计算机可读存储介质,当计算机指令被一个或处理器执行时,致使一个或处理器执行包括以下的动作:获取雷达在目标时刻探测到的目标对象的第一空间信息和第一能量信息;根据第一能量信息和第一空间信息,计算目标对象在目标时刻的目标观测中心;将目标观测中心与目标对象在历史时刻的第一目标轨迹进行融合,以生成目标对象的第二目标轨迹;其中,第一目标轨迹是根据目标对象在历史时刻的第二空间信息和第二能量信息确定的;且历史时刻位于目标时刻之前。
可选地,当计算机指令被一个或处理器执行时,致使一个或处理器执行上述图1、图2以及可选实施方式中的相关步骤,具体参见上述实施例的相关内容,在此不再赘述。
图3为本申请实施例提供的一种数据处理装置。如图3所示,该装置包括:第一获取模块30a、计算模块30b和第一融合模块30c。
在本实施例中,第一获取模块30a,用于获取雷达在目标时刻探测到的目标对象的第一空间信息和第一能量信息。计算模块30b,用于根据第一能量信息和第一空间信息,计算目标对象在目标时刻的目标观测中心。第一融合模块30c,用于将目标观测中心与目标对象在历史时刻的第一目标轨迹进行融 合,以生成目标对象的第二目标轨迹。其中,第一目标轨迹是根据目标对象在历史时刻的第二空间信息和第二能量信息确定的;历史时刻位于目标时刻之前。
可选地,历史时刻是与目标时刻距离最近的探测时刻。
在一些实施例中,第一获取模块30a在获取雷达在目标时刻探测到的目标对象的第一空间信息时,具体用于:根据目标对象在目标时刻的点云信息,计算目标对象在目标时刻的基准观测中心和基准几何特征,作为目标对象的第一空间信息。
进一步,目标对象在目标时刻的点云信息包含:目标对象的多个位置坐标。相应地,第一获取模块30a在计算目标对象在目标时刻的基准观测中心时,具体用于:计算多个位置坐标的平均坐标值,作为目标对象在目标时刻的基准观测中心。
可选地,目标对象在目标时刻的点云信息包含:目标对象在目标时刻的多个位置坐标。相应地,第一获取模块30a在计算目标对象在目标时刻的基准几何特征时,具体用于:根据多个位置坐标的最大值和最小值,确定一长方体,并将长方体的边长作为目标对象在目标时刻的基准几何特征;或者,根据多个位置坐标的最大值和最小值,确定一椭球体,并将椭球体的中轴长度作为目标对象在目标时刻的基准几何特征;或者,根据多个位置坐标的最大值和最小值,确定一球体,并将球体的半径作为目标对象在目标时刻的基准几何特征。
在另一些实施例中,目标对象在目标时刻的点云信息包含:目标对象在目标时刻的多个能量值。相应地,第一获取模块30a在获取雷达探测到的目标对象的第一能量信息时,具体用于:计算多个能量值的平均能量值,作为第一能量信息。
在又一些实施例中,计算模块30b在计算目标对象的目标观测中心时,具体用于:根据第一能量信息、基准观测中心以及基准几何特征,计算观测加权因子;并基于观测加权因子和基准观测中心,计算目标对象的目标观测 中心。
进一步,计算模块30b在计算观测加权因子时,具体用于:计算基准几何特征与目标对象在历史时刻的加权几何特征的比值,作为几何特征加权因子;计算第一能量信息与目标对象在历史时刻的加权能量信息之间的比值,作为能量加权因子;根据距离加权因子、几何特征加权因子以及能量加权因子,计算观测加权因子;其中,目标对象在历史时刻的加权几何特征以及加权能量信息均是根据第二空间信息和第二能量信息计算得到的。
可选地,计算模块30b还用于:根据目标对象在历史时刻的加权几何特征,计算目标对象在历史时刻的波门半径。
可选地,计算模块30b在计算目标对象在目标时刻的目标观测中心之前,还用于:根据至少一个待识别对象在目标时刻对应的点云信息,计算雷达探测到的至少一个待识别对象在目标时刻的基准观测中心;并根据第一目标轨迹以及至少一个待识别对象在目标时刻的基准观测中心,从至少一个待识别对象中识别出目标对象。
进一步,计算模块30b在从至少一个待识别对象中识别出目标对象时,具体用于:针对第一待识别对象,判断第一待识别对象在目标时刻的基准观测中心与第一目标轨迹之间的距离是否小于或等于目标对象在历史时刻的波门半径;若判断结果为是,则确定第一待识别对象为目标对象;其中,第一待识别对象为至少一个待识别对象中的任一对象。
可选地,计算模块30b在计算雷达探测到的至少一个待识别对象的基准观测中心之前,还用于:获取雷达在目标时刻与雷达探测到的多个探测点之间的距离以及多个探测点对应的点云信息;根据雷达在目标时刻与多个探测点之间的距离,确定多个探测点对应的点云信息的邻域的大小;基于邻域的大小,对多个探测点对应的点云信息进行密度聚类,以得到至少一个待识别对象在目标时刻对应的点云信息。
在其它一些实施例中,若至少一个待识别对象中确定为目标对象的数量为多个,则计算模块30b在计算观测加权因子时,具体用于:针对第一目标 对象,根据至少一个待识别对象中确定为目标对象的多个对象各自的第一能量信息和基准观测中心以及基准几何特征,计算多个对象各自的基准观测加权因子;利用多个对象各自的基准观测加权因子,对第一目标对象的基准观测加权因子进行归一化处理,得到第一目标对象的观测加权因子;其中,第一目标对象为多个对象中的任一对象。
进一步,数据处理装置还包括:第二获取模块30d、预测模块30e和第二融合模块30f。可选地,若第一待识别对象不是目标对象以及历史时刻确定出的其它对象中的任一对象,第二获取模块30d用于:获取搭载有雷达的设备在雷达探测到第一待识别对象时的运动学参数。相应地,预测模块30e用于:根据搭载有雷达的设备在雷达探测到第一待识别对象时的运动学参数,预测第一待识别对象在后续时刻的轨迹信息。第二融合模块30f用于:根据预测的第一待识别对象在后续时刻的轨迹信息以及雷达在后续时刻探测到的未知对象的基准观测中心,对未知对象和第一待识别对象进行轨迹融合。
可选地,搭载有雷达的设备在雷达探测到第一待识别对象时的运动学参数包括:搭载有雷达的设备在雷达探测到第一待识别对象时的位置信息、加速度信息、速度信息以及运动方向中的至少一种。
进一步,在一些实施例中,数据处理装置还可包括:引导模块30g。相应地,引导模块30g用于:根据第二目标轨迹,引导搭载有雷达的设备对目标对象进行跟踪。
在另一实施例中,数据处理装置还可包括:还原模块30h。相应地,还原模块30h用于:根据第二目标轨迹以及目标对象在第二目标轨迹之前的其它目标轨迹,对搭载有雷达的设备的工作环境进行还原。
本实施例提供的数据处理装置,可根据目标对象的空间信息和能量信息,计算目标对象的目标观测中心,并将目标对象的观测中心与目标对象在历史时刻的目标轨迹融合,生成目标对象的新的目标轨迹。其中,综合目标对象的空间信息和能量信息,确定目标对象的目标轨迹,有助于提高目标轨迹的准确性,进而有助于提高后续利用目标对象的轨迹进行后续工 作的准确性。例如,在对目标跟踪时,有助于提高对目标对象定位的准确度,从而有助于提高目标跟踪的准确度等等。
图4为本申请实施例提供的一种雷达的结构示意图。如图4所示,雷达包括:存储器40a和处理器40b。在本实施例中,存储器40a用于存储计算机程序和目标对象在历史时刻的第一目标轨迹;第一目标轨迹是根据目标对象在历史时刻的第二空间信息和第二能量信息确定的。
在本实施例中,处理器耦合至存储器,用于执行计算机程序以用于:获取雷达在目标时刻探测到的目标对象的第一空间信息和第一能量信息;其中,历史时刻位于目标时刻之前;根据第一能量信息和第一空间信息,计算目标对象在目标时刻的目标观测中心;以及将目标观测中心与第一目标轨迹进行融合,以生成目标对象的第二目标轨迹。
可选地,历史时刻是与目标时刻距离最近的探测时刻。
在一些实施例中,处理器40b在获取雷达在目标时刻探测到的目标对象的第一空间信息时,具体用于:根据目标对象在目标时刻的点云信息,计算目标对象在目标时刻的基准观测中心和基准几何特征,作为目标对象的第一空间信息。
进一步,目标对象在目标时刻的点云信息包含:目标对象的多个位置坐标。相应地,处理器40b在计算目标对象在目标时刻的基准观测中心时,具体用于:计算多个位置坐标的平均坐标值,作为目标对象在目标时刻的基准观测中心。
可选地,目标对象在目标时刻的点云信息包含:目标对象在目标时刻的多个位置坐标。相应地,处理器40b在计算目标对象在目标时刻的基准几何特征时,具体用于:根据多个位置坐标的最大值和最小值,确定一长方体,并将长方体的边长作为目标对象在目标时刻的基准几何特征;或者,根据多个位置坐标的最大值和最小值,确定一椭球体,并将椭球体的中轴长度作为目标对象在目标时刻的基准几何特征;或者,根据多个位置坐标的最大值和最小值,确定一球体,并将球体的半径作为目标对象在目标时刻的基准几何 特征。
在另一些实施例中,目标对象在目标时刻的点云信息包含:目标对象在目标时刻的多个能量值。相应地,处理器40b在获取雷达探测到的目标对象的第一能量信息时,具体用于:计算多个能量值的平均能量值,作为第一能量信息。
在又一些实施例中,处理器40b在计算目标对象的目标观测中心时,具体用于:根据第一能量信息、基准观测中心以及基准几何特征,计算观测加权因子;并基于观测加权因子和基准观测中心,计算目标对象的目标观测中心。
进一步,处理器40b在计算观测加权因子时,具体用于:计算基准几何特征与目标对象在历史时刻的加权几何特征的比值,作为几何特征加权因子;计算第一能量信息与目标对象在历史时刻的加权能量信息之间的比值,作为能量加权因子;根据距离加权因子、几何特征加权因子以及能量加权因子,计算观测加权因子;其中,目标对象在历史时刻的加权几何特征以及加权能量信息均是根据第二空间信息和第二能量信息计算得到的。
可选地,处理器40b还用于:根据目标对象在历史时刻的加权几何特征,计算目标对象在历史时刻的波门半径。
可选地,处理器40b在计算目标对象在目标时刻的目标观测中心之前,还用于:根据至少一个待识别对象在目标时刻对应的点云信息,计算雷达探测到的至少一个待识别对象在目标时刻的基准观测中心;并根据第一目标轨迹以及至少一个待识别对象在目标时刻的基准观测中心,从至少一个待识别对象中识别出目标对象。
进一步,处理器40b在从至少一个待识别对象中识别出目标对象时,具体用于:针对第一待识别对象,判断第一待识别对象在目标时刻的基准观测中心与第一目标轨迹之间的距离是否小于或等于目标对象在历史时刻的波门半径;若判断结果为是,则确定第一待识别对象为目标对象;其中,第一待识别对象为至少一个待识别对象中的任一对象。
可选地,处理器40b在计算雷达探测到的至少一个待识别对象的基准观测中心之前,还用于:获取雷达在目标时刻与雷达探测到的多个探测点之间的距离以及多个探测点对应的点云信息;根据雷达在目标时刻与多个探测点之间的距离,确定多个探测点对应的点云信息的邻域的大小;基于邻域的大小,对多个探测点对应的点云信息进行密度聚类,以得到至少一个待识别对象在目标时刻对应的点云信息。
在其它一些实施例中,若至少一个待识别对象中确定为目标对象的数量为多个,则处理器40b在计算观测加权因子时,具体用于:针对第一目标对象,根据至少一个待识别对象中确定为目标对象的多个对象各自的第一能量信息和基准观测中心以及基准几何特征,计算多个对象各自的基准观测加权因子;利用多个对象各自的基准观测加权因子,对第一目标对象的基准观测加权因子进行归一化处理,得到第一目标对象的观测加权因子;其中,第一目标对象为多个对象中的任一对象。
进一步,若第一待识别对象不是目标对象以及历史时刻确定出的其它对象中的任一对象,处理器40b还用于:获取搭载有雷达的设备在雷达探测到第一待识别对象时的运动学参数;根据搭载有雷达的设备在雷达探测到第一待识别对象时的运动学参数,预测第一待识别对象在后续时刻的轨迹信息;以及根据预测的第一待识别对象在后续时刻的轨迹信息以及雷达在后续时刻探测到的未知对象的基准观测中心,对未知对象和第一待识别对象进行轨迹融合。
可选地,搭载有雷达的设备在雷达探测到第一待识别对象时的运动学参数包括:搭载有雷达的设备在雷达探测到第一待识别对象时的位置信息、加速度信息、速度信息以及运动方向中的至少一种。
进一步,在一些实施例中,处理器40b还用于:根据第二目标轨迹,引导搭载有雷达的设备对目标对象进行跟踪;或者,根据第二目标轨迹以及目标对象在第二目标轨迹之前的其它目标轨迹,对搭载有雷达的设备的工作环境进行还原。
在一些可选实施方式中,如图4所示,该雷达还可以包括:通信组件40c、水平角度检测装置40d和电扫描测角装置40e和电源组件40f等可选组件。图4中仅示意性给出部分组件,并不意味着雷达必须包含图4所示全部组件,也不意味着雷达只能包括图4所示组件。
在本实施例中,通信组件40c用于发射探测信号,并接收探测信号探测到探测点时反射的回波信号。可选地,如图4所示,通信组件40c可包括:发射机40c1、接收机40c2以及天线40c3,等等,但不限于此。其中,关于发射机40c1、接收机40c2以及天线40c3的作用和实现形态均属于本领域的公知常识,在此不做赘述。
在本实施例中,水平角度检测装置40d可测量雷达的旋转位置;电扫描测角装置40d可测量目标相对于发射机40c1的轴向的偏离角。其中,关于水平角度检测装置40d的设置方式可参见上述实施例的相关内容,在此不再赘述。可选地,水平角度检测装置40d包括:光电传感器40d1和光栅盘40d2。其中,光电传感器40d1和光栅盘40d2的设置方式、实现形态以及工作原理的描述均属于本领域的公知常识,在此不做赘述。
本实施例提供的雷达,可根据目标对象的空间信息和能量信息,计算目标对象的目标观测中心,并将目标对象的观测中心与目标对象在历史时刻的目标轨迹融合,生成目标对象的新的目标轨迹。其中,综合目标对象的空间信息和能量信息,确定目标对象的目标轨迹,有助于提高目标轨迹的准确性,进而有助于提高后续利用目标对象的轨迹进行后续工作的准确性。例如,在对目标跟踪时,有助于提高对目标对象定位的准确度,从而有助于提高目标跟踪的准确度等等。
图5为本申请实施例提供的一种探测设备的结构示意图。如图5所示,该探测设备包括:存储器50a和处理器50b。该探测设备还搭载有雷达50c。
在本实施例中,雷达50c用于获取在目标时刻探测到的目标对象的第一空间信息和第一能量信息。
存储器50a存储计算机程序和目标对象在历史时刻的第一目标轨迹;第 一目标轨迹是根据目标对象在历史时刻的第二空间信息和第二能量信息确定的;历史时刻位于目标时刻之前。
处理器50b耦合至存储器50a,用于执行计算机程序以用于:根据第一能量信息和第一空间信息,计算目标对象在目标时刻的目标观测中心;将目标观测中心与第一目标轨迹进行融合,以生成目标对象的第二目标轨迹。
可选地,历史时刻是与目标时刻距离最近的探测时刻。
在一些实施例中,雷达50c在获取在目标时刻探测到的目标对象的第一空间信息时,具体用于:根据目标对象在目标时刻的点云信息,计算目标对象在目标时刻的基准观测中心和基准几何特征,作为目标对象的第一空间信息。
进一步,目标对象在目标时刻的点云信息包含:目标对象的多个位置坐标。相应地,雷达50c在计算目标对象在目标时刻的基准观测中心时,具体用于:计算多个位置坐标的平均坐标值,作为目标对象在目标时刻的基准观测中心。
可选地,目标对象在目标时刻的点云信息包含:目标对象在目标时刻的多个位置坐标。相应地,雷达50c在计算目标对象在目标时刻的基准几何特征时,具体用于:根据多个位置坐标的最大值和最小值,确定一长方体,并将长方体的边长作为目标对象在目标时刻的基准几何特征;或者,根据多个位置坐标的最大值和最小值,确定一椭球体,并将椭球体的中轴长度作为目标对象在目标时刻的基准几何特征;或者,根据多个位置坐标的最大值和最小值,确定一球体,并将球体的半径作为目标对象在目标时刻的基准几何特征。
在另一些实施例中,目标对象在目标时刻的点云信息包含:目标对象在目标时刻的多个能量值。相应地,雷达50c在获取雷达探测到的目标对象的第一能量信息时,具体用于:计算多个能量值的平均能量值,作为第一能量信息。
在又一些实施例中,处理器50b在计算目标对象的目标观测中心时,具 体用于:根据第一能量信息、基准观测中心以及基准几何特征,计算观测加权因子;并基于观测加权因子和基准观测中心,计算目标对象的目标观测中心。
进一步,处理器50b在计算观测加权因子时,具体用于:计算基准几何特征与目标对象在历史时刻的加权几何特征的比值,作为几何特征加权因子;计算第一能量信息与目标对象在历史时刻的加权能量信息之间的比值,作为能量加权因子;根据距离加权因子、几何特征加权因子以及能量加权因子,计算观测加权因子;其中,目标对象在历史时刻的加权几何特征以及加权能量信息均是根据第二空间信息和第二能量信息计算得到的。
可选地,处理器50b还用于:根据目标对象在历史时刻的加权几何特征,计算目标对象在历史时刻的波门半径。
可选地,处理器50b在计算目标对象在目标时刻的目标观测中心之前,还用于:根据至少一个待识别对象在目标时刻对应的点云信息,计算雷达探测到的至少一个待识别对象在目标时刻的基准观测中心;并根据第一目标轨迹以及至少一个待识别对象在目标时刻的基准观测中心,从至少一个待识别对象中识别出目标对象。
进一步,处理器50b在从至少一个待识别对象中识别出目标对象时,具体用于:针对第一待识别对象,判断第一待识别对象在目标时刻的基准观测中心与第一目标轨迹之间的距离是否小于或等于目标对象在历史时刻的波门半径;若判断结果为是,则确定第一待识别对象为目标对象;其中,第一待识别对象为至少一个待识别对象中的任一对象。
可选地,处理器50b在计算雷达探测到的至少一个待识别对象的基准观测中心之前,还用于:获取雷达在目标时刻与雷达探测到的多个探测点之间的距离以及多个探测点对应的点云信息;根据雷达在目标时刻与多个探测点之间的距离,确定多个探测点对应的点云信息的邻域的大小;基于邻域的大小,对多个探测点对应的点云信息进行密度聚类,以得到至少一个待识别对象在目标时刻对应的点云信息。
在其它一些实施例中,若至少一个待识别对象中确定为目标对象的数量为多个,则处理器50b在计算观测加权因子时,具体用于:针对第一目标对象,根据至少一个待识别对象中确定为目标对象的多个对象各自的第一能量信息和基准观测中心以及基准几何特征,计算多个对象各自的基准观测加权因子;利用多个对象各自的基准观测加权因子,对第一目标对象的基准观测加权因子进行归一化处理,得到第一目标对象的观测加权因子;其中,第一目标对象为多个对象中的任一对象。
进一步,若第一待识别对象不是目标对象以及历史时刻确定出的其它对象中的任一对象,处理器50b还用于:获取搭载有雷达的设备在雷达探测到第一待识别对象时的运动学参数;根据搭载有雷达的设备在雷达探测到第一待识别对象时的运动学参数,预测第一待识别对象在后续时刻的轨迹信息;以及根据预测的第一待识别对象在后续时刻的轨迹信息以及雷达在后续时刻探测到的未知对象的基准观测中心,对未知对象和第一待识别对象进行轨迹融合。
可选地,搭载有雷达的设备在雷达探测到第一待识别对象时的运动学参数包括:搭载有雷达的设备在雷达探测到第一待识别对象时的位置信息、加速度信息、速度信息以及运动方向中的至少一种。
进一步,在一些实施例中,处理器50b还用于:根据第二目标轨迹,引导搭载有雷达的设备对目标对象进行跟踪;或者,根据第二目标轨迹以及目标对象在第二目标轨迹之前的其它目标轨迹,对搭载有雷达的设备的工作环境进行还原。
可选地,探测设备为无人机、无人驾驶车辆、机器人或船只,等等,但不限于此。
在一些可选实施方式中,如图5所示,该探测设备还可以包括:电源组件50d、通信组件50f、驱动组件50g、显示组件50h、音频组件50i或一个或传感器50j等可选组件。其中,探测设备的实现形态不同,其包括的其它组件也有所差别。关于探测设备包含的其它组件属于探测设备本身所属领域的公 知常识,在此不做赘述。图5中仅示意性给出部分组件,并不意味着探测设备必须包含图5所示全部组件,也不意味着探测设备只能包括图5所示组件。
本申请实施例提供的探测设备包括:雷达和处理器。其中,雷达和处理器相互配合,可根据目标对象的空间信息和能量信息,计算目标对象的目标观测中心,并将目标对象的观测中心与目标对象在历史时刻的目标轨迹融合,生成目标对象的新的目标轨迹。其中,综合目标对象的空间信息和能量信息,确定目标对象的目标轨迹,有助于提高目标轨迹的准确性,进而有助于提高后续利用目标对象的轨迹进行后续工作的准确性。例如,在对目标跟踪时,有助于提高对目标对象定位的准确度,从而有助于提高目标跟踪的准确度等等。
图6为本申请实施例提供的一种移动设备的结构示意图。如图6所示,该移动设备搭载有雷达60a。其中,雷达60a用于:获取在目标时刻探测到的目标对象的第一空间信息和第一能量信息;根据第一能量信息和第一空间信息,计算目标对象在目标时刻的目标观测中心;将目标观测中心与目标对象的在历史时刻的第一目标轨迹进行融合,以生成目标对象的第二目标轨迹;其中,第一目标轨迹是根据目标对象在历史时刻的第二空间信息和第二能量信息确定的;历史时刻位于目标时刻之前。
其中,关于雷达60a获取的目标对象的第一空间信息和第一能量信息、计算目标对象在目标时刻的目标观测中心以及将目标观测中心与目标对象的在历史时刻的第一目标轨迹进行融合的具体实施方式,均可参见上述实施例的相关内容,在此不再赘述。关于雷达60a的具体实现形态和结构也参见上述实施例的相关内容,在此不再赘述。
在一些实施例中,移动设备还包括:处理器60b。可选地,雷达60a可将目标对象的第二目标轨迹以及第二目标轨迹之前的其它目标轨迹提供给处理器60b。相应地,处理器60b可根据第二目标轨迹引导移动设备对目标对象进行跟踪;或者,根据第二目标轨迹以及目标对象在第二目标轨迹之前的其它轨迹,对移动设备的运动环境进行还原。
在一些可选实施方式中,如图6所示,该探测设备还可以包括:存储器60c、电源组件60d、通信组件60f、驱动组件60g、显示组件60h、音频组件60i或一个或传感器60j等可选组件。其中,移动设备的实现形态不同,其包括的其它组件也有所差别。关于移动设备包含的其它组件属于移动设备本身所属领域的公知常识,在此不做赘述。图6中仅示意性给出部分组件,并不意味着移动设备必须包含图6所示全部组件,也不意味着移动设备只能包括图6所示组件。
本实施例提供的移动设备,搭载有雷达。其中,雷达可根据目标对象的空间信息和能量信息,计算目标对象的目标观测中心,并将目标对象的观测中心与目标对象在历史时刻的目标轨迹融合,生成目标对象的新的目标轨迹。其中,综合目标对象的空间信息和能量信息,确定目标对象的目标轨迹,有助于提高目标轨迹的准确性,进而有助于提高后续利用目标对象的轨迹进行后续工作的准确性。例如,在对目标跟踪时,有助于提高对目标对象定位的准确度,从而有助于提高目标跟踪的准确度等等。
在本申请实施例中,存储器用于存储计算机程序,并可被配置为存储其它各种数据以支持在其所在设备上的操作。其中,处理器可执行存储器中存储的计算机程序,以实现相应控制逻辑。存储器可以由任何类型的易失性或非易失性存储设备或者它们的组合实现,如静态随机存取存储器(SRAM),电可擦除可编程只读存储器(EEAROM),可擦除可编程只读存储器(EAROM),可编程只读存储器(AROM),只读存储器(ROM),磁存储器,快闪存储器,磁盘或光盘。
在本申请实施例中,处理器可以为任意可执行上述方法逻辑的硬件处理设备。可选地,处理器可以为中央处理器(Central Arocessing Unit,CAU)、图形处理器(GraAhics Arocessing Unit,GAU)或微控制单元(Microcontroller Unit,MCU);也可以为现场可编程门阵列(Field-Arogrammable Gate Array,FAGA)、可编程阵列逻辑器件(Arogrammable Array Logic,AAL)、通用阵列逻辑器件(General Array Logic,GAL)、复杂可编程逻辑器件(ComAlex  Arogrammable Logic Device,CALD)等可编程器件;或者为先进精简指令集(RISC)处理器(Advanced RISC Machines,ARM)或系统芯片(System on ChiA SOC)等等,但不限于此。
在本申请实施例中,通信组件还可被配置为便于其所在设备和其他设备之间有线或无线方式的通信。其所在设备可以接入基于通信标准的无线网络,如WiFi,2G或3G,4G,5G或它们的组合。在一个示例性实施例中,通信组件经由广播信道接收来自外部广播管理系统的广播信号或广播相关信息。在一个示例性实施例中,所述通信组件还可基于近场通信(NFC)技术、射频识别(RFID)技术、红外数据协会(IrDA)技术、超宽带(UWB)技术、蓝牙(BT)技术或其他技术来实现。
在本申请实施例中,显示组件可以包括液晶显示器(LCD)和触摸面板(TP)。如果显示组件包括触摸面板,显示组件可以被实现为触摸屏,以接收来自用户的输入信号。触摸面板包括一个或触摸传感器以感测触摸、滑动和触摸面板上的手势。所述触摸传感器可以不仅感测触摸或滑动动作的边界,而且还检测与所述触摸或滑动操作相关的持续时间和压力。
在本实施例中,电源组件被配置为其所在设备的各种组件提供电力。电源组件可以包括电源管理系统,一个或电源,及其他与为电源组件所在设备生成、管理和分配电力相关联的组件。
在本实施例中,音频组件可被配置为输出和/或输入音频信号。例如,音频组件包括一个麦克风(MIC),当音频组件所在设备处于操作模式,如呼叫模式、记录模式和语音识别模式时,麦克风被配置为接收外部音频信号。所接收的音频信号可以被进一步存储在存储器或经由通信组件发送。在一些实施例中,音频组件还包括一个扬声器,用于输出音频信号。例如,对于具有语言交互功能的设备,可通过音频组件实现与用户的语音交互等。
需要说明的是,本文中的“第一”、“第二”等描述,是用于区分不同的消息、设备、模块等,不代表先后顺序,也不限定“第一”和“第二”是不同的类型。
本领域内的技术人员应明白,本发明的实施例可提供为方法、系统、或计算机程序产品。因此,本发明可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本发明可采用在一个或其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本发明是参照根据本发明实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或流程和/或方框图一个方框或方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或流程和/或方框图一个方框或方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或流程和/或方框图一个方框或方框中指定的功能的步骤。
在一个典型的配置中,计算设备包括一个或处理器(CAU)、输入/输出接口、网络接口和内存。
内存可能包括计算机可读介质中的非永久性存储器,随机存取存储器(RAM)和/或非易失性内存等形式,如只读存储器(ROM)或闪存(flash RAM)。内存是计算机可读介质的示例。
计算机可读介质包括永久性和非永久性、可移动和非可移动媒体可以由任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、 程序的模块或其他数据。计算机的存储介质的例子包括,但不限于相变内存(ARAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEAROM)、快闪记忆体或其他内存技术、只读光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)或其他光学存储、磁盒式磁带,磁带磁磁盘存储或其他磁性存储设备或任何其他非传输介质,可用于存储可以被计算设备访问的信息。按照本文中的界定,计算机可读介质不包括暂存电脑可读媒体(transitory media),如调制的数据信号和载波。
还需要说明的是,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、商品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、商品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、商品或者设备中还存在另外的相同要素。
以上所述仅为本申请的实施例而已,并不用于限制本申请。对于本领域技术人员来说,本申请可以有各种更改和变化。凡在本申请的精神和原理之内所作的任何修改、等同替换、改进等,均应包含在本申请的权利要求范围之内。

Claims (23)

  1. 一种数据处理方法,其特征在于,包括:
    获取雷达在目标时刻探测到的目标对象的第一空间信息和第一能量信息;
    根据所述第一能量信息和所述第一空间信息,计算所述目标对象在所述目标时刻的目标观测中心;
    将所述目标观测中心与所述目标对象在历史时刻的第一目标轨迹进行融合,以生成所述目标对象的第二目标轨迹;
    其中,所述第一目标轨迹是根据所述目标对象在所述历史时刻的第二空间信息和第二能量信息确定的;所述历史时刻位于所述目标时刻之前。
  2. 根据权利要求1所述的方法,其特征在于,所述获取雷达在目标时刻探测到的目标对象的第一空间信息,包括:
    根据所述目标对象在所述目标时刻的点云信息,计算所述目标对象在所述目标时刻的基准观测中心和基准几何特征,作为所述目标对象的第一空间信息。
  3. 根据权利要求2所述的方法,其特征在于,所述目标对象在所述目标时刻的点云信息包含:所述目标对象的多个位置坐标;所述根据所述目标对象的点云信息,计算所述目标对象在所述目标时刻的基准观测中心,包括:
    计算所述多个位置坐标的平均坐标值,作为所述目标对象在所述目标时刻的基准观测中心。
  4. 根据权利要求2所述的方法,其特征在于,所述目标对象在所述目标时刻的点云信息包含:所述目标对象在所述目标时刻的多个位置坐标;所述根据所述目标对象的点云信息,计算所述目标对象在所述目标时刻的基准几何特征,包括:
    根据所述多个位置坐标的最大值和最小值,确定一长方体,并将所述长方体的边长作为所述目标对象在所述目标时刻的基准几何特征;
    或者,
    根据所述多个位置坐标的最大值和最小值,确定一椭球体,并将所述椭球体的中轴长度作为所述目标对象在所述目标时刻的基准几何特征;
    或者,
    根据所述多个位置坐标的最大值和最小值,确定一球体,并将所述球体的半径作为所述目标对象在所述目标时刻的基准几何特征。
  5. 根据权利要求2所述的方法,其特征在于,所述目标对象在所述目标时刻的点云信息包含:所述目标对象在所述目标时刻的多个能量值;所述获取雷达探测到的目标对象的第一能量信息,包括:
    计算所述多个能量值的平均能量值,作为所述第一能量信息。
  6. 根据权利要求2所述的方法,其特征在于,所述根据所述第一能量信息和所述第一空间信息,计算所述目标对象在所述目标时刻的目标观测中心,包括:
    根据所述第一能量信息、所述基准观测中心以及所述基准几何特征,计算观测加权因子;
    基于所述观测加权因子和所述基准观测中心,计算所述目标对象在所述目标时刻的目标观测中心。
  7. 根据权利要求6所述的方法,其特征在于,所述根据所述第一能量信息、所述基准观测中心以及所述基准几何特征,计算观测加权因子,包括:
    计算所述基准观测中心和所述第一目标轨迹之间的距离与所述目标对象在所述历史时刻的波门半径的比值,作为距离加权因子;
    计算所述基准几何特征与所述目标对象在所述历史时刻的加权几何特征的比值,作为几何特征加权因子;
    计算所述第一能量信息与所述目标对象在所述历史时刻的加权能量信息之间的比值,作为能量加权因子;
    根据所述距离加权因子、所述几何特征加权因子以及所述能量加权因子,计算所述观测加权因子;
    其中,所述目标对象在所述历史时刻的加权几何特征以及加权能量信息 均是根据所述第二空间信息和所述第二能量信息计算得到的。
  8. 根据权利要求7所述的方法,其特征在于,还包括:
    根据所述目标对象在所述历史时刻的加权几何特征,计算所述目标对象在所述历史时刻的波门半径。
  9. 根据权利要求6所述的方法,其特征在于,在根据所述第一能量信息和所述第一空间信息计算所述目标对象在所述目标时刻的目标观测中心之前,还包括:
    根据所述至少一个待识别对象在所述目标时刻对应的点云信息,计算雷达探测到的至少一个待识别对象在所述目标时刻的基准观测中心;
    根据所述第一目标轨迹以及所述至少一个待识别对象在所述目标时刻的基准观测中心,从所述至少一个待识别对象中识别出所述目标对象。
  10. 根据权利要求9所述的方法,其特征在于,在根据所述至少一个待识别对象在所述目标时刻对应的点云信息,计算所述雷达探测到的至少一个待识别对象的基准观测中心之前,还包括:
    获取所述雷达在所述目标时刻与所述雷达探测到的多个探测点之间的距离以及所述多个探测点对应的点云信息;
    根据所述雷达在所述目标时刻与所述多个探测点之间的距离,确定所述多个探测点对应的点云信息的邻域的大小;
    基于所述邻域的大小,对所述多个探测点对应的点云信息进行密度聚类,以得到所述至少一个待识别对象在所述目标时刻对应的点云信息。
  11. 根据权利要求9所述的方法,其特征在于,所述根据所述第一目标轨迹以及所述至少一个待识别对象在所述目标时刻的基准观测中心,从所述至少一个待识别对象中识别出所述目标对象,包括:
    针对第一待识别对象,判断所述第一待识别对象在所述目标时刻的基准观测中心与所述第一目标轨迹之间的距离是否小于或等于所述目标对象在所述历史时刻的波门半径;
    若判断结果为是,则确定所述第一待识别对象为所述目标对象;
    其中,所述第一待识别对象为所述至少一个待识别对象中的任一对象。
  12. 根据权利要求11所述的方法,其特征在于,若所述至少一个待识别对象中确定为所述目标对象的数量为多个,所述根据所述第一能量信息、所述基准观测中心以及所述基准几何特征,计算观测加权因子,包括:
    针对第一目标对象,根据所述至少一个待识别对象中确定为所述目标对象的多个对象各自的第一能量信息和基准观测中心以及基准几何特征,计算所述多个对象各自的基准观测加权因子;
    利用所述多个对象各自的基准观测加权因子,对所述第一目标对象的基准观测加权因子进行归一化处理,得到所述第一目标对象的观测加权因子;
    其中,所述第一目标对象为所述多个对象中的任一对象。
  13. 根据权利要求11所述的方法,其特征在于,还包括:
    若所述第一待识别对象不是所述目标对象以及所述历史时刻确定出的其它对象中的任一对象,则获取搭载有所述雷达的设备在所述雷达探测到所述第一待识别对象时的运动学参数;
    根据所述搭载有所述雷达的设备在所述雷达探测到所述第一待识别对象时的运动学参数,预测所述第一待识别对象在后续时刻的轨迹信息;
    根据预测的所述第一待识别对象在后续时刻的轨迹信息以及所述雷达在后续时刻探测到的未知对象的基准观测中心,对所述未知对象和所述第一待识别对象进行轨迹融合。
  14. 根据权利要求13所述的方法,其特征在于,所述搭载有所述雷达的设备在所述雷达探测到所述第一待识别对象时的运动学参数包括:所述搭载有所述雷达的设备在所述雷达探测到所述第一待识别对象时的位置信息、加速度信息、速度信息以及运动方向中的至少一种。
  15. 根据权利要求1-14任一项所述的方法,其特征在于,所述历史时刻是与所述目标时刻距离最近的探测时刻。
  16. 根据权利要求1-14任一项所述的方法,其特征在于,还包括执行以下至少一种操作:
    根据所述第二目标轨迹,引导搭载有所述雷达的设备对所述目标对象进行跟踪;
    或者,
    根据所述第二目标轨迹以及所述目标对象在所述第二目标轨迹之前的其它目标轨迹,对所述搭载有所述雷达的设备的工作环境进行还原。
  17. 一种数据处理装置,其特征在于,包括:第一获取模块、计算模块和第一融合模块;
    其中,所述第一获取模块,用于获取雷达在目标时刻探测到的目标对象的第一空间信息和第一能量信息;
    所述计算模块,用于根据所述第一能量信息和所述第一空间信息,计算所述目标对象在所述目标时刻的目标观测中心;
    所述第一融合模块,用于将所述目标观测中心与所述目标对象在历史时刻的第一目标轨迹进行融合,以生成所述目标对象的第二目标轨迹;
    其中,所述第一目标轨迹是根据所述目标对象在所述历史时刻的第二空间信息和第二能量信息确定的;所述历史时刻位于所述目标时刻之前。
  18. 一种雷达,其特征在于,包括:存储器和处理器;其中,所述存储器用于存储计算机程序和目标对象在历史时刻的第一目标轨迹;所述第一目标轨迹是根据所述目标对象在所述历史时刻的第二空间信息和第二能量信息确定的;
    所述处理器耦合至所述存储器,用于执行所述计算机程序以用于:
    获取雷达在所述目标时刻探测到的目标对象的第一空间信息和第一能量信息;其中,所述历史时刻位于所述目标时刻之前
    根据所述第一能量信息和所述第一空间信息,计算所述目标对象在所述目标时刻的目标观测中心;
    将所述目标观测中心与所述第一目标轨迹进行融合,以生成所述目标对象的第二目标轨迹。
  19. 一种探测设备,其特征在于,搭载有雷达,并包括存储器和处理器; 其中,所述雷达用于获取在目标时刻探测到的目标对象的第一空间信息和第一能量信息;
    所述存储器用于存储计算机程序和所述目标对象在历史时刻的第一目标轨迹;所述第一目标轨迹是根据所述目标对象在历史时刻的第二空间信息和第二能量信息确定的;所述历史时刻位于所述目标时刻之前;
    所述处理器耦合至所述存储器,用于执行所述计算机程序以用于:根据所述第一能量信息和所述第一空间信息,计算所述目标对象在所述目标时刻的目标观测中心;将所述目标观测中心与所述第一目标轨迹进行融合,以生成所述目标对象的第二目标轨迹。
  20. 一种移动设备,其特征在于,搭载有雷达;所述雷达用于:获取在所述目标时刻探测到的目标对象的第一空间信息和第一能量信息;根据所述第一能量信息和所述第一空间信息,计算所述目标对象在所述目标时刻的目标观测中心;将所述目标观测中心与所述目标对象的在历史时刻的第一目标轨迹进行融合,以生成所述目标对象的第二目标轨迹;其中,所述第一目标轨迹是根据所述目标对象在历史时刻的第二空间信息和第二能量信息确定的;所述历史时刻位于所述目标时刻之前。
  21. 根据权利要求20所述的移动设备,其特征在于,还包括:存储器和处理器;其中,其中,所述存储器用于存储计算机程序;
    所述雷达还用于:将所述第二目标轨迹提供给所述处理器;
    所述处理器耦合至所述存储器,用于执行所述计算机程序以用于:根据所述第二目标轨迹引导所述移动设备对所述目标对象进行跟踪;或者,根据所述第二目标轨迹以及所述目标对象在所述第二目标轨迹之前的其它轨迹,对所述移动设备的运动环境进行还原。
  22. 根据权利要求20或21所述的设备,其特征在于,所述移动设备为无人机、无人驾驶车辆、机器人或船只。
  23. 一种存储有计算机指令的计算机可读存储介质,其特征在于,当所述计算机指令被一个或多个处理器执行时,致使所述一个或多个处理器执行 包括以下的动作:
    获取雷达在目标时刻探测到的目标对象的第一空间信息和第一能量信息;
    根据所述第一能量信息和所述第一空间信息,计算所述目标对象在所述目标时刻的目标观测中心;
    将所述目标观测中心与所述目标对象在历史时刻的第一目标轨迹进行融合,以生成所述目标对象的第二目标轨迹;
    其中,所述第一目标轨迹是根据所述目标对象在历史时刻的第二空间信息和第二能量信息确定的;所述历史时刻位于所述目标时刻之前。
PCT/CN2019/115814 2019-11-05 2019-11-05 数据处理方法、装置、雷达、设备及存储介质 WO2021087777A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2019/115814 WO2021087777A1 (zh) 2019-11-05 2019-11-05 数据处理方法、装置、雷达、设备及存储介质
CN201980040012.8A CN112334946A (zh) 2019-11-05 2019-11-05 数据处理方法、装置、雷达、设备及存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/115814 WO2021087777A1 (zh) 2019-11-05 2019-11-05 数据处理方法、装置、雷达、设备及存储介质

Publications (1)

Publication Number Publication Date
WO2021087777A1 true WO2021087777A1 (zh) 2021-05-14

Family

ID=74319384

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/115814 WO2021087777A1 (zh) 2019-11-05 2019-11-05 数据处理方法、装置、雷达、设备及存储介质

Country Status (2)

Country Link
CN (1) CN112334946A (zh)
WO (1) WO2021087777A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113208566A (zh) * 2021-05-17 2021-08-06 深圳大学 一种数据处理方法、装置、电子设备及存储介质
CN113325383A (zh) * 2021-06-17 2021-08-31 广东工业大学 基于网格和dbscan的自适应车载毫米波雷达聚类算法及装置
CN113504796A (zh) * 2021-09-07 2021-10-15 上海特金信息科技有限公司 无人机轨迹处理方法、装置、电子设备和存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107709928A (zh) * 2015-04-10 2018-02-16 欧洲原子能共同体由欧洲委员会代表 用于实时建图与定位的方法和装置
CN108549084A (zh) * 2018-01-30 2018-09-18 西安交通大学 一种基于稀疏二维激光雷达的目标检测与姿态估计方法
CN109581312A (zh) * 2018-11-22 2019-04-05 西安电子科技大学昆山创新研究院 一种高分辨毫米波雷达多目标聚类方法
CN110210389A (zh) * 2019-05-31 2019-09-06 东南大学 一种面向道路交通场景的多目标识别跟踪方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107709928A (zh) * 2015-04-10 2018-02-16 欧洲原子能共同体由欧洲委员会代表 用于实时建图与定位的方法和装置
CN108549084A (zh) * 2018-01-30 2018-09-18 西安交通大学 一种基于稀疏二维激光雷达的目标检测与姿态估计方法
CN109581312A (zh) * 2018-11-22 2019-04-05 西安电子科技大学昆山创新研究院 一种高分辨毫米波雷达多目标聚类方法
CN110210389A (zh) * 2019-05-31 2019-09-06 东南大学 一种面向道路交通场景的多目标识别跟踪方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
GAO, DI: "Research on Taraget Detection and Tracking Algorithm for UAV Obstacle Avoidance Radar", HARBIN INSTITUTE OF TECHNOLOGY MASTER'S DEGREE THESIS, 31 December 2017 (2017-12-31), pages 1 - 86, XP055810312 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113208566A (zh) * 2021-05-17 2021-08-06 深圳大学 一种数据处理方法、装置、电子设备及存储介质
CN113208566B (zh) * 2021-05-17 2023-06-23 深圳大学 一种数据处理方法、装置、电子设备及存储介质
CN113325383A (zh) * 2021-06-17 2021-08-31 广东工业大学 基于网格和dbscan的自适应车载毫米波雷达聚类算法及装置
CN113325383B (zh) * 2021-06-17 2023-07-04 广东工业大学 基于网格和dbscan的自适应车载毫米波雷达聚类算法及装置
CN113504796A (zh) * 2021-09-07 2021-10-15 上海特金信息科技有限公司 无人机轨迹处理方法、装置、电子设备和存储介质

Also Published As

Publication number Publication date
CN112334946A (zh) 2021-02-05

Similar Documents

Publication Publication Date Title
US11987250B2 (en) Data fusion method and related device
US11461915B2 (en) Object size estimation using camera map and/or radar information
WO2021087777A1 (zh) 数据处理方法、装置、雷达、设备及存储介质
WO2020147485A1 (zh) 一种信息处理方法、系统、设备和计算机存储介质
KR102032070B1 (ko) 깊이 맵 샘플링을 위한 시스템 및 방법
WO2018222889A1 (en) Collision prediction system
CN111932943B (zh) 动态目标的检测方法、装置、存储介质及路基监测设备
US11587445B2 (en) System and method for fusing asynchronous sensor tracks in a track fusion application
CN112171675B (zh) 一种移动机器人的避障方法、装置、机器人及存储介质
EP3979156A1 (en) Work analysis system, work analysis device, and work analysis program
WO2021087760A1 (zh) 目标检测方法、雷达、设备及存储介质
JP7418476B2 (ja) 運転可能な領域情報を決定するための方法及び装置
Mendez et al. Automatic label creation framework for fmcw radar images using camera data
US20230273308A1 (en) Sensor based object detection
US11280899B2 (en) Target recognition from SAR data using range profiles and a long short-term memory (LSTM) network
CN115131756A (zh) 一种目标检测方法及装置
CN114330726A (zh) 追踪定位方法、装置、电子设备及存储介质
CN115331214A (zh) 用于目标检测的感知方法及系统
Ronecker et al. Dynamic Occupancy Grids for Object Detection: A Radar-Centric Approach
WO2023193567A1 (zh) 机器人的移动控制方法和装置、存储介质及电子装置
US20230215277A1 (en) System and method for fusing asynchronous sensor tracks in a track fusion application
CN112749504B (zh) 仿真扫描点的获取方法、装置、电子设备及存储介质
KR102649303B1 (ko) 전자 장치 및 그의 다중로봇 탐사영역 관리 방법
KR102669876B1 (ko) 국지적 분해능을 조정하는 레이더 데이터 처리 장치 및 방법
Danylova et al. AUTOMATED NAVIGATION FOR UNMANNED GROUND VEHICLES IN LOGISTICS.

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19951540

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19951540

Country of ref document: EP

Kind code of ref document: A1