CN112836628A - Method and device for processing adhesion moving points - Google Patents

Method and device for processing adhesion moving points Download PDF

Info

Publication number
CN112836628A
CN112836628A CN202110136517.3A CN202110136517A CN112836628A CN 112836628 A CN112836628 A CN 112836628A CN 202110136517 A CN202110136517 A CN 202110136517A CN 112836628 A CN112836628 A CN 112836628A
Authority
CN
China
Prior art keywords
target
point cloud
detection
detection target
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110136517.3A
Other languages
Chinese (zh)
Other versions
CN112836628B (en
Inventor
蒲晓波
廖瑞军
陈富
潘米样
陈建桦
邹万里
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Enno Electronics Co ltd
Original Assignee
Enno Electronics Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Enno Electronics Co ltd filed Critical Enno Electronics Co ltd
Priority to CN202110136517.3A priority Critical patent/CN112836628B/en
Publication of CN112836628A publication Critical patent/CN112836628A/en
Application granted granted Critical
Publication of CN112836628B publication Critical patent/CN112836628B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/35Categorising the entire scene, e.g. birthday party or wedding scene
    • G06V20/36Indoor scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention provides a method and a device for processing adhesion moving points, wherein the method comprises the following steps: acquiring current frame point cloud data and historical frame point cloud data obtained by millimeter wave radar detection, determining a detection target currently existing in a detection area, when the number of the detection targets is at least two, because the target adhesion condition may exist, at least one adhesion moving point can be determined from a plurality of corresponding moving points in the current frame point cloud data according to the detection target currently existing in the detection area, by determining a first detection target and a second detection target which have the minimum distance with each adhesion movable point, according to the first distance between the adhesion movable point and the first detection target, the second distance between the adhesion movable point and the second detection target and the vector included angle formed by the adhesion movable point, the first detection target and the second detection target process the adhesion movable point, thereby realizing the processing of the adhesion movable point.

Description

Method and device for processing adhesion moving points
Technical Field
The invention relates to the technical field of computers, in particular to a method and a device for processing adhesion moving points.
Background
The indoor activity target detection technology is an important basic perception detection technology in a smart city, and intelligent control of relevant intelligent equipment can be realized by detecting the activity track of an indoor activity target.
In the related art, millimeter-wave radar may be used to detect moving targets indoors. During detection, the millimeter wave radar can detect point cloud data in the detection area, and the moving tracks of the detection targets are obtained by using the detected moving points through the distribution of the detection targets to the plurality of moving points corresponding to the point cloud data.
However, when the distance between the multiple detection targets is short in the detection area, an intersection area of the active points exists between the multiple detection targets, the active points in the intersection area are adhesion active points, and how to process the adhesion active points becomes a problem to be solved urgently.
Disclosure of Invention
The embodiment of the invention provides a method and a device for processing adhesion moving points, which are used for processing the adhesion moving points.
In a first aspect, an embodiment of the present invention provides a method for processing adhesion moving points, including:
acquiring current frame point cloud data corresponding to a plurality of active points, which are obtained by detecting a detection area by the millimeter wave radar;
determining a detection target currently existing in the detection area according to the current frame point cloud data and stored historical frame point cloud data; the number of the detection targets is at least two;
determining at least one adhesion moving point from the plurality of moving points according to a detection target currently existing in the detection area;
for each adhesion activity point, executing:
determining a first detection target and a second detection target which have the minimum distance with the adhesion moving point from the detection targets currently existing in the detection area;
determining a first distance between the adhesion activity point and the first detection target, and determining a second distance between the adhesion activity point and the second detection target;
determining a vector included angle formed by the adhesion moving point and the first detection target and the second detection target;
and processing the adhesion moving point according to the first distance, the second distance and the vector included angle.
Preferably, the determining a detection target currently existing in the detection area according to the current frame point cloud data and according to the stored historical frame point cloud data includes:
determining a detection target existing before the current frame point cloud data is generated in the detection area according to the historical frame point cloud data; determining a detection target existing before the current frame point cloud data in the determined detection area is generated as a detection target existing in the detection area at present;
determining whether a newly added moving target exists in the detection area or not according to the current frame point cloud data and the historical frame point cloud data, and if so, adding the newly added moving target serving as a detection target to the detection target existing in the detection area;
determining whether a newly added static target exists in the detection area or not according to the current frame point cloud data and the historical frame point cloud data, and if so, adding the newly added static target serving as a detection target to the detection target existing in the detection area;
determining whether a reduced moving target exists in the detection area or not according to the current frame point cloud data and the historical frame point cloud data, and if so, deleting the reduced moving target from a detection target existing in the detection area;
and determining whether a reduced static target exists in the detection area or not according to the current frame point cloud data and the historical frame point cloud data, and if so, deleting the reduced static target from the detection target existing in the detection area.
Preferably, the current frame point cloud data includes attribute information of each active point;
the determining whether a newly added active target currently exists in the detection area includes:
clustering the plurality of active points into at least one point cloud cluster according to the attribute information of each active point;
if the point cloud cluster meeting the moving target generation condition exists in the at least one point cloud cluster, determining that a newly added moving target exists in the detection area, and distributing the point cloud cluster meeting the moving target generation condition as the newly added moving target; if not, determining that no newly added moving target exists in the detection area;
and/or the presence of a gas in the gas,
the determining whether a newly added stationary target currently exists in the detection area includes:
according to whether a point cloud cluster corresponding to a moving target included in the point cloud data of the current frame, which is an adjacent frame, in the historical frame point cloud data disappears in a plurality of moving points corresponding to the current frame point cloud data, if so, determining that a newly added static target currently exists in the detection area, and determining the moving target corresponding to the disappeared point cloud cluster as the newly added static target; otherwise, determining that no newly added static target exists in the detection area;
and/or the presence of a gas in the gas,
the determining whether there are currently reduced moving objects within the detection region comprises:
whether a point cloud cluster corresponding to a moving target included in the point cloud data of the current frame which is an adjacent frame in the historical frame point cloud data disappears in a plurality of moving points corresponding to the current frame point cloud data or not is judged, if yes, a reduced moving target currently exists in the detection area is determined, and the moving target corresponding to the disappeared point cloud cluster is determined to be the reduced moving target; otherwise, determining that no reduced moving objects currently exist in the detection area;
and/or the presence of a gas in the gas,
the determining whether there are currently reduced stationary objects within the detection area comprises:
determining whether the static time length of a static target reaches a set first time length or not according to the current frame point cloud data and the historical frame point cloud data, if so, determining that a reduced static target currently exists in the detection area, and determining the static target with the static time length reaching the set time length as the reduced static target; otherwise, it is determined that there are currently no reduced stationary objects within the detection region.
Preferably, the activity goal generating condition includes:
the duration of the point cloud cluster reaches a set second duration, and the number of the active points in the point cloud cluster is not less than a preset number threshold; and/or the distance between the point cloud cluster and the moving target closest to the point cloud cluster is not less than the set first length threshold.
Preferably, the current frame point cloud data includes position information of each active point;
the determining at least one adhesion activity point from the plurality of activity points comprises:
determining the position information of each detection target according to the position information of the active point corresponding to each detection target;
for each of the plurality of active points, performing:
calculating the distance between the active point and each detection target according to the position information of the active point and the position information of each detection target;
calculating the distance difference of the minimum two distances in each distance;
and if the distance difference is smaller than a set second length threshold, determining the movable point as an adhesion movable point.
Preferably, the current frame point cloud data includes: the distance, azimuth angle and Doppler value from each active point to the millimeter wave radar;
the calculating the distance between the active point and each detection target respectively comprises:
for each detection target, performing:
determining the motion speed and the motion acceleration of the detection target according to the current frame point cloud data and the historical frame point cloud data;
generating a three-dimensional covariance matrix according to the position information, the movement speed and the movement acceleration of the detection target;
determining a plurality of target active points corresponding to the detection target, and determining the distance, the azimuth angle and the Doppler value between the detection target and the millimeter wave radar according to the distance, the azimuth angle and the Doppler value between each target active point and the millimeter wave radar in the current frame point cloud data;
calculating the distance D between the active point and the detection target according to the following first formula;
D=d[0]*(d[0]*S[0]+d[1]*S[3]+d[2]*S[6])+d[1]*(d[0]*S[1]+d[1]*S[4]+d[2]*S[7])+d[2]*(d[0]*S[2]+d[1]*S[5]+d[2]*S[8]);
d 0 is used for representing the difference value between the distance from the active point to the millimeter wave radar and the distance from the detection target to the millimeter wave radar; d 1 is used for representing the difference value between the azimuth angle from the active point to the millimeter wave radar and the azimuth angle from the detection target to the millimeter wave radar; d 2 is used to represent the difference between the Doppler value from the active point to the millimeter wave radar and the Doppler value from the detected target to the millimeter wave radar; s [0] to S [8] are respectively used for representing elements in an inverse matrix of the three-dimensional covariance matrix.
Preferably, the processing the adhesion activity point according to the first distance, the second distance and the vector included angle includes:
judging whether the included angle of the vector is smaller than a set angle threshold value or not;
if the vector included angle is smaller than the set angle threshold, further judging the distance difference value between the first distance and the second distance, and if the distance difference value between the first distance and the second distance is larger than 0, allocating the adhesion activity point to the detection target closest to the adhesion activity point; if the distance difference between the first distance and the second distance is equal to 0, calculating the dynamic speeds corresponding to the first detection target and the second detection target respectively, and allocating the adhesion activity point to the detection target with the maximum dynamic speed in the first detection target and the second detection target;
and if the vector included angle is not smaller than the set angle threshold value, marking the adhesion activity point as an invalid activity point, and discarding the adhesion activity point marked as the invalid activity point.
Preferably, the calculating the dynamic speeds corresponding to the first detection target and the second detection target respectively includes:
for each of the first detection target and the second detection target, performing:
sampling the current frame point cloud data and the historical frame point cloud data, and sampling two frames of point cloud data each time, wherein the time stamp distance between the two frames of point cloud data is T;
determining the position information of the detection target in each frame of point cloud data according to two frames of point cloud data obtained by each sampling;
and calculating the dynamic speed of the detection target by using the following second formula:
Figure BDA0002927147160000061
wherein, VtThe dynamic speed of the detected target is represented, n is used for representing the sampling times, (x)new,ynew) The method is used for representing the position information of the detection target in a frame of point cloud data closest to the current time stamp in the m-th sampling, (x)old,yold) And the method is used for representing the position information of the detection target in the frame of point cloud data farthest from the current timestamp in the m-th sampling.
In a second aspect, an embodiment of the present invention further provides an adhesion moving point processing apparatus, including:
the point cloud data acquisition unit is used for acquiring current frame point cloud data corresponding to the plurality of active points, which are obtained by detecting a detection area by the millimeter wave radar;
a detection target determining unit, configured to determine a detection target currently existing in the detection area according to the current frame point cloud data and according to stored historical frame point cloud data; the number of the detection targets is at least two;
an adhesion activity point determining unit, configured to determine at least one adhesion activity point from the multiple activity points according to a detection target currently existing in the detection area;
an adhesion activity point processing unit configured to, for each adhesion activity point, execute: determining a first detection target and a second detection target which have the minimum distance with the adhesion moving point from the detection targets currently existing in the detection area; determining a first distance between the adhesion activity point and the first detection target, and determining a second distance between the adhesion activity point and the second detection target; determining a vector included angle formed by the adhesion moving point and the first detection target and the second detection target; and processing the adhesion moving point according to the first distance, the second distance and the vector included angle.
In a third aspect, an embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed in a computer, the computer program causes the computer to execute any one of the methods described above.
The embodiment of the invention provides an adhesion moving point processing method and device, which can determine a detection target currently existing in a detection area through current frame point cloud data and historical frame point cloud data obtained by millimeter wave radar detection, when the number of the detection targets is at least two, because target adhesion conditions may exist, at least one adhesion moving point can be determined from a plurality of corresponding moving points in the current frame point cloud data according to the detection target currently existing in the detection area, a first detection target and a second detection target which have the smallest distance with the adhesion moving point are determined for each adhesion moving point, according to a first distance between the adhesion moving point and the first detection target, a second distance between the adhesion moving point and the second detection target and a vector included angle formed by the adhesion moving point and the first detection target and the second detection target, the adhesion moving points are processed, so that the processing of the adhesion moving points can be realized.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a flow chart of a method for processing adhesion moving points according to an embodiment of the present invention;
fig. 2 is a flowchart of a method for determining an adhesion activity point according to an embodiment of the present invention;
FIG. 3 is a schematic diagram illustrating a relationship between a moving point and a millimeter-wave radar according to an embodiment of the present invention;
FIG. 4 is a schematic diagram illustrating a positional relationship between an adhesion moving point and two nearest detection targets according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of an adhesion moving point processing apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer and more complete, the technical solutions in the embodiments of the present invention will be described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention, and based on the embodiments of the present invention, all other embodiments obtained by a person of ordinary skill in the art without creative efforts belong to the scope of the present invention.
Referring to fig. 1, an embodiment of the invention provides a method for processing an adhesion moving point, including:
step 101: and acquiring current frame point cloud data corresponding to the plurality of active points, which is obtained by detecting a detection area by the millimeter wave radar.
Step 102: determining a detection target currently existing in the detection area according to the current frame point cloud data and stored historical frame point cloud data; wherein, the number of the detection targets is at least two.
Step 103: and determining at least one adhesion moving point from the plurality of moving points according to the detection target currently existing in the detection area.
Step 104: for each adhesion activity point, executing: determining a first detection target and a second detection target which have the minimum distance with the adhesion moving point from the detection targets currently existing in the detection area; determining a first distance between the adhesion activity point and the first detection target, and determining a second distance between the adhesion activity point and the second detection target; determining a vector included angle formed by the adhesion moving point and the first detection target and the second detection target; and processing the adhesion moving point according to the first distance, the second distance and the vector included angle.
In the embodiment shown in fig. 1, the current frame point cloud data and the historical frame point cloud data obtained by millimeter wave radar detection can be used to determine the detection target currently existing in the detection area, when the number of the detection targets is at least two, due to the possibility of target adhesion, at least one adhesion moving point can be determined from the corresponding multiple moving points in the current frame point cloud data according to the detection target currently existing in the detection area, the adhesion moving point is processed by determining, for each adhesion moving point, a first detection target and a second detection target which are the smallest in distance from the adhesion moving point, according to the first distance between the adhesion moving point and the first detection target, the second distance between the adhesion moving point and the second detection target, and the vector included angle formed by the adhesion moving point and the first detection target and the second detection target, thereby realizing the treatment of the adhesion moving points.
The implementation of each step is described below.
In step 101, current frame point cloud data corresponding to a plurality of active points, which is obtained by detecting a detection area by a millimeter wave radar, is obtained.
The millimeter wave radar can be arranged in the detection area to detect moving targets in the detection area, and when one frame of point cloud data is detected, the current frame of point cloud data is output.
The method comprises the following steps of obtaining current frame point cloud data output by a millimeter wave radar, wherein the current frame point cloud data obtained by the millimeter wave radar corresponds to a plurality of active points, and the point cloud data at least comprises the following components: the number of active points, the position information of each active point, the speed, the signal-to-noise ratio.
In step 102, determining a detection target currently existing in the detection area according to current frame point cloud data and stored historical frame point cloud data; wherein, the number of the detection targets is at least two.
When detecting a moving target in a detection area, one purpose is to obtain a motion track of the moving target, and when obtaining the motion track, current frame point cloud data and historical frame point cloud data need to be used, so that each frame of point cloud data is received, the frame of point cloud data is stored for later use.
When a plurality of moving objects in the detection area are close to each other, an intersection area of moving points may exist between the plurality of moving objects. In addition, when an active target passes by a static target, there may be a case where the active target takes the active points of the static target, that is, there are active points that do not belong to the active target in the active points corresponding to the active target detected by the millimeter wave radar. Therefore, both moving objects and stationary objects in the detection area belong to the detection objects described in the embodiments of the present invention.
In an embodiment of the present invention, due to the states of the detection target being moving, stationary and moving away from the detection area, the detection target in the detection area has the following situations:
situation one, newly adding an activity target;
in case two, a static target is newly added;
case three, the activity goal is reduced;
case four, stationary target is reduced.
Therefore, after the current frame point cloud data is obtained, the detection target existing before the current frame point cloud data is generated in the detection area can be determined according to the historical frame data, the detection target existing before the current frame point cloud data is generated in the determined detection area can be determined as the detection target existing currently in the detection area, the detection targets are judged according to the above conditions, and the detection targets existing currently in the detection area are determined again. The following are descriptions of the several cases that exist separately.
Situation one, newly adding an activity target.
In this case, it is necessary to determine whether a newly added moving target currently exists in the detection area according to the current frame point cloud data and the historical frame point cloud data, and if so, add the newly added moving target as a detection target to the detection target currently existing in the detection area. The new active object may be generated after entering the detection region from outside the detection region.
Wherein, whether the newly added active target exists in the detection area can be determined at least in one of the following ways: clustering the plurality of active points into at least one point cloud cluster according to the attribute information of each active point; the current frame point cloud data comprises attribute information of each active point; if the point cloud cluster meeting the moving target generation condition exists in the at least one point cloud cluster, determining that a newly added moving target exists in the detection area, and distributing the point cloud cluster meeting the moving target generation condition as the newly added moving target; and if not, determining that no newly added active target exists in the detection area.
The attribute information may include position information, velocity, and signal-to-noise ratio, among others. Through the attribute information, a plurality of active points can be clustered into at least one point cloud cluster, and if the point cloud cluster obtained through clustering meets the active target generation condition, the point cloud cluster meeting the condition can be determined as a newly added active target.
In one embodiment of the present invention, the activity goal generating condition may include: the duration of the point cloud cluster reaches a set second duration (for example, the second duration is 5s), and the number of the active points included in the point cloud cluster is not less than a preset number threshold (for example, the number threshold is 100); and/or the distance between the cloud cluster of point clouds and the moving target closest to the cloud cluster of point clouds is not less than a set first length threshold (for example, the first length threshold is 0.5 m).
And in case two, newly adding a static target.
In this case, it is necessary to determine whether a newly added stationary target currently exists in the detection area according to the current frame point cloud data and the historical frame point cloud data, and if so, add the newly added stationary target as a detection target to the detection target currently existing in the detection area. The newly added static target is generated after the moving target existing in the detection area is changed from the moving state to the static state, so if the newly added static target exists, the moving target is reduced, and the detection target can be further updated according to the third condition.
Wherein, whether a newly added static target currently exists in the detection area can be determined at least in one of the following ways: according to whether a point cloud cluster corresponding to a moving target included in the point cloud data of the current frame, which is an adjacent frame, in the historical frame point cloud data disappears in a plurality of moving points corresponding to the current frame point cloud data, if so, determining that a newly added static target currently exists in the detection area, and determining the moving target corresponding to the disappeared point cloud cluster as the newly added static target; otherwise, determining that no newly added static target exists in the detection area.
Case three, the activity goal is reduced.
In this case, it is necessary to determine whether there is a reduced moving object currently in the detection area according to the current frame point cloud data and the historical frame point cloud data, and if there is a reduced moving object, delete the reduced moving object from the detection object currently in the detection area.
Wherein determining whether there is currently a reduced moving object within the detection area may be performed at least one of: whether a point cloud cluster corresponding to a moving target included in the point cloud data of the current frame which is an adjacent frame in the historical frame point cloud data disappears in a plurality of moving points corresponding to the current frame point cloud data or not is judged, if yes, a reduced moving target currently exists in the detection area is determined, and the moving target corresponding to the disappeared point cloud cluster is determined to be the reduced moving target; otherwise, it is determined that there are currently no reduced moving objects within the detection area.
Case four, stationary target is reduced.
In the fourth case, it is necessary to determine whether there are reduced stationary objects currently in the detection area according to the current frame point cloud data and the historical frame point cloud data, and if so, delete the reduced stationary objects from the detection objects currently in the detection area.
Wherein determining whether there is currently a reduced stationary target within the detection area may be performed at least one of: determining whether the static time length of a static target reaches a set first time length or not according to the current frame point cloud data and the historical frame point cloud data, if so, determining that a reduced static target currently exists in the detection area, and determining the static target with the static time length reaching the set time length as the reduced static target; otherwise, it is determined that there are currently no reduced stationary objects within the detection region.
In the case of the third case and the case of the fourth case, the number of moving targets in the detection area is reduced, and the moving targets may be generated after the moving targets are changed from the moving state to the stationary state, or may be generated after the moving targets leave the detection area. In one embodiment of the present invention, if the point cloud cluster corresponding to the moving object disappears in the current frame point cloud data, it is preferentially determined that the moving object changes from the moving state to the stationary state, that is, the moving object is converted into the stationary object, and if the stationary time length of the stationary object reaches the set first time length, it indicates that the moving object converted into the stationary object leaves the detection area.
In the embodiment of the invention, the four conditions are judged, and the detection target in the detection area is updated according to the judgment result, so that the detection target currently existing in the detection area can be accurately obtained, and the accuracy of determining the adhesion moving point in the subsequent process is improved.
In step 103, at least one adhesion moving point is determined from the plurality of moving points according to a detection target currently existing in the detection area.
Wherein, the current frame point cloud data can comprise the position information of each active point. The position information may be a coordinate position in a two-dimensional coordinate system constructed for the detection region.
Referring to fig. 2, at least one adhesion activity point may be determined from the plurality of activity points at least one of the following ways, and the method may include the following steps:
step 201: and determining the position information of each detection target according to the position information of the active point corresponding to each detection target.
Since each detection target is composed of a plurality of active points, when determining the position information of a detection target, it can be determined by the position information of the active points composing the detection target.
When the detection target is a stationary target, the position information of the stationary target may be determined from the position information of the moving target converted into the stationary target when the moving target is changed from the moving state to the stationary state.
In one embodiment of the present invention, the position information of each detection target can be determined at least by one of the following ways:
for each detection target, the position information of the detection target can be calculated according to the following formula:
Figure BDA0002927147160000121
wherein (X, Y) is used for representing the coordinate position of the detection target in a two-dimensional coordinate system, (X)R,YR) Is used for representing the coordinate position of the millimeter wave radar in the two-dimensional coordinate system, k is used for representing the total number of the movable points included by the detection target, lgFor characterizing the distance theta between the g-th active point in the detected target and the millimeter-wave radargThe method is used for representing the azimuth angle between the g-th active point in the detection target and the millimeter wave radar.
The distance and azimuth between the moving point and the millimeter wave radar and the coordinate position of the millimeter wave radar can be directly obtained from the current frame point cloud data. Please refer to fig. 3 for the distance between the active point and the millimeter-wave radar, the azimuth, and the coordinate position of the millimeter-wave radar.
The determination method of the position information of the detection target may be determined by other methods than the above-described method, for example, by determining the position of any one of the plurality of active points corresponding to the detection target as the position of the detection target, or by randomly removing a plurality of active points from the plurality of active points corresponding to the detection target, and then determining the center positions of the remaining active points as the position of the detection target.
Step 202: for each of the plurality of active points, performing:
step 2021: and calculating the distance between the active point and each detection target according to the position information of the active point and the position information of each detection target.
Step 2022: the distance difference between the smallest two distances among the distances is calculated.
Step 2023: and if the distance difference is smaller than a set second length threshold, determining the movable point as an adhesion movable point.
In this step 2021, the current frame point cloud data may include: the distance, azimuth angle and Doppler value from each active point to the millimeter wave radar; the distance between the moving point and the detection target can be calculated according to the point cloud data, and in one embodiment of the invention, the distance can be obtained at least by one of the following ways:
step A1: and determining the motion speed (X ', Y') and the motion acceleration (X ', Y') of the detection target according to the current frame point cloud data and the historical frame point cloud data.
The movement speed of the detection target can be determined by the movement speeds of a plurality of active points corresponding to the detection target. Similarly, the motion acceleration of the detection target may be determined by the motion accelerations of the plurality of active points corresponding to the detection target. And for the movement speed and the movement acceleration of the active point, the position change of the active point in a time period can be determined according to the current frame point cloud data and the historical frame point cloud data, so that the movement speed and the movement acceleration of the active point are determined.
Step A2: a three-dimensional covariance matrix is generated from the position information (X, Y), the movement velocity (X ', Y') and the movement acceleration (X ', Y') of the detection target. The three-dimensional covariance matrix is as follows:
Figure BDA0002927147160000141
step A3: and determining a plurality of target active points corresponding to the detection target, and determining the distance, the azimuth angle and the Doppler value between the detection target and the millimeter wave radar according to the distance, the azimuth angle and the Doppler value between each target active point and the millimeter wave radar in the current frame point cloud data.
The average distance, the average azimuth angle and the average doppler value from the multiple target active points corresponding to the detection target to the millimeter wave radar can be used as the distance, the azimuth angle and the doppler value between the detection target and the millimeter wave radar.
Step A4: and calculating the distance D corresponding to the moving point to the detection target according to the following first formula.
D=d[0]*(d[0]*S[0]+d[1]*S[3]+d[2]*S[6])+d[1]*(d[0]*S[1]+d[1]*S[4]+d[2]*S[7])+d[2]*(d[0]*S[2]+d[1]*S[5]+d[2]*S[8]);
D 0 is used for representing the difference value between the distance from the active point to the millimeter wave radar and the distance from the detection target to the millimeter wave radar; d 1 is used for representing the difference value between the azimuth angle from the active point to the millimeter wave radar and the azimuth angle from the detection target to the millimeter wave radar; d 2 is used to represent the difference between the Doppler value from the active point to the millimeter wave radar and the Doppler value from the detected target to the millimeter wave radar; s [0] to S [8] are respectively used for representing elements in an inverse matrix of the three-dimensional covariance matrix.
The inverse of the three-dimensional covariance matrix may be:
Figure BDA0002927147160000142
because the position information of the detection target changes very frequently and cannot be a fixed point when the detection target is in the moving state, the movement state of the detection target can be well expressed by considering the three-dimensional characteristics of the position information, the movement speed and the movement acceleration of the detection target, and therefore, the distance between the moving point and the detection target is calculated by using the method, and the calculation result is more accurate.
It should be noted that the above-mentioned manner of calculating the distance between the moving point and the detection target is a preferred manner, but other manners may be used to determine the distance, such as obtaining the distance by a scale, for example, by a formula
Figure BDA0002927147160000151
Is obtained by calculation, wherein (X)1,Y1) Is the coordinate position of the active point.
In step 2022 and step 2023, if the two minimum distances among the distances calculated in step 2021 are L1 and L2, respectively, if the distance difference between L1 and L2 is smaller than the set second length threshold, for example, the second length threshold is 0.2m, it indicates that the distance between the detection target corresponding to L1 and the detection target corresponding to L2 is very close, and there is a target adhesion condition in the distance, so the activity point is the adhesion activity point.
In this embodiment, if the distance difference calculated for the active point is greater than or equal to the second length threshold, it indicates that the distance between the two detection targets corresponding to the distance difference is relatively long, and there is no target adhesion condition, so that the active point is not an adhesion active point, and the active point may be directly allocated to the detection target closest to the active point.
In step 104, for each adhesion activity point, the following steps are performed:
step 1041: and determining a first detection target and a second detection target which have the minimum distance with the adhesion moving point from the detection targets currently existing in the detection area.
Since the greater probability that the adhesion activity point belongs to the detection target with the smallest distance to the adhesion activity point, it is necessary to determine the detection target with the smallest distance to the adhesion activity point, for example, the first detection target. However, because there is a blocking situation, it is necessary to determine a detection target, which is the smallest moving point of the blocking, except the first detection target, among the currently existing detection targets, for example, the second detection target.
The distance between the adhesion moving point and each detection target can be obtained through the calculation in step 103.
Step 1042: determining a first distance between the adhesion activity point and the first detection target, and determining a second distance between the adhesion activity point and the second detection target.
Step 1043: and determining a vector included angle formed by the adhesion moving point and the first detection target and the second detection target.
Please refer to fig. 4, which is a schematic diagram illustrating a position relationship between the adhesion activity point P and the first detection target a and the second detection target B, wherein a vector included angle formed by the adhesion activity point P and the first detection target a and the second detection target B is θS
In one embodiment of the invention, the vector angle θSCan be calculated by at least the following formula:
θS=acos(CV)×180/π
CV=PV/(Vt1×Vt2)
PV=(Δxt1×Δxt2)+(Δyt1×Δyt2)
wherein:
Δxt1=xa-xp
Δxt2=xb-xp
Δyt1=ya-yp
Δyt2=yb-yp
Figure BDA0002927147160000161
Figure BDA0002927147160000162
wherein (x)p,yp) Is the coordinate position of the adhesion moving point in a two-dimensional coordinate system (x)a,ya) (x) a coordinate position in a two-dimensional coordinate system of the first detection targetb,yb) Is the coordinate position of the second detection target in the two-dimensional coordinate system.
Step 1044: and processing the adhesion moving point according to the first distance, the second distance and the vector included angle.
In this step 1044, the adhesion movement points may be treated as follows:
b1: judging whether the included angle of the vector is smaller than a set angle threshold value or not; if the included angle of the vector is smaller than the set angle threshold value, executing the judgment of the next step; and if the vector included angle is not smaller than the set angle threshold value, marking the adhesion activity point as an invalid activity point, and discarding the adhesion activity point marked as the invalid activity point.
If the vector included angle is not smaller than the set angle threshold value, the adhesion movable point is located between the first detection target and the second detection target, for the adhesion movable point, partial areas of the two detection targets may be overlapped, and the adhesion movable point is an movable point in the overlapped area. If the moving point is allocated to one of the detection targets, the accuracy of the detection result of the other detection target is affected, so that the adhesion moving point can be marked as an invalid moving point, and the adhesion moving point marked as the invalid moving point can be discarded, thereby improving the accuracy of the detection of the first detection target and the second detection target.
B2: further judging the distance difference between the first distance and the second distance, and if the distance difference between the first distance and the second distance is larger than 0, allocating the adhesion activity point to the detection target closest to the adhesion activity point; and if the distance difference between the first distance and the second distance is equal to 0, calculating the dynamic speed corresponding to the first detection target and the dynamic speed corresponding to the second detection target respectively, and distributing the adhesion activity point to the detection target with the maximum dynamic speed in the first detection target and the second detection target.
As shown in step 2023, if the distance difference between the first distance and the second distance of the adhesion activity point is smaller than the second length threshold, there are two cases as follows for the distance difference L between the first distance and the second distance:
case a: l is more than 0;
in this case a, assuming that the first distance is greater than the second distance, it indicates that the stuck activity point is closer to the first detection target, and thus the stuck activity point can be directly assigned to the first detection target.
Case B: and L is 0.
In this case B, since the first distance and the second distance are equal, it indicates that the adhesion movement point is located on the central axis between the first detection target and the second detection target, and in this case, in order to determine which detection target the adhesion movement point belongs to, it can be determined by the dynamic speed of the detection target.
In one embodiment of the present invention, the dynamic velocity of the detection target may be calculated at least by one of the following ways, which is performed for each of the first detection target and the second detection target:
sampling the current frame point cloud data and the historical frame point cloud data, and sampling two frames of point cloud data each time, wherein the time stamp distance between the two frames of point cloud data is T; for example, T is 1 s. In order to calculate the dynamic speed more accurately, the current frame point cloud data can be used as a sampling frame during sampling. Taking the frames corresponding to each frame of point cloud data acquired by the millimeter wave radar as F1, F2, F3, F100, and F101 … … as examples, where F1 is current frame point cloud data, and others are historical frame point cloud data, the first sampling may be F1 and F5, the second sampling may be F3 and F7, and the third sampling may be F5 and F9 … …. Wherein, the difference of the time stamps between F1 and F5, F3 and F7, and F5 and F9 are all T.
Determining the position information of the detection target in each frame of point cloud data according to two frames of point cloud data obtained by each sampling;
and calculating the dynamic speed of the detection target by using the following second formula:
Figure BDA0002927147160000181
wherein, VtThe dynamic speed of the detected target is represented, n is used for representing the sampling times, (x)new,ynew) The method is used for representing the position of the detection target in a frame of point cloud data closest to the current time stamp in the m-th samplingInformation, (x)old,yold) And the method is used for representing the position information of the detection target in the frame of point cloud data farthest from the current timestamp in the m-th sampling.
It should be noted that the above calculation method of the dynamic speed of the detection target is a preferred method, and besides this method, other methods may be used for calculation, which is not limited in the embodiment of the present invention.
In the above embodiment, when the number of the detection targets is at least two, due to the possibility of target adhesion, at least one adhesion moving point may be determined from a plurality of corresponding moving points in the current frame point cloud data according to the detection target currently existing in the detection area, and a first detection target and a second detection target having the smallest distance to the adhesion moving point are determined for each adhesion moving point, and the adhesion moving point is processed according to the first distance between the adhesion moving point and the first detection target, the second distance between the adhesion moving point and the second detection target, and the vector included angle formed by the adhesion moving point and the first detection target and the second detection target, thereby realizing the treatment of the adhesion moving points.
Referring to fig. 5, an embodiment of the present invention further provides an adhesion moving point processing apparatus, including:
a point cloud data obtaining unit 501, configured to obtain current frame point cloud data corresponding to multiple active points, where the current frame point cloud data is obtained by detecting a detection area by the millimeter wave radar;
a detection target determining unit 502, configured to determine a detection target currently existing in the detection area according to the current frame point cloud data and according to stored historical frame point cloud data; the number of the detection targets is at least two;
an adhesion activity point determining unit 503, configured to determine at least one adhesion activity point from the multiple activity points according to a detection target currently existing in the detection area;
an adhesion activity point processing unit 504 configured to, for each adhesion activity point, perform: determining a first detection target and a second detection target which have the minimum distance with the adhesion moving point from the detection targets currently existing in the detection area; determining a first distance between the adhesion activity point and the first detection target, and determining a second distance between the adhesion activity point and the second detection target; determining a vector included angle formed by the adhesion moving point and the first detection target and the second detection target; and processing the adhesion moving point according to the first distance, the second distance and the vector included angle.
In an embodiment of the present invention, the detection target determining unit is specifically configured to perform the following steps:
determining a detection target existing before the current frame point cloud data is generated in the detection area according to the historical frame point cloud data; determining a detection target existing before the current frame point cloud data in the determined detection area is generated as a detection target existing in the detection area at present;
determining whether a newly added moving target exists in the detection area or not according to the current frame point cloud data and the historical frame point cloud data, and if so, adding the newly added moving target serving as a detection target to the detection target existing in the detection area;
determining whether a newly added static target exists in the detection area or not according to the current frame point cloud data and the historical frame point cloud data, and if so, adding the newly added static target serving as a detection target to the detection target existing in the detection area;
determining whether a reduced moving target exists in the detection area or not according to the current frame point cloud data and the historical frame point cloud data, and if so, deleting the reduced moving target from a detection target existing in the detection area;
and determining whether a reduced static target exists in the detection area or not according to the current frame point cloud data and the historical frame point cloud data, and if so, deleting the reduced static target from the detection target existing in the detection area.
In one embodiment of the invention, the current frame point cloud data comprises attribute information of each active point;
the detection target determining unit, when determining whether there is a newly added active target in the detection area, specifically includes:
clustering the plurality of active points into at least one point cloud cluster according to the attribute information of each active point;
if the point cloud cluster meeting the moving target generation condition exists in the at least one point cloud cluster, determining that a newly added moving target exists in the detection area, and distributing the point cloud cluster meeting the moving target generation condition as the newly added moving target; if not, determining that no newly added moving target exists in the detection area;
the detection target determining unit, when determining whether there is a newly added stationary target in the detection area, specifically includes:
according to whether a point cloud cluster corresponding to a moving target included in the point cloud data of the current frame, which is an adjacent frame, in the historical frame point cloud data disappears in a plurality of moving points corresponding to the current frame point cloud data, if so, determining that a newly added static target currently exists in the detection area, and determining the moving target corresponding to the disappeared point cloud cluster as the newly added static target; otherwise, determining that no newly added static target exists in the detection area;
the detection target determining unit, when determining whether there is a reduced moving target currently in the detection region, specifically includes:
whether a point cloud cluster corresponding to a moving target included in the point cloud data of the current frame which is an adjacent frame in the historical frame point cloud data disappears in a plurality of moving points corresponding to the current frame point cloud data or not is judged, if yes, a reduced moving target currently exists in the detection area is determined, and the moving target corresponding to the disappeared point cloud cluster is determined to be the reduced moving target; otherwise, determining that no reduced moving objects currently exist in the detection area;
the detection target determining unit, when determining whether there is a reduced stationary target currently in the detection region, specifically includes:
determining whether the static time length of a static target reaches a set first time length or not according to the current frame point cloud data and the historical frame point cloud data, if so, determining that a reduced static target currently exists in the detection area, and determining the static target with the static time length reaching the set time length as the reduced static target; otherwise, it is determined that there are currently no reduced stationary objects within the detection region.
In one embodiment of the present invention, the activity target generation condition includes:
the duration of the point cloud cluster reaches a set second duration, and the number of the active points in the point cloud cluster is not less than a preset number threshold; and/or the distance between the point cloud cluster and the moving target closest to the point cloud cluster is not less than the set first length threshold.
In one embodiment of the invention, the current frame point cloud data comprises position information of each active point;
the adhesion activity point determining unit is specifically configured to perform the following operations:
determining the position information of each detection target according to the position information of the active point corresponding to each detection target;
for each of the plurality of active points, performing:
calculating the distance between the active point and each detection target according to the position information of the active point and the position information of each detection target;
calculating the distance difference of the minimum two distances in each distance;
and if the distance difference is smaller than a set second length threshold, determining the movable point as an adhesion movable point.
In one embodiment of the present invention, the current frame point cloud data includes: the distance, azimuth angle and Doppler value from each active point to the millimeter wave radar;
the adhesion activity point determining unit is specifically configured to, when calculating the distance corresponding to the activity point and each detection target, perform the following operations:
for each detection target, performing:
determining the motion speed and the motion acceleration of the detection target according to the current frame point cloud data and the historical frame point cloud data;
generating a three-dimensional covariance matrix according to the position information, the movement speed and the movement acceleration of the detection target;
determining a plurality of target active points corresponding to the detection target, and determining the distance, the azimuth angle and the Doppler value between the detection target and the millimeter wave radar according to the distance, the azimuth angle and the Doppler value between each target active point and the millimeter wave radar in the current frame point cloud data;
calculating the distance D between the active point and the detection target according to the following first formula;
D=d[0]*(d[0]*S[0]+d[1]*S[3]+d[2]*S[6])+d[1]*(d[0]*S[1]+d[1]*S[4]+d[2]*S[7])+d[2]*(d[0]*S[2]+d[1]*S[5]+d[2]*S[8]);
d 0 is used for representing the difference value between the distance from the active point to the millimeter wave radar and the distance from the detection target to the millimeter wave radar; d 1 is used for representing the difference value between the azimuth angle from the active point to the millimeter wave radar and the azimuth angle from the detection target to the millimeter wave radar; d 2 is used to represent the difference between the Doppler value from the active point to the millimeter wave radar and the Doppler value from the detected target to the millimeter wave radar; s [0] to S [8] are respectively used for representing elements in an inverse matrix of the three-dimensional covariance matrix.
In an embodiment of the present invention, when the processing unit processes the adhesion moving point according to the first distance, the second distance, and the vector included angle, the processing unit is specifically configured to perform the following operations:
judging whether the included angle of the vector is smaller than a set angle threshold value or not;
if the vector included angle is smaller than the set angle threshold, further judging the distance difference value between the first distance and the second distance, and if the distance difference value between the first distance and the second distance is larger than 0, allocating the adhesion activity point to the detection target closest to the adhesion activity point; if the distance difference between the first distance and the second distance is equal to 0, calculating the dynamic speeds corresponding to the first detection target and the second detection target respectively, and allocating the adhesion activity point to the detection target with the maximum dynamic speed in the first detection target and the second detection target;
and if the vector included angle is not smaller than the set angle threshold value, marking the adhesion activity point as an invalid activity point, and discarding the adhesion activity point marked as the invalid activity point.
In an embodiment of the present invention, when the adhesion moving point processing unit calculates the dynamic speeds corresponding to the first detection target and the second detection target, the following operations are specifically performed:
for each of the first detection target and the second detection target, performing:
sampling the current frame point cloud data and the historical frame point cloud data, and sampling two frames of point cloud data each time, wherein the time stamp distance between the two frames of point cloud data is T;
determining the position information of the detection target in each frame of point cloud data according to two frames of point cloud data obtained by each sampling;
and calculating the dynamic speed of the detection target by using the following second formula:
Figure BDA0002927147160000231
wherein, VtThe dynamic speed of the detected target is represented, n is used for representing the sampling times, (x)new,ynew) The method is used for representing the position information of the detection target in a frame of point cloud data closest to the current time stamp in the m-th sampling, (x)old,yold) And the method is used for representing the position information of the detection target in the frame of point cloud data farthest from the current timestamp in the m-th sampling.
The configuration illustrated in the embodiment of the present specification is not intended to specifically limit the stuck moving point processing apparatus. In other embodiments of the specification, the adhesive moving point processing apparatus may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
For the information interaction, execution process, and other contents between the units in the apparatus, the specific contents may refer to the description in the method embodiment of the present specification because the same concept is based on the method embodiment of the present specification, and are not described herein again.
Embodiments of the present invention also provide a computer-readable storage medium, on which a computer program is stored, which, when executed in a computer, causes the computer to perform the method of any one of the above embodiments.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other similar elements in a process, method, article, or apparatus that comprises the element.
Those of ordinary skill in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Finally, it is to be noted that: the above description is only a preferred embodiment of the present invention, and is only used to illustrate the technical solutions of the present invention, and not to limit the protection scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.

Claims (10)

1. A method for treating adhesion moving points is characterized by comprising the following steps:
acquiring current frame point cloud data corresponding to a plurality of active points, which are obtained by detecting a detection area by the millimeter wave radar;
determining a detection target currently existing in the detection area according to the current frame point cloud data and stored historical frame point cloud data; the number of the detection targets is at least two;
determining at least one adhesion moving point from the plurality of moving points according to a detection target currently existing in the detection area;
for each adhesion activity point, executing:
determining a first detection target and a second detection target which have the minimum distance with the adhesion moving point from the detection targets currently existing in the detection area;
determining a first distance between the adhesion activity point and the first detection target, and determining a second distance between the adhesion activity point and the second detection target;
determining a vector included angle formed by the adhesion moving point and the first detection target and the second detection target;
and processing the adhesion moving point according to the first distance, the second distance and the vector included angle.
2. The method of claim 1,
the determining a detection target currently existing in the detection area according to the current frame point cloud data and the stored historical frame point cloud data comprises:
determining a detection target existing before the current frame point cloud data is generated in the detection area according to the historical frame point cloud data; determining a detection target existing before the current frame point cloud data in the determined detection area is generated as a detection target existing in the detection area at present;
determining whether a newly added moving target exists in the detection area or not according to the current frame point cloud data and the historical frame point cloud data, and if so, adding the newly added moving target serving as a detection target to the detection target existing in the detection area;
determining whether a newly added static target exists in the detection area or not according to the current frame point cloud data and the historical frame point cloud data, and if so, adding the newly added static target serving as a detection target to the detection target existing in the detection area;
determining whether a reduced moving target exists in the detection area or not according to the current frame point cloud data and the historical frame point cloud data, and if so, deleting the reduced moving target from a detection target existing in the detection area;
and determining whether a reduced static target exists in the detection area or not according to the current frame point cloud data and the historical frame point cloud data, and if so, deleting the reduced static target from the detection target existing in the detection area.
3. The method of claim 2,
the current frame point cloud data comprises attribute information of each active point;
the determining whether a newly added active target currently exists in the detection area includes:
clustering the plurality of active points into at least one point cloud cluster according to the attribute information of each active point;
if the point cloud cluster meeting the moving target generation condition exists in the at least one point cloud cluster, determining that a newly added moving target exists in the detection area, and distributing the point cloud cluster meeting the moving target generation condition as the newly added moving target; if not, determining that no newly added moving target exists in the detection area;
and/or the presence of a gas in the gas,
the determining whether a newly added stationary target currently exists in the detection area includes:
according to whether a point cloud cluster corresponding to a moving target included in the point cloud data of the current frame, which is an adjacent frame, in the historical frame point cloud data disappears in a plurality of moving points corresponding to the current frame point cloud data, if so, determining that a newly added static target currently exists in the detection area, and determining the moving target corresponding to the disappeared point cloud cluster as the newly added static target; otherwise, determining that no newly added static target exists in the detection area;
and/or the presence of a gas in the gas,
the determining whether there are currently reduced moving objects within the detection region comprises:
whether a point cloud cluster corresponding to a moving target included in the point cloud data of the current frame which is an adjacent frame in the historical frame point cloud data disappears in a plurality of moving points corresponding to the current frame point cloud data or not is judged, if yes, a reduced moving target currently exists in the detection area is determined, and the moving target corresponding to the disappeared point cloud cluster is determined to be the reduced moving target; otherwise, determining that no reduced moving objects currently exist in the detection area;
and/or the presence of a gas in the gas,
the determining whether there are currently reduced stationary objects within the detection area comprises:
determining whether the static time length of a static target reaches a set first time length or not according to the current frame point cloud data and the historical frame point cloud data, if so, determining that a reduced static target currently exists in the detection area, and determining the static target with the static time length reaching the set time length as the reduced static target; otherwise, it is determined that there are currently no reduced stationary objects within the detection region.
4. The method of claim 3, wherein the activity goal generating conditions comprise:
the duration of the point cloud cluster reaches a set second duration, and the number of the active points in the point cloud cluster is not less than a preset number threshold; and/or the distance between the point cloud cluster and the moving target closest to the point cloud cluster is not less than the set first length threshold.
5. The method of claim 1,
the current frame point cloud data comprises position information of each active point;
the determining at least one adhesion activity point from the plurality of activity points comprises:
determining the position information of each detection target according to the position information of the active point corresponding to each detection target;
for each of the plurality of active points, performing:
calculating the distance between the active point and each detection target according to the position information of the active point and the position information of each detection target;
calculating the distance difference of the minimum two distances in each distance;
and if the distance difference is smaller than a set second length threshold, determining the movable point as an adhesion movable point.
6. The method of claim 5,
the current frame point cloud data comprises: the distance, azimuth angle and Doppler value from each active point to the millimeter wave radar;
the calculating the distance between the active point and each detection target respectively comprises:
for each detection target, performing:
determining the motion speed and the motion acceleration of the detection target according to the current frame point cloud data and the historical frame point cloud data;
generating a three-dimensional covariance matrix according to the position information, the movement speed and the movement acceleration of the detection target;
determining a plurality of target active points corresponding to the detection target, and determining the distance, the azimuth angle and the Doppler value between the detection target and the millimeter wave radar according to the distance, the azimuth angle and the Doppler value between each target active point and the millimeter wave radar in the current frame point cloud data;
calculating the distance D between the active point and the detection target according to the following first formula;
D=d[0]*(d[0]*S[0]+d[1]*S[3]+d[2]*S[6])+d[1]*(d[0]*S[1]+d[1]*S[4]+d[2]*S[7])+d[2]*(d[0]*S[2]+d[1]*S[5]+d[2]*S[8]);
d 0 is used for representing the difference value between the distance from the active point to the millimeter wave radar and the distance from the detection target to the millimeter wave radar; d 1 is used for representing the difference value between the azimuth angle from the active point to the millimeter wave radar and the azimuth angle from the detection target to the millimeter wave radar; d 2 is used to represent the difference between the Doppler value from the active point to the millimeter wave radar and the Doppler value from the detected target to the millimeter wave radar; s [0] to S [8] are respectively used for representing elements in an inverse matrix of the three-dimensional covariance matrix.
7. The method according to any one of claims 1-6, wherein said processing the adhesion activity point according to the first distance, the second distance and the vector angle comprises:
judging whether the included angle of the vector is smaller than a set angle threshold value or not;
if the vector included angle is smaller than the set angle threshold, further judging the distance difference value between the first distance and the second distance, and if the distance difference value between the first distance and the second distance is larger than 0, allocating the adhesion activity point to the detection target closest to the adhesion activity point; if the distance difference between the first distance and the second distance is equal to 0, calculating the dynamic speeds corresponding to the first detection target and the second detection target respectively, and allocating the adhesion activity point to the detection target with the maximum dynamic speed in the first detection target and the second detection target;
and if the vector included angle is not smaller than the set angle threshold value, marking the adhesion activity point as an invalid activity point, and discarding the adhesion activity point marked as the invalid activity point.
8. The method of claim 7, wherein the calculating the dynamic speed corresponding to each of the first detection target and the second detection target comprises:
for each of the first detection target and the second detection target, performing:
sampling the current frame point cloud data and the historical frame point cloud data, and sampling two frames of point cloud data each time, wherein the time stamp distance between the two frames of point cloud data is T;
determining the position information of the detection target in each frame of point cloud data according to two frames of point cloud data obtained by each sampling;
and calculating the dynamic speed of the detection target by using the following second formula:
Figure FDA0002927147150000051
wherein, VtFor characterizing the movement of the detection targetVelocity of state, n being used to characterize the number of samples, (x)new,ynew) The method is used for representing the position information of the detection target in a frame of point cloud data closest to the current time stamp in the m-th sampling, (x)old,yold) And the method is used for representing the position information of the detection target in the frame of point cloud data farthest from the current timestamp in the m-th sampling.
9. An adhesion moving point processing apparatus, comprising:
the point cloud data acquisition unit is used for acquiring current frame point cloud data corresponding to the plurality of active points, which are obtained by detecting a detection area by the millimeter wave radar;
a detection target determining unit, configured to determine a detection target currently existing in the detection area according to the current frame point cloud data and according to stored historical frame point cloud data; the number of the detection targets is at least two;
an adhesion activity point determining unit, configured to determine at least one adhesion activity point from the multiple activity points according to a detection target currently existing in the detection area;
an adhesion activity point processing unit configured to, for each adhesion activity point, execute: determining a first detection target and a second detection target which have the minimum distance with the adhesion moving point from the detection targets currently existing in the detection area; determining a first distance between the adhesion activity point and the first detection target, and determining a second distance between the adhesion activity point and the second detection target; determining a vector included angle formed by the adhesion moving point and the first detection target and the second detection target; and processing the adhesion moving point according to the first distance, the second distance and the vector included angle.
10. A computer-readable storage medium, on which a computer program is stored which, when executed in a computer, causes the computer to carry out the method of any one of claims 1-8.
CN202110136517.3A 2021-02-01 2021-02-01 Method and device for processing adhesion moving points Active CN112836628B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110136517.3A CN112836628B (en) 2021-02-01 2021-02-01 Method and device for processing adhesion moving points

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110136517.3A CN112836628B (en) 2021-02-01 2021-02-01 Method and device for processing adhesion moving points

Publications (2)

Publication Number Publication Date
CN112836628A true CN112836628A (en) 2021-05-25
CN112836628B CN112836628B (en) 2022-12-27

Family

ID=75931364

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110136517.3A Active CN112836628B (en) 2021-02-01 2021-02-01 Method and device for processing adhesion moving points

Country Status (1)

Country Link
CN (1) CN112836628B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110992356A (en) * 2019-12-17 2020-04-10 深圳辰视智能科技有限公司 Target object detection method and device and computer equipment
CN111123255A (en) * 2019-12-13 2020-05-08 意诺科技有限公司 Method, device and system for positioning moving target
CN111199555A (en) * 2019-12-13 2020-05-26 意诺科技有限公司 Millimeter wave radar target identification method
US10816993B1 (en) * 2019-11-23 2020-10-27 Ha Q Tran Smart vehicle
CN111856507A (en) * 2020-07-28 2020-10-30 上海木木聚枞机器人科技有限公司 Environment sensing implementation method, intelligent mobile device and storage medium
CN112147635A (en) * 2020-09-25 2020-12-29 北京亮道智能汽车技术有限公司 Detection system, method and device
CN112154356A (en) * 2019-09-27 2020-12-29 深圳市大疆创新科技有限公司 Point cloud data processing method and device, laser radar and movable platform
US20200408892A1 (en) * 2019-06-27 2020-12-31 Samsung Electronics Co., Ltd. Radar data processing device and local range resolving power adjusting method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200408892A1 (en) * 2019-06-27 2020-12-31 Samsung Electronics Co., Ltd. Radar data processing device and local range resolving power adjusting method
CN112154356A (en) * 2019-09-27 2020-12-29 深圳市大疆创新科技有限公司 Point cloud data processing method and device, laser radar and movable platform
US10816993B1 (en) * 2019-11-23 2020-10-27 Ha Q Tran Smart vehicle
CN111123255A (en) * 2019-12-13 2020-05-08 意诺科技有限公司 Method, device and system for positioning moving target
CN111199555A (en) * 2019-12-13 2020-05-26 意诺科技有限公司 Millimeter wave radar target identification method
CN110992356A (en) * 2019-12-17 2020-04-10 深圳辰视智能科技有限公司 Target object detection method and device and computer equipment
CN111856507A (en) * 2020-07-28 2020-10-30 上海木木聚枞机器人科技有限公司 Environment sensing implementation method, intelligent mobile device and storage medium
CN112147635A (en) * 2020-09-25 2020-12-29 北京亮道智能汽车技术有限公司 Detection system, method and device

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
KUN QIAN等: ""3D Point Cloud Generation with Millimeter-Wave Radar"", 《ACM》 *
杨文秀: ""基于双光楔激光雷达的地面目标识别方法研究"", 《中国优秀硕士学位论文全文数据库信息科技辑》 *
陈明: ""基于激光与视觉信息融合的运动目标检测关键技术研究"", 《中国优秀硕士学位论文全文数据库工程科技Ⅱ辑》 *
马进全等: ""基于空间相似聚类的点云数据分割方法研究与实验"", 《地理信息世界》 *

Also Published As

Publication number Publication date
CN112836628B (en) 2022-12-27

Similar Documents

Publication Publication Date Title
CN112526513B (en) Millimeter wave radar environment map construction method and device based on clustering algorithm
CN110361727A (en) A kind of millimetre-wave radar multi-object tracking method
CN110286389B (en) Grid management method for obstacle identification
CN106971401B (en) Multi-target tracking device and method
US11880985B2 (en) Tracking multiple objects in a video stream using occlusion-aware single-object tracking
CN111798487A (en) Target tracking method, device and computer readable storage medium
CN111929653B (en) Target detection and tracking method and system based on unmanned ship navigation radar
CN116432060A (en) Target self-adaptive clustering method, device, equipment and storage medium based on radar
CN114690174A (en) Target tracking method and device based on millimeter wave radar and laser radar
CN112836628B (en) Method and device for processing adhesion moving points
CN113537077A (en) Label multi-Bernoulli video multi-target tracking method based on feature pool optimization
KR102361816B1 (en) Method for detecting target and readable medium
CN115966084A (en) Holographic intersection millimeter wave radar data processing method and device and computer equipment
CN113219425B (en) Test method and system for radar target detection performance
US6658365B2 (en) Collision avoidance system, position detector, collision avoidance method, position detecting method and program for realizing the same
CN111768442B (en) Track initiation method and system based on hierarchical clustering and logic method
CN114895274A (en) Guardrail identification method
CN115439484B (en) Detection method and device based on 4D point cloud, storage medium and processor
CN112654883B (en) Radar target clustering method and device
CN113281736B (en) Radar maneuvering intersection target tracking method based on multi-hypothesis singer model
CN112363131B (en) Processing method and device for vehicle-mounted millimeter wave radar data and computer storage medium
JP5609095B2 (en) Target correlation processing apparatus, target correlation processing method and program
CN112363131A (en) Processing method and device for vehicle-mounted millimeter wave radar data and computer storage medium
CN117473259A (en) Radar game countermeasure simulation method based on multi-source radio frequency signal dynamic reconstruction
CN117388818A (en) Point cloud density enhancement method and system based on track negative feedback

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant