CN115049696A - Personnel monitoring method and device based on radar data - Google Patents

Personnel monitoring method and device based on radar data Download PDF

Info

Publication number
CN115049696A
CN115049696A CN202110251613.2A CN202110251613A CN115049696A CN 115049696 A CN115049696 A CN 115049696A CN 202110251613 A CN202110251613 A CN 202110251613A CN 115049696 A CN115049696 A CN 115049696A
Authority
CN
China
Prior art keywords
point cloud
target
cloud data
tracking
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110251613.2A
Other languages
Chinese (zh)
Inventor
贾槐真
张小东
李芝
蒋小颖
徐翘楚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Milli Technology Co ltd
Jinmao Green Building Technology Co Ltd
Original Assignee
Beijing Milli Technology Co ltd
Jinmao Green Building Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Milli Technology Co ltd, Jinmao Green Building Technology Co Ltd filed Critical Beijing Milli Technology Co ltd
Priority to CN202110251613.2A priority Critical patent/CN115049696A/en
Publication of CN115049696A publication Critical patent/CN115049696A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The embodiment of the invention provides a personnel monitoring method and a personnel monitoring device based on radar data, wherein the scheme is specifically to obtain original point cloud data output by a radar in a monitored scene; processing the original point cloud data based on a preset tracking algorithm to obtain target information of a tracking target in the scene; and outputting the target information. According to the scheme, the target information of the tracking target in the scene can be effectively output, and the requirement that a user can realize energy-saving control, environmental protection control or intelligent security according to the target information of the personnel in the representative scene can be met.

Description

Personnel monitoring method and device based on radar data
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to a personnel monitoring method and a personnel monitoring device based on radar data.
Background
With the development of science and technology and economy, many places have more and more requirements on intelligent management, such as home environment, working environment, buildings and the like, and the control measures of the responsiveness can be made according to the number of people entering the places through the intelligent management and through further determining the activity rules, specific requirements and other factors of the people, so that the purposes of energy-saving control, environmental protection control, intelligent security and the like of the places are achieved. The basis for this is that it is necessary to determine the person entering a particular location to detect.
Disclosure of Invention
In view of the above, the present invention provides a method and an apparatus for monitoring people based on radar data, so as to detect people in a specific location.
In order to solve the above problems, the present invention discloses a personnel monitoring method based on radar data, which is characterized in that the personnel monitoring method comprises the steps of:
acquiring original point cloud data output by a radar in a monitored scene;
processing the original point cloud data based on a preset tracking algorithm to obtain target information of a tracking target in the scene;
and outputting the target information.
Optionally, the raw point cloud data includes distance information, doppler information, and azimuth information of each point therein.
Optionally, the processing the original point cloud data based on a preset tracking algorithm to obtain target information of a tracking target in the scene includes:
screening the original point cloud data, and removing noise points in the original point cloud data to obtain processed point cloud data;
predicting the centroid position of a tracking object based on the point cloud data at the previous moment, and determining a tracking target according to the centroid position;
associating the point cloud data at the current moment with the tracking target based on a preset distance function;
clustering the point cloud data which are not associated;
and updating each tracking target to obtain the target information.
Optionally, after the step of associating the point cloud data at the current time with the tracking target based on the preset distance function, the method further includes the steps of:
and if the point cloud data of the existing tracking target does not appear in a new frame, defining the existing tracking target as a static target, and recording the target information of the static target.
Optionally, the clustering process is performed on the point cloud data that is not associated, and includes the steps of:
if the point cloud data which are not associated appear at the entrance position of a preset boundary, obtaining a new tracking target through clustering;
and if the point cloud data which is not associated appears in the preset boundary, the point cloud data is associated with the static target.
There is also provided a personnel monitoring and tracking based on radar data, the personnel monitoring and tracking comprising:
the data acquisition module is configured to acquire original point cloud data output by the radar in the monitored scene;
the data calculation module is configured to process the original point cloud data based on a preset tracking algorithm to obtain target information of a tracking target in the scene;
an information output module configured to output the target information.
Optionally, the raw point cloud data includes distance information, doppler information, and azimuth information of each point therein.
Optionally, the data calculation module includes:
the point cloud screening unit is configured to screen the original point cloud data, remove noise points in the original point cloud data and obtain point cloud data;
a target prediction unit configured to predict a centroid position of a tracking object based on the point cloud data at the previous time, and determine a tracking target according to the centroid position;
a point cloud associating unit configured to associate the point cloud data at the current time with the tracking target based on a preset distance function;
a point cloud allocation unit configured to perform clustering processing on the point cloud data that is not associated;
and the target updating unit is configured to update each tracking target to obtain the target information.
Optionally, the data calculation module further includes:
and the dynamic and static conversion unit is configured to define the existing tracking target as a static target and record the target information of the static target if the point cloud data of the existing tracking target does not appear in a new frame after the point cloud association unit associates the point cloud data at the current moment with the tracking target based on a preset distance function.
Optionally, the point cloud allocating unit includes:
a first assigning subunit configured to obtain a new tracking target through clustering if the point cloud data not associated appears at an entrance position of a preset boundary;
a second sub-unit configured to associate the point cloud data that is not associated with the stationary object if it occurs within the preset boundary.
According to the technical scheme, the invention provides a personnel monitoring method and a personnel monitoring device based on radar data, and the method is specifically used for acquiring original point cloud data output by a radar in a monitored scene; processing the original point cloud data based on a preset tracking algorithm to obtain target information of a tracking target in the scene; and outputting the target information. According to the scheme, the target information of the tracking target in the scene can be effectively output, and the requirement that a user can realize energy-saving control, environmental protection control or intelligent security according to the target information of the personnel in the representative scene can be met.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the embodiments or the prior art descriptions will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a flowchart of a personnel monitoring method based on radar data according to an embodiment of the present application;
FIG. 2 is a flowchart illustrating a method for calculating a tracking target and target information thereof according to an embodiment of the present disclosure;
FIG. 3 is a block diagram of a personnel monitoring and tracking system based on radar data according to an embodiment of the present application;
FIG. 4 is a block diagram of another radar data based personnel monitoring and tracking embodiment of the present application;
fig. 5 is a block diagram of another radar data based personnel monitoring and tracking embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example one
Fig. 1 is a flowchart of a personnel monitoring method based on radar data according to an embodiment of the present application.
The personnel monitoring method of the embodiment is applied to a closed place, or a place with specified boundaries, such as a room, a courtyard, even an in-car space of an automobile, and the like, and is used for monitoring personnel through radar data generated by radars arranged in the places so as to obtain target information, namely information of the personnel entering the places. The radar can be millimeter wave radar, laser radar, and terahertz radar.
Referring to fig. 1, the personnel monitoring method provided in this embodiment includes the following steps:
and S1, acquiring original point cloud data output by the radar in the monitored scene.
The original point cloud data is output by detecting the internal environment of the monitored scene through a millimeter wave radar, a laser radar or a terahertz radar. This embodiment is described by taking Texas Instruments (TI) millimeter wave radar as an example. The millimeter wave radar can process received original FMCW signals into point cloud data, and each point in the point cloud comprises a distance, an azimuth angle, a pitch angle, Doppler, a signal-to-noise ratio and a noise value.
For example, when two persons enter the above-mentioned place, the millimeter wave radar generates original point cloud data representing the respective persons.
And S2, processing according to the original point cloud data to obtain the target information of the tracking target.
Specifically, the original point cloud data is processed based on a preset tracking algorithm to obtain target information of a tracking target in the scene.
And S3, outputting the target information of the tracking target.
That is, the target information is output to the user or the system which needs the target information based on the request of the user or the need of the system. And obtaining the information of the personnel in the scene so as to realize energy-saving control, environmental protection control or intelligent security for the places.
According to the technical scheme, the embodiment provides the personnel monitoring method based on the radar data, and particularly the method comprises the steps of obtaining original point cloud data output by a radar in a monitored scene; processing the original point cloud data based on a preset tracking algorithm to obtain target information of a tracking target in the scene; and obtaining output target information. According to the scheme, the target information of the tracking target in the scene can be effectively output, and the requirement that a user can realize energy-saving control, environmental protection control or intelligent security according to the target information of the personnel in the representative scene can be met.
In this embodiment, the following scheme is specifically adopted to realize the calculation of the target information of the tracking target, as shown in fig. 2.
S201, screening the original point cloud data.
And removing a part of noise points by using the position information of the points through screening processing of the original point cloud data to obtain corresponding point cloud data. Some points that do not meet the conditions may be marked as "out-of-bounds points" and do not participate in the processing of the following steps "predict", "associate", and "assign".
And S202, performing prediction processing on the tracking target.
And predicting the centroid position of the tracked object based on the point cloud data at the last moment, and determining the tracking target according to the centroid position. Specifically, in prediction, the centroid position of the tracked object at time n is estimated based on the state and covariance matrix at time n-1. And determining a tracking target based on the centroid position.
And S203, associating the point cloud data with the tracking target.
When the method is executed, a distance function from all points j in the current frame to the tracks i of two traceable targets is defined and calculated
Figure BDA0002966299200000051
As the amount of innovation the measurement points bring to the trajectory. And then, defining a threshold value G to represent the maximum distance of the acceptable measuring points in each traceable unit, if the distance is less than the threshold value G, calculating a likelihood function of the point j relative to the track i, and obtaining the track i capable of realizing the maximum likelihood by taking a logarithm as the tracking unit associated with the measuring point j.
If 2 users in the area move all the time, theoretically, points in the point cloud can be associated with 2 existing trackable units, after the fixed matching is completed, the updating step is directly carried out, each tracking unit is updated based on the set of the associated points, and the values of innovation, innovation covariance, Kalman gain, posterior state vector, posterior error covariance are updated. Ready for the calculation of the next frame.
And S204, clustering the point cloud data which are not associated.
If there are points in the point cloud that are not associated, the point cloud enters an "assignment" module, in which the points are clustered according to their closeness in the coordinate system, each cluster is used as a candidate for a new tracking unit, and if the candidate tracking units pass the examination of some conditions such as number of image points, signal-to-noise ratio, radial velocity, maximum distance, etc., the candidate tracking units are marked as a new tracking target.
And S205, updating each tracking target to obtain the target information.
Updating each tracking unit based on the set of the associated points, calculating the values of innovation, innovation covariance, Kalman gain, posterior state vector, posterior error covariance and the like at the step, correcting the predicted target prior information, and preparing for calculating the next frame. And inquiring each tracking target to generate target information of the tracking target.
The target information of the tracking target can be obtained through the scheme, so that a data base is provided for outputting the target information according to the requirements of the user.
In addition, in the specific embodiment of the present application, between steps S203 and S204, the following steps are further included:
s2031: if the point cloud data of the existing tracking target does not appear in the new frame, the existing tracking target is defined as a static tracking target, and the target information of the static tracking target is recorded.
Because the point cloud data generated by the millimeter wave radar only depicts a dynamic target in a place, in the point cloud tracking process, if the tracked target keeps relatively static in the place, the radar loses the tracking information of the target, thereby causing the problem of discontinuous target tracking. To address this problem, the present invention requires the user to set a personalized perceptual boundary to demarcate the boundaries of the venue prior to initialization. For the point cloud data disappearing in the boundary, the tracking algorithm considers that the target changes from a dynamic state to a static state, the tracking target is marked as a static target, the last frame of information of the target tracked by the radar is marked as target information of the static target, and the point cloud data is similar to the dynamic target and continues to participate in the next round of iteration.
Based on the above conversion of the dynamic and static targets, the specific content of step S204 is:
firstly, if the point cloud data which is not associated appears at the entrance position of a preset boundary, a new tracking target is obtained through clustering; for example, if another person enters the location, the point in the point cloud data of the newly entered person does not satisfy the association condition with the two tracking targets existing before, is distributed as the point not associated and is distributed to the new tracking target, the user is determined to be a person entering from the outside of the area by checking the appearance position of the new target, and is not matched with the static target in the original area, and the person count is increased by 1 on the original basis to be 3 persons.
In consideration of the influence of radar measurement errors, when a static target in the above-mentioned place starts moving again, the generated point cloud data is not successfully associated with the static target, which may cause a new tracking target in the area, resulting in an inaccurate counting problem.
Then, in order to solve this problem, it is considered that only new tracked objects appearing near the area entrance and exit are people newly entering the area and are subjected to the "assignment" process, and the new tracked objects are regarded as a new tracking unit, and other objects newly appearing in the area are subjected to the association process with the existing static objects through the hungarian matching algorithm. And after the association is successful, deleting the original static target and continuously updating the newly appeared tracking target.
Through the operation, the misjudgment of the tracking target in the site can be avoided.
Example two
Fig. 3 is a block diagram of a personnel monitoring and tracking system based on radar data according to an embodiment of the present application.
The personnel monitoring and tracking of the embodiment is applied to a closed place or a place with specified boundaries, such as a room, a yard or even an in-vehicle space of a car, and the like, and is used for monitoring personnel through radar data generated by radars arranged in the places so as to obtain target information, namely information of personnel entering the places. The radar can be millimeter wave radar, laser radar, and terahertz radar.
Referring to fig. 3, the monitoring and tracking of the person provided by the embodiment includes a data acquisition module 10, a data calculation module 20 and an information output module 30.
The data acquisition module is used for acquiring original point cloud data output by the radar in the monitored scene.
The original point cloud data is output by detecting the internal environment of the monitored scene through a millimeter wave radar, a laser radar or a terahertz radar. This embodiment is described by taking Texas Instruments (TI) millimeter wave radar as an example. The millimeter wave radar can process received original FMCW signals into point cloud data, and each point in the point cloud comprises a distance, an azimuth angle, a pitch angle, Doppler, a signal-to-noise ratio and a noise value.
For example, when two persons enter the above-mentioned place, the millimeter wave radar generates original point cloud data representing the respective persons.
And the data calculation module is used for processing according to the original point cloud data to obtain target information of the tracking target.
Specifically, the original point cloud data is processed based on a preset tracking algorithm to obtain target information of a tracking target in the scene.
The information output module is used for outputting the target information of the tracking target.
That is, the target information is output to the user or the system requiring the target information based on the request of the user or the need of the system. And obtaining the information of the personnel in the scene so as to realize energy-saving control, environmental protection control or intelligent security for the places.
According to the technical scheme, the embodiment provides the personnel monitoring method based on the radar data, and particularly the method comprises the steps of obtaining original point cloud data output by a radar in a monitored scene; processing the original point cloud data based on a preset tracking algorithm to obtain target information of a tracking target in the scene; and outputting the target information. According to the scheme, the target information of the tracking target in the scene can be effectively output, and the requirement that a user can realize energy-saving control, environmental protection control or intelligent security according to the target information of the personnel in the representative scene can be met.
In the present embodiment, the data calculation module includes a point cloud screening unit 21, a target prediction unit 22, a point cloud association unit 23, a point cloud allocation unit 24, and a target update unit 25, as shown in fig. 4.
The point cloud screening unit is used for screening the original point cloud data.
And removing a part of noise points by using the position information of the points through screening processing of the original point cloud data to obtain corresponding point cloud data. Some points that do not meet the conditions may be marked as "out-of-bounds points" and do not participate in the processing of the subsequent "predict", "associate" and "assign" steps.
The target prediction unit is used for performing prediction processing on the tracking target.
And predicting the centroid position of the tracked object based on the point cloud data at the last moment, and determining the tracking target according to the centroid position. Specifically, in prediction, the centroid position of the tracked object at time n is estimated based on the state and covariance matrix at time n-1. And determining a tracking target based on the centroid position.
The point cloud association unit is used for associating the point cloud data with the tracking target.
When the method is executed, a distance function from all points j in the current frame to the tracks i of two traceable targets is defined and calculated
Figure BDA0002966299200000091
New brought to track as measuring pointInformation amount. And then, defining a threshold value G to represent the maximum distance of the acceptable measuring points in each traceable unit, if the distance is less than the threshold value G, calculating a likelihood function of the point j relative to the track i, and obtaining the track i capable of realizing the maximum likelihood by taking a logarithm as the tracking unit associated with the measuring point j.
If 2 users in the area move all the time, theoretically, points in the point cloud can be associated with 2 existing trackable units, after the fixed matching is completed, the updating step is directly carried out, each tracking unit is updated based on the set of the associated points, and the values of innovation, innovation covariance, Kalman gain, posterior state vector, posterior error covariance are updated. In preparation for the calculation of the next frame.
The point cloud distribution unit is used for clustering the point cloud data which are not associated.
If there are points in the point cloud that are not associated, the point cloud enters an "assignment" module, in which the points are clustered according to their closeness in the coordinate system, each cluster is used as a candidate for a new tracking unit, and if the candidate tracking units pass the examination of some conditions such as number of image points, signal-to-noise ratio, radial velocity, maximum distance, etc., the candidate tracking units are marked as a new tracking target.
And the point cloud updating unit is used for updating each tracking target to obtain the target information.
Updating each tracking unit based on the set of the associated points, calculating the values of innovation, innovation covariance, Kalman gain, posterior state vector, posterior error covariance and the like at the step, correcting the predicted target prior information, and preparing for calculating the next frame. And inquiring each tracking target to generate target information of the tracking target.
Through the scheme, the target information of the tracked target can be obtained, so that a data basis is provided for outputting the target information according to the requirements of the user subsequently.
In addition, the data calculation module further includes a dynamic/static conversion unit 26, as shown in fig. 5.
And the dynamic and static conversion unit is used for defining the existing tracking target as a static tracking target and recording the target information of the static tracking target if the point cloud data of the existing tracking target does not appear in a new frame.
Because the point cloud data generated by the millimeter wave radar only depicts a dynamic target in a place, in the point cloud tracking process, if the tracked target keeps relatively static in the place, the radar loses the tracking information of the target, thereby causing the problem of discontinuous target tracking. To address this problem, the present invention requires the user to set a personalized perceptual boundary to demarcate the boundaries of the venue prior to initialization. For point cloud data disappearing in the boundary, the tracking algorithm considers that the target changes from dynamic state to static state, the tracking target is marked as a static target, the last frame information of the target tracked by the radar is marked as target information of the static target, the target information is similar to the dynamic target, and the next round of iteration is continued to be participated.
Based on the conversion of the dynamic and static targets, the point cloud distribution unit comprises a first distribution subunit and a second distribution subunit.
The first sub-unit is used for obtaining a new tracking target through clustering if the point cloud data which is not associated appears at the entrance position of a preset boundary; for example, if another person enters the place, the point in the point cloud data of the newly entered person does not satisfy the association condition with the two tracking targets existing before, is distributed as the point which is not associated, and is distributed to the new tracking target, the user is determined to be a person entering from the outside by checking the appearance position of the new target, and is not matched with the static target in the original area, and the person count is increased by 1 on the original basis, and is 3 persons.
In consideration of the influence of radar measurement errors, the situation that when the static target in the above-mentioned site restarts to move, the generated point cloud data is not successfully associated with the static target may occur, which may cause a new tracking target to appear in the area, resulting in an inaccurate counting problem.
The second allocation subunit is used for solving the problem, considering that only new tracking targets appearing near the area entrance and exit are persons newly entering the area and performing "allocation" processing, and regarding the new tracking targets as a new tracking unit, and performing association processing on other targets newly appearing in the area and existing static targets through a Hungarian matching algorithm. And after the association is successful, deleting the original static target and continuously updating the newly appeared tracking target.
Through the operation, misjudgment of the tracking target in the site can be avoided.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or terminal equipment comprising the element.
The technical solutions provided by the present invention are described in detail above, and the principle and the implementation of the present invention are explained in this document by applying specific examples, and the description of the above examples is only used to help understanding the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (10)

1. A personnel monitoring method based on radar data, characterized in that the personnel monitoring method comprises the steps of:
acquiring original point cloud data output by a radar in a monitored scene;
processing the original point cloud data based on a preset tracking algorithm to obtain target information of a tracking target in the scene;
and outputting the target information.
2. The people monitoring method of claim 1, wherein the raw point cloud data comprises range information, doppler information, and azimuth information for each point therein.
3. The personnel monitoring method as claimed in claim 1, wherein said processing said original point cloud data based on a predetermined tracking algorithm to obtain target information of a tracked target within said scene comprises the steps of:
screening the original point cloud data, and removing noise points in the original point cloud data to obtain processed point cloud data;
predicting the centroid position of a tracking object based on the point cloud data at the previous moment, and determining a tracking target according to the centroid position;
associating the point cloud data at the current moment with the tracking target based on a preset distance function;
clustering the point cloud data which are not associated;
and updating each tracking target to obtain the target information.
4. The personnel monitoring method as claimed in claim 3, wherein after said step of associating the point cloud data of the current time with said tracking target based on a preset distance function, further comprising the steps of:
and if the point cloud data of the existing tracking target does not appear in a new frame, defining the existing tracking target as a static target, and recording the target information of the static target.
5. The people monitoring method as claimed in claim 4, wherein the clustering process of the point cloud data not associated comprises the steps of:
if the point cloud data which are not associated appear at the entrance position of a preset boundary, obtaining a new tracking target through clustering;
and if the point cloud data which is not associated appears in the preset boundary, associating the point cloud data with the static target.
6. A personnel monitoring and tracking system based on radar data, the personnel monitoring and tracking system comprising:
the data acquisition module is configured to acquire original point cloud data output by a radar in a monitored scene;
the data calculation module is configured to process the original point cloud data based on a preset tracking algorithm to obtain target information of a tracking target in the scene;
an information output module configured to output the target information.
7. The people monitoring device of claim 6, wherein the raw point cloud data comprises range information, Doppler information, and azimuth information for each point therein.
8. The people monitoring device according to claim 6, wherein the data calculation module comprises:
the point cloud screening unit is configured to screen the original point cloud data, remove noise points in the original point cloud data and obtain point cloud data;
a target prediction unit configured to predict a centroid position of a tracking object based on the point cloud data at the previous time, and determine a tracking target according to the centroid position;
a point cloud associating unit configured to associate the point cloud data at the current time with the tracking target based on a preset distance function;
a point cloud allocation unit configured to perform clustering processing on the point cloud data that is not associated;
and the target updating unit is configured to update each tracking target to obtain the target information.
9. The people monitoring device of claim 8, wherein the data calculation module further comprises:
and the dynamic and static conversion unit is configured to define the existing tracking target as a static target and record the target information of the static target if the point cloud data of the existing tracking target does not appear in a new frame after the point cloud association unit associates the point cloud data at the current moment with the tracking target based on a preset distance function.
10. The people monitoring device of claim 9, wherein the point cloud allocation unit comprises:
a first assigning subunit configured to obtain a new tracking target through clustering if the point cloud data not associated appears at an entrance position of a preset boundary;
a second sub-unit configured to associate the point cloud data that is not associated with the stationary object if it occurs within the preset boundary.
CN202110251613.2A 2021-03-08 2021-03-08 Personnel monitoring method and device based on radar data Pending CN115049696A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110251613.2A CN115049696A (en) 2021-03-08 2021-03-08 Personnel monitoring method and device based on radar data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110251613.2A CN115049696A (en) 2021-03-08 2021-03-08 Personnel monitoring method and device based on radar data

Publications (1)

Publication Number Publication Date
CN115049696A true CN115049696A (en) 2022-09-13

Family

ID=83156331

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110251613.2A Pending CN115049696A (en) 2021-03-08 2021-03-08 Personnel monitoring method and device based on radar data

Country Status (1)

Country Link
CN (1) CN115049696A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110780305A (en) * 2019-10-18 2020-02-11 华南理工大学 Track cone bucket detection and target point tracking method based on multi-line laser radar
CN110927712A (en) * 2019-10-28 2020-03-27 珠海格力电器股份有限公司 Tracking method and device
US20200193619A1 (en) * 2018-12-13 2020-06-18 Axis Ab Method and device for tracking an object
CN111340854A (en) * 2019-12-19 2020-06-26 南京理工大学 Mobile robot target tracking method based on ICamshift algorithm
CN112102409A (en) * 2020-09-21 2020-12-18 杭州海康威视数字技术股份有限公司 Target detection method, device, equipment and storage medium
CN112102370A (en) * 2020-09-22 2020-12-18 珠海格力电器股份有限公司 Target tracking method and device, storage medium and electronic device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200193619A1 (en) * 2018-12-13 2020-06-18 Axis Ab Method and device for tracking an object
CN110780305A (en) * 2019-10-18 2020-02-11 华南理工大学 Track cone bucket detection and target point tracking method based on multi-line laser radar
CN110927712A (en) * 2019-10-28 2020-03-27 珠海格力电器股份有限公司 Tracking method and device
CN111340854A (en) * 2019-12-19 2020-06-26 南京理工大学 Mobile robot target tracking method based on ICamshift algorithm
CN112102409A (en) * 2020-09-21 2020-12-18 杭州海康威视数字技术股份有限公司 Target detection method, device, equipment and storage medium
CN112102370A (en) * 2020-09-22 2020-12-18 珠海格力电器股份有限公司 Target tracking method and device, storage medium and electronic device

Similar Documents

Publication Publication Date Title
US10706285B2 (en) Automatic ship tracking method and system based on deep learning network and mean shift
CN111127513B (en) Multi-target tracking method
CN112526513B (en) Millimeter wave radar environment map construction method and device based on clustering algorithm
CN112308881B (en) Ship multi-target tracking method based on remote sensing image
CN108919177B (en) Positioning map construction method based on virtual information source estimation and track correction
CN110927712B (en) Tracking method and device
CN110412378B (en) Target object detection method and device
CN115542308B (en) Indoor personnel detection method, device, equipment and medium based on millimeter wave radar
CN116027324A (en) Fall detection method and device based on millimeter wave radar and millimeter wave radar equipment
CN111279215A (en) Target detection method and device, track management method and device and unmanned aerial vehicle
CN112102370A (en) Target tracking method and device, storage medium and electronic device
US11754704B2 (en) Synthetic-aperture-radar image processing device and image processing method
CN114216434A (en) Target confirmation method, system, equipment and storage medium for mobile measurement and control station
CN115049696A (en) Personnel monitoring method and device based on radar data
CN112884801A (en) High altitude parabolic detection method, device, equipment and storage medium
Hlinomaz et al. A multi-rate multiple model track-before-detect particle filter
CN114967751A (en) Aircraft track tracking method, device, equipment and storage medium
CN114596702B (en) Traffic state prediction model construction method and traffic state prediction method
KR102288938B1 (en) Method and apparatus for detecting targets having different radar cross-sections
CN113221709B (en) Method and device for identifying user motion and water heater
CN111812670B (en) Single photon laser radar space transformation noise judgment and filtering method and device
Yan et al. An efficient extended target detection method based on region growing and contour tracking algorithm
CN113311405A (en) Regional people counting method and device, computer equipment and storage medium
Arróspide et al. Multiple object tracking using an automatic variable-dimension particle filter
CN112738714A (en) Floor recognition method for building, electronic device and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination