CN112102370B - Target tracking method and device, storage medium and electronic device - Google Patents
Target tracking method and device, storage medium and electronic device Download PDFInfo
- Publication number
- CN112102370B CN112102370B CN202011001634.0A CN202011001634A CN112102370B CN 112102370 B CN112102370 B CN 112102370B CN 202011001634 A CN202011001634 A CN 202011001634A CN 112102370 B CN112102370 B CN 112102370B
- Authority
- CN
- China
- Prior art keywords
- target
- radar
- point cloud
- cloud data
- area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 49
- 238000001514 detection method Methods 0.000 claims description 30
- 230000008569 process Effects 0.000 claims description 10
- 238000001914 filtration Methods 0.000 claims description 5
- 238000004590 computer program Methods 0.000 claims description 4
- 238000005192 partition Methods 0.000 claims 1
- 238000005516 engineering process Methods 0.000 description 8
- 230000005540 biological transmission Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 241000282414 Homo sapiens Species 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 230000007547 defect Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 3
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/277—Analysis of motion involving stochastic approaches, e.g. using Kalman filters
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/06—Systems determining position data of a target
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/66—Radar-tracking systems; Analogous systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/23—Clustering techniques
- G06F18/232—Non-hierarchical techniques
- G06F18/2321—Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20076—Probabilistic image processing
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Networks & Wireless Communication (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Probability & Statistics with Applications (AREA)
- Multimedia (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
The application discloses a target tracking method and device, a storage medium and an electronic device. Wherein the method comprises the following steps: acquiring point cloud data acquired by a radar device; acquiring scene information input for the point cloud data; and tracking the target according to the point cloud data and the scene information. The method solves the technical problem of low accuracy of target tracking by the radar.
Description
Technical Field
The present application relates to the field of target tracking, and in particular, to a target tracking method and apparatus, a storage medium, and an electronic apparatus.
Background
Radar technology is a relatively advanced technology at present, because radar is used as electromagnetic wave detection equipment, has a plurality of excellent characteristics, and along with the development of science and technology and artificial intelligence being continuously accelerated, the breakthrough of radar technology is also rapid, the technology gradually enters daily life application from the initial military use, and the radar technology is used in a plurality of places such as organic field security inspection instruments, mechanical vibration measurement, human vital sign detection and the like. Millimeter wave radars in radars have the characteristics of high sensitivity, strong penetrability, no need of contact and the like, and are widely applied to various fields, and the technology of the millimeter wave radars is widely applied to the field of automobiles at present. The use of the radar in household and human body recognition is relatively lacking (especially the accurate tracking cannot be performed), which is the next stage of radar development, and the technology is really used for benefiting the life of human beings.
In view of the above problems, no effective solution has been proposed at present.
Disclosure of Invention
The embodiment of the application provides a target tracking method and device, a storage medium and an electronic device, which are used for at least solving the technical problem of low target tracking accuracy of a radar.
According to an aspect of an embodiment of the present application, there is provided a tracking method of an object, including: acquiring point cloud data acquired by a radar device; acquiring scene information input for the point cloud data; and tracking the target according to the point cloud data and the scene information.
According to another aspect of the embodiment of the present application, there is also provided a tracking apparatus for a target, including: the first acquisition unit is used for acquiring the point cloud data acquired by the radar device; the second acquisition unit is used for acquiring scene information input for the point cloud data; and the tracking unit is used for tracking the target according to the point cloud data and the scene information.
According to another aspect of the embodiments of the present application, there is also provided a storage medium including a stored program that executes the above-described method when running.
According to another aspect of the embodiments of the present application, there is also provided an electronic device including a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor executing the method described above by the computer program.
In the embodiment of the application, when the millimeter wave radar is used, the boundary and the range to be detected are actively given to the millimeter wave radar, and then the millimeter wave radar accurately detects and tracks the data according to the transmitted boundary data. The method for tracking the targets in the area based on the combination of the millimeter wave radar and the user scene information can solve the technical problem that the accuracy of target tracking by the radar is low.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
FIG. 1 is a flow chart of an alternative target tracking method according to an embodiment of the application;
FIG. 2 is a schematic diagram of an alternative target tracking scheme according to an embodiment of the application;
FIG. 3 is a schematic diagram of an alternative target tracking scheme according to an embodiment of the application;
FIG. 4 is a schematic diagram of an alternative target tracking device according to an embodiment of the application;
And
Fig. 5 is a block diagram of a structure of a terminal according to an embodiment of the present application.
Detailed Description
In order that those skilled in the art will better understand the present application, a technical solution in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, shall fall within the scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to an aspect of the embodiment of the application, a method embodiment of a tracking method of a target is provided. The scheme provides a target tracking solution based on the combination of the millimeter wave radar and the user scene information, which can solve the problem that the millimeter wave radar sometimes tracks inaccurately and can better realize the tracking monitoring and people counting of people in the human scene. FIG. 1 is a flow chart of an alternative method of tracking an object, as shown in FIG. 1, according to an embodiment of the application, the method may include the steps of:
step S1, acquiring point cloud data acquired by a radar device.
And S2, acquiring scene information input for the point cloud data.
Optionally, acquiring the scene information input for the point cloud data includes: in the case where the radar apparatus is located on a roof, the scene information including: the height of the radar from the ground, the length and width of the area to be detected, and the position of the radar entering and exiting the detection area; in the case where the radar apparatus is located at a corner, the scene information including: the positions of four walls of the detected area relative to the radar and the positions of the four walls entering and exiting the detected area; in the case where the radar apparatus is located on the side of a wall, the scene information including: the positions of the other three walls of the four walls of the detected area except the wall where the radar is located relative to the radar, and the positions of the other three walls entering and exiting the detected area.
And step S3, performing target tracking according to the point cloud data and the scene information.
Optionally, performing object tracking according to the point cloud data and the scene information includes: clustering the point cloud data of each frame by adopting a clustering algorithm to detect a target; real-time tracking targets in the point cloud data of each frame by adopting an extended Kalman filtering algorithm; in the target tracking process, the target which leaves the radar detection area is released.
Clustering the point cloud data of each frame by adopting a clustering algorithm to detect the target comprises the following steps: and carrying out clustering detection on the point cloud data of each frame in a partitioned mode by adopting a clustering algorithm, wherein the partitioned areas comprise an entrance area and an indoor area.
Optionally, after clustering detection is performed on the point cloud data subareas of each frame by adopting a clustering algorithm, if a formed new target is positioned in the entrance area, confirming that the new target is a target to be tracked; in the case that the formed new target is located in the indoor area and the old target exists in the indoor area, confirming that the new target is the target to be tracked, and releasing the old target in the indoor area; in the case where the formed new target is located in the indoor area and the old target does not exist in the indoor area, the new target is not taken as the target to be tracked.
Optionally, after clustering detection is performed on the point cloud data of each frame by using a clustering algorithm, if a new target continuously appears in the indoor area for multiple times within a period of target time and no old target exists in the indoor area, determining that the new target is the target to be tracked.
Optionally, during the object tracking process, if an object leaves a boundary from the entrance area, the object is released, and if the object does not leave a boundary from the entrance area, the object is not released even if it is lost.
Through the steps, when the millimeter wave radar is used, the boundary and the range to be detected are actively given to the millimeter wave radar, and then the millimeter wave radar accurately detects and tracks the data according to the incoming boundary data. The method for tracking the targets in the area based on the combination of the millimeter wave radar and the user scene information can solve the technical problem that the accuracy of target tracking by the radar is low.
Regarding the selection and determination of the boundary, an interactive program, such as an app or an applet, can be provided for the user, and the user can intuitively input the area information which the user wants to detect by using the radar into the user (the above can also be automatically completed by a machine), and the data can be transmitted to the millimeter wave radar through the app and other interactive modes, and then the user can judge whether the boundary information is correct or not by displaying the data information of the boundary on a mobile phone interface of the user, and if the boundary information is correct, the user can determine that the setting is successful.
As an alternative example, the technical solution of the present application is further described in detail below in connection with the specific embodiments shown in fig. 2 and 3.
The scheme is applied to the upper layer on the basis of point cloud data obtained by millimeter wave radar detection, and by using the millimeter wave radar and the scene information detected by the radar input by the user, the two are combined to track the human targets in the scene, so that the tracking and the people counting of the user in a specific scene are realized, and the accuracy and the precision are improved.
And step 1, acquiring scene information.
Radar installations fall into three modes: 1) Roof mode, i.e. radar device on roof, detection vertically downwards; 2) The corner mode is that a radar is placed at a corner, detection is carried out, and data are collected; 3) The side mode, i.e. radar is placed on the side of the wall for detection.
Three modes are set, on one hand, the principle of detecting or collecting data by the radar in the three modes is different, and on the other hand, scene information in the three modes is different. The scene information is as follows: 1) Scene information of top mode: the height of the radar from the ground, the length and width of the area to be detected, and the position of the access area; 2) Scene information of corner mode: the positions of four walls in the area to be detected relative to the radar and the positions of the room entering and exiting areas; 3) Scene information of side mode: the positions of the four walls in the area to be detected, other three walls except the wall where the radar is, relative to the radar, and the positions of the four walls entering and exiting the area.
And 2, millimeter wave Lei Dadian cloud data.
The scheme is an upper layer application based on point cloud data obtained by millimeter wave radar detection. The millimeter wave radar obtains point cloud data through signal processing of the received echo signals, the point cloud data is original data to be calculated and processed according to the scheme, and the point cloud data information comprises: 1) The distance between the target point and the radar; 2) Azimuth angle of target point relative to radar; 3) Elevation angle of target point relative to radar (the attribute is information detected by three-dimensional millimeter wave radar, the two-dimensional millimeter wave radar does not contain the attribute, and the scheme contains the two radars); 4) The speed of the target point relative to the radar; 5) Signal to noise ratio of the target point.
And 3, target tracking based on the point cloud data.
The target tracking process based on the point cloud data is as follows:
and 3.1, clustering the point cloud data of each frame by adopting a clustering algorithm to realize target detection, namely forming a new target, wherein the clustering algorithm uses a DBSCAN algorithm and is not limited to the clustering algorithm.
And 3.2, tracking a new target by adopting an extended Kalman filtering algorithm, and tracking the target in the point cloud data of each frame in real time.
And 3.3, target release management, wherein in the target tracking process, some targets possibly leave a radar detection area and need to be released.
Step 4, defects existing in target tracking application based on point cloud data, and radar detection defects: 1) The data detected are all moving objects, and the static objects cannot be accurately detected; 2) There is reflection on the wall, resulting in the appearance of false objects. Defects exist in target tracking based on radar detected point cloud data: 1) Without a boundary, the algorithm does not know when to go out of the boundary, and when to release the boundary, the release mechanism is wrong in practical application; 2) False objects are easily formed.
And 5, combining the millimeter wave radar with the scene information.
The prior knowledge of scene information is utilized to limit the constraint algorithm, so that the practical application capability of the algorithm can be improved.
The scene information comprises the boundary of the region to be detected and the entrance of the boundary, and the tracking algorithm is performed in combination with the scene information as follows:
And 5.1, forming a target, and clustering the point cloud data of each frame by adopting a clustering DBSCAN algorithm to realize target detection. Carrying out clustering detection in different areas, wherein the clustering detection is divided into two areas: an inlet area and an indoor area. Adding rule constraints: a. if the formed new target is in the entrance area, confirming that the new target is a target to be tracked, and otherwise, not processing; b. if the formed new target is in the indoor area, confirming that the new target is a target to be tracked, releasing (deleting) an old target in the indoor area, and if no other target is in the indoor area, not confirming that the new target is the target to be tracked; c. if a new target is formed in the indoor area for 10 times continuously within a period of time, namely, the step b repeatedly appears, and if the new target is formed after exceeding 10 times within a certain period of time, the step a is executed at the moment, and the new target is confirmed to be the target to be tracked, and the other targets are not processed.
And 5.2, tracking a new target by adopting an extended Kalman filtering algorithm, and tracking the target in the point cloud data of each frame in real time.
And 5.3, in the process of target tracking, if the target leaves the boundary from the entrance area, the target is released, and if the target does not leave the boundary from the entrance area, the target is not released even if the target is lost (when the target is lost, the actual target can form a new target, and when the new target is formed, the old target can be released according to the processing of the steps before the new target is formed, so that the tracking target number is kept consistent with the actual target number, and the defect of inaccurate radar tracking target number can be overcome.
The scheme provides a target tracking method based on combination of millimeter wave radar and scene information, mainly aims at solving the problems of unstable and inaccurate tracking when millimeter waves are singly used at present based on combination of point cloud data detected by the millimeter wave radar and the scene information, and can be applied to articles such as air conditioners, wall hanging furnaces and the like needing to control a switch. To control the switching on and off of the device according to the user's needs.
It should be noted that, for simplicity of description, the foregoing method embodiments are all described as a series of acts, but it should be understood by those skilled in the art that the present application is not limited by the order of acts described, as some steps may be performed in other orders or concurrently in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily required for the present application.
From the description of the above embodiments, it will be clear to a person skilled in the art that the method according to the above embodiments may be implemented by means of software plus the necessary general hardware platform, but of course also by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present application.
According to another aspect of the embodiment of the present application, there is also provided a target tracking apparatus for implementing the target tracking method. FIG. 4 is a schematic diagram of an alternative target tracking device apparatus, as shown in FIG. 4, according to an embodiment of the application, which may include:
A first obtaining unit 401, configured to obtain point cloud data acquired by a radar device;
a second obtaining unit 403, configured to obtain scene information input for the point cloud data;
And the tracking unit 405 is used for tracking the target according to the point cloud data and the scene information.
It should be noted that, the first obtaining unit 401 in this embodiment may be used to perform step S1 in the embodiment of the present application, the second obtaining unit 403 in this embodiment may be used to perform step S2 in the embodiment of the present application, and the tracking unit 405 in this embodiment may be used to perform step S3 in the embodiment of the present application.
Through the module, when the millimeter wave radar is used, the boundary and the range to be detected are actively given to the millimeter wave radar, and then the millimeter wave radar accurately detects and tracks the data according to the incoming boundary data. The method for tracking the targets in the area based on the combination of the millimeter wave radar and the user scene information can solve the technical problem that the accuracy of target tracking by the radar is low.
Optionally, the first obtaining unit is further configured to: in the case where the radar apparatus is located on a roof, the scene information including: the height of the radar from the ground, the length and width of the area to be detected, and the position of the radar entering and exiting the detection area; in the case where the radar apparatus is located at a corner, the scene information including: the positions of four walls of the detected area relative to the radar and the positions of the four walls entering and exiting the detected area; in the case where the radar apparatus is located on the side of a wall, the scene information including: the positions of the other three walls of the four walls of the detected area except the wall where the radar is located relative to the radar, and the positions of the other three walls entering and exiting the detected area.
Optionally, the tracking unit is further configured to: clustering the point cloud data of each frame by adopting a clustering algorithm to detect a target; real-time tracking targets in the point cloud data of each frame by adopting an extended Kalman filtering algorithm; in the target tracking process, the target which leaves the radar detection area is released.
Optionally, the tracking unit is further configured to: and carrying out clustering detection on the point cloud data of each frame in a partitioned mode by adopting a clustering algorithm, wherein the partitioned areas comprise an entrance area and an indoor area.
Optionally, the tracking unit is further configured to: after clustering detection is carried out on the point cloud data of each frame in a partitioned mode through a clustering algorithm, under the condition that a formed new target is located in an entrance area, the new target is confirmed to be a target to be tracked; in the case that the formed new target is located in the indoor area and the old target exists in the indoor area, confirming that the new target is the target to be tracked, and releasing the old target in the indoor area; in the case where the formed new target is located in the indoor area and the old target does not exist in the indoor area, the new target is not taken as the target to be tracked.
Optionally, the tracking unit is further configured to: after clustering detection is carried out on the point cloud data of each frame by adopting a clustering algorithm, if a new target continuously appears in the indoor area for multiple times within a period of target time and no old target exists in the indoor area, the new target is confirmed to be the target to be tracked.
It should be noted that the above modules are the same as examples and application scenarios implemented by the corresponding steps, but are not limited to what is disclosed in the above embodiments. It should be noted that, the above modules may be implemented in a corresponding hardware environment as part of the apparatus, and may be implemented in software, or may be implemented in hardware, where the hardware environment includes a network environment.
According to another aspect of the embodiment of the present application, there is also provided a server or a terminal for implementing the tracking method of the above object.
Fig. 5 is a block diagram of a terminal according to an embodiment of the present application, and as shown in fig. 5, the terminal may include: one or more (only one is shown) processors 201, memory 203, and transmission means 205, as shown in fig. 5, the terminal may also include input output devices 207.
The memory 203 may be used to store software programs and modules, such as program instructions/modules corresponding to the method and apparatus for tracking an object in the embodiment of the present application, and the processor 201 executes the software programs and modules stored in the memory 203, thereby performing various functional applications and data processing, that is, implementing the method for tracking an object. Memory 203 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 203 may further comprise memory remotely located relative to the processor 201, which may be connected to the terminal via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 205 is used for receiving or transmitting data via a network, and may also be used for data transmission between the processor and the memory. Specific examples of the network described above may include wired networks and wireless networks. In one example, the transmission device 205 includes a network adapter (Network Interface Controller, NIC) that may be connected to other network devices and routers via a network cable to communicate with the internet or a local area network. In one example, the transmission device 205 is a Radio Frequency (RF) module, which is used to communicate with the internet wirelessly.
Wherein in particular the memory 203 is used for storing application programs.
The processor 201 may call the application program stored in the memory 203 through the transmission means 205 to perform the following steps:
Acquiring point cloud data acquired by a radar device;
acquiring scene information input for the point cloud data;
and tracking the target according to the point cloud data and the scene information.
The processor 201 is further configured to perform the steps of:
In the case where the radar apparatus is located on a roof, the scene information including: the height of the radar from the ground, the length and width of the area to be detected, and the position of the radar entering and exiting the detection area;
in the case where the radar apparatus is located at a corner, the scene information including: the positions of four walls of the detected area relative to the radar and the positions of the four walls entering and exiting the detected area;
In the case where the radar apparatus is located on the side of a wall, the scene information including: the positions of the other three walls of the four walls of the detected area except the wall where the radar is located relative to the radar, and the positions of the other three walls entering and exiting the detected area.
Alternatively, specific examples in this embodiment may refer to examples described in the foregoing embodiments, and this embodiment is not described herein.
It will be appreciated by those skilled in the art that the structure shown in fig. 5 is only illustrative, and the terminal may be a smart phone (such as an Android phone, an iOS phone, etc.), a tablet computer, a palm computer, a Mobile internet device (Mobile INTERNET DEVICES, MID), a PAD, etc. Fig. 5 is not limited to the structure of the electronic device. For example, the terminal may also include more or fewer components (e.g., network interfaces, display devices, etc.) than shown in fig. 5, or have a different configuration than shown in fig. 5.
Those of ordinary skill in the art will appreciate that all or part of the steps in the various methods of the above embodiments may be implemented by a program for instructing a terminal device to execute in association with hardware, the program may be stored in a computer readable storage medium, and the storage medium may include: flash disk, read-Only Memory (ROM), random-access Memory (Random Access Memory, RAM), magnetic disk or optical disk, etc.
The embodiment of the application also provides a storage medium. Alternatively, in the present embodiment, the above-described storage medium may be used for executing the program code of the tracking method of the target.
Alternatively, in this embodiment, the storage medium may be located on at least one network device of the plurality of network devices in the network shown in the above embodiment.
Alternatively, in the present embodiment, the storage medium is configured to store program code for performing the steps of:
Acquiring point cloud data acquired by a radar device;
acquiring scene information input for the point cloud data;
and tracking the target according to the point cloud data and the scene information.
Optionally, the storage medium is further arranged to store program code for performing the steps of:
In the case where the radar apparatus is located on a roof, the scene information including: the height of the radar from the ground, the length and width of the area to be detected, and the position of the radar entering and exiting the detection area;
in the case where the radar apparatus is located at a corner, the scene information including: the positions of four walls of the detected area relative to the radar and the positions of the four walls entering and exiting the detected area;
In the case where the radar apparatus is located on the side of a wall, the scene information including: the positions of the other three walls of the four walls of the detected area except the wall where the radar is located relative to the radar, and the positions of the other three walls entering and exiting the detected area.
Alternatively, specific examples in this embodiment may refer to examples described in the foregoing embodiments, and this embodiment is not described herein.
Alternatively, in the present embodiment, the storage medium may include, but is not limited to: a usb disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
The integrated units in the above embodiments may be stored in the above-described computer-readable storage medium if implemented in the form of software functional units and sold or used as separate products. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, comprising several instructions for causing one or more computer devices (which may be personal computers, servers or network devices, etc.) to perform all or part of the steps of the method described in the embodiments of the present application.
In the foregoing embodiments of the present application, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In several embodiments provided by the present application, it should be understood that the disclosed client may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, such as the division of the units, is merely a logical function division, and may be implemented in another manner, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The foregoing is merely a preferred embodiment of the present application and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present application, which are intended to be comprehended within the scope of the present application.
Claims (6)
1. A method of tracking a target, comprising:
Acquiring point cloud data acquired by a radar device;
acquiring scene information input for the point cloud data;
performing target tracking according to the point cloud data and the scene information;
The obtaining the scene information input for the point cloud data includes:
In the case where the radar apparatus is located on a roof, the scene information including: the height of the radar from the ground, the length and width of the area to be detected, and the position of the radar entering and exiting the detection area;
in the case where the radar apparatus is located at a corner, the scene information including: the positions of four walls of the detected area relative to the radar and the positions of the four walls entering and exiting the detected area;
In the case where the radar apparatus is located on the side of a wall, the scene information including: the positions of other three walls of the four walls of the detected area except the wall where the radar is located relative to the radar and the positions of the four walls entering and exiting the detected area;
The performing target tracking according to the point cloud data and the scene information comprises:
clustering the point cloud data of each frame by adopting a clustering algorithm to detect a target;
real-time tracking targets in the point cloud data of each frame by adopting an extended Kalman filtering algorithm;
in the target tracking process, releasing the target which leaves the radar detection area;
clustering the point cloud data of each frame by adopting a clustering algorithm to detect a target comprises the following steps:
carrying out clustering detection on the point cloud data of each frame in a partitioned mode by adopting a clustering algorithm, wherein the partitioned areas comprise an inlet area and an indoor area;
after the clustering algorithm is adopted to perform clustering detection on the point cloud data subareas of each frame, the method further comprises the following steps:
in the case that the formed new target is located in the entrance area, confirming that the new target is a target to be tracked;
in the case that the formed new target is located in the indoor area and the old target exists in the indoor area, confirming that the new target is the target to be tracked, and releasing the old target in the indoor area;
In the case where the formed new target is located in the indoor area and the old target does not exist in the indoor area, the new target is not taken as the target to be tracked.
2. The method of claim 1, wherein after clustering the point cloud data partitions of each frame using a clustering algorithm, the method further comprises:
and if the indoor area continuously and repeatedly generates a new target within a period of target time and the indoor area does not have an old target, confirming that the new target is the target to be tracked.
3. The method of claim 1, wherein releasing the target exiting the radar detection area during target tracking comprises:
In the object tracking process, if an object leaves a boundary from an entrance area, the object is released, wherein if the object does not leave the boundary from the entrance area, the object is not released even if the object is lost.
4. A tracking apparatus of an object, wherein the tracking method of an object according to any one of claims 1 to 3 is applied, comprising:
The first acquisition unit is used for acquiring the point cloud data acquired by the radar device;
The second acquisition unit is used for acquiring scene information input for the point cloud data;
And the tracking unit is used for tracking the target according to the point cloud data and the scene information.
5. A storage medium comprising a stored program, wherein the program when run performs the method of any one of the preceding claims 1 to 3.
6. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor performs the method of any of the preceding claims 1 to 3 by means of the computer program.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011001634.0A CN112102370B (en) | 2020-09-22 | 2020-09-22 | Target tracking method and device, storage medium and electronic device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011001634.0A CN112102370B (en) | 2020-09-22 | 2020-09-22 | Target tracking method and device, storage medium and electronic device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112102370A CN112102370A (en) | 2020-12-18 |
CN112102370B true CN112102370B (en) | 2024-09-06 |
Family
ID=73755806
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011001634.0A Active CN112102370B (en) | 2020-09-22 | 2020-09-22 | Target tracking method and device, storage medium and electronic device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112102370B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112946618B (en) * | 2021-01-26 | 2023-02-17 | 北京清雷科技有限公司 | Indoor personnel positioning method, device and system and household appliance |
CN115049696A (en) * | 2021-03-08 | 2022-09-13 | 北京金茂绿建科技有限公司 | Personnel monitoring method and device based on radar data |
CN114442081A (en) * | 2021-12-21 | 2022-05-06 | 珠海格力电器股份有限公司 | Personnel detection method, device, storage medium and electronic equipment |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110609281A (en) * | 2019-08-23 | 2019-12-24 | 珠海格力电器股份有限公司 | Region detection method and device |
CN111080679A (en) * | 2020-01-02 | 2020-04-28 | 东南大学 | Method for dynamically tracking and positioning indoor personnel in large-scale place |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3667557B1 (en) * | 2018-12-13 | 2021-06-16 | Axis AB | Method and device for tracking an object |
CN111578949B (en) * | 2020-07-03 | 2023-07-25 | 筑石科技(湖州)有限公司 | Indoor positioning method and device, storage medium and electronic device |
-
2020
- 2020-09-22 CN CN202011001634.0A patent/CN112102370B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110609281A (en) * | 2019-08-23 | 2019-12-24 | 珠海格力电器股份有限公司 | Region detection method and device |
CN111080679A (en) * | 2020-01-02 | 2020-04-28 | 东南大学 | Method for dynamically tracking and positioning indoor personnel in large-scale place |
Also Published As
Publication number | Publication date |
---|---|
CN112102370A (en) | 2020-12-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112102370B (en) | Target tracking method and device, storage medium and electronic device | |
JP6606170B2 (en) | Mobile terminal positioning based on electromagnetic signals | |
CN110363076B (en) | Personnel information association method and device and terminal equipment | |
CN110488264A (en) | Personnel's detection method, device, electronic equipment and storage medium | |
CN106461786B (en) | Indoor global positioning system | |
CN110933632B (en) | Terminal indoor positioning method and system | |
DK2928243T3 (en) | PROCEDURE FOR INDOOR POSITIONING OF WIRELESS LOCAL DEVICES (WLAN) | |
US11879967B2 (en) | Radar for tracking or generating radar images of passive objects | |
CN109348428A (en) | A kind of fingerprint base fast construction method of bluetooth indoor locating system | |
GB2538510B (en) | Interoperating sensing devices and mobile devices | |
CN114137505B (en) | Target detection method and device based on wireless radar | |
CN107402374A (en) | A kind of localization method, server and alignment system | |
CN110765823A (en) | Target identification method and device | |
Poston et al. | A framework for occupancy tracking in a building via structural dynamics sensing of footstep vibrations | |
CN113447959A (en) | GNSS deception jamming detection method based on Doppler frequency and related device | |
CN111220959B (en) | Indoor multipath false target identification method and device, electronic equipment and storage medium | |
Tekir et al. | Signal preprocessing routines for the detection and classification of human micro‐Doppler radar signatures | |
CN113344954A (en) | Boundary detection method and device, computer equipment, storage medium and sensor | |
He et al. | WiFi based indoor localization with adaptive motion model using smartphone motion sensors | |
CN111654843B (en) | Method and system for automatically updating fingerprint database, wifi positioning method and system | |
US10012729B2 (en) | Tracking subjects using ranging sensors | |
Waqar et al. | A range error reduction technique for positioning applications in sports | |
CN116996760A (en) | Video data processing method and device, computer readable medium and electronic equipment | |
CN115204221A (en) | Method and device for detecting physiological parameters and storage medium | |
US10942267B2 (en) | Video object processing |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |