CN115330841A - Method, apparatus, device and medium for detecting projectile based on radar map - Google Patents

Method, apparatus, device and medium for detecting projectile based on radar map Download PDF

Info

Publication number
CN115330841A
CN115330841A CN202211056912.1A CN202211056912A CN115330841A CN 115330841 A CN115330841 A CN 115330841A CN 202211056912 A CN202211056912 A CN 202211056912A CN 115330841 A CN115330841 A CN 115330841A
Authority
CN
China
Prior art keywords
target
radar image
track
detected
position information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211056912.1A
Other languages
Chinese (zh)
Inventor
张军
顾超
许孝勇
陶征
章庆
朱大安
仇世豪
王长冬
张辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Hurys Intelligent Technology Co Ltd
Original Assignee
Nanjing Hurys Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Hurys Intelligent Technology Co Ltd filed Critical Nanjing Hurys Intelligent Technology Co Ltd
Priority to CN202211056912.1A priority Critical patent/CN115330841A/en
Publication of CN115330841A publication Critical patent/CN115330841A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration by the use of local operators
    • G06T5/30Erosion or dilatation, e.g. thinning
    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/215Motion-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/254Analysis of motion involving subtraction of images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20224Image subtraction

Abstract

The invention discloses a radar map-based projectile detection method, device, equipment and medium. The method comprises the following steps: acquiring a current radar image of a radar detection area; performing track tracking on a target to be detected in the current radar image according to the current radar image and the historical radar image to obtain a current track tracking result of the target to be detected; carrying out projectile detection on the target to be detected according to the current track tracking result and the historical track tracking result of the target to be detected; and determining the historical track tracking result according to the historical radar image. The technical scheme of the application solves the problem of how to rapidly detect the projectile through the radar map, and realizes accurate tracking and monitoring of the projectile.

Description

Method, apparatus, device and medium for detecting projectile based on radar map
Technical Field
The invention relates to the technical field of image detection, in particular to a method, a device, equipment and a medium for detecting a projectile based on a radar map.
Background
Digital traffic is an important field of digital economic development, promotes the deep fusion of advanced information technology and the traffic field, and also promotes the intelligent, digital and information development of the traffic industry. Problems caused by traffic anomalies and road spills are increasingly being valued. For example, in traffic application scenarios such as urban roads, tunnel roads, expressways, railways, water transportation and the like, the existence of the sprinkled objects easily causes a series of traffic accidents, seriously affects traffic capacity, and brings about serious safety problems.
How to accurately and quickly realize traffic incident detection, timely identify and process the throwing objects becomes an important topic in the field of intelligent traffic security.
According to the related scheme, aiming at detection of the sprinkled objects on the road, a scanning radar is used for generating a panoramic radar map to detect the sprinkled objects. However, there are many image factors on the road, which may interfere with the detection of the projectile in the radar map, making it difficult to quickly and accurately detect whether the projectile is present, and possibly causing a problem of misjudgment of the projectile. Therefore, it is vital to detect the objects thrown and identify the traffic events quickly and accurately, and then improve the traffic efficiency and guarantee the safety.
Disclosure of Invention
The invention provides a method, a device, equipment and a medium for detecting a projectile based on a radar map, and aims to solve the problem of how to quickly detect the projectile through the radar map.
According to an aspect of the invention, there is provided a radar map-based projectile detection method, which may include:
acquiring a current radar image of a radar detection area;
performing track tracking on a target to be detected in the current radar image according to the current radar image and the historical radar image to obtain a current track tracking result of the target to be detected;
carrying out projectile detection on the target to be detected according to the current track tracking result and the historical track tracking result of the target to be detected; and determining the historical track tracking result according to the historical radar image.
According to another aspect of the present invention, there is provided a radar map based projectile detection apparatus, the apparatus may comprise:
the image determining module is used for acquiring a current radar image of a radar detection area;
the track tracking module is used for carrying out track tracking on the target to be detected in the current radar image according to the current radar image and the historical radar image to obtain a current track tracking result of the target to be detected;
the detection module is used for carrying out projectile detection on the target to be detected according to the current track tracking result and the historical track tracking result of the target to be detected; and determining the historical track tracking result according to the historical radar image.
According to another aspect of the present invention, there is provided an electronic apparatus including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform a radar map based projectile detection method as described in any one of the embodiments of the invention.
According to another aspect of the present invention, there is provided a computer readable storage medium having stored thereon computer instructions for causing a processor to execute a method for radar map based projectile detection according to any one of the embodiments of the present invention.
According to the technical scheme of the embodiment of the invention, the current radar image and the historical radar image of the radar detection area are used for tracking the track of the target to be detected in the current radar image to obtain the current track tracking result of the target to be detected, and then the projectile is detected on the target to be detected according to the current track tracking result and the historical track tracking result of the target to be detected, so that the problem of how to quickly detect the projectile through the radar image is solved, and the accurate tracking and monitoring of the projectile is realized.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present invention, nor do they necessarily limit the scope of the invention. Other features of the present invention will become apparent from the following description.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a flow chart of a radar map based method for detecting a projectile in accordance with an embodiment of the present invention;
FIG. 2 is a flowchart of a radar map based projectile detection method according to a second embodiment of the present invention;
FIG. 3 is a flowchart of a radar map based projectile detection method according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of a projectile detection device based on a radar chart according to a fourth embodiment of the invention;
fig. 5 is a schematic structural diagram of an electronic device implementing the radar-map-based projectile detection method according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," "object," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Moreover, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example one
Fig. 1 is a flowchart of a radar-map-based projectile detection method according to an embodiment of the present invention, where this embodiment is applicable to performing track tracking on a target to be detected in a current radar image according to an acquired current radar image and an acquired historical radar image to obtain a current track tracking result of the target to be detected, and then determining whether the target to be detected is a projectile according to the current track tracking result and the historical track tracking result. As shown in fig. 1, the method includes:
and S110, acquiring a current radar image of the radar detection area.
The current radar image may be a radar image obtained by scanning a radar detection area with a radar at the current time. The accurate acquisition of the current radar image is beneficial to subsequent detection and judgment of the target to be detected in the current radar image.
And S120, performing track tracking on the target to be detected in the current radar image according to the current radar image and the historical radar image to obtain a current track tracking result of the target to be detected.
The current track tracking result comprises current track position information, current track speed and corresponding track identification information, wherein the track identification information is used for uniquely identifying an object corresponding to a target area, and identification can be carried out according to object characteristics in the target area in a track tracking algorithm so as to avoid errors of identification of the sprinkled object caused by error identification of the object. The current track position information can be used to represent the position coordinate information of the current object.
In a feasible embodiment, performing track tracking on the target to be detected in the current radar image according to the current radar image and the historical radar image to obtain a current track tracking result of the target to be detected, which may include the following steps A1-A3:
a1, separating the background and the foreground in the current radar image to obtain a target radar image with the background removed.
And A2, determining the target position information of the target to be detected in the foreground according to the target radar image.
And A3, performing track tracking on the target position information of the target to be detected according to the historical radar image of the radar detection area to obtain a current track tracking result of the target to be detected.
And the foreground of the target radar image corresponds to at least one target area to be detected.
The background may be, among other things, immovable objects that are always present in the radar detection area, such as road surfaces, buildings, etc. The foreground may be a movable object, such as a vehicle, a projectile, etc., present in the radar detection area. The target position information may be information in the target radar image to characterize the position of the target to be detected.
According to the technical scheme, the current radar image is obtained, the background is separated from the foreground of the current radar image, the target radar image with the background removed is obtained, the target position information of the target to be detected in the foreground is determined according to the target radar image, and finally, the track tracking is carried out on the target position information of the target to be detected according to the historical radar image in the radar detection area, so that the accurate current track tracking result of the target to be detected is obtained, and whether the target to be detected is a sprinkled object or not is accurately judged subsequently.
In a possible embodiment, the separating the background and the foreground in the current radar image to obtain the target radar image with the background removed may include the following steps B1 to B3:
step B1, determining a preset number of previous radar images acquired in a radar detection area before the current radar image is acquired; the last radar image comprises a radar image acquired in a time adjacent to the current radar image or a radar image acquired by closing a radar detection area.
And B2, performing image accumulation averaging on the previous radar images in the preset number to obtain an accumulation average image corresponding to the previous radar image.
And B3, separating the background and the foreground in the current radar image according to the accumulated average image to obtain the target radar image with the background removed.
The preset number may be the number of radar images that need to be acquired by scanning a radar detection area before the current radar image is acquired, which is determined according to actual requirements. The last radar image may be a set of all radar images obtained by scanning the radar detection area by the radar in the immediate time before the current radar image is acquired; or the radar detection area is closed, so that the radar detection area is scanned by the radar without interference factors such as vehicles and the like in the radar detection area to acquire images. Wherein the radar is a microwave radar, such as a millimeter wave radar.
Specifically, a radar scanning radar detection area is used for obtaining a preset number of previous radar images, each image can be recorded as F, then image accumulation averaging is carried out on the preset number of previous radar images, and an accumulated average image of the previous radar images is obtained
Figure BDA0003825220590000061
And then, acquiring a current radar image acquired by scanning a radar detection area through a radar at the current moment, and separating the background and the foreground in the current radar image according to the accumulated average image of the previous radar image to obtain a target radar image with the background removed.
According to the technical scheme, the accumulated average image of the last radar image is more accurate by carrying out image accumulated average on the last radar image in the preset number, so that the background and the foreground in the current radar image are separated according to the accumulated average image of the last radar image, the accurate target radar image with the background removed is obtained, and the current track tracking result of the current radar image is favorably and more accurately obtained subsequently.
In a possible embodiment, the method for separating the background from the foreground in the current radar image according to the accumulated average image to obtain the target radar image with the background removed may include the following steps C1-C2:
and step C1, performing image difference processing on the accumulated average image of the current radar image and the previous radar image to obtain an image after image difference processing.
And step C2, performing binarization processing on the image after the image difference processing, and separating the background and the foreground in the current radar image to obtain a target radar image with the background removed.
The image difference processing may be a difference processing of two similar images. The binarization processing may be that each pixel on the image has only two possible values or gray level states, that is, the gray value of any pixel in the image is 0 or 255, which respectively represents black and white, and the binarization processing may be performed by the following formula:
Figure BDA0003825220590000071
wherein f is ij ' is the gray value of the corresponding pixel point of the radar image after the binarization processing,
Figure BDA0003825220590000072
the gray value of the corresponding pixel point in the radar image is the gray value, T is a preset gray value, the preset gray value can be a critical value that the gray value of the corresponding pixel point in the radar image is converted into 0 or 255, when the gray value of the corresponding pixel point in the radar image is larger than or equal to the preset gray value, the gray value of the corresponding pixel point is converted into 255, otherwise, the gray value is converted into 0.
Specifically, when the image accumulation average is performed on a preset number of previous radar images, the accumulation average image of the previous radar image is obtained
Figure BDA0003825220590000073
Then, obtaining a current radar image F, and carrying out image difference processing on the accumulated average image of the current radar image and the previous radar image to obtainImage F after image difference processing Δ It can be expressed as:
Figure BDA0003825220590000074
wherein, F is a gray scale image of the current radar image.
Then, the image F after the image difference processing is carried out Δ Performing binarization treatment, i.e. F Δ And the pixel value of each pixel point is converted into 0 or 255, the pixel point with the pixel value of 0 is taken as the background, and the pixel point with the pixel value of 255 is taken as the foreground, so that the background and the foreground in the current radar image can be separated according to the image difference value processed by binarization, and the target radar image with the background removed is obtained.
According to the technical scheme, the image difference value processing is carried out on the accumulated average image of the current radar image and the previous radar image to obtain the image after the image difference value processing, the image after the image difference value processing is carried out, the background and the foreground in the current radar image are separated to obtain the target radar image with the background removed, the background and the foreground of the current radar image are accurately separated, the target radar image with the background removed is more accurate, and the current track tracking result of the current radar image can be more accurately obtained.
S130, carrying out projectile detection on the target to be detected according to the current track tracking result and the historical track tracking result of the target to be detected; and determining the historical track tracking result according to the historical radar image.
The historical track tracking result at least comprises track starting position information, historical track position information, track continuous non-target associated frame number, motion stopping frame number and corresponding track identification information. The historical track tracking result can be used for representing information related to the historical motion trail of the object in the target area, and can be obtained by analyzing and processing the historical radar image.
The track start position information may be motion coordinate information of the object appearing in the radar detection area for the first time, and is used for indicating a motion start point of the object. The historical track position information is used for representing the moving coordinate information of the object in the historical movement process, and can be the coordinate information of the object appearing in the last frame of image of the current radar image. The continuous non-target associated frame number of the flight path may be a radar image frame number in which a target object corresponding to the flight path identification information does not appear continuously. The motion stop frame number is used for representing the frame number of the position stop change of the target object in the track tracking process, and can be used for judging the time length of the object stopping motion.
Specifically, according to a current track tracking result determined by a current radar image and a historical track tracking result determined by a historical radar image, track identification information in the current track tracking result is matched with track identification information in the historical track tracking result, the historical track tracking result is updated according to the matching result, finally, the motion state change condition of an object in each target area is determined according to track starting position information, historical track position information, continuous track non-target associated frame number, motion stop frame number and corresponding track identification information in the updated historical track tracking result, and then the object in the target area is subjected to object throwing detection according to the motion state change condition.
According to the technical scheme of the embodiment of the invention, the track tracking is carried out on the target to be detected in the current radar image through the obtained current radar image and historical radar image of the radar detection area to obtain the current track tracking result of the target to be detected, and then the object to be detected is detected according to the current track tracking result and historical track tracking result of the target to be detected, so that the problem of how to quickly detect the object to be detected through the radar image is solved, and the accurate tracking and monitoring of the object to be detected is realized.
Example two
Fig. 2 is a flowchart of a radar-map-based method for detecting a projectile according to a second embodiment of the present invention, where in this embodiment, track tracking is performed on target position information of a target to be detected in a foreground according to a target radar image and target position information of the target to be detected according to a historical radar image of a radar detection area, so as to obtain a current track tracking result of the target to be detected in detail. As shown in fig. 2, the method includes:
s210, obtaining a current radar image of a radar detection area, and separating a background from a foreground in the current radar image to obtain a target radar image with the background removed.
S220, performing edge detection on the target radar image to obtain an edge detection image of the target radar image.
Specifically, a target radar image is obtained, edge detection is performed on the target radar image, real and potential edges in the target radar image are distinguished, and then an accurate edge detection image of the target radar image is obtained.
In a possible embodiment, performing edge detection on the target radar image to obtain an edge detection map of the target radar image may include the following steps D1-D2:
step D1, performing morphological processing on the target radar image to obtain a processed radar image; a target area to be detected in the foreground of the target radar image is divided into different sub-areas after the foreground and the background are separated.
And D2, performing Gaussian smoothing on the processed radar image, and performing edge detection on the processed radar image after Gaussian smoothing to obtain an edge detection image of the target radar image.
Specifically, acquire target radar image, it is divided into different subregion to wait to detect the target region probably to appear in this process, so need carry out morphological processing to target radar image and obtain the radar image after handling, carry out gaussian smoothing to the radar image after handling again to eliminate because the noise that radar detection arouses, carry out the edge detection to the radar image after the processing after gaussian smoothing at last and obtain the edge detection map of target radar image.
This technical scheme, obtain handling back radar image through carrying out morphological processing to target radar image, make handling back radar image more can accurately show and wait to detect the target area, furthermore, carry out the gaussian smoothing to the radar image after handling, further eliminate because the little noise point of part that radar detection arouses, the accuracy of image has been strengthened more, carry out the edge detection image that obtains the target radar image to handling back radar image after the gaussian smoothing at last, be favorable to follow-up target position information who waits to detect the target through the edge detection image acquisition accuracy.
In a possible embodiment, the morphological processing of the target radar image to obtain a processed radar image may include the following steps E1 to E2:
e1, performing morphological dilation operation on the target radar image to obtain a dilated radar image; morphological dilation operations are used to eliminate internal voids and/or neighboring region voids between different sub-regions to which the foreground corresponds.
And E2, performing morphological corrosion operation on the expanded radar image to obtain a corroded radar image, and using the corroded radar image as the processed radar image.
Specifically, in the process of forming the target radar image, it may occur that the target area to be detected is divided into different sub-areas, so in order to eliminate internal voids and/or gaps of neighboring areas between the different sub-areas corresponding to the target area to be detected, morphological dilation operation needs to be performed on the target radar image to obtain a dilated radar image. Because the area can grow after the expansion, the radar image after the corrosion is obtained through morphological corrosion operation on the radar image after the expansion, and the area of the area can be recovered to the area before the expansion, so that the processed radar image can more accurately represent the target area to be detected.
According to the technical scheme, the expanded radar image is obtained by performing morphological expansion operation on the target radar image, internal holes and/or adjacent area gaps between different sub-areas corresponding to the foreground are eliminated, the radar image subjected to corrosion is obtained by performing morphological corrosion operation on the expanded radar image and is used as the processed radar image, so that the area of the image area is recovered to the area before expansion, the target area to be detected can be more accurately represented, and the subsequent analysis of the target area to be detected is facilitated.
And S230, determining target position information of the target to be detected corresponding to the foreground of the target radar image according to the edge detection image of the target radar image.
Specifically, a current radar image is obtained, a foreground and a background in the current radar image are separated to obtain a target radar image without the background, an edge detection image is obtained by performing edge detection on the target radar image, target position information of a target to be detected corresponding to the foreground of the target radar image is obtained according to the edge detection image, and accurate obtaining of the target position information of the target to be detected is achieved.
In a possible embodiment, determining the target position information of the target to be detected corresponding to the foreground of the target radar image according to the edge detection map of the target radar image may include the following steps F1 to F3:
and F1, extracting the outer boundary inflection point of the edge detection graph to obtain the outer boundary inflection point position information of the target area to be detected in the foreground of the target radar image.
And F2, determining the central point position information of the target area to be detected according to the outer boundary inflection point position information.
F3, clustering the central point position information of the target area to be detected to obtain clustered central point position information which is used as target position information of the target to be detected corresponding to the target area to be detected; wherein at least one target area to be detected corresponds to one target to be detected.
The target area to be detected can be an area of a target to be detected on the image in the foreground of the target radar image. The central point position information can be coordinate information of a target to be detected corresponding to a target area to be detected in the foreground of the target radar image, and is expressed by a Cartesian coordinate system, wherein the Cartesian coordinate system is an orthogonal coordinate system in mathematics, is a general name of a rectangular coordinate system and an oblique coordinate system, and is an orthogonal coordinate system taking the center of the target radar image as a coordinate origin.
Specifically, each target area to be detected in the foreground can be accurately determined according to the target radar image, the position of the target to be detected corresponding to the target area to be detected is described through the central point position information of the target area to be detected, the target area to be detected comprises a plurality of pixel point coordinates, the geometric center pixel coordinates are obtained through the pixel point coordinates, and then the geometric center pixel coordinates are converted into corresponding Cartesian coordinates to serve as the central point position information X of the target area to be detected. In addition, the processed radar image is obtained by performing morphological processing on the target radar image, and the possibility that the same target area to be detected is divided into a plurality of areas still exists, namely the plurality of target areas to be detected represent the information of the same object, so that the central point position information needs to be clustered, a new central point coordinate is obtained by calculating the central point coordinate of the target area to be detected representing the position information of the target to be detected of the target area to be detected, namely the clustered central point position information is used as the target position information of the target area to be detected, and meanwhile, the coordinate set of each central point is updated to be X', namely the target position information of each clustered target area to be detected.
According to the technical scheme, the geometric center position information of the vehicle area to be detected is determined according to the outer boundary inflection point position information of the target area to be detected in the foreground of the target radar image extracted from the edge detection image of the target radar image, and then the target position information of the target to be detected is accurately acquired according to the geometric center position information of the target area to be detected, so that the inaccuracy of the current track tracking result caused by the error of the target position information of the target to be detected is avoided.
In a possible embodiment, determining the position information of the center point of the target area to be detected according to the position information of the inflection point of the outer boundary may include the following steps G1 to G2:
g1, determining the geometric center point position information of the target area to be detected according to the inflection point position information of the outer boundary.
And G2, determining the actual central point position information of the target area to be detected corresponding to the geometric central point position information of the target area to be detected according to the predetermined corresponding relation between the pixel point and the actual area in the radar image, and taking the actual central point position information as the central point position information of the target area to be detected.
Specifically, the corresponding outer boundary inflection points of each target region to be detected are extracted to obtain the position information of the outer boundary inflection points. Any method in the prior art may be adopted for the outer boundary inflection point extraction method, which is not limited in the embodiment of the present invention. The finally determined outer boundary inflection point location information may be expressed as follows:
Figure BDA0003825220590000131
wherein D is i A set of outer boundary corner coordinates representing the ith target region,
Figure BDA0003825220590000132
the row and column geometric pixel coordinates of the m-th inflection point representing the outer boundary of the i-th target region.
Calculating to obtain the geometric center point position information of each target region to be detected according to the outer boundary inflection point position information corresponding to each target region to be detected, for example, summing the maximum value and the minimum value in the row geometric pixel coordinates of m inflection points on the outer boundary of the ith target region to be detected, and taking one half of the summation result as the geometric center point row pixel coordinate r of the geometric center point position information of the target region to be detected i (ii) a In the same way, the maximum value and the minimum value in the row geometric pixel coordinates of the m inflection points on the outer boundary of the ith target area to be detected are summed, and then one half of the summation result is taken as the geometric center point row pixel coordinate c of the geometric center point position information of the target area to be detected i That is, the geometric center point position information of the target area can be expressed as:
Figure BDA0003825220590000133
because a pixel point in the radar image corresponds to a square area with the side length of delta meters in the actual area, the geometric center point position information of the target area to be detected is converted into the actual center point position information of the target area to be detected under a Cartesian coordinate system according to delta, namely the center point position information of the target area to be detected, and the center point coordinate can be expressed as follows:
Figure BDA0003825220590000134
in the formula, Δ is the side length of an actual region corresponding to a target to be detected in a target region to be detected, P and Q represent that a target radar image is an image with P rows and Q columns, and the values of P and Q are odd numbers (the odd numbers can enable central pixels of the image to be exactly the origin of a cartesian coordinate system).
The central point coordinates corresponding to each target area to be detected are gathered as follows:
X={(x 1 ,y 1 ),(x 2 ,y 2 ),...,(x n ,y n )
thus, the central point position information of all the target areas to be detected is obtained.
According to the technical scheme, the geometric central point position information of the target area to be detected is determined through the outer boundary inflection point position information, the actual central point position information of the target area to be detected, corresponding to the geometric central point position information of the target area to be detected, is determined according to the corresponding relation between the pixel point in the radar image and the actual area, and is used as the central point position information of the target area to be detected, and the central point position information is accurately acquired.
S240, performing track tracking on the target position information of the target to be detected according to the historical radar image of the radar detection area to obtain a current track tracking result of the target to be detected.
Specifically, a historical radar image of a radar detection area is obtained by scanning the radar detection area through a radar, track tracking is carried out on target position information of each target area to be detected according to the historical radar image, so that a current track tracking result of the target to be detected is obtained, and an output set of the current track tracking result can be expressed as:
T={(id 1 ,x 1 ,y 1 ,vx 1 ,vy 1 ),(id 2 ,x 2 ,y 2 ,vx 2 ,vy 2 ),…,(id n ,x n ,y n ,vx n ,vy n )}
in the set, id n Corresponding track identification information, x, for the target position of the target area to be detected n ,y n Current track position information, vx, for the target position of a target area to be detected n ,vy n And the current track speed of the target position of the target area to be detected. Alternatively, any algorithm in the prior art may be used for the track tracking algorithm, and the embodiment of the present invention is not limited thereto.
And S250, carrying out projectile detection on the target to be detected according to the current track tracking result and the historical track tracking result of the target to be detected.
According to the technical scheme, the method comprises the steps of obtaining a current radar image of a radar detection area, separating a background from a foreground in the current radar image to obtain a target radar image with the background removed, carrying out edge detection on the target radar image to obtain an edge detection image of the target radar image, eliminating internal cavities of the target radar image, gaps of adjacent areas and noise caused by radar, obtaining an accurate target area to be detected in the foreground, extracting outer boundary position information according to the edge detection image of the target radar image to determine central point position information of an inflection point target area to be detected, using the central point position information as the target position information of the target to be detected, ensuring the accuracy of the target position information, carrying out track tracking on the target position information of the target to be detected according to historical radar images of the radar detection area to obtain an accurate current track tracking result of the target to be detected, finally carrying out throw object detection on the target to be detected according to the current track tracking result and the historical track tracking result of the target to be detected, solving the problem of how to quickly detect a throw object through the radar image, and realizing accurate tracking and monitoring of the throw object.
EXAMPLE III
Fig. 3 is a flowchart of a method for detecting a projectile based on a radar chart according to a third embodiment of the present invention, and this embodiment describes details of detecting the projectile of the target to be detected according to a current track tracking result and a historical track tracking result of the target to be detected in the third embodiment of the present invention. As shown in fig. 3, the method includes:
s310, obtaining a current radar image of a radar detection area, and performing track tracking on a target to be detected in the current radar image according to the current radar image and the historical radar image to obtain a current track tracking result of the target to be detected.
S320, determining a matching result of the track identification information in the current track tracking result and the track identification information in the historical track tracking result.
The matching result may refer to that the track identification information in the current track tracking result is successfully or unsuccessfully matched with the track identification information in the historical track tracking result. If the matching is successful, the track identification information in the current track tracking result exists in the track identification information in the historical track tracking result; if the matching is not successful, the track identification information in the current track tracking result does not exist in the track identification information in the historical track tracking result, or the track identification information in the historical track tracking result does not exist in the track identification information in the current track tracking result.
Specifically, the track identification information in the current track tracking result and the track identification information in the historical track tracking result are extracted, and the historical track tracking result is updated through the compared matching result, so that the accurate judgment of the projectile is realized.
S330, if the matching is successful, determining the track distance difference between the current track position information and the historical track position information in the historical track tracking result which is successfully matched, and whether the current track speed meets a first preset condition; the first preset condition is that the track distance difference is smaller than a first distance threshold value, and the current track speed is smaller than a speed threshold value.
The first preset condition is used for judging whether the motion state change condition of the target to be detected in the target area to be detected is a stop state, namely whether the track position and the track speed change. If the target to be detected in the target area is in a stop state, the track position and the track speed of the target to be detected in the target area are not changed or are slightly changed, and the historical track position information and the historical track speed in the historical track tracking result are not required to be updated; if the target to be detected in the target area to be detected is not in the stop state, it indicates that the track position and the track speed of the target to be detected in the target area to be detected have obvious changes, and the historical track position information and the track speed in the historical track tracking result need to be updated. The first distance threshold may refer to a maximum track distance difference satisfying a first preset condition. The speed threshold may be a maximum track speed that satisfies a first preset condition. Both the first distance threshold and the speed threshold can be set according to actual requirements, for example, the first distance threshold is 1, and the speed threshold is 0.5.
Specifically, track identification information in the historical track tracking result is compared with track identification information in the current track tracking result, if the track identification information in the historical track tracking result has the track identification information in the current track tracking result, matching is successful, and then the track distance difference between the current track position information and the historical track position information in the historical track tracking result which is successfully matched and the current track speed need to be judged, and the historical track tracking result is updated in time according to the judged result.
In a possible embodiment, after determining a matching result between the track identification information in the current track tracking result and the track identification information in the historical track tracking result, the method further includes:
if the track identification information in the current track tracking result is not successfully matched in the historical track tracking result, establishing a historical track tracking result according to the current track tracking result which is not successfully matched, updating the current track position information in the current track tracking result as historical track position information, and initializing the number of continuous non-target associated frames and the number of motion stopping frames of the track;
and carrying out the detection of the sprinkled objects on the target to be detected according to the updated historical track tracking result.
Specifically, track identification information in a historical track tracking result is compared with track identification information in a current track tracking result, if the track identification information in the current track tracking result does not have the track identification information in the historical track tracking result, that is, the matching is not successful, it is indicated that an object in a target area appears in a radar detection area for the first time, that is, the object is tracked to the first time, a historical track tracking result corresponding to the object needs to be created according to the current track tracking result, the current track position information is determined as track starting position information and historical track position information, the track starting position information remains unchanged in subsequent records, but the historical track position information can be continuously updated, and a target-free associated frame number phi and a motion stop frame number of the track are initialized, that is, phi =0, and tau =0.
According to the technical scheme, because the track identification information in the historical track tracking result does not exist in the track identification information in the current track tracking result, the historical track tracking result needs to be created according to the current track tracking result which is not successfully matched, so that the historical track tracking result contains abundant motion information of the target to be detected corresponding to the target area to be detected, and the detection of the projectile is more accurate.
In a possible embodiment, after determining a matching result between the track identification information in the current track tracking result and the track identification information in the historical track tracking result, the method further includes:
if the track identification information in the historical track tracking result is not successfully matched in the current track tracking result, updating the continuous non-target associated frame number of the track in the historical track tracking result which is not successfully matched, and keeping the motion stopping frame number and the historical track position information unchanged;
and carrying out projectile detection on the target to be detected according to the updated historical track tracking result.
Specifically, the track identification information in the historical track tracking result is compared with the track identification information in the current track tracking result, if the track identification information in the historical track tracking result does not have the track identification information in the current track tracking result, that is, the matching is not successful, it is indicated that the current track tracking result is not associated with the historical track tracking result, and it is indicated that an object corresponding to a target area leaves the target area, so that the number of continuous target-free associated frames of the track in the historical track tracking result which is not successfully matched needs to be updated, that is, phi = phi +1, and then the existence state of the object corresponding to the target area is monitored according to the number of continuous target-free associated frames; in addition, the number of motion stop frames and historical track position information remain unchanged.
According to the technical scheme, because the track identification information in the current track tracking result does not exist in the track identification information in the historical track tracking result, the number of continuous non-target associated frames of the track in the historical track tracking result which is not successfully matched needs to be updated, the actual condition of the target to be detected is judged according to the continuous non-target associated frames, the target area to be detected is accurately detected corresponding to the target to be detected, and the detection error of the sprinkled object caused by misjudgment is avoided.
And S340, if so, updating and counting the motion stop frame number in the successfully matched historical track tracking result, and keeping the historical track position information and the track continuous non-target associated frame number unchanged.
Specifically, after the track identification information in the historical track tracking result is compared with the track identification information in the current track tracking result, it is determined that the track identification information in the historical track tracking result is successfully matched with the track identification information in the current track tracking result, and then it is required to determine the track distance difference between the current track position information and the historical track position information in the successfully matched historical track tracking result, and whether the current track speed meets a first preset condition. If the current track speed is less than the speed threshold, it is indicated that track position information and speed of the target to be detected corresponding to the target area to be detected do not change in the change of the two frames before and after, namely the target to be detected corresponding to the target area to be detected stops running, so that the number of motion stop frames in the history track tracking result which is successfully matched needs to be updated and counted, and the count is tau = tau +1; because the speed and the track position information are not changed, and the track identification information in the historical track tracking result is related to the track identification information in the current track tracking result, the continuous non-target related frame number of the historical track position information and the track is kept unchanged.
In one possible embodiment, after determining the track distance difference between the current track position information and the historical track position information in the historical track tracking result which is successfully matched, and whether the current track speed meets a first preset condition, the method further includes:
if not, updating the historical track position information in the successfully matched historical track tracking result according to the current track position information, and keeping the motion stop frame number and the track continuous non-target associated frame number unchanged;
and carrying out projectile detection on the target to be detected according to the updated historical track tracking result.
Specifically, the track identification information in the historical track tracking result is successfully matched with the track identification information in the current track tracking result, but the current track position information, the track distance difference of the historical track position information in the successfully matched historical track tracking result and the current track speed do not meet a first preset condition, which indicates that the current object still continues to move, so that the historical track position information in the successfully matched historical track tracking result needs to be updated according to the current track position information, but the number of motion stopping frames and the number of continuous target-free associated frames of the track remain unchanged.
According to the technical scheme, after the track distance difference between the current track position information and the historical track position information in the historical track tracking result which is successfully matched and the current track speed do not meet the first preset condition are confirmed, the historical track position information in the historical track tracking result which is successfully matched is updated in time according to the current track position information, and the accuracy of the historical track tracking result is ensured.
And S350, carrying out projectile detection on the target to be detected according to the updated historical track tracking result.
Specifically, when the historical track tracking result is updated, the historical track tracking result at this time can accurately describe the motion state information of the object corresponding to each target area, so that the object to be detected in the target area to be detected corresponding to the target position can be subjected to projectile detection according to the motion state information shown by the updated historical track tracking result.
In a possible embodiment, the method for detecting the projectile of the target to be detected according to the updated historical track tracking result may include the following steps H1 to H3:
step H1, if the number of continuous target-free associated frames and the number of motion stopping frames of the flight path in the updated historical flight path tracking result meet a second preset condition, determining whether the distance between the flight path starting position information and the historical flight path position information is smaller than a second distance threshold value, and if yes, determining that the target to be detected is a projectile.
And H2, if the distance between the track starting position information and the historical track position information is larger than a second distance threshold value, determining that the target to be detected is a vehicle moving to a stop.
And H3, if the track continuous non-target associated frame number and the motion stop frame number in the updated historical track tracking result do not meet a second preset condition, deleting the updated historical track tracking result.
The second preset condition may include that the number of consecutive non-target correlation frames phi of the track is less than or equal to the maximum threshold psi of non-correlation frames, and the number of motion stop frames tau is greater than the threshold omega of static frames. The second preset condition is used for judging whether the target to be detected in the target area to be detected is still in the target area to be detected and whether the motion state change condition of the target to be detected in the target area to be detected is a stop state. If the target to be detected in the target area to be detected is in the target area to be detected and the target to be detected in the target area to be detected is in a stop state, carrying out projectile detection on the target to be detected in the target area to be detected according to the updated historical track tracking result; if the target to be detected in the target area to be detected is not in the target area to be detected, the updated historical track tracking result is invalid; and if the target to be detected in the target area to be detected is in the target area to be detected but the target to be detected in the target area to be detected is not in the stop state, not performing the detection of the projectile on the target to be detected in the target area to be detected.
Specifically, the updated historical track tracking result is extracted, and whether the number of continuous non-target associated frames and the number of motion stopping frames of the track in the updated historical track tracking result meet a second preset condition is judged.
If phi is larger than psi, the target tracked by the track does not appear in the radar detection area for a long time and does not need to be detected any more, the historical track is tracked as an invalid historical track tracking result, and the updated historical track tracking result is deleted.
If phi is less than or equal to psi and tau is less than omega, the movement of the target tracked by the flight path is still not stopped, and the updating of the historical flight path tracking result needs to be continued.
If phi is less than psi and tau is more than omega, the target tracked by the flight path stops moving, the distance between the flight path initial position information and the historical flight path position information needs to be determined to determine whether the target is a projectile, specifically, the distance is determined according to whether the distance is less than a second distance threshold epsilon, and the flight path initial position information (x) is obtained 0 ,y 0 ) Historical track location information (x, y), then the distance between the track start location information and the historical track location information is
Figure BDA0003825220590000201
If it is
Figure BDA0003825220590000202
The distance between the object tracked by the track and the initial stop motion is smaller, the object meets the throwing motion condition of the throwing object, and the object in the target area corresponding to the target position is determined to be the throwing object;
if it is
Figure BDA0003825220590000211
The distance between the object tracked by the track and the initial stop motion is large, the object tracked by the track is not in accordance with the throwing motion condition of a throwing object, and due to the fact that the radar detection area is on a highway or a tunnel highway and due to the scene limitation on the highway and the like, the object which appears is a vehicle except the throwing object, and the object in the object area corresponding to the object position is determined to be the vehicle which moves to stop.
According to the technical scheme, whether the track continuous non-target associated frame number and the motion stopping frame number in the updated historical track tracking result meet the second preset condition or not is judged, whether the historical track tracking result is effective or not can be accurately judged, if the object is detected, whether the distance between the track initial position information and the historical track position information is smaller than the second distance threshold or not can be determined, whether the target in the target area corresponding to the target position is the object to be thrown or not is accurately judged, and accurate and quick detection of the object to be thrown is achieved.
Example four
Fig. 4 is a schematic structural diagram of a projectile detection device based on a radar chart according to a fourth embodiment of the present invention. As shown in fig. 4, the apparatus includes:
and an image determining module 410, configured to obtain a current radar image of the radar detection area.
And the track tracking module 420 is configured to perform track tracking on the target to be detected in the current radar image according to the current radar image and the historical radar image, so as to obtain a current track tracking result of the target to be detected.
The detection module 430 is configured to perform projectile detection on the target to be detected according to the current track tracking result and the historical track tracking result of the target to be detected; and determining the historical track tracking result according to the historical radar image.
Optionally, the current track tracking result includes current track position information, current track speed, and corresponding track identification information, and the historical track tracking result at least includes track starting position information, historical track position information, track continuous non-target associated frame number, motion stop frame number, and corresponding track identification information.
Optionally, the detection module includes a first matching unit, and is specifically configured to:
determining a matching result of the track identification information in the current track tracking result and the track identification information in the historical track tracking result;
if the matching is successful, determining the track distance difference between the current track position information and the historical track position information in the successfully matched historical track tracking result, and whether the current track speed meets a first preset condition; the first preset condition is that the track distance difference is smaller than a first distance threshold, and the current track speed is smaller than a speed threshold;
if yes, updating and counting the motion stopping frame number in the successfully matched historical track tracking result, and keeping the historical track position information and the continuous target-free associated frame number unchanged;
and carrying out projectile detection on the object in the target area corresponding to the central point position according to the updated historical track tracking result.
Optionally, the first matching unit includes a determining unit, and is specifically configured to:
if not, updating the historical track position information in the successfully matched historical track tracking result according to the current track position information, and keeping the motion stop frame number and the track continuous non-target associated frame number unchanged;
and carrying out the detection of the sprinkled objects on the target to be detected according to the updated historical track tracking result.
Optionally, the detection module includes a second matching unit, specifically configured to:
if the track identification information in the current track tracking result is not successfully matched in the historical track tracking result, establishing a historical track tracking result according to the current track tracking result which is not successfully matched, updating the current track position information in the current track tracking result as historical track position information, and initializing the number of continuous non-target associated frames and the number of motion stopping frames of the track;
and carrying out the detection of the sprinkled objects on the target to be detected according to the updated historical track tracking result.
Optionally, the detection module includes a third matching unit, and is specifically configured to:
if the track identification information in the historical track tracking result is not successfully matched in the current track tracking result, updating the continuous target-free associated frame number of the track in the historical track tracking result which is not successfully matched, and keeping the motion stopping frame number and the historical track position information unchanged;
and carrying out the detection of the sprinkled objects on the target to be detected according to the updated historical track tracking result.
Optionally, the detection module includes a first detection unit, and is specifically configured to:
if the track continuous non-target associated frame number and the motion stop frame number in the updated historical track tracking result meet a second preset condition, determining whether the distance between the track initial position information and the historical track position information is smaller than a second distance threshold, and if so, determining that the target to be detected is a projectile;
and if the track continuous non-target associated frame number and the motion stop frame number in the updated historical track tracking result do not meet a second preset condition, deleting the updated historical track tracking result.
The radar detection area is configured on an expressway or a tunnel road.
Optionally, the detection module includes a second detection unit, and is specifically configured to:
and determining whether the distance between the track starting position information and the historical track position information is smaller than a second distance threshold value, and if not, determining that the target to be detected is a vehicle moving to a stop.
Optionally, the track tracking module is specifically configured to:
separating the background and the foreground in the current radar image to obtain a target radar image with the background removed;
determining target position information of a target to be detected in the foreground according to the target radar image;
performing track tracking on the target position information of the target to be detected according to the historical radar image of the radar detection area to obtain a current track tracking result of the target to be detected;
and the foreground of the target radar image corresponds to at least one target area to be detected.
Optionally, the track tracking module includes an information determining unit, and is specifically configured to:
performing edge detection on the target radar image to obtain an edge detection graph of the target radar image;
and determining target position information of the target to be detected corresponding to the foreground of the target radar image according to the edge detection image of the target radar image.
Optionally, the information determining unit is specifically configured to:
performing outer boundary inflection point extraction on the edge detection image to obtain outer boundary inflection point position information of a target area to be detected in the foreground of the target radar image;
determining the position information of the central point of the target area to be detected according to the inflection point position information of the outer boundary;
clustering the central point position information of the target area to be detected to obtain clustered central point position information which is used as target position information of a target to be detected corresponding to the target area to be detected; wherein at least one target area to be detected corresponds to one target to be detected.
Optionally, the information determining unit includes a central point location information determining unit, and is specifically configured to:
determining the position information of the geometric center point of the target area to be detected according to the position information of the inflection point of the outer boundary;
and determining the actual central point position information of the target area to be detected corresponding to the geometric central point position information of the target area to be detected according to the predetermined corresponding relation between the pixel point and the actual area in the radar image, and taking the actual central point position information as the central point position information of the target area to be detected.
Optionally, the information determining unit includes an image detecting unit, and is specifically configured to:
performing morphological processing on the target radar image to obtain a processed radar image; a target area to be detected in the foreground of the target radar image is divided into different sub-areas after the foreground and the background are separated;
and performing Gaussian smoothing on the processed radar image, and performing edge detection on the processed radar image after Gaussian smoothing to obtain an edge detection image of the target radar image.
Optionally, the image detection unit includes a morphology processing unit, and is specifically configured to:
performing morphological dilation operation on the target radar image to obtain a dilated radar image; morphological dilation operation is used for eliminating internal holes and/or adjacent area gaps between different sub-areas corresponding to the foreground;
and performing morphological corrosion operation on the expanded radar image to obtain a corroded radar image, and using the corroded radar image as the processed radar image.
Optionally, the track tracking module includes an image determining unit, and is specifically configured to:
determining a preset number of previous radar images acquired in a radar detection area before the current radar image is acquired; the last radar image comprises a radar image acquired in the adjacent time before the current radar image is acquired or a radar image acquired by closing a radar detection area;
performing image accumulation averaging on a preset number of previous radar images to obtain an accumulation average image corresponding to the previous radar image;
and separating the background and the foreground in the current radar image according to the accumulated average image to obtain a target radar image with the background removed.
Optionally, the image determining unit includes an image processing unit, and is specifically configured to:
performing image difference processing on the accumulated average image of the current radar image and the previous radar image to obtain an image subjected to image difference processing;
and performing binarization processing on the image after the image difference processing, and separating the background and the foreground in the current radar image to obtain a target radar image with the background removed.
The radar map-based projectile detection device provided by the embodiment of the invention can execute the radar map-based projectile detection method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
According to the technical scheme, the data acquisition, storage, use, processing and the like meet relevant regulations of national laws and regulations and do not violate the good custom of the public order.
EXAMPLE five
The present disclosure also provides an electronic device, a readable storage medium, and a computer program product according to embodiments of the present disclosure.
Fig. 5 shows a schematic structural diagram of an electronic device that may be used to implement the radar map-based projectile detection method of an embodiment of the present invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital assistants, cellular phones, smart phones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 5, the electronic device 10 includes at least one processor 11, and a memory communicatively connected to the at least one processor 11, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, and the like, wherein the memory stores a computer program executable by the at least one processor, and the processor 11 can perform various suitable actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from a storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data necessary for the operation of the electronic apparatus 10 may also be stored. The processor 11, the ROM 12, and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to the bus 14.
A number of components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, or the like; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, or the like. The processor 11 performs the various methods and processes described above, such as a radar map-based projectile detection method.
In some embodiments, the radar map-based projectile detection method may be implemented as a computer program tangibly embodied in a computer-readable storage medium, such as storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. When the computer program is loaded into RAM 13 and executed by processor 11, one or more steps of the method described above based on radar map detection of a projectile may be performed. Alternatively, in other embodiments, the processor 11 may be configured to perform the radar map-based projectile detection method by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), system on a chip (SOCs), complex Programmable Logic Devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Computer programs for implementing the methods of the present invention can be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be performed. A computer program can execute entirely on a machine, partly on a machine, as a stand-alone software package partly on a machine and partly on a remote machine or entirely on a remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. A computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user may provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the Internet.
The computing system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical host and VPS service are overcome.
It should be understood that various forms of the flows shown above, reordering, adding or deleting steps, may be used. For example, the steps described in the present invention may be executed in parallel, sequentially, or in different orders, and are not limited herein as long as the desired results of the technical solution of the present invention can be achieved.
The above-described embodiments should not be construed as limiting the scope of the invention. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (18)

1. A method for detecting a projectile based on a radar map, comprising:
acquiring a current radar image of a radar detection area;
performing track tracking on a target to be detected in the current radar image according to the current radar image and the historical radar image to obtain a current track tracking result of the target to be detected;
carrying out projectile detection on the target to be detected according to the current track tracking result and the historical track tracking result of the target to be detected; and determining the historical track tracking result according to the historical radar image.
2. The method according to claim 1, wherein the current track tracking result comprises current track position information, current track speed and corresponding track identification information, and the historical track tracking result at least comprises track starting position information, historical track position information, track continuous non-target associated frame number, motion stop frame number and corresponding track identification information;
correspondingly, the detecting the projectile of the target to be detected according to the current track tracking result and the historical track tracking result of the target to be detected comprises the following steps:
determining a matching result of the track identification information in the current track tracking result and the track identification information in the historical track tracking result;
if the matching is successful, determining the track distance difference between the current track position information and the historical track position information in the successfully matched historical track tracking result, and whether the current track speed meets a first preset condition; the first preset condition is that the track distance difference is smaller than a first distance threshold, and the current track speed is smaller than a speed threshold;
if yes, updating and counting the motion stopping frame number in the successfully matched historical track tracking result, and keeping the historical track position information and the continuous target-free associated frame number unchanged;
and carrying out the detection of the sprinkled objects on the target to be detected according to the updated historical track tracking result.
3. The method according to claim 2, wherein after determining whether the current track position information and the track distance difference of the historical track position information in the successfully matched historical track tracking result and the current track speed satisfy a first preset condition, the method further comprises:
if not, updating the historical track position information in the successfully matched historical track tracking result according to the current track position information, and keeping the motion stop frame number and the track continuous non-target associated frame number unchanged;
and carrying out the detection of the sprinkled objects on the target to be detected according to the updated historical track tracking result.
4. The method according to claim 2, wherein after determining a matching result of the track identification information in the current track tracking result and the track identification information in the historical track tracking result, the method further comprises:
if the track identification information in the current track tracking result is not successfully matched in the historical track tracking result, establishing a historical track tracking result according to the current track tracking result which is not successfully matched, updating the current track position information in the current track tracking result as historical track position information, and initializing the number of continuous non-target associated frames and the number of motion stopping frames of the track;
and carrying out the detection of the sprinkled objects on the target to be detected according to the updated historical track tracking result.
5. The method of claim 2, wherein after determining a match between the track identity information in the current track following result and the track identity information in the historical track following result, the method further comprises:
if the track identification information in the historical track tracking result is not successfully matched in the current track tracking result, updating the continuous non-target associated frame number of the track in the historical track tracking result which is not successfully matched, and keeping the motion stopping frame number and the historical track position information unchanged;
and carrying out the detection of the sprinkled objects on the target to be detected according to the updated historical track tracking result.
6. The method according to any one of claims 2 to 5, wherein the detecting the object to be detected for the projectile according to the updated historical track tracking result comprises:
if the track continuous non-target associated frame number and the motion stop frame number in the updated historical track tracking result meet a second preset condition, determining whether the distance between the track initial position information and the historical track position information is smaller than a second distance threshold, and if so, determining that the target to be detected is a projectile;
and if the track continuous non-target associated frame number and the motion stop frame number in the updated historical track tracking result do not meet a second preset condition, deleting the updated historical track tracking result.
7. The method of claim 6, wherein the radar detection area is configured on a highway or a tunnel road;
correspondingly, determining whether the distance between the track start position information and the historical track position information is less than a second distance threshold includes:
if not, determining that the target to be detected is a vehicle moving to a stop.
8. The method according to claim 1, wherein performing track tracking on the target to be detected in the current radar image according to the current radar image and the historical radar image to obtain a current track tracking result of the target to be detected, and the method comprises:
separating the background and the foreground in the current radar image to obtain a target radar image with the background removed;
determining target position information of a target to be detected in the foreground according to the target radar image;
performing track tracking on the target position information of the target to be detected according to the historical radar image of the radar detection area to obtain a current track tracking result of the target to be detected;
and the foreground of the target radar image corresponds to at least one target area to be detected.
9. The method according to claim 8, wherein the determining target position information of the target to be detected in the foreground according to the target radar image comprises:
performing edge detection on the target radar image to obtain an edge detection graph of the target radar image;
and determining the target position information of the target to be detected corresponding to the foreground of the target radar image according to the edge detection image of the target radar image.
10. The method according to claim 9, wherein determining target position information of the target to be detected corresponding to the foreground of the target radar image according to the edge detection map of the target radar image comprises:
performing outer boundary inflection point extraction on the edge detection graph to obtain outer boundary inflection point position information of a target area to be detected in the foreground of the target radar image;
determining the central point position information of the target area to be detected according to the inflection point position information of the outer boundary;
clustering the central point position information of the target area to be detected to obtain clustered central point position information serving as target position information of a target to be detected corresponding to the target area to be detected; wherein at least one target area to be detected corresponds to one target to be detected.
11. The method according to claim 10, wherein determining the position information of the center point of the target region to be detected according to the position information of the inflection point of the outer boundary comprises:
determining the position information of the geometric center point of the target area to be detected according to the position information of the inflection point of the outer boundary;
and determining the actual central point position information of the target area to be detected corresponding to the geometric central point position information of the target area to be detected according to the predetermined corresponding relation between the pixel point and the actual area in the radar image, and taking the actual central point position information as the central point position information of the target area to be detected.
12. The method of claim 9, wherein performing edge detection on the target radar image results in an edge detection map for the target radar image, comprising:
performing morphological processing on the target radar image to obtain a processed radar image; a target area to be detected in the foreground of the target radar image is divided into different sub-areas after the foreground and the background are separated;
and performing Gaussian smoothing on the processed radar image, and performing edge detection on the processed radar image after Gaussian smoothing to obtain an edge detection image of the target radar image.
13. The method of claim 12, wherein morphologically processing the target radar image to obtain a processed radar image comprises:
performing morphological dilation operation on the target radar image to obtain a dilated radar image; morphological dilation operation is used for eliminating internal holes and/or adjacent area gaps between different sub-areas corresponding to the foreground;
and performing morphological corrosion operation on the expanded radar image to obtain a corroded radar image, and using the corroded radar image as the processed radar image.
14. The method of claim 8, wherein separating the background from the foreground in the current radar image to obtain a background-removed target radar image comprises:
determining a preset number of previous radar images acquired in a radar detection area before the current radar image is acquired; the last radar image comprises a radar image acquired in the adjacent time before the current radar image is acquired or a radar image acquired by closing a radar detection area;
performing image accumulation averaging on a preset number of previous radar images to obtain an accumulation average image corresponding to the previous radar image;
and separating the background and the foreground in the current radar image according to the accumulated average image to obtain a target radar image with the background removed.
15. The method of claim 14, wherein separating the background from the foreground in the current radar image according to the accumulated average image to obtain a background-removed target radar image comprises:
performing image difference processing on the accumulated average image of the current radar image and the previous radar image to obtain an image subjected to image difference processing;
and performing binarization processing on the image after the image difference processing, and separating the background and the foreground in the current radar image to obtain a target radar image with the background removed.
16. A projectile detection device based on radar maps, comprising:
the image determining module is used for acquiring a current radar image of a radar detection area;
the track tracking module is used for carrying out track tracking on the target to be detected in the current radar image according to the current radar image and the historical radar image to obtain a current track tracking result of the target to be detected;
the detection module is used for carrying out projectile detection on the target to be detected according to the current track tracking result and the historical track tracking result of the target to be detected; and determining the historical track tracking result according to the historical radar image.
17. An electronic device, characterized in that the electronic device comprises:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein, the first and the second end of the pipe are connected with each other,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the radar map based projectile detection method of any one of claims 1-15.
18. A computer-readable storage medium having stored thereon computer instructions for causing a processor to execute the radar map-based projectile detection method of any one of claims 1-15.
CN202211056912.1A 2022-08-31 2022-08-31 Method, apparatus, device and medium for detecting projectile based on radar map Pending CN115330841A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211056912.1A CN115330841A (en) 2022-08-31 2022-08-31 Method, apparatus, device and medium for detecting projectile based on radar map

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211056912.1A CN115330841A (en) 2022-08-31 2022-08-31 Method, apparatus, device and medium for detecting projectile based on radar map

Publications (1)

Publication Number Publication Date
CN115330841A true CN115330841A (en) 2022-11-11

Family

ID=83927835

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211056912.1A Pending CN115330841A (en) 2022-08-31 2022-08-31 Method, apparatus, device and medium for detecting projectile based on radar map

Country Status (1)

Country Link
CN (1) CN115330841A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117111019A (en) * 2023-10-25 2023-11-24 深圳市先创数字技术有限公司 Target tracking and monitoring method and system based on radar detection

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117111019A (en) * 2023-10-25 2023-11-24 深圳市先创数字技术有限公司 Target tracking and monitoring method and system based on radar detection
CN117111019B (en) * 2023-10-25 2024-01-09 深圳市先创数字技术有限公司 Target tracking and monitoring method and system based on radar detection

Similar Documents

Publication Publication Date Title
CN112669349A (en) Passenger flow statistical method, electronic equipment and storage medium
WO2013053159A1 (en) Method and device for tracking vehicle
CN110798805B (en) Data processing method and device based on GPS track and storage medium
CN111402293A (en) Vehicle tracking method and device for intelligent traffic
CN115330841A (en) Method, apparatus, device and medium for detecting projectile based on radar map
CN115049954A (en) Target identification method, device, electronic equipment and medium
CN112883236B (en) Map updating method and device, electronic equipment and storage medium
CN113255580A (en) Method and device for identifying sprinkled objects and vehicle sprinkling and leaking
CN115436900A (en) Target detection method, device, equipment and medium based on radar map
CN115641359B (en) Method, device, electronic equipment and medium for determining movement track of object
CN116990768A (en) Predicted track processing method and device, electronic equipment and readable medium
CN112507957B (en) Vehicle association method and device, road side equipment and cloud control platform
CN115953434A (en) Track matching method and device, electronic equipment and storage medium
CN115376106A (en) Vehicle type identification method, device, equipment and medium based on radar map
CN115526837A (en) Abnormal driving detection method and device, electronic equipment and medium
CN115359026A (en) Special vehicle traveling method and device based on microwave radar, electronic equipment and medium
CN109063675B (en) Traffic density calculation method, system, terminal and computer readable storage medium
CN115359030A (en) Ground foreign matter detection method, device, equipment and medium based on radar map
CN114581890B (en) Method and device for determining lane line, electronic equipment and storage medium
CN116258769B (en) Positioning verification method and device, electronic equipment and storage medium
CN116662788B (en) Vehicle track processing method, device, equipment and storage medium
CN115424442A (en) Radar map-based vehicle driving event detection method, device, equipment and medium
CN117853971A (en) Method, device, equipment and storage medium for detecting sprinkled object
CN117372477A (en) Target tracking matching method, device, equipment and medium
Wang et al. Efficient feature aided multi-object tracking in video surveillance

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination