CN112052878B - Method, device and storage medium for shielding identification of radar - Google Patents

Method, device and storage medium for shielding identification of radar Download PDF

Info

Publication number
CN112052878B
CN112052878B CN202010803000.0A CN202010803000A CN112052878B CN 112052878 B CN112052878 B CN 112052878B CN 202010803000 A CN202010803000 A CN 202010803000A CN 112052878 B CN112052878 B CN 112052878B
Authority
CN
China
Prior art keywords
radar
image
pixel
parameter
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010803000.0A
Other languages
Chinese (zh)
Other versions
CN112052878A (en
Inventor
李冬冬
李乾坤
卢维
殷俊
王凯
汪巧斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202010803000.0A priority Critical patent/CN112052878B/en
Publication of CN112052878A publication Critical patent/CN112052878A/en
Application granted granted Critical
Publication of CN112052878B publication Critical patent/CN112052878B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/411Identification of targets based on measurements of radar reflectivity
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/417Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section involving the use of neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Molecular Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Image Analysis (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a method, a device and a storage medium for shielding and identifying radar, which are used for solving the technical problem that the radar in the prior art cannot work normally due to shielding, and the method comprises the following steps: constructing a first image from multiple frames of radar data within a set duration; the method comprises the steps that a first image adopts a preset color mode, one pixel in the first image corresponds to one sub-area in a radar coverage area, the sizes of all the sub-areas are the same, and the values of a plurality of first parameters corresponding to the color value of one pixel are respectively determined by the values of a plurality of second parameters of all measuring points in the sub-area corresponding to the pixel in multi-frame radar data; identifying the first image by using the trained deep learning network model to determine whether the radar is shielded; the trained deep learning network model is obtained by training images with known occlusion types; and if the identification result is that the radar is shielded, sending out alarm information to prompt a user to clear a shielding object shielding the radar.

Description

Method, device and storage medium for shielding identification of radar
Technical Field
The invention relates to the field of security protection, in particular to a method, a device and a storage medium for shielding identification of radar.
Background
With the rapid development of security technology and the public importance of security, various security products are layered, and the security application field is continuously expanding.
The millimeter wave radar-based area monitoring technology is a big hot spot of recent five years of research. The traditional security terminal equipment mainly comprises a visible light camera, but the visible light camera cannot work at night; although infrared cameras can supplement the drawbacks of visible light cameras, this clearly increases the cost and operational difficulty. In addition, the optical sensor is also affected by weather, and the monitoring effect is unsatisfactory in a foggy day or a rainy and snowy day. The millimeter wave Radar actively emits electromagnetic waves and receives signals with the same frequency, has very high detection probability for moving objects or objects with larger Radar Cross-Section (RCS), and has lower detection probability (the detection probability is not zero) for stationary objects. Moreover, the millimeter wave radar can work for 24 hours all the day, and is less affected by weather. Therefore, there is a great demand for millimeter wave radar-based monitoring products in the market at present.
The millimeter wave radar can monitor various targets, extract targets of interest to the user from the various targets, and terminate/filter targets or false targets that are not of interest to the user as soon as possible. One of the objectives of target trajectory classification is to screen/filter targets. For example, in a park, 3-level winds are accidentally scraped, the tree is shaken to form a target track moving at a low speed and in a small range, the target type is a non-human non-vehicle non-animal target, the type of target does not need to be reported, or a track ending method is called as soon as possible to delete the target track. If a puppy is passing through the garden, the track should also be terminated in time as it is not the target of interest to the user (the target of interest to the user is a person or car). If the track is a track formed by pedestrians, the radar outputs the track information of the pedestrians to the camera, and the camera photographs or records the pedestrians according to the track space position information provided by the radar.
The radar may be in a failure state for various reasons, one of the most common being a shelter, such as placing a metal plate in front of the radar, or a cart just standing in front of the radar. These phenomena can cause the radar to fail to function properly and fail to track the target properly.
Disclosure of Invention
The invention provides a method, a device and a storage medium for shielding and identifying a radar, which are used for solving the technical problem that the radar in the prior art cannot work normally due to shielding.
In order to solve the above technical problems, a technical solution of a method for identifying shielding of a radar according to an embodiment of the present invention is as follows:
constructing a first image from multiple frames of radar data within a set duration; the first image adopts a preset color mode, one pixel in the first image corresponds to one sub-area in a radar coverage area, the sizes of all the sub-areas are the same, and the values of a plurality of first parameters corresponding to the color value of one pixel are respectively determined by the values of a plurality of second parameters of all measuring points in the sub-area corresponding to the pixel in the multi-frame radar data;
identifying the first image by using a trained deep learning network model to determine whether the radar is occluded; the trained deep learning network model is obtained by training images with known occlusion types;
and if the radar is shielded as a result of the identification, sending alarm information, wherein the alarm information is used for prompting a user to clear a shielding object shielding the radar.
One possible implementation manner constructs multiple frames of radar data within a set time period into a first image, including:
acquiring a measurement graph corresponding to each frame of radar data, and rasterizing the measurement graph, wherein one grid corresponds to one subarea;
performing sum operation on each second parameter value corresponding to all the measuring points in the same subarea in the multi-frame radar data to obtain a sum value of each second parameter corresponding to the subarea; the second parameters comprise the number of measuring points, radar reflection area and radial speed;
respectively carrying out normalization treatment on the sum value of each second parameter corresponding to the subarea, and then rounding to obtain a quantized value corresponding to the second parameter value; wherein, the value range of the quantized value is the value range of the first parameter corresponding to the second parameter;
and taking the quantized value of the second parameter corresponding to each grid as the first parameter value of the corresponding pixel in the first image to obtain the first image.
In one possible implementation manner, before performing a sum operation on each of the second parameter values corresponding to all the measurement points located in the same sub-area in the multiple frames of radar data, the method further includes:
If the second parameter is the radar reflection area, carrying out logarithmic operation on the radar reflection area corresponding to each measuring point to obtain a new radar reflection area; wherein the new radar reflection area is used for the sum operation of the radar reflection area as the second parameter.
A possible implementation manner, obtaining the trained deep learning network model includes:
acquiring a plurality of second images; the second image adopts the preset color mode, one pixel in the second image corresponds to one subarea, the values of a plurality of first parameters corresponding to the color value of one pixel are respectively determined by the values of a plurality of second parameters of all measuring points in the subarea corresponding to the pixel in multi-frame sample radar data, and the sample radar data are respectively obtained by measuring the condition that the radar is shielded and the condition that the radar is not shielded;
randomly dividing the plurality of second images into three parts, namely a training set, a verification set and a test set;
training the deep learning network model by using the training set, verifying the trained deep learning network model by using the verification set, testing the verified deep learning network model by using the test set, and stopping training until the accuracy rate of the identification of the tested deep learning network model reaches a set threshold value to obtain the trained deep learning network model.
A possible implementation manner, the deep learning network model includes:
3 input channels, 6 convolutional layers and 2 fully-connected layers, 2 output channels;
the 3 input channels respectively input the values of the three first parameters corresponding to each pixel, the convolution layer is used for extracting the characteristics input from the input channels, the full connection layer is used for classifying the characteristics to determine whether the radar is shielded or not, and the output channel is used for outputting a classification result.
In one possible implementation, the number of the second images in the training set, the verification set and the test set is 3:1:1.
A possible implementation manner, obtaining the sample radar data includes:
the radar transmits radar waves to different known shielding types under different use scenes by using different transmission frequency bands, and echo data are obtained;
and taking the echo data as the sample radar data.
A possible implementation, the known occlusion type, comprises:
metal shielding type, plastic shielding type, no shielding type.
A possible implementation, the preset color mode includes:
RGB color mode, HSB color mode, lab color mode.
In a second aspect, an embodiment of the present invention provides an apparatus for identifying occlusion of a radar, including:
the construction unit is used for constructing a first image from multiple frames of radar data in a set duration; the first image adopts a preset color mode, one pixel in the first image corresponds to one sub-area in a radar coverage area, the sizes of all the sub-areas are the same, and the values of a plurality of first parameters corresponding to the color value of one pixel are respectively determined by the values of a plurality of second parameters of all measuring points in the sub-area corresponding to the pixel in the multi-frame radar data;
the identification unit is used for identifying the first image by using the trained deep learning network model so as to determine whether the radar is shielded; the trained deep learning network model is obtained by training images with known occlusion types;
and the processing unit is used for sending out alarm information if the radar is shielded as a recognition result, and the alarm information is used for prompting a user to clear a shielding object shielding the radar.
In one possible embodiment, the construction unit is specifically configured to:
Acquiring a measurement graph corresponding to each frame of radar data, and rasterizing the measurement graph, wherein one grid corresponds to one subarea;
performing sum operation on each second parameter value corresponding to all the measuring points in the same subarea in the multi-frame radar data to obtain a sum value of each second parameter corresponding to the subarea; the second parameters comprise the number of measuring points, radar reflection area and radial speed;
respectively carrying out normalization treatment on the sum value of each second parameter corresponding to the subarea, and then rounding to obtain a quantized value corresponding to the second parameter value; wherein, the value range of the quantized value is the value range of the first parameter corresponding to the second parameter;
and taking the quantized value of the second parameter corresponding to each grid as the first parameter value of the corresponding pixel in the first image to obtain the first image.
In a possible embodiment, the construction unit is further configured to:
if the second parameter is the radar reflection area, carrying out logarithmic operation on the radar reflection area corresponding to each measuring point to obtain a new radar reflection area; wherein the new radar reflection area is used for the sum operation of the radar reflection area as the second parameter.
In a possible embodiment, the apparatus further comprises a training unit, the training unit being configured to:
acquiring a plurality of second images; the second image adopts the preset color mode, one pixel in the second image corresponds to one subarea, the values of a plurality of first parameters corresponding to the color value of one pixel are respectively determined by the values of a plurality of second parameters of all measuring points in the subarea corresponding to the pixel in multi-frame sample radar data, and the sample radar data are respectively obtained by measuring the condition that the radar is shielded and the condition that the radar is not shielded;
randomly dividing the plurality of second images into three parts, namely a training set, a verification set and a test set;
training the deep learning network model by using the training set, verifying the trained deep learning network model by using the verification set, testing the verified deep learning network model by using the test set, and stopping training until the accuracy rate of the identification of the tested deep learning network model reaches a set threshold value to obtain the trained deep learning network model.
A possible implementation manner, the deep learning network model includes:
3 input channels, 6 convolutional layers and 2 fully-connected layers, 2 output channels;
the 3 input channels respectively input the values of the three first parameters corresponding to each pixel, the convolution layer is used for extracting the characteristics input from the input channels, the full connection layer is used for classifying the characteristics to determine whether the radar is shielded or not, and the output channel is used for outputting a classification result.
In one possible implementation, the number of the second images in the training set, the verification set and the test set is 3:1:1.
In a possible embodiment, the construction unit is further configured to:
the radar transmits radar waves to different known shielding types under different use scenes by using different transmission frequency bands, and echo data are obtained;
and taking the echo data as the sample radar data.
A possible implementation, the known occlusion type, comprises:
metal shielding type, plastic shielding type, no shielding type.
A possible implementation, the preset color mode includes:
RGB color mode, HSB color mode, lab color mode.
In a third aspect, an embodiment of the present invention further provides an apparatus for identifying occlusion of a radar, including:
at least one processor, and
a memory coupled to the at least one processor;
wherein the memory stores instructions executable by the at least one processor, the at least one processor performing the method of the first aspect described above by executing the instructions stored by the memory.
In a fourth aspect, an embodiment of the present invention further provides a readable storage medium, including:
the memory device is used for storing the data,
the memory is configured to store instructions that, when executed by the processor, cause an apparatus comprising the readable storage medium to perform the method as described in the first aspect above.
Through the technical scheme in the one or more embodiments of the present invention, the embodiments of the present invention have at least the following technical effects:
in the embodiment provided by the invention, a first image is constructed by multi-frame radar data in a set time length, the first image is identified by a trained deep learning network model to determine whether the radar is shielded, and if the radar is shielded as a result of identification, alarm information is sent and used for prompting a user to clear a shielding object shielding the radar; the method comprises the steps that a first image adopts a preset color mode, one pixel in the first image corresponds to one sub-area in a radar coverage area, the sizes of all the sub-areas are the same, and the values of a plurality of first parameters corresponding to the color value of one pixel are respectively determined by the values of a plurality of second parameters of all measuring points in the sub-area corresponding to the pixel in multi-frame radar data; the trained deep learning network model is trained using images of known occlusion types. Because the multi-frame radar data are converted into the first image with the preset color mode, and the first image is identified by the trained deep learning model, whether the radar is shielded or not is determined, and when the radar is shielded, alarm information is sent to prompt a user, so that the user can timely clear the shielding object of the radar, thereby timely finding out that the radar is invalid (namely in an abnormal state) due to shielding, and further timely recovering the radar to a normal working state.
Drawings
FIG. 1 is a flowchart of a method for identifying occlusion of a radar according to an embodiment of the present invention;
FIG. 2 is a diagram of radar measurement corresponding to a frame of radar data according to an embodiment of the present invention;
FIG. 3 is a diagram of radar measurements after rasterization provided by an embodiment of the present invention;
FIG. 4 is a schematic diagram of distribution density of radar reflection areas without logarithmic operation according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of distribution density of radar reflection areas after logarithmic operation according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a radar shielding recognition device according to an embodiment of the present invention.
Detailed Description
The embodiment of the invention provides a method, a device and a storage medium for shielding and identifying a radar, which are used for solving the technical problem that the radar in the prior art cannot work normally due to shielding.
The technical scheme in the embodiment of the application aims to solve the technical problems, and the overall thought is as follows:
the shielding identification method of the radar comprises the following steps: constructing a first image from multiple frames of radar data within a set duration; the method comprises the steps that a first image adopts a preset color mode, one pixel in the first image corresponds to one sub-area in a radar coverage area, the sizes of all the sub-areas are the same, and the values of a plurality of first parameters corresponding to the color value of one pixel are respectively determined by the values of a plurality of second parameters of all measuring points in the sub-area corresponding to the pixel in multi-frame radar data; identifying the first image by using the trained deep learning network model to determine whether the radar is shielded; the trained deep learning network model is obtained by training images with known occlusion types; if the radar is shielded as a result of the identification, alarm information is sent out, and the alarm information is used for prompting a user to clear a shielding object shielding the radar.
In the scheme, the multi-frame radar data in the set time length are constructed into a first image, the first image is identified by the trained deep learning network model to determine whether the radar is shielded, if the radar is shielded as a result of identification, alarm information is sent, and the alarm information is used for prompting a user to clear a shielding object shielding the radar; the method comprises the steps that a first image adopts a preset color mode, one pixel in the first image corresponds to one sub-area in a radar coverage area, the sizes of all the sub-areas are the same, and the values of a plurality of first parameters corresponding to the color value of one pixel are respectively determined by the values of a plurality of second parameters of all measuring points in the sub-area corresponding to the pixel in multi-frame radar data; the trained deep learning network model is trained using images of known occlusion types. Because the multi-frame radar data are converted into the first image with the preset color mode, and the first image is identified by the trained deep learning model, whether the radar is shielded or not is determined, and when the radar is shielded, alarm information is sent to prompt a user, so that the user can timely clear the shielding object of the radar, thereby timely finding out that the radar is invalid (namely in an abnormal state) due to shielding, and further timely recovering the radar to a normal working state.
In order to better understand the above technical solutions, the following detailed description of the technical solutions of the present invention is made by using the accompanying drawings and specific embodiments, and it should be understood that the specific features of the embodiments and the embodiments of the present invention are detailed descriptions of the technical solutions of the present invention, and not limiting the technical solutions of the present invention, and the technical features of the embodiments and the embodiments of the present invention may be combined with each other without conflict.
Referring to fig. 1, an embodiment of the present invention provides a method for identifying shielding of a radar, and the processing procedure of the method is as follows.
Step 101: constructing a first image from multiple frames of radar data within a set duration; the first image adopts a preset color mode, one pixel in the first image corresponds to one sub-area in the radar coverage area, the sizes of all the sub-areas are the same, and the values of a plurality of first parameters corresponding to the color value of one pixel are respectively determined by the values of a plurality of second parameters of all measuring points in the sub-area corresponding to the pixel in the multi-frame radar data.
The measurement point is the target that the radar measures within its coverage area.
Taking millimeter wave radar as an example, the measurement information that can obtain a target (measurement point) includes: distance, angle, radial speed (radalspeed), radar reflection area (RCS). The millimeter wave radar can set a measurement period (namely a signal receiving and transmitting period) as required, and the measurement period is generally set to be 0.1/0.05 seconds, namely the working frequency is 10/20Hz.
Please refer to table 1 for an example of a portion of one frame of radar data. The total number of measuring points of the radar data in the frame is 49, and the measuring time is 2000-1-1:03:3:40.
TABLE 1
In table 1, the radial velocity is zero, indicating that the measurement point is stationary, such as measurement point 0-measurement point 3 in table 1 are stationary targets; the radial speed is negative, which indicates that the measuring points are close to the radar, such as measuring points 4-13 in table 1 are moving targets approaching the radar; the radial velocity is positive, indicating that the measurement point is far from the radar, such as measurement point 14 in table 1 is the moving target that is far from the radar.
After acquiring multiple frames of radar data for a set period of time similar to that of table 1, they can be constructed into a first image, which can be achieved as follows:
and acquiring a measurement graph corresponding to each frame of radar data, and rasterizing the measurement graph, wherein one grid corresponds to one sub-area.
Fig. 2 is a diagram showing radar measurement corresponding to a frame of radar data according to an embodiment of the present invention. That is, a frame of radar data similar to that of table 1 is plotted on a single map to obtain a corresponding radar measurement map.
In fig. 2, the center position of the abscissa axis is indicated by 0, representing the position of the radar, the units of the abscissa axis and the ordinate axis are meters (m), the black solid dots in fig. 2 represent stationary targets, the open circles represent moving targets approaching the radar, and the gray solid circles represent moving targets moving away from the radar.
Fig. 3 is a diagram of radar measurement after rasterization according to an embodiment of the present invention. The area within a dashed box in fig. 3 is a sub-area (as shown by the gray grid line coverage area in fig. 3).
Then, performing sum operation on each second parameter value corresponding to all measuring points in the same subarea in multi-frame radar data to obtain a sum value of each second parameter corresponding to the subarea; the second parameters comprise the number of measuring points, radar reflection area and radial speed.
If the second parameter is the radar reflection area, carrying out logarithmic operation on the radar reflection area corresponding to each measuring point to obtain a new radar reflection area; wherein the new radar reflection area is used for the sum operation of the radar reflection area as the second parameter.
For example, assuming that the multi-frame radar data is 2 frames of radar data, in the same sub-area of the 2 frames of radar data, one measuring point in the first frame of radar data is included, the second parameter is that the number of measuring points is 1, the radar reflection area is 1.330, the radial velocity is 0, and two measuring points in the second frame of radar data are that the number of measuring points is 2, the radar reflection area is 1.410 and 1.803, the radial velocity is-4.547 and-7.276, and the sum operation of each second parameter corresponding to all measuring points in the sub-area is: the sum operation result of the number of the measuring points is 3, and the sum operation result of the radial speed is-11.823.
And determining a sum operation result of the radar reflection areas, wherein the sum operation is required to be performed on the radar reflection area corresponding to each measuring point, and then the sum operation is performed on the logarithm operation result (namely, the new radar reflection area).
Assuming that e is the base in logarithmic operation, the result of the sum operation of the radar reflection areas in the sub-areas is: ln1.33+ln1.41+ln1.803 =1.218.
In addition, when the logarithmic operation is performed in practical application, the bottom of the logarithmic operation may be freely selected as needed.
Fig. 4 is a schematic diagram of distribution density of radar reflection areas without logarithmic operation according to an embodiment of the present invention, and fig. 5 is a schematic diagram of distribution density of radar reflection areas with logarithmic operation according to an embodiment of the present invention.
As can be seen from fig. 4 and 5, the radar reflection area distribution not subjected to the logarithmic operation is unbalanced, but the radar reflection area distribution subjected to the logarithmic operation is relatively balanced, so that the gray value of the radar reflection area can be uniformly distributed in the first image.
After calculating the sum value of each second parameter in each subarea, respectively carrying out normalization treatment on the sum value of each second parameter corresponding to the subarea, and then rounding to obtain a quantized value corresponding to the second parameter value; the value range of the quantized value is the value range of the first parameter corresponding to the second parameter; and taking the quantized value of the second parameter corresponding to each grid as the first parameter value of the corresponding pixel in the first image to obtain the first image.
The preset color mode adopted by the first image comprises the following steps: RGB color mode, HSB color mode, lab color mode.
If the preset color mode is an RGB color mode, since the range of values of R, G, B in the RGB color mode is 0-255, the sum of the second parameters corresponding to each sub-area is normalized and rounded, or the sum of the second parameters may be normalized to a range of 0-255 and rounded, or the sum of the second parameters may be normalized to a range of 0-1, multiplied by 255 and rounded, which is not particularly limited.
If the preset color mode is the HSB color mode, the value range of H in the HSB color mode is 0-360, and the value ranges of s and B are 0-100, so that the sum of the second parameters corresponding to each sub-area is normalized and then rounded, or the sum of one of the second parameters may be normalized to the range of 0-360 and then rounded, or the sum of the second parameters may be normalized to the range of 0-1, multiplied by 360 and then rounded, and the remaining two second parameters are normalized to the range of 0-100 and then rounded, or normalized to the range of 0-1 and then multiplied by 100 and then rounded.
If the preset color mode is the LAB color mode, the value range of L in the LAB color mode is 0-100, and the value ranges of a and B are-128-127, so that the sum of the second parameters corresponding to each sub-region is normalized and then rounded, or the sum of one of the second parameters is normalized to be 0-100, and then rounded, or the sum of the second parameters is normalized to be 0-1, multiplied by 100, and then rounded, and the remaining two second parameters are normalized to be-128-127.
Since the values are normalized to the prior art, no further description is given.
If the preset color mode adopted by the first image is a CMYK color mode, the value of one parameter in the CMYK mode can be set to be a constant value, and other parameters respectively correspond to the second parameters.
After calculating the first parameter of the corresponding pixel in the first image for each sub-area in the above manner, the first image can be obtained, and then step 102 can be executed.
Step 102: identifying the first image by using the trained deep learning network model to determine whether the radar is shielded; the trained deep learning network model is obtained by training images with known occlusion types.
Known occlusion types include: metal shielding type, plastic shielding type, no shielding type.
The trained deep learning network model is obtained by the following method:
acquiring a plurality of second images; the second image adopts a preset color mode, one pixel in the second image corresponds to one sub-area, the values of a plurality of first parameters corresponding to the color value of one pixel are respectively determined by the values of a plurality of second parameters of all measuring points in the sub-area corresponding to the pixel in the multi-frame sample radar data, and the sample radar data are respectively obtained by measuring the condition that the radar is shielded and the condition that the radar is not shielded.
Sample radar data may be obtained by:
the radar transmits radar waves to different known shielding types under different use scenes by using different transmission frequency bands, and echo data are obtained; the echo data is taken as sample radar data.
The transmit frequency bands used by the radar may include 24G, 60G, 77G.
The use scene comprises: open scenes (i.e., scenes without occlusions, such as open lawns), shrub forest scenes, traffic intersection scenes (usually more vehicles in this scene).
For example, in an open scene, the radar respectively uses the emission frequency bands of 24G, 60G and 77G to emit radar waves of a metal shielding type, a plastic shielding type and a non-shielding type, and echo data is obtained as sample radar data.
And respectively transmitting radar waves to a metal shielding type, a plastic shielding type and a non-shielding type by using the radar in 24G, 60G and 77G transmission frequency bands under the shrub scene, and acquiring echo data as sample radar data.
And under the traffic intersection scene, the radar respectively uses the emission frequency bands of 24G, 60G and 77G to emit radar waves for the metal shielding type, the plastic shielding type and the non-shielding type, and echo data is obtained as sample radar data.
Since the manner of converting the multi-frame sample radar data into the corresponding second image is the same as the manner of converting the multi-frame radar data into the first image, the description thereof will not be repeated here.
After the plurality of second images are acquired, the plurality of second images are randomly divided into three parts, namely a training set, a verification set and a test set. For example, the number of second images in the training set, validation set, and test set may be allocated at a 3:1:1 ratio. Since it is known whether or not the radar data of each frame of the sample is occluded at the time of acquisition of the sample radar data, whether or not the radar is occluded can be noted in the second image at the time of obtaining the second image.
And then training the deep learning network model by using a training set, verifying the trained deep learning network model by using a verification set, testing the verified deep learning network model by using a test set, and stopping training until the accuracy of the identification of the tested deep learning network model reaches a set threshold value to obtain a trained deep learning network model.
The deep learning network model includes: 3 input channels, 6 convolutional layers and 2 fully-connected layers, 2 output channels; the 3 input channels respectively input the values of three first parameters corresponding to each pixel, the convolution layer is used for extracting the characteristics input from the input channels, the full connection layer is used for classifying the characteristics so as to determine whether the radar is shielded, and the output channel is used for outputting the classification result.
It should be noted that the number of convolution layers and full connection layers included in the deep learning network model may be other numbers according to actual needs, and should not be construed as being limited to 6 convolution layers and 2 full connection layers.
After obtaining the trained deep learning network model, the first image may be identified using the trained deep learning network model to determine whether the radar is occluded, after which step 103 may be performed.
Step 103: if the radar is shielded as a result of the identification, alarm information is sent out, and the alarm information is used for prompting a user to clear a shielding object shielding the radar.
The alarm information may be, for example, a voice message, a text message, or an indicator light.
Because the radar is blocked and cannot work normally, after receiving the alarm information, a worker can find the blocking object in time and further clear the blocking object, so that the radar can recover to work normally.
Due to the fact that whether the radar is shielded or not can be automatically found through the mode, manual intervention is not needed in the process, and accordingly the workload of maintenance on the radar can be further reduced.
Based on the same inventive concept, in an embodiment of the present invention, a device for identifying radar occlusion is provided, a specific implementation of a radar occlusion identification method of the device may refer to a description of an embodiment part of the method, and details are not repeated, and please refer to fig. 6, where the device includes:
A construction unit 601, configured to construct a first image from multiple frames of radar data within a set duration; the first image adopts a preset color mode, one pixel in the first image corresponds to one sub-area in a radar coverage area, the sizes of all the sub-areas are the same, and the values of a plurality of first parameters corresponding to the color value of one pixel are respectively determined by the values of a plurality of second parameters of all measuring points in the sub-area corresponding to the pixel in the multi-frame radar data;
an identifying unit 602, configured to identify the first image with a trained deep learning network model, so as to determine whether the radar is occluded; the trained deep learning network model is obtained by training images with known occlusion types;
and the processing unit 603 is configured to send out alarm information if the radar is blocked as a result of the identification, where the alarm information is used to prompt a user to clear a blocking object that blocks the radar.
In one possible embodiment, the construction unit 601 is specifically configured to:
acquiring a measurement graph corresponding to each frame of radar data, and rasterizing the measurement graph, wherein one grid corresponds to one subarea;
Performing sum operation on each second parameter value corresponding to all the measuring points in the same subarea in the multi-frame radar data to obtain a sum value of each second parameter corresponding to the subarea; the second parameters comprise the number of measuring points, radar reflection area and radial speed;
respectively carrying out normalization treatment on the sum value of each second parameter corresponding to the subarea, and then rounding to obtain a quantized value corresponding to the second parameter value; wherein, the value range of the quantized value is the value range of the first parameter corresponding to the second parameter;
and taking the quantized value of the second parameter corresponding to each grid as the first parameter value of the corresponding pixel in the first image to obtain the first image.
In a possible embodiment, the construction unit 601 is further configured to:
if the second parameter is the radar reflection area, carrying out logarithmic operation on the radar reflection area corresponding to each measuring point to obtain a new radar reflection area; wherein the new radar reflection area is used for the sum operation of the radar reflection area as the second parameter.
In a possible implementation manner, the apparatus further includes a training unit 604, where the training unit 604 is configured to:
Acquiring a plurality of second images; the second image adopts the preset color mode, one pixel in the second image corresponds to one subarea, the values of a plurality of first parameters corresponding to the color value of one pixel are respectively determined by the values of a plurality of second parameters of all measuring points in the subarea corresponding to the pixel in multi-frame sample radar data, and the sample radar data are respectively obtained by measuring the condition that the radar is shielded and the condition that the radar is not shielded;
randomly dividing the plurality of second images into three parts, namely a training set, a verification set and a test set;
training the deep learning network model by using the training set, verifying the trained deep learning network model by using the verification set, testing the verified deep learning network model by using the test set, and stopping training until the accuracy rate of the identification of the tested deep learning network model reaches a set threshold value to obtain the trained deep learning network model.
A possible implementation manner, the deep learning network model includes:
3 input channels, 6 convolutional layers and 2 fully-connected layers, 2 output channels;
The 3 input channels respectively input the values of the three first parameters corresponding to each pixel, the convolution layer is used for extracting the characteristics input from the input channels, the full connection layer is used for classifying the characteristics to determine whether the radar is shielded or not, and the output channel is used for outputting a classification result.
In one possible implementation, the number of the second images in the training set, the verification set and the test set is 3:1:1.
In a possible embodiment, the construction unit 601 is further configured to:
the radar transmits radar waves to different known shielding types under different use scenes by using different transmission frequency bands, and echo data are obtained;
and taking the echo data as the sample radar data.
A possible implementation, the known occlusion type, comprises:
metal shielding type, plastic shielding type, no shielding type.
A possible implementation, the preset color mode includes:
RGB color mode, HSB color mode, lab color mode.
Based on the same inventive concept, an embodiment of the present invention provides a device for identifying occlusion of a radar, including: at least one processor, and
A memory coupled to the at least one processor;
wherein the memory stores instructions executable by the at least one processor, the at least one processor executing the occlusion recognition method of radar as described above by executing the instructions stored by the memory.
Based on the same inventive concept, an embodiment of the present invention also provides a readable storage medium, including:
the memory device is used for storing the data,
the memory is for storing instructions that, when executed by the processor, cause an apparatus comprising the readable storage medium to perform an occlusion recognition method for a radar as described above.
It will be appreciated by those skilled in the art that embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the invention may take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (11)

1. A method of occlusion recognition for a radar, comprising:
constructing a first image from multiple frames of radar data within a set duration; the first image adopts a preset color mode, one pixel in the first image corresponds to one sub-area in a radar coverage area, the sizes of all the sub-areas are the same, and the values of a plurality of first parameters corresponding to the color value of one pixel are respectively determined by the values of a plurality of second parameters of all measuring points in the sub-area corresponding to the pixel in the multi-frame radar data;
identifying the first image by using a trained deep learning network model to determine whether the radar is occluded; the trained deep learning network model is obtained by training images with known occlusion types;
if the radar is shielded as a result of the identification, sending alarm information, wherein the alarm information is used for prompting a user to clear a shielding object shielding the radar;
the method comprises the steps of constructing a first image from multiple frames of radar data in a set duration, and the method comprises the following steps:
acquiring a measurement graph corresponding to each frame of radar data, and rasterizing the measurement graph, wherein one grid corresponds to one subarea;
Performing sum operation on each second parameter value corresponding to all the measuring points in the same subarea in the multi-frame radar data to obtain a sum value of each second parameter corresponding to the subarea; the second parameters comprise the number of measuring points, radar reflection area and radial speed;
respectively carrying out normalization treatment on the sum value of each second parameter corresponding to the subarea, and then rounding to obtain a quantized value corresponding to the second parameter value; wherein, the value range of the quantized value is the value range of the first parameter corresponding to the second parameter;
and taking the quantized value of the second parameter corresponding to each grid as the first parameter value of the corresponding pixel in the first image to obtain the first image.
2. The method of claim 1, wherein prior to performing a sum operation on each of the second parameter values corresponding to all of the measurement points located within the same sub-region in the plurality of frames of radar data, further comprising:
if the second parameter is the radar reflection area, carrying out logarithmic operation on the radar reflection area corresponding to each measuring point to obtain a new radar reflection area; wherein the new radar reflection area is used for the sum operation of the radar reflection area as the second parameter.
3. The method of claim 1, wherein obtaining the trained deep learning network model comprises:
acquiring a plurality of second images; the second image adopts the preset color mode, one pixel in the second image corresponds to one subarea, the values of a plurality of first parameters corresponding to the color value of one pixel are respectively determined by the values of a plurality of second parameters of all measuring points in the subarea corresponding to the pixel in multi-frame sample radar data, and the sample radar data are respectively obtained by measuring the condition that the radar is shielded and the condition that the radar is not shielded;
randomly dividing the plurality of second images into three parts, namely a training set, a verification set and a test set;
training the deep learning network model by using the training set, verifying the trained deep learning network model by using the verification set, testing the verified deep learning network model by using the test set, and stopping training until the accuracy rate of the identification of the tested deep learning network model reaches a set threshold value to obtain the trained deep learning network model.
4. The method of claim 3, wherein the deep learning network model comprises:
3 input channels, 6 convolutional layers and 2 fully-connected layers, 2 output channels;
the 3 input channels respectively input the values of the three first parameters corresponding to each pixel, the convolution layer is used for extracting the characteristics input from the input channels, the full connection layer is used for classifying the characteristics to determine whether the radar is shielded or not, and the output channel is used for outputting a classification result.
5. The method of claim 3, wherein the number of second images in the training set, validation set, and test set is a 3:1:1 ratio.
6. A method according to claim 3, wherein obtaining the sample radar data comprises:
the radar transmits radar waves to different known shielding types under different use scenes by using different transmission frequency bands, and echo data are obtained;
and taking the echo data as the sample radar data.
7. The method of claim 1, wherein the known occlusion type comprises:
metal shielding type, plastic shielding type, no shielding type.
8. The method of any one of claims 1-7, wherein the pre-set color pattern comprises:
RGB color mode, HSB color mode, lab color mode.
9. An apparatus for occlusion recognition of a radar, comprising:
the construction unit is used for constructing a first image from multiple frames of radar data in a set duration; the first image adopts a preset color mode, one pixel in the first image corresponds to one sub-area in a radar coverage area, the sizes of all the sub-areas are the same, and the values of a plurality of first parameters corresponding to the color value of one pixel are respectively determined by the values of a plurality of second parameters of all measuring points in the sub-area corresponding to the pixel in the multi-frame radar data;
the identification unit is used for identifying the first image by using the trained deep learning network model so as to determine whether the radar is shielded; the trained deep learning network model is obtained by training images with known occlusion types;
the processing unit is used for sending alarm information if the radar is shielded as a result of the identification, and the alarm information is used for prompting a user to clear a shielding object shielding the radar;
The method comprises the steps of constructing a first image from multiple frames of radar data in a set duration, and the method comprises the following steps:
acquiring a measurement graph corresponding to each frame of radar data, and rasterizing the measurement graph, wherein one grid corresponds to one subarea;
performing sum operation on each second parameter value corresponding to all the measuring points in the same subarea in the multi-frame radar data to obtain a sum value of each second parameter corresponding to the subarea; the second parameters comprise the number of measuring points, radar reflection area and radial speed;
respectively carrying out normalization treatment on the sum value of each second parameter corresponding to the subarea, and then rounding to obtain a quantized value corresponding to the second parameter value; wherein, the value range of the quantized value is the value range of the first parameter corresponding to the second parameter;
and taking the quantized value of the second parameter corresponding to each grid as the first parameter value of the corresponding pixel in the first image to obtain the first image.
10. An apparatus for occlusion recognition of a radar, comprising:
at least one processor, and
A memory coupled to the at least one processor;
wherein the memory stores instructions executable by the at least one processor, the at least one processor performing the method of any of claims 1-8 by executing the instructions stored by the memory.
11. A meter-readable storage medium comprising a memory,
the memory is configured to store instructions that, when executed by a processor, cause an apparatus comprising the readable storage medium to perform the method of any of claims 1-8.
CN202010803000.0A 2020-08-11 2020-08-11 Method, device and storage medium for shielding identification of radar Active CN112052878B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010803000.0A CN112052878B (en) 2020-08-11 2020-08-11 Method, device and storage medium for shielding identification of radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010803000.0A CN112052878B (en) 2020-08-11 2020-08-11 Method, device and storage medium for shielding identification of radar

Publications (2)

Publication Number Publication Date
CN112052878A CN112052878A (en) 2020-12-08
CN112052878B true CN112052878B (en) 2024-04-16

Family

ID=73601420

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010803000.0A Active CN112052878B (en) 2020-08-11 2020-08-11 Method, device and storage medium for shielding identification of radar

Country Status (1)

Country Link
CN (1) CN112052878B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113009449B (en) * 2021-03-10 2022-08-26 森思泰克河北科技有限公司 Radar shielding state identification method and device and terminal equipment
CN113095240B (en) * 2021-04-16 2023-08-29 青岛海尔电冰箱有限公司 Method for identifying article information in refrigerator, refrigerator and computer storage medium
CN115542296B (en) * 2021-06-29 2024-03-08 苏州一径科技有限公司 Dirty spot and dirty detection method of laser radar and electronic device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN208537711U (en) * 2018-06-21 2019-02-22 北京汽车股份有限公司 Radar sensor failure monitor processing unit
CN109490889A (en) * 2017-09-12 2019-03-19 比亚迪股份有限公司 Trailer-mounted radar and judge the method, apparatus whether trailer-mounted radar is blocked
CN110441753A (en) * 2019-09-19 2019-11-12 森思泰克河北科技有限公司 Radar occlusion detection method and radar
CN110717480A (en) * 2019-10-25 2020-01-21 中国人民解放军国防科技大学 Synthetic aperture radar shielding target identification method based on random erasure image fusion

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11353577B2 (en) * 2018-09-28 2022-06-07 Zoox, Inc. Radar spatial estimation
US11280883B2 (en) * 2019-01-25 2022-03-22 Veoneer Us, Inc. Apparatus and method for detecting radar sensor blockage using machine learning

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109490889A (en) * 2017-09-12 2019-03-19 比亚迪股份有限公司 Trailer-mounted radar and judge the method, apparatus whether trailer-mounted radar is blocked
CN208537711U (en) * 2018-06-21 2019-02-22 北京汽车股份有限公司 Radar sensor failure monitor processing unit
CN110441753A (en) * 2019-09-19 2019-11-12 森思泰克河北科技有限公司 Radar occlusion detection method and radar
CN110717480A (en) * 2019-10-25 2020-01-21 中国人民解放军国防科技大学 Synthetic aperture radar shielding target identification method based on random erasure image fusion

Also Published As

Publication number Publication date
CN112052878A (en) 2020-12-08

Similar Documents

Publication Publication Date Title
CN112052878B (en) Method, device and storage medium for shielding identification of radar
EP3869459B1 (en) Target object identification method and apparatus, storage medium and electronic apparatus
CN109740639B (en) Wind cloud satellite remote sensing image cloud detection method and system and electronic equipment
CN110147706B (en) Obstacle recognition method and device, storage medium, and electronic device
CN113822247B (en) Method and system for identifying illegal building based on aerial image
CN111753757B (en) Image recognition processing method and device
CN109508583B (en) Method and device for acquiring crowd distribution characteristics
KR101255736B1 (en) Method for classifying meteorological/non-meteorological echoes using single polarization radars
CN114445803A (en) Driving data processing method and device and electronic equipment
CN113516102B (en) Deep learning parabolic behavior detection method based on video
CN106952242A (en) A kind of progressive TIN point cloud filtering method based on voxel
CN111275705A (en) Intelligent cloth inspecting method and device, electronic equipment and storage medium
CN112487884A (en) Traffic violation behavior detection method and device and computer readable storage medium
JP2020160840A (en) Road surface defect detecting apparatus, road surface defect detecting method, road surface defect detecting program
CN106326850A (en) Fast lane line detection method
CN110716209B (en) Map construction method, map construction equipment and storage device
CN109061632A (en) A kind of unmanned plane recognition methods
CN113128422A (en) Image smoke and fire detection method and system of deep neural network
CN117853942A (en) Cloud and fog identification method, cloud and fog identification device and cloud and fog identification system
CN112509321A (en) Unmanned aerial vehicle-based driving control method and system for urban complex traffic situation and readable storage medium
CN109375187B (en) Method and device for determining radar target
CN113593256B (en) Unmanned aerial vehicle intelligent driving-away control method and system based on city management and cloud platform
CN114169404A (en) Method for intelligently acquiring quantitative information of slope diseases based on images
CN114913488A (en) Sprinkler detection method, device, electronic device, and storage medium
CN115407800A (en) Unmanned aerial vehicle inspection system and inspection method in agricultural product storage and preservation warehouse

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant