CN112052878A - Radar shielding identification method and device and storage medium - Google Patents

Radar shielding identification method and device and storage medium Download PDF

Info

Publication number
CN112052878A
CN112052878A CN202010803000.0A CN202010803000A CN112052878A CN 112052878 A CN112052878 A CN 112052878A CN 202010803000 A CN202010803000 A CN 202010803000A CN 112052878 A CN112052878 A CN 112052878A
Authority
CN
China
Prior art keywords
radar
image
pixel
deep learning
network model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010803000.0A
Other languages
Chinese (zh)
Other versions
CN112052878B (en
Inventor
李冬冬
李乾坤
卢维
殷俊
王凯
汪巧斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202010803000.0A priority Critical patent/CN112052878B/en
Publication of CN112052878A publication Critical patent/CN112052878A/en
Application granted granted Critical
Publication of CN112052878B publication Critical patent/CN112052878B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/411Identification of targets based on measurements of radar reflectivity
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/417Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section involving the use of neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Molecular Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a method, a device and a storage medium for identifying radar shielding, which are used for solving the technical problem that the radar cannot normally work due to shielding in the prior art, and the method comprises the following steps: constructing a first image by using multi-frame radar data within a set time length; the method comprises the steps that a preset color mode is adopted for a first image, one pixel in the first image corresponds to one subregion in a radar coverage area, all subregions are the same in size, and values of a plurality of first parameters corresponding to a color value of one pixel are determined by values of a plurality of second parameters of all measuring points in the subregion corresponding to the pixel in multi-frame radar data; identifying the first image by using the trained deep learning network model to determine whether the radar is shielded; the trained deep learning network model is obtained by training images with known shielding types; and if the identification result is that the radar is blocked, sending alarm information to prompt a user to clear a blocking object blocking the radar.

Description

Radar shielding identification method and device and storage medium
Technical Field
The invention relates to the field of security protection, in particular to a method and a device for identifying shielding of a radar and a storage medium.
Background
With the rapid development of security technology and the importance of public on security, various security products are emerging, and the security application field is expanding.
The millimeter wave radar-based area monitoring technology is a big hot spot of the last five years research. The traditional security terminal equipment is mainly a visible light camera, but the visible light camera cannot work at night; although there are drawbacks to infrared cameras that complement visible light cameras, this undoubtedly increases cost and operational difficulty. In addition, the optical sensor is also influenced by weather, and the monitoring effect cannot be satisfactory in heavy fog days or rainy and snowy days. The millimeter wave Radar actively transmits electromagnetic waves and receives signals with the same frequency, and has very high detection probability for moving objects or objects with large Radar Cross-Section (RCS) reflection areas, and has low detection probability (the detection probability is not zero) for static objects. And the millimeter wave radar can work 24 hours all day, and is less influenced by weather. Therefore, monitoring products based on millimeter wave radar are in strong demand in the market at present.
The millimeter wave radar can monitor various targets, extract the targets in which the user is interested from the various targets, and terminate/filter the targets or false targets in which the user is not interested as soon as possible. One of the purposes of object trajectory classification is to filter/filter objects. For example, in a park, a 3-level wind is occasionally blown, trees shake to form a low-speed target track moving in a small range, the target type is a non-human, non-vehicle or non-animal target, and the target type does not need to be reported or a track ending method is called as soon as possible to delete the target track. If a small dog is going through the garden, the trajectory should also be terminated in time since it is not the target of the user's attention (whether the user is a person or a car). If the track is formed by the pedestrian, the radar outputs the track information of the pedestrian to the camera, and the camera takes pictures or records the pictures according to the track space position information provided by the radar.
The radar may be in a failure state for various reasons, one of the most common being shadowing, such as placing a metal plate in front of the radar, or a large vehicle standing just in front of the radar. These phenomena can cause the radar to work abnormally, and target tracking cannot be performed normally.
Disclosure of Invention
The invention provides a method and a device for identifying radar shielding and a storage medium, which are used for solving the technical problem that the radar cannot normally work due to shielding in the prior art.
In a first aspect, to solve the above technical problem, a technical solution of a method for identifying a radar occlusion provided in an embodiment of the present invention is as follows:
constructing a first image by using multi-frame radar data within a set time length; the first image adopts a preset color mode, one pixel in the first image corresponds to one subregion in a radar coverage area, the sizes of all subregions are the same, and the values of a plurality of first parameters corresponding to the color value of one pixel are determined by the values of a plurality of second parameters of all measuring points in the subregion corresponding to the pixel in the multi-frame radar data respectively;
identifying the first image by using a trained deep learning network model to determine whether the radar is blocked; the trained deep learning network model is obtained by training images of known shielding types;
and if the identification result is that the radar is shielded, sending alarm information, wherein the alarm information is used for prompting a user to clear a shielding object shielding the radar.
One possible implementation way, constructing multiple frames of radar data within a set time length into a first image, includes:
acquiring a measurement map corresponding to each frame of radar data, and rasterizing the measurement map, wherein one grid corresponds to one sub-area;
performing sum operation on each second parameter value corresponding to all the measurement points in the same sub-region in the multi-frame radar data to obtain a sum value of each second parameter corresponding to the sub-region; the second parameters comprise the number of measuring points, radar reflection area and radial speed;
respectively carrying out normalization processing on the sum value of each second parameter corresponding to the sub-region and then rounding to obtain a quantized value corresponding to the second parameter value; the value range of the quantized value is the value range of a first parameter corresponding to a second parameter;
and taking the quantized value of the second parameter corresponding to each grid as the first parameter value of the corresponding pixel in the first image to obtain the first image.
In a possible embodiment, before performing the sum operation on each second parameter value corresponding to all measurement points located in the same sub-region in the multi-frame radar data, the method further includes:
if the second parameter is the radar reflection area, carrying out logarithmic operation on the radar reflection area corresponding to each measuring point to obtain a new radar reflection area; wherein the new radar reflection area is used for the second parameter being a sum operation of the radar reflection areas.
One possible implementation manner of obtaining the trained deep learning network model includes:
acquiring a plurality of second images; the second image adopts the preset color mode, one pixel in the second image corresponds to one sub-region, and the values of a plurality of first parameters corresponding to the color value of one pixel are determined by the values of a plurality of second parameters of all measuring points in the sub-region corresponding to the pixel in multi-frame sample radar data respectively, so that the sample radar data are obtained by measuring the conditions that the radar has occlusion and does not have occlusion;
randomly dividing the plurality of second images into a training set, a verification set and a test set;
training the deep learning network model by using the training set, verifying the trained deep learning network model by using the verification set, testing the verified deep learning network model by using the test set, and stopping training until the accuracy of the deep learning network model identification after testing reaches a set threshold value to obtain the trained deep learning network model.
One possible implementation, the deep learning network model includes:
3 input channels, 6 convolutional layers and 2 full-link layers and 2 output channels;
the 3 input channels respectively input values of the three first parameters corresponding to each pixel, the convolutional layer is used for extracting features input from the input channels, the full-link layer is used for classifying the features to determine whether the radar is shielded, and the output channel is used for outputting a classification result.
In one possible embodiment, the ratio of the number of the second images in the training set, the validation set, and the test set is 3:1: 1.
One possible embodiment, obtaining the sample radar data, comprises:
the radar uses different transmitting frequency bands to transmit radar waves to different known shielding types under different using scenes to obtain echo data;
taking the echo data as the sample radar data.
A possible implementation, the known occlusion type, comprising:
metal occlusion type, plastic occlusion type, and non-occlusion type.
In one possible embodiment, the preset color mode includes:
RGB color mode, HSB color mode, Lab color mode.
In a second aspect, an embodiment of the present invention provides an apparatus for identifying occlusion of a radar, including:
the device comprises a construction unit, a processing unit and a processing unit, wherein the construction unit is used for constructing a first image from multi-frame radar data within a set time length; the first image adopts a preset color mode, one pixel in the first image corresponds to one subregion in a radar coverage area, the sizes of all subregions are the same, and the values of a plurality of first parameters corresponding to the color value of one pixel are determined by the values of a plurality of second parameters of all measuring points in the subregion corresponding to the pixel in the multi-frame radar data respectively;
the identification unit is used for identifying the first image by using a trained deep learning network model so as to determine whether the radar is blocked; the trained deep learning network model is obtained by training images of known shielding types;
and the processing unit is used for sending alarm information if the radar is shielded according to the identification result, and the alarm information is used for prompting a user to clear a shielding object shielding the radar.
In a possible embodiment, the building unit is specifically configured to:
acquiring a measurement map corresponding to each frame of radar data, and rasterizing the measurement map, wherein one grid corresponds to one sub-area;
performing sum operation on each second parameter value corresponding to all the measurement points in the same sub-region in the multi-frame radar data to obtain a sum value of each second parameter corresponding to the sub-region; the second parameters comprise the number of measuring points, radar reflection area and radial speed;
respectively carrying out normalization processing on the sum value of each second parameter corresponding to the sub-region and then rounding to obtain a quantized value corresponding to the second parameter value; the value range of the quantized value is the value range of a first parameter corresponding to a second parameter;
and taking the quantized value of the second parameter corresponding to each grid as the first parameter value of the corresponding pixel in the first image to obtain the first image.
In a possible embodiment, the building unit is further configured to:
if the second parameter is the radar reflection area, carrying out logarithmic operation on the radar reflection area corresponding to each measuring point to obtain a new radar reflection area; wherein the new radar reflection area is used for the second parameter being a sum operation of the radar reflection areas.
In a possible embodiment, the apparatus further comprises a training unit configured to:
acquiring a plurality of second images; the second image adopts the preset color mode, one pixel in the second image corresponds to one sub-region, and the values of a plurality of first parameters corresponding to the color value of one pixel are determined by the values of a plurality of second parameters of all measuring points in the sub-region corresponding to the pixel in multi-frame sample radar data respectively, so that the sample radar data are obtained by measuring the conditions that the radar has occlusion and does not have occlusion;
randomly dividing the plurality of second images into a training set, a verification set and a test set;
training the deep learning network model by using the training set, verifying the trained deep learning network model by using the verification set, testing the verified deep learning network model by using the test set, and stopping training until the accuracy of the deep learning network model identification after testing reaches a set threshold value to obtain the trained deep learning network model.
One possible implementation, the deep learning network model includes:
3 input channels, 6 convolutional layers and 2 full-link layers and 2 output channels;
the 3 input channels respectively input values of the three first parameters corresponding to each pixel, the convolutional layer is used for extracting features input from the input channels, the full-link layer is used for classifying the features to determine whether the radar is shielded, and the output channel is used for outputting a classification result.
In one possible embodiment, the ratio of the number of the second images in the training set, the validation set, and the test set is 3:1: 1.
In a possible embodiment, the building unit is further configured to:
the radar uses different transmitting frequency bands to transmit radar waves to different known shielding types under different using scenes to obtain echo data;
taking the echo data as the sample radar data.
A possible implementation, the known occlusion type, comprising:
metal occlusion type, plastic occlusion type, and non-occlusion type.
In one possible embodiment, the preset color mode includes:
RGB color mode, HSB color mode, Lab color mode.
In a third aspect, an embodiment of the present invention further provides a device for identifying occlusion of a radar, including:
at least one processor, and
a memory coupled to the at least one processor;
wherein the memory stores instructions executable by the at least one processor, and the at least one processor performs the method according to the first aspect by executing the instructions stored by the memory.
In a fourth aspect, an embodiment of the present invention further provides a readable storage medium, including:
a memory for storing a plurality of data to be transmitted,
the memory is for storing instructions that, when executed by the processor, cause an apparatus comprising the readable storage medium to perform the method as described in the first aspect above.
Through the technical solutions in one or more of the above embodiments of the present invention, the embodiments of the present invention have at least the following technical effects:
in the embodiment provided by the invention, a first image is constructed by multi-frame radar data within a set time length, the first image is identified by a trained deep learning network model to determine whether the radar is shielded, and if the identification result is that the radar is shielded, alarm information is sent out and used for prompting a user to clear a shielding object shielding the radar; the first image adopts a preset color mode, one pixel in the first image corresponds to one subregion in a radar coverage area, the sizes of all subregions are the same, and the values of a plurality of first parameters corresponding to the color value of one pixel are determined by the values of a plurality of second parameters of all measuring points in the subregion corresponding to the pixel in multi-frame radar data respectively; the trained deep learning network model is obtained by training with images with known occlusion types. Because the multiframe radar data is converted into the first image of a preset color mode, the first image is identified by the trained deep learning model, whether the radar is shielded is determined, alarm information is sent out when the radar is shielded to prompt a user, the user can timely clear away the shielding object of the radar, the radar can be timely found to be invalid (namely in an abnormal state) due to shielding, and the radar can timely recover to a normal working state.
Drawings
Fig. 1 is a flowchart of a radar occlusion identification method according to an embodiment of the present invention;
FIG. 2 is a diagram of a radar measurement map corresponding to a frame of radar data according to an embodiment of the present invention;
FIG. 3 is a rasterized radar metrology map provided by an embodiment of the present invention;
FIG. 4 is a schematic diagram illustrating distribution density of radar reflection areas without logarithm operation according to an embodiment of the present invention;
fig. 5 is a schematic diagram of distribution density of radar reflection areas after logarithmic operation according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a radar occlusion recognition device according to an embodiment of the present invention.
Detailed Description
The embodiment of the invention provides a method and a device for identifying radar shielding and a storage medium, which are used for solving the technical problem that the radar cannot normally work due to shielding in the prior art.
In order to solve the technical problems, the general idea of the embodiment of the present application is as follows:
provided is a radar occlusion recognition method, including: constructing a first image by using multi-frame radar data within a set time length; the method comprises the steps that a first image adopts a preset color mode, one pixel in the first image corresponds to one subregion in a radar coverage area, all subregions are the same in size, and values of a plurality of first parameters corresponding to a color value of one pixel are determined by values of a plurality of second parameters of all measuring points in the subregion corresponding to the pixel in multi-frame radar data; identifying the first image by using the trained deep learning network model to determine whether the radar is shielded; the trained deep learning network model is obtained by training images with known shielding types; and if the identification result is that the radar is shielded, sending alarm information, wherein the alarm information is used for prompting a user to clear a shielding object for shielding the radar.
According to the scheme, a first image is constructed by multi-frame radar data within a set time length, the first image is identified by a trained deep learning network model to determine whether the radar is shielded, and if the identification result is that the radar is shielded, alarm information is sent out and used for prompting a user to clear shielding objects shielding the radar; the first image adopts a preset color mode, one pixel in the first image corresponds to one subregion in a radar coverage area, the sizes of all subregions are the same, and the values of a plurality of first parameters corresponding to the color value of one pixel are determined by the values of a plurality of second parameters of all measuring points in the subregion corresponding to the pixel in multi-frame radar data respectively; the trained deep learning network model is obtained by training with images with known occlusion types. Because the multiframe radar data is converted into the first image of a preset color mode, the first image is identified by the trained deep learning model, whether the radar is shielded is determined, alarm information is sent out when the radar is shielded to prompt a user, the user can timely clear away the shielding object of the radar, the radar can be timely found to be invalid (namely in an abnormal state) due to shielding, and the radar can timely recover to a normal working state.
In order to better understand the technical solutions of the present invention, the following detailed descriptions of the technical solutions of the present invention are provided with the accompanying drawings and the specific embodiments, and it should be understood that the specific features in the embodiments and the examples of the present invention are the detailed descriptions of the technical solutions of the present invention, and are not limitations of the technical solutions of the present invention, and the technical features in the embodiments and the examples of the present invention may be combined with each other without conflict.
Referring to fig. 1, an embodiment of the invention provides a radar occlusion recognition method, which includes the following processing steps.
Step 101: constructing a first image by using multi-frame radar data within a set time length; the first image adopts a preset color mode, one pixel in the first image corresponds to one subregion in the radar coverage area, all the subregions are the same in size, and the values of a plurality of first parameters corresponding to the color value of one pixel are determined by the values of a plurality of second parameters of all measuring points in the subregion corresponding to the pixel in the multi-frame radar data respectively.
A measurement point is an object measured by the radar within its coverage area.
Taking millimeter wave radar as an example, the measurement information that can be obtained for a target (measurement point) includes: distance, angle, radial velocity (RadialSpeed), radar reflection area (RCS). The millimeter wave radar can set the measurement period (i.e. the signal transceiving period) as required, and the measurement period is generally set to be 0.1/0.05 second, i.e. the operating frequency is 10/20 Hz.
Please refer to table 1 for an example of partial data in one frame of radar data. The total number of measurement points of the radar data in the frame is 49, and the measurement time is 2000-1-103: 3: 40.
TABLE 1
Figure BDA0002628072160000091
In table 1, the radial velocity is zero, indicating that the measurement point is stationary, as in table 1, measurement points 0 to 3 are stationary targets; the radial velocity is negative, which indicates that the measuring points are close to the radar, and the measuring points 4-13 in the table 1 are moving targets approaching the radar; the radial velocity is positive, indicating that the measurement point is away from the radar, as is the moving object in table 1 at measurement point 14.
After acquiring multiple frames of radar data within a set time length similar to that in table 1, they may be constructed into a first image, which may be implemented as follows:
and acquiring a measurement map corresponding to each frame of radar data, and rasterizing the measurement map, wherein one grid corresponds to one sub-area.
Fig. 2 is a diagram of a radar measurement map corresponding to a frame of radar data according to an embodiment of the present invention. That is, a frame of radar data similar to that in table 1 is plotted on a graph to obtain a corresponding radar measurement map.
The center position of the abscissa axis in fig. 2 is indicated by 0, which represents the position of the radar, the units of the abscissa and ordinate axes are meters (m), the black solid points in fig. 2 represent stationary targets, the open circles represent moving targets approaching the radar, and the gray solid circles represent moving targets moving away from the radar.
Fig. 3 is a diagram of a rasterized radar survey according to an embodiment of the present invention. The area within a dashed box in fig. 3 is a sub-area (as shown by the gray grid line coverage in fig. 3).
Then, performing sum operation on each second parameter value corresponding to all measuring points in the same subregion in the multi-frame radar data to obtain a sum value of each second parameter corresponding to the subregion; the second parameters comprise the number of measuring points, the radar reflection area and the radial speed.
If the second parameter is the radar reflection area, carrying out logarithm operation on the radar reflection area corresponding to each measuring point to obtain a new radar reflection area; wherein the new radar reflection area is used for the second parameter which is the sum operation of the radar reflection areas.
For example, assuming that the multi-frame radar data is 2 frames of radar data, in the same sub-area of the 2 frames of radar data, one measurement point in the first frame of radar data is included, the second parameter is the number of measurement points is 1, the radar reflection area is 1.330, and the radial velocity is 0, and the second parameter is the number of measurement points is 2, the radar reflection areas are 1.410 and 1.803, and the radial velocities are-4.547 and-7.276, and performing the sum operation on each second parameter corresponding to all measurement points in the sub-area is: the sum of the number of measurement points is 3 and the sum of the radial velocities is-11.823.
Determining the sum operation result of the radar reflection areas, namely, performing the operation of the logarithm operation on the radar reflection areas corresponding to each measuring point, and then summing the logarithm operation results (i.e., new radar reflection areas).
Assuming that e is used as the base in the logarithm operation, the sum operation result of the radar reflection areas in the sub-area is: ln1.33+ ln1.41+ ln1.803 is 1.218.
In the case of performing the logarithm operation in practical use, the base of the logarithm operation may be freely selected as needed.
Fig. 4 is a schematic diagram showing distribution density of radar reflection areas without logarithm operation according to an embodiment of the present invention, and fig. 5 is a schematic diagram showing distribution density of radar reflection areas after logarithm operation according to an embodiment of the present invention.
As can be seen from fig. 4 and 5, the radar reflection area distribution without the logarithm operation is unbalanced, and the radar reflection area distribution with the logarithm operation is more balanced, so that the gray values of the radar reflection areas are evenly distributed in the first image.
After the sum of each second parameter in each sub-region is calculated, the sum of each second parameter corresponding to the sub-region can be respectively normalized and then rounded to obtain a quantization value corresponding to the second parameter value; the value range of the quantized value is the value range of the first parameter corresponding to the second parameter; and taking the quantized value of the second parameter corresponding to each grid as a first parameter value of a corresponding pixel in the first image to obtain the first image.
The preset color mode adopted by the first image comprises the following steps: RGB color mode, HSB color mode, Lab color mode.
If the preset color mode is an RGB color mode, since the value ranges of R, G, B in the RGB color mode are all 0 to 255, the sum of the second parameter corresponding to each sub-region is normalized and then rounded, the sum of the second parameter may be normalized to be in the range of 0 to 255 and then rounded, or the sum of the second parameter may be normalized to be in the range of 0 to 1 and then multiplied by 255 and then rounded, which is not limited specifically.
If the preset color mode is the HSB color mode, the value range of H in the HSB color mode is 0-360, and the value range of S, B is 0-100%, so the sum of the second parameters corresponding to each sub-region is normalized and then rounded, the sum of one of the second parameters may be normalized to be 0-360 and then rounded, or the sum of the second parameter is normalized to be 0-1, multiplied by 360 and then rounded, and the remaining two second parameters are normalized to be 0-100 and then rounded, or normalized to be 0-1 and then multiplied by 100 and then rounded.
If the preset color mode is an LAB color mode, the value range of L in the LAB color mode is 0-100, and the value range of A, B is-128-127, so that the sum of the second parameters corresponding to each sub-region is normalized and then rounded, the sum of one of the second parameters can be normalized to be 0-100 and then rounded, or the sum of the second parameters is normalized to be 0-1, multiplied by 100 and then rounded, and the remaining two second parameters are normalized to be-128-127 and then rounded.
Since the numerical values are normalized to the prior art, the description thereof is omitted.
If the preset color mode adopted by the first image is a CMYK color mode, the value of one parameter in the CMYK mode may be set to be a constant value, and the other parameters correspond to the second parameters, respectively.
After the first parameter of the corresponding pixel in the first image of each sub-region is calculated in the above manner, the first image can be obtained, and then step 102 can be executed.
Step 102: identifying the first image by using the trained deep learning network model to determine whether the radar is shielded; the trained deep learning network model is obtained by training images with known occlusion types.
Known occlusion types include: metal occlusion type, plastic occlusion type, and non-occlusion type.
Obtaining a trained deep learning network model, and adopting the following modes:
acquiring a plurality of second images; the second image adopts a preset color mode, one pixel in the second image corresponds to one subregion, and the values of a plurality of first parameters corresponding to the color value of one pixel are respectively determined by the values of a plurality of second parameters of all measuring points in the subregion corresponding to the pixel in the multi-frame sample radar data, so that the sample radar data are respectively obtained by measuring the conditions that the radar has shielding and does not have shielding.
Sample radar data may be obtained by:
the radar uses different transmitting frequency bands to transmit radar waves to different known shielding types under different using scenes to obtain echo data; the echo data is taken as sample radar data.
The transmission frequency bands used by radar may include 24G, 60G, 77G.
The usage scenario includes: an open scene (i.e., a scene without an obstruction, such as an open lawn), a shrub scene, and a traffic intersection scene (in this scene, there are usually many vehicles).
For example, in an open scene, the radar transmits radar waves in 24G, 60G and 77G transmission frequency bands respectively for a metal shielding type, a plastic shielding type and an unshielded type, and echo data is acquired as sample radar data.
And in the scene of the shrubbery, the radar transmits radar waves in 24G, 60G and 77G transmission frequency bands respectively for a metal shielding type, a plastic shielding type and a non-shielding type, and echo data are acquired as sample radar data.
The radar transmits radar waves of a metal shielding type, a plastic shielding type and a non-shielding type respectively by using 24G, 60G and 77G transmitting frequency bands in a traffic intersection scene, and echo data is acquired as sample radar data.
Since the manner of converting the multi-frame sample radar data into the corresponding second image is the same as the manner of converting the multi-frame sample radar data into the first image, further description is omitted here.
And after a plurality of second images are obtained, randomly dividing the plurality of second images into a training set, a verification set and a test set. For example, the number of second images in the training set, the validation set, and the test set may be assigned in a ratio of 3:1: 1. Since it is known whether each frame of sample radar data is occluded or not when the sample radar data is acquired, whether the radar is occluded or not can be marked in the second image when the second image is obtained.
And then, training the deep learning network model by using a training set, verifying the trained deep learning network model by using a verification set, testing the verified deep learning network model by using a test set, and stopping training until the recognition accuracy of the tested deep learning network model reaches a set threshold value to obtain the trained deep learning network model.
The deep learning network model comprises: 3 input channels, 6 convolutional layers and 2 full-link layers and 2 output channels; the three values of the first parameters corresponding to each pixel are respectively input into the 3 input channels, the convolution layer is used for extracting the features input from the input channels, the full-connection layer is used for classifying the features so as to determine whether the radar is shielded, and the output channel is used for outputting a classification result.
It should be noted that, the number of convolutional layers and fully-connected layers included in the deep-learning network model may be other numbers according to actual needs, and is not to be understood as being limited to 6 convolutional layers and 2 fully-connected layers.
After the trained deep learning network model is obtained, the trained deep learning network model may be used to identify the first image to determine whether the radar is occluded, and then step 103 may be performed.
Step 103: and if the identification result is that the radar is shielded, sending alarm information, wherein the alarm information is used for prompting a user to clear a shielding object for shielding the radar.
The alert information may be, for example, a voice message, a text message, or an indicator light.
Because the radar is sheltered from the back, unable normal work, consequently after the staff received alarm information, can in time discover to shelter from the thing, and then clear away and shelter from the thing to let the radar resume normal work.
Because can let the radar discover by oneself whether sheltered from through above-mentioned mode, this process need not artificial intervention to the work load of maintaining is carried out to the radar that can further reduction.
Based on the same inventive concept, an embodiment of the present invention provides a radar occlusion recognition apparatus, and the specific implementation of the radar occlusion recognition method of the apparatus can refer to the description of the method embodiment, and repeated descriptions are omitted, please refer to fig. 6, and the apparatus includes:
the constructing unit 601 is configured to construct a first image from multiple frames of radar data within a set time duration; the first image adopts a preset color mode, one pixel in the first image corresponds to one subregion in a radar coverage area, the sizes of all subregions are the same, and the values of a plurality of first parameters corresponding to the color value of one pixel are determined by the values of a plurality of second parameters of all measuring points in the subregion corresponding to the pixel in the multi-frame radar data respectively;
a recognition unit 602, configured to recognize the first image with a trained deep learning network model to determine whether the radar is occluded; the trained deep learning network model is obtained by training images of known shielding types;
and the processing unit 603 is configured to send alarm information if the radar is blocked according to the identification result, where the alarm information is used to prompt a user to clear a blocking object blocking the radar.
In a possible implementation manner, the building unit 601 is specifically configured to:
acquiring a measurement map corresponding to each frame of radar data, and rasterizing the measurement map, wherein one grid corresponds to one sub-area;
performing sum operation on each second parameter value corresponding to all the measurement points in the same sub-region in the multi-frame radar data to obtain a sum value of each second parameter corresponding to the sub-region; the second parameters comprise the number of measuring points, radar reflection area and radial speed;
respectively carrying out normalization processing on the sum value of each second parameter corresponding to the sub-region and then rounding to obtain a quantized value corresponding to the second parameter value; the value range of the quantized value is the value range of a first parameter corresponding to a second parameter;
and taking the quantized value of the second parameter corresponding to each grid as the first parameter value of the corresponding pixel in the first image to obtain the first image.
In a possible implementation, the building unit 601 is further configured to:
if the second parameter is the radar reflection area, carrying out logarithmic operation on the radar reflection area corresponding to each measuring point to obtain a new radar reflection area; wherein the new radar reflection area is used for the second parameter being a sum operation of the radar reflection areas.
In a possible implementation, the apparatus further includes a training unit 604, where the training unit 604 is configured to:
acquiring a plurality of second images; the second image adopts the preset color mode, one pixel in the second image corresponds to one sub-region, and the values of a plurality of first parameters corresponding to the color value of one pixel are determined by the values of a plurality of second parameters of all measuring points in the sub-region corresponding to the pixel in multi-frame sample radar data respectively, so that the sample radar data are obtained by measuring the conditions that the radar has occlusion and does not have occlusion;
randomly dividing the plurality of second images into a training set, a verification set and a test set;
training the deep learning network model by using the training set, verifying the trained deep learning network model by using the verification set, testing the verified deep learning network model by using the test set, and stopping training until the accuracy of the deep learning network model identification after testing reaches a set threshold value to obtain the trained deep learning network model.
One possible implementation, the deep learning network model includes:
3 input channels, 6 convolutional layers and 2 full-link layers and 2 output channels;
the 3 input channels respectively input values of the three first parameters corresponding to each pixel, the convolutional layer is used for extracting features input from the input channels, the full-link layer is used for classifying the features to determine whether the radar is shielded, and the output channel is used for outputting a classification result.
In one possible embodiment, the ratio of the number of the second images in the training set, the validation set, and the test set is 3:1: 1.
In a possible implementation, the building unit 601 is further configured to:
the radar uses different transmitting frequency bands to transmit radar waves to different known shielding types under different using scenes to obtain echo data;
taking the echo data as the sample radar data.
A possible implementation, the known occlusion type, comprising:
metal occlusion type, plastic occlusion type, and non-occlusion type.
In one possible embodiment, the preset color mode includes:
RGB color mode, HSB color mode, Lab color mode.
Based on the same inventive concept, the embodiment of the invention provides a radar occlusion recognition device, which comprises: at least one processor, and
a memory coupled to the at least one processor;
wherein the memory stores instructions executable by the at least one processor, and the at least one processor performs the radar occlusion recognition method as described above by executing the instructions stored by the memory.
Based on the same inventive concept, an embodiment of the present invention further provides a readable storage medium, including:
a memory for storing a plurality of data to be transmitted,
the memory is configured to store instructions that, when executed by the processor, cause an apparatus comprising the readable storage medium to perform the method for occlusion recognition of a radar as described above.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (12)

1. A method of occlusion recognition for a radar, comprising:
constructing a first image by using multi-frame radar data within a set time length; the first image adopts a preset color mode, one pixel in the first image corresponds to one subregion in a radar coverage area, the sizes of all subregions are the same, and the values of a plurality of first parameters corresponding to the color value of one pixel are determined by the values of a plurality of second parameters of all measuring points in the subregion corresponding to the pixel in the multi-frame radar data respectively;
identifying the first image by using a trained deep learning network model to determine whether the radar is blocked; the trained deep learning network model is obtained by training images of known shielding types;
and if the identification result is that the radar is shielded, sending alarm information, wherein the alarm information is used for prompting a user to clear a shielding object shielding the radar.
2. The method of claim 1, wherein constructing a first image from a plurality of frames of radar data over a set duration comprises:
acquiring a measurement map corresponding to each frame of radar data, and rasterizing the measurement map, wherein one grid corresponds to one sub-area;
performing sum operation on each second parameter value corresponding to all the measurement points in the same sub-region in the multi-frame radar data to obtain a sum value of each second parameter corresponding to the sub-region; the second parameters comprise the number of measuring points, radar reflection area and radial speed;
respectively carrying out normalization processing on the sum value of each second parameter corresponding to the sub-region and then rounding to obtain a quantized value corresponding to the second parameter value; the value range of the quantized value is the value range of a first parameter corresponding to a second parameter;
and taking the quantized value of the second parameter corresponding to each grid as the first parameter value of the corresponding pixel in the first image to obtain the first image.
3. The method of claim 2, wherein summing each of the second parameter values corresponding to all of the measurement points in the same sub-region of the multi-frame radar data, further comprises:
if the second parameter is the radar reflection area, carrying out logarithmic operation on the radar reflection area corresponding to each measuring point to obtain a new radar reflection area; wherein the new radar reflection area is used for the second parameter being a sum operation of the radar reflection areas.
4. The method of claim 1, wherein obtaining the trained deep learning network model comprises:
acquiring a plurality of second images; the second image adopts the preset color mode, one pixel in the second image corresponds to one sub-region, and the values of a plurality of first parameters corresponding to the color value of one pixel are determined by the values of a plurality of second parameters of all measuring points in the sub-region corresponding to the pixel in multi-frame sample radar data respectively, so that the sample radar data are obtained by measuring the conditions that the radar has occlusion and does not have occlusion;
randomly dividing the plurality of second images into a training set, a verification set and a test set;
training the deep learning network model by using the training set, verifying the trained deep learning network model by using the verification set, testing the verified deep learning network model by using the test set, and stopping training until the accuracy of the deep learning network model identification after testing reaches a set threshold value to obtain the trained deep learning network model.
5. The method of claim 4, wherein the deep learning network model comprises:
3 input channels, 6 convolutional layers and 2 full-link layers and 2 output channels;
the 3 input channels respectively input values of the three first parameters corresponding to each pixel, the convolutional layer is used for extracting features input from the input channels, the full-link layer is used for classifying the features to determine whether the radar is shielded, and the output channel is used for outputting a classification result.
6. The method of claim 4, wherein the number of second images in the training set, validation set, and test set is in a ratio of 3:1: 1.
7. The method of claim 4, wherein obtaining the sample radar data comprises:
the radar uses different transmitting frequency bands to transmit radar waves to different known shielding types under different using scenes to obtain echo data;
taking the echo data as the sample radar data.
8. The method of claim 1, wherein the known occlusion types comprise:
metal occlusion type, plastic occlusion type, and non-occlusion type.
9. The method of any of claims 1-8, wherein the preset color pattern comprises:
RGB color mode, HSB color mode, Lab color mode.
10. An apparatus for occlusion recognition of a radar, comprising:
the device comprises a construction unit, a processing unit and a processing unit, wherein the construction unit is used for constructing a first image from multi-frame radar data within a set time length; the first image adopts a preset color mode, one pixel in the first image corresponds to one subregion in a radar coverage area, the sizes of all subregions are the same, and the values of a plurality of first parameters corresponding to the color value of one pixel are determined by the values of a plurality of second parameters of all measuring points in the subregion corresponding to the pixel in the multi-frame radar data respectively;
the identification unit is used for identifying the first image by using a trained deep learning network model so as to determine whether the radar is blocked; the trained deep learning network model is obtained by training images of known shielding types;
and the processing unit is used for sending alarm information if the radar is shielded according to the identification result, and the alarm information is used for prompting a user to clear a shielding object shielding the radar.
11. An apparatus for occlusion recognition of a radar, comprising:
at least one processor, and
a memory coupled to the at least one processor;
wherein the memory stores instructions executable by the at least one processor, the at least one processor performing the method of any one of claims 1-9 by executing the instructions stored by the memory.
12. A meter-readable storage medium comprising a memory,
the memory is for storing instructions that, when executed by the processor, cause an apparatus comprising the readable storage medium to perform the method of any of claims 1-9.
CN202010803000.0A 2020-08-11 2020-08-11 Method, device and storage medium for shielding identification of radar Active CN112052878B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010803000.0A CN112052878B (en) 2020-08-11 2020-08-11 Method, device and storage medium for shielding identification of radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010803000.0A CN112052878B (en) 2020-08-11 2020-08-11 Method, device and storage medium for shielding identification of radar

Publications (2)

Publication Number Publication Date
CN112052878A true CN112052878A (en) 2020-12-08
CN112052878B CN112052878B (en) 2024-04-16

Family

ID=73601420

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010803000.0A Active CN112052878B (en) 2020-08-11 2020-08-11 Method, device and storage medium for shielding identification of radar

Country Status (1)

Country Link
CN (1) CN112052878B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113009449A (en) * 2021-03-10 2021-06-22 森思泰克河北科技有限公司 Radar shielding state identification method and device and terminal equipment
CN113095240A (en) * 2021-04-16 2021-07-09 青岛海尔电冰箱有限公司 Method for identifying information of articles in refrigerator, refrigerator and computer storage medium
CN115542296A (en) * 2021-06-29 2022-12-30 苏州一径科技有限公司 Laser radar smudge and smudge detection method and electronic device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN208537711U (en) * 2018-06-21 2019-02-22 北京汽车股份有限公司 Radar sensor failure monitor processing unit
CN109490889A (en) * 2017-09-12 2019-03-19 比亚迪股份有限公司 Trailer-mounted radar and judge the method, apparatus whether trailer-mounted radar is blocked
CN110441753A (en) * 2019-09-19 2019-11-12 森思泰克河北科技有限公司 Radar occlusion detection method and radar
CN110717480A (en) * 2019-10-25 2020-01-21 中国人民解放军国防科技大学 Synthetic aperture radar shielding target identification method based on random erasure image fusion
US20200103523A1 (en) * 2018-09-28 2020-04-02 Zoox, Inc. Radar Spatial Estimation
US20200241111A1 (en) * 2019-01-25 2020-07-30 Veoneer Us, Inc. Apparatus and Method for Detecting Radar Sensor Blockage Using Machine Learning

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109490889A (en) * 2017-09-12 2019-03-19 比亚迪股份有限公司 Trailer-mounted radar and judge the method, apparatus whether trailer-mounted radar is blocked
CN208537711U (en) * 2018-06-21 2019-02-22 北京汽车股份有限公司 Radar sensor failure monitor processing unit
US20200103523A1 (en) * 2018-09-28 2020-04-02 Zoox, Inc. Radar Spatial Estimation
US20200241111A1 (en) * 2019-01-25 2020-07-30 Veoneer Us, Inc. Apparatus and Method for Detecting Radar Sensor Blockage Using Machine Learning
CN110441753A (en) * 2019-09-19 2019-11-12 森思泰克河北科技有限公司 Radar occlusion detection method and radar
CN110717480A (en) * 2019-10-25 2020-01-21 中国人民解放军国防科技大学 Synthetic aperture radar shielding target identification method based on random erasure image fusion

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113009449A (en) * 2021-03-10 2021-06-22 森思泰克河北科技有限公司 Radar shielding state identification method and device and terminal equipment
CN113095240A (en) * 2021-04-16 2021-07-09 青岛海尔电冰箱有限公司 Method for identifying information of articles in refrigerator, refrigerator and computer storage medium
CN113095240B (en) * 2021-04-16 2023-08-29 青岛海尔电冰箱有限公司 Method for identifying article information in refrigerator, refrigerator and computer storage medium
CN115542296A (en) * 2021-06-29 2022-12-30 苏州一径科技有限公司 Laser radar smudge and smudge detection method and electronic device
CN115542296B (en) * 2021-06-29 2024-03-08 苏州一径科技有限公司 Dirty spot and dirty detection method of laser radar and electronic device

Also Published As

Publication number Publication date
CN112052878B (en) 2024-04-16

Similar Documents

Publication Publication Date Title
CN112052878A (en) Radar shielding identification method and device and storage medium
CN109740639B (en) Wind cloud satellite remote sensing image cloud detection method and system and electronic equipment
CN109377694B (en) Monitoring method and system for community vehicles
CN109508583B (en) Method and device for acquiring crowd distribution characteristics
CN113052107B (en) Method for detecting wearing condition of safety helmet, computer equipment and storage medium
CN114445803A (en) Driving data processing method and device and electronic equipment
CN113505643B (en) Method and related device for detecting violation target
CN112990247A (en) Multi-modal object detection system with 5G array
CN111127507A (en) Method and system for determining throwing object
CN114821414A (en) Smoke and fire detection method and system based on improved YOLOV5 and electronic equipment
CN110716209B (en) Map construction method, map construction equipment and storage device
CN109061632A (en) A kind of unmanned plane recognition methods
CN111929672A (en) Method and device for determining movement track, storage medium and electronic device
CN117054997A (en) Millimeter wave radar high-voltage tower point and line point detection method based on CFAR and CNN
CN113593256B (en) Unmanned aerial vehicle intelligent driving-away control method and system based on city management and cloud platform
CN111274336B (en) Target track processing method and device, storage medium and electronic device
CN113627388A (en) Method and device for determining bare soil coverage condition
CN113658275A (en) Visibility value detection method, device, equipment and storage medium
CN114782883A (en) Abnormal behavior detection method, device and equipment based on group intelligence
CN114913488A (en) Sprinkler detection method, device, electronic device, and storage medium
CN111429791A (en) Identity determination method, identity determination device, storage medium and electronic device
CN115436923A (en) Sea defense monitoring method and device
CN116229369B (en) Method, device and equipment for detecting people flow and computer readable storage medium
CN116681955B (en) Method and computing device for identifying traffic guardrail anomalies
CN116699521B (en) Urban noise positioning system and method based on environmental protection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant