CN113625235B - Radar view limited scene recognition method, storage medium and vehicle-mounted device - Google Patents

Radar view limited scene recognition method, storage medium and vehicle-mounted device Download PDF

Info

Publication number
CN113625235B
CN113625235B CN202110732269.9A CN202110732269A CN113625235B CN 113625235 B CN113625235 B CN 113625235B CN 202110732269 A CN202110732269 A CN 202110732269A CN 113625235 B CN113625235 B CN 113625235B
Authority
CN
China
Prior art keywords
radar
limited scene
view
amplitude
spectrum information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110732269.9A
Other languages
Chinese (zh)
Other versions
CN113625235A (en
Inventor
陈丽
罗贤平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huizhou Desay SV Intelligent Transport Technology Research Institute Co Ltd
Original Assignee
Huizhou Desay SV Intelligent Transport Technology Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huizhou Desay SV Intelligent Transport Technology Research Institute Co Ltd filed Critical Huizhou Desay SV Intelligent Transport Technology Research Institute Co Ltd
Priority to CN202110732269.9A priority Critical patent/CN113625235B/en
Publication of CN113625235A publication Critical patent/CN113625235A/en
Application granted granted Critical
Publication of CN113625235B publication Critical patent/CN113625235B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4052Means for monitoring or calibrating by simulation of echoes

Abstract

The invention relates to a radar view-limited scene recognition method, which comprises the steps of recognizing a full-view-limited scene, and comprises the following steps: obtaining target detection spectrum information based on radar echo; calculating an amplitude dip point H of detection spectrum information of an absolute stationary target according to the speed information of the mobile equipment and the parameter index of the radar; calculating amplitude distribution differences in a preset distance section before and after the amplitude dip point H; and when the amplitude distribution difference is larger than or equal to a preset threshold value, judging that the current radar environment is a full-view limited scene. The invention also provides a storage medium and vehicle-mounted equipment. The radar view-field limited scene recognition method is suitable for being applied to the radar loaded on the movable equipment, fully considers the possible storage condition of the full-view limited scene, effectively realizes the self-detection of the radar on the self-view limited scene condition, improves the accuracy of the self-diagnosis function of the radar system, and meets the real-time performance of the radar diagnosis application.

Description

Radar view limited scene recognition method, storage medium and vehicle-mounted device
Technical Field
The invention relates to a radar field-of-view limited scene recognition method, a storage medium and vehicle-mounted equipment.
Background
Millimeter wave radars are widely applied to various industries, and most of application scenes of the millimeter wave radars are outdoor, so that the millimeter wave radars are easily covered or shielded by sediment and the like, and the radar Field of View (FOV) is limited, so that normal use is affected. One of the important functions of radar self-detection is therefore radar occlusion self-detection. For example, in the automotive field, on an electromagnetic wave transmission path of a loading radar, a second surface (such as a radome, a bumper, a logo, etc.) right in front of the radar is directly covered with foreign matter such as sludge, ice and snow. The condition that direct foreign matter covers and leads to the radar to be sheltered from will make the radar electromagnetic wave unable normally propagate in the environment, leads to the radar unable normally to perceive the environmental information of automobile body, and the performance of radar receives serious influence, even functional failure. Therefore, the radar needs to have a shielding self-detection function.
In general, the shielding self-detection function of the radar realizes the judgment of whether the radar is covered by foreign matters or not according to the relevant characterization of the radar echo in the time domain or the frequency domain, thereby realizing shielding detection. However, during actual driving, a 'difficult' scene that affects the accuracy of occlusion detection, such as a radar full field-of-view limited scene (i.e., a full FOV limited scene), is easily encountered. A full FOV-limited scene is defined as a scene where objects are enclosed at close distances of various angles within the full FOV area of the radar. Fig. 1 and 2 illustrate two typical full-field limited scenarios. The surrounding object may be a combination of an object having electromagnetic wave strong attenuation performance and electromagnetic wave strong reflection performance, or may be a single object having electromagnetic wave strong reflection performance. The effect of a full FOV-limited scene on radar detection is directly manifested in that electromagnetic wave transmission is blocked in the scene, and long-distance target detection cannot be performed. Taking the automotive field as an example, a typical full FOV limited scenario is such as a parking lot surrounded by walls and by vehicles around the radar, traffic lights congestion, etc. When the radar is in a full FOV limited scene, even if the second surface is not directly covered by the foreign matter, because electromagnetic wave transmission is blocked, signal characterization on a certain distance is almost consistent with that when the second surface of the radar is directly covered by the foreign matter, and a 'pseudo' shielding characteristic is shown.
The Chinese patent application with the application publication number of CN112485770A and the application publication date of 2021, 3 and 12 discloses a method for determining a full FOV limited scene of a millimeter wave radar. However, this method is somewhat complicated in calculation, and there is room for improvement.
Disclosure of Invention
The invention aims to provide an effective and easy-to-implement radar field-of-view limited scene recognition method.
A radar view-limited scene recognition method applied to a radar on a mobile device, the radar view-limited scene recognition method comprising a step S10 of recognizing a full-view limited scene, the step S10 comprising:
s101, obtaining target detection spectrum information based on radar echo;
s102, extracting detection spectrum information of an absolute stationary target in the target detection spectrum information according to the moving speed of the mobile equipment and the parameter index of the radar;
s103, calculating an amplitude dip point H of the detection spectrum information of the absolute stationary target;
s104, calculating amplitude distribution differences in a preset distance section before and after the amplitude dip point H; and
and S105, when the amplitude distribution difference is larger than or equal to a preset threshold, judging that the current radar environment is a full-view limited scene, otherwise, judging that the current radar environment is a common non-view limited scene.
After the radar is started, judging whether the radar is in a shielding state, and executing the step S10 when the radar is in the shielding state, outputting a shielding alarm signal when the current radar environment is in a common non-visual field limited scene as a result of the step S10, and completing detection when the current radar environment is in a full-visual field limited scene as a result of the step S10.
As another implementation manner, after the radar is started, the step S10 is executed first, and when the current radar environment is a normal non-view limited scene, the radar shielding detection function is executed, otherwise, the radar shielding detection function is cancelled.
As an implementation manner, after two-dimensional fourier transform is performed on echoes of each channel of the radar, coherent or incoherent accumulation is performed on fourier transform results of each channel to obtain the target detection spectrum information, then according to the moving speed and the speed detection resolution of the radar, a spectrum information position index corresponding to an environment absolute stationary target in the target detection spectrum information is calculated, and according to the position index, detection spectrum information of the absolute stationary target is extracted from the target detection spectrum information.
As one implementation mode, the maximum value of the amplitude difference values of all adjacent peaks and troughs on the full-distance section of the detection spectrum information of the absolute stationary target is searched, and then the intermediate value of the maximum value and the maximum value is taken as the amplitude dip point H according to the distance value corresponding to the peak and trough at the maximum value.
In one embodiment, after the amplitude dip point H is obtained, the average amplitude values in the upper front and rear preset distance segments of the detected spectrum information of the absolute stationary object are counted and respectively denoted as ampmeas_b and ampmeas_a, and the difference ampmeas between ampmeas_b and ampmeas_a is further obtained as the amplitude distribution difference.
As another embodiment, after the amplitude dip point H is obtained, probability density functions of amplitude values in a pre-set distance section before and after the detection spectrum information of the absolute stationary object are counted, and a difference between the two probability density functions is further obtained as the amplitude distribution difference.
As another embodiment, after the amplitude dip point H is obtained, the sum of the amplitude values in the upper front and rear preset distance segments of the detected spectrum information of the absolute stationary object is counted and respectively denoted as amp um_b and amp um_a, and the difference amp um diff between amp um_b and amp um_a is further obtained as the amplitude distribution difference.
The invention also provides a storage medium comprising instructions for implementing the radar field-of-view limited scene recognition method.
The invention also provides vehicle-mounted equipment which comprises a processor and the memory, and the vehicle-mounted equipment invokes the instructions of the storage medium through the processor so as to realize the radar view limited scene recognition method.
The radar vision-limited scene recognition method of the invention is suitable for being applied to the radar loaded on the movable equipment, in particular to the millimeter wave radar. The method fully considers the possible situation of the full-view limited scene, effectively realizes the self-detection of the radar on the situation of the self-view limited scene, has strong robustness and low calculation force requirement, improves the accuracy of the self-diagnosis function of the radar system, and meets the real-time performance of the radar diagnosis application.
Drawings
Fig. 1 is a typical radar full-field limited scene.
Fig. 2 is another exemplary radar full-field limited scene.
FIG. 3 is a flow chart of a method for radar field limited scene recognition in an embodiment.
Fig. 4 is a comparison of the detected spectral information distribution of an absolute stationary object of a full view limited scene and a normal view limited scene.
Fig. 5 is a schematic diagram of a radar full-field limited scene detection result.
FIG. 6 is a radar occlusion detection flow based on the radar field limited scene recognition method in an extended embodiment.
Fig. 7 is a radar shielding detection flow based on the radar view limited scene recognition method in another extended embodiment.
Detailed Description
The radar field limited scene recognition method, the storage medium and the vehicle-mounted device of the invention are described in further detail below with reference to specific embodiments and drawings.
The main purpose of the radar view limited scene recognition method is to recognize the condition that the radar view is limited, whether the radar view is a full view (FOV) limited scene or a general shielding caused by directly covering the second surface by foreign matters such as sludge, ice and snow, thereby effectively sending out an alarm signal and improving the accuracy of the self-diagnosis function of a radar system.
The recognition method of the full-view limited scene can be applied to radars on movable equipment, such as vehicle-mounted radars, in particular millimeter wave radars, for example, can be applied to shielding detection of vehicle-mounted rear angle radars and vehicle head forward looking radars, and can also be applied to trunk door opening early warning. For example, when the radar detects a full FOV limited scene, that is, a large area of obstacle exists around the radar (vehicle body) in a short distance (the object in the vehicle body environment is surrounded), the driver needs to be reminded that the trunk is opened and a collision danger exists. The general idea of the method of the invention can be also used for other radars such as ultrasonic radars, laser radars and the like.
The recognition of the full view limited scene is based on the characteristics of an electromagnetic wave transmission path in the full view limited scene, under the scene, the electromagnetic wave transmission is mainly represented by that after the electromagnetic wave of the radar full FOV is transmitted in a short distance, the electromagnetic wave is attenuated by surrounding walls and objects or reflected by vehicles and generates multipath signals, so that the radar electromagnetic wave cannot be transmitted to a longer distance, namely a long-distance target cannot be detected, meanwhile, no clutter enters a receiving end in the long distance, and finally, the radar full FOV is represented by stronger echo and clutter in a short distance, no target echo and clutter in the long distance, almost only system noise is visually represented by a dip point of amplitude at a certain distance point in a frequency domain. However, for non-full FOV-limited scenes (normal field-of-view limited scenes), no such obvious amplitude dip occurs in the frequency domain, regardless of whether the radar second surface is covered by foreign objects, the distribution of clutter, or all systematic noise is present at the same time for the full range segment. That is, the electromagnetic wave signal characterization in a full FOV limited scene is different from the signal characterization in a conventional scene (i.e., not full FOV limited).
According to the electromagnetic wave performance difference between the full FOV-limited scene and the normal radar view-limited scene, the present embodiment provides a method for identifying the full FOV-limited scene (defined as step S10, please refer to fig. 3) by taking millimeter wave radar as an example, which includes the following steps S101 to S105.
S101, obtaining target detection spectrum (Detection Spectrum) information based on radar echo. Specifically, after two-dimensional fourier transform is performed on echoes of each channel of the radar, incoherent accumulation is performed on fourier transform results of each channel, so as to obtain the target detection spectrum information (the spectrum information simultaneously represents the speed and the distance value of the target). The target detection spectrum information comprises spectrum information of targets which are absolutely static and non-static in a relative geodetic coordinate system in the environment. The improved advantage of applying a two-dimensional fourier transform to the target Signal-to-Noise Ratio (SNR) improves the algorithm performance.
S102, calculating Doppler dimension indexes of corresponding spectrum information of an absolute stationary target (a relative geodetic coordinate system) in a target detection spectrum according to moving speed information (the moving speed of a motor vehicle in the embodiment) of a mobile device (the moving speed of the motor vehicle in the embodiment) loaded with the radar and parameter indexes of a radar system in the embodiment, namely speed detection resolution, and extracting detection spectrum information of the absolute stationary target (such as a building, a railing, a stationary vehicle and the like) in the target detection spectrum information according to the Doppler dimension indexes, namely obtaining the detection spectrum information of the absolute stationary target.
S103, calculating the amplitude dip point H of the detection spectrum information of the absolute stationary target. The method for calculating the amplitude dip point H mainly represents distance values corresponding to the echo and clutter of the farthest environmental target which can be received by the millimeter wave radar, is flexible, and can be used for finding the maximum value of the amplitude difference values of all adjacent peaks and troughs on the whole distance section of the detection spectrum information of the absolute stationary target, and then taking the intermediate value of the maximum value and the maximum value as the amplitude dip point H according to the distance values corresponding to the peaks and troughs at the maximum value. It should be noted that in a full FOV limited scene, the dip point is essentially clear, whereas in a conventional field-limited scene, which is not full FOV limited, no amplitude dip occurs because a distant object is continuously detectable, so the point found based on the amplitude dip point algorithm is a randomly hopped pseudo dip point. Fig. 4 shows a detected spectral information distribution comparison of an absolute stationary object of a full view limited scene with a normal view limited scene.
S104, calculating amplitude distribution differences in a preset distance section before and after the amplitude dip point H, and extracting the amplitude distribution statistical differences of a short-distance section and a long-distance section of the radar in the environment, namely targets, clutter or system noise before and after the amplitude dip point. The calculation method of the amplitude distribution difference can be one of the following three methods: (1) Counting amplitude average values in upper front and rear preset distance sections of the detection spectrum information of the absolute stationary target, respectively marking the amplitude average values as AmpMean_b and AmpMean_a, and further obtaining the difference AmpMeanDiff of the AmpMean_b and the AmpMean_a as the amplitude distribution difference; (2) Counting the sum of amplitude values in the upper front and rear preset distance segments of the detection spectrum information of the absolute stationary target, respectively marking the sum as AmpUM_b and AmpUM_a, and further obtaining the difference AmpumDiff of the AmpUM_b and the AmpUM_a as the amplitude distribution difference; (3) And counting probability density functions of amplitude values in a preset distance section before and after the detection spectrum information of the absolute stationary target, and further obtaining a difference value of the two probability density functions as the amplitude distribution difference.
And S105, when the amplitude distribution difference is larger than or equal to a preset threshold, judging that the current radar environment is a full-view limited scene, otherwise, judging that the current radar environment is a common non-view limited scene. The main idea of acquiring the preset threshold is to count the difference of the amplitude mean value of the appointed distance section before and after the amplitude dip point in the range of the radar detectable target when the second surface on the electromagnetic wave propagation path of the millimeter wave radar has no foreign matter coverage in the normal view limited scene, count the difference of the amplitude mean value of the appointed distance section before and after the amplitude dip point in the range of the radar detectable target when the second surface on the electromagnetic wave propagation path of the millimeter wave radar has no foreign matter coverage in different limited distances of the full FOV limited scene in the same way, and finish the self-detection of the millimeter wave radar to the real direct coverage and the direct shielding of the non-foreign matter (full view) and the self-adaptive judgment of the 'limited pseudo-millimeter wave' due to the shielding of the non-foreign matter by the millimeter wave. Fig. 5 shows the radar full-field limited scene detection result.
In an extended embodiment, please refer to fig. 6, when the method for identifying a full-view limited scene in the above embodiment is applied to a radar occlusion detection function to perform occlusion detection, the method may include the following steps:
after the radar is started, radar shielding detection is executed, if the radar is judged to be in a shielding state, the step S10 is executed, and when the current radar environment is common non-vision limited shielding as a result of the step S10, an alarm signal that the radar is shielded is output, and when the current radar environment is a full-vision limited scene as a result of the step S10, the radar is not shielded, and detection is completed.
In another extended embodiment, please refer to fig. 7, the method for identifying a full-view limited scene in the above embodiment may include the following steps when applied to a radar shielding detection function:
after the radar is started, the result of executing the step S10 is output to a radar shielding detection flow, when the result of the step S10 is that the current radar is in a common non-visual field limited scene, the shielding detection function is canceled, and otherwise, the shielding detection function is executed.
In summary, the radar view limited scene recognition method of the invention takes a target detection spectrum (Detection Spectrum) corresponding to radar echo as input, calculates a corresponding Doppler dimension index of an absolute stationary target (a building, a railing, a stationary vehicle and the like) in an environment on the target detection spectrum according to vehicle speed information and related parameter indexes of a radar system, extracts full-distance segment data based on the index, marks as 'detection spectrum information of the absolute stationary target', extracts distribution differences of a short-distance segment (distance radar) target and a long-distance segment (distance radar) target and clutter in the environment based on the 'detection spectrum information of the absolute stationary target', and further judges whether the current radar is in a full-FOV limited scene according to the difference degree.
The radar field-of-view limited scene recognition method of the present invention is suitably applied to a radar loaded on a mobile device. The method fully considers the possible situation of the full-view limited scene, effectively realizes the self-detection of the radar on the situation of the self-view limited scene, has strong robustness and low calculation force requirement, improves the accuracy of the self-diagnosis function of the radar system, and meets the real-time performance of the radar diagnosis application.
In other embodiments, the amplitude distribution difference AmpMeanDiff obtained by making a difference between the amplitude average values of the specified distance segments before and after the amplitude dip point may also be obtained by using other calculation methods for the amplitude average values of the specified distance segments before and after the amplitude dip point, for example, by using the system noise as a reference value, calculating the specific gravity of the amplitude average values of the specified distance segments before and after the amplitude dip point relative to the reference value, and further calculating the difference of the specific gravity.
In other embodiments, in step S101, the target detection spectrum information is obtained by performing incoherent accumulation on the fourier transform result of each receiving channel signal, or alternatively, the target detection spectrum information is obtained by performing coherent accumulation on the fourier transform result of each receiving channel signal.
In practical applications, the above-mentioned method is stored in the form of instructions in a storage medium that is installed in a vehicle-mounted device or other electronic device provided with a processor by which the instructions in the storage medium can be invoked to implement the above-mentioned radar field of view limited scene recognition method.
While the invention has been described in conjunction with the specific embodiments above, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art in light of the foregoing description. Accordingly, all such alternatives, modifications, and variations are included within the spirit and scope of the following claims.

Claims (9)

1. A radar view-limited scene recognition method applied to a radar on a mobile device, wherein the radar view-limited scene recognition method comprises a step S10 of recognizing a full-view limited scene, and the step S10 comprises:
s101, obtaining target detection spectrum information based on radar echo;
s102, extracting detection spectrum information of an absolute stationary target in the target detection spectrum information according to the moving speed of the mobile equipment and the parameter index of the radar;
s103, calculating an amplitude dip point H of the detection spectrum information of the absolute stationary target;
s104, calculating amplitude distribution differences in a preset distance section before and after the amplitude dip point H; and
s105, when the amplitude distribution difference is larger than or equal to a preset threshold value, judging that the current radar environment is a full-view limited scene, otherwise, judging that the current radar environment is a common non-view limited scene;
and searching the maximum value of amplitude differences between all adjacent peaks and troughs on the full-distance section of the detection spectrum information of the absolute stationary target, and taking the intermediate value of the maximum value and the maximum value as the amplitude dip point H according to the distance values corresponding to the peaks and the troughs at the maximum value.
2. The method for recognizing the radar view-field limited scene according to claim 1, wherein after the radar is started, judging whether the radar is in a shielding state, and when the judgment is yes, executing the step S10, and outputting a shielding alarm signal when the current radar environment is a common non-view-field limited scene as a result of the step S10, and when the current radar environment is a full-view limited scene as a result of the step S10, the radar is not shielded, so that the detection is completed.
3. The method according to claim 1, wherein after the radar is started, the step S10 is executed first, and when the current radar is in a normal non-view limited scene, the radar shielding detection function is executed, otherwise, the radar shielding detection function is canceled.
4. The radar view-field limited scene recognition method according to any one of claims 1 to 3, wherein the target detection spectrum information is obtained by performing two-dimensional fourier transform on echoes of each channel of the radar, respectively, and then performing coherent or incoherent accumulation on fourier transform results of each channel, and further calculating a spectrum information position index corresponding to an absolute stationary target of an environment in the target detection spectrum information according to the moving speed and the speed detection resolution of the radar, and extracting the detection spectrum information of the absolute stationary target from the target detection spectrum information according to the position index.
5. A radar field limited scene recognition method according to any one of claims 1 to 3, wherein after obtaining the amplitude dip point H, the amplitude average values in the upper front and rear preset distance segments of the detection spectrum information of the absolute stationary object are counted, respectively denoted as ampmeas_b, ampmeas_a, and further the difference ampmeas_b from ampmeas_a is obtained as the amplitude distribution difference.
6. A radar field limited scene recognition method according to any of claims 1 to 3, wherein after obtaining the amplitude dip point H, probability density functions of amplitude values within upper front and rear preset distance segments of the detection spectrum information of the absolute stationary object are counted, and a difference of these two probability density functions is further obtained as the amplitude distribution difference.
7. A radar field limited scene recognition method according to any of claims 1 to 3, wherein after obtaining the amplitude dip point H, the sum of amplitude values within upper front and rear preset distance segments of the detected spectrum information of the absolute stationary object is counted, denoted as amp um_b, amp um_a, respectively, and further a difference amp um diff of amp um_b and amp um_a is obtained as the amplitude distribution difference.
8. A storage medium comprising instructions for implementing the radar field limited scene identification method according to any one of claims 1 to 3.
9. An in-vehicle apparatus including a processor and the storage medium of claim 8, the in-vehicle apparatus invoking instructions of the storage medium via the processor to implement the radar field of view limited scene recognition method of any one of claims 1 to 3.
CN202110732269.9A 2021-06-30 2021-06-30 Radar view limited scene recognition method, storage medium and vehicle-mounted device Active CN113625235B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110732269.9A CN113625235B (en) 2021-06-30 2021-06-30 Radar view limited scene recognition method, storage medium and vehicle-mounted device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110732269.9A CN113625235B (en) 2021-06-30 2021-06-30 Radar view limited scene recognition method, storage medium and vehicle-mounted device

Publications (2)

Publication Number Publication Date
CN113625235A CN113625235A (en) 2021-11-09
CN113625235B true CN113625235B (en) 2024-03-29

Family

ID=78378649

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110732269.9A Active CN113625235B (en) 2021-06-30 2021-06-30 Radar view limited scene recognition method, storage medium and vehicle-mounted device

Country Status (1)

Country Link
CN (1) CN113625235B (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6469659B1 (en) * 2001-05-03 2002-10-22 Delphi Technologies, Inc. Apparatus and method for detecting radar obstruction
JP2016148515A (en) * 2015-02-10 2016-08-18 株式会社豊田中央研究所 Radar device
CN107783097A (en) * 2016-08-25 2018-03-09 大连楼兰科技股份有限公司 Target is matched and data processing platform (DPP)
CN109490889A (en) * 2017-09-12 2019-03-19 比亚迪股份有限公司 Trailer-mounted radar and judge the method, apparatus whether trailer-mounted radar is blocked
CN109709527A (en) * 2019-01-14 2019-05-03 上海海洋大学 The Gauss wave crest method of Gauss Decomposition in a kind of Full wave shape laser-measured height echo signal
CN110441753A (en) * 2019-09-19 2019-11-12 森思泰克河北科技有限公司 Radar occlusion detection method and radar
CN111580051A (en) * 2020-04-08 2020-08-25 惠州市德赛西威智能交通技术研究院有限公司 Vehicle-mounted millimeter wave radar shielding detection method based on amplitude change rate analysis
CN111722195A (en) * 2020-06-29 2020-09-29 上海蛮酷科技有限公司 Radar occlusion detection method and computer storage medium
CN112162250A (en) * 2020-08-28 2021-01-01 惠州市德赛西威智能交通技术研究院有限公司 Radar occlusion detection method and system based on full FOV restricted identification
WO2021000313A1 (en) * 2019-07-04 2021-01-07 深圳市大疆创新科技有限公司 Methods of using lateral millimeter wave radar to detect lateral stationary object and measure moving speed
CN112485770A (en) * 2020-12-02 2021-03-12 惠州市德赛西威智能交通技术研究院有限公司 Millimeter wave radar full FOV restricted scene recognition method, storage medium and vehicle-mounted equipment
CN113009449A (en) * 2021-03-10 2021-06-22 森思泰克河北科技有限公司 Radar shielding state identification method and device and terminal equipment

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3915742B2 (en) * 2003-06-20 2007-05-16 株式会社デンソー Vehicle object recognition device
US10162046B2 (en) * 2016-03-17 2018-12-25 Valeo Radar Systems, Inc. System and method for detecting blockage in an automotive radar

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6469659B1 (en) * 2001-05-03 2002-10-22 Delphi Technologies, Inc. Apparatus and method for detecting radar obstruction
JP2016148515A (en) * 2015-02-10 2016-08-18 株式会社豊田中央研究所 Radar device
CN107783097A (en) * 2016-08-25 2018-03-09 大连楼兰科技股份有限公司 Target is matched and data processing platform (DPP)
CN109490889A (en) * 2017-09-12 2019-03-19 比亚迪股份有限公司 Trailer-mounted radar and judge the method, apparatus whether trailer-mounted radar is blocked
CN109709527A (en) * 2019-01-14 2019-05-03 上海海洋大学 The Gauss wave crest method of Gauss Decomposition in a kind of Full wave shape laser-measured height echo signal
WO2021000313A1 (en) * 2019-07-04 2021-01-07 深圳市大疆创新科技有限公司 Methods of using lateral millimeter wave radar to detect lateral stationary object and measure moving speed
CN110441753A (en) * 2019-09-19 2019-11-12 森思泰克河北科技有限公司 Radar occlusion detection method and radar
CN111580051A (en) * 2020-04-08 2020-08-25 惠州市德赛西威智能交通技术研究院有限公司 Vehicle-mounted millimeter wave radar shielding detection method based on amplitude change rate analysis
CN111722195A (en) * 2020-06-29 2020-09-29 上海蛮酷科技有限公司 Radar occlusion detection method and computer storage medium
CN112162250A (en) * 2020-08-28 2021-01-01 惠州市德赛西威智能交通技术研究院有限公司 Radar occlusion detection method and system based on full FOV restricted identification
CN112485770A (en) * 2020-12-02 2021-03-12 惠州市德赛西威智能交通技术研究院有限公司 Millimeter wave radar full FOV restricted scene recognition method, storage medium and vehicle-mounted equipment
CN113009449A (en) * 2021-03-10 2021-06-22 森思泰克河北科技有限公司 Radar shielding state identification method and device and terminal equipment

Also Published As

Publication number Publication date
CN113625235A (en) 2021-11-09

Similar Documents

Publication Publication Date Title
US7489265B2 (en) Vehicle sensor system and process
CN108089165B (en) Method for detecting blindness in a radar sensor for a motor vehicle
EP1326087A2 (en) Apparatus and method for radar data processing
CN112526521B (en) Multi-target tracking method for automobile millimeter wave anti-collision radar
CN111624560B (en) Method for detecting shielding state of vehicle-mounted millimeter wave radar based on target identification
CN115575922B (en) Moving target detection method and device based on vehicle-mounted FMCW millimeter wave radar
CN108535709B (en) Road clutter suppression
US6943727B2 (en) Length measurement with radar
US20210190907A1 (en) Electronic device, method for controlling electronic device, and electronic device control program
EP4343369A2 (en) Interference signal detection method and apparatus, and integrated circuit, radio device and terminal
US20090135050A1 (en) Automotive radar system
CN112763994B (en) Vehicle-mounted radar shielding detection method, storage medium and vehicle-mounted equipment
CN116660847A (en) Interference signal detection method and device
CN112485770A (en) Millimeter wave radar full FOV restricted scene recognition method, storage medium and vehicle-mounted equipment
CN112986945B (en) Radar target identification method, device, equipment and storage medium
CN116324490A (en) Method for characterizing an object in the surroundings of a motor vehicle
CN111175715B (en) Auxiliary driving system and method capable of inhibiting radar close-range harmonic waves
CN112578371B (en) Signal processing method and device
CN113625235B (en) Radar view limited scene recognition method, storage medium and vehicle-mounted device
CN111175714B (en) Auxiliary driving method capable of suppressing radar close-range harmonic wave and storage medium
KR100875564B1 (en) Near Object Detection System
CN111190154B (en) Auxiliary driving system and method capable of inhibiting radar close-range harmonic waves
CN111175717B (en) Auxiliary driving method capable of inhibiting radar close-range harmonic wave and scene application
CN116209914A (en) Method and computing device for detecting road users in a vehicle environment by detecting interference based on radar sensor measurements
CN111190155B (en) Auxiliary driving system capable of inhibiting radar close-range harmonic wave

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant