CN111505623A - Method and system for detecting obstacle in driving process of unmanned vehicle and vehicle - Google Patents

Method and system for detecting obstacle in driving process of unmanned vehicle and vehicle Download PDF

Info

Publication number
CN111505623A
CN111505623A CN202010333455.0A CN202010333455A CN111505623A CN 111505623 A CN111505623 A CN 111505623A CN 202010333455 A CN202010333455 A CN 202010333455A CN 111505623 A CN111505623 A CN 111505623A
Authority
CN
China
Prior art keywords
target
sensor
obstacle
type
original
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010333455.0A
Other languages
Chinese (zh)
Other versions
CN111505623B (en
Inventor
陈白帆
刘飞
宋宝军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Central South University
Original Assignee
Central South University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Central South University filed Critical Central South University
Priority to CN202010333455.0A priority Critical patent/CN111505623B/en
Publication of CN111505623A publication Critical patent/CN111505623A/en
Application granted granted Critical
Publication of CN111505623B publication Critical patent/CN111505623B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a method, a system and a vehicle for detecting obstacles in the running process of an unmanned vehicle, wherein a radar coordinate system and a camera coordinate system are unified into a vehicle body coordinate system; reading and analyzing original message data of a first sensor and a second sensor to obtain an original obstacle target under the vehicle body coordinate system; judging whether the original obstacle targets detected by the first sensor and the second sensor are the same obstacle or not; and fusing the targets judged to be the same obstacle to obtain final obstacle target information. The invention integrates the data of the first sensor and the second sensor, solves the problems that the existing method can not meet the real-time requirement, has low detection efficiency and high cost, and is easily influenced by the environment, and improves the detection reliability and the detection efficiency.

Description

Method and system for detecting obstacle in driving process of unmanned vehicle and vehicle
Technical Field
The invention relates to the field of target perception of unmanned vehicles, in particular to a method, a system and a vehicle for detecting obstacles in the driving process of an unmanned vehicle.
Background
With the intense heat of artificial intelligence, unmanned driving, which is an important branch, is also rapidly developing. The automatic driving automobile comprises modules of perception, decision, control and the like, wherein environment perception relates to various sensors, the modules are important guarantee for safety of the automatic driving automobile, and a target detection task is the most critical ring of a perception task. Target detection refers to a task of giving information of various obstacles such as vehicles in an automatic driving scene. To better represent the different degrees of autodrive of a vehicle, the american society of automotive engineers divides autodrive into five levels. With the increase of the automatic driving level, the requirement of the automobile on target detection is higher and higher. Therefore, it is important to study automatic driving target detection. With attendant stringent demands on the perception system of the unmanned vehicle. The improvement of the perception performance is imminent for the further development of the unmanned technology.
At present, an environment sensing module in an unmanned system depends on various sensors, and a camera, a millimeter wave radar and a laser radar are the most important and widely used three types of sensors. Compared with other sensors, the visual sensor is lower in price and lower in technical difficulty, abundant information can be provided, the obtained targets are easy to classify, and the short-distance measurement result is accurate. However, it has the disadvantage of being sensitive to light and weather conditions, requiring complex calculations to ensure accuracy. The millimeter wave radar sensor detects objects by emitting millimeter wave electric signals and analyzing frequency shift in reflection, can stably operate in different environmental factors, and can measure the speed of obstacles at a long distance with high precision. Millimeter wave radars launched by delfu corporation, mainland germany and bosch are stable and reliable, so that they are applied to research in many universities and enterprises. However, it has the disadvantage that the objects obtained are difficult to classify, limited in spatial resolution and sometimes false positives, missing heels. The laser radar has the advantages of accurate 3D perception capability, insensitivity to light change, rich information and the like, and can perform very accurate 3D perception on a target by relying on a traditional feature extraction method and an advanced point cloud deep learning network. It has the disadvantages of sensitivity to weather such as rain, fog and the like and high price; if the requirement of real-time property cannot be met only by means of vision, the radar detection rate is not high only by means of radar. Therefore, to obtain a real and reliable vehicle front obstacle recognition system, it is difficult to achieve the real and reliable vehicle front obstacle recognition system only by means of the single sensor, and multiple sensors need to be fused to meet the complex road traffic environment.
Disclosure of Invention
The invention aims to solve the technical problem that in order to overcome the defects in the prior art, the invention provides the method, the system and the vehicle for detecting the obstacle in the driving process of the unmanned vehicle, and the obstacle target in the driving process of the unmanned vehicle is effectively detected.
In order to solve the technical problems, the technical scheme adopted by the invention is as follows: an obstacle detection method in the driving process of an unmanned vehicle comprises the following steps:
1) unifying a radar coordinate system and a camera coordinate system into a vehicle body coordinate system;
2) reading and analyzing original message data of a first sensor and a second sensor to obtain an original obstacle target under the vehicle body coordinate system;
3) judging whether the original obstacle targets detected by the first sensor and the second sensor are the same obstacle or not;
4) fusing the targets judged to be the same barrier to obtain final barrier target information;
wherein, the first sensor and the second sensor are different kinds of sensors.
The invention integrates the data of two sensors, solves the problems that the reliability is not high and the complex road traffic environment is difficult to meet when the prior art relies on a single type of sensor for detection, improves the real-time property and the detection efficiency of the detection and is not easy to be influenced by the environment.
And 2), reading the original message data of the first sensor and the second sensor in an online or offline mode. The real-time online mode can be used for unmanned vehicle operation and debugging, and the offline reading mode can be used for demonstration and experiments.
In order to further improve the white noise resistance of the first sensor and the second sensor and reduce the probability of the occurrence of false targets, the following operations are further performed between the step 2) and the step 3) of the invention: and screening the original obstacle target to eliminate false targets and white noise.
The life cycle method has the advantages that a simpler algorithm can be used, the reliability is higher, effective targets can be screened, various measurement working conditions are adapted, and the robustness of sensor target identification is improved. Specifically, the specific implementation process of screening the original obstacle target includes:
A) when an original barrier target is decided as an effective target for the first time in a certain period, a life threshold value is set for the effective target, and the influence of transient data interference and loss cannot be effectively eliminated due to the smaller life threshold value; the larger life threshold value causes the delay of effective target switching, so the life threshold value is generally 5-8 frames (one frame is an adopted period), the life threshold value is set to be 6 frames, and the effective target is tracked and measured;
B) starting from the period, if the target newly detected by the first sensor and the second sensor is inconsistent with the effective target continuously for N times, and the target newly detected in the period from the (N + 1) th time to the later time is consistent with the effective target again, keeping the original effective target unchanged; wherein N is less than a life threshold; if the number of times of continuous inconsistency between the newly detected target and the effective target from the period exceeds a life threshold, abandoning the original effective target, taking the newly detected target of the first sensor and the second sensor as a new effective target, and carrying out tracking measurement on the new effective target;
C) and step A, B is executed in a circulating manner, and effective target output sets screened by the millimeter wave first sensor and the millimeter wave second sensor are obtained respectively. Providing input data for step 3).
In step 4), the specific implementation process of determining whether the original obstacle targets detected by the first sensor and the second sensor are the same obstacle includes:
1)marking an obstacle target detected by a first sensor as an R-type target, and marking an obstacle target detected by a second sensor as a V-type target; the target position coordinate of the R-type obstacle is (x)R,yR) At a velocity of VR(ii) a The target position coordinate of the V-type obstacle is (x)V,yV) At a velocity of VV
2) Judging the relationship between weighted values of distance information and speed information of R-class target and V-class target and a threshold value, judging that the weighted values (W ') are not the same obstacle when the weighted values (W') are greater than or equal to the threshold value (W), and judging that the weighted values (W ') are the same obstacle when the weighted values (W') are less than the threshold value (W), wherein the threshold value (W) is lambda 0.1 delta LR+(1-λ)ρ·VRW '═ lambda delta L + (1-lambda) delta V, W' is the weighted value of the distance information and the speed information of the R type target and the V type target, lambda is the weight number, rho is the speed error coefficient;
Figure BDA0002465778430000031
ΔV=|VR-VV|。
the judging process fully considers the characteristic of high detection accuracy of the sensor on the position information, comprehensively considers the position and speed information of the target, and considers that the error is increased along with the increase of the detected target speed of the obstacle, so that a corresponding speed error coefficient is set, and the judging method has higher robustness.
λ is a weight used for weighting the distance difference and the speed difference of the two types of targets, and considering that the measurement accuracy of the two types of sensors on the position information is greater than that of the speed information, the invention takes λ as 0.7.
Rho is a speed error coefficient, and a corresponding speed error coefficient can be set according to the value of the speed, so that the error increase caused by the speed increase is avoided. The speed error coefficient of the invention is specifically set as follows:
when V isRAt < 40 km/h: ρ is 0.1;
when V is less than or equal to 40km/hRAt < 80 km/h: ρ is 0.09;
when the voltage is less than or equal to 80km/hRAt < 120 km/h: ρ is 0.07;
when 120km/h is less than VRThe method comprises the following steps: ρ is 0.05. If the first sensingThe threshold is set up by using the first sensor as a standard if the detection accuracy of the obstacle is higher than that of the second sensor, so that the threshold is specifically set as follows in order to further improve the detection accuracy:
when V isRAt < 40 km/h:
Figure BDA0002465778430000041
when V is less than or equal to 40km/hRAt < 80 km/h:
Figure BDA0002465778430000042
when the voltage is less than or equal to 80km/hRAt < 120 km/h:
Figure BDA0002465778430000043
when 120km/h is less than VRThe method comprises the following steps:
Figure BDA0002465778430000044
in the step 4), the Kalman filtering method is used for fusing and judging the targets of the same barrier, classical Kalman filtering is adopted, the algorithm has better robustness and real-time performance, and compared with fusion algorithms such as a neural network and a D-S evidence theory, the method is lighter and has wider applicability; preferably, the specific implementation process of fusing the targets judged as the same obstacle by using the kalman filtering method includes:
A) initialization, after receiving the first valid measurement value of the camera, the system state is checked
Figure BDA0002465778430000045
And (3) initializing:
Figure BDA0002465778430000046
wherein: x is the number ofVIs a V-type target horizontal coordinate; y isVIs a V-type target longitudinal coordinate; vV-XIs the target lateral velocity of the V type; vV-Yα is the Euclidean distance of the V-type target;
Figure BDA0002465778430000047
is the absolute value of the target speed of the V type; phi is the orientation angle of the class V target.
B) Substituting the position and speed information of the V-class target initialized at the k-1 moment into a state prediction formula to obtain a predicted value at the k moment
Figure BDA0002465778430000048
Figure BDA0002465778430000051
PkIndicating the degree of uncertainty of the system state
Figure BDA0002465778430000052
Represents the initial covariance P of the invention0The following settings are set:
Figure BDA0002465778430000053
and the covariance is updated by the following equation:
Figure BDA0002465778430000054
Figure BDA0002465778430000055
Figure BDA0002465778430000056
to a predicted state
Figure BDA0002465778430000057
Error covariance of (P)k-1Is composed of
Figure BDA0002465778430000058
The covariance of (a) of (b),
Figure BDA0002465778430000059
Figure BDA00024657784300000510
fusing a system state predicted value for Kalman filtering at the moment k; f is a state transition matrix which is a model of target state transition; q is the inherent noise interference of the target state conversion model;
the specific updating steps are as follows:
covariance P according to initialization0Calculate to obtain 1 time
Figure BDA00024657784300000511
Covariance of
Figure BDA00024657784300000512
By
Figure BDA00024657784300000513
Covariance of
Figure BDA00024657784300000514
Calculate time of day 1
Figure BDA00024657784300000515
Covariance P of1From
Figure BDA00024657784300000516
Covariance P of1Calculate time of day 2
Figure BDA00024657784300000517
Covariance of
Figure BDA00024657784300000518
Until a time point k-1
Figure BDA00024657784300000519
Covariance P ofk-1Calculating the k time
Figure BDA00024657784300000520
Error covariance of
Figure BDA00024657784300000521
From the time of k
Figure BDA00024657784300000522
Error covariance of
Figure BDA00024657784300000523
K time is obtained through calculation
Figure BDA00024657784300000524
Error covariance P ofk
From predicted states
Figure BDA00024657784300000525
Error covariance of
Figure BDA00024657784300000526
Calculating Kalman gain at the k moment:
Figure BDA00024657784300000527
from the predicted value at time k
Figure BDA00024657784300000528
Observed value z of sensorkAnd Kalman gain KkPredictive estimation of system states can be updated:
Figure BDA00024657784300000529
wherein, KkIs the Kalman gain; h is a conversion matrix from the state variable to the observation variable;
Figure BDA00024657784300000530
the optimal state is corrected through measurement; r is the measurement noise covariance; z is a radical ofkIs the first sensor observation or the second sensor observation;
C) substituting class V targetsObserved value z at time kkAnd obtaining the optimal state of the system according to the updated prediction estimation formula in the step B)
Figure BDA0002465778430000061
Optimizing the system at the time k
Figure BDA0002465778430000062
Substituting the predicted estimation of the system state in the step B) to obtain the predicted value of the system state at the moment k +1, and substituting the R-type targets which are judged as the same obstacle into the observed value z at the moment k +1k+1And updating the prediction estimation to obtain the fused effective barrier information
Figure BDA0002465778430000063
D) Cycling the steps B) and C) in time sequence, and measuring the corrected optimal state each time
Figure BDA0002465778430000064
Is the fused set of obstacle target outputs.
Correspondingly, the invention also provides an obstacle detection system in the driving process of the unmanned vehicle, which comprises the following steps: the coordinate conversion module is used for unifying the first sensor coordinate system and the second sensor coordinate system into a vehicle body coordinate system; the data analysis module is used for reading and analyzing original message data of the first sensor and the second sensor to obtain an original obstacle target under the vehicle body coordinate system;
the judging module is used for judging whether the original obstacle targets detected by the first sensor and the second sensor are the same obstacle or not;
and the fusion module is used for fusing the targets judged to be the same obstacle to obtain the final obstacle target information.
The system integrates the first sensor data and the second sensor data, solves the problems that in the prior art, when a single type of sensor is used for detection, the reliability is not high, and the complex road traffic environment is difficult to meet, improves the real-time performance and the detection efficiency of detection, and is not easy to be influenced by the environment.
In order to further improve the detection precision, the system of the invention further comprises a screening module, wherein the screening module is used for screening the original obstacle target output by the data analysis module, eliminating false targets and white noise, and transmitting the original obstacle target after eliminating the false targets and the white noise to the judgment module.
The screening module screens the original obstacle target by using an effective life cycle method.
The judging module comprises:
the marking unit is used for marking the obstacle target detected by the first sensor as an R-type target and the obstacle target detected by the second sensor as a V-type target; the target position coordinate of the R-type obstacle is (x)R,yR) At a velocity of VR(ii) a The target position coordinate of the V-type obstacle is (x)V,yV) At a velocity of VV
A comparison unit for judging the relationship between the weighted value of the distance information and the velocity information of the R-class target and the V-class target and the threshold value, wherein the weighted value (W ') is larger than or equal to the threshold value (W) and is judged not to be the same obstacle, and the weighted value (W') is smaller than the threshold value (W) and is judged to be the same obstacle, wherein the threshold value W is lambda 0.1 delta LR+(1-λ)ρ·VRW '═ lambda delta L + (1-lambda) delta V, W' is the weighted value of the distance information and the speed information of the R type target and the V type target, lambda is the weight number, rho is the speed error coefficient;
Figure BDA0002465778430000071
ΔV=|VR-VVl, |; preferably, the speed error coefficient is specifically set as follows:
when V isRAt < 40 km/h: ρ is 0.1;
when V is less than or equal to 40km/hRAt < 80 km/h: ρ is 0.09;
when the voltage is less than or equal to 80km/hRAt < 120 km/h: ρ is 0.07;
when 120km/h is less than VRThe method comprises the following steps: ρ is 0.05.
Preferably, the threshold is specifically set as follows:
when V isRAt < 40 km/h:
Figure BDA0002465778430000072
when V is less than or equal to 40km/hRAt < 80 km/h:
Figure BDA0002465778430000073
when the voltage is less than or equal to 80km/hRAt < 120 km/h:
Figure BDA0002465778430000074
when 120km/h is less than VRThe method comprises the following steps:
Figure BDA0002465778430000075
the invention also provides, as an inventive concept, an unmanned vehicle comprising a processor for performing the above method steps.
The processor can control the unmanned vehicle to avoid the final obstacle target information according to the final obstacle target information, and ensures the driving safety of the unmanned vehicle.
The processor is in communication with the display module for facilitating visual display of the obstacle target.
The processor of the invention is communicated with the data acquisition module; the data acquisition module comprises at least one first sensor and at least one second sensor which are arranged on the front part of a vehicle body; the first sensor and the second sensor are heterogeneous sensors.
In order to save cost and be convenient to use, the first sensor of the invention is a radar, and the second sensor is a camera. Compared with the prior art, the invention has the beneficial effects that:
1) the invention integrates two types of sensor data, solves the problems that the existing method can not meet the real-time requirement, has low detection efficiency, high cost and is easily influenced by the environment, and improves the detection reliability and the detection efficiency;
2) the method can avoid white noise interference when the target of the unmanned vehicle is detected in the driving process, eliminate false targets and further improve the detection reliability and detection precision;
3) the method has the advantages that two analysis modes of online analysis and offline analysis can be selected, the application range is wide, and effective barrier targets on the road can be accurately detected;
4) the invention can intuitively display the coordinate position information of the effective obstacle target and provide the coordinate position information to a decision module (processor) of the unmanned automobile, and has wider application prospect.
Drawings
FIG. 1 is a flowchart of a method according to an embodiment of the present invention.
FIG. 2 is a diagram of millimeter wave radar and camera coordinate transformation in accordance with an embodiment of the present invention;
FIG. 3 is a flow chart of target Kalman filtering data fusion according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating an exemplary target detection result according to an embodiment of the present invention;
FIG. 5 is a block diagram of the system architecture of the present invention;
FIG. 6 is a block diagram of a determining module according to the present invention.
Detailed Description
The 12-meter intelligent driving passenger car adopting the scheme is taken as an example in the embodiment. As shown in fig. 1, the method comprises the following steps: and S1, unifying the coordinates of the millimeter wave radar and the camera into a vehicle body coordinate system. As shown in fig. 2, the positions of the millimeter wave radar and the camera with respect to the vehicle body are fixed on the smart-drive passenger car. The detection angle of the millimeter wave radar in the Z direction is generally only +/-5 degrees, the installation height of the radar is too high, the blind area is increased, the radar beam is emitted to the ground due to too low installation height, clutter interference is brought by ground reflection, and the judgment of the radar is influenced. Therefore, the arrangement height of the millimeter wave radar (i.e., the distance from the ground to the center point of the radar module) is generally recommended to be between 500 (full load state) and 800mm (no load state). The millimeter wave radar is arranged at the position 0.75m away from the ground in the center of the front part of the vehicle body, and the camera is arranged at the position 1.2m away from the ground above the millimeter wave radar in order to avoid being shielded in view of the camera. According to the translation and rotation relation between the camera coordinate system and the camera projection coordinate system, a coordinate conversion formula between the camera coordinate system and the camera projection coordinate system can be obtained:
Figure BDA0002465778430000081
Figure BDA0002465778430000082
wherein xc,yc,zcIs the coordinate system of the camera;
xrc,yrc,zrca millimeter wave radar self coordinate system;
theta is an inclination angle between the camera and the millimeter wave radar;
h is the rotation coefficient of the camera and the millimeter wave radar;
Figure BDA0002465778430000091
and S2, obtaining an original obstacle target under the same coordinate system (vehicle body coordinate system) by reading and analyzing original message data of the millimeter wave radar and the camera.
In order to facilitate debugging and interface demonstration, the invention provides two data analysis modes of online and offline:
in the offline mode, original message data of the millimeter wave radar is read and analyzed from the selectable path, and the offline file analyzed in the embodiment is from data acquired by real-time operation of the intelligent passenger car and is stored in a CSV format file. With reference to fig. 4, a user can select the millimeter wave radar and the offline file of the original message data of the camera which need to be analyzed at the lower right corner of the interface, because the millimeter wave radar of type ARS408-21 manufactured by the german continental company is adopted, and the camera adopts the camera of type IFVS-400 manufactured by the shanghai smart car company. Therefore, the invention strictly reads and analyzes the original data according to the corresponding model message protocol manual;
and in a real-time online mode, corresponding original message data CAN be read and analyzed from CAN-USB interfaces of the running millimeter wave radar and the camera.
And S3, screening the original obstacle target by using an effective life cycle method, and eliminating false targets and white noise. The effective life cycle method comprises the following specific steps:
step one, when the millimeter wave radar or camera processing module analyzes the obstacle target in the original data for 6 continuous frames, the target is determined to be an effective target. Setting a life threshold value for the effective target (taking 6 continuous frames to detect the target), and carrying out tracking measurement on the effective target.
And step two, after N times of continuous inconsistency (namely not the same target) between the newly detected target and the effective target from a certain period, the newly detected target and the effective target in the (N + 1) th and subsequent detection periods are changed into consistency again, wherein N is smaller than a life threshold (namely the target only has 1-5 frames, and N is less than 6), which shows that the phenomenon of the N times of continuous inconsistency in the middle is caused by interference under various working conditions such as vehicle bump, vehicle swing, missed detection or false target, and the phenomenon of the inconsistency is temporary. The target inconsistency processing performed at this time is to keep the original effective target unchanged, and in order to ensure the continuity of the output result of the running state information of the original effective target in the middle N periods, switching of a new target is not required.
And step three, the number of times that the newly detected target and the effective target are continuously inconsistent (namely are not the same target) from a certain frame exceeds a life threshold (the target is detected by taking 6 continuous frames), which indicates that the effective target is not the most dangerous target and the new target is more threatening. And in the target inconsistency processing, target switching is carried out, the original effective target is abandoned, the newly detected target is used as a new effective target, the measurement information of the new effective target is used as a state initial value of the effective target, and tracking measurement is carried out on the new effective target.
S4, judging whether the effective targets detected by the millimeter wave radar and the camera are the same obstacle or not
The detailed steps are as follows:
the method comprises the following steps: and recording an effective target point detected by the millimeter wave radar as an R-type target, and recording an effective target point detected by the camera as a V-type target. The target position coordinate of the R-type obstacle is (x)R,yR) At a velocity of VR(ii) a The target position coordinate of the V-type obstacle is (x)V,yV) At a velocity of VV
Step two, setting the threshold value as W ═ lambda.0.1 delta LR+(1-λ)ρ·VR
The lambda is a weight number and is used for weighting the distance difference and the speed difference of the two types of targets, and the lambda is 0.7 in consideration of the fact that the measurement accuracy of the two types of sensors on the position information is greater than that of the speed information;
Figure BDA0002465778430000101
the distance difference of the effective target detected by the millimeter wave radar and the camera;
ΔV=|VR-VVthe speed difference of the effective target detected by the millimeter wave radar and the camera is |;
rho is a speed error coefficient, and a corresponding speed error coefficient can be set according to the value of the speed, so that the error increase caused by the speed increase is avoided. The speed error coefficient of the invention is specifically set as follows:
when V isRAt < 40 km/h: ρ is 0.1;
when V is less than or equal to 40km/hRAt < 80 km/h: ρ is 0.09;
when the voltage is less than or equal to 80km/hRAt < 120 km/h: ρ is 0.07;
when 120km/h is less than VRThe method comprises the following steps: ρ is 0.05.
Considering that the detection accuracy of the millimeter wave radar for the obstacle is greater than that of the camera, the threshold is set up with the millimeter wave radar as a standard:
when V isRAt < 40 km/h:
Figure BDA0002465778430000102
when V is less than or equal to 40km/hRAt < 80 km/h:
Figure BDA0002465778430000111
when the voltage is less than or equal to 80km/hRAt < 120 km/h:
Figure BDA0002465778430000112
when 120km/h is less than VRThe method comprises the following steps:
Figure BDA0002465778430000113
and step three, substituting the distance information and the speed information of the R-type target and the V-type target into a weighted value formula W ' ═ lambda delta L + (1-lambda) delta V, and if the weighted value (W ') is larger than or equal to a threshold value (W), judging that the R-type target and the V-type target are not the same obstacle, and if the weighted value (W ') is smaller than the threshold value, judging that the R-type target and the V-type target are the same.
And S5, performing data fusion on the targets judged to be the same obstacle by using Kalman filtering.
The specific implementation steps are as follows:
A) initialization, after receiving the first valid measurement value of the camera, the system state is checked
Figure BDA0002465778430000114
And (3) initializing:
Figure BDA0002465778430000115
wherein: x is the number ofVIs a V-type target horizontal coordinate; y isVIs a V-type target longitudinal coordinate; vV-XIs the target lateral velocity of the V type; vV-Yα is the Euclidean distance of the V-type target;
Figure BDA0002465778430000116
is the absolute value of the target speed of the V type; phi is the orientation angle of the class V target.
B) Substituting the position and speed information of the V-class target initialized at the k-1 moment into a state prediction formulaTo obtain the predicted value of k time
Figure BDA0002465778430000117
Figure BDA0002465778430000118
PkIndicating the degree of uncertainty of the system state
Figure BDA0002465778430000119
Represents the initial covariance P of the invention0The following settings are set:
Figure BDA00024657784300001110
and the covariance is updated by the following equation:
Figure BDA0002465778430000121
Figure BDA0002465778430000122
Figure BDA0002465778430000123
to a predicted state
Figure BDA0002465778430000124
Error covariance of (P)k-1Is composed of
Figure BDA0002465778430000125
The covariance of (a);
Figure BDA0002465778430000126
fusing a system state predicted value for Kalman filtering at the moment k; f is a state transition matrix, which is a model of target state transition, and for a linear model, the embodiments of the present invention can obtain:
Figure BDA0002465778430000127
q is the inherent noise interference of the target state conversion model, and because the obstacle is screened, the inherent white noise of the model is filtered, the value of Q is zero in the embodiment;
the specific updating steps are as follows:
covariance P according to initialization0Calculate to obtain 1 time
Figure BDA0002465778430000128
Covariance of
Figure BDA0002465778430000129
By
Figure BDA00024657784300001210
Covariance of
Figure BDA00024657784300001211
Calculate time of day 1
Figure BDA00024657784300001212
Covariance P of1From
Figure BDA00024657784300001213
Covariance P of1Calculate time of day 2
Figure BDA00024657784300001214
Covariance of
Figure BDA00024657784300001215
Until a time point k-1
Figure BDA00024657784300001216
Covariance P ofk-1Calculating the k time
Figure BDA00024657784300001217
Error covariance of
Figure BDA00024657784300001218
From the time of k
Figure BDA00024657784300001219
Error covariance of
Figure BDA00024657784300001220
K time is obtained through calculation
Figure BDA00024657784300001221
Error covariance P ofk
From predicted states
Figure BDA00024657784300001222
Error covariance of
Figure BDA00024657784300001223
Calculating Kalman gain at the k moment:
Figure BDA00024657784300001224
from the predicted value at time k
Figure BDA00024657784300001225
Observed value z of sensorkAnd Kalman gain KkPredictive estimation of system states can be updated:
Figure BDA00024657784300001226
wherein, KkIs the Kalman gain; h is a conversion matrix from the state variable to the observation variable;
Figure BDA00024657784300001227
the optimal state is corrected through measurement; r is the measurement noise covariance; z is a radical ofkIs the first sensor observation or the second sensor observation;
C) substituting class V targets into observed value z at time kkAnd obtaining the optimal state of the system according to the updated prediction estimation formula in the step B)
Figure BDA0002465778430000131
Optimizing the system at the time k
Figure BDA0002465778430000132
Substituting the predicted estimation of the system state in the step B) to obtain the predicted value of the system state at the moment k +1, and substituting the R-type targets which are judged as the same obstacle into the observed value z at the moment k +1k+1And updating the prediction estimation to obtain the fused effective barrier information
Figure BDA0002465778430000133
D) Cycling the steps B) and C) in time sequence, and measuring the corrected optimal state each time
Figure BDA0002465778430000134
Is the fused set of obstacle target outputs.
The control decision module of the intelligent driving passenger car can execute corresponding driving behaviors such as lane changing or parking according to the fused reliable barrier target.
As shown in fig. 5, corresponding to the method of the foregoing embodiment, the system of this embodiment includes:
the coordinate conversion module is used for unifying a radar coordinate system and a camera coordinate system into a vehicle body coordinate system;
the data analysis module is used for reading and analyzing original message data of the radar and the camera to obtain an original obstacle target under the vehicle body coordinate system;
the judging module is used for judging whether the radar and the camera shooting original obstacle target are the same obstacle or not; the fusion module is used for fusing the targets judged to be the same barrier to obtain final barrier target information; the screening module is used for screening the original obstacle target output by the data analysis module, eliminating false targets and white noise, and transmitting the original obstacle target after eliminating the false targets and the white noise to the judgment module; preferably, the screening module screens the original obstacle target using an effective life cycle method.
As shown in fig. 6, the determining module of this embodiment includes:
the marking unit is used for marking the obstacle target detected by the radar as an R-type target and marking the obstacle target detected by the camera as a V-type target; the target position coordinate of the R-type obstacle is (x)R,yR) At a velocity of VR(ii) a The target position coordinate of the V-type obstacle is (x)V,yV) At a velocity of VV
A comparison unit for judging the relationship between the weighted value of the distance information and the velocity information of the R-class target and the V-class target and the threshold value, wherein the weighted value (W ') is larger than or equal to the threshold value (W) and is judged not to be the same obstacle, and the weighted value (W') is smaller than the threshold value (W) and is judged to be the same obstacle, wherein the threshold value W is lambda 0.1 delta LR+(1-λ)ρ·VRW '═ lambda delta L + (1-lambda) delta V, W' is the weighted value of the distance information and the speed information of the R type target and the V type target, lambda is the weight number, rho is the speed error coefficient;
Figure BDA0002465778430000135
ΔV=|VR-VVl. In the comparing unit, the speed error coefficient is specifically set as follows:
when V isRAt < 40 km/h: ρ is 0.1;
when V is less than or equal to 40km/hRAt < 80 km/h: ρ is 0.09;
when the voltage is less than or equal to 80km/hRAt < 120 km/h: ρ is 0.07;
when 120km/h is less than VRThe method comprises the following steps: ρ is 0.05.
The threshold is specifically set as follows:
when V isRAt < 40 km/h:
Figure BDA0002465778430000141
when V is less than or equal to 40km/hRAt < 80 km/h:
Figure BDA0002465778430000142
when the voltage is less than or equal to 80km/hRAt < 120 km/h:
Figure BDA0002465778430000143
when 120km/h is less than VRThe method comprises the following steps:
Figure BDA0002465778430000144
the present embodiments also provide an unmanned vehicle comprising a processor for performing the steps of the method of an embodiment of the invention. The processor controls the unmanned vehicle to avoid the final obstacle target information according to the final obstacle target information; in order to visually display the obstacle target information, in the present embodiment, the processor communicates with the display module. The display module may be a terminal device, such as a mobile terminal or the like.

Claims (10)

1. A method for detecting an obstacle during the driving of an unmanned vehicle is characterized by comprising the following steps:
1) unifying a first sensor coordinate system and a second sensor coordinate system into a vehicle body coordinate system;
2) reading and analyzing original message data of a first sensor and a second sensor to obtain an original obstacle target under the vehicle body coordinate system;
3) judging whether the original obstacle targets detected by the first sensor and the second sensor are the same obstacle or not;
4) fusing the targets judged to be the same barrier to obtain final barrier target information;
wherein, the first sensor and the second sensor are different kinds of sensors.
2. The method for detecting the obstacle during the driving of the unmanned vehicle as claimed in claim 1, wherein in step 2), the raw message data of the first sensor and the second sensor are read in an online or offline manner.
3. The method for detecting the obstacle during the driving of the unmanned vehicle according to claim 1, wherein between the step 2) and the step 3), the following operations are further performed: and screening the original obstacle target to eliminate false targets and white noise.
4. The method for detecting the obstacle during the driving process of the unmanned vehicle according to claim 3, wherein the specific implementation process for screening the original obstacle target comprises the following steps:
A) when an original barrier target is decided as an effective target for the first time in a certain period, setting a life threshold value for the effective target, and carrying out tracking measurement on the effective target;
B) starting from the period, if the target newly detected by the first sensor and the second sensor is inconsistent with the corresponding effective target continuously for N times, and the target newly detected in the period from the (N + 1) th time to the later time is consistent with the effective target again, keeping the original effective target unchanged; if the number of times of continuous inconsistency between the newly detected target and the effective target from the period exceeds a life threshold, abandoning the original effective target, taking the newly detected target of the first sensor and the second sensor as a new effective target, and carrying out tracking measurement on the new effective target; wherein N is less than a life threshold;
C) and circularly executing the steps A) and B) to respectively obtain effective target output sets screened by the first sensor and the second sensor.
5. The method for detecting the obstacle during the driving process of the unmanned vehicle according to any one of claims 1 to 4, wherein the step 4) of determining whether the original obstacle targets detected by the first sensor and the second sensor are the same obstacle comprises:
1) marking an obstacle target detected by a first sensor as an R-type target, and marking an obstacle target detected by a second sensor as a V-type target; the target position coordinate of the R-type obstacle is (x)R,yR) At a velocity of VR(ii) a The target position coordinate of the V-type obstacle is (x)V,yV) At a velocity of VV
2) Judging the relationship between weighted values of distance information and speed information of the R-type target and the V-type target and a threshold value, judging whether the weighted values are the same obstacle when the weighted values are larger than or equal to the threshold value W, and judging the weighted values are the same obstacle when the weighted values are smaller than the threshold value W, wherein the threshold value W is lambda 0.1 delta LR+(1-λ)ρ·VRW '═ lambda delta L + (1-lambda) delta V, W' is the weighted value of the distance information and the speed information of the R type target and the V type target, lambda is the weight number, rho is the speed error coefficient;
Figure RE-FDA0002532274430000021
ΔV=|VR-VVl, |; preferably, λ is 0.7.
6. The method according to claim 5, wherein the threshold is specifically set as follows:
when V isRWhen the speed is less than 40km/h, rho is 0.1:
Figure RE-FDA0002532274430000022
when V is less than or equal to 40km/hRWhen the speed is less than 80km/h, rho is 0.09:
Figure RE-FDA0002532274430000023
when the voltage is less than or equal to 80km/hRWhen the speed is less than 120km/h, rho is 0.07:
Figure RE-FDA0002532274430000024
when 120km/h is less than VRWhen ρ is 0.05:
Figure RE-FDA0002532274430000025
7. the method for detecting the obstacle during the travel of the unmanned vehicle according to claim 5, wherein in the step 4), the targets determined to be the same obstacle are fused by using a Kalman filtering method; preferably, the specific implementation process of fusing the targets judged as the same obstacle by using the kalman filtering method includes:
A) after receiving the first valid measurement from the second sensor, initializing using:
Figure RE-FDA0002532274430000031
wherein: x is the number ofVIs a V-type target horizontal coordinate; y isVIs a V-type target longitudinal coordinate; vV-XIs the target lateral velocity of the V type; vV-Yα is the Euclidean distance of the V-type target;
Figure RE-FDA0002532274430000032
is the absolute value of the target speed of the V type; phi is the orientation angle of the V-type target;
B) substituting the position information and the speed information of the V-type target initialized at the k-1 moment into a state prediction formula to obtain a state prediction value of a Kalman filtering fusion system at the k moment
Figure RE-FDA0002532274430000033
Figure RE-FDA0002532274430000034
C) Updating the prediction estimate of the Kalman filter fusion system state using:
Figure RE-FDA0002532274430000035
wherein, KkIn order to be the basis of the kalman gain,
Figure RE-FDA0002532274430000036
h is a conversion matrix from the state variable to the observation variable, and superscript T represents matrix transposition;
Figure RE-FDA0002532274430000037
the optimal state is corrected through measurement; r is the measurement noise covariance; z is a radical ofkA first sensor observation or a second sensor observation at time k; pkIs composed of
Figure RE-FDA0002532274430000038
Represents the uncertainty of the state;
Figure RE-FDA0002532274430000039
Figure RE-FDA00025322744300000310
is composed of
Figure RE-FDA00025322744300000311
Error covariance of (P)k-1Is composed of
Figure RE-FDA00025322744300000312
The covariance of (a) of (b),
Figure RE-FDA00025322744300000313
f is a state transition matrix which is a model of target state transition; q is the inherent noise interference of the target state conversion model;
D) substituting class V targets into observed value z at time kkAnd obtaining the optimal state of the system according to the Kalman filtering fusion system state prediction value estimation formula in the step B)
Figure RE-FDA00025322744300000314
Optimizing the system at the time k
Figure RE-FDA00025322744300000315
Substituting the prediction estimation formula of the Kalman filtering fusion system state in the step C) to obtain the predicted value of the system state at the moment k +1, and substituting the R-type targets which are judged as the same obstacle into the observed value z at the moment k +1k+1And further onPredicting and estimating the state of the new Kalman filtering fusion system to obtain the fused effective barrier information
Figure RE-FDA0002532274430000041
E) Cycling steps B) to D) in time sequence, and measuring and correcting the optimal state each time
Figure RE-FDA0002532274430000042
… are the fused set of obstacle target outputs.
8. An obstacle detection system during travel of an unmanned vehicle, comprising:
the coordinate conversion module is used for unifying the first sensor coordinate system and the second sensor coordinate system into a vehicle body coordinate system;
the data analysis module is used for reading and analyzing original message data of the first sensor and the second sensor to obtain an original obstacle target under the vehicle body coordinate system;
the judging module is used for judging whether the original obstacle targets detected by the first sensor and the second sensor are the same obstacle or not;
the fusion module is used for fusing the targets judged to be the same barrier to obtain final barrier target information; preferably, the system further comprises a screening module, wherein the screening module is used for screening the original obstacle target output by the data analysis module, eliminating false targets and white noise, and transmitting the original obstacle target after eliminating the false targets and the white noise to the judgment module; preferably, the screening module screens the original obstacle target by using an effective life cycle method;
wherein, the first sensor and the second sensor are different kinds of sensors.
9. The system of claim 8, wherein the determining module comprises:
the marking unit is used for marking the obstacle target detected by the first sensor as an R-type target and the obstacle target detected by the second sensor as a V-type target; the target position coordinate of the R-type obstacle is (x)R,yR) At a velocity of VR(ii) a The target position coordinate of the V-type obstacle is (x)V,yV) At a velocity of VV
A comparing unit for judging the relationship between the weighted value of the distance information and the velocity information of the R-type target and the V-type target and a threshold value, wherein the weighted value W' is more than or equal to the threshold value W which is judged not to be the same obstacle and less than the threshold value W which is judged to be the same obstacle, wherein the threshold value W is lambda 0.1 delta LR+(1-λ)ρ·VRW '═ lambda delta L + (1-lambda) delta V, W' is the weighted value of the distance information and the speed information of the R type target and the V type target, lambda is the weight number, rho is the speed error coefficient;
Figure RE-FDA0002532274430000051
ΔV=|VR-VVl, |; preferably, the threshold is specifically set as follows:
when V isRWhen the speed is less than 40km/h, rho is 0.1:
Figure RE-FDA0002532274430000052
when V is less than or equal to 40km/hRWhen the speed is less than 80km/h, rho is 0.09:
Figure RE-FDA0002532274430000053
when the voltage is less than or equal to 80km/hRWhen the speed is less than 120km/h, rho is 0.07:
Figure RE-FDA0002532274430000054
when 120km/h is less than VRWhen ρ is 0.05:
Figure RE-FDA0002532274430000055
10. an unmanned vehicle comprising a processor for performing the method steps of any one of claims 1 to 7; preferably, the processor controls the unmanned vehicle to avoid the final obstacle target information according to the final obstacle target information; preferably, the processor is in communication with a display module; preferably, the processor is in communication with a data acquisition module; the data acquisition module comprises at least one first sensor and at least one second sensor which are arranged on the front part of the vehicle body; the first sensor and the second sensor are heterogeneous sensors.
CN202010333455.0A 2020-04-24 2020-04-24 Method and system for detecting obstacle in driving process of unmanned vehicle and vehicle Active CN111505623B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010333455.0A CN111505623B (en) 2020-04-24 2020-04-24 Method and system for detecting obstacle in driving process of unmanned vehicle and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010333455.0A CN111505623B (en) 2020-04-24 2020-04-24 Method and system for detecting obstacle in driving process of unmanned vehicle and vehicle

Publications (2)

Publication Number Publication Date
CN111505623A true CN111505623A (en) 2020-08-07
CN111505623B CN111505623B (en) 2023-04-18

Family

ID=71874777

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010333455.0A Active CN111505623B (en) 2020-04-24 2020-04-24 Method and system for detecting obstacle in driving process of unmanned vehicle and vehicle

Country Status (1)

Country Link
CN (1) CN111505623B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112462368A (en) * 2020-11-25 2021-03-09 中国第一汽车股份有限公司 Obstacle detection method and device, vehicle and storage medium
CN112924960A (en) * 2021-01-29 2021-06-08 重庆长安汽车股份有限公司 Target size real-time detection method, system, vehicle and storage medium
CN113296088A (en) * 2021-05-11 2021-08-24 雄狮汽车科技(南京)有限公司 Dynamic target tracking method and device for vehicle and vehicle
CN114148301A (en) * 2021-12-20 2022-03-08 岚图汽车科技有限公司 Vehicle brake control method, device, equipment and readable storage medium
CN117369507A (en) * 2023-10-31 2024-01-09 河海大学 Unmanned aerial vehicle dynamic path planning method of self-adaptive particle swarm algorithm
CN117932548A (en) * 2024-03-25 2024-04-26 西北工业大学 Confidence-based flight obstacle recognition method based on multi-source heterogeneous sensor fusion

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11348606A (en) * 1998-06-08 1999-12-21 Nissan Motor Co Ltd Vehicular driving force control device
CN106909148A (en) * 2017-03-10 2017-06-30 南京沃杨机械科技有限公司 Based on the unmanned air navigation aid of agricultural machinery that farm environment is perceived
CN107632308A (en) * 2017-08-24 2018-01-26 吉林大学 A kind of vehicle front barrier profile testing method based on recurrence superposition algorithm
WO2018154367A1 (en) * 2017-02-23 2018-08-30 Kpit Technologies Limited System and method for target track management of an autonomous vehicle
CN109886308A (en) * 2019-01-25 2019-06-14 中国汽车技术研究中心有限公司 One kind being based on the other dual sensor data fusion method of target level and device
CN110850413A (en) * 2019-11-26 2020-02-28 奇瑞汽车股份有限公司 Method and system for detecting front obstacle of automobile
CA3050426A1 (en) * 2018-10-12 2020-04-12 Aurora Flight Sciences Corporation Adaptive sense and avoid system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11348606A (en) * 1998-06-08 1999-12-21 Nissan Motor Co Ltd Vehicular driving force control device
WO2018154367A1 (en) * 2017-02-23 2018-08-30 Kpit Technologies Limited System and method for target track management of an autonomous vehicle
CN106909148A (en) * 2017-03-10 2017-06-30 南京沃杨机械科技有限公司 Based on the unmanned air navigation aid of agricultural machinery that farm environment is perceived
CN107632308A (en) * 2017-08-24 2018-01-26 吉林大学 A kind of vehicle front barrier profile testing method based on recurrence superposition algorithm
CA3050426A1 (en) * 2018-10-12 2020-04-12 Aurora Flight Sciences Corporation Adaptive sense and avoid system
CN109886308A (en) * 2019-01-25 2019-06-14 中国汽车技术研究中心有限公司 One kind being based on the other dual sensor data fusion method of target level and device
CN110850413A (en) * 2019-11-26 2020-02-28 奇瑞汽车股份有限公司 Method and system for detecting front obstacle of automobile

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘志强;王盈盈;倪捷;刘恒;: "前方车辆障碍物检测方法的研究" *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112462368A (en) * 2020-11-25 2021-03-09 中国第一汽车股份有限公司 Obstacle detection method and device, vehicle and storage medium
CN112462368B (en) * 2020-11-25 2022-07-12 中国第一汽车股份有限公司 Obstacle detection method and device, vehicle and storage medium
CN112924960A (en) * 2021-01-29 2021-06-08 重庆长安汽车股份有限公司 Target size real-time detection method, system, vehicle and storage medium
CN112924960B (en) * 2021-01-29 2023-07-18 重庆长安汽车股份有限公司 Target size real-time detection method, system, vehicle and storage medium
CN113296088A (en) * 2021-05-11 2021-08-24 雄狮汽车科技(南京)有限公司 Dynamic target tracking method and device for vehicle and vehicle
CN114148301A (en) * 2021-12-20 2022-03-08 岚图汽车科技有限公司 Vehicle brake control method, device, equipment and readable storage medium
CN117369507A (en) * 2023-10-31 2024-01-09 河海大学 Unmanned aerial vehicle dynamic path planning method of self-adaptive particle swarm algorithm
CN117932548A (en) * 2024-03-25 2024-04-26 西北工业大学 Confidence-based flight obstacle recognition method based on multi-source heterogeneous sensor fusion
CN117932548B (en) * 2024-03-25 2024-06-04 西北工业大学 Confidence-based flight obstacle recognition method based on multi-source heterogeneous sensor fusion

Also Published As

Publication number Publication date
CN111505623B (en) 2023-04-18

Similar Documents

Publication Publication Date Title
CN111505623B (en) Method and system for detecting obstacle in driving process of unmanned vehicle and vehicle
CN110850403B (en) Multi-sensor decision-level fused intelligent ship water surface target feeling knowledge identification method
US10489664B2 (en) Image processing device, device control system, and computer-readable storage medium
CN102806913B (en) Novel lane line deviation detection method and device
CN110929796B (en) Multi-source sensor-based decision layer data fusion method and system and storage medium
Jiménez et al. Improving the obstacle detection and identification algorithms of a laserscanner-based collision avoidance system
CN112154455B (en) Data processing method, equipment and movable platform
EP3851870A1 (en) Method for determining position data and/or motion data of a vehicle
US11538241B2 (en) Position estimating device
CN112668602A (en) Method, device and machine-readable storage medium for determining a quality level of a data set of a sensor
CN112285714A (en) Obstacle speed fusion method and device based on multiple sensors
CN111797701B (en) Road obstacle sensing method and system for vehicle multi-sensor fusion system
CN115993597A (en) Visual radar perception fusion method and terminal equipment
CN117173666A (en) Automatic driving target identification method and system for unstructured road
CN114298142A (en) Multi-source heterogeneous sensor information fusion method and device for camera and millimeter wave radar
JP7499140B2 (en) Object Recognition Device
CN113792598A (en) Vehicle-mounted camera-based vehicle collision prediction system and method
CN111959515A (en) Forward target selection method, device and system based on visual detection
CN116022163A (en) Automatic driving vehicle scanning matching and radar attitude estimator based on super local subgraph
Domhof et al. Multi-sensor object tracking performance limits by the cramer-rao lower bound
Nguyen et al. Optimized grid-based environment perception in advanced driver assistance systems
CN114730495A (en) Method for operating an environment detection device with grid-based evaluation and with fusion, and environment detection device
CN113359147B (en) Method and device for judging motion states of vehicle and target object
RU2798739C1 (en) Method for tracking objects at the recognition stage for unmanned vehicles
KR102643361B1 (en) Adaptive intervention determination method of vehicle accident avoidance system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant