CN110329259B - Vehicle automatic following system and method based on multi-sensor fusion - Google Patents
Vehicle automatic following system and method based on multi-sensor fusion Download PDFInfo
- Publication number
- CN110329259B CN110329259B CN201910594270.2A CN201910594270A CN110329259B CN 110329259 B CN110329259 B CN 110329259B CN 201910594270 A CN201910594270 A CN 201910594270A CN 110329259 B CN110329259 B CN 110329259B
- Authority
- CN
- China
- Prior art keywords
- vehicle
- module
- radar
- relative distance
- relative
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000004927 fusion Effects 0.000 title claims abstract description 13
- 238000000034 method Methods 0.000 title claims abstract description 9
- 238000004891 communication Methods 0.000 claims abstract description 44
- 238000001514 detection method Methods 0.000 claims description 42
- 230000005540 biological transmission Effects 0.000 claims description 9
- 230000007613 environmental effect Effects 0.000 claims description 6
- 230000001133 acceleration Effects 0.000 claims description 4
- 238000004364 calculation method Methods 0.000 claims description 4
- 238000009826 distribution Methods 0.000 abstract description 3
- 238000012423 maintenance Methods 0.000 abstract 1
- 206010039203 Road traffic accident Diseases 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000008447 perception Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/14—Adaptive cruise control
- B60W30/16—Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
- B60W30/165—Automatically following the path of a preceding lead vehicle, e.g. "electronic tow-bar"
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/10—Longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2555/00—Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
- B60W2555/20—Ambient conditions, e.g. wind or rain
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Traffic Control Systems (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
Abstract
The invention provides a vehicle automatic following system based on multi-sensor fusion and a method thereof.A self vehicle respectively obtains state information of the self vehicle and a front vehicle through a millimeter wave radar, an ultrasonic radar, a camera and a vehicle-vehicle communication system which are arranged, then different weight distributions are carried out according to different weather states and network states to obtain the final relative speed and relative distance of the two vehicles, then weighted collision time is calculated, and finally vehicle distance following maintenance is carried out according to different conditions until the self vehicle reaches a destination. According to the invention, the plurality of sensors are fused, and the network conditions and the working conditions of the sensors are divided, so that the weight distribution is carried out on the data obtained under different conditions, and the data can be more accurately obtained in real time.
Description
Technical Field
The invention belongs to the technical field of automobiles, and particularly relates to a vehicle automatic following system based on multi-sensor fusion and a method thereof.
Background
Most ACC that exist at present all rely on the millimeter wave radar to obtain relative distance and the relative speed from car to front car with the car system alone, and millimeter wave radar detection performance can receive weather reason influence to cause the measuring distance inaccurate, can appear the short-range detection blind area moreover, takes place the traffic accident very easily when being nearer with the car. And with the development of 5G technology, V2V (vehicle-to-vehicle communication system) is also beginning to be applied to the vehicle ACC system, but due to the immature technology, the communication time lag between vehicles can be caused by different network signal strengths in different areas, and the deviation between the acquired distance and the actual distance is caused.
Disclosure of Invention
In order to solve the technical problems, the invention provides a vehicle automatic following system based on multi-sensor fusion and a method thereof.
The invention adopts the following technical scheme:
a multi-sensor fusion based vehicle auto-following system, comprising:
the environment information sensing unit comprises a plurality of sensors and is used for acquiring the relative distance and the relative speed from the vehicle to the front vehicle and acquiring external environment factors influencing data of the plurality of sensors;
the decision control unit is used for receiving data of the sensors, carrying out weighted average on the obtained relative distance and relative speed according to external environmental factors, calculating collision time and judging whether the collision time exceeds a preset early warning time threshold value or not;
if the collision time exceeds a threshold value, controlling the self-vehicle to accelerate; and if the collision time does not exceed the threshold value, controlling the self vehicle to brake and decelerate.
Preferably, the environment information sensing unit includes:
the radar detection module is used for acquiring the relative distance and the relative speed from the vehicle to the front vehicle through radar waves;
the camera module is used for acquiring the relative distance and the relative speed from the vehicle to the front vehicle through video acquisition;
the V2V communication system module is used for carrying out real-time communication between the self vehicle and the front vehicle, the front vehicle transmits position information and speed information to the self vehicle in real time, and the self vehicle obtains the relative distance and the relative speed from the self vehicle to the front vehicle through calculation according to the obtained information;
the weather detection and judgment module is used for judging the rain and fog weather conditions affecting the radar detection module and the camera module data;
and the network judgment module is used for judging the network transmission quality of the real-time communication between the self vehicle and the front vehicle.
Preferably, the radar detection module comprises a millimeter wave radar and an ultrasonic radar, and the millimeter wave radar is installed in the middle of a front bumper of the vehicle and used for detecting an area 3-150 meters away from the front of the vehicle; the ultrasonic radars are symmetrically arranged on two sides of a front bumper and are used for detecting an area with a distance of 20-400 cm from the front of the bumper; the radar detection module obtains the relative distance and the relative speed from the vehicle to the front vehicle through mutual compensation of the millimeter wave radar and the ultrasonic radar.
Preferably, the decision control unit includes:
the central ECU controller is used for receiving the relative distance information and the relative speed information of each module; determining the proportion of data obtained by the radar and the camera according to the weather quality judged by the weather detection judgment module, determining the proportion of data obtained by the V2V communication system module according to the quality of network transmission quality judged by the network judgment module, carrying out weighted average on the obtained relative distance and relative speed, and calculating collision time; judging whether the collision time is greater than a threshold value, if so, judging that no collision danger exists, and sending a signal to an accelerator actuating mechanism; if not, judging that certain collision danger exists, and sending a signal to a brake actuating mechanism;
the accelerator executing mechanism is used for receiving the collision-free danger signal and executing the operation of accelerating the self vehicle;
and the brake actuating mechanism is used for receiving a certain collision danger signal and executing the operation of braking and decelerating the self vehicle.
The invention also discloses a vehicle automatic following method based on multi-sensor fusion, which comprises the following steps:
s1, acquiring the relative distance and relative speed from the vehicle to the front vehicle through a plurality of sensors, and acquiring external environmental factors influencing data of the plurality of sensors;
s2, carrying out weighted average on the obtained relative distance and relative speed according to external environment factors, and calculating collision time according to the obtained weighted relative distance and weighted relative speed;
s3, judging whether the collision time exceeds a preset early warning time threshold value, wherein if the collision time exceeds the threshold value, sending a self-vehicle acceleration signal; if the collision time does not exceed the threshold value, sending a braking and decelerating signal of the vehicle;
and S4, repeating the steps S1, S2 and S3 until the vehicle safely reaches the destination.
Preferably, the step S1 includes:
obtaining the relative distance D from the vehicle to the front vehicle through a radar detection modulelAnd relative velocity Vl;
Obtaining the relative distance D from the vehicle to the front vehicle through the camera modulecAnd relative velocity Vc;
Calculating the relative distance D from the vehicle to the front vehicle through a V2V communication system moduletAnd relative velocity Vt;
Judging the rain and fog weather conditions affecting the radar detection module and the camera module data through a weather detection judgment module;
judging the network transmission quality of real-time communication between the self vehicle and the front vehicle through a network judging module;
and the relative distance information and the relative speed information of each module are transmitted to the central ECU controller through a CAN communication mode.
Preferably, the radar detection module adopts a mutual compensation mode of a millimeter wave radar and an ultrasonic radar, the millimeter wave radar detects an area with a distance of 3-150 meters from the front of the vehicle, the ultrasonic radar detects an area with a distance of 20-400 centimeters from the front of the vehicle, and the relative distance from the vehicle to the front of the vehicle is obtained through mutual compensation of the millimeter wave radar and the ultrasonic radar and is DlAnd relative velocity Vl。
Preferably, the step S3 includes:
the central ECU controller determines the proportion of the relative distance and the relative speed obtained by the radar detection module and the camera module according to the judgment of the weather detection judgment module on the size of rain and fog;
the central ECU controller determines the proportion of the relative distance and the relative speed obtained by the V2V communication system module according to the judgment of the network judgment module on the network transmission quality;
carrying out weighted average on the obtained relative distance and relative speed according to the weight occupied by each detection module;
and calculating the time to collision TTC according to the obtained weighted relative distance and the weighted relative speed.
The step S3 specifically includes:
grading the foggy environment into fogless, light fog, medium fog and heavy fog, wherein the weight occupied by the influence of the fog on the data obtained by the radar detection module is recorded as:
Wl1(W) { no fog, light fog, medium fog, and heavy fog }a1,Wa2,Wa3,Wa4}; (1)
The weight of the effect of fog on the data obtained by the camera module is given as:
Wc1(W) { no fog, light fog, medium fog, and heavy fog }b1,Wb2,Wb3,Wb4}; (2)
Classifying the rain environment into no rain, light rain, medium rain and heavy rain, wherein the weight occupied by the rain on the data obtained by the radar detection module is recorded as:
Wl2(W) ═ no rain, light rain, medium rain, heavy rain } - { Wd1,Wd2,Wd3,Wd4}; (3)
The weight of the influence of rain on the data obtained by the camera module is recorded as:
Wc2(W) ═ no rain, light rain, medium rain, heavy rain } - { We1,We2,We3,We4}; (4)
The communication quality is classified into the following weight, the communication quality is very good, the communication quality is general, the communication quality is poor, the communication is interrupted, and the influence of the communication quality on the data obtained by the V2V communication system module is recorded as:
Wtgood, bad, interrupted, Wf1,Wf2,Wf3,Wf4}。 (5)
According to the dayThe gas judgment module and the network communication quality module judge the weight occupied by each detection module, and the obtained relative distance and relative speed are weighted and averaged to obtain the final relative distance DfAnd relative velocity Vf:
Wherein i is 1, 2, 3, 4.
And calculating the collision time TTC according to the obtained final weighted relative distance and relative speed:
preferably, the step S4 further includes a preset first vehicle speed threshold and a preset second vehicle speed threshold, where the early warning time threshold includes a first time threshold, a second time threshold and a third time threshold;
judging whether the current vehicle speed exceeds a first vehicle speed threshold, and if so, judging whether the collision time TTC exceeds a first time threshold; if the current speed of the vehicle does not exceed the first speed threshold, judging whether the current speed of the vehicle exceeds a second speed threshold, if so, judging whether the Time To Collision (TTC) exceeds a second time threshold, and if not, judging whether the Time To Collision (TTC) exceeds a third time threshold;
if the collision time exceeds a threshold value, the central ECU controller sends an instruction to the accelerator execution mechanism, and the accelerator execution mechanism executes the acceleration operation of the automobile; if the collision time does not exceed the threshold value, the central ECU controller sends an instruction to the brake executing mechanism, and the brake executing mechanism executes the braking and decelerating operation of the automobile.
The invention has the beneficial effects that:
(1) according to the invention, the millimeter wave radar and the ultrasonic radar are fused, so that a short-distance blind area of the millimeter wave radar is compensated, and a radar data acquisition system is more complete and accurate without the blind area;
(2) the data acquired by the Internet of vehicles system is fused with the data of the traditional sensor, and each system can complement the advantages according to different working conditions, so that the normal operation of the intelligent car-around ACC system can be ensured under the unfavorable conditions;
(3) by dividing network conditions and sensor working conditions, weight distribution is carried out on data obtained under different conditions, so that the obtained data is more accurate and real-time.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings:
FIG. 1 is a block diagram of the overall construction of the system of the present invention;
FIG. 2 is a schematic view of the mounting location of various sensors of the present invention on a vehicle;
FIG. 3 is a flow chart of the operation of the system of the present invention.
Detailed Description
The invention discloses a vehicle automatic following system based on multi-sensor fusion, which mainly comprises an environmental information perception part and a decision control part, wherein the environmental information perception part mainly comprises a radar detection module, a camera module, a V2V communication system module, a weather detection judgment module and a network judgment module; the decision control part mainly comprises a vehicle ECU and an actuating mechanism for an accelerator and a brake system.
The present invention will be described in further detail with reference to the following embodiments and accompanying fig. 1, 2 and 3.
Step one, acquiring relative distance and relative speed by each sensor
The radar detection module adopts an algorithm of mutual compensation of the millimeter wave radar 4, the left ultrasonic radar 3 and the right ultrasonic radar 5 to obtain the relative distance and the relative speed, the detection area of the millimeter wave radar is S2, and the detection distance D2 is 3-150 meter, a detection blind area exists in the aspect of short-distance detection, the detection area of the ultrasonic radar is S1, and the distance D1 which can be accurately detected is 20-400 centimeters, so in a radar ranging system adopting the millimeter wave radar and the ultrasonic radar for compensation, the millimeter wave radar 4 is arranged in the middle of a front bumper of a vehicle, which is 50-80 centimeters away from the ground, and the left ultrasonic radar 3 and the right ultrasonic radar 5 are symmetrically arranged on the two sides of the front bumper; the relative distance between the self vehicle and the front vehicle obtained by the mutual compensation of the radars is DlRelative velocity is Vl。
The camera module adopts a CCD camera 2, is arranged in the middle of the front windshield of the bicycle, can realize the full coverage of 0-100 meters, and obtains the relative distance between the bicycle and the front bicycle as DcRelative velocity is Vc。
The V2V communication system module is arranged at the roof part, the self vehicle and the front vehicle can communicate in real time, the front vehicle can transmit position information and speed information to the self vehicle in real time, and the self vehicle obtains the relative distance D between the self vehicle and the front vehicle through calculation according to the obtained informationtRelative velocity is Vt。
And the obtained relative distance information and relative speed information of each module are transmitted to the central ECU controller through the CAN communication mode to wait for further processing.
Step two, obtaining the weight calculation of the relative distance and the relative speed by each sensor
The radar and the camera module are mainly influenced by weather environments, such as rain, snow, fog and the like, the influence of different sizes of the rain and the fog on the radar and the camera is different, the weather detection and judgment module mainly judges the external environment of the running of the vehicle, and the proportion of data obtained by the radar and the camera is determined according to the quality of the working environment. The V2V communication system module mainly depends on the transmission between networks, the speed of network signals directly determines the real-time accuracy of transmitted data, so the weight occupied by the data obtained by the vehicle-vehicle communication module is determined by the network strength.
Radar and camera are mainly influenced by rain, fog, at first carry out the classification to the fog environment, divide into no fog, light fog, well fog and fog, and wherein the weight that the fog influences the data that the radar gained accounts for is recorded as:
Wl1(W) { no fog, light fog, medium fog, and heavy fog }a1,Wa2,Wa3,Wa4}; (9)
The weight of the effect of fog on the data obtained by the camera module is given as:
Wc1(W) { no fog, light fog, medium fog, and heavy fog }b1,Wb2,Wb3,Wb4}; (10)
And then classifying the rain environment into no rain, light rain, medium rain and heavy rain, wherein the weight occupied by the influence of rain on the data obtained by the radar is recorded as:
Wl2(W) ═ no rain, light rain, medium rain, heavy rain } - { Wd1,Wd2,Wd3,Wd4}; (11)
The weight of the influence of rain on the data obtained by the camera module is recorded as:
Wc2(W) ═ no rain, light rain, medium rain, heavy rain } - { We1,We2,We3,We4}; (12)
For the V2V communication module, it is mainly influenced by communication quality, and classifies the communication quality into very good communication quality, general communication quality, poor communication quality, communication interruption, and the weight of the influence of communication quality on data obtained by the car-to-car communication module is recorded as:
Wtgood, bad, interrupted, Wf1,Wf2,Wf3,Wf4}。 (13)
According to the weight occupied by each detection module judged by the weather judgment module and the network communication quality module, carrying out weighted average on the obtained relative distance and relative speed to obtain the final relative distance DfAnd relative velocity Vf:
Wherein i is 1, 2, 3, 4.
Calculating to obtain the multi-sensor fusion collision time TTC according to the obtained final weighted relative distance and relative speed, and then performing decision control:
the collision time is set to be neither too large nor too small, if the TTC is set to be too large, the following distance is too large, traffic jam can be caused, if the TTC is too small, the following distance is too close, and traffic accidents are easily caused when emergency occurs, so that the following strategy is determined to be determined by adopting the speed of the vehicle and the TTC.
When the speed V > of the self-vehicle is 60km/h, the collision time is set to be 1.5s, when the calculated TTC is more than 1.5s, no collision danger exists, the self-vehicle can be accelerated properly, and the distance between vehicles is reduced to improve the efficiency of the same-line operation; when the TTC is less than 1.5s, the self-vehicle has certain collision risk and should be braked and decelerated properly.
When the speed of the self-vehicle is 30km/h, namely V <60km/h, the collision time is determined to be 1.2s, when the calculated TTC is more than 1.2s, the collision danger is avoided, the self-vehicle can be accelerated properly, and the distance between vehicles is reduced to improve the efficiency of the same-line operation; when the TTC is less than 1.2s, the self-vehicle has certain collision risk and should be braked and decelerated properly.
When the speed V of the self-vehicle is less than 30km/h, the collision time is set as 1s, when the calculated TTC is more than 1s, the collision danger is avoided, the self-vehicle can be accelerated properly, and the distance between the self-vehicle and the self-vehicle is reduced to improve the efficiency of the same-line operation; when TTC <1s has a certain risk of collision, the self-vehicle should be braked and decelerated properly.
And repeating the steps one to three until the self vehicle safely reaches the destination.
Although the present invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that changes may be made in the embodiments and/or equivalents thereof without departing from the spirit and scope of the invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (4)
1. A vehicle automatic following system based on multi-sensor fusion is characterized by comprising:
the environment information sensing unit comprises a plurality of sensors and is used for acquiring the relative distance and the relative speed from the vehicle to the front vehicle and acquiring external environment factors influencing data of the plurality of sensors;
the decision control unit is used for receiving data of the sensors, carrying out weighted average on the obtained relative distance and relative speed according to external environmental factors, calculating collision time and judging whether the collision time exceeds a preset early warning time threshold value or not;
if the collision time exceeds a threshold value, controlling the self-vehicle to accelerate; if the collision time does not exceed the threshold value, controlling the self-vehicle to brake and decelerate;
the environment information sensing unit includes:
the radar detection module is used for acquiring the relative distance and the relative speed from the vehicle to the front vehicle through radar waves;
the camera module is used for acquiring the relative distance and the relative speed from the vehicle to the front vehicle through video acquisition;
the V2V communication system module is used for carrying out real-time communication between the self vehicle and the front vehicle, the front vehicle transmits position information and speed information to the self vehicle in real time, and the self vehicle obtains the relative distance and the relative speed from the self vehicle to the front vehicle through calculation according to the obtained information;
the weather detection and judgment module is used for judging the rain and fog weather conditions affecting the radar detection module and the camera module data;
the network judgment module is used for judging the network transmission quality of real-time communication between the self vehicle and the front vehicle;
the radar detection module comprises a millimeter wave radar and an ultrasonic radar, and the millimeter wave radar is arranged in the middle of a front bumper of the vehicle and is used for detecting an area with a distance of 3-150 meters from the front of the vehicle; the ultrasonic radars are symmetrically arranged on two sides of a front bumper and are used for detecting an area with a distance of 20-400 cm from the front of the bumper; the radar detection module obtains the relative distance and the relative speed from the vehicle to the front vehicle through mutual compensation of the millimeter wave radar and the ultrasonic radar;
the decision control unit comprises:
the central ECU controller is used for receiving relative distance information and relative speed information of the radar detection module, the camera module and the V2V communication system module; determining the proportion of data obtained by the radar and the camera according to the weather quality judged by the weather detection judgment module, determining the proportion of data obtained by the V2V communication system module according to the quality of network transmission quality judged by the network judgment module, carrying out weighted average on the obtained relative distance and relative speed, and calculating collision time; judging whether the collision time is greater than a threshold value, if so, judging that no collision danger exists, and sending a signal to an accelerator actuating mechanism; if not, judging that certain collision danger exists, and sending a signal to a brake actuating mechanism;
the accelerator executing mechanism is used for receiving the collision-free danger signal and executing the operation of accelerating the self vehicle;
and the brake actuating mechanism is used for receiving a certain collision danger signal and executing the operation of braking and decelerating the self vehicle.
2. A vehicle automatic following method based on multi-sensor fusion is characterized by comprising the following steps:
s1, acquiring the relative distance and relative speed from the vehicle to the front vehicle through a plurality of sensors, and acquiring external environmental factors influencing data of the plurality of sensors;
s2, carrying out weighted average on the obtained relative distance and relative speed according to external environment factors, and calculating collision time according to the obtained weighted relative distance and weighted relative speed;
s3, judging whether the collision time exceeds a preset early warning time threshold value, wherein if the collision time exceeds the threshold value, sending a self-vehicle acceleration signal; if the collision time does not exceed the threshold value, sending a braking and decelerating signal of the vehicle;
s4, repeating the steps S1, S2 and S3 until the vehicle safely reaches the destination;
the step S1 includes:
obtaining the relative distance and the relative speed from the vehicle to the front vehicle through a radar detection module;
obtaining the relative distance and the relative speed from the vehicle to the front vehicle through the camera module;
calculating the relative distance and the relative speed from the vehicle to the front vehicle through a V2V communication system module;
judging the rain and fog weather conditions affecting the radar detection module and the camera module data through a weather detection judgment module;
judging the network transmission quality of real-time communication between the self vehicle and the front vehicle through a network judging module;
the relative distance information and the relative speed information of the radar detection module, the camera module and the V2V communication system module are transmitted to the central ECU controller through a CAN communication mode;
the step S2 includes:
the central ECU controller determines the proportion of the relative distance and the relative speed obtained by the radar detection module and the camera module according to the weather detection and judgment module;
the central ECU controller determines the proportion of the relative distance and the relative speed obtained by the V2V communication system module according to the quality of the network transmission quality judged by the network judgment module;
carrying out weighted average on the obtained relative distance and relative speed according to the weight occupied by each module;
and calculating the time to collision TTC according to the obtained weighted relative distance and the weighted relative speed.
3. The method for automatically following the vehicle based on the fusion of the multiple sensors as claimed in claim 2, wherein the radar detection module adopts a mutual compensation mode of a millimeter wave radar and an ultrasonic radar, the millimeter wave radar detects an area with a distance of 3-150 m from the front of the vehicle, the ultrasonic radar detects an area with a distance of 20-400 cm from the front of the vehicle, and the relative distance and the relative speed from the vehicle to the front vehicle are obtained through the mutual compensation of the millimeter wave radar and the ultrasonic radar.
4. The method for automatic following of vehicle based on multi-sensor fusion according to claim 3, wherein the step S3 further includes a preset first vehicle speed threshold and a preset second vehicle speed threshold, and the pre-warning time threshold includes a first time threshold, a second time threshold and a third time threshold;
judging whether the current vehicle speed exceeds a first vehicle speed threshold, and if so, judging whether the collision time exceeds a first time threshold; if the current vehicle speed does not exceed the first vehicle speed threshold, judging whether the current vehicle speed exceeds a second vehicle speed threshold, if so, judging whether the collision time exceeds a second time threshold, and if not, judging whether the collision time exceeds a third time threshold;
if the collision time exceeds a threshold value, the central ECU controller sends an instruction to the accelerator execution mechanism, and the accelerator execution mechanism executes the acceleration operation of the automobile; if the collision time does not exceed the threshold value, the central ECU controller sends an instruction to the brake executing mechanism, and the brake executing mechanism executes the braking and decelerating operation of the automobile.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910594270.2A CN110329259B (en) | 2019-07-03 | 2019-07-03 | Vehicle automatic following system and method based on multi-sensor fusion |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910594270.2A CN110329259B (en) | 2019-07-03 | 2019-07-03 | Vehicle automatic following system and method based on multi-sensor fusion |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110329259A CN110329259A (en) | 2019-10-15 |
CN110329259B true CN110329259B (en) | 2020-10-16 |
Family
ID=68143114
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910594270.2A Active CN110329259B (en) | 2019-07-03 | 2019-07-03 | Vehicle automatic following system and method based on multi-sensor fusion |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110329259B (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112837527B (en) * | 2019-11-22 | 2024-08-02 | 罗伯特·博世有限公司 | Target recognition system and method thereof |
US20210229681A1 (en) * | 2020-01-27 | 2021-07-29 | GM Global Technology Operations LLC | Realtime proactive object fusion for object tracking |
WO2021199320A1 (en) * | 2020-03-31 | 2021-10-07 | 本田技研工業株式会社 | Control device, saddled vehicle, control device operation method, and program |
CN114091562A (en) * | 2020-08-05 | 2022-02-25 | 北京万集科技股份有限公司 | Multi-sensing data fusion method, device, system, equipment and storage medium |
CN112141103A (en) * | 2020-08-31 | 2020-12-29 | 恒大新能源汽车投资控股集团有限公司 | Method and system for controlling vehicle to run along with front vehicle |
CN112660054A (en) * | 2021-01-05 | 2021-04-16 | 北京家人智能科技有限公司 | Method and device for triggering external airbags in grading manner and electronic equipment |
CN114274957B (en) * | 2021-12-13 | 2024-03-15 | 中国北方车辆研究所 | Vehicle self-adaptive cruise control method and system |
CN116061807A (en) * | 2023-03-06 | 2023-05-05 | 北京理工大学前沿技术研究院 | Blind area early warning method and system based on vehicle-road information fusion |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011219056A (en) * | 2010-04-14 | 2011-11-04 | Toyota Motor Corp | Travel control device |
CN106379319A (en) * | 2016-10-13 | 2017-02-08 | 上汽大众汽车有限公司 | Automobile driving assistance system and control method |
CN106504561A (en) * | 2015-09-03 | 2017-03-15 | 罗伯特·博世有限公司 | Method for recognizing the object on parking area |
CN107089231A (en) * | 2017-03-27 | 2017-08-25 | 中国第汽车股份有限公司 | It is a kind of automatic with car drive-control system and its method |
CN107202983A (en) * | 2017-05-19 | 2017-09-26 | 深圳佑驾创新科技有限公司 | The self-actuating brake method and system merged based on image recognition and millimetre-wave radar |
-
2019
- 2019-07-03 CN CN201910594270.2A patent/CN110329259B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011219056A (en) * | 2010-04-14 | 2011-11-04 | Toyota Motor Corp | Travel control device |
CN106504561A (en) * | 2015-09-03 | 2017-03-15 | 罗伯特·博世有限公司 | Method for recognizing the object on parking area |
CN106379319A (en) * | 2016-10-13 | 2017-02-08 | 上汽大众汽车有限公司 | Automobile driving assistance system and control method |
CN107089231A (en) * | 2017-03-27 | 2017-08-25 | 中国第汽车股份有限公司 | It is a kind of automatic with car drive-control system and its method |
CN107202983A (en) * | 2017-05-19 | 2017-09-26 | 深圳佑驾创新科技有限公司 | The self-actuating brake method and system merged based on image recognition and millimetre-wave radar |
Also Published As
Publication number | Publication date |
---|---|
CN110329259A (en) | 2019-10-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110329259B (en) | Vehicle automatic following system and method based on multi-sensor fusion | |
CN107346612B (en) | Vehicle anti-collision method and system based on Internet of vehicles | |
WO2020187254A1 (en) | Longitudinal control method and system for automatic driving vehicle | |
CN110155046B (en) | Automatic emergency braking hierarchical control method and system | |
CN109760678B (en) | Speed limiting method of automobile adaptive cruise system | |
CN110293967B (en) | Low-speed active safety execution control method and system for automobile | |
US20180210463A1 (en) | System and method for implementing pre-cognition braking and/or avoiding or mitigation risks among platooning vehicles | |
CN111402626B (en) | Safe following distance control system and control method based on vehicle-road cooperation | |
EP3018027A1 (en) | Control arrangement arranged to control an autonomous vehicle, autonomous drive arrangement, vehicle and method | |
CN112406820B (en) | Multi-lane enhanced automatic emergency braking system control method | |
US20130297196A1 (en) | Vehicular driving assist apparatus, method, and vehicle | |
CN106448190B (en) | Real-time monitoring and early warning device and method for traffic flow around self-vehicle on highway | |
CN107204128B (en) | Automobile anti-collision early warning system and method based on ZigBee communication | |
WO2012014040A1 (en) | Vehicle control system | |
CN107867283A (en) | Integrated form FCW/ACC/AEB systems and vehicle based on forecast model | |
CN102869961A (en) | Device for detecting sounds outside vehicle | |
CN105894858B (en) | A kind of vehicle emergency brake early warning system | |
US11753013B2 (en) | Method for operating a driving assistance system, and driving assistance system | |
CN111391832A (en) | Vehicle self-adaptive cruise control method and system based on information sharing | |
CN115158304B (en) | Automatic emergency braking control system and method | |
JP2021088289A (en) | Drive assist device | |
CN113085828A (en) | Control method for protecting rear vehicle during emergency braking of unmanned automobile | |
CN114255614A (en) | Intelligent expressway vehicle deceleration early warning method and system based on vehicle-mounted smart phone and automobile data recorder | |
CN110979277B (en) | Rear-end collision prevention system and method based on front vehicle state | |
JP3894147B2 (en) | Brake control device for vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |