CN112977393B - Automatic driving anti-collision avoiding device and method thereof - Google Patents

Automatic driving anti-collision avoiding device and method thereof Download PDF

Info

Publication number
CN112977393B
CN112977393B CN202110436589.XA CN202110436589A CN112977393B CN 112977393 B CN112977393 B CN 112977393B CN 202110436589 A CN202110436589 A CN 202110436589A CN 112977393 B CN112977393 B CN 112977393B
Authority
CN
China
Prior art keywords
point
laser
camera
obstacle
control unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110436589.XA
Other languages
Chinese (zh)
Other versions
CN112977393A (en
Inventor
周宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202110436589.XA priority Critical patent/CN112977393B/en
Priority to PCT/CN2021/089577 priority patent/WO2022222159A1/en
Publication of CN112977393A publication Critical patent/CN112977393A/en
Application granted granted Critical
Publication of CN112977393B publication Critical patent/CN112977393B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/04Conjoint control of vehicle sub-units of different type or different function including control of propulsion units
    • B60W10/06Conjoint control of vehicle sub-units of different type or different function including control of propulsion units including control of combustion engines
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/18Conjoint control of vehicle sub-units of different type or different function including control of braking systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • B60W40/105Speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • B60W40/107Longitudinal acceleration
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/143Alarm means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo or light sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • B60W2420/408
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/06Direction of travel
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/10Longitudinal speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/10Longitudinal speed
    • B60W2520/105Longitudinal acceleration
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/50Barriers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2710/00Output or target parameters relating to a particular sub-units
    • B60W2710/06Combustion engines, Gas turbines
    • B60W2710/0605Throttle position
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2710/00Output or target parameters relating to a particular sub-units
    • B60W2710/18Braking system

Abstract

The application discloses a low-cost automatic driving anti-collision avoidance device and method based on a camera, a millimeter wave radar and a laser emitter are used as main sensors, and relates to the technical field of automobile automatic driving, so as to solve the problem that the existing camera, the millimeter wave radar are used as main sensors and static obstacles cannot be identified. The utility model provides an autopilot anticollision avoid device, include: a laser emitter for emitting a laser beam in a specific direction; the camera is used for acquiring an image containing light spots generated by laser beams; the image processing module is used for acquiring the data transmitted by the camera to perform calculation processing and analyzing whether an obstacle exists in front; the control unit acquires real-time feedback data of the signal processing unit and analyzes whether collision risk exists or not; and the anti-collision execution unit is used for executing the control of the control unit and performing automatic driving anti-collision avoidance operation.

Description

Automatic driving anti-collision avoiding device and method thereof
Technical Field
The present disclosure relates to an automatic driving collision avoidance device and a method thereof, and more particularly, to a collision avoidance device and a method for detecting static obstacle for an automatic driving vehicle using a camera and a millimeter wave radar as main sensors.
Background
Currently, the mainstream sensors of a vehicle with an autopilot function include a camera, a millimeter wave radar and a laser radar. The laser radar has not been applied in batch due to the characteristics of higher cost, low stability, etc. The existing automatic driving scheme mainly adopts a camera and millimeter wave radar as a main sensor. However, the use of "camera + millimeter wave radar" as the primary sensor has been a difficulty in identifying static obstacles. In recent years, a few fatal accidents occur in an automatic driving vehicle using a "camera+millimeter wave radar" as a main sensor, which is caused by a collision with a stationary vehicle in a vertical direction while an L2-level (L2 is one of the level of evaluation criteria for an automatic driving level at this stage: a driver is responsible for monitoring a road surface, realizing partial automatic driving) automatic driving system is turned on. The reasons for the accidents are that the camera and the millimeter wave radar do not judge the static obstacle, and the system does not react to the result.
Disclosure of Invention
The main object of the present disclosure is to provide a low-cost automatic driving collision avoidance device based on a 'camera + millimeter wave radar + laser transmitter' as a main sensor and a method thereof.
The application discloses a low-cost autopilot anticollision avoiding apparatus based on "camera + millimeter wave radar + laser emitter" is as main sensor, this autopilot anticollision avoiding apparatus possesses: a laser emitter for emitting a laser beam in a specific direction; the camera is used for acquiring an image containing light spots generated by laser beams; the image processing module is used for acquiring the data transmitted by the camera to perform calculation processing and analyzing whether an obstacle exists in front; the control unit acquires real-time feedback data of the signal processing unit and analyzes whether collision risk exists or not; and the anti-collision execution unit is used for executing the control of the control unit and performing automatic driving anti-collision avoidance operation.
The camera is a common camera chip and comprises a CCD image sensor and a COMS image sensor; the laser transmitter emits a beam of light that can be recognized by the camera, including but not limited to visible light.
As a further improvement of the application, the camera judges whether an obstacle exists in front, if not, the image processing module judges the safety position where the laser beam can irradiate, and the control unit operates the laser emitter to emit the laser beam in the specific direction to irradiate the safety position to form a light spot.
As a further improvement of the present application, the image processing unit determining unsafe positions that the laser beam can irradiate includes: 1) directly or indirectly irradiating the face part, 2) the surface of the inflammable and explosive substance, and 3) the surface of the object which has a reflection function and cannot judge whether the laser is safe after reflection, but the unsafe position is not limited to the above.
As a further improvement of the application, the installation distance between the laser transmitter and the camera in the vertical direction of the ground is more than 10cm.
As a further improvement of the present application, laser emitters include, but are not limited to: a point cloud laser emitter, a rotatable laser emitter.
As a further development of the application, the laser generator can emit laser beams of a plurality of light colors sequentially or simultaneously.
As a further improvement of the present application, the apparatus further comprises: and the motion state sensor is used for judging the inclination, acceleration and steering states of the vehicle and transmitting the vehicle to the control unit to correct the deviation of the laser emission direction.
As a further development of the application, the motion state sensor is integrated on the control unit motherboard.
The application also discloses a low-cost automatic driving anti-collision avoidance method based on a camera, a millimeter wave radar and a laser transmitter as main sensors, which comprises the following steps: the signal processing unit receives data from a camera and a microwave radar which are arranged on the automatic driving vehicle and judges whether an obstacle exists in front; if not, the image processing module judges the safety position which can be irradiated by the laser beam, and the control unit receives the safety position signal and controls the laser emitter to emit the laser beam to irradiate the safety position; the camera receives light spot image data formed by the laser beam and transmits the light spot image data to the image processing module, and the image processing module judges whether an obstacle exists in front according to the position of the laser light spot; in the above process, as long as the obstacle in front is detected, the control unit judges whether the potential safety hazard exists or not, and controls the anti-collision executing unit to perform automatic driving anti-collision avoiding operation.
The application provides a low-cost automatic driving anti-collision avoidance device and method based on a camera, a millimeter wave radar and a laser transmitter as main sensors, which make up for the problem that the camera and the millimeter wave radar cannot identify static obstacles as the main sensors, and have simple structure, low cost and high application value.
Drawings
To more clearly describe the embodiments of the present application or the prior art, the following will apply to the embodiments
The drawings that accompany the detailed description can be briefly described as follows, as will be apparent from the embodiments or prior art descriptions
The drawings are embodiments of the present application, and are not intended to be inventive for one of ordinary skill in the art
Further figures can be obtained from these figures.
Fig. 1 is a schematic diagram of an autopilot collision avoidance apparatus according to an embodiment of the present application.
Fig. 2 is a schematic diagram of laser ranging for determining an obstacle according to an embodiment of the present application.
Fig. 3 is a diagram of a situation in which a vehicle provided in an embodiment of the present application corrects when a laser ray direction is shifted due to a difference in height.
Fig. 4 is a flowchart of an anti-collision avoidance process provided in an embodiment of the present application.
Detailed Description
The embodiments described herein will be clearly and fully described with reference to the accompanying drawings
Examples are some, but not all, of the embodiments of the present application. Based on the examples in this application, one of ordinary skill in the art
All other embodiments that can be made by the person without making any inventive effort are intended to fall within the scope of protection of the present application.
In the description of the present application, it should be noted that the directions or positional relationships indicated by the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc. are based on the directions or positional relationships shown in the drawings, are merely for convenience of description of the present application and to simplify the description, and do not indicate or imply that the system or element referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present application. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In the description of the present application, it should be noted that, unless explicitly specified and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be either fixedly connected, detachably connected, or integrally connected, for example; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the terms in this application will be understood by those of ordinary skill in the art in a specific context.
Hereinafter, an embodiment of the autonomous collision avoidance device mounted on a vehicle will be described with reference to the drawings. The collision avoidance device according to the present embodiment detects the presence or absence of an obstacle in front by emitting a laser beam in a specific direction. First, a schematic configuration of a collision avoidance device for a vehicle according to the present embodiment will be described with reference to fig. 1. It should be appreciated that the apparatus and methods of the present application may be used with any type of vehicle, including conventional vehicles, hybrid vehicles (HEVs), extended Range Electric Vehicles (EREVs), electric only vehicles (BEVs), motorcycles, passenger vehicles, sport Utility Vehicles (SUVs), cross-country vehicles, trucks, vans, buses, recreational Vehicles (RVs), and the like. These are just a few of the possible applications, as the apparatus and methods described herein are not limited to the exemplary embodiments shown in fig. 1-4, and may be implemented in a variety of different ways.
In fig. 1, a vehicle (50) is provided with: a sensor unit (10) comprising: a camera (101) for image acquisition, a microwave radar (102) for front obstacle detection, a motion state sensor (103) for vehicle inclination, acceleration and steering state recognition, a laser transmitter (104) for laser beam emission; a signal processing unit (20) comprising: an image processing module (201) for receiving camera image data for image processing, a radar processing module (202) for microwave radar data processing and control, a motion state processing module (203) for processing, converting and analyzing motion state sensor data, a laser control module (204) for controlling laser beam emission; the control unit (30) is used for receiving and processing the data transmitted by the signal processing unit, controlling the signal processing unit to further detect signals and finally controlling the anti-collision executing unit to perform anti-collision avoiding actions; an anti-collision execution unit (40) comprising: a vehicle speed regulator (401) for executing the control unit signal to reduce or close the oil passage to control the vehicle speed, a braking device (402) for executing the control unit signal to switch on a braking circuit to brake the vehicle, a meter (403) for executing the control unit signal to display a front obstacle alarm, and a brake lamp (404) for executing the control unit signal to remind the rear vehicle.
The camera (101) is a common camera chip, including but not limited to a CCD image sensor and a COMS image sensor; the system is used for acquiring image signals, can identify light beams emitted by the laser emitter (104), can acquire the image signals containing light spots formed by the light beams emitted by the laser emitter (104), and transmits the image signals to the image processing module (201) for calculation and analysis.
The microwave radar (102) adopts microwaves with the frequency of 300MHz-3000GHz and the wavelength of 0.1 mm-10 m as a signal source. Microwaves are the collective term of decimeter waves, centimeter waves, millimeter waves, sub-millimeter waves and meter waves, so microwave radars also include millimeter wave radars. The microwave directivity is good, the speed is equal to the speed of light, the microwave is reflected back immediately when encountering a vehicle and then is received by the radar velocimeter, so that the speed of the detected vehicle can be displayed on the nixie tube after a round, but in hundreds of thousands of minutes and seconds. When a person or object moves within the induction range of the microwaves, the inductor is activated. Therefore, the microwave radar is widely applied to the detection of front obstacles in automatic driving, but the fact that the microwave radar cannot identify static obstacles is always a great difficulty in automatic driving.
The motion state sensor (103) comprises: the 3-axis accelerometer, the 3-axis gyroscope, the 3-axis magnetometer, the air pressure sensor and the inclination angle sensor can be one or a combination of a plurality of the three. The relevant signals may be acquired and transmitted to a motion state processing module (203) for computational analysis.
The laser transmitter (104) may transmit a laser beam in a particular direction upon command of the control unit (30), including but not limited to: a point cloud laser emitter, a rotatable laser emitter. And may emit laser beams of multiple light colors sequentially or simultaneously. And forming a light spot at the confirmed safe position, wherein the light spot image data is used for judging whether an obstacle exists in front.
The image processing module (201) is used for receiving the image data transmitted by the camera (101), analyzing whether an obstacle exists in front through an algorithm, if not, continuously analyzing the image data of the laser light spots, calculating and analyzing whether the obstacle exists in front and the distance of the obstacle according to whether the light spots deviate and the deviation amount, and transmitting the result to the control unit.
The radar processing module (202) receives the microwave radar (102) data, analyzes the forward obstacle, and transmits the analysis result to the control unit.
The motion state processing module (203) receives the motion state sensor data, calculates the vehicle inclination, acceleration and steering states through fusion of Kalman filtering, particle filtering and complementary filtering algorithms, and transmits the data to the control unit for correcting the laser beam emission angle.
The laser control module (204) receives the control signal of the control unit and controls the laser transmitter to emit a laser beam in a specific direction.
The control unit (30) is equipped with a microcomputer, an interface for a wire harness, and the like. The microcomputer has CPU (Central processing unit)
Processing Unit: central processing unit), ROM (Read Only Memory: read-only memory), RAM (Random access memory)
Access Memory: random access memory), I/O, and CAN (Controller Area Network: controller area network
A network) communication device, etc. The control unit (30) is mainly used for connecting the signal processing unit, receiving the analysis data of the sensing signals, sending the data to control the laser transmitter, calculating and analyzing whether the front obstacle is safe or not, and controlling the anti-collision executing unit (40) to perform anti-collision avoidance actions.
The vehicle speed regulator (401) is used for reducing or closing an oil path to control the engine to inhibit driving force so as to regulate the vehicle speed.
The brake device (402) may be part of any suitable vehicle brake system, including systems associated with disc brakes, drum brakes, electro-hydraulic brakes, electro-mechanical brakes, regenerative brakes, brake-by-wire, and the like.
An alarm indicator lamp is arranged on the instrument (403), when the safety risk occurs to the front obstacle, the alarm indicator lamp gives out a warning, and the driver is prompted to take over further measures while avoiding collision automatically.
When the brake lamp (404) judges that the front obstacle needs to be avoided, the brake device (402) is automatically started, and meanwhile, the brake lamp is lightened to prompt the rear vehicle to avoid in early warning.
Turning now to fig. 2, a schematic diagram of a laser ranging determination obstacle in an autopilot collision avoidance apparatus is shown. As shown in fig. 2 (i), when the vehicle (50) is configured with the camera (101) to feed back the image for the first time to determine that no obstacle exists in front of the vehicle, the image processing module (201) continues to determine whether the point C is safe, whether a pedestrian or a flammable or explosive object exists, and if not, the laser transmitter (104) transmits a laser beam to irradiate the position of the point C. The camera (101) acquires a laser spot forming image for the second time. Point C will appear in the imaging position at point C as shown in fig. 2 (iii) (a), and there is no obstacle ahead of the secondary confirmation.
If the image processing module (201) processes the first feedback image and misjudges that an unidentified obstacle (60) is actually arranged in front of the vehicle, according to the ray linear propagation principle, a laser beam (a line connecting a point B and a point C of a laser emitter) forms an E-point light spot on the obstacle (60), a vertical section X passing through the E-point is formed in the running direction of the vehicle, and the imaging of the E-point on the X-section is shown in fig. 2 (III) (B). According to the principle of linear light propagation, an imaging point of the original C point on the X section is an intersection point D of an AC line (the connecting line of the C point and the center A point of the camera) and the X section, as shown in fig. 2 (III) (b). The image acquired by the camera (101) is shown in fig. 2 (iii) (b), the point C and the point D are coincident to the same point, and the point E is below the point C, resulting in an offset of x distance. Therefore, if the C point is not at the preset position of the C point, but the x-distance offset is generated, the image processing module (201) can judge that an obstacle exists in front according to the facula image, and the first image judgment has erroneous judgment.
If the obstacle (60) is located farther from the vehicle (50), as in the obstacle (70) position of fig. 2 (i), the laser beam (the line connecting the points B and C of the laser transmitter) will form an F spot on the X' section of the obstacle (70). Similarly, in the formed image on the camera (101), the point F and the point G are the same, and the distance between the point F (G) and the point C (D) is x', as shown in fig. 2 (iii) (C). As can be seen from fig. 2 (ii): the point F (G) is below the point C (D) and the point E is below the point F (G), so the farther the imaging point is from the predetermined imaging point C, the closer the obstacle is to the vehicle (50).
On the other hand, we can calculate the distance between the obstacle and the vehicle (50) by the distance of the imaging point E from the predetermined imaging point C. As shown in fig. 2 (ii), first, the difference X between two points on the X-section can be obtained by calculating the pixel position difference between the E-point and the C-point on the image. Then, a distance d1 between the obstacle and the vehicle (50), a distance d from the vehicle (50) to a point C, and a distance d2 from a projection point H of the point E projected to the ground to the point C, d1=d-d 2; y is the distance from the center B point of the laser transmitter to the ground, alpha is the included angle between the laser beam BC and the ground, and d=y/tan alpha; the distance between the E point and the D point on the X section is X, and the included angle between the connecting line of the A point and the C point of the center of the camera and the ground is beta, d2=x/X (tan beta-tan alpha); substituting the two formulas into d1=d-d 2 yields the final result, the distance d1=y/tan α -x/(tan β -tan α) between the obstacle and the vehicle (50).
From the formula d1=y/tan α -x/(tan β -tan α), the calculation accuracy of d1 is determined by x and y. And y is the installation height of the laser probe from the ground, and the larger the laser installation height y is, the higher the test precision is. Whereas x is actually related to BA ', where a ' is the intersection of the AC line with the vertical cross section of the vehicle's direction of travel through point B, and BA ' is related to the mounting distance of the camera and laser transmitter in the ground vertical direction, if BA ' is too small, this will result in a reduced measurement accuracy, all the requirements being: the installation distance between the laser transmitter and the camera in the ground vertical direction is more than 10cm.
In the practical use, the laser beams may be multiple beams, the emission time of different beams may be different, and the light color of the laser emitted by different beams may be different, which may be more complex than the above case according to the practical situation.
Still further, as shown in FIG. 3, a more complex situation is presented in which when the vehicle (50) encounters an incline or incline, or is uneven in front and rear, the laser transmitter is angularly displaced from the horizontal by an angle eta, which directly results in a change in the original laser emission angle gamma. At this time, the actual inclination angle of the vehicle needs to be judged by means of the motion state sensor (103), and the inclination angle is converted and fed back to the control unit (30) to rectify the laser real-time emission angle delta. Furthermore, besides the inclination of the vehicle, the motion state sensor (103) can also judge the speed, acceleration and steering state of the vehicle, so as to further rectify the direction of the laser emission beam or the position point of image processing.
Fig. 4 is a flow chart of a method according to an embodiment of the present application. It should be understood that, although the steps in the flowchart of fig. 4 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited in order and may be performed in other orders, unless explicitly stated herein. Moreover, at least some of the steps in the figures may include multiple sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, the order of their execution not necessarily occurring in sequence, but may be performed alternately or alternately with other steps or at least a portion of the other steps or stages. The collision avoidance device processing flow provided in the present embodiment repeatedly executes the processing of fig. 4 for each predetermined control cycle.
First, the camera and the microwave radar collect data and transmit the data to the signal processing unit (S101). If the signal processing unit detects an obstacle (S102: yes), data is transmitted to the control unit. The control unit judges whether or not there is a potential safety hazard, and controls the collision avoidance execution unit to perform an automatic driving collision avoidance operation (S109). If the signal processing unit does not detect an obstacle (S102: NO), the image processing module confirms a laser safety irradiation point (S103) and transmits the safety irradiation point coordinates to the control unit. The control unit corrects the irradiation direction according to the coordinate position in combination with the position state sensor data (S104), and transmits the correct irradiation coordinates to the laser control module. The laser control module controls the laser transmitter to transmit laser searchlight according to the correct coordinates transmitted from the control unit (S105). The camera acquires laser spot image data for the second time (S106) and transmits the laser spot image data to the image processing module. The image processing module extracts a laser spot probe point according to the image data (S107), and analyzes whether an obstacle exists in front according to the spot position deviation condition (S108). If the signal processing unit detects an obstacle (S108: yes), data is transmitted to the control unit. The control unit judges whether or not there is a potential safety hazard, and controls the collision avoidance execution unit to perform an automatic driving collision avoidance operation (S109). If the signal processing unit does not detect an obstacle (S108: NO), this flow ends, returning to the operation of starting to enter the next cycle.
In summary, the embodiment of the application provides a low-cost automatic driving anti-collision avoidance device and method based on a camera, a millimeter wave radar and a laser transmitter as main sensors, which make up for the problem that the camera, the millimeter wave radar and the main sensors cannot identify static obstacles, and the device has simple structure, low cost and high application value.
The present disclosure is described in terms of the embodiments, but it should be understood that the present disclosure is not limited to the embodiments and constructions. The present disclosure also includes various modifications and modifications within the equivalent scope. In addition, various combinations and modes, and other combinations or modes including only one element, more than one element, or less than one element therein are also included within the scope and spirit of the present disclosure.

Claims (7)

1. An automatic drive collision avoidance device, the device comprising:
a sensor unit (10) comprising: the laser emitter (104) is used for emitting laser beams in a specific direction, and the camera (101) acquires images containing light spots generated by the laser beams; wherein the specific direction is: the included angle between the laser beam and the ground is alpha, and a camera (101) acquires a front image in the running direction of the vehicle;
a signal processing unit (20) comprising: the image processing module (201) acquires data transmitted by the camera (101) to perform calculation processing and analyze whether an obstacle exists in front;
the control unit (30) acquires real-time feedback data of the signal processing unit and analyzes whether collision risk exists or not;
an anti-collision execution unit (40) that executes control of the control unit and performs an automatic driving anti-collision avoidance operation;
it is characterized in that the method comprises the steps of,
when the image processing module (201) judges that no obstacle exists in front, continuously judging whether the point C is a safe position, and if no pedestrian exists, considering the point C as the safe position; the intersection point of the central point A of the camera (101) projected on the ground by an included angle beta is a point C, so that the included angle between the connecting line of the central point A of the camera (101) and the point C and the ground is beta;
the control unit (30) is also used for controlling the laser emitter (104) to emit a laser beam in a specific direction to the C point and controlling the camera (101) to continue to take images when the image processing module (201) judges that no obstacle exists in front and the C point is a safe position;
if an unidentified obstacle (60) is actually arranged in front of the vehicle, the laser beam forms an E-point light spot on the obstacle (60), a vertical section X passing through the E-point light spot is formed in the running direction of the vehicle, an imaging point of a C point on the vertical section X is an intersection point D of an AC line and the vertical section X, and the AC line is a connecting line of the C point and a center A point of the camera; in the acquired image of the camera (101), the imaging point of the C point and the intersection point D are overlapped to be the same point, and the E point is below the imaging point of the C point, so that x-distance offset is generated, an image processing module (201) judges that an obstacle exists in front according to the spot image of the spot of the E point, and misjudgment exists in the first image judgment;
the distance between the obstacle (60) and the vehicle (50) is: d1 =y/tan α -X/(tan β -tan α), where y is the mounting height of the laser probe of the laser transmitter (104) from the ground, and X is the distance between the E and D points on the X section: and calculating the pixel position difference of the imaging point between the E point and the C point on the image to obtain the difference value of the two points on the X section actually, namely the X distance.
2. The apparatus according to claim 1, wherein: the installation distance between the laser transmitter and the camera in the ground vertical direction is more than 10cm.
3. The apparatus according to claim 1, wherein: the laser transmitter is a point cloud laser transmitter or a rotatable laser transmitter.
4. The apparatus according to claim 1, wherein: the laser generator emits laser beams of a plurality of light colors sequentially or simultaneously.
5. The apparatus according to claim 1, wherein: the sensor unit further includes: the motion state sensor (103) judges the vehicle inclination, acceleration and steering states and transmits to the control unit to correct the deviation of the laser emission direction.
6. The apparatus according to claim 5, wherein: the motion state sensor is integrated on the control unit main board.
7. A method for autopilot collision avoidance of an apparatus according to any one of claims 1 to 6, wherein the method of collision avoidance comprises the steps of:
a) The signal processing unit receives data from a camera and a microwave radar which are arranged on the automatic driving vehicle and judges whether an obstacle exists in front;
b) If not, the image processing module judges the safety position which can be irradiated by the laser beam, and the control unit receives the safety position signal and controls the laser generator to emit the laser beam to irradiate the safety position;
c) The camera receives the image data when the laser irradiates and transmits the image data to the image processing module, and the image processing module judges whether an obstacle exists in front according to the position of the laser spot;
d) As long as the obstacle in front is detected, the control unit judges whether the potential safety hazard exists or not, and controls the anti-collision executing unit to perform automatic driving anti-collision avoiding operation.
CN202110436589.XA 2021-04-22 2021-04-22 Automatic driving anti-collision avoiding device and method thereof Active CN112977393B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110436589.XA CN112977393B (en) 2021-04-22 2021-04-22 Automatic driving anti-collision avoiding device and method thereof
PCT/CN2021/089577 WO2022222159A1 (en) 2021-04-22 2021-04-25 Autonomous driving collision avoidance apparatus and method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110436589.XA CN112977393B (en) 2021-04-22 2021-04-22 Automatic driving anti-collision avoiding device and method thereof

Publications (2)

Publication Number Publication Date
CN112977393A CN112977393A (en) 2021-06-18
CN112977393B true CN112977393B (en) 2024-01-12

Family

ID=76339763

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110436589.XA Active CN112977393B (en) 2021-04-22 2021-04-22 Automatic driving anti-collision avoiding device and method thereof

Country Status (2)

Country Link
CN (1) CN112977393B (en)
WO (1) WO2022222159A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116567375B (en) * 2023-04-24 2024-02-02 禾多科技(北京)有限公司 Vehicle-mounted front-view camera all-in-one machine, vehicle and vehicle speed control method

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06290398A (en) * 1993-03-31 1994-10-18 Mazda Motor Corp Obstacle detector
JPH09197045A (en) * 1996-01-24 1997-07-31 Nissan Motor Co Ltd Radar device for vehicles
US5832407A (en) * 1996-04-12 1998-11-03 Mitsubishi Denki Kabushiki Kaisha Vehicle surrounding monitoring device
JP2006258457A (en) * 2005-03-15 2006-09-28 Omron Corp Laser scanning apparatus
JP2006349694A (en) * 2006-08-10 2006-12-28 Omron Corp Object detection device and method
JP2010018080A (en) * 2008-07-09 2010-01-28 Mazda Motor Corp Vehicular driving support device
CN104729459A (en) * 2015-04-10 2015-06-24 武汉工程大学 Driving ranging and collision warning unit based on ARM-Linux system
CN105292085A (en) * 2015-11-02 2016-02-03 清华大学苏州汽车研究院(吴江) Vehicle forward collision avoidance system based on infrared laser aid
CN106291520A (en) * 2016-07-14 2017-01-04 江苏大学 A kind of DAS (Driver Assistant System) based on coded laser light and binocular vision and method
CN108995584A (en) * 2018-07-27 2018-12-14 合肥市智信汽车科技有限公司 To anti-collision system after a kind of vehicle based on infrared laser auxiliary
CN211696386U (en) * 2020-03-16 2020-10-16 河南灵狗电子科技有限公司 Novel on-vehicle millimeter wave radar of combination formula anticollision

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2901112B2 (en) * 1991-09-19 1999-06-07 矢崎総業株式会社 Vehicle periphery monitoring device
KR101395089B1 (en) * 2010-10-01 2014-05-16 안동대학교 산학협력단 System and method for detecting obstacle applying to vehicle
CN202903176U (en) * 2012-09-20 2013-04-24 孙斌 Visual range finder of laser auxiliary machine
US20210072398A1 (en) * 2018-06-14 2021-03-11 Sony Corporation Information processing apparatus, information processing method, and ranging system

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06290398A (en) * 1993-03-31 1994-10-18 Mazda Motor Corp Obstacle detector
JPH09197045A (en) * 1996-01-24 1997-07-31 Nissan Motor Co Ltd Radar device for vehicles
US5832407A (en) * 1996-04-12 1998-11-03 Mitsubishi Denki Kabushiki Kaisha Vehicle surrounding monitoring device
JP2006258457A (en) * 2005-03-15 2006-09-28 Omron Corp Laser scanning apparatus
JP2006349694A (en) * 2006-08-10 2006-12-28 Omron Corp Object detection device and method
JP2010018080A (en) * 2008-07-09 2010-01-28 Mazda Motor Corp Vehicular driving support device
CN104729459A (en) * 2015-04-10 2015-06-24 武汉工程大学 Driving ranging and collision warning unit based on ARM-Linux system
CN105292085A (en) * 2015-11-02 2016-02-03 清华大学苏州汽车研究院(吴江) Vehicle forward collision avoidance system based on infrared laser aid
CN106291520A (en) * 2016-07-14 2017-01-04 江苏大学 A kind of DAS (Driver Assistant System) based on coded laser light and binocular vision and method
CN108995584A (en) * 2018-07-27 2018-12-14 合肥市智信汽车科技有限公司 To anti-collision system after a kind of vehicle based on infrared laser auxiliary
CN211696386U (en) * 2020-03-16 2020-10-16 河南灵狗电子科技有限公司 Novel on-vehicle millimeter wave radar of combination formula anticollision

Also Published As

Publication number Publication date
CN112977393A (en) 2021-06-18
WO2022222159A1 (en) 2022-10-27

Similar Documents

Publication Publication Date Title
AU2019239572B2 (en) Method and device for monitoring and/or detecting a sensor system of a vehicle
US6943726B2 (en) Device for searching a parking space
US6166628A (en) Arrangement and method for detecting objects from a motor vehicle
JP6479130B1 (en) Vehicle travel support device
US11250276B2 (en) Object height determination for automated vehicle steering control system
CN104995070A (en) Method and beam sensor module for determining the condition of the road ahead in a vehicle
CN101162507A (en) Method for recognizing vehicle type of moving care
CN111108410B (en) Method for monitoring the surroundings of a motor vehicle, sensor controller, driver assistance system and motor vehicle
JPH0765294A (en) Preventive safety device for vehicle
AU2017337223A1 (en) A method for determining the presence of a trailer
US10885784B2 (en) Driving support device and control method of driving support device
KR102013224B1 (en) Autonomous Emergencyy Braking System and Controlling Method Thereof
JP2009042177A (en) Object detecting device
CN112977393B (en) Automatic driving anti-collision avoiding device and method thereof
JP3841047B2 (en) Vehicle distance control device
US20230034560A1 (en) Method for tracking a remote target vehicle in an area surrounding a motor vehicle by means of a collision detection device
JP3465384B2 (en) Vehicle obstacle detection device and approach warning / avoidance device
KR20170087368A (en) Blind spot detection method and blind spot detection device
CN113022593B (en) Obstacle processing method and device and traveling equipment
CN113486836B (en) Automatic driving control method for low-pass obstacle
KR101127702B1 (en) Integrated Side/Rear Safety System for Automobile
US20220308233A1 (en) Driver assistance system and operation method thereof
EP4125073A1 (en) Device and method for assisting a commercial vehicle during overtaking another vehicle
KR20230129076A (en) Method and apparatus for preventing collision with overloaded vehicle
CN114111606B (en) System and method for measuring vehicle parameters

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant