WO2022222159A1 - 自动驾驶防撞避免装置及其方法 - Google Patents

自动驾驶防撞避免装置及其方法 Download PDF

Info

Publication number
WO2022222159A1
WO2022222159A1 PCT/CN2021/089577 CN2021089577W WO2022222159A1 WO 2022222159 A1 WO2022222159 A1 WO 2022222159A1 CN 2021089577 W CN2021089577 W CN 2021089577W WO 2022222159 A1 WO2022222159 A1 WO 2022222159A1
Authority
WO
WIPO (PCT)
Prior art keywords
laser
camera
collision avoidance
control unit
obstacle
Prior art date
Application number
PCT/CN2021/089577
Other languages
English (en)
French (fr)
Inventor
周宇
Original Assignee
周宇
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 周宇 filed Critical 周宇
Publication of WO2022222159A1 publication Critical patent/WO2022222159A1/zh

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/04Conjoint control of vehicle sub-units of different type or different function including control of propulsion units
    • B60W10/06Conjoint control of vehicle sub-units of different type or different function including control of propulsion units including control of combustion engines
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/18Conjoint control of vehicle sub-units of different type or different function including control of braking systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • B60W40/105Speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • B60W40/107Longitudinal acceleration
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/143Alarm means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/408Radar; Laser, e.g. lidar
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/06Direction of travel
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/10Longitudinal speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/10Longitudinal speed
    • B60W2520/105Longitudinal acceleration
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/50Barriers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2710/00Output or target parameters relating to a particular sub-units
    • B60W2710/06Combustion engines, Gas turbines
    • B60W2710/0605Throttle position
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2710/00Output or target parameters relating to a particular sub-units
    • B60W2710/18Braking system

Definitions

  • the present application relates to a collision avoidance device and method for automatic driving, in particular to a collision avoidance device and method for detecting static obstacles by an automatic driving vehicle using "camera + millimeter wave radar" as the main sensor.
  • the mainstream sensors currently used in vehicles equipped with autonomous driving functions include cameras, millimeter-wave radars, and lidars. Lidar has not been applied in batches due to its high cost and low stability.
  • Existing autonomous driving solutions mainly use "camera + millimeter-wave radar" as the main sensor.
  • the main purpose of the present disclosure is to provide a low-cost automatic driving collision avoidance device and method based on "camera + millimeter wave radar + laser transmitter" as the main sensor.
  • the present application discloses a low-cost automatic driving collision avoidance device based on "camera + millimeter wave radar + laser transmitter" as the main sensor.
  • the automatic driving collision avoidance device includes: a laser transmitter for emitting a laser beam in a specific direction ; Camera, obtains the image containing the laser beam generated spot; Image processing module, obtains the data transmitted by the camera for calculation and processing, and analyzes whether there is an obstacle in front; Control unit, obtains the real-time feedback data of the signal processing unit, and analyzes whether there is a collision risk; The execution unit executes the control of the control unit to perform automatic driving collision avoidance operation.
  • the camera is a common camera chip, including but not limited to a CCD image sensor and a CMOS image sensor; the light beam emitted by the laser transmitter can be recognized by the camera, including but not limited to visible light.
  • the camera first determines whether there is an obstacle in front of the camera, and if not, the image processing module determines the safe position where the laser beam can be irradiated, operates the laser transmitter through the control unit, emits a laser beam in a specific direction, and irradiates it to the safe position form a spot.
  • the unsafe positions that the image processing unit can determine that the laser beam can irradiate include: 1) the position directly or indirectly irradiated to the face part, 2) the surface of flammable and explosive substances, 3) the surface with reflective function and cannot The surface of the object to determine whether it is safe after the laser is reflected, but the unsafe position is not limited to this.
  • the installation distance between the laser transmitter and the camera in the vertical direction of the ground is >10 cm.
  • laser transmitters include but are not limited to: point cloud laser transmitters, and rotatable laser transmitters.
  • the laser generator can emit laser beams of multiple light colors sequentially or simultaneously.
  • the device further includes: a motion state sensor, which judges the inclination, acceleration and steering states of the vehicle, and transmits it to the control unit to correct the deviation of the laser emission direction.
  • the motion state sensor is integrated on the main board of the control unit.
  • the present application also discloses a low-cost automatic driving collision avoidance method based on "camera + millimeter wave radar + laser transmitter" as the main sensor, which includes the following steps: a signal processing unit receives data from a camera installed on the automatic driving vehicle and microwave radar data to determine whether there is an obstacle ahead; if not, the image processing module determines the safe position where the laser beam can be irradiated, the control unit receives the safe position signal, and controls the laser transmitter to emit a laser beam to illuminate the safe position; the camera receives The spot image data formed by the laser beam is transmitted to the image processing module, and the image processing module judges again whether there is an obstacle ahead according to the position of the laser spot; in the above process, as long as an obstacle is detected in front, the control unit judges whether there is a safety hidden dangers, and control the collision avoidance execution unit to perform automatic driving collision avoidance operations.
  • This application provides a low-cost automatic driving collision avoidance avoidance device and method based on "camera + millimeter-wave radar + laser transmitter” as the main sensor, which makes up for the inability of "camera + millimeter-wave radar” as the main sensor to identify static obstacles
  • the structure is simple, the cost is low, and it has high application value.
  • FIG. 1 is a schematic diagram of an automatic driving collision avoidance device provided by an embodiment of the present application.
  • FIG. 2 is a schematic diagram of a laser ranging to determine an obstacle according to an embodiment of the present application.
  • FIG. 3 is a situation diagram of correction when the direction of the laser rays is shifted due to the vehicle encountering a height difference according to an embodiment of the present application.
  • FIG. 4 is a flow chart of collision avoidance processing provided by an embodiment of the present application.
  • the terms “installed”, “connected” and “connected” should be understood in a broad sense, for example, it may be a fixed connection or a detachable connection Connection, or integral connection; it can be a mechanical connection or an electrical connection; it can be a direct connection or an indirect connection through an intermediate medium, and it can be the internal communication of two elements.
  • installed should be understood in a broad sense, for example, it may be a fixed connection or a detachable connection Connection, or integral connection; it can be a mechanical connection or an electrical connection; it can be a direct connection or an indirect connection through an intermediate medium, and it can be the internal communication of two elements.
  • the collision avoidance device according to the present embodiment detects whether there is an obstacle ahead by emitting a laser beam in a specific direction.
  • the schematic configuration of the vehicle collision avoidance device according to the present embodiment will be described with reference to FIG. 1 .
  • the apparatus and methods of the present application may be used in any type of vehicle, including conventional vehicles, hybrid electric vehicles (HEVs), extended-range electric vehicles (EREVs), pure electric vehicles (BEVs), motorcycles, passenger cars, sport utility vehicles Utility vehicles (SUVs), crossovers, trucks, vans, buses, station wagons (RVs), etc.
  • HEVs hybrid electric vehicles
  • EREVs extended-range electric vehicles
  • BEVs pure electric vehicles
  • SUVs sport utility vehicles Utility vehicles
  • RVs station wagons
  • the vehicle (50) is provided with: a sensor unit (10) comprising: a camera (101) for image acquisition, a microwave radar (102) for front obstacle detection, for vehicle tilt, acceleration and steering A motion state sensor (103) for state recognition, a laser transmitter (104) for laser beam emission; a signal processing unit (20), comprising: an image processing module (201) for receiving camera image data for image processing, a radar processing module (202) for microwave radar data processing and control, a motion state processing module (203) for processing, converting and analyzing motion state sensor data, and a laser control module (204) for controlling laser beam emission;
  • the control unit (30) is used to receive and process the data transmitted from the signal processing unit, and control the signal processing unit to further perform signal detection, and finally control the collision avoidance execution unit to perform the collision avoidance action;
  • the collision avoidance execution unit (40) includes : a vehicle speed controller (401) used to execute the control unit signal to reduce or close the oil circuit to control the vehicle speed, a braking device (402) used to execute the control unit signal to turn on the brake
  • the camera (101) is a common camera chip, including but not limited to a CCD image sensor and a CMOS image sensor; for collecting image signals, it can identify the light beam emitted by the laser transmitter (104), and can collect the light beam emitted by the laser transmitter (104) to form a
  • the image signal of the light spot is sent to the image processing module (201) for calculation and analysis.
  • the microwave radar (102) uses microwaves with a frequency between 300MHz and 3000GHz and a wavelength between 0.1mm and 10m as a signal source.
  • Microwave is the collective name of decimeter wave, centimeter wave, millimeter wave, submillimeter wave and meter wave, so microwave radar also includes millimeter wave radar.
  • the directionality of the microwave is very good, and the speed is equal to the speed of light.
  • the microwave encounters the vehicle, it is immediately reflected back, and then received by the radar speedometer. the speed of the vehicle being measured.
  • the sensor is activated. Therefore, microwave radar is widely used in the detection of obstacles ahead in autonomous driving, but the inability of microwave radar to identify static obstacles has always been a major problem in autonomous driving.
  • the motion state sensor (103) includes: a 3-axis accelerometer, a 3-axis gyroscope, a 3-axis magnetometer, an air pressure sensor, and an inclination sensor, which may be one or a combination of them.
  • the relevant signals can be collected and sent to the motion state processing module (203) for calculation and analysis.
  • the laser transmitter (104) can emit a laser beam in a specific direction according to the instruction of the control unit (30), including but not limited to: point cloud laser transmitter, rotatable laser transmitter. And laser beams of multiple light colors can be emitted sequentially or simultaneously. A light spot is formed at the confirmed safe position, and the light spot image data is used to determine whether there is an obstacle ahead.
  • the image processing module (201) is used to receive the image data transmitted from the camera (101), analyze whether there is an obstacle ahead through an algorithm, if not, continue to analyze the image data of the laser spot, and calculate according to whether the spot is offset and the offset amount The presence and distance of obstacles ahead are analyzed and the results are sent to the control unit.
  • the radar processing module (202) receives the microwave radar (102) data for analyzing the obstacles ahead, and transmits the analysis result to the control unit.
  • the motion state processing module (203) receives the motion state sensor data, calculates the vehicle inclination, acceleration and steering states through fusion including Kalman filtering, particle filtering, and complementary filtering algorithms, and transmits the data to the control unit for use in the laser beam Correction of launch angle.
  • the laser control module (204) receives the control signal from the control unit, and controls the laser transmitter to emit a laser beam in a specific direction.
  • the control unit (30) is equipped with a microcomputer, an interface of the wiring harness, and the like.
  • the above-mentioned microcomputer has a CPU with (Central Processing Unit: Central Processing Unit), ROM (Read Only Memory: Read Only Memory), RAM (Random Access Memory: Random Access Memory), I/O and CAN (Controller Area Network: a well-known configuration of a controller area network) communication device or the like.
  • the control unit (30) is mainly used to connect the signal processing unit, receive sensor signal analysis data and send data to control the laser transmitter, calculate and analyze whether the obstacle ahead is safe, and control the collision avoidance execution unit (40) to perform collision avoidance action.
  • the vehicle speed controller (401) is used to reduce or close the oil circuit to control the engine to suppress the driving force, so as to regulate the vehicle speed.
  • the braking device (402) may be part of any suitable vehicle braking system, including systems associated with disc brakes, drum brakes, electro-hydraulic braking, electro-mechanical braking, regenerative braking, brake-by-wire, etc. .
  • An alarm indicator light is set on the instrument (403), when the obstacle ahead presents a safety risk, the alarm indicator light will issue a warning, automatically avoid collision, and at the same time, prompt the driver to take over and take further measures.
  • the brake light (404) automatically activates the braking device (402) when it is determined that the obstacle ahead needs to be avoided, and at the same time, the brake light is turned on to prompt the rear vehicle to give an early warning and avoidance.
  • Fig. 2 shows the principle diagram of the laser ranging to judge obstacles in the automatic driving collision avoidance device.
  • the image processing module (201) continues to determine whether point C is safe, whether there are pedestrians or flammable Explosives, if not present, then the laser transmitter (104) emits a laser beam to irradiate the C point position.
  • the camera (101) captures the laser spot forming image for the second time. Point C will appear at the imaging position of point C, as shown in Figure 2(III)(a), then it will be confirmed twice that there is no obstacle ahead.
  • the image processing module (201) misjudged the first feedback image, and there is actually an unrecognized obstacle (60) in front of the car, then according to the principle of straight line propagation of light, the laser beam (the center point B of the laser transmitter and the point C The line connecting the points) will form an E-point light spot on the obstacle (60), and a vertical section X passing through the E point will be made in the vehicle driving direction.
  • the image of the E point on the X section is shown in Figure 2(III)(b) ) shown.
  • the imaging point of the original point C in the X section is the intersection point D of the AC line (the line connecting point C and the camera center point A) and the X section, as shown in Figure 2(III)(b).
  • the image captured by the camera (101) is shown in Figure 2(III)(b), the point C and the point D coincide as the same point, and the point E is below the point C, resulting in the offset of the x distance. Therefore, if the point C is not at the predetermined position of the point C, but an offset of the x distance occurs, the image processing module (201) can judge that there is an obstacle ahead according to the spot image, and the first image judgment has a misjudgment.
  • the laser beam (the line connecting point B and point C in the center of the laser transmitter) will An F spot is formed on the X' section of the obstacle (70).
  • point F and point G are the same point, and the distance between point F(G) and point C(D) is x', as shown in Fig. 2(III)(c). It can be seen from Figure 2 (II) that point F(G) is below point C(D), and point E is below point F(G), so the farther the imaging point is from the predetermined imaging point C, the closer the obstacle is to the vehicle ( 50) The closer.
  • the difference x between the two points on the X section can be obtained by calculating the pixel position difference between point E and point C on the image.
  • d1 y/tan ⁇ -x/( tan ⁇ - tan ⁇ ) that the calculation accuracy of d1 is determined by x and y.
  • y is the installation height of the laser probe from the ground. It is concluded that the larger the laser installation height y, the higher the test accuracy.
  • x is actually related to BA', where A' is the intersection of the AC connection line and the vertical section of the vehicle's forward direction passing through point B, and BA' is related to the installation distance of the camera and the laser transmitter in the vertical direction of the ground. If BA' If the distance is too small, the measurement accuracy will decrease. All requirements: the installation distance between the laser transmitter and the camera in the vertical direction of the ground is more than 10cm.
  • the above description only introduces the case of one laser beam.
  • the laser beam can be multiple beams, and the emission time of different beams can also be different, and the color of the laser emitted by different beams can also be different. Different, according to the actual situation will be more complicated than the above case.
  • FIG. 4 is a schematic flowchart of a method according to an embodiment of the present application. It should be understood that, although the steps in the flowchart of FIG. 4 are displayed in sequence according to the arrows, these steps are not necessarily executed in the sequence indicated by the arrows. Unless explicitly stated herein, the execution of these steps is not strictly limited to the order and may be performed in other orders. Moreover, at least some of the steps in the figure may include multiple sub-steps or multiple stages. These sub-steps or stages are not necessarily executed at the same time, but may be executed at different times, and the execution order is not necessarily sequential. Instead, it may be performed in turn or alternately with other steps or at least a portion of sub-steps or stages of other steps. In the processing flow of the collision avoidance device provided in this embodiment, the processing shown in FIG. 4 is repeatedly executed for each predetermined control cycle.
  • the camera and microwave radar collect data, and transmit the data to the signal processing unit (S101). If the signal processing unit detects an obstacle (S102: YES), the data is transmitted to the control unit. The control unit judges whether there is a potential safety hazard, and controls the collision avoidance execution unit to perform an automatic driving collision avoidance avoidance operation ( S109 ). If the signal processing unit does not detect an obstacle ( S102 : NO), the image processing module confirms the laser safety irradiation point ( S103 ), and transmits the coordinates of the safety irradiation point to the control unit. The control unit corrects the irradiation direction according to the coordinate position and combined with the position state sensor data (S104), and transmits the correct irradiation coordinates to the laser control module.
  • the laser control module controls the laser transmitter to emit laser search shots according to the correct coordinates sent from the control unit (S105).
  • the camera collects the laser spot image data for the second time (S106), and transmits it to the image processing module.
  • the image processing module extracts the laser spot detection point according to the image data ( S107 ), and analyzes whether there is an obstacle ahead according to the position shift of the spot ( S108 ). If the signal processing unit detects an obstacle (S108: YES), the data is transmitted to the control unit.
  • the control unit judges whether there is a potential safety hazard, and controls the collision avoidance execution unit to perform an automatic driving collision avoidance avoidance operation ( S109 ). If the signal processing unit does not detect an obstacle ( S108 : NO), this process ends, and the operation returns to the start of entering the next cycle.
  • the embodiments of the present application provide a low-cost automatic driving collision avoidance avoidance device and method based on "camera + millimeter-wave radar + laser transmitter” as the main sensor, which makes up for “camera + millimeter-wave radar” as the main sensor.

Landscapes

  • Engineering & Computer Science (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Human Computer Interaction (AREA)
  • Traffic Control Systems (AREA)

Abstract

一种自动驾驶防撞避免装置及其方法,包括:传感器单元(10),包括:激光发射器(104)用于发射特定方向激光光束,摄像头(101)获取含有激光光束产生光斑的图像;信号处理单元(20),包括:图像处理模块(201)获取摄像头(101)传送数据进行计算处理,分析前方是否存在障碍物;控制单元(30),获取信号处理单元(20)实时反馈数据,分析是否存在碰撞风险;防撞执行单元(40),执行控制单元(30)的控制,进行自动驾驶防撞避免操作。该自动驾驶防撞避免装置及其方法以"摄像头(101)+毫米波雷达(102)+激光发射器(104)"作为主传感器,弥补了"摄像头+毫米波雷达"作为主传感器无法识别静态障碍物这一难题,而且结构简单,成本低,有很高的应用价值。

Description

自动驾驶防撞避免装置及其方法 技术领域
本申请涉及自动驾驶防撞避免装置及其方法,特别涉及以“摄像头+毫米波雷达”作为主传感器的自动驾驶车辆对静态障碍物检测的防撞装置和方法。
背景技术
目前搭载自动驾驶功能的车辆所搭载的主流传感器有摄像头、毫米波雷达和激光雷达。激光雷达由于成本较高,稳定性不高等特点未被批量应用。现有自动驾驶方案主要采用“摄像头+毫米波雷达”作为主传感器。
技术问题
然而,采用“摄像头+毫米波雷达”作为主传感器对于识别静态障碍物一直是一个难点。近年来,采用“摄像头+毫米波雷达”作为主传感器的自动驾驶车辆发生几起致命事故,都是在打开L2级(L2是现阶段对自动驾驶级别评价标准的其中一个级别:驾驶员负责监控路面,实现部分自动驾驶)自动驾驶系统的同时,以垂直方向撞上静止车辆导致。这几起事故的原因就是由于“摄像头+毫米波雷达”没有判断出静止障碍物,最终系统没有反应导致的结果。
技术解决方案
本公开的主要目的在于提供一种基于“摄像头+毫米波雷达+激光发射器”作为主传感器的低成本自动驾驶防撞避免装置及其方法。
本申请公开一种基于“摄像头+毫米波雷达+激光发射器”作为主传感器的低成本自动驾驶防撞避免装置,该自动驾驶防撞避免装置具备:激光发射器,用于发射特定方向激光光束;摄像头,获取含有激光光束产生光斑的图像;图像处理模块,获取摄像头传送数据进行计算处理,分析前方是否存在障碍物;控制单元,获取信号处理单元实时反馈数据,分析是否存在碰撞风险;防撞执行单元,执行控制单元的控制,进行自动驾驶防撞避免操作。
所述摄像头为普通摄像头芯片,包括但不限于CCD图像传感器、COMS图像传感器;激光发射器发射光束可被该摄像头所识别,包括但不限于可见光。
作为本申请的进一步改进,摄像头先判断前方是否存在障碍物,若否,则图像处理模块判断激光光束可照射的安全位置,通过控制单元操作激光发射器,发射特定方向激光光束,照射到安全位置形成光斑。
作为本申请的进一步改进,图像处理单元判断激光光束可照射的不安全位置包括: 1)直接或者间接照射到人脸部分的位置, 2)易燃易爆物质表面,3)具有反射功能且无法判断激光反射后是否安全的物体表面,但不安全位置并不局限于此。
作为本申请的进一步改进,激光发射器与摄像头在地面垂直方向上的安装距离>10cm。
作为本申请的进一步改进,激光发射器包括但不限于:点云激光发射器,可旋转激光发射器。
作为本申请的进一步改进,激光发生器可以先后或者同时发射多种光色的激光光束。
作为本申请的进一步改进,该装置还包括:运动状态传感器,判断车辆倾斜、加速和转向状态,并传送到控制单元纠正激光发射方向的偏差。
作为本申请的进一步改进,运动状态传感器集成在控制单元主板上。
本申请还公开一种基于“摄像头+毫米波雷达+激光发射器”作为主传感器的低成本自动驾驶防撞避免方法,包括如下步骤:信号处理单元接收来自安装在所述自动驾驶车辆上的摄像头和微波雷达的数据,判断前方是否存在障碍物;若无,则图像处理模块判断激光光束可照射的安全位置,控制单元接收安全位置信号,并控制激光发射器发射激光光束照射安全位置;摄像头接收激光光束形成的光斑图像数据,并传输给图像处理模块,图像处理模块根据激光光斑的位置,再次判断前方是否存在障碍物;上述过程中,只要检测到前方存在障碍物,控制单元判断是否存在安全隐患,并控制防撞执行单元进行自动驾驶防撞避免操作。
有益效果
本申请提供了一种基于“摄像头+毫米波雷达+激光发射器”作为主传感器的低成本自动驾驶防撞避免装置和方法,弥补了“摄像头+毫米波雷达”作为主传感器无法识别静态障碍物的这一难题,而且结构简单,成本低,有很高的应用价值。
附图说明
为了更清楚地说明本申请具体实施方式或现有技术中的技术方案,下面将对具体实施方式或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图是本申请的一些实施方式,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本申请实施例提供的自动驾驶防撞避免装置的示意图。
图2为本申请实施例提供的激光测距判断障碍物原理图。
图3为本申请实施例提供的车辆由于遇到高低差导致激光射线方向偏移时进行纠正的状况图。
图4为本申请实施例提供的防撞避免处理流程图。
本发明的实施方式
下面将结合附图对本申请的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
在本申请的描述中,需要说明的是,术语“中心”、“上”、“下”、“左”、“右”、“竖直”、“水平”、“内”、“外”等指示的方位或位置关系为基于附图所示的方位或位置关系,仅是为了便于描述本申请和简化描述,而不是指示或暗示所指的系统或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本申请的限制。此外,术语“第一”、“第二”、“第三”仅用于描述目的,而不能理解为指示或暗示相对重要性。
在本申请的描述中,需要说明的是,除非另有明确的规定和限定,术语“安装”、“相连”、“连接”应做广义理解,例如,可以是固定连接,也可以是可拆卸连接,或一体地连接;可以是机械连接,也可以是电气连接;可以是直接相连,也可以通过中间媒介间接相连,可以是两个元件内部的连通。对于本领域的普通技术人员而言,可以具体情况理解上述术语在本申请中的具体含义。
以下,参照附图对作为搭载于车辆的自动驾驶防撞避免装置而具体化后的实施方式进行说明。本实施方式所涉及的防撞避免装置通过发射特定方向激光光束,来探测前方是否存在障碍物。首先,使用图1对本实施方式所涉及的车辆的防撞避免装置的示意结构进行说明。应当理解,本申请的装置和方法可用于任何类型的车辆,包括传统车辆、混合动力车辆(HEV)、增程式电动车(EREV)、纯电动车(BEV)、摩托车、客车、运动型多功能车(SUV)、跨界车、卡车、厢式货车、公共汽车、旅行车(RV) 等。这些仅仅是可能的应用中的一些,因为本文所述装置和方法不限于图1-4所示示例性实施例,并且可通过多种不同方式实现。
在图1中,车辆(50)具备:传感器单元(10),包括:用于图像采集的摄像头(101),用于前方障碍物探测的微波雷达(102),用于车辆倾斜、加速和转向状态识别的运动状态传感器(103),用于激光光束发射的激光发射器(104);信号处理单元(20),包括:用于接收摄像头图像数据用于图像处理的图像处理模块(201),用于微波雷达数据处理和控制的雷达处理模块(202),用于处理、转换和分析运动状态传感器数据的运动状态处理模块(203),用于控制激光光束发射的激光控制模块(204);控制单元(30),用于接收和处理信号处理单元传送过来的数据,并控制信号处理单元进一步进行信号探测,最后控制防撞执行单元进行防撞避免动作;防撞执行单元(40),包括:用于执行控制单元信号而减少或关闭油路来控制车速的车速调控器(401),用于执行控制单元信号而接通刹车电路使车辆刹车制动的制动装置(402),用于执行控制单元信号而显示前方障碍物报警的仪表(403),用于执行控制单元信号而提醒后部车辆的刹车灯(404)。
摄像头(101)为普通摄像头芯片,包括但不限于CCD图像传感器、COMS图像传感器;用于采集图像信号,可以识别激光发射器(104)发射光束,能够采集含有激光发射器(104)发射光束形成光斑的图像信号,并传送给图像处理模块(201)进行计算分析。
微波雷达(102)采用频率在300MHz-3000GHz之间,波长在0.1mm到10米之间微波作为信号源。微波是分米波、厘米波、毫米波与亚毫米波、米波的统称,所以微波雷达也包括毫米波雷达。微波的方向性很好,速度等于光速,微波遇到车辆立即被反射回来,再被雷达测速计接收,这样一来一回,不过几十万分之一秒的时间,数码管上就会显示出所测车辆的车速。当人物或物体在微波的感应范围内移动时,便会启动感应器。因此微波雷达广泛应用于自动驾驶中前方障碍物的探测,但是微波雷达无法识别静态障碍物一直是自动驾驶的一大难题。
运动状态传感器(103)包括:3轴加速度计、3轴陀螺仪、3轴磁力计、气压传感器、倾角传感器,可以为其中的一种或者多种组合。可以采集相关信号并传送给运动状态处理模块(203)进行计算分析。
激光发射器(104)可以根据控制单元(30)指令发射特定方向的激光光束,包括但不限于:点云激光发射器,可旋转激光发射器。并且可以先后或者同时发射多种光色的激光光束。在确认的安全位置形成光斑,该光斑图像数据用于判断前方是否存在障碍物。
图像处理模块(201)用于接收摄像头(101)传送过来的图像数据,通过算法分析前方是否存在障碍物,若无,继续分析激光光斑的图像数据,根据光斑是否偏移及偏移量,计算分析前方是否存在障碍物以及障碍物的距离,并将结果传送到控制单元。
雷达处理模块(202)接收微波雷达(102)数据,用于分析前方障碍物,并将分析结果传送给控制单元。
运动状态处理模块(203)接收运动状态传感器数据,通过包括卡尔曼滤波、粒子滤波、互补滤波算法的融合,计算出车辆倾斜、加速和转向状态,并将数据传送给控制单元,用于激光光束发射角度纠偏。
激光控制模块(204)接收控制单元控制信号,并控制激光发射器发射特定方向的激光光束。
控制单元(30)搭载有微机、线束的接口等。上述微机具有具备CPU (Central Processing Unit:中央处理器)、ROM (Read Only Memory:只读存储器)、RAM (Random Access Memory:随机存取存储器)、I/O以及CAN(Controller Area Network:控制器局域网络)通信装置等的公知的结构。控制单元(30)主要用于连接信号处理单元,接收传感信号分析数据并发送数据控制激光发射器,计算分析前方障碍物是否安全,并控制防撞执行单元(40)进行防撞避免动作。
车速调控器(401)用于减少或关闭油路对发动机进行抑制驱动力的控制,来调控车速。
制动装置(402)可以是任何合适的车辆制动系统的一部分,包括与盘式制动器、鼓式制动器、电液制动、电子机械制动、再生制动、线控制动等相关联的系统。
仪表(403)上设置报警指示灯,当前方障碍物出现安全风险时,报警指示灯发出警告,自动防撞规避同时,提示驾驶人员接管采取进一步措施。
刹车灯(404)当判断前方障碍需要规避时,自动启动制动装置(402)同时,点亮刹车灯,提示后方车辆预警规避。
现在转到图2,示出了自动驾驶防撞避免装置中激光测距判断障碍物原理图。如图2(Ⅰ)显示,当车辆(50)配置摄像头(101)第一次反馈图像判断前方不存在障碍物时,图像处理模块(201)继续判断C点是否安全,是否存在行人或者易燃易爆物品,如果不存在,那么激光发射器(104)发射激光光束去照射C点位置。摄像头(101)第二次采集激光光斑成形图像。C点会呈现在C点成像位置,如图2(Ⅲ)(a)所示,则二次确认前方不存在障碍物。
如果图像处理模块(201)处理第一次反馈图像出现误判,实际在车前方有一个没有识别的障碍物(60),那么根据光线直线传播原理,激光光束(激光发射器中心B点与C点的连线)会在障碍物(60)上形成一个E点光斑,在车辆行驶方向上做一个通过E点的垂直截面X,E点在X截面上的成像如图2(Ⅲ)(b)所示。根据光线直线传播原理,原C点在X截面的成像点为AC线(C点与摄像头中心A点连线)与X截面的交点D,如图2(Ⅲ)(b)所示。摄像头(101)的采集图像如图2(Ⅲ)(b),C点和D点重合为同一点,而且E点在C点下方,产生x距离的偏移。因此,如果C点不在C点预定位置,而是产生了x距离的偏移,那么图像处理模块(201)即可根据光斑图像,判断前方存在障碍物,且第一次图像判断存在误判。
如果障碍物(60)处在离车辆(50)更远的位置,如图2(Ⅰ)中障碍物(70)位置,那么激光光束(激光发射器中心B点与C点的连线)会在障碍物(70)的X'截面上形成一个F点光斑。同理,在摄像头(101)上的成形图像中F点和G点为同一点,F(G)点与C(D)点距离为x',如图2(Ⅲ)(c)所示。由图2(Ⅱ)可知:F(G)点在C(D)点下方,E点在F(G)点下方,所以成像点与预定成像点C点距离越远,那么障碍物离车辆(50)越近。
另一方面,我们可以通过成像点E与预定成像点C的距离来计算得到障碍物与车辆(50)之间的距离。如图2(Ⅱ),首先,通过图像上E点与C点的像素位置差的计算可以得到实际在X截面上两点的差值x。然后,障碍物与车辆(50)之间的距离d1,车辆(50)到C点的距离d,E点投影到地面的投影点H到C点的距离为d2,d1=d-d2;y为激光发射器中心B点到地面的距离,α为激光光束BC与地面的夹角,d=y÷tanα;X截面上E点和D点的距离为x,摄像头中心A点和C点连线和地面的夹角为β,d2=x÷(tanβ- tanα);将两个公式代入d1=d-d2中最后得到, 障碍物与车辆(50)之间的距离d1=y/tanα-x/( tanβ- tanα)。
由公式d1=y/tanα-x/( tanβ- tanα)可知,d1的计算精度由x和y决定的。而y为激光探头距离地面的安装高度,得出激光安装高度y越大,测试精度越高。而x实际与BA'有关,其中A'为AC连线与通过B点的车辆前进方向垂直截面的交点,而BA'与摄像头和激光发射器在地面垂直方向上的安装距离有关,如果BA' 距离过小,会导致测量精度下降,所有要求:激光发射器与摄像头在地面垂直方向上的安装距离>10cm。
上述说明案例为了解释清晰需要,只介绍了一束激光射线的案例,实际使用情况中,激光光束可以为多光束,且不同光束的发射时间也可不同,且不同光束发射激光的光色也可不同,根据实际情况会比以上案例更加复杂。
更进一步地,如图3,介绍一种更加复杂的情况,当车辆(50)遇到上下坡或者前后高低不平的情况,那么激光发射器就会相对水平位置有角度η偏差,会直接导致激光原发射角度γ产生变化。此时我们需要借助运动状态传感器(103)来判断车辆的实际倾斜角度,并将倾斜角度换算后反馈给控制单元(30),对激光实时发射角度δ进行纠偏。更进一步,除了车辆倾斜,运动状态传感器(103)还可以判断车子速度、加速度、转向状态,对激光发射光束的方向或者图像处理的位置点进行进一步纠偏。
图4为本申请一个实施例的方法的流程示意图。应该理解的是,虽然图4的流程图中的各个步骤按照箭头的指示依次显示但是这些步骤并不是必然按照箭头指示的顺序依次执行。除非本文中有明确的说明,这些步骤的执行并没有严格的顺序限制,其可以以其他的顺序执行。而且,图中的至少部分步骤可以包括多个子步骤或者多个阶段,这些子步骤或者阶段并不必然是在同一时刻执行完成,而是可以在不同的时刻执行,其执行顺序也不必然是依次进行,而是可以与其他步骤或者其他步骤的子步骤或者阶段的至少一部分轮流或者交替地执行。本实施例所提供的防撞避免装置处理流程,按每个规定的控制周期反复执行图4的处理。
首先,摄像头和微波雷达采集数据,并将数据传送给信号处理单元(S101)。如果信号处理单元检测出有障碍物(S102:是),数据传送给控制单元。控制单元判断是否存在安全隐患,并控制防撞执行单元进行自动驾驶防撞避免操作(S109)。如果如果信号处理单元未检测出有障碍物(S102:否),则图像处理模块确认激光安全照射点(S103),并把安全照射点坐标传送给控制单元。控制单元根据坐标位置并结合位置状态传感器数据纠正照射方向 (S104),并传送正确的照射坐标给激光控制模块。激光控制模块控制激光发射器按照控制单元传送过来的正确坐标发射激光探照(S105)。摄像头二次采集激光光斑图像数据(S106),并传送给图像处理模块。图像处理模块根据图像数据,提取激光光斑探点(S107),并根据光斑位置偏移情况分析前方是否存在障碍物(S108)。如果信号处理单元检测出有障碍物(S108:是),数据传送给控制单元。控制单元判断是否存在安全隐患,并控制防撞执行单元进行自动驾驶防撞避免操作(S109)。如果如果信号处理单元未检测出有障碍物(S108:否),则这一个流程结束,回到开始进入下一个循环的操作。
工业实用性
综上,本申请实施例提供了一种基于“摄像头+毫米波雷达+激光发射器”作为主传感器的低成本自动驾驶防撞避免装置和方法,弥补了“摄像头+毫米波雷达”作为主传感器无法识别静态障碍物的这一难题,而且结构简单,成本低,有很高的应用价值。
本公开依据实施例进行了记述,但是应理解的是本公开并不限定于该实施例及构造。本公开也包含各种变形例及等同范围内的变形。除此以外,各种各样的组合及方式、以及在其中仅包含一个要素、一个以上要素或一个以下要素的其他组合或方式也包含在本公开的范畴及思想范围内。

Claims (9)

  1. 一种用于自动驾驶防撞避免装置,该装置包括:
      传感器单元(10),包括:激光发射器(104)用于发射特定方向激光光束,摄像头(101)获取含有激光光束产生光斑的图像;
      信号处理单元(20),包括:图像处理模块(201)获取摄像头传送数据进行计算处理,分析前方是否存在障碍物;
      控制单元(30),获取信号处理单元实时反馈数据,分析是否存在碰撞风险;
      防撞执行单元(40),执行控制单元的控制,进行自动驾驶防撞避免操作。
  2. 根据权利要求1所述的装置,其特征在于:摄像头先判断前方是否存在障碍物,若否,则图像处理模块判断激光光束可照射的安全位置,通过控制单元操作激光发射器,发射特定方向激光光束,照射到安全位置形成光斑。
  3. 根据权利要求2所述的装置,其特征在于:图像处理单元判断激光光束可照射的不安全位置包括: 1)直接或者间接照射到人脸部分的位置,2)易燃易爆物质表面,3)具有反射功能且无法判断激光反射后是否安全的物体表面,但不安全位置并不局限于此。
  4. 根据权利要求1所述的装置,其特征在于:激光发射器与摄像头在地面垂直方向上的安装距离>10cm。
  5. 根据权利要求1所述的装置,其特征在于:激光发射器包括但不限于:点云激光发射器,可旋转激光发射器。
  6. 根据权利要求1所述的装置,其特征在于:激光发生器可以先后或者同时发射多种光色的激光光束。
  7. 根据权利要求1所述的装置,其特征在于:传感器单元还包括:运动状态传感器(103)判断车辆倾斜、加速和转向状态,并传送到控制单元纠正激光发射方向的偏差。
  8. 根据权利要求7所述的方法,其特征在于:运动状态传感器集成在控制单元主板上。
  9. 一种用于自动驾驶防撞避免的方法,所述防撞避免方法包括如下步骤:
      a)信号处理单元接收来自安装在所述自动驾驶车辆上的摄像头和微波雷达的数据,判断前方是否存在障碍物;
      b)若无,则图像处理模块判断激光光束可照射的安全位置,控制单元接收安全位置信号,并控制激光发生器发射激光光束照射安全位置;
      c)摄像头接收激光照射时的图像数据,并传输给图像处理模块,图像处理模块根据激光光斑的位置,再次判断前方是否存在障碍物;
      d)上述过程中,只要检测到前方存在障碍物,控制单元判断是否存在安全隐患,并控制防撞执行单元进行自动驾驶防撞避免操作。
PCT/CN2021/089577 2021-04-22 2021-04-25 自动驾驶防撞避免装置及其方法 WO2022222159A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110436589.XA CN112977393B (zh) 2021-04-22 2021-04-22 自动驾驶防撞避免装置及其方法
CN202110436589.X 2021-04-22

Publications (1)

Publication Number Publication Date
WO2022222159A1 true WO2022222159A1 (zh) 2022-10-27

Family

ID=76339763

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/089577 WO2022222159A1 (zh) 2021-04-22 2021-04-25 自动驾驶防撞避免装置及其方法

Country Status (2)

Country Link
CN (1) CN112977393B (zh)
WO (1) WO2022222159A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116567375A (zh) * 2023-04-24 2023-08-08 禾多科技(北京)有限公司 车载前视摄像头一体机、车辆和车辆车速控制方法

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115257615A (zh) * 2022-09-05 2022-11-01 四川途芯智控新能源科技有限公司 基于电动汽车激光雷达一体化防撞装置

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6285778B1 (en) * 1991-09-19 2001-09-04 Yazaki Corporation Vehicle surroundings monitor with obstacle avoidance lighting
CN202903176U (zh) * 2012-09-20 2013-04-24 孙斌 激光辅助机器视觉测距装置
CN104729459A (zh) * 2015-04-10 2015-06-24 武汉工程大学 基于ARM-Linux系统的行车测距防撞预警装置
CN105292085A (zh) * 2015-11-02 2016-02-03 清华大学苏州汽车研究院(吴江) 一种基于红外激光辅助的车辆前向避撞系统
CN106291520A (zh) * 2016-07-14 2017-01-04 江苏大学 一种基于编码激光与双目视觉的辅助驾驶系统及方法
CN108995584A (zh) * 2018-07-27 2018-12-14 合肥市智信汽车科技有限公司 一种基于红外激光辅助的车辆后向避撞系统
CN211696386U (zh) * 2020-03-16 2020-10-16 河南灵狗电子科技有限公司 一种新型组合式防撞车载毫米波雷达

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06290398A (ja) * 1993-03-31 1994-10-18 Mazda Motor Corp 障害物検知装置
JPH09197045A (ja) * 1996-01-24 1997-07-31 Nissan Motor Co Ltd 車両用レーダ装置
JP3140961B2 (ja) * 1996-04-12 2001-03-05 三菱電機株式会社 車両の周辺監視装置
JP2006258457A (ja) * 2005-03-15 2006-09-28 Omron Corp レーザスキャン装置
JP4345783B2 (ja) * 2006-08-10 2009-10-14 オムロン株式会社 物体検知装置および方法
JP2010018080A (ja) * 2008-07-09 2010-01-28 Mazda Motor Corp 車両の運転支援装置
KR101395089B1 (ko) * 2010-10-01 2014-05-16 안동대학교 산학협력단 장애물 감지 시스템 및 방법
WO2019239566A1 (ja) * 2018-06-14 2019-12-19 ソニー株式会社 情報処理装置、情報処理方法及び測距システム

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6285778B1 (en) * 1991-09-19 2001-09-04 Yazaki Corporation Vehicle surroundings monitor with obstacle avoidance lighting
CN202903176U (zh) * 2012-09-20 2013-04-24 孙斌 激光辅助机器视觉测距装置
CN104729459A (zh) * 2015-04-10 2015-06-24 武汉工程大学 基于ARM-Linux系统的行车测距防撞预警装置
CN105292085A (zh) * 2015-11-02 2016-02-03 清华大学苏州汽车研究院(吴江) 一种基于红外激光辅助的车辆前向避撞系统
CN106291520A (zh) * 2016-07-14 2017-01-04 江苏大学 一种基于编码激光与双目视觉的辅助驾驶系统及方法
CN108995584A (zh) * 2018-07-27 2018-12-14 合肥市智信汽车科技有限公司 一种基于红外激光辅助的车辆后向避撞系统
CN211696386U (zh) * 2020-03-16 2020-10-16 河南灵狗电子科技有限公司 一种新型组合式防撞车载毫米波雷达

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116567375A (zh) * 2023-04-24 2023-08-08 禾多科技(北京)有限公司 车载前视摄像头一体机、车辆和车辆车速控制方法
CN116567375B (zh) * 2023-04-24 2024-02-02 禾多科技(北京)有限公司 车载前视摄像头一体机、车辆和车辆车速控制方法

Also Published As

Publication number Publication date
CN112977393B (zh) 2024-01-12
CN112977393A (zh) 2021-06-18

Similar Documents

Publication Publication Date Title
CN208101973U (zh) 一种智能驾驶辅助避撞系统
KR101996419B1 (ko) 센서 융합 기반 보행자 탐지 및 보행자 충돌 방지 장치 및 방법
US8320626B2 (en) Image processing apparatus
CN111708016B (zh) 一种毫米波雷达与激光雷达融合的车辆前碰撞预警方法
EP1338477A2 (en) Obstacle detection device for vehicle and method thereof
CN108407715A (zh) 一种汽车安全预警系统及方法
JP6479130B1 (ja) 車両用走行支援装置
US11755022B2 (en) Vehicle control device
WO2022222159A1 (zh) 自动驾驶防撞避免装置及其方法
US10059343B2 (en) Apparatus and method for use in a vehicle
CN110673599A (zh) 基于传感器网络的自动驾驶车辆环境感知系统
US20170174224A1 (en) Apparatus and method for use in a vehicle
CN216374520U (zh) 用于车辆的驾驶员辅助系统和车辆
CN112793507A (zh) 基于惯性器件感知车辆右转运动特性的盲区预警制动系统
WO2015121260A1 (en) Apparatus and method for use in a vehicle
JPH1159355A (ja) 車間距離警報装置
US12084052B2 (en) System and method of predicting and displaying a side blind zone entry alert
CN113573965A (zh) 用于为运输工具求取由潮湿引起的事故风险的方法
CN113022593B (zh) 障碍物处理方法、装置和行驶设备
KR101127702B1 (ko) 자동차의 통합 측/후방 안전시스템
CN113734207B (zh) 车辆安全防护系统、方法及车辆
CN114690163A (zh) 车辆用识别装置、车辆控制系统、车辆用识别方法及存储介质
CN217085550U (zh) 一种智能网联的汽车安全数据监测装置及系统
US11951981B2 (en) Systems and methods for detecting turn indicator light signals
US20220308233A1 (en) Driver assistance system and operation method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21937387

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21937387

Country of ref document: EP

Kind code of ref document: A1