WO2023236738A1 - 一种车辆自动驾驶控制方法、装置和计算机存储介质 - Google Patents

一种车辆自动驾驶控制方法、装置和计算机存储介质 Download PDF

Info

Publication number
WO2023236738A1
WO2023236738A1 PCT/CN2023/094733 CN2023094733W WO2023236738A1 WO 2023236738 A1 WO2023236738 A1 WO 2023236738A1 CN 2023094733 W CN2023094733 W CN 2023094733W WO 2023236738 A1 WO2023236738 A1 WO 2023236738A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
driving
target vehicle
information
target
Prior art date
Application number
PCT/CN2023/094733
Other languages
English (en)
French (fr)
Inventor
熊健
Original Assignee
宁波路特斯机器人有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 宁波路特斯机器人有限公司 filed Critical 宁波路特斯机器人有限公司
Publication of WO2023236738A1 publication Critical patent/WO2023236738A1/zh

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • B60W30/182Selecting between different operative modes, e.g. comfort and performance modes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0098Details of control systems ensuring comfort, safety or stability not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0015Planning or execution of driving tasks specially adapted for safety
    • B60W60/0016Planning or execution of driving tasks specially adapted for safety of the vehicle or its occupants
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2555/00Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
    • B60W2555/20Ambient conditions, e.g. wind or rain

Definitions

  • the present application relates to the field of vehicle driving technology, and in particular to a vehicle automatic driving control method, device and computer storage medium.
  • the end-to-end automatic driving function means that the vehicle realizes the automatic driving function from the starting position to the end position after setting the destination; the end-to-end automatic driving function uses high-precision maps, satellite positioning, front-facing cameras, front-facing radar, After fusion of information from 360° surround view system, angle radar, lidar and other devices, a preset autonomous driving route is planned, and at the same time, it is associated with the body system to control the vehicle to keep driving within a safe and stable range.
  • the driver No action is required, reducing driving intensity for the driver.
  • this application provides a vehicle automatic driving control method, including:
  • the target vehicle When the driving scene recognition result indicates that the target vehicle is in a sub-extreme driving scene, the target vehicle is controlled to enter a safe automatic driving mode; the driving disadvantages present in the sub-extreme driving scene are smaller than those in the extreme driving scene. The driving disadvantages are greater than those in normal driving scenarios.
  • this application also provides a vehicle automatic driving control device, which includes:
  • the acquisition module is used to obtain the vehicle perception information of the target vehicle
  • the target vehicle is controlled to enter the safe automatic driving mode.
  • obtaining the vehicle perception information of the target vehicle includes:
  • Perception distance fusion is performed based on the radar perception distance and the image perception distance to determine the vehicle perception distance.
  • controlling the target vehicle to enter a safe autonomous driving mode includes:
  • the target vehicle is controlled to enter the safe automatic driving mode; the target object is an object that adversely affects automatic driving.
  • determining the lateral control moment of the target vehicle based on the positional relationship between the target vehicle and the target object includes:
  • the lateral control moment of the target vehicle is determined based on the real-time adhesion coefficient.
  • the position relationship represents the distance between the target object and the target vehicle
  • Determining the lateral control moment of the target vehicle based on the positional relationship between the target vehicle and the target object includes:
  • the lateral control torque is kept unchanged.
  • the position relationship represents the distance between the target object and the target vehicle
  • Determining the lateral control moment of the target vehicle based on the positional relationship between the target vehicle and the target object includes:
  • the lateral control torque is reduced.
  • the vehicle sensing information includes driving wind information
  • controlling the target vehicle to enter a safe autonomous driving mode includes:
  • the target vehicle When the driving wind information is within a preset wind level range, the target vehicle is controlled to enter the safe automatic driving mode.
  • the method further includes:
  • the traveling direction of the target vehicle is controlled to remain unchanged.
  • this application also provides an intelligent identification device.
  • the device includes a processor and a memory. At least one instruction or at least one program is stored in the memory. The at least one instruction or the at least one program is composed of the The processor is loaded and executed to implement the above-mentioned vehicle automatic driving control method.
  • the present application also provides a computer storage medium in which at least one instruction or at least one program is stored, and the at least one instruction or at least one program is loaded and executed by a processor to implement the above Vehicle automatic driving control method.
  • the vehicle automatic driving control method in this application controls the target vehicle to enter a safe automatic driving mode when the driving recognition result indicates that the target vehicle is in a sub-extreme driving scene, thereby responding to the sub-extreme scene by presetting the safe automatic driving mode, thereby improving It also improves the safety of autonomous driving of vehicles in sub-extreme scenarios; it also prevents vehicles from exiting autonomous driving mode in sub-extreme scenarios, improving the applicability of autonomous driving scenarios.
  • Figure 1 is a schematic flowchart of a vehicle automatic driving control method provided by an embodiment of the present application
  • Figure 2 is a schematic flowchart of obtaining vehicle sensing information in a vehicle automatic driving control method provided by an embodiment of the present application
  • Figure 3 is a schematic flowchart of lateral torque control in a vehicle automatic driving control method provided by an embodiment of the present application
  • Figure 4 is a schematic flowchart of safe automatic driving in a vehicle automatic driving control method provided by an embodiment of the present application
  • Figure 5 is a schematic structural diagram of a vehicle automatic driving control device provided by an embodiment of the present application.
  • Figure 6 is a hardware structure block diagram of a vehicle automatic driving control method provided by an embodiment of the present application.
  • references herein to "one embodiment” or “an embodiment” refers to a particular feature, structure, or characteristic that may be included in at least one implementation of the present application.
  • the terms “upper”, “lower”, The orientations or positional relationships indicated by “left”, “right”, “top”, “bottom”, etc. are based on the orientations or positional relationships shown in the drawings, and are only for the convenience of describing the present application and simplifying the description, rather than indicating or implying The devices or elements referred to must have a specific orientation, be constructed and operate in a specific orientation, and therefore should not be construed as limiting the application.
  • first and second are used for descriptive purposes only and cannot be understood as indicating or implying relative importance or implicitly indicating the quantity of indicated technical features. Therefore, features defined as “first” and “second” may explicitly or implicitly include one or more of these features. Furthermore, the terms “first”, “second”, etc. are used to distinguish similar objects and are not necessarily used to describe a specific order or sequence. It is to be understood that the data so used are interchangeable under appropriate circumstances so that the embodiments of the application described herein can be practiced in sequences other than those illustrated or described herein.
  • a vehicle automatic driving control method provided by an embodiment of the present application is introduced.
  • the vehicle automatic driving control can be applied to an in-vehicle controller or a cloud vehicle controller; the method includes:
  • the driving scene recognition is to identify the current driving scene of autonomous driving;
  • the driving scene recognition result is the driving scene in which the current autonomous driving is located;
  • the driving recognition result includes Extreme driving scenarios, sub-extreme driving scenarios and normal driving scenarios. Different driving scenarios correspond to different autonomous driving strategies; extreme driving scenarios include strong winds, heavy rain, blizzards, sandstorms and severe icing on the road; sub-extreme driving scenarios include strong winds, heavy rain , heavy snow, moderate dust and water on the road; normal driving scenes include driving scenes with clear weather and no water on the ground.
  • the target vehicle when the driving recognition result indicates that the target vehicle is in a sub-extreme driving scene, the target vehicle is controlled to enter a safe automatic driving mode, thereby coping with the sub-extreme scene by presetting the safe automatic driving mode and improving the performance of the vehicle in the sub-extreme driving scene.
  • the safety of autonomous driving in sub-extreme scenarios it also prevents the vehicle from exiting the autonomous driving mode in sub-extreme scenarios, improving the applicability of autonomous driving scenarios.
  • the vehicle sensing information includes the vehicle sensing distance; the vehicle sensing distance refers to the limit distance that the vehicle can sense; S105 includes:
  • the target vehicle when the vehicle sensing distance is within 50 to 200m, the target vehicle may be in sub-extreme driving scenarios such as heavy fog, heavy snow, sandstorms, rain and fog that affect the driving vision. Since the driving vision of the target vehicle is reduced, , leading to obstacles in the perception of vehicles or obstacles far ahead, so after S105, the above method also includes:
  • the automatic driving speed refers to the driving speed of the target vehicle in automatic driving mode; by reducing the automatic driving speed of the target vehicle, when the vehicle senses that there is a vehicle or In the case of obstacles, braking can be used to slow down to avoid collisions, thereby improving the driving safety of the target vehicle in sub-extreme driving scenarios; while ensuring the normal autonomous driving of the vehicle in sub-extreme driving scenarios.
  • the automatic driving speed of the target vehicle can be reduced based on the corresponding relationship between the vehicle's perceived distance and the automatic driving speed; that is, the vehicle's perceived distance determines the automatic driving speed of the target vehicle; for example, when the vehicle's perceived distance is 200m
  • the automatic driving speed of the target vehicle can be 120km/h.
  • the target vehicle under full braking, the target vehicle can still be prevented from colliding with the vehicle or obstacle 200m ahead; in the case where the vehicle sensing distance is 100m Under the condition, the automatic driving speed of the target vehicle can be 100km/h.
  • the target vehicle can still avoid collision with the vehicle or obstacle 100m ahead; when the vehicle sensing distance is 50m, The target vehicle's autonomous driving speed can be 90km/h. At this time, under full braking, the target vehicle can still avoid collision with vehicles or obstacles 50m ahead; based on the correspondence between the vehicle's perceived distance and the autonomous driving speed, Reducing the autonomous driving speed of the target vehicle not only improves the driving safety of the target vehicle, but also increases the average autonomous driving speed, thereby reducing the time it takes for autonomous driving to reach the destination.
  • the corresponding relationship between the vehicle sensing distance and the autonomous driving speed can also be a proportional relationship.
  • the autonomous driving speed and the driving speed in a normal driving scenario The speed drops by 25% to 0%.
  • the automatic driving speed drops by 17% compared to the driving speed in the normal driving scenario; when the vehicle sensing distance is 50 meters, the automatic driving speed drops by 17%.
  • the driving speed is reduced by 25% compared to the driving speed in normal driving scenarios.
  • the various radar collection information includes the radar collection information corresponding to the various radars, and the information confidence corresponding to the various radars;
  • the image collection information refers to the camera on the target vehicle Collected information about the surrounding scene;
  • radar collection information refers to the information collected by the radar on the target vehicle; among them, the radar on the target vehicle includes millimeter wave radar, ultrasonic radar, laser radar and angle radar, etc.
  • the information collected by radar is a kind of radar collection information; information confidence refers to the credibility of various radar output results.
  • S203 Perform information fusion based on the corresponding radar collection information of multiple radars and the corresponding information confidence levels of multiple radars to determine the radar sensing distance; information fusion refers to the fusion of the results of the information output of multiple radar collections. Radar sensing Distance refers to the distance that the radar system in the target vehicle can sense.
  • the corresponding radar sensing ranges of multiple radars can be obtained; specifically, radar detects objects around the target vehicle through electromagnetic waves of different frequencies.
  • the radar is used to detect objects around the target vehicle.
  • the difference between the emitted and received radar waves is calculated and detected, that is, the Doppler effect; if the target vehicle is in sub-extreme driving scenarios such as heavy fog, heavy snow, sandstorms, rain and fog that affect the driving vision, the radar waves emitted by the radar will be absorption, resulting in a decrease in the number of radar waves received by the radar. If the detected detection targets are all within 50 meters, or the object pursued by the radar cannot be detected after being 50 meters away from the target vehicle, the radar sensing range of the radar is considered to be 50 meters.
  • the target object includes objects that reduce the adhesion coefficient of the target vehicle, and the objects that reduce the adhesion coefficient of the target vehicle include water and snow; when the target object is located in the current lane where the target vehicle is located, the target vehicle is controlled Lane changes include:
  • the lateral control torque of the target vehicle Based on the positional relationship between the target vehicle and the target object, determine the lateral control torque of the target vehicle; the positional relationship refers to the position of the target vehicle and the sensing object. At this time, the target object must be located in front of the target vehicle; the lateral control torque is used to control The torque of the target vehicle when changing lanes and turning. The smaller the lateral control torque, the smaller the probability of the target vehicle skidding.
  • the method of obtaining the object that reduces the adhesion coefficient of the target vehicle includes:
  • the lateral control torque is kept unchanged; by keeping the lateral control moment unchanged when the distance between the target vehicle and the target object is greater than the preset distance, it can be achieved Change lanes in advance, so that the target vehicle can quickly enter a lane in advance where there are no objects that reduce the adhesion coefficient of the target vehicle such as water or snow, which improves the intelligence of autonomous driving and improves the safety of autonomous driving.
  • the lateral control moment is reduced to reduce the adhesion coefficient of the target vehicle when the target vehicle enters water or snow, etc. After entering the lane, it reduces the possibility of the target vehicle skidding during lane changes or turns, improves vehicle stability, and thereby improves the safety of autonomous driving.
  • the lateral control torque is reduced.
  • determining the lateral control torque of the target vehicle also includes:
  • the real-time wheel speed information refers to the current rotation speed of the wheels of the target vehicle
  • the real-time driving torque information refers to the torque currently driving the target vehicle to drive automatically.
  • S303 Perform force analysis based on real-time wheel speed information and real-time driving torque information to obtain real-time wheel force information; force analysis means that the driving force of the tire can be obtained after calculation based on real-time wheel speed information and real-time driving torque information.
  • the driving force is equal to the ground friction; specifically, the real-time driving torque is x N*m, and the tire circumference is about y m, then the force on the wheel segment is x/y N, and the ground friction can be obtained as x/y N, and then Get the ground adhesion coefficient.
  • S307. Determine the lateral control moment of the target vehicle based on the real-time adhesion coefficient.
  • the lateral control torque of the target vehicle can be controlled based on the real-time adhesion coefficient and the decrease ratio of the adhesion coefficient in normal driving scenarios; by monitoring the real-time adhesion coefficient, it is ensured that the target vehicle during the autonomous driving process can quickly change based on The adhesion coefficient is controlled accordingly, thereby improving the intelligence of autonomous driving and improving the safety of autonomous driving.
  • the ground adhesion coefficient in normal driving scenarios, is approximately 0.7-1, and in extreme driving scenarios, the ground adhesion coefficient is 0.3.
  • the adhesion coefficient is 0.3-0.7, the vehicle is in sub-extreme driving.
  • the lateral control moment will decrease by 40% to 0% with the ground adhesion coefficient; specifically, when the ground adhesion coefficient is reduced to 0.5, the lateral control moment will decrease by 20% relative to the lateral control moment in the normal driving scene.
  • the vehicle sensing information includes driving wind information, which refers to the wind level in the driving scene of the target vehicle, and can be obtained through the vehicle sensing system or in the weather database;
  • S105 includes:
  • the target vehicle When the driving wind information is within the preset wind level range, the target vehicle is controlled to enter the safe automatic driving mode; the preset wind level range means that the wind level is greater than the wind level in normal driving scenarios and less than the wind level in extreme driving scenarios. ; Specifically, the wind level in normal driving scenarios is 0 to 5; in extreme driving scenarios, it is 9 to 17; the preset wind level range is 6 to 8.
  • a takeover request instruction is issued; the takeover request instruction refers to an instruction requesting the user to take over control of the vehicle.
  • the acquisition module 1001 is used to acquire the vehicle sensing information of the target vehicle
  • the safe driving module 3001 is used to control the target vehicle to enter a safe automatic driving mode when the driving scene recognition result indicates that the target vehicle is in a sub-extreme driving scene; the driving disadvantages in the sub-extreme driving scene are less than those in the extreme driving scene. The disadvantages are greater than those in normal driving scenarios.
  • Vehicle sensing information includes vehicle sensing distance, sensing objects and driving wind information;
  • the safe driving module includes:
  • the first safety unit is used to control the target vehicle to enter a safe automatic driving mode when the vehicle's sensing distance is within the preset sensing range.
  • the second safety unit is used to control the target vehicle to enter a safe autonomous driving mode when the sensing object is the target object.
  • the lane changing module is used to control the target vehicle to change lanes when the target object is located in the current lane of the target vehicle.
  • Wind direction acquisition module used to obtain wind direction information
  • the driving direction control module is used to control the driving direction of the target vehicle unchanged based on wind direction information.
  • An information comprehensive acquisition unit is used to acquire image collection information and various radar collection information;
  • the various radar collection information includes radar collection information corresponding to various radars, and information confidence levels corresponding to various radars;
  • the lane changing module includes:
  • the torque reduction subunit is used to reduce the lateral control torque when the distance between the target object and the target vehicle is less than or equal to the preset distance.
  • the electronic device 900 may vary greatly due to different configurations or performances, and may include one or more central processing units (Centr Vehicle Automatic Driving Control Method l Processing Units, CPU) 910 (processor 910 A processing device that may include but is not limited to a microprocessor MCU or a programmable logic device (FPG), a vehicle automatic driving control method, etc.), a memory 930 for storing data, one or more storage media 920 for storing application programs 923 or data 922 ( For example, one or more mass storage devices). Among them, the memory 930 and the storage medium 920 may be short-term storage or persistent storage.
  • the program stored in the storage medium 920 may include one or more modules, and each module may include a series of instruction operations in the electronic device.
  • the central processor 910 may be configured to communicate with the storage medium 920 and execute a series of instruction operations in the storage medium 920 on the electronic device 900 .
  • the electronic device 900 may also include one or more power supplies 960, one or more wired or wireless network interfaces 950, one or more input and output interfaces 940, and/or, one or more operating systems 921, such as Windows ServerTM, M Vehicle automatic driving control method c OS XTM, UnixTM, LinuxTM, Free vehicle automatic driving control device SDTM, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Traffic Control Systems (AREA)

Abstract

一种车辆自动驾驶控制方法、装置和计算机存储介质;其中,控制方法包括:获取目标车辆的车辆感知信息;基于车辆感知信息进行驾驶场景识别,得到驾驶场景识别结果;在驾驶场景识别结果表征目标车辆处于次极端驾驶场景的情况下,控制目标车辆进入安全自动驾驶模式;次极端驾驶场景中存在的驾驶不利因素小于极端驾驶场景中的驾驶不利因素,大于正常驾驶场景中的驾驶不利因素。

Description

一种车辆自动驾驶控制方法、装置和计算机存储介质 技术领域
本申请涉及车辆驾驶技术领域,特别涉及一种车辆自动驾驶控制方法、装置和计算机存储介质。
背景技术
端到端自动驾驶功能是指车辆在设置目的地之后,实现从起始位置到终点位置的自动驾驶功能;端到端自动驾驶功能利用高精度地图、卫星定位、前置摄像头、前置雷达、360°环视系统、角雷达、激光雷达等装置进行信息融合后,规划出预设自动驾驶路线,同时关联车身系统,控制车辆保持在安全稳定的范围内行驶,在全自动驾驶过程中,驾驶员无需进行任何操作,从而降低驾驶员的驾驶强度。
端到端自动驾驶功能的感知模块具备强大的感知功能,能够对车辆周围的事物进行精准识别,其中,对事物的识别包括对物体种类、运动状态、与车辆的相对位置等信息的识别;同时可以针对当前自动驾驶的环境进行自动识别,但现有技术中,车辆在处于极端场景的情况下会请求驾驶员接管驾驶并退出自动驾驶,其中,极端场景包括:暴雨、暴雪、沙尘暴和路面严重结冰等场景;在实际驾驶过程中,驾驶中的驾驶场景多为次极端场景,其中,次极端场景包括大雨、大雪、中度沙尘和路面积水等场景;现有技术中并未出现针对次极端场景的自动驾驶方法。
发明内容
针对现有技术的上述问题,本申请的目的在于通过预设安全自动驾驶模式应对次极端场景,提高了车辆在次极端场景自动驾驶的安全性,以及提高自动驾驶的驾 驶场景适用性。
为了解决上述问题,本申请提供了一种车辆自动驾驶控制方法,包括:
获取目标车辆的车辆感知信息;
基于所述车辆感知信息进行驾驶场景识别,得到驾驶场景识别结果;
在所述驾驶场景识别结果表征所述目标车辆处于次极端驾驶场景的情况下,控制所述目标车辆进入安全自动驾驶模式;所述次极端驾驶场景中存在的驾驶不利因素小于极端驾驶场景中的驾驶不利因素,大于正常驾驶场景中的驾驶不利因素。
另一方面,本申请还提供一种车辆自动驾驶控制装置,该装置包括:
获取模块,用于获取目标车辆的车辆感知信息;
场景识别模块,用于基于所述车辆感知信息进行驾驶场景识别,得到驾驶场景识别结果;
安全驾驶模块,用于在所述驾驶场景识别结果表征所述目标车辆处于次极端驾驶场景的情况下,控制所述目标车辆进入安全自动驾驶模式;所述次极端驾驶场景中存在的驾驶不利因素小于极端驾驶场景中的驾驶不利因素,大于正常驾驶场景中的驾驶不利因素。
在本申请实施例中,所述车辆感知信息包括车辆感知距离;
所述在所述驾驶场景识别结果表征所述目标车辆处于次极端驾驶场景的情况下,控制所述目标车辆进入安全自动驾驶模式包括:
在所述车辆感知距离在预设感知范围内的情况下,控制所述目标车辆进入所述安全自动驾驶模式。
在本申请实施例中,所述在所述车辆感知距离在预设感知范围内的情况下,控制所述目标车辆进入所述安全自动驾驶模式之后,所述方法还包括:
降低所述目标车辆的自动驾驶车速。
在本申请实施例中,所述获取目标车辆的车辆感知信息包括:
获取图像采集信息和多种雷达采集信息;所述多种雷达采集信息包括多种雷达各自对应的雷达采集信息,以及所述多种雷达各自对应的信息置信度;
基于所述多种雷达各自对应的雷达采集信息,以及所述多种雷达各自对应的信息置信度进行信息融合,确定雷达感知距离;
对所述图像采集信息进行像素点识别,确定图像感知距离;
基于所述雷达感知距离和所述图像感知距离进行感知距离融合,确定所述车辆感知距离。
在本申请实施例中,所述车辆感知信息包括感知对象;
所述在所述驾驶场景识别结果表征所述目标车辆处于次极端驾驶场景的情况下,控制所述目标车辆进入安全自动驾驶模式包括:
在所述感知对象为目标对象的情况下,控制所述目标车辆进入所述安全自动驾驶模式;所述目标对象为对自动驾驶产生不利影响的对象。
在本申请实施例中,所述在所述感知对象为目标对象的情况下,控制所述目标车辆进入所述安全自动驾驶模式之后,所述方法还包括:
在所述目标对象位于所述目标车辆所处的当前车道的情况下,控制所述目标车辆变道。
在本申请实施例中,所述目标对象包括降低所述目标车辆附着系数的对象,所述在所述目标对象位于所述目标车辆所处的当前车道的情况下,控制所述目标车辆变道包括:
基于所述目标车辆与所述目标对象的位置关系,确定所述目标车辆的横向控制 力矩;
基于所述横向控制力矩控制所述目标车辆变道。
在本申请实施例中,所述基于所述目标车辆与所述目标对象的位置关系,确定所述目标车辆的横向控制力矩包括:
获取所述目标车辆的实时轮速信息和实时驱动力矩信息;
基于所述实时轮速信息和所述实时驱动力矩信息进行受力分析,得到车轮的实时受力信息;
基于所述实时受力信息和预设车辆重量进行附着系数分析,得到所述实时附着系数;
基于所述实时附着系数确定所述目标车辆的横向控制力矩。
在本申请实施例中,所述位置关系表征所述目标对象与所述目标车辆的距离;
所述基于所述目标车辆与所述目标对象的位置关系,确定所述目标车辆的横向控制力矩包括:
在所述目标对象与所述目标车辆的距离大于预设距离的情况下,保持所述横向控制力矩不变。
在本申请另一实施例中,所述位置关系表征所述目标对象与所述目标车辆的距离;
所述基于所述目标车辆与所述目标对象的位置关系,确定所述目标车辆的横向控制力矩包括:
在所述目标对象与所述目标车辆的距离小于或等于预设距离的情况下,降低所述横向控制力矩。
在本申请实施例中,所述车辆感知信息包括行驶风力信息;
所述在所述驾驶场景识别结果表征所述目标车辆处于次极端驾驶场景的情况下,控制所述目标车辆进入安全自动驾驶模式包括:
在所述行驶风力信息在预设风力级别范围内的情况下,控制所述目标车辆进入所述安全自动驾驶模式。
在本申请实施例中,所述在所述行驶风力信息在预设风力级别范围内的情况下,控制所述目标车辆进入所述安全自动驾驶模式之后,所述方法还包括:
降低所述目标车辆的自动驾驶车速;
获取风力方向信息;
基于所述风力方向信息,控制所述目标车辆的行驶方向不变。
另一方面,本申请还提供一种智能识别设备,所述设备包括处理器和存储器,所述存储器中存储有至少一条指令或至少一段程序,所述至少一条指令或所述至少一段程序由所述处理器加载并执行以实现如上述车辆自动驾驶控制方法。
另一方面,本申请还提供一种计算机存储介质,所述存储介质中存储有至少一条指令或至少一段程序,所述至少一条指令或所述至少一段程序由处理器加载并执行以实现如上述车辆自动驾驶控制方法。
由于上述技术方案,本申请所述的一种车辆自动驾驶控制方法具有以下有益效果:
本申请中车辆自动驾驶控制方法,通过在驾驶识别结果表征目标车辆处于次极端驾驶场景的情况下,控制目标车辆进入安全自动驾驶模式,从而通过预设安全自动驾驶模式应对次极端场景,提高了车辆在次极端场景自动驾驶的安全性;也避免了车辆在次极端场景退出自动驾驶模式,提高了自动驾驶的驾驶场景适用性。
附图说明
为了更清楚地说明本申请的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单的介绍。显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其它附图。
图1是本申请实施例提供的一种车辆自动驾驶控制方法的流程示意图;
图2是本申请实施例提供的一种车辆自动驾驶控制方法中获取车辆感知信息的流程示意图;
图3是本申请实施例提供的一种车辆自动驾驶控制方法中横向力矩控制的流程示意图;
图4是本申请实施例提供的一种车辆自动驾驶控制方法中安全自动驾驶的流程示意图;
图5是本申请实施例提供的一种车辆自动驾驶控制装置的结构示意图;
图6是本申请实施例提供的一种车辆自动驾驶控制方法的硬件结构框图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动的前提下所获得的所有其他实施例,都属于本申请保护的范围。
此处所称的“一个实施例”或“实施例”是指可包含于本申请至少一个实现方式中的特定特征、结构或特性。在本申请的描述中,需要理解的是,术语“上”、“下”、 “左”、“右”、“顶”、“底”等指示的方位或位置关系为基于附图所示的方位或位置关系,仅是为了便于描述本申请和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本申请的限制。此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含的包括一个或者更多个该特征。而且,术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本申请的实施例能够以除了在这里图示或描述的那些以外的顺序实施。
结合图1,介绍本申请实施例提供的一种车辆自动驾驶控制方法,车辆自动驾驶控制可以应用于车内控制器,也可以应用于云端车辆控制器;该方法包括:
S101、获取目标车辆的车辆感知信息;车辆感知信息是指车辆基于感知模块对车辆周围的环境、周围的物体进行感知后的信息;具体的,车辆感知信息包括车辆感知距离、感知对象和行驶风力信息等信息。
S103、基于车辆感知信息进行驾驶场景识别,得到驾驶场景识别结果;驾驶场景识别即是对当前自动驾驶的驾驶场景进行识别;驾驶场景识别结果即当前自动驾驶所处的驾驶场景;驾驶识别结果包括极端驾驶场景、次极端驾驶场景和正常驾驶场景,不同驾驶场景对应不同的自动驾驶策略;极端驾驶场景包括狂风、暴雨、暴雪、沙尘暴和路面严重结冰等场景;次极端驾驶场景包括大风、大雨、大雪、中度沙尘和路面积水等场景;正常驾驶场景包括天气晴朗和地面无积水的驾驶场景。
S105、在驾驶场景识别结果表征目标车辆处于次极端驾驶场景的情况下,控制目标车辆进入安全自动驾驶模式;次极端驾驶场景中存在的驾驶不利因素小于极端 驾驶场景中的驾驶不利因素,大于正常驾驶场景中的驾驶不利因素;安全自动驾驶模式是指在次极端驾驶场景下控制目标车辆安全自动驾驶的模式;不利因素是指对于车辆安全驾驶具备反向影响的因素,例如:路面附着系数低,车辆容易打滑;空气中存在迷雾或雨雾,车辆视野较差。
在本申请实施例中,通过在驾驶识别结果表征目标车辆处于次极端驾驶场景的情况下,控制目标车辆进入安全自动驾驶模式,从而通过预设安全自动驾驶模式应对次极端场景,提高了车辆在次极端场景自动驾驶的安全性;也避免了车辆在次极端场景退出自动驾驶模式,提高了自动驾驶的驾驶场景适用性。
在本申请实施例中,车辆感知信息包括车辆感知距离;车辆感知距离是指车辆能够感知的极限距离;S105包括:
在车辆感知距离在预设感知范围内的情况下,控制目标车辆进入安全自动驾驶模式;预设感知范围是指在车辆感知距离小于正常驾驶场景下的车辆感知距离,大于极端驾驶场景下的车辆感知距离;例如,在车辆感知距离大于200m的情况下,目标车辆处于正常驾驶场景;车辆感知距离小于50m的情况下,目标车辆处于极端驾驶场景;则预设范围为50~200m,即在车辆感知距离在50~200m内的情况下,目标车辆处于次极端驾驶场景。
在本申请实施例中,在车辆感知距离在50~200m内的情况下,目标车辆可能处于大雾、大雪、沙尘暴和雨雾等影响驾驶视野的次极端驾驶场景,由于目标车辆的驾驶视野减小,导致前方远距离的车辆或障碍物的感知出现障碍,因此S105之后,上述方法还包括:
降低目标车辆的自动驾驶车速;自动驾驶车速是指目标车辆处于自动驾驶模式下的行驶车速;通过降低目标车辆的自动驾驶车速,在车辆感知到前方具备车辆或 障碍物的情况下,可以通过制动减速以避免碰撞,从而提高目标车辆在次极端驾驶场景下的驾驶安全;同时保障车辆在次极端场景下的正常自动驾驶。
在本申请具体实施例中,可以是基于车辆感知距离与自动驾驶车速的对应关系,降低目标车辆的自动驾驶车速;即车辆感觉距离决定目标车辆的自动驾驶车速;例如,在车辆感知距离在200m的情况下,目标车辆的自动驾驶车速可以是120km/h,此时在全力制动的情况下,依然可以避免目标车辆与前方200m的车辆或障碍物相撞;在车辆感知距离在100m的情况下,目标车辆的自动驾驶车速可以是100km/h,此时在全力制动的情况下,依然可以避免目标车辆与前方100m的车辆或障碍物相撞;在车辆感知距离在50m的情况下,目标车辆的自动驾驶车速可以是90km/h,此时在全力制动的情况下,依然可以避免目标车辆与前方50m的车辆或障碍物相撞;基于车辆感知距离与自动驾驶车速的对应关系,降低目标车辆的自动驾驶车速,既提高了目标车辆的驾驶安全性,又提高了自动驾驶的平均车速,从而降低自动驾驶到达目的地的时长。
在本申请具体实施例中,车辆感知距离与自动驾驶车速的对应关系可以是车辆感知距离每下降5m,自动驾驶车速降低1km/h;也可以是车辆感知距离每下降5m,自动驾驶车速降低1.5km/h;具体数值在此不做过多限定,具体数值需要基于目标车辆的制动压力进行计算获取。
在本申请另一具体实施例中,车辆感知距离与自动驾驶车速的对应关系还可以是比例关系,例如,车辆感知距离在50~200m内的情况下,自动驾驶车速与正常驾驶场景中的驾驶速度下降25%~0%,具体的,在车辆感知距离为100米的情况下,自动驾驶车速相比正常驾驶场景中的驾驶速度下降17%;在车辆感知距离为50米的情况下,自动驾驶车速相比正常驾驶场景中的驾驶速度下降25%。
在本申请实施例中,参考附图2,S101包括:
S201、获取图像采集信息和多种雷达采集信息;多种雷达采集信息包括多种雷达各自对应的雷达采集信息,以及多种雷达各自对应的信息置信度;图像采集信息是指目标车辆上的摄像头采集到的周围场景的信息;雷达采集信息是指目标车辆上的雷达采集到的信息;其中,目标车辆上的雷达包括毫米波雷达、超声波雷达、激光雷达和角雷达等多种雷达,一种雷达采集的信息即为一种雷达采集信息;信息置信度是指各种雷达输出结果的可信度。
S203、基于多种雷达各自对应的雷达采集信息,以及多种雷达各自对应的信息置信度进行信息融合,确定雷达感知距离;信息融合是指对多种雷达采集信息输出的结果进行融合,雷达感知距离是指目标车辆中雷达系统能够感知到的距离。
S205、对图像采集信息进行像素点识别,确定图像感知距离;像素点识别是指对采集到的图像上的像素点进行分析识别。
S207、基于雷达感知距离和图像感知距离进行感知距离融合,确定车辆感知距离。感知距离融合是指将雷达感知距离和图像感知距离进行融合的过程;具体的,可以是取雷达感知距离和图像感知距离的交集,也可以是取雷达感知距离和图像感知距离的并集;基于雷达感知距离和图像感知距离进行感知距离融合,确定车辆感知距离,可以确保车辆感知距离的准确性,从而提高自动驾驶的智能性。
在本申请实施例中,基于多种雷达各自对应的雷达采集信息,可以得到多种雷达各自对应的雷达感知距离;具体的,雷达是通过不同频率电磁波对目标车辆周围的物体进行探测,利用雷达发出和接收的雷达波的差值进行计算探测,即多普勒效应;若目标车辆处于大雾、大雪、沙尘暴和雨雾等影响驾驶视野的次极端驾驶场景,会将雷达发射出的雷达波进行吸收,从而导致雷达接收的雷达波数量变少,在雷达 探测到的探测目标均处于50米以内,或雷达追寻的物体在距离目标车辆50米后便无法进行探测,则认为雷达的雷达感知距离为50米。
在本申请实施例中,可以是选取置信度高的雷达输出结果作为雷达感知距离,也可以是基于置信度对多种雷达输出结果进行加权计算后得到雷达感知距离。
在本申请实施例中,摄像头通过光学元器件获取识别目标的像素点,具体的识别目标可以是车尾灯,在识别出对称的车尾灯的情况下,可以等同于识别出对应车辆。
在本申请实施例中,可以采用200万像素的摄像头对识别目标的轮廓或部分进行识别,以确定图像感知距离;其中,识别目标距离目标车辆约为200m,具体的,若识别出识别目标的像素点为1000个,从而确定图像感知距离大于200m;在识别目标的像素点在200~500的情况下,从而确定图像感知距离在50m以内;在识别目标的像素点在250~1000的情况下,可以确定图像感知距离在50~200m的范围内;其中,造成像素点减少的原因可以是大雾、大雪、沙尘暴和雨雾等影响驾驶视野的次极端驾驶场景。
在本申请实施例中,车辆感知信息包括感知对象;感知对象是指目标车辆周围的对象;S105包括:
在感知对象为目标对象的情况下,控制目标车辆进入安全自动驾驶模式;目标对象为对自动驾驶产生不利影响的对象;例如,地面积水和路面石块等。
在本申请实施例中,在感知对象为目标对象的情况下,上述方法还包括:
在目标对象位于目标车辆所处的当前车道的情况下,控制目标车辆变道;具体的,可以是车辆前方具有石块、积水或积雪,可以提前将目标车辆变换至没有石块或积水的地方;若车辆正处于积水或积雪的车道上,可以将车辆变换至没有积水或 积雪的车道上。
在本申请实施例中,目标对象包括降低目标车辆附着系数的对象,降低目标车辆附着系数的对象包括积水和积雪;在目标对象位于目标车辆所处的当前车道的情况下,控制目标车辆变道包括:
1、基于目标车辆与目标对象的位置关系,确定目标车辆的横向控制力矩;位置关系是指目标车辆与感知对象的位置情况,此时目标对象必然位于目标车辆前方;横向控制力矩是用于控制目标车辆变道转向时的力矩,横向控制力矩越小,目标车辆打滑的概率越小。
2、基于横向控制力矩控制目标车辆变道;通过目标车辆与目标对象的位置关系确定使目标车辆变道的横向控制力矩,进而在保障自动驾驶快速行进的过程中,提高自动驾驶的安全性。
在本申请实施例中,获取降低目标车辆附着系数的对象的方法包括:
获取图像采集信息;
基于图像采集信息的区域色差分析感知对象位置。
在本申请实施例中,位置关系表征目标对象与目标车辆的距离;基于目标车辆与目标对象的位置关系,确定目标车辆的横向控制力矩包括:
在目标对象与目标车辆的距离大于预设距离的情况下,保持横向控制力矩不变;通过在目标车辆与目标对象的距离大于预设距离的情况下,保持横向控制力矩不变,从而可以实现提前变道,从而快速使目标车辆提前进入不存在积水或积雪等降低目标车辆附着系数对象的车道上,提高了自动驾驶的智能性,以及提高了自动驾驶的安全性。
在本申请另一实施例中,在目标对象与目标车辆的距离小于或等于预设距离的 情况下,降低横向控制力矩;通过在目标车辆与目标对象的距离小于或等于预设距离的情况下,降低横向控制力矩,从而在目标车辆进入积水或积雪等降低目标车辆附着系数对象的车道后,降低目标车辆在变道或拐弯过程中的打滑可能性,提高车辆稳定性,从而提高自动驾驶的安全性。
在本申请实施例中,若目标车辆已经处于积水或积雪的车道,且其他车道也都有积水或积雪,降低横向控制力矩。
在本申请实施例中,参考附图3,基于目标车辆与目标对象的位置关系,确定目标车辆的横向控制力矩还包括:
S301、获取目标车辆的实时轮速信息和实时驱动力矩信息;实时轮速信息是指目标车辆的车轮当前的转速,实时驱动力矩信息是指当前驱动目标车辆自动行驶的力矩。
S303、基于实时轮速信息和实时驱动力矩信息进行受力分析,得到车轮的实时受力信息;受力分析是指基于实时轮速信息和实时驱动力矩信息进行计算后可以得到轮胎的驱动力,驱动力等于地面摩擦力;具体的,实时驱动力矩为x N*m,轮胎周长约为y m,那么轮段受力为x/y N,可以得到地面摩擦力为x/y N,进而得到地面附着系数。
S305、基于实时受力信息和预设车辆重量进行附着系数分析,得到实时附着系数;附着系数分析即基于实时受力信息和预设车辆重量进行计算获得附着系数的过程。
S307、基于实时附着系数确定目标车辆的横向控制力矩。具体的,可以是基于实时附着系数与正常驾驶场景下的附着系数下降比例控制目标车辆的横向控制力矩;通过对实时附着系数的监测,保障了自动驾驶过程中的目标车辆可以快速基于变换 的附着系数做出对应控制,从而提高了自动驾驶的智能性,以及提高了自动驾驶的安全性。
在本申请具体实施例中,在正常驾驶场景中,地面附着系数约为0.7~1,极端驾驶场景中,地面附着系数为0.3,在附着系数为0.3~0.7的情况下,车辆处于次极端驾驶场景中,横向控制力矩会随地面附着系数下降40%~0%;具体的,在地面附着系数降低至0.5的情况下,横向控制力矩相对于正常驾驶场景中的横向控制力矩下降20%。
在本申请实施例中,述车辆感知信息包括行驶风力信息,行驶风力信息是指目标车辆驾驶场景中的风力等级,可以通过车辆感知系统获取,也可以在天气数据库中获取;S105包括:
在行驶风力信息在预设风力级别范围内的情况下,控制目标车辆进入安全自动驾驶模式;预设风力级别范围是指风力级别大于正常驾驶场景下的风力级别,小于极端驾驶场景下的风力级别;具体的,正常驾驶场景下的风力等级为0~5级;极端驾驶场景下为9~17级;预设风力级别范围指6~8级。
在行驶风力信息在预设风力级别范围内的情况下,控制目标车辆进入安全自动驾驶模式之后,参考附图4,上述方法还包括:
S401、降低目标车辆的自动驾驶车速;通过降低目标车辆的自动驾驶车速,以提高目标车辆在强风中的行驶稳定性,从而提高自动驾驶的安全性。
S403、获取风力方向信息;
S405、基于风力方向信息,控制目标车辆的行驶方向不变;具体的,在风力方向信息显示强风从东南吹往西北方向,且目标车辆由南向北行驶的情况下,适当增加车辆左后部分的控制力矩,从而控制目标车辆的行驶方向不变;基于风力方向信 息,控制目标车辆行驶方向不变,从而提高了自动驾驶的智能性,以及提高了自动驾驶的安全性。
在本申请实施例中,控制方法还包括:
在驾驶场景识别结果表征目标车辆处于极端驾驶场景的情况下,发出请求接管指令;请求接管指令是指请求用户接管车辆控制权的指令。
退出自动驾驶模式。
在本申请实施例中,通过在极端驾驶场景下请求用户接管并退出自动驾驶模式,从而提高了车辆处于极端场景中的安全性。
结合附图5,介绍本申请实施例提供的一种车辆自动驾驶控制装置,该装置包括:
获取模块1001,用于获取目标车辆的车辆感知信息;
场景识别模块2001,用于基于车辆感知信息进行驾驶场景识别,得到驾驶场景识别结果;
安全驾驶模块3001,用于在驾驶场景识别结果表征目标车辆处于次极端驾驶场景的情况下,控制目标车辆进入安全自动驾驶模式;次极端驾驶场景中存在的驾驶不利因素小于极端驾驶场景中的驾驶不利因素,大于正常驾驶场景中的驾驶不利因素。
车辆感知信息包括车辆感知距离、感知对象和行驶风力信息;安全驾驶模块包括:
第一安全单元,用于在车辆感知距离在预设感知范围内的情况下,控制目标车辆进入安全自动驾驶模式。
第二安全单元,用于在感知对象为目标对象的情况下,控制目标车辆进入安全自动驾驶模式。
第三安全单元,用于在行驶风力信息在预设风力级别范围内的情况下,控制目标车辆进入安全自动驾驶模式。
在本申请实施例中,车辆自动驾驶控制装置还包括:
降速模块,用于降低目标车辆的自动驾驶车速。
变道模块,用于在目标对象位于目标车辆所处的当前车道的情况下,控制目标车辆变道。
风力方向获取模块,用于获取风力方向信息;
行驶方向控制模块,用于基于风力方向信息,控制目标车辆的行驶方向不变。
在本申请实施例中,获取模块包括:
信息综合获取单元,用于获取图像采集信息和多种雷达采集信息;多种雷达采集信息包括多种雷达各自对应的雷达采集信息,以及多种雷达各自对应的信息置信度;
雷达信息融合单元,用于基于多种雷达各自对应的雷达采集信息,以及多种雷达各自对应的信息置信度进行信息融合,确定雷达感知距离;
像素点识别单元,用于对图像采集信息进行像素点识别,确定图像感知距离;
感知距离融合单元,用于基于雷达感知距离和图像感知距离进行感知距离融合,确定车辆感知距离。
在本申请实施例中,变道模块包括:
横向力矩控制单元,用于基于目标车辆与目标对象的位置关系,确定目标车辆的横向控制力矩。
在本申请实施例中,横向力矩控制单元包括:
信息获取子单元,用于获取目标车辆的实时轮速信息和实时驱动力矩信息;
受力分析子单元,用于基于实时轮速信息和实时驱动力矩信息进行受力分析,得到车轮的实时受力信息;
附着系数分析子单元,用于基于实时受力信息和预设车辆重量进行附着系数分析,得到实时附着系数;
力矩控制子单元,用于基于实时附着系数确定目标车辆的横向控制力矩。
在本申请实施例中,位置关系表征目标对象与目标车辆的距离,横向力矩控制单元包括:
力矩保持子单元,用于在目标对象与目标车辆的距离大于预设距离的情况下,保持横向控制力矩不变。
力矩降低子单元,用于在目标对象与目标车辆的距离小于或等于预设距离的情况下,降低横向控制力矩。
本申请实施例还提供一种智能识别设备,该智能识别设备包括处理器和存储器,储器中存储有至少一条指令或至少一段程序,至少一条指令或至少一段程序由处理器加载并执行以实现如上述的车辆自动驾驶控制方法。
存储器可用于存储软件程序以及模块,处理器通过运行存储在存储器的软件程序以及模块,从而执行各种功能应用以及数据处理。存储器可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、功能所需的应用程序等;存储数据区可存储根据设备的使用所创建的数据等。此外,存储器可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个硬盘存储器件、闪存器件或其他易失性固态存储器件。相应的,存储器还可以包括存储器控制器,以提供处理器对存储器的访问。
本申请实施例所提供的方法实施例可以在移动终端、计算机终端、服务器或者 类似的运算装置等电子设备中执行。图6是本申请实施例提供的一种车辆自动驾驶控制方法的硬件结构框图。如图6所示,该电子设备900可因配置或性能不同而产生比较大的差异,可以包括一个或一个以上中央处理器(Centr车辆自动驾驶控制方法l Processing Units,CPU)910(处理器910可以包括但不限于微处理器MCU或可编程逻辑器件FPG车辆自动驾驶控制方法等的处理装置)、用于存储数据的存储器930,一个或一个以上存储应用程序923或数据922的存储介质920(例如一个或一个以上海量存储设备)。其中,存储器930和存储介质920可以是短暂存储或持久存储。存储在存储介质920的程序可以包括一个或一个以上模块,每个模块可以包括对电子设备中的一系列指令操作。更进一步地,中央处理器910可以设置为与存储介质920通信,在电子设备900上执行存储介质920中的一系列指令操作。电子设备900还可以包括一个或一个以上电源960,一个或一个以上有线或无线网络接口950,一个或一个以上输入输出接口940,和/或,一个或一个以上操作系统921,例如Windows ServerTM,M车辆自动驾驶控制方法c OS XTM,UnixTM,LinuxTM,Free一种车辆自动驾驶控制装置SDTM等等。
输入输出接口940可以用于经由一个网络接收或者发送数据。上述的网络具体实例可包括电子设备900的通信供应商提供的无线网络。在一个实例中,输入输出接口940包括一个网络适配器(Network Interf车辆自动驾驶控制方法ce Controller,NIC),其可通过基站与其他网络设备相连从而可与互联网进行通讯。在一个实例中,输入输出接口940可以为射频(R车辆自动驾驶控制方法dio Frequency,RF)模块,其用于通过无线方式与互联网进行通讯。
本领域普通技术人员可以理解,图6所示的结构仅为示意,其并不对上述电子装置的结构造成限定。例如,电子设备900还可包括比图6中所示更多或者更少的组件, 或者具有与图6所示不同的配置。
本申请的实施例还提供一种存储介质,该存储介质中存储有至少一条指令或至少一段程序,至少一条指令或至少一段程序由处理器加载并执行以实现如上述的车辆自动驾驶控制方法。
上述说明已经充分揭露了本申请的具体实施方式。需要指出的是,熟悉该领域的技术人员对本申请的具体实施方式所做的任何改动均不脱离本申请的权利要求书的范围。相应地,本申请的权利要求的范围也并不仅仅局限于前述具体实施方式。

Claims (15)

  1. 一种车辆自动驾驶控制方法,其特征在于,包括:
    获取目标车辆的车辆感知信息;
    基于所述车辆感知信息进行驾驶场景识别,得到驾驶场景识别结果;
    在所述驾驶场景识别结果表征所述目标车辆处于次极端驾驶场景的情况下,控制所述目标车辆进入安全自动驾驶模式;所述次极端驾驶场景中存在的驾驶不利因素小于极端驾驶场景中的驾驶不利因素,大于正常驾驶场景中的驾驶不利因素。
  2. 根据权利要求1所述的一种车辆自动驾驶控制方法,其特征在于,所述车辆感知信息包括车辆感知距离;
    所述在所述驾驶场景识别结果表征所述目标车辆处于次极端驾驶场景的情况下,控制所述目标车辆进入安全自动驾驶模式包括:
    在所述车辆感知距离在预设感知范围内的情况下,控制所述目标车辆进入所述安全自动驾驶模式。
  3. 根据权利要求2所述的一种车辆自动驾驶控制方法,其特征在于,所述在所述车辆感知距离在预设感知范围内的情况下,控制所述目标车辆进入所述安全自动驾驶模式之后,所述方法还包括:
    降低所述目标车辆的自动驾驶车速。
  4. 根据权利要求2所述的一种车辆自动驾驶控制方法,其特征在于,所述获取目标车辆的车辆感知信息包括:
    获取图像采集信息和多种雷达采集信息;所述多种雷达采集信息包括多种雷达各自对应的雷达采集信息,以及所述多种雷达各自对应的信息置信度;
    基于所述多种雷达各自对应的雷达采集信息,以及所述多种雷达各自对应的信 息置信度进行信息融合,确定雷达感知距离;
    对所述图像采集信息进行像素点识别,确定图像感知距离;
    基于所述雷达感知距离和所述图像感知距离进行感知距离融合,确定所述车辆感知距离。
  5. 根据权利要求1所述的一种车辆自动驾驶控制方法,其特征在于,所述车辆感知信息包括感知对象;
    所述在所述驾驶场景识别结果表征所述目标车辆处于次极端驾驶场景的情况下,控制所述目标车辆进入安全自动驾驶模式包括:
    在所述感知对象为目标对象的情况下,控制所述目标车辆进入所述安全自动驾驶模式;所述目标对象为对自动驾驶产生不利影响的对象。
  6. 根据权利要求5所述的一种车辆自动驾驶控制方法,其特征在于,所述在所述感知对象为目标对象的情况下,控制所述目标车辆进入所述安全自动驾驶模式之后,所述方法还包括:
    在所述目标对象位于所述目标车辆所处的当前车道的情况下,控制所述目标车辆变道。
  7. 根据权利要求6所述的一种车辆自动驾驶控制方法,其特征在于,所述目标对象包括降低所述目标车辆附着系数的对象,所述在所述目标对象位于所述目标车辆所处的当前车道的情况下,控制所述目标车辆变道包括:
    基于所述目标车辆与所述目标对象的位置关系,确定所述目标车辆的横向控制力矩;
    基于所述横向控制力矩控制所述目标车辆变道。
  8. 根据权利要求7所述的一种车辆自动驾驶控制方法,其特征在于,所述基于 所述目标车辆与所述目标对象的位置关系,确定所述目标车辆的横向控制力矩包括:
    获取所述目标车辆的实时轮速信息和实时驱动力矩信息;
    基于所述实时轮速信息和所述实时驱动力矩信息进行受力分析,得到车轮的实时受力信息;
    基于所述实时受力信息和预设车辆重量进行附着系数分析,得到所述实时附着系数;
    基于所述实时附着系数确定所述目标车辆的横向控制力矩。
  9. 根据权利要求7所述的一种车辆自动驾驶控制方法,其特征在于,所述位置关系表征所述目标对象与所述目标车辆的距离;
    所述基于所述目标车辆与所述目标对象的位置关系,确定所述目标车辆的横向控制力矩包括:
    在所述目标对象与所述目标车辆的距离大于预设距离的情况下,保持所述横向控制力矩不变。
  10. 根据权利要求7所述的一种车辆自动驾驶控制方法,其特征在于,所述位置关系表征所述目标对象与所述目标车辆的距离;
    所述基于所述目标车辆与所述目标对象的位置关系,确定所述目标车辆的横向控制力矩包括:
    在所述目标对象与所述目标车辆的距离小于或等于预设距离的情况下,降低所述横向控制力矩。
  11. 根据权利要求1所述的一种车辆自动驾驶控制方法,其特征在于,所述车辆感知信息包括行驶风力信息;
    所述在所述驾驶场景识别结果表征所述目标车辆处于次极端驾驶场景的情况下, 控制所述目标车辆进入安全自动驾驶模式包括:
    在所述行驶风力信息在预设风力级别范围内的情况下,控制所述目标车辆进入所述安全自动驾驶模式。
  12. 根据权利要求11所述的一种车辆自动驾驶控制方法,其特征在于,所述在所述行驶风力信息在预设风力级别范围内的情况下,控制所述目标车辆进入所述安全自动驾驶模式之后,所述方法还包括:
    降低所述目标车辆的自动驾驶车速;
    获取风力方向信息;
    基于所述风力方向信息,控制所述目标车辆的行驶方向不变。
  13. 一种车辆自动驾驶控制装置,其特征在于,包括:
    获取模块,用于获取目标车辆的车辆感知信息;
    场景识别模块,用于基于所述车辆感知信息进行驾驶场景识别,得到驾驶场景识别结果;
    安全驾驶模块,用于在所述驾驶场景识别结果表征所述目标车辆处于次极端驾驶场景的情况下,控制所述目标车辆进入安全自动驾驶模式;所述次极端驾驶场景中存在的驾驶不利因素小于极端驾驶场景中的驾驶不利因素,大于正常驾驶场景中的驾驶不利因素。
  14. 一种智能识别设备,其特征在于,所述设备包括处理器和存储器,所述存储器中存储有至少一条指令或至少一段程序,所述至少一条指令或所述至少一段程序由所述处理器加载并执行以实现如权利要求1-12任一所述的车辆自动驾驶控制方法。
  15. 一种计算机存储介质,其特征在于,所述存储介质中存储有至少一条指令、 至少一段程序、代码集或指令集,所述至少一条指令、至少一段程序、代码集或指令集由处理器加载并执行以实现如权利要求1-12任一所述的车辆自动驾驶控制方法。
PCT/CN2023/094733 2022-06-10 2023-05-17 一种车辆自动驾驶控制方法、装置和计算机存储介质 WO2023236738A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210657661.6A CN117246335A (zh) 2022-06-10 2022-06-10 一种车辆自动驾驶控制方法、装置和计算机存储介质
CN202210657661.6 2022-06-10

Publications (1)

Publication Number Publication Date
WO2023236738A1 true WO2023236738A1 (zh) 2023-12-14

Family

ID=89117525

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/094733 WO2023236738A1 (zh) 2022-06-10 2023-05-17 一种车辆自动驾驶控制方法、装置和计算机存储介质

Country Status (2)

Country Link
CN (1) CN117246335A (zh)
WO (1) WO2023236738A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102107466B1 (ko) * 2018-12-14 2020-05-07 국민대학교산학협력단 자율주행차량 기반의 주행제어장치 및 방법
CN113276882A (zh) * 2021-04-22 2021-08-20 清华大学苏州汽车研究院(相城) 自动驾驶车辆控制方法、控制系统及目标速度的计算方法
CN113291293A (zh) * 2021-04-25 2021-08-24 宁波均联智行科技股份有限公司 基于车身稳定控制驾驶模式的方法及系统
CN113348119A (zh) * 2020-04-02 2021-09-03 华为技术有限公司 一种车辆盲区识别方法、自动驾驶辅助系统以及包括该系统的智能驾驶车辆
CN114407894A (zh) * 2020-09-25 2022-04-29 阿波罗智能技术(北京)有限公司 车辆控制方法、装置、电子设备及存储介质
CN114475597A (zh) * 2022-02-28 2022-05-13 东风汽车集团股份有限公司 一种自动驾驶车辆跟车距离控制方法及控制系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102107466B1 (ko) * 2018-12-14 2020-05-07 국민대학교산학협력단 자율주행차량 기반의 주행제어장치 및 방법
CN113348119A (zh) * 2020-04-02 2021-09-03 华为技术有限公司 一种车辆盲区识别方法、自动驾驶辅助系统以及包括该系统的智能驾驶车辆
CN114407894A (zh) * 2020-09-25 2022-04-29 阿波罗智能技术(北京)有限公司 车辆控制方法、装置、电子设备及存储介质
CN113276882A (zh) * 2021-04-22 2021-08-20 清华大学苏州汽车研究院(相城) 自动驾驶车辆控制方法、控制系统及目标速度的计算方法
CN113291293A (zh) * 2021-04-25 2021-08-24 宁波均联智行科技股份有限公司 基于车身稳定控制驾驶模式的方法及系统
CN114475597A (zh) * 2022-02-28 2022-05-13 东风汽车集团股份有限公司 一种自动驾驶车辆跟车距离控制方法及控制系统

Also Published As

Publication number Publication date
CN117246335A (zh) 2023-12-19

Similar Documents

Publication Publication Date Title
CN108375775B (zh) 车载探测设备及其参数的调整方法、介质、探测系统
EP2921362B1 (en) Vehicle, vehicle system and method for increasing safety and/or comfort during autonomous driving
US9469307B2 (en) Vehicle, vehicle system and method for increasing safety and/or comfort during autonomous driving
US10006779B2 (en) Transmission necessity determination apparatus and route planning system
WO2022156276A1 (zh) 一种目标检测方法及装置
US11402842B2 (en) Method to define safe drivable area for automated driving system
WO2019218861A1 (zh) 一种行车道路的估计方法以及行车道路估计系统
US20150153184A1 (en) System and method for dynamically focusing vehicle sensors
CN103874931B (zh) 用于求取车辆的环境中的对象的位置的方法和设备
US11167751B2 (en) Fail-operational architecture with functional safety monitors for automated driving system
CN112347206A (zh) 地图更新方法、装置及存储介质
US20190293435A1 (en) Host vehicle position estimation device
US11631257B2 (en) Surroundings recognition device, and surroundings recognition method
US11016489B2 (en) Method to dynamically determine vehicle effective sensor coverage for autonomous driving application
CN114442101B (zh) 基于成像毫米波雷达的车辆导航方法、装置、设备及介质
CN111781933A (zh) 一种基于边缘计算和空间智能的高速自动驾驶车辆实现系统及方法
US11814081B2 (en) Control system, control method, vehicle, and computer-readable storage medium
JP2021169235A (ja) 車両の走行支援装置
JP2018116404A (ja) 車両制御システム
US20200386897A1 (en) Method for the Satellite-Supported Determination of a Position of a Vehicle
CN114724398A (zh) 基于自动驾驶的预约泊车方法、系统及可读存储介质
US11790781B2 (en) Method and system for detecting slip accident using mobile device
WO2022133939A1 (zh) 驾驶控制方法、装置、汽车及计算机可读存储介质
WO2023236738A1 (zh) 一种车辆自动驾驶控制方法、装置和计算机存储介质
WO2022160127A1 (zh) 控制方法和装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23818903

Country of ref document: EP

Kind code of ref document: A1