CN112512887B - Driving decision selection method and device - Google Patents

Driving decision selection method and device Download PDF

Info

Publication number
CN112512887B
CN112512887B CN202080004262.9A CN202080004262A CN112512887B CN 112512887 B CN112512887 B CN 112512887B CN 202080004262 A CN202080004262 A CN 202080004262A CN 112512887 B CN112512887 B CN 112512887B
Authority
CN
China
Prior art keywords
range
vehicle
distance
confidence
interval
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202080004262.9A
Other languages
Chinese (zh)
Other versions
CN112512887A (en
Inventor
李帅君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN112512887A publication Critical patent/CN112512887A/en
Application granted granted Critical
Publication of CN112512887B publication Critical patent/CN112512887B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0956Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0015Planning or execution of driving tasks specially adapted for safety

Abstract

The embodiment of the application provides a driving decision selection method and a driving decision selection device in the fields of intelligent automobiles, internet automobiles and automatic driving, which are used for expanding the available perception range when a driving decision is selected for a vehicle and a driving path is planned, improving the driving safety and stability of the vehicle and improving the user experience. The method comprises the following steps: acquiring perception information acquired by a sensor configured on a vehicle; selecting a first range interval matched with the perception information from at least one range interval, wherein the at least one range interval is obtained by dividing the detection range of the sensor based on the output information of the sensor, and each range interval corresponds to at least one driving decision; and selecting a driving decision of the vehicle from at least one driving decision corresponding to the first range interval by combining the vehicle speed of the vehicle, and controlling the vehicle to drive according to the driving decision of the vehicle.

Description

Driving decision selection method and device
Technical Field
The application relates to the field of automobiles, in particular to a driving decision selection method and device.
Background
Decision and planning are important components in the automatic driving technology, such as global path planning, behavior decision or local path planning. At present, the upstream sensing module outputs position information of environment and some key target objects, and a decision and planning module generates control targets, such as a reference path or a reference track, for the self-vehicle, and outputs the control targets to a downstream control link to complete closed-loop control of the vehicle.
However, when planning a driving route, it is common to plan based on deterministic perception information, i.e., perception information with a confidence level higher than a certain value, which results in a limited range of perception.
Disclosure of Invention
The embodiment of the application provides a driving decision selection method and a driving decision selection device, which are used for expanding an available perception range when a driving decision is selected for a vehicle and a driving path is planned, improving the driving safety and stability of the vehicle and improving the user experience.
In view of the above, in a first aspect, the present application provides a driving decision selection method, including: firstly, acquiring perception information acquired by a sensor configured on a vehicle; then, according to the perception information, selecting a first range interval from at least one range interval, wherein the at least one range interval is obtained by dividing the detection range of the sensor based on the output information of the sensor, the output information of the sensor comprises at least one of a distance result of a detection target object, a confidence coefficient corresponding to the distance result or a relationship between the distance result and the confidence coefficient corresponding to the distance result, and each range interval has at least one corresponding driving decision; and then, selecting a driving decision of the vehicle from at least one driving decision corresponding to the first range interval by combining the vehicle speed of the vehicle, and controlling the vehicle to drive according to the driving decision of the vehicle.
In an embodiment of the application, the detection range of the sensor is divided into at least one range section, each range section having a corresponding distance range, before the driving decision is selected. In the process of selecting the driving decision, at least one driving decision corresponding to each range interval can be used to select the driving decision of the vehicle, such as the decisions of acceleration, deceleration, vehicle speed maintenance or lane change, and the vehicle is controlled to drive according to the driving decision. Generally, the farther the sensor detects an object, the lower the confidence. Therefore, in contrast to planning a driving route only using the perception information with high confidence, the driving decision selection method provided by the present application divides the range section for the detection range of the sensor, and one or more obtained range sections cover the detection range of the sensor. In the process of selecting the driving decision, even if the distance included in the perception information is long or the confidence coefficient is low, the perceived longer distance can be used for selecting the driving decision, so that the vehicle can avoid longer obstacles in advance, the available perception range when the driving decision of the vehicle is determined is increased, the driving safety of the vehicle is improved, and the user experience is improved. And the obstacle farther away from the vehicle is avoided in advance, so that decisions such as acceleration or deceleration can be made in advance, severe acceleration or severe deceleration and the like caused by too short distance of the vehicle are avoided, the driving process of the vehicle is smoother, and the user experience is improved.
In a possible implementation, each range section corresponds to a distance range, the sensing information includes a first distance between an obstacle detected by the sensor and the vehicle, and the selecting a first range section matching the sensing information from at least one range section may include: and matching the first distance with the distance range corresponding to the at least one range interval, so that the first distance is within the distance range corresponding to the first range interval, and screening out the first range interval matched with the first distance.
In the embodiment of the application, each range section has a corresponding driving decision, so that the range section matched with the distance between the vehicle and the object can be selected according to the distance between the vehicle and the object, and the driving decision of the vehicle can be selected according to one or more driving decisions corresponding to the range section, so that the driving decision of the vehicle can be selected based on a larger sensing range, and then the driving path of the vehicle can be planned based on the larger sensing range.
In a possible embodiment, each range interval corresponds to one confidence level range, and the confidence level range corresponding to the at least one range interval covers the confidence level of the information detected by the sensor in the detection range, and the sensing information further includes a first confidence level, and the first confidence level is used for indicating the accuracy degree of the first distance; the selecting a first range section matched with the perception information from the at least one range section may include: and matching the first confidence with the confidence range corresponding to the at least one range interval, so as to obtain that the first confidence is in the confidence range corresponding to the first range interval, and further screening the first range interval.
In the embodiment of the present application, a range section matched with the perception information may be selected based on the confidence included in the perception information, and a driving decision of the vehicle may be selected from at least one driving decision of the range section. The driving decision of the vehicle can be selected even in a scene with low confidence. And then, a driving decision is selected according to a longer distance or a distance with lower confidence, so that the vehicle can process the obstacle at the longer distance in advance, such as early deceleration or early lane change, and the driving safety of the vehicle is improved.
In one possible embodiment, the first confidence level included in the perception information is obtained according to a detection range of the sensor and a distance between the sensor and the obstacle. The sensor is usually disposed in the vehicle, and the distance between the sensor and the obstacle is the distance between the vehicle and the sensor. Generally, the farther the distance between the vehicle and the sensor, the lower the first confidence, and the closer the distance between the vehicle and the sensor, the higher the first confidence.
In one possible embodiment, before obtaining the perception information, the driving decision selection method provided by the present application further includes: and dividing the detection range of the sensor to obtain at least one distance range, wherein the at least one distance range corresponds to the at least one range interval one to one.
Therefore, in the embodiment of the present application, the detection range of the sensor may be divided to obtain the distance range corresponding to each range section, and the distance range corresponding to the at least one range section may cover the detection range of the sensor, which is equivalent to increasing the available sensing range when selecting the driving decision. In a possible embodiment, before obtaining the perception information, the dividing the detection range of the sensor to obtain at least one distance range may include: the method comprises the steps of dividing the detectable confidence degree range of a sensor in a detection range to obtain at least one confidence degree range, wherein the confidence degree range corresponding to the at least one range covers the confidence degree of information detected by the sensor in the detection range. And then calculating the distance range corresponding to each confidence degree range according to the relationship between the distance result and the confidence degree corresponding to the distance result to obtain at least one distance range.
In an embodiment of the application, the detectable confidence range of the sensor is divided into at least one confidence range, each confidence range has a corresponding distance range, and each range interval is provided with at least one driving decision. Therefore, when the driving decision is selected, the driving decision can be selected according to the distance and the confidence degree included in the perception information, and then the driving path of the vehicle is planned, even if the distance included in the perception information is long or the confidence degree is low, the driving path can be planned in advance, which is equivalent to enlarging the available perception range when the driving path is planned, improving the driving safety and stability of the vehicle, and improving the user experience. It can be understood that, for the perception information that only uses the confidence coefficient to be higher than a definite value to plan the driving route, the scope interval has been planned in advance in this application for when subsequent planning driving route, can use bigger perception scope to plan the driving route, thereby select suitable driving decision to the barrier in front of the vehicle in advance, thereby plan the driving route that security, stability are higher fast, safely, improve user experience.
In a possible implementation manner, at least one confidence degree range corresponding to at least one distance range in a one-to-one manner is determined according to the relationship between the distance result and the confidence degree corresponding to the distance result, and the at least one confidence degree range is used for screening the range matched with the perception information from the at least one range.
In a possible implementation, before obtaining the perception information, the driving decision selection method provided by the present application may further include: dividing the detection range of the sensor to obtain at least one distance range, namely covering the detection range of the sensor by the at least one distance range; and then obtaining at least one confidence degree range corresponding to the at least one distance range one by one according to the at least one distance range and the relationship between the distance result and the confidence degree corresponding to the distance result, wherein each range interval corresponds to one confidence degree range and one distance range.
In the embodiment of the application, the confidence degree range can be divided firstly, and then the distance range corresponding to each confidence degree, namely, one range interval has a corresponding distance interval and confidence degree interval according to the relation between the distance result and the confidence degree corresponding to the distance result, so that the driving decision can be selected subsequently according to the distance or the confidence degree, the available sensing range when the driving decision is selected is increased, the safety and the stability of a vehicle are improved, and the user experience is improved.
In a possible implementation manner, if at least one range section is obtained by dividing the monitoring range according to the distance result output by the sensor, the first distance included in the sensing information and the distance range corresponding to each range section may be directly matched, and the range section matched with the first distance is screened out.
In another possible implementation, if at least one range interval is obtained by dividing according to the confidence degrees corresponding to the distance results output by the sensors, and each range interval corresponds to one confidence degree range, the first confidence degree included in the sensing information may be matched with the distance range corresponding to each range interval, so as to screen out the range interval matched with the first confidence degree.
In another possible implementation, if at least one range interval is divided according to the distance result, and the sensing information includes the first confidence and the relationship between the distance result and the confidence corresponding to the distance result, the first distance corresponding to the first confidence may be calculated according to the relationship between the distance result and the confidence corresponding to the distance result, and then the distance range corresponding to each range interval is matched to the first distance, so as to obtain the range interval matched with the first distance.
In one possible embodiment, the driving decision selection method provided by the present application may further include: acquiring historical distance information and corresponding confidence of historical distance information acquired by a sensor; and acquiring the relation between the distance result and the confidence corresponding to the distance result according to the historical distance information and the corresponding confidence.
In the embodiment of the application, the relationship between the distance result included in the output information of the sensor and the confidence degree corresponding to the distance result can be counted according to the historical distance information and the corresponding confidence degree acquired by the sensor, so that the subsequent range interval is divided, and the confidence degree range and the distance range corresponding to each range interval are determined.
In one possible implementation, when the driving decision of the vehicle is selected, the relative speed of the vehicle and the obstacle can be calculated according to the speed of the vehicle and the speed of the obstacle, and the speed of the obstacle can be calculated according to the perception information acquired within a period of time; and then screening out the driving decision of the vehicle from at least one driving decision corresponding to the first range interval by combining the relative speed.
In the embodiment of the application, when the driving decision is selected, the driving decision of the vehicle can be selected by combining the relative speed between the vehicle and the obstacle, the driving decision of the vehicle can be selected more accurately, and the driving safety of the vehicle is further improved.
In one possible embodiment, the at least one driving decision for each range interval is determined according to an application scenario, which may include, but is not limited to: automatic cruising, car following or automatic parking and the like. In a second aspect, the present application provides a driving decision selection device having a function of implementing the driving decision selection method of the first aspect. The function can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the functions described above.
In a second aspect, the present application provides a driving decision selection device having a function of implementing the driving decision selection method of the first aspect. The function can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the functions described above.
In a third aspect, the present application provides a driving decision selection method, comprising: acquiring at least one range interval, wherein each range interval in the at least one range interval corresponds to a confidence degree range and a distance range, the distance range included in the at least one range interval covers the detection range of the sensor, and the at least one confidence degree range covers the confidence degree of the information detected by the sensor in the detection range; then, at least one driving decision corresponding to each range section in the at least one range section is set, the at least one driving decision corresponding to each range section and each range section is used for selecting the driving decision of the vehicle, and the driving decision of the vehicle is used for generating a driving path of the vehicle.
In the embodiment of the application, the detection range of the sensor is divided into one or more distance ranges, each distance range has a corresponding confidence range, one range interval corresponds to one distance range and one confidence range, and each range interval corresponds to at least one driving decision. In the process of selecting the driving decision, the confidence coefficient or the distance can be used for matching, the driving decision of each range interval is set according to a fixed rule, such as the decisions of acceleration, deceleration, vehicle speed maintenance or lane change, and the vehicle is controlled to drive according to the driving decision. Generally, the farther the sensor detects an object, the lower the confidence. Therefore, in the driving decision selection method provided by the application, in the process of selecting the driving decision, the perceived farther distance can be used for selecting the driving decision, and the driving decision can be understood as being selected for the object at the farther distance, so that the vehicle can avoid the farther obstacle in advance, the driving safety of the vehicle is improved, and the user experience is improved. And the obstacle farther away from the vehicle is avoided in advance, so that decisions such as acceleration or deceleration can be made in advance, the driving process of the vehicle is smoother, and the user experience is improved.
In a possible implementation, obtaining at least one range interval may include: firstly, dividing the detection range of the sensor to obtain at least one distance range, and calculating the confidence degree range corresponding to each distance degree range according to the relationship between the distance result included in the output information of the sensor and the confidence degree corresponding to the distance result, wherein one range interval corresponds to one confidence degree range and one distance range.
Therefore, in the embodiment of the present application, the detection range of the sensor may be divided based on the distance between the vehicle and the object to obtain one or more distance ranges, and at least one range section may be obtained according to the confidence range corresponding to each distance range. Namely, one range interval has a corresponding distance interval and a corresponding confidence interval, so that a driving decision can be selected subsequently according to a longer distance or a lower confidence, which is equivalent to increasing an available perception range when the driving decision is selected, improving the safety and stability of the vehicle and improving the user experience.
In a possible implementation, obtaining at least one range interval may include: the method comprises the steps of dividing the range of the confidence degrees which can be detected by a sensor to obtain at least one confidence degree range, and calculating a range interval corresponding to each distance range corresponding to each confidence degree range according to the relationship between the distance result included in the output information of the sensor and the confidence degree corresponding to the distance result, wherein the range interval corresponds to one confidence degree range and one distance range.
Therefore, in the embodiment of the present application, the classification may be performed based on the confidence level to obtain one or more confidence level ranges. Generally, there is a corresponding relationship between the confidence and the distance, each confidence range has a corresponding distance range, and the confidence range and the distance range may constitute one or more range intervals obtained by dividing the sensing range of the sensor.
In one possible embodiment, after setting at least one driving decision corresponding to each of the at least one range section, the method further comprises: acquiring perception information acquired by a sensor arranged in a vehicle, wherein the perception information can comprise information of an obstacle, such as a first distance between the vehicle and the obstacle; and selecting a range interval matched with the perception information from at least one range interval, selecting a driving decision of the vehicle from at least one driving decision of the first range interval by combining the speed of the vehicle, and controlling the vehicle to drive according to the driving decision so as to drive the vehicle according to the driving decision.
Therefore, in the embodiment of the present application, in the process of selecting the driving decision of the vehicle, the driving decision of the vehicle may be selected based on at least one driving decision corresponding to each of the at least one range section obtained by the division. Compared with the method for determining the driving decision of the vehicle only by using the perception information with higher confidence coefficient, the distance range corresponding to at least one range section provided by the application covers the detection range of the sensor, even if the confidence coefficient of the perception information is low, the perception information can be used, namely the driving decision can be selected according to farther distance or lower confidence coefficient, so that the driving decision can be selected by using a larger perception range, the driving path of the vehicle can be planned, a farther object can be perceived, and a more accurate and safer driving path can be planned.
In a possible embodiment, the sensing information includes a distance between the obstacle and the vehicle, and the selecting the driving decision of the vehicle according to the sensing information and at least one driving decision corresponding to each range section may include: screening out that the distance between the obstacle and the vehicle is within the distance range of the first range interval; then, a driving decision of the vehicle is selected from the at least one driving decision of the first range section in combination with the vehicle speed of the vehicle.
Thus, in the embodiments of the present application, each range section has a corresponding travel decision, so that a range section matching it can be selected according to the distance of the object detected by the sensor, and a travel decision of the vehicle can be selected according to one or more travel decisions corresponding to the range section, so that a travel decision of the vehicle can be selected based on a larger sensing range.
In one possible embodiment, a confidence level may be included in the perception information; the selecting a driving decision of the vehicle according to the perception information and the at least one driving decision corresponding to each range interval may further include: and if the confidence degree included in the perception information is in the confidence degree range corresponding to the first range interval, selecting the driving decision of the vehicle from at least one driving decision of the first range interval by combining the vehicle speed of the vehicle.
Therefore, in the embodiment of the present application, the corresponding range section can be selected based on the confidence included in the perception information, and the driving decision of the vehicle can be selected.
In one possible embodiment, a relative vehicle speed of the vehicle with respect to the obstacle may be calculated from the speed of the vehicle, and the driving decision of the vehicle may then be selected from at least one driving decision in the first range section in combination with the relative vehicle speed.
Therefore, in the embodiment of the application, the driving decision of the vehicle can be more accurately selected by combining the speed of the vehicle and/or the relative speed of the vehicle and the obstacle, and the driving safety of the vehicle is further improved.
In one possible implementation, the confidence level included in the perception information is related to the distance of the sensor from the obstacle. Generally, the confidence level of the information sensed by the sensor is lower as the distance increases.
In one possible implementation, obtaining a relationship between a distance result included in the output information of the sensor and a confidence corresponding to the distance result may include: acquiring historical distance information and corresponding confidence of historical distance information acquired by a sensor; and according to the historical distance information and the relationship between the distance result and the confidence degree corresponding to the distance result, wherein the distance result is included in the output information of the corresponding confidence degree sensor.
Therefore, in the embodiment of the application, the relationship between the distance result included in the output information of the sensor and the confidence degree corresponding to the distance result can be counted according to the historical distance information and the corresponding confidence degree of the historical distance information acquired by the sensor, so as to complete the subsequent division of the range interval.
In a fourth aspect, the present application provides a driving decision selection device having a function of implementing the driving decision selection method of the third aspect. The function can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the functions described above.
In a fifth aspect, an embodiment of the present application provides a driving decision selection device having a function of implementing the driving decision selection method according to the first aspect or the third aspect. The function can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the functions described above.
In a sixth aspect, an embodiment of the present application provides a driving decision selecting device, including: a processor and a memory, wherein the processor and the memory are interconnected by a line, and the processor calls the program code in the memory for executing the processing-related function in the driving decision selection method according to any one of the first aspect or the third aspect.
In a seventh aspect, an embodiment of the present application provides a driving decision selecting device, where the driving decision selecting device may also be referred to as a digital processing chip or a chip, where the chip includes a processing unit and a communication interface, the processing unit obtains program instructions through the communication interface, and the program instructions are executed by the processing unit, and the processing unit is configured to execute functions related to processing in any one of the above-mentioned first aspect, any one of the optional implementations of the first aspect, the third aspect, or the third aspect.
Alternatively, the aforementioned driving decision selection device may be a chip or a vehicle, etc.
In an eighth aspect, an embodiment of the present application provides a computer-readable storage medium, which includes instructions that, when executed on a computer, cause the computer to perform the method in the first aspect, any of the optional embodiments of the first aspect, the third aspect, or any of the optional embodiments of the third aspect.
In a ninth aspect, embodiments of the present application provide a computer program product comprising instructions, which when run on a computer, cause the computer to perform the method of the first aspect, any of the optional embodiments of the first aspect, the third aspect, or any of the optional embodiments of the third aspect.
Drawings
FIG. 1 is a schematic structural diagram of a vehicle according to an embodiment of the present disclosure;
fig. 2 is a schematic flow chart of a driving decision selection method according to an embodiment of the present disclosure;
fig. 3A is a schematic view of a scenario for planning a driving route in an embodiment of the present application;
fig. 3B is a schematic view of another scenario for planning a driving route in the embodiment of the present application;
fig. 3C is a schematic view of another scenario for planning a driving route in the embodiment of the present application;
FIG. 4A is a graphical illustration of a range bin in an embodiment of the present application;
FIG. 4B is a schematic illustration of another range interval in an embodiment of the present application;
FIG. 5 is a graphical representation of another range interval in an embodiment of the present application;
FIG. 6A is a schematic view of a parking scenario in an embodiment of the present application;
FIG. 6B is a schematic view of a cabin in another parking scenario in the embodiment of the present application;
FIG. 6C is a schematic diagram of a display interface in another parking scenario according to an embodiment of the present application;
FIG. 6D is a schematic illustration of another parking scenario in an embodiment of the present application;
fig. 7 is a schematic structural diagram of a driving decision selection device according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of another driving decision selection device according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of another driving decision selection device according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a chip according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The driving decision selection method provided by the embodiment of the application can be applied to various scenes for planning paths. For example, the present application may be applied to a scenario of selecting a driving decision for a vehicle, or the driving decision selection method provided by the present application may be performed by a vehicle. In addition, the present application may also be applied to a scenario of performing path planning on various robots, such as a freight robot, a detection robot, a sweeping robot, or other types of robots, where the application scenario is further described by taking a freight robot as an example, when a freight robot performs transportation, a path needs to be planned for the freight robot, so as to complete transportation safely and stably.
Embodiments of the present application are described below with reference to the accompanying drawings. As can be known to those skilled in the art, with the development of technology and the emergence of new scenarios, the technical solution provided in the embodiments of the present application is also applicable to similar technical problems.
In order to facilitate understanding of the present solution, in the embodiment of the present application, first, a structure of a vehicle provided by the present application is described with reference to fig. 1. Referring to fig. 1, fig. 1 is a schematic structural diagram of a vehicle according to an embodiment of the present disclosure, where the vehicle 100 may be configured in an automatic driving mode. For example, the vehicle 100 may control itself while in the autonomous driving mode, and may confirm the current state of the vehicle and its surroundings by human operation, determine whether there is an obstacle in the surroundings, and control the vehicle 100 based on information of the obstacle. The vehicle 100 may also be placed into operation without human interaction while the vehicle 100 is in the autonomous driving mode.
The vehicle 100 may include various subsystems such as a travel system 102, a sensor system 104, a control system 106, one or more peripherals 108, as well as a power supply 110, a computer system 112, and a user interface 116. Alternatively, vehicle 100 may include more or fewer subsystems, and each subsystem may include multiple components. In addition, each of the sub-systems and components of the vehicle 100 may be interconnected by wire or wirelessly.
The travel system 102 may include components that provide powered motion to the vehicle 100. In one embodiment, the travel system 102 may include an engine 118, an energy source 119, a transmission 120, and wheels/tires 121.
The engine 118 may be an internal combustion engine, an electric motor, an air compression engine, or other types of engine combinations, such as a hybrid engine composed of a gasoline engine and an electric motor, and a hybrid engine composed of an internal combustion engine and an air compression engine. The engine 118 converts the energy source 119 into mechanical energy. Examples of energy sources 119 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electrical power. The energy source 119 may also provide energy to other systems of the vehicle 100. The transmission 120 may transmit mechanical power from the engine 118 to the wheels 121. The transmission 120 may include a gearbox, a differential, and a drive shaft. In one embodiment, the transmission 120 may also include other devices, such as a clutch. Wherein the drive shaft may comprise one or more shafts that may be coupled to one or more wheels 121.
The sensor system 104 may include a number of sensors that sense information about the environment surrounding the vehicle 100. For example, the sensor system 104 may include a positioning system 122 (which may be a global positioning GPS system, a compass system, or other positioning system), an Inertial Measurement Unit (IMU) 124, a radar 126, a laser range finder 128, and a camera 130. The sensor system 104 may also include sensors of internal systems of the monitored vehicle 100 (e.g., an in-vehicle air quality monitor, a fuel gauge, an oil temperature gauge, etc.). The sensing data from one or more of these sensors can be used to detect the object and its corresponding characteristics (position, shape, orientation, velocity, etc.). Such detection and identification is a critical function of the safe operation of the autonomous vehicle 100. The sensor mentioned in the following embodiments of the present application may be a radar 126, a laser range finder 128, a camera 130, or the like.
The positioning system 122 may be used, among other things, to estimate the geographic location of the vehicle 100. The IMU 124 is used to sense position and orientation changes of the vehicle 100 based on inertial acceleration. In one embodiment, IMU 124 may be a combination of an accelerometer and a gyroscope. The radar 126 may utilize radio signals to sense objects within the surrounding environment of the vehicle 100, which may be embodied as millimeter wave radar or lidar. In some embodiments, in addition to sensing objects, radar 126 may also be used to sense the speed and/or heading of an object. The laser rangefinder 128 may use laser light to sense objects in the environment in which the vehicle 100 is located. In some embodiments, the laser rangefinder 128 may include one or more laser sources, laser scanners, and one or more detectors, among other system components. The camera 130 may be used to capture multiple images of the surrounding environment of the vehicle 100. The camera 130 may be a still camera or a video camera.
The control system 106 is for controlling the operation of the vehicle 100 and its components. The control system 106 may include various components including a steering system 132, a throttle 134, a braking unit 136, a computer vision system 140, a line control system 142, and an obstacle avoidance system 144.
Wherein the steering system 132 is operable to adjust the heading of the vehicle 100. For example, in one embodiment, a steering wheel system. The throttle 134 is used to control the operating speed of the engine 118 and thus the speed of the vehicle 100. The brake unit 136 is used to control the deceleration of the vehicle 100. The brake unit 136 may use friction to slow the wheel 121. In other embodiments, the brake unit 136 may convert the kinetic energy of the wheel 121 into an electric current. The brake unit 136 may take other forms to slow the rotational speed of the wheels 121 to control the speed of the vehicle 100. The computer vision system 140 may be operable to process and analyze images captured by the camera 130 to identify objects and/or features in the environment surrounding the vehicle 100. The objects and/or features may include traffic signals, road boundaries, and obstacles. The computer vision system 140 may use object recognition algorithms, Motion from Motion (SFM) algorithms, video tracking, and other computer vision techniques. In some embodiments, the computer vision system 140 may be used to map an environment, track objects, estimate the speed of objects, and so forth. The route control system 142 is used to plan a travel route and a travel speed of the vehicle 100. In some embodiments, the route control system 142 may include a lateral planning module 1421 and a longitudinal planning module 1422, the lateral planning module 1421 and the longitudinal planning module 1422 being used to plan a travel route and a travel speed for the vehicle 100, respectively, in conjunction with data from the obstacle avoidance system 144, the GPS 122, and one or more predetermined maps. Obstacle avoidance system 144 is used to identify, evaluate, and avoid or otherwise negotiate obstacles in the environment of vehicle 100 that may be embodied as actual obstacles and virtual moving objects that may collide with vehicle 100. In one example, the control system 106 may additionally or alternatively include components other than those shown and described. Or may reduce some of the components shown above.
Vehicle 100 interacts with external sensors, other vehicles, other computer systems, or users through peripherals 108. The peripheral devices 108 may include a wireless communication system 146, an in-vehicle computer 148, a microphone 150, and/or speakers 152. In some embodiments, the peripheral devices 108 provide a means for a user of the vehicle 100 to interact with the user interface 116. For example, the onboard computer 148 may provide information to a user of the vehicle 100. The user interface 116 may also operate the in-vehicle computer 148 to receive user input. The in-vehicle computer 148 may be operated via a touch screen. In other cases, the peripheral devices 108 may provide a means for the vehicle 100 to communicate with other devices located within the vehicle. For example, the microphone 150 may receive audio (e.g., voice commands or other audio input) from a user of the vehicle 100. Similarly, the speaker 152 may output audio to a user of the vehicle 100. The wireless communication system 146 may communicate wirelessly with one or more devices, either directly or via a communication network. For example, the wireless communication system 146 may use 3G cellular communication such as Code Division Multiple Access (CDMA), EVD0, global system for mobile communication (GSM)/General Packet Radio Service (GPRS), or 4G cellular communication such as LTE. Or a fifth Generation mobile communication technology (5th-Generation, 5G) communication. The wireless communication system 146 may communicate using a Wireless Local Area Network (WLAN). In some embodiments, the wireless communication system 146 may utilize an infrared link, bluetooth, or ZigBee to communicate directly with the device. Other wireless protocols, such as various vehicle communication systems, for example, the wireless communication system 146 may include one or more Dedicated Short Range Communications (DSRC) devices that may include public and/or private data communications between vehicles and/or roadside stations.
The power supply 110 may provide power to various components of the vehicle 100. In one embodiment, power source 110 may be a rechargeable lithium ion or lead acid battery. One or more battery packs of such batteries may be configured as a power source to provide power to various components of the vehicle 100. In some embodiments, the power source 110 and the energy source 119 may be implemented together, such as in some all-electric vehicles.
Some or all of the functionality of the vehicle 100 is controlled by the computer system 112. The computer system 112 may include at least one processor 113, the processor 113 executing instructions 115 stored in a non-transitory computer readable medium, such as the memory 114. The computer system 112 may also be a plurality of computing devices that control individual components or subsystems of the vehicle 100 in a distributed manner. The processor 113 may be any conventional processor, such as a commercially available Central Processing Unit (CPU). Alternatively, the processor 113 may be a dedicated device such as an Application Specific Integrated Circuit (ASIC) or other hardware-based processor. Although fig. 1 functionally illustrates a processor, memory, and other components of the computer system 112 in the same block, those skilled in the art will appreciate that the processor, or memory, may actually comprise multiple processors, or memories, that are not stored within the same physical housing. For example, the memory 114 may be a hard drive or other storage medium located in a different enclosure than the computer system 112. Thus, references to processor 113 or memory 114 are to be understood as including references to a collection of processors or memories that may or may not operate in parallel. Rather than using a single processor to perform the steps described herein, some components, such as the steering component and the retarding component, may each have their own processor that performs only computations related to the component-specific functions.
In various aspects described herein, the processor 113 may be located remotely from the vehicle 100 and in wireless communication with the vehicle 100. In other aspects, some of the processes described herein are executed on a processor 113 disposed within the vehicle 100 while others are executed by the remote processor 113, including taking the steps necessary to execute a single maneuver.
In some embodiments, the memory 114 may include instructions 115 (e.g., program logic), and the instructions 115 may be executed by the processor 113 to perform various functions of the vehicle 100, including those described above. The memory 114 may also contain additional instructions, including instructions to send data to, receive data from, interact with, and/or control one or more of the travel system 102, the sensor system 104, the control system 106, and the peripheral devices 108. In addition to instructions 115, memory 114 may also store data such as road maps, route information, the location, direction, speed of the vehicle, and other such vehicle data, among other information. Such information may be used by the vehicle 100 and the computer system 112 during operation of the vehicle 100 in autonomous, semi-autonomous, and/or manual modes. A user interface 116 for providing information to and receiving information from a user of the vehicle 100. Optionally, the user interface 116 may include one or more input/output devices within the collection of peripheral devices 108, such as a wireless communication system 146, an in-vehicle computer 148, a microphone 150, or a speaker 152, among others.
The computer system 112 may control the functions of the vehicle 100 based on inputs received from various subsystems (e.g., the travel system 102, the sensor system 104, and the control system 106) and from the user interface 116. For example, the computer system 112 may communicate with other systems or components within the vehicle 100 using a can bus, such as the computer system 112 may utilize input from the control system 106 to control the steering system 132 to avoid obstacles detected by the sensor system 104 and the obstacle avoidance system 144. In some embodiments, the computer system 112 is operable to provide control over many aspects of the vehicle 100 and its subsystems.
Alternatively, one or more of these components described above may be mounted or associated separately from the vehicle 100. For example, the memory 114 may exist partially or completely separate from the vehicle 100. The above components may be communicatively coupled together in a wired and/or wireless manner.
Optionally, the above components are only an example, in an actual application, components in the above modules may be added or deleted according to an actual need, and fig. 1 should not be construed as limiting the embodiment of the present application. The driving decision selection method provided by the present application may be executed by the computer system 112, the radar 126, the laser range finder 130, or a peripheral device, such as the vehicle-mounted computer 148 or other vehicle-mounted terminals. For example, the driving decision selection method provided by the present application may be executed by the on-board computer 148, the on-board computer 148 may select a driving decision and plan a driving path for the vehicle, generate a control command according to the driving path, send the control command to the computer system 112, and control the steering system 132, the accelerator 134, the braking unit 136, the computer vision system 140, the route control system 142, or the obstacle avoidance system 144, etc. in the control system 106 of the vehicle by the computer system 112, thereby implementing automatic driving of the vehicle.
The vehicle 100 may be a car, a truck, a motorcycle, a bus, a boat, an airplane, a helicopter, a lawn mower, an amusement car, a playground vehicle, construction equipment, a trolley, a golf cart, a train, a trolley, etc., and the embodiment of the present invention is not particularly limited.
Decision and planning are important component modules in the automatic driving technology. At present, as shown in the following figures, the mainstream technical framework of an automatic driving system generally includes an upstream sensing module outputting position information of an environment and some key targets, a decision and planning module generating a control target for a vehicle, such as a reference path or a reference track for controlling the vehicle, and outputting the control target to a downstream control link to complete closed-loop control. The decision and planning process of the decision and planning module usually includes a process of estimating the motion trajectory of itself, and may also include a process of estimating the motion trajectory of other targets. The decision and planning module and the control module generally realize the control of the vehicle on the premise of safety, stability, rapidness and accuracy.
Generally, in the process of planning the driving path of the vehicle, the information collected by the sensor may be processed to obtain perception information, including information of objects around the vehicle, such as the position and size of the object within the perception range of the vehicle, or the distance between the object and the vehicle, and then the driving path of the vehicle is planned according to the perception information. The perception information may generally include deterministic perception information or probabilistic perception information. The deterministic perceptual information is perceptual information with a confidence level greater than a threshold, for example, perceptual information with a confidence level greater than 95% is deterministic perceptual information, and the probabilistic perceptual information is information including perceived information and confidence level.
Generally, the confidence of the perception information of the obstacle is related to the distance between the sensor and the obstacle, the environment, the size of the obstacle, and the like, for example, the confidence of the perception information of the obstacle is lower the farther the sensor is from the obstacle, and the confidence of the perception information of the obstacle is higher the closer the sensor is to the obstacle. Therefore, if only the deterministic sensing information is used, the planned driving path may be unstable due to the limited sensing range corresponding to the deterministic sensing information, and the roadblock with a longer distance cannot be planned in advance, thereby reducing the user experience.
Therefore, the application provides a driving decision selection method, which is used for providing a farther perception range for planning a driving path and improving user experience. The following describes the driving decision selection method provided in the present application in detail.
Referring to fig. 2, a flow chart of a driving decision selection method provided in the present application is shown as follows.
201. At least one range interval is obtained.
The at least one range section is obtained by dividing the detection range of the sensor based on output information of the sensor, and the output information of the sensor may include, but is not limited to, one or more of a distance result of a target object (hereinafter referred to as an obstacle for easy understanding), a confidence degree corresponding to the distance result, or a relationship between the distance result and the corresponding confidence degree. The distance result may be a distance between the obstacle and the vehicle as monitored by the sensor, and the confidence level may be used to indicate the accuracy of the distance. The distance range corresponding to the at least one range section covers the detection range of the sensor. For example, if the detection range of the sensor covers a range with a radius of 200 meters and centered on the sensor, the detection range includes a plurality of distance ranges, such as 0-50 meters as a distance range, 50-150 as a distance range, and 150-200 as a distance range. In the present application, the threshold values of two range sections may be divided into a previous range section or a next range section, and the present application does not limit the threshold values.
Optionally, each range interval also corresponds to a confidence range. Generally, each range section corresponds to a confidence range and a distance range, and the mapping relationship may be a relationship between a distance result included in the output information of the sensor and a confidence corresponding to the distance result, the confidence indicating the accuracy of the distance result detected by the sensor, and the distance between the vehicle and the object is the distance between the vehicle and the object detected by the sensor. It is understood that the confidence level range corresponding to the at least one range interval covers the confidence level of the information detected by the sensor in the detection range. Typically, the sensor is provided in the vehicle, and therefore, the distance of the vehicle from the object, i.e., the distance of the sensor from the object. For example, the confidence level detectable by the sensor in the detection range is in the range of 0-100%, the range of 0-100% is divided into a plurality of confidence level ranges, for example, 0-80% is a confidence level range, 80% -95% is a confidence level range, and more than 95% is a confidence level range.
The sensor may be a sensor disposed inside the vehicle, such as the sensor in the sensor system 104, or may also be a sensor disposed outside the vehicle, such as a sensor connected to the vehicle and disposed on a surface of the vehicle body, and may be specifically adjusted according to an actual application scenario.
Specifically, when the range interval is divided, the range interval may be divided in a plurality of ways, and several possible dividing ways are exemplarily described below.
Method one, dividing based on confidence
In one possible implementation, the partitioning may be based on confidence. One or more confidence ranges are derived for a range of confidence levels for objects detected within the detection range of the sensor. I.e. the confidence range or ranges cover the confidence of the information detected by the sensor in the detection range. Then, based on the relationship between the distance result included in the output information of the sensor and the confidence degree corresponding to the distance result, the distance range corresponding to each confidence degree range is calculated. Each range interval has a corresponding confidence degree range and a corresponding distance range, and at least one distance range corresponding to at least one range interval covers the detection range of the sensor.
Illustratively, as shown in Table 1, 95% -100% may be divided into range interval 1, 85% -95% may be divided into range interval 2, and so on. And, each range section has a corresponding range of distances. In general, the distance corresponding to each range interval may not be the same due to sensor errors. For example, a confidence range corresponding to 80 meters may be 93% -96%, and thus, there may be partial distance overlap between range interval 1 and range interval 2, and subsequently when selecting a driving decision, a driving decision for the vehicle may be selected based on the confidence range that does not overlap.
Interval(s) Confidence range Distance range (rice)
Range interval 1 (95%,100%] (0,90]
Range interval 2 (85%,95%] (70,120]
Range interval 3 (60%,85%] (110,170]
Range interval 4 (40%,65%] (150,220]
Range interval 5 [0,40%] [210,+∞]
TABLE 1
The distance between the vehicle and the object monitored by the sensor and the confidence coefficient can be obtained through a preset perception algorithm or can also be called as perception model output. The relationship between the distance and the confidence may be derived by counting the distance between the vehicle and the object detected by the sensor, and the corresponding confidence. For ease of understanding, the relationship between the distance result included in the output information of the sensor and the confidence level corresponding to the distance result will be referred to as a relationship between the distance and the confidence level hereinafter. The confidence is the accuracy of the information of the object detected by the sensor, and the information of the detected object may include information of the size, position, moving direction, speed, or distance from the vehicle of the object. It will be appreciated that the confidence level may be expressed as the accuracy of information such as the size, position, direction of movement, speed or distance from the vehicle of the object detected by the sensor.
Specifically, the relationship between the distance and the confidence may be a linear relationship, an exponential relationship, a logarithmic relationship, or a sequence of numbers, and the like, and may be determined specifically according to an actual application scenario, which is not limited in this application.
In some possible scenarios, the same sensor may detect different information of an object, such as information of a size, a direction, a position, and the like of the object, or the confidence degrees of the sensors for different types of objects at the same distance may also be different, for example, if the sensor is a camera, the confidence degrees of the objects at different positions may be different due to unclear contents of a part of the acquired image due to the limitation of the focus of the camera. Therefore, the same sensor may have relationships between multiple distances and confidences of different data types, such as a relationship between the distance and the confidence of the size of the object, a relationship between the distance and the confidence of the direction of the object, and the like, and may be specifically adjusted according to an actual application scenario. For convenience of understanding, in the following embodiments of the present application, the relationship between the confidence and the distance of one type of data is merely exemplified and not limited.
In a possible implementation manner, historical distance information and a corresponding confidence degree of historical distance information acquired by a sensor may be acquired, and then, according to the historical distance information and the corresponding confidence degree, a relationship between a distance result included in output information of the sensor and the confidence degree corresponding to the distance result is calculated. For example, information of a large number of obstacles collected by the sensor may be collected and used as an input of a preset sensing algorithm, and a distance between the vehicle and an object detected by the sensor and a corresponding confidence level may be output. The relationship between the distance and the corresponding confidence is then fitted, which may be expressed, for example, as y ═ ax, y is the confidence, x is the distance, and a is a constant.
Further, the perception model may also be obtained by training information collected by a large number of sensors. Illustratively, the perceptual model may be one or more of a target detection neural network, a semantically segmented neural network, a convolutional neural network, or a constructed network, or the like. For example, the target detection neural Network may be a deep neural Network for 2D target detection, such as an evolved Network based on a Regional Convolutional Neural Network (RCNN), or a deep neural Network for 3D target detection, such as a neural Network based on Forward Propagation (FP), or an evolved Network based on a Segmentation Network (SegNet), etc.
In one possible embodiment, the relationship between the distance between the vehicle and the object detected by the sensor and the confidence level may be updated in real time by the real-time information collected by the sensor and the corresponding confidence level. For example, while the vehicle is in motion, information comparing data collected by the sensors at different locations may be saved, and the relationship between distance and confidence may be updated.
Method two, division based on distance
In another possible implementation, the detection range of the sensor may be divided based on the distance to obtain one or more distance ranges.
Optionally, the confidence level range corresponding to each distance range may also be calculated based on a relationship between the distance result included in the output information of the sensor and the confidence level corresponding to the distance result. The confidence degree range corresponding to each distance range is used for subsequently screening the range interval matched with the perception information.
For example, as shown in Table 2, the range of 0-80 can be divided into range section 1, the distance of 80-120 can be divided into range section 2, the distance of 120-160 can be divided into range section 3, and so on. Generally, due to errors of the sensors, the confidence corresponding to each range interval may be different, for example, there may be partial confidence coincidence between the range interval 1 and the range interval 2, and subsequently, when a driving decision is selected, the sensing information may be matched based on the distance range where there is no coincidence.
Interval(s) Distance range (rice) Confidence range
Range interval 1 (0,80] (98%,100%]
Range interval 2 (80,120] (85%,98%]
Range interval 3 (120,160] (62%,87%]
Range interval 4 (160,200] (40%,65%]
Range interval 5 [200,+∞] [0,45%]
TABLE 2
Of course, in some possible scenarios, the range interval may also be divided by the distance and the confidence, that is, the foregoing first manner and second manner may be combined to obtain one or more range intervals, which may be specifically adjusted according to the actual application scenario, and this is not limited herein.
Therefore, in the second embodiment, the detection range of the sensor may be divided according to the confidence to obtain one or more range sections, and the range sections may be used to select a driving decision, and further, a driving path is planned through the driving decision. Therefore, when the driving path is planned, compared with the method for planning the driving path by using the sensing information with the confidence coefficient higher than the threshold value, the driving decision selection method provided by the application can improve the sensing range used when the driving decision is selected, can select the driving decision of the vehicle in advance, can enable the vehicle to carry out operations such as acceleration or deceleration in advance, enables the control of the vehicle to be more stable, improves the driving safety of the vehicle, and improves the user experience. The driving decision selection method provided by the application can be understood as introducing probability type perception information to represent uncertainty of the perception information, so that driving decisions can be selected subsequently according to the uncertain perception information, and a perception range used in planning a driving path is enlarged.
In some possible scenes, a plurality of sensors may exist, and when the range interval is divided, the detection range of each sensor can be divided respectively to obtain the range interval corresponding to each sensor; the distance corresponding to the same confidence of each sensor may be divided after performing weighted operation, or the confidence of each sensor at the same distance may be divided after performing weighted operation, and the like, and the method may be specifically adjusted according to an actual application scenario. Therefore, in the embodiment of the application, different division modes can be used for different sensors, and in the process of selecting the driving decision of the vehicle, multiple sensors can work cooperatively, so that the driving safety of the vehicle is further improved.
For example, in the 70m range, the confidence of perception of the vehicle or pedestrian is high, and in the 70-150m range, the confidence of perception of the vehicle or pedestrian is low (e.g., between 50% -80%); and the visual perception sensor has a higher confidence of perception of the vehicle or the pedestrian and the like in the range of 80m, and has a lower confidence of perception of the vehicle or the pedestrian and the like in the range of 80-150 m. Generally, the information of the object in the environment is sensed by a plurality of sensors, so that the detection accuracy can be improved, the situations of false detection or missing detection and the like can be avoided, and the accuracy of the detected information is improved.
202. At least one driving decision for each of the at least one range interval is set.
The driving decision can include the decisions of acceleration, deceleration, constant-speed cruising, vehicle following, vehicle speed keeping or lane changing and the like. The driving decisions set in different range intervals can be the same or different. For example, the driving decision corresponding to the range section 1 may be a decision such as constant speed cruising, deceleration following, lane change driving or speed increasing, or may further include a decision such as controlling the vehicle speed to be in a first preset range or controlling the speed of the vehicle relative to the obstacle to be in a second preset range.
In some possible scenarios, the driving decision for the vehicle may be set in advance or selected by the user. For example, different distances may be preset for different decisions, a range of 0-20 meters may be set for deceleration or holding speed or a corresponding vehicle speed range or a relative vehicle speed range, a range of 40-80 meters may be set for acceleration, deceleration or lane change or the like or a corresponding vehicle speed range or a relative vehicle speed range, and more than 80 meters may be set for acceleration, cruise, deceleration or lane change or a corresponding vehicle speed range or a relative vehicle speed range. For another example, a driving decision may be selected by the user, wherein the acceleration and deceleration may be set as a candidate decision, and the decision such as cruise control, lane change, etc. may be selected by the user through the interactive interface.
It should be noted that after at least one driving decision corresponding to each range section in the at least one range section is set, before or during driving of the vehicle, a driving path of the vehicle may be planned through the divided at least one range section and the corresponding driving decision, and the vehicle is controlled to drive according to the driving path. Before or during the travel of the vehicle, the divided range section or the travel decision corresponding to each range section may be updated. For example, the sensor may have loss in the environment, which causes an error or baseline drift, and the like, so that the range interval may be updated in real time according to the information acquired by the sensor during use, so as to improve the matching between the divided range interval and the accuracy detected by the sensor.
Therefore, in the embodiment of the present application, different range sections have corresponding driving decisions, so that the following planning of the driving path can be performed through the selected driving decisions. The method and the device have the advantages that the sectional type driving planning interval is provided, perception information with different confidence degrees can be processed based on different range intervals, and the perception range in the process of planning the driving path is improved.
Typically, in different scenarios, the driving decision corresponding to each range interval is determined according to an application scenario, which may include, but is not limited to: automatic cruising, car following or automatic parking and the like. Specifically, for example, in a high-speed automatic cruise scenario, the vehicle speed is high, the distance ranges corresponding to the range sections may be 0m to 100m, 100m to 200m, 200m to 300m, and the like, the driving decision corresponding to 0m to 100m includes deceleration or lane change, the driving decision corresponding to 100m to 200m is to maintain the vehicle speed or lane change, and the driving decision corresponding to 200m to 300m may include acceleration, lane change, and the like. In a low-speed car following scene, a decision corresponding to 0m-100m is deceleration, and a driving decision corresponding to a distance range of more than 100m is acceleration. Of course, different scenes may be adapted by dividing different distance ranges and setting a corresponding driving decision for each distance range. For example, in a low-speed vehicle following scene, the distance range corresponding to each range section may be different from the distance range of the high-speed automatic cruise, for example, the distance ranges corresponding to the plurality of range sections may be 0 to 50m, 50 to 100m, 100m to 150m, the driving decision corresponding to 0 to 50m may be a decision to decelerate or change lanes, the driving decision corresponding to 50 to 100m may be a decision to accelerate or maintain the vehicle speed, the driving decision corresponding to 100m to 150m may be a decision to accelerate or maintain the vehicle speed, and the distance range above 150m may correspond to a driving decision to accelerate or change lanes. For example, in an automatic parking scene, the divided distance ranges may be different because the distance between the vehicle and the obstacle is short, for example, the distance ranges corresponding to the plurality of range sections may include 0 to 1m, 1m to 2m, 2m to 5m, and the like, the driving decision corresponding to 0 to 1m is deceleration, the driving decision corresponding to 1m to 2m is to maintain the vehicle speed, and the driving decision corresponding to 2m to 5m is slow acceleration.
In addition, in different environments, the driving decision corresponding to each range section may be different. For example, the driving decision corresponding to each range section in the smoke environment and the non-smoke environment is different, and the driving decision corresponding to each range section in the rain and fog environment and the clear environment is also different, and the driving decision may be specifically adjusted according to the actual application scenario, which is not limited in this application.
It should be noted that the steps 201-202 are optional steps. The step 201 and 202 may also be implemented separately, or the step 201 and 202 may be executed by other devices, and then at least one divided range section is configured in a sensor, or the at least one range section is sent to a vehicle-mounted terminal or other devices for controlling the vehicle. For example, before a certain sensor leaves the factory, a storage medium may be provided in the sensor, and the storage medium may be written with a command code for executing the above-mentioned step 201 and step 202, or written with detailed information of the divided range section, etc. in the storage medium.
In general, in practical applications, the division of the range section need not be performed each time a driving decision is selected. The method can be specifically adjusted according to actual application scenes. Alternatively, in some scenarios, one or more range sections may be defined in advance, and the range section does not need to be defined again each time a driving decision is selected or a driving path is defined.
203. And acquiring the perception information acquired by the sensor.
The vehicle is provided with a sensor, and the perception information collected by the sensor can be acquired during the driving of the vehicle or before the driving is started. For example, a sensor acquires information such as an image of an obstacle, a laser point cloud, or an electromagnetic echo point cloud, and then the information is input to a preset perception algorithm or a perception model to output perception information.
Specifically, the perception information may include, but is not limited to, one or more of the following: speed of the vehicle, speed of the vehicle relative to the obstacle, position of the obstacle, direction of the obstacle, size of the obstacle, and the like. The obstacle may be a vehicle, a pedestrian, a road, a traffic light, a traffic sign, etc. within the detection range of the sensor.
In general, the information collected by different sensors may be different. For example, the sensor may include a camera, and the information acquired by the sensor may include pixel values of an image; the sensor can comprise a laser radar, and the information collected by the sensor can comprise 3D laser point cloud; the sensor may further include a millimeter wave radar, the information collected by the sensor may include a point cloud formed by electromagnetic echoes, the specific type included in the sensor may be adjusted according to the actual application, the information collected by different types of sensors may be the same or different, which is merely an exemplary illustration and is not limited herein.
Different sensors may correspond to different perception models, and may also correspond to the same perception model, that is, the perception model may process different types of input data to obtain corresponding perception information. The perceptual model may be trained from a large amount of historical data relating to the aforementioned sensors.
Optionally, a confidence level may be further included in the perception information to indicate the accuracy of the information included in the perception information. For example, if the speed of the vehicle relative to the obstacle can be included in the perception information, the confidence level can be used to indicate the accuracy of the speed of the vehicle relative to the obstacle. Therefore, when the driving decision of the vehicle is selected subsequently and the driving path of the vehicle is planned, the confidence coefficient can be referred to for planning, and a more accurate driving path is obtained.
204. A first range interval matching the perceptual information is selected from the at least one range interval.
After the perception information is obtained through the perception algorithm, a first range interval matched with the perception information can be selected from at least one range interval. The at least one range interval may be at least one range interval obtained by dividing in the foregoing step 202, where each range interval corresponds to a distance range, and optionally, each range interval also corresponds to a confidence level range, which is specifically referred to the description in the foregoing step 202, and is not described herein again.
In a possible implementation, the sensing information may include a first distance between the vehicle and the obstacle, and step 204 may specifically include: and comparing the first distance with the distance range of each range interval, and screening out a first range interval, wherein the first distance is in the distance range corresponding to the first range interval. The distance between the vehicle and the obstacle may be included in the perception information, and the range section that matches the distance between the vehicle and the obstacle may be selected as the first range section in the traveling direction of the vehicle, such as in the same lane or in front of the obstacle in the traveling direction of the vehicle.
For example, if the distance between the vehicle and the obstacle included in the perception information is 90 meters, the distance range of the range section 1 is 50 to 150 meters, and the distance range of the range section 2 is 0 to 50 meters, the 90 meters included in the perception information is compared with the range section, and the range section that matches the 90 meters is determined to be the range section 1 when the 90 meters are within the range of 50 to 150 meters included in the range section 1.
In one possible embodiment, a first confidence level may be included in the perception information, and the first confidence level is used for indicating the accuracy of the distance between the vehicle and the obstacle included in the perception information. Step 204 may specifically include: the first confidence level may be matched with the confidence level range included in each range interval, and a first range interval is screened out, where the first confidence level is in the confidence level range included in the first range interval.
For example, if the first confidence is 95%, the range interval 1 includes a confidence range of 94% to 96%, the range interval 2 includes a confidence range of 96% to 100%, and 95% of the confidence range included in the range interval 1 is in the confidence range, that is, the range interval matched with the first confidence is determined to be the range interval 1.
Of course, in a possible embodiment, the aforementioned at least one range section may be divided by combining confidence and distance, the first distance between the vehicle and the obstacle and the first confidence of the first distance may be included in the perception information, and the obstacle is located in the driving direction of the vehicle. A first range interval may be screened from the at least one range interval in conjunction with a first distance of the vehicle and the obstacle and a first confidence in the distance.
For example, if the distance between the vehicle and the obstacle is 90 meters, the confidence is 95%, the distance range corresponding to the first range interval is 50-150 meters, and the confidence is 94% -96%, that is, the distance included in the sensing information is within the distance range of the first range interval, and the confidence included in the sensing information is within the confidence range corresponding to the first range interval, it is determined that the selected range interval is the first range interval, the first range interval is the distance range of 50-150 meters, and the confidence range is 94% -96%.
For another example, in one scenario, the confidence level and the priority level of the distance may be set in advance, and when the confidence level is in the range interval 1 and the distance is in the range interval 2, if the priority level of the distance is higher than the priority level of the confidence level, it may be confirmed that the range interval corresponding to the obstacle is the range interval 2; if the confidence level is higher than the distance priority level, it can be confirmed that the range section corresponding to the obstacle is the range section 1.
In addition, in some embodiments, only the detection range of the sensor may be divided, and the distance range corresponding to each range section may be obtained. In this scenario, the sensing information may include a confidence level and a relationship between a distance and the confidence level, and then, according to the relationship between the distance and the confidence level, a distance corresponding to the confidence level is calculated, and then the distance and a distance range corresponding to each range interval are matched, so as to screen out a first range interval matched with the sensing information.
In one possible scenario, there may be multiple information detected by the sensors, and thus multiple perception information may be obtained, and the information and confidence included in each perception information may not be the same. In this scenario, the final perception information may be selected from the multiple pieces of perception information with the highest confidence level, the first range section may be screened out according to the final perception information, or the multiple pieces of perception may be subjected to weighting operation, and the weight value corresponding to the perception information with the high confidence level is also high, so that the final perception information is obtained. For example, if the sensing model outputs sensing information corresponding to 3 sensors including 3 distances to the obstacle, the 3 distances to the obstacle may be weighted, and the weighted value corresponding to the distance with high confidence is higher, so as to obtain the weighted distance, and the first range section may be selected according to the weighted distance.
205. The driving decision of the vehicle is selected from the at least one driving decision of the first range interval in combination with the vehicle speed of the vehicle.
Specifically, the driving decision of the vehicle may be selected according to a preset rule in combination with the vehicle speed of the vehicle. For example, a highest vehicle speed and a lowest vehicle speed are set for each range section, if the distance between the vehicle and the obstacle is within the distance range of a first range section, the vehicle speed of the vehicle is smaller than the highest vehicle speed of the first range section and is greater than the lowest vehicle speed, and the driving decision of the first range section includes acceleration, following and deceleration, then the acceleration or following can be selected as the driving decision of the vehicle; if the vehicle speed of the vehicle is less than the minimum speed, the acceleration can be selected as the driving decision of the vehicle, and the vehicle speed of the vehicle is kept between the maximum vehicle speed and the minimum vehicle speed of the first range section.
Alternatively, the highest relative vehicle speed and the lowest relative vehicle speed may be set for each range section. The relative speed of the vehicle with respect to the obstacle is calculated from the vehicle speed of the vehicle and the speed of the obstacle included in the sensing information. A driving decision for the vehicle is then selected from the at least one driving decision for the first range interval based on the relative vehicle speed. For example, the driving decision of the first range section includes acceleration, following and deceleration, and if the distance between the vehicle and the obstacle is within the distance range of the first range section, and the relative vehicle speed between the vehicle and the obstacle is less than the highest relative vehicle speed and less than the lowest relative vehicle speed of the first range section, the acceleration or following can be selected as the driving decision of the vehicle; if the relative speed between the vehicle and the obstacle is smaller than the lowest relative speed, taking the acceleration as a driving decision of the vehicle; and if the relative speed between the vehicle and the obstacle is greater than the highest relative speed, taking deceleration as a driving decision of the vehicle.
In one possible embodiment, the driving decision of the vehicle may be selected from at least one driving decision corresponding to the first range section in combination with the speed of the vehicle and the relative speed of the vehicle and the obstacle. For example, a highest vehicle speed, a lowest vehicle speed, a highest relative vehicle speed, and a lowest relative vehicle speed may be set for each range section, and a travel decision of the vehicle may be selected based on the speed of the vehicle and the relative speed between the vehicle and the obstacle such that the vehicle speed of the vehicle is maintained between the highest vehicle speed and the lowest vehicle speed, and the relative speed between the vehicle and the obstacle is maintained between the highest relative vehicle speed and the lowest relative vehicle speed.
For example, taking a certain range interval as an example, the distance of the range interval is 10-20 meters, and the corresponding driving decision can comprise decelerating and maintaining the vehicle speed. The speed of the vehicle is 30km/h, the relative speed of the vehicle to the preceding vehicle is 5km/h, and in order to improve the driving safety of the vehicle, the control target is to keep the relative speed of the vehicle to the preceding vehicle at 0km/h, so the driving decision at this time is to decelerate so that the relative speed of the vehicle to the preceding vehicle is kept at 0 km/h. Alternatively, the control target may be set to a speed of 20km/h, at which the travel decision is to decelerate so that the speed of the vehicle is reduced to 20km/h or less.
In addition, in some scenarios, the sensing information with the confidence coefficient higher than the threshold may be processed as the deterministic sensing information, and the driving decision of the vehicle may be selected based on the deterministic sensing information, that is, a planning algorithm corresponding to the deterministic sensing information may be used, for example, to improve the compatibility of the driving decision selection method provided in the present application when planning the driving path.
In some scenarios, in addition to the speed of the vehicle or the relative speed of the vehicle and the obstacle, other data, such as a vehicle that is different from the traveling direction of the vehicle, the traveling behavior of an obstacle vehicle around the vehicle (such as lane change, passing, etc.), or the traveling environment change, may be combined to select the traveling decision of the vehicle. For example, if the driving decision corresponding to the distance range of 100-; if the weather changes suddenly and rain fog appears, the vehicle speed can be kept at the moment, so that the vehicle can run safely.
206. And controlling the vehicle to run according to the running decision of the vehicle.
After the driving decision of the vehicle is selected, the vehicle may be controlled to execute the driving decision. For example, if the driving decision of the vehicle is acceleration, the vehicle is controlled to accelerate, if the driving decision of the vehicle is deceleration, the vehicle is controlled to decelerate, and if the driving decision of the vehicle is to keep the vehicle speed, the vehicle speed is controlled to be unchanged; and if the driving decision of the vehicle is lane change, generating a path for the vehicle to change lanes, and controlling the vehicle to drive according to the path.
Specifically, the driving path of the vehicle can be planned by combining the vehicle speed of the vehicle and the driving decision of the vehicle, and the vehicle is controlled to drive according to the driving path. For example, if the travel decision of the vehicle is deceleration, the travel route of the vehicle in the process of executing the deceleration decision of the vehicle may be generated, and the vehicle may be controlled to travel according to the travel route by controlling a steering system, a brake system, and the like of the vehicle.
In general, if the driving decision of the vehicle is to accelerate, decelerate, or maintain a vehicle speed, etc., the driving path of the vehicle may include a route parallel to a lane or close to a lane, and if the driving decision of the vehicle is to change lanes, the driving path of the vehicle may include a curve in which the vehicle drives from a current lane to an adjacent lane.
More specifically, when the driving path includes a curve, the driving path may specifically include information such as a driving curve, a steering angle, a steering radius, or a vehicle speed of the vehicle.
To facilitate understanding, a specific scenario is taken as an example to illustrate the method provided by the present application, in a vehicle with an automatic driving function, a sensing range of a sensor in the vehicle is divided into a plurality of range sections, each range section includes a distance range and a corresponding confidence range, for example, range section 1 includes a distance range of [0, 10], a confidence range of [ 98%, 100% ], range section 2 includes a distance range of (10,50], a confidence range of [ 97%, 98%), range section 3 includes a distance range of (50,100], a confidence range of [ 95%, 97%), range section 4 includes a distance range of (100, + ∞ and a confidence range of [ 0%, 95%). In each range section, a corresponding driving decision is set in advance, for example, the driving decision of the range of [0, 10] meter includes deceleration, the driving decision of the range of [ 10,50] meter includes deceleration and holding of vehicle speed, the driving decision of the range of [ 50,100] meter includes holding of vehicle speed and lane change, and the driving decision of the range of [ 100] + ∞ ] meter includes holding of vehicle speed, acceleration and lane change, in the automatic driving scene, as shown in FIG. 3A, if an obstacle 301 is detected ahead of the vehicle on the way of the vehicle, and the obstacle 301 is at a distance of 126m from the vehicle, within the range section 4, which corresponds to the distance in the driving decision of the range section, the corresponding driving decision includes holding of vehicle speed, acceleration and lane change, the driving decision can be selected according to the user's demand, for example, the driving decision is selected as lane change or acceleration if the user's demand is fast arrival, and if the user needs to run stably, selecting a running decision to keep the speed and the like. For example, if the user's demand is a fast arrival and the distance between the vehicle and the obstacle 301 gradually decreases, the driving decision of the vehicle is selected as a lane change in consideration of the safety of the vehicle and the user's demand for fast arrival. Then, a lane-changing driving path 302 is generated based on the current vehicle speed of the vehicle, the vehicle speed of the obstacle and the road condition, and the vehicle is controlled to drive according to the driving path 302. For convenience of understanding, a plan view of a driving route corresponding to the lane change decision may be as shown in fig. 3B, after detecting that there is an obstacle 301 in front of the vehicle, i.e., in the driving direction of the vehicle, and selecting the driving decision of the vehicle as the lane change, generating a driving route 302, and controlling the vehicle to drive according to the driving route. The manner of generating the travel path may include a speed-time map (ST) algorithm, a 3DST algorithm (SLT) algorithm, and the like. As shown in fig. 3C, a position 3031 where the vehicle is located after changing lanes is selected, two curves 3021 and 3022 of the vehicle to the position 3031 are planned, the curves 3021 and 3022 are tangent, a steering radius of the vehicle, that is, a radius r1 of the curve 3031 and a radius r2 of the curve 3022, is calculated based on a current vehicle speed of the vehicle, the curve 3021 and the curve 3022 are smoothed to obtain a driving path 302, and the vehicle is controlled to travel along the driving path 302 according to the steering radius.
Further, if the driving decision selection method provided by the present application is executed by the vehicle-mounted terminal, the vehicle-mounted computer, or other external devices, after the vehicle-mounted terminal, the vehicle-mounted computer, or other external devices plan a driving path, the driving path may be transmitted to the control system 106 shown in fig. 1, and the control system 106 generates a control instruction through differential (PD) control, proportional, integral, and differential (PID) control, and the like according to information such as a driving curve and a vehicle speed included in the driving path, so as to control the steering system 132, the accelerator 134, the brake unit 136, and the like, so as to control the vehicle to drive according to the driving path.
Therefore, before the driving decision is selected, at least one range interval is divided, each range interval has a corresponding confidence degree and a corresponding distance range, a mapping relation exists between a value included in the confidence degree range and a value included in the distance range, the mapping relation is a relation between a distance result output by the sensor and the confidence degree corresponding to the distance result, and the mapping relation can be preset or can be updated in real time according to information collected by the sensor. In selecting a driving decision, at least one driving decision based on each range section may be used to select a driving decision of the vehicle, such as a decision to accelerate, decelerate, maintain a vehicle speed, or change lanes. Generally, the farther the sensor detects an object, the lower the confidence. Compared with the method for planning the driving path by using the perception information with the confidence coefficient higher than the threshold value, the driving decision selection method provided by the application can select the driving decision by using the perception information with the confidence coefficient not higher than the threshold value, which is equivalent to enlarging the perception range used when the driving path is planned. Therefore, in the driving decision selection method provided by the application, in the process of selecting the driving decision, the perceived obstacle at a farther distance can be used for selecting the driving decision, and the driving decision can be determined aiming at the object at a farther distance, so that the vehicle can avoid the obstacle at a farther distance in advance, the driving safety of the vehicle is improved, and the user experience is improved. And the obstacle farther away from the vehicle is avoided in advance, so that decisions such as acceleration or deceleration can be made in advance, the driving process of the vehicle is smoother, and the user experience is improved.
Furthermore, for the convenience of understanding, the following describes a specific application scenario for the multiple range intervals divided by the driving decision selection method and the selection manner of the driving decision.
Illustratively, as shown in fig. 4A, the sensing range of the sensor provided on the vehicle 401 may be divided into 4 zones, the distance range of zone 1 is 0-20 meters (m), the distance range of zone 2 is 20-65 meters, the distance range of zone 3 is 65-150 meters, and the distance range of zone 4 is 150-250 meters. The critical distance may be divided into any one of two adjacent intervals, for example, the distance of 20m may be divided into an interval 1 or an interval 2, and may be specifically adjusted according to an actual corresponding scene.
Then, a corresponding driving decision is set for each section, for example, for the automatic driving mode, the driving decision set for section 1 includes braking and holding following, the driving decision set for section 2 includes following, lane changing and decelerating, the driving decision set for section 3 includes accelerating, following, lane changing and decelerating, and the driving decision set for section 4 includes accelerating.
In the sensing areas close to the vehicle, such as the interval 1 and the interval 2, the confidence of the information detected by the sensor is generally high in the range, and the range interval corresponding to the deterministic sensing information can be understood. And in a sensing region far away from the vehicle, such as the region 3 or the region 4, in this range, the confidence of the information detected by the sensor is generally low, that is, the accuracy of the detected information is low, and the range region corresponding to the probability type sensing information can be understood. Therefore, the distance range of the available interval 3 and the distance range of the available interval 4 are increased in the range interval divided in the embodiment of the application, the range interval is used for selecting the driving decision of the vehicle, the driving path of the vehicle is planned, and the available perception range is increased when the driving path is planned.
Therefore, the scheme provided by the application is equivalent to increase the perception range when the driving path is planned. For example, when the visual perception is within 100m, the obstacle can be reliably detected, and the visual perception is a determined perception range; in the interval of 100-200m, the success rate of detecting the vehicle is reduced and the confidence is also reduced due to a series of reasons such as the distance is increased or the image is decreased. In the scheme provided by the application, the area division is also performed on the 100-200m interval, the driving decision of the vehicle is selected based on the area division, and the driving path of the vehicle is planned, which is equivalent to increasing the perception range when the driving path is planned.
Referring to fig. 4B, for example, the obstacle 402 is a leading vehicle, Vr may be a relative speed between the vehicle and the leading vehicle, and Ve may be a speed of the vehicle itself. The front vehicle can be in a moving state, namely the speed of the front vehicle is not 0, or can be in a static state, namely the speed of the front vehicle is 0, or the front vehicle can be replaced by other obstacles, such as traffic lights, roadblocks, triangular cones and the like.
When the driving decision is selected, if the distance between the obstacle 402 and the vehicle 401 is within the distance range of the zone 1, the zone 1 can be understood as an emergency braking zone, that is, the distance between the vehicle and the obstacle 402 is too close to be within the safe distance range and is greatly different from the safe distance, and the speed of the vehicle needs to be reduced so that the vehicle and the obstacle keep a safer distance. At this time, the control target of the vehicle can be parking or the vehicle speed can be maintained within the range of 0 & ltVr & lt 30 & gt and 0 & ltve & lt 60 & gt, so that the vehicle and the front vehicle can be prevented from forming rear-end accidents. The safe distance may be a value set in advance, or a value calculated according to a set algorithm, such as a value calculated according to the current speed, the braking distance, or the speed of the vehicle ahead, and the safe distance may be understood as a distance that ensures that the vehicle can safely travel and avoid collision.
If the distance between the front vehicle 402 and the vehicle 401 is within the distance range of the interval 2, and the distance between the vehicle and the front vehicle is in the normal braking interval, that is, the distance between the vehicle and the front vehicle is not within the safe distance, but the difference between the distance and the safe distance is small, the control target of the vehicle 401 can be set to be 30-Vr-60, 60-Ve-100, and the vehicle speed of the vehicle 401 can be properly reduced to enable the vehicle speed to be within the range of the control target.
If the distance between the front vehicle 402 and the vehicle 401 is within the distance range of the section 3, the distance between the vehicle and the front vehicle is in the comfortable speed regulation section, and the distance between the front vehicle and the vehicle is not less than the safety distance. In this scenario, the vehicle speed of the vehicle may be adjusted according to the actual scenario, and the control target may be set to 60 ≦ Vr ≦ 100 and 100 ≦ Ve ≦ 125 so that the vehicle may maintain rapid and safe driving.
If the distance between the obstacle 402 and the vehicle 401 is within the distance range of the section 4, the distance between the vehicle and the front vehicle is in the pre-judgment section, the distance between the front vehicle and the vehicle is not less than the safe distance, and the vehicle is in a safe position. In this scenario, the vehicle speed of the vehicle may be adjusted according to the actual scenario, and the control target may be set to 100 ≦ Vr ≦ 150 and 125 ≦ Ve ≦ 150 so that the vehicle may maintain rapid and safe driving. Furthermore, the obstacle with the distance from the vehicle within the distance range can be avoided in advance, and the driving safety of the vehicle can be further improved.
Of course, instead of selecting the control target with reference to Vr and Ve, the driving decision of vehicle acceleration or deceleration may be selected directly according to the distance between the vehicle and the above-mentioned distance as the control distance, for example, if the distance between the vehicle and the obstacle is too close, the vehicle is controlled to decelerate, so as to increase the distance between the vehicle and the obstacle; if the distance between the vehicle and the obstacle is too far, the vehicle is controlled to maintain the vehicle speed or accelerate, so as to maintain the distance or reduce the distance, and the distance may be specifically adjusted according to an actual application scenario, which is merely an exemplary illustration and is not limited.
For convenience of understanding, it is understood that, among the 4 intervals shown in fig. 4A, the interval 1 and the interval 2 are range intervals corresponding to deterministic perceptual information, that is, range intervals corresponding to perceptual information whose confidence is higher than a threshold, and the interval 3 and the interval 4 are range intervals corresponding to perceptual information whose confidence is not higher than the threshold. When the distance between the vehicle and the obstacle is confirmed to be in the interval 1 or the interval 2 according to the confidence degree or the distance carried in the perception information, the existing mode of planning the driving path according to the selective perception information can be used for planning the driving path for the vehicle, and therefore the compatibility of the mode of planning the driving path in the driving decision selection method provided by the application is improved. When the distance between the vehicle and the obstacle is confirmed to be in the interval 3 or the interval 4 according to the confidence degree or the distance carried in the perception information, the driving decision of the vehicle can be selected according to the speed of the vehicle and/or the relative speed of the vehicle and the obstacle, so that the obstacle in a longer distance can be responded in advance, such as obstacle avoidance, early deceleration and the like, the driving safety of the vehicle is improved, the driving process of the vehicle is changed more smoothly, and the user experience is improved.
More specifically, the process of selecting a driving decision of the vehicle is explained in more detail for the sake of understanding.
Taking the example of selecting the control target with reference to Vr and Ve, after confirming that the distance between the vehicle and the obstacle up to now is within the distance range of a certain range section, the control target of the speed at the boundary of the range section may be set, and then the control target of the speed at the boundary of the range section may be used as the speed control target of the vehicle.
For example, as shown in fig. 5, taking one of the range sections, such as the section i as an example, when the distance between the obstacle 402 and the vehicle is sensed in the section i, that is, when the obstacle is present in the range of the section i sensed by the vehicle, the obstacle is taken as a front vehicle here as an example.
If the relative distance between the vehicle and the preceding vehicle is continuously reduced, the relative distance between the vehicle and the preceding vehicle approaches to the section i-1, at this time, Vr1 is 60km/h and Ve1 is 100km/h as boundary vehicle speeds, and the deceleration movement planning is performed by taking the boundary vehicle speeds as control targets to generate a driving path of the vehicle, and at this time, the driving path of the vehicle can be continuously driven in the current lane, for example, the vehicle keeps straight driving. When the relative distance between the vehicle and the preceding vehicle is reduced to the boundary of the section i, the control targets are Vr 1-60 km/h and Ve 1-100 km/h. And if the vehicle speed is greater than 100km/h and the relative speed of the vehicle and the front vehicle is greater than 60km/h, reducing the vehicle speed, namely, making the driving decision be deceleration so that the vehicle speed is not greater than 100km/h and the relative speed of the vehicle and the front vehicle is not greater than 60 km/h.
When the relative distance between the vehicle and the preceding vehicle is increased, the relative distance between the vehicle and the preceding vehicle approaches the section i +1, and the moving track and the vehicle speed of the vehicle are controlled by using the vehicle speeds with Vr2 being 100km/h and Ve2 being 125km/h as boundary vehicle speeds. When the distance to the preceding vehicle decreases to the boundary of the section i, the control targets are such that the vehicle speed Ve2 becomes 125km/h and the relative speed Vr2 with the preceding vehicle becomes 100 km/h.
Therefore, in the embodiment of the application, the range interval is divided according to the confidence and the distance of the object detected by the sensor, a larger range is divided, and compared with the method for planning the driving path by using only perception information with higher confidence, the method provided by the application is equivalent to increase the perception range. And the driving decision corresponding to each range interval is selected. The range interval corresponding to the relative distance between the vehicle and the obstacle can be obtained, and the corresponding driving decision can be selected according to the range interval, so that the vehicle can select the driving decision and plan the driving path for the obstacle with a longer distance, for example, the vehicle can be decelerated in advance to avoid collision, the driving safety of the vehicle is improved, smoother acceleration or deceleration can be performed in advance, and the user experience can be improved.
In some scenarios, the type of driving decision set for each range interval may also be different for different types of obstacles. For example, if the obstacle is a moving object, the driving decision can be made as described in fig. 4A. For another example, if the obstacle is a stationary object, when the vehicle gradually approaches the obstacle, the driving decision set for each range section only includes deceleration or lane change, but not acceleration, so as to improve the driving safety of the vehicle.
In addition, the driving decision selection method provided by the application can be applied to parking scenes besides planning the path of the vehicle in the driving process. For example, a parking scenario may be as shown in FIG. 6A, where a vehicle 601 is located within a parking lot having one or more unavailable slots 602 and one or more available slots 603 for non-parked vehicles. After the automatic parking is initiated, the cockpit of the vehicle may display that the vehicle is currently in the automatic parking mode in a display screen in a dashboard of the vehicle, one or more of the parking spaces 603 may be displayed, and the available parking spaces may be displayed in the interactive display interface 1000 of the vehicle and selected by the user for parking, referring to fig. 6B. The interface displayed by the interactive display interface 1000 may be as shown in fig. 6C, and the user may select one of the parking spaces 6031 in the plurality of parking spaces 603. Subsequently, as shown in fig. 6D, in the scenario of automatically parking in the parking space, a driving decision corresponding to a range interval in which the distance between the vehicle and the obstacle is located may be selected according to the distance and the confidence degree of the obstacle in the environment detected by the sensor, so as to plan a driving path 604 of the parking space 6031 for the vehicle 601, and the specific manner of planning the driving path may refer to the related description of fig. 3C, which is not described herein again.
The foregoing describes in detail the flow of the driving decision selection method provided in the present application, and the following explains the apparatus provided in the present application with reference to the method embodiments corresponding to fig. 2 to 6D.
Referring to fig. 7, a schematic structural diagram of a driving decision selection device provided in the present application is shown. The driving decision selection device is configured to perform the steps of the method described above with reference to fig. 2-6D.
The driving decision selection means may include:
the sensing module 701 is used for acquiring sensing information acquired by a sensor configured on a vehicle;
a decision module 702, configured to select a first range interval that matches the sensing information from at least one range interval, where the at least one range interval is obtained by dividing a detection range of a sensor based on output information of the sensor, the output information of the sensor includes at least one of a distance result of a detection target object, a confidence level corresponding to the distance result, or a relationship between the distance result and the confidence level corresponding to the distance result, each range interval in the at least one range interval corresponds to at least one driving decision, and each range interval has at least one corresponding driving decision;
the decision module 702 is further configured to select a driving decision of the vehicle from at least one driving decision corresponding to the first range interval according to the vehicle speed of the vehicle;
and the control module 703 is used for controlling and measuring the running according to the running decision of the vehicle.
In a possible implementation, the decision module 702 is specifically configured to select a first range section matching the first distance from the at least one range section, where the first distance is within a distance range included in the first range section.
In a possible implementation manner, each range interval further includes a confidence range, the confidence range corresponding to at least one range interval covers the confidence of the information detected by the sensor in the detection range, the sensing information further includes a first confidence, and the first confidence is used for indicating the accuracy of the first distance;
the decision module 702 is specifically configured to select a first range interval matching a first confidence level from the at least one range interval, where the first confidence level is included in the sensing information, and the first confidence level is used to indicate an accuracy degree of the first distance.
In one possible embodiment, the driving decision selection device provided by the present application may further include: a dividing module 705, configured to divide a detection range of the sensor to obtain at least one distance range before the sensing module obtains sensing information, where the at least one distance range corresponds to the at least one range interval one to one.
In a possible implementation, the dividing module 705 is specifically configured to: before the sensing module 701 acquires the sensing information, acquiring at least one confidence level range, wherein each confidence level range in the at least one confidence level range is not overlapped, and the at least one confidence level range covers the confidence level of the information detected by the sensor in the detection range; and calculating at least one distance range corresponding to the at least one confidence degree range one by one according to the at least one confidence degree range and the relationship between the distance result and the confidence degree corresponding to the distance result, wherein each range interval corresponds to one confidence degree range and one distance range.
In a possible implementation, the dividing module 705 is further configured to: and determining at least one confidence degree range corresponding to at least one distance range one by one according to the relationship between the distance result and the confidence degree corresponding to the distance result, wherein the at least one confidence degree range is used for screening the range matched with the perception information from the at least one range.
In one possible embodiment, the sensing module 701 is further configured to: acquiring historical distance information and corresponding confidence of historical distance information acquired by a sensor; and acquiring the relation between the distance result and the confidence corresponding to the distance result according to the historical distance information and the corresponding confidence.
In a possible implementation, the decision module 702 is specifically configured to: calculating the relative speed of the vehicle and the obstacle according to the vehicle speed of the vehicle; and selecting the driving decision of the vehicle from at least one driving decision corresponding to the first range interval by combining the relative speed.
In one possible embodiment, the control module 703 may control the vehicle to travel according to the driving route by a vehicle actuator 704 of the vehicle. The vehicle actuator 704 may include one or more of the modules described above with respect to the travel system 102 or the control system 106 of FIG. 1.
In one possible embodiment, the at least one driving decision for each range interval is determined according to application scenarios including, but not limited to: automatic cruising, car following or automatic parking and the like.
Referring to fig. 8, a schematic structural diagram of another driving decision selection device provided in the present application is shown. The driving decision selection device is configured to perform the steps of the method described above with reference to fig. 2-6D.
A dividing module 801, configured to obtain at least one range interval, where each range interval in the at least one range interval has a corresponding confidence level range and a corresponding distance range, a distance range included in the at least one range interval covers a detection range of a sensor, and at least one confidence level range covers a confidence level of information detected by the sensor in the detection range;
the decision module 802 is configured to set at least one driving decision corresponding to each range interval, where each range interval and the at least one driving decision corresponding to each range interval are used to select a driving decision of a vehicle, and the driving decision of the vehicle is used to generate a driving path of the vehicle.
In a possible implementation, the dividing module 801 is specifically configured to:
the method comprises the steps of dividing a detection range of a sensor based on the distance between a vehicle and an object detected by the sensor to obtain at least one distance range, and calculating a confidence degree range corresponding to each distance degree range according to the relationship between a distance result included in output information of the sensor and a confidence degree corresponding to the distance result, wherein each range interval corresponds to one confidence degree range and one distance range, and each range interval comprises one confidence degree range and one distance range.
In a possible implementation, the dividing module 801 is specifically configured to:
the method comprises the steps of dividing a range of the confidence degrees which can be detected by a sensor to obtain at least one confidence degree range, calculating a distance range corresponding to each confidence degree range according to the relationship between a distance result included in output information of the sensor and the confidence degree corresponding to the distance result, wherein each range interval corresponds to one confidence degree range and one distance range.
In one possible embodiment, the driving decision selection device may further include: a control module 805 and a perception module 804;
the sensing module 804 is configured to obtain sensing information, where the sensing information includes information of an obstacle, such as a first distance between a vehicle and the obstacle;
and the control module 805 is configured to select a driving decision of the vehicle from at least one driving decision in the first range interval according to the sensing information, and control the vehicle to drive according to the driving decision.
In a possible implementation, the sensing information includes a distance between the obstacle and the vehicle, and the control module 805 is specifically configured to:
determining that the confidence degree included in the perception information is within the confidence degree range included in the first range interval; the driving decision of the vehicle is selected from the at least one driving decision of the first range interval in combination with the vehicle speed of the vehicle. In one possible implementation, the control module 805 is further configured to:
based on the speed of the vehicle, a relative vehicle speed of the vehicle relative to the obstacle is calculated, and then in combination with the relative vehicle speed, a driving decision of the vehicle is selected from at least one driving decision in the first range interval. In one possible embodiment, the perception information further comprises the speed of the vehicle and/or the relative speed of the vehicle and the obstacle;
the control module 805 is specifically configured to select a driving decision of the vehicle from the at least one driving decision of the first range interval in combination with the speed of the vehicle and/or the relative speed of the vehicle and the obstacle.
In one possible implementation, the confidence level included in the perception information is related to the distance of the sensor from the obstacle.
In a possible implementation, the driving decision selecting device may further include an obtaining module 803, specifically configured to:
acquiring historical distance information and corresponding confidence of historical distance information acquired by a sensor;
and acquiring the relation between the distance result included in the output information of the sensor and the confidence corresponding to the distance result according to the historical distance information and the corresponding confidence.
Optionally, the vehicle actuator 806 is similar to the vehicle actuator 704 described above and will not be described in detail herein.
Referring to fig. 9, a schematic structural diagram of another driving decision selection device provided in the present application is as follows.
The driving decision selection means may comprise a processor 901 and a memory 902. The processor 901 and the memory 902 are interconnected by a line. Wherein program instructions and data are stored in memory 902.
The memory 902 stores program instructions and data corresponding to the steps of fig. 2-6D described above.
The processor 901 is configured to execute the method steps performed by the driving decision selection device according to any one of the embodiments of fig. 2 to 6D.
Optionally, the driving decision selection device may further comprise a transceiver 903 for receiving or transmitting data.
Also provided in an embodiment of the present application is a computer-readable storage medium having stored therein a program for generating a vehicle travel speed, which when run on a computer causes the computer to perform the steps of the method as described in the embodiments shown in fig. 2-6D above.
Alternatively, the aforementioned driving decision selection means shown in fig. 9 is a chip.
The embodiment of the present application further provides a driving decision selecting device, which may also be referred to as a digital processing chip or a chip, where the chip includes a processing unit and a communication interface, the processing unit obtains a program instruction through the communication interface, the program instruction is executed by the processing unit, and the processing unit is configured to execute the method steps executed by the driving decision selecting device shown in any one of the foregoing embodiments in fig. 2 to 6D.
The embodiment of the application also provides a digital processing chip. The digital processing chip has integrated therein circuitry and one or more interfaces for implementing the processor 901 described above, or the functions of the processor 901. When integrated with memory, the digital processing chip may perform the method steps of any one or more of the preceding embodiments. When the digital processing chip is not integrated with the memory, the digital processing chip can be connected with the external memory through the communication interface. The digital processing chip implements the actions performed by the driving decision selection device in the above embodiments according to program codes stored in an external memory.
Embodiments of the present application also provide a computer program product, which when running on a computer, causes the computer to execute the steps performed by the running decision selecting device in the method described in the embodiments of fig. 2 to 6D.
The driving decision selection device provided by the embodiment of the application can be a chip, and the chip comprises: a processing unit, which may be for example a processor, and a communication unit, which may be for example an input/output interface, a pin or a circuit, etc. The processing unit may execute computer-executable instructions stored by the memory unit to cause the chip to perform the driving decision selection method described in the embodiments of fig. 2-6D above. Optionally, the storage unit is a storage unit in the chip, such as a register, a cache, and the like, and the storage unit may also be a storage unit located outside the chip in the wireless access device, such as a read-only memory (ROM) or another type of static storage device that can store static information and instructions, a Random Access Memory (RAM), and the like.
Specifically, the aforementioned processing unit or processor may be a Central Processing Unit (CPU), a Network Processor (NPU), a Graphics Processing Unit (GPU), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), or other programmable logic devices (programmable gate array), discrete gate or transistor logic devices (discrete hardware components), or the like. A general purpose processor may be a microprocessor or any conventional processor or the like.
Referring to fig. 10, fig. 10 is a schematic structural diagram of a chip according to an embodiment of the present disclosure, where the chip may be represented as a neural network processor NPU 100, and the NPU 100 is mounted on a main CPU (Host CPU) as a coprocessor, and the Host CPU allocates tasks. The core portion of the NPU is an arithmetic circuit 100, and the controller 1004 controls the arithmetic circuit 1003 to extract matrix data in the memory and perform multiplication.
In some implementations, the arithmetic circuit 1003 includes a plurality of processing units (PEs) therein. In some implementations, the operational circuit 1003 is a two-dimensional systolic array. The arithmetic circuit 1003 may also be a one-dimensional systolic array or other electronic circuit capable of performing mathematical operations such as multiplication and addition. In some implementations, the arithmetic circuit 1003 is a general-purpose matrix processor.
For example, assume that there is an input matrix A, a weight matrix B, and an output matrix C. The arithmetic circuit fetches the data corresponding to the matrix B from the weight memory 1002 and buffers it in each PE in the arithmetic circuit. The arithmetic circuit takes the matrix a data from the input memory 1001 and performs matrix operation with the matrix B, and a partial result or a final result of the obtained matrix is stored in an accumulator (accumulator) 1008.
The unified memory 1006 is used for storing input data and output data. The weight data directly passes through a Direct Memory Access Controller (DMAC) 1005, and the DMAC is transferred to the weight memory 1002. The input data is also carried into the unified memory 1006 by the DMAC.
A Bus Interface Unit (BIU) 1010 for interaction of the AXI bus with the DMAC and the instruction fetch memory (IFB) 1009.
A bus interface unit 1010 (BIU) for acquiring instructions from the external memory by the instruction fetch memory 1009 and for acquiring the original data of the input matrix a or the weight matrix B from the external memory by the storage unit access controller 1005.
The DMAC is mainly used to transfer input data in the external memory DDR to the unified memory 1006 or to transfer weight data into the weight memory 1002 or to transfer input data into the input memory 1001.
The vector calculation unit 1007 includes a plurality of operation processing units, and further processes the output of the operation circuit such as vector multiplication, vector addition, exponential operation, logarithmic operation, magnitude comparison, and the like, if necessary. The method is mainly used for non-convolution/full-connection layer network calculation in the neural network, such as batch normalization (batch normalization), pixel-level summation, up-sampling of a feature plane and the like.
In some implementations, the vector calculation unit 1007 can store the processed output vector to the unified memory 1006. For example, the vector calculation unit 1007 may apply a linear function and/or a nonlinear function to the output of the arithmetic circuit 1003, such as linear interpolation of the feature planes extracted by the convolutional layers, and further such as a vector of accumulated values to generate the activation values. In some implementations, the vector calculation unit 1007 generates normalized values, pixel-level summed values, or both. In some implementations, the vector of processed outputs can be used as activation inputs to the arithmetic circuit 1003, for example, for use in subsequent layers in a neural network.
An instruction fetch buffer 1009 connected to the controller 1004, for storing instructions used by the controller 1004;
the unified memory 1006, the input memory 1001, the weight memory 1002, and the instruction fetch memory 1009 are On-Chip memories. The external memory is private to the NPU hardware architecture.
The operation of each layer in the recurrent neural network may be performed by the operation circuit 1003 or the vector calculation unit 1007.
Wherein any of the aforementioned processors may be a general purpose central processing unit, a microprocessor, an ASIC, or one or more integrated circuits configured to control the execution of the programs of the methods of fig. 2-6D.
It should be noted that the above-described embodiments of the apparatus are merely schematic, where the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. In addition, in the drawings of the embodiments of the apparatus provided in the present application, the connection relationship between the modules indicates that there is a communication connection therebetween, and may be implemented as one or more communication buses or signal lines.
Through the above description of the embodiments, those skilled in the art will clearly understand that the present application can be implemented by software plus necessary general-purpose hardware, and certainly can also be implemented by special-purpose hardware including special-purpose integrated circuits, special-purpose CPUs, special-purpose memories, special-purpose components and the like. Generally, functions performed by computer programs can be easily implemented by corresponding hardware, and specific hardware structures for implementing the same functions may be various, such as analog circuits, digital circuits, or dedicated circuits. However, for the present application, the implementation of a software program is more preferable. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a readable storage medium, such as a floppy disk, a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk of a computer, and includes instructions for enabling a computer device (which may be a personal computer, a server, or a network device) to execute the methods described in the embodiments of the present application.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product.
The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that a computer can store or a data storage device, such as a server, a data center, etc., that is integrated with one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims of the present application and in the drawings described above, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that the embodiments described herein may be practiced otherwise than as specifically illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.

Claims (20)

1. A driving decision selection method, comprising:
acquiring perception information acquired by a sensor configured on a vehicle;
selecting a first range interval matched with the perception information from at least one range interval, wherein the at least one range interval is obtained by dividing the detection range of the sensor based on the output information of the sensor, the output information of the sensor comprises at least one of a distance result of a detection target object, a confidence coefficient corresponding to the distance result or a relation between the distance result and the confidence coefficient corresponding to the distance result, and each range interval in the at least one range interval corresponds to at least one driving decision;
and selecting a running decision of the vehicle from at least one running decision corresponding to the first range interval by combining the vehicle speed of the vehicle, and controlling the vehicle to run according to the running decision of the vehicle.
2. The method according to claim 1, wherein each range section corresponds to a distance range, the perception information comprises a first distance, and the first distance is a distance between the vehicle and an obstacle,
the selecting a first range interval matched with the perception information from at least one range interval comprises:
selecting the first range bin matching the first distance from the at least one range bin.
3. The method according to claim 1, wherein each range interval corresponds to a confidence range, and the perception information includes a first confidence, and the first confidence is used for indicating the accuracy of the distance between the vehicle and the obstacle included in the perception information;
the selecting a first range interval matched with the perception information from at least one range interval comprises:
selecting the first range interval matched with the first confidence degree from the at least one range interval, wherein the first confidence degree is in a confidence degree range included in the first range interval.
4. The method according to any of claims 1-3, wherein prior to obtaining the perception information, the method further comprises:
and dividing the detection range of the sensor to obtain at least one distance range, wherein the at least one distance range corresponds to the at least one range interval one to one.
5. The method of claim 4, wherein the dividing the detection range of the sensor into at least one distance range comprises:
obtaining at least one confidence range, each of the at least one confidence ranges being non-coincident, the at least one confidence range covering a confidence of information detected by the sensor within the detection range;
and dividing the detection range of the sensor into at least one distance range according to the at least one confidence range and the relationship between the distance result and the confidence corresponding to the distance result.
6. The method of claim 4, wherein after said dividing the detection range of the sensor into at least one distance range, the method further comprises:
and determining at least one confidence degree range corresponding to the at least one distance range one by one according to the relationship between the distance result and the confidence degree corresponding to the distance result, wherein the at least one confidence degree range is used for screening the range matched with the perception information from the at least one range.
7. The method according to any one of claims 1-3, further comprising:
acquiring historical distance information and corresponding confidence degree acquired by the sensor;
and acquiring the relation between the distance result and the confidence corresponding to the distance result according to the historical distance information and the corresponding confidence.
8. A method according to any one of claims 1-3, wherein said selecting a driving decision for the vehicle from at least one driving decision for the first range interval in combination with the vehicle speed of the vehicle comprises:
calculating the relative speed of the vehicle and an obstacle according to the vehicle speed of the vehicle;
and selecting a driving decision of the vehicle from at least one driving decision corresponding to the first range interval by combining the relative speed.
9. The method according to any one of claims 1-3, wherein the at least one driving decision for each range interval is determined according to an application scenario comprising: automatic cruising, car following or automatic parking.
10. A driving decision selection device, comprising:
the sensing module is used for acquiring sensing information acquired by a sensor configured on a vehicle;
a decision module, configured to select a first range interval that matches the sensing information from at least one range interval, where the at least one range interval is obtained by dividing a detection range of the sensor based on output information of the sensor, the output information of the sensor includes at least one of a distance result of a detection target object, a confidence degree corresponding to the distance result, or a relationship between the distance result and the confidence degree corresponding to the distance result, and each range interval in the at least one range interval corresponds to at least one driving decision;
the decision module is further configured to select a driving decision of the vehicle from at least one driving decision corresponding to the first range interval in combination with the vehicle speed of the vehicle;
and the control module is used for controlling the vehicle to run according to the running decision of the vehicle.
11. The apparatus according to claim 10, wherein each range section corresponds to a distance range, the perception information further includes a first distance, and the first distance is a distance between the vehicle and an obstacle;
the decision module is specifically configured to select the first range interval that matches the first distance from the at least one range interval, where the first distance is within a distance range included in the first range interval.
12. The apparatus according to claim 10, wherein each range interval corresponds to a confidence range, and the perception information includes a first confidence, and the first confidence is used to indicate an accuracy degree of a distance between the vehicle and an obstacle included in the perception information;
the decision module is specifically configured to select, based on a first confidence level, the first range interval corresponding to the first confidence level from the at least one range interval.
13. The apparatus according to any one of claims 10-12, further comprising:
the dividing module is used for dividing the detection range of the sensor to obtain at least one distance range before the perception module acquires the perception information, and the at least one distance range is in one-to-one correspondence with the at least one range interval.
14. The apparatus of claim 13, wherein each range bin comprises a confidence range, the apparatus further comprising: a partitioning module specifically configured to:
obtaining at least one confidence range, each of the at least one confidence ranges being non-coincident, the at least one confidence range covering a confidence of information detected by the sensor within the detection range;
and obtaining at least one distance range corresponding to the at least one confidence degree range one by one according to the at least one confidence degree range and the relationship between the distance result and the confidence degree corresponding to the distance result, wherein the at least one confidence degree range and the at least one distance range form the at least one range interval.
15. The apparatus according to claim 13, wherein the partitioning module is further configured to determine at least one confidence level range corresponding to the at least one distance range in a one-to-one manner according to a relationship between the distance result and a confidence level corresponding to the distance result, where the at least one confidence level range is used to filter a range that matches the perception information from the at least one range.
16. The apparatus according to any one of claims 10-12, wherein the sensing module is further configured to:
acquiring historical distance information and corresponding confidence degree acquired by the sensor;
and acquiring the relation between the distance result and the confidence corresponding to the distance result according to the historical distance information and the corresponding confidence.
17. The apparatus according to any one of claims 10-12, wherein the decision module is specifically configured to:
calculating the relative speed of the vehicle and an obstacle according to the vehicle speed of the vehicle;
and selecting a driving decision of the vehicle from at least one driving decision corresponding to the first range interval by combining the relative speed.
18. The apparatus of any of claims 10-12, wherein the at least one driving decision for each range interval is determined according to an application scenario comprising: automatic cruising, car following or automatic parking.
19. A driving decision selection device comprising a processor;
the processor obtaining program instructions through the communication interface, which when executed by the processor implement the method of any one of claims 1-9; alternatively, the first and second electrodes may be,
coupled to the processor and a memory storing a program that when executed by the processor implements the method of any of claims 1 to 9.
20. A computer-readable storage medium comprising a program which, when executed by a processing unit, performs the method of any of claims 1 to 9.
CN202080004262.9A 2020-07-21 2020-07-21 Driving decision selection method and device Active CN112512887B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/103181 WO2022016351A1 (en) 2020-07-21 2020-07-21 Method and apparatus for selecting driving decision

Publications (2)

Publication Number Publication Date
CN112512887A CN112512887A (en) 2021-03-16
CN112512887B true CN112512887B (en) 2021-11-30

Family

ID=74953137

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080004262.9A Active CN112512887B (en) 2020-07-21 2020-07-21 Driving decision selection method and device

Country Status (2)

Country Link
CN (1) CN112512887B (en)
WO (1) WO2022016351A1 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113799799A (en) * 2021-09-30 2021-12-17 中国第一汽车股份有限公司 Security compensation method and device, storage medium and electronic equipment
CN114655131B (en) * 2022-03-29 2023-10-13 东风汽车集团股份有限公司 Vehicle-mounted sensing sensor adjustment method, device, equipment and readable storage medium
CN114997359A (en) * 2022-05-17 2022-09-02 哈尔滨工业大学 Complete set of technical equipment for embankment dangerous case patrol based on bionic machine dog
CN114771576A (en) * 2022-05-19 2022-07-22 北京百度网讯科技有限公司 Behavior data processing method, control method of automatic driving vehicle and automatic driving vehicle
CN114872735B (en) * 2022-07-10 2022-10-04 成都工业职业技术学院 Neural network algorithm-based decision-making method and device for automatically-driven logistics vehicles
CN117196266B (en) * 2023-11-07 2024-01-23 成都工业职业技术学院 Unmanned shared automobile area scheduling method and device based on neural network

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008265706A (en) * 2007-04-25 2008-11-06 Nissan Motor Co Ltd Vehicle traveling control device and vehicle traveling control method
US9367065B2 (en) * 2013-01-25 2016-06-14 Google Inc. Modifying behavior of autonomous vehicles based on sensor blind spots and limitations
CN107067753B (en) * 2017-05-23 2020-01-07 东南大学 Automatic following driving method based on driving safety distance
CN108860148B (en) * 2018-06-13 2019-11-08 吉林大学 Self-adapting cruise control method based on driver's follow the bus characteristic Safety distance model
JP7027279B2 (en) * 2018-08-07 2022-03-01 本田技研工業株式会社 Vehicle control devices, vehicle control methods, and programs
CN110949383B (en) * 2018-09-26 2021-03-30 广州汽车集团股份有限公司 Control method and device for following driving of automatic driving vehicle
JP7065410B2 (en) * 2018-09-28 2022-05-12 パナソニックIpマネジメント株式会社 Empty parking space detection device and empty parking space detection method
CN110843779B (en) * 2019-10-16 2021-08-13 华为技术有限公司 Method and device for controlling vehicle running
CN111324115B (en) * 2020-01-23 2023-09-19 北京百度网讯科技有限公司 Obstacle position detection fusion method, obstacle position detection fusion device, electronic equipment and storage medium

Also Published As

Publication number Publication date
WO2022016351A1 (en) 2022-01-27
CN112512887A (en) 2021-03-16

Similar Documents

Publication Publication Date Title
JP7255782B2 (en) Obstacle avoidance method, obstacle avoidance device, automatic driving device, computer-readable storage medium and program
CN112512887B (en) Driving decision selection method and device
WO2021000800A1 (en) Reasoning method for road drivable region and device
WO2021103511A1 (en) Operational design domain (odd) determination method and apparatus and related device
CN113631452B (en) Lane change area acquisition method and device
CN112534483B (en) Method and device for predicting vehicle exit
CN113792566A (en) Laser point cloud processing method and related equipment
WO2022142839A1 (en) Image processing method and apparatus, and intelligent vehicle
WO2022062825A1 (en) Vehicle control method, device, and vehicle
CN115042821B (en) Vehicle control method, vehicle control device, vehicle and storage medium
EP4307251A1 (en) Mapping method, vehicle, computer readable storage medium, and chip
CN114056347A (en) Vehicle motion state identification method and device
CN115147796A (en) Method and device for evaluating target recognition algorithm, storage medium and vehicle
CN112810603B (en) Positioning method and related product
CN113954858A (en) Method for planning vehicle driving route and intelligent automobile
CN114842440B (en) Automatic driving environment sensing method and device, vehicle and readable storage medium
CN115100630B (en) Obstacle detection method, obstacle detection device, vehicle, medium and chip
CN112829762A (en) Vehicle running speed generation method and related equipment
CN115205848A (en) Target detection method, target detection device, vehicle, storage medium and chip
CN114261404A (en) Automatic driving method and related device
CN113022573A (en) Road structure detection method and device
CN115082886B (en) Target detection method, device, storage medium, chip and vehicle
CN115407344B (en) Grid map creation method, device, vehicle and readable storage medium
CN114556251B (en) Method and device for determining a passable space for a vehicle
CN114877911B (en) Path planning method, device, vehicle and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant