CN115993821A - Decision-making method, device and equipment for automatic driving vehicle and automatic driving vehicle - Google Patents
Decision-making method, device and equipment for automatic driving vehicle and automatic driving vehicle Download PDFInfo
- Publication number
- CN115993821A CN115993821A CN202211567745.7A CN202211567745A CN115993821A CN 115993821 A CN115993821 A CN 115993821A CN 202211567745 A CN202211567745 A CN 202211567745A CN 115993821 A CN115993821 A CN 115993821A
- Authority
- CN
- China
- Prior art keywords
- vehicle
- target obstacle
- decision
- obstacle
- driving behavior
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Landscapes
- Traffic Control Systems (AREA)
Abstract
The disclosure provides a decision-making method, device and equipment for an automatic driving vehicle and the automatic driving vehicle, and relates to the technical field of computers, in particular to the technical field of automatic driving. The implementation scheme is as follows: obtaining an obstacle detection result of the current frame, wherein the obstacle detection result comprises a first position of a target obstacle on a vehicle running path; acquiring a historical decision tag sequence of the target obstacle in response to the first distance between the target obstacle and the vehicle being less than a second threshold and the current frame not being the first frame in which the target obstacle is detected; and based on the historical decision tag sequence, judging whether the target obstacle is used for determining the current driving behavior of the vehicle. The scheme can screen short obstacles in short distance, reduce frequent sudden braking and ensure safe and stable running of the vehicle.
Description
Technical Field
The present disclosure relates to the field of computer technology, and in particular, to the field of autopilot technology, and more particularly, to a decision method and apparatus for an autopilot vehicle, an electronic device, a computer readable storage medium, a computer program product, and an autopilot vehicle.
Background
Autopilot technology involves several aspects of environmental awareness, behavioral decisions, trajectory planning, and motion control. Depending on the cooperation of the sensor, the vision computing system and the positioning system, the vehicle with the autopilot function may be operated automatically without or with little manipulation by the driver. In order to ensure safe driving of the vehicle, the driving vehicle needs to detect obstacles in the surrounding environment and make driving decisions according to the detection result to determine the following driving behavior.
The approaches described in this section are not necessarily approaches that have been previously conceived or pursued. Unless otherwise indicated, it should not be assumed that any of the approaches described in this section qualify as prior art merely by virtue of their inclusion in this section. Similarly, the problems mentioned in this section should not be considered as having been recognized in any prior art unless otherwise indicated.
Disclosure of Invention
The present disclosure provides a decision making method and apparatus for an autonomous vehicle, an electronic device, a computer readable storage medium, a computer program product, and an autonomous vehicle.
According to an aspect of the present disclosure, there is provided a decision method of an autonomous vehicle, including: obtaining an obstacle detection result of a current frame, wherein the obstacle detection result comprises a first position of a target obstacle on a vehicle driving path, and the height of the target obstacle is smaller than a first threshold value; responsive to a first distance between the target obstacle and the vehicle being less than a second threshold and the current frame not being a first frame in which the target obstacle is detected, obtaining a historical decision tag sequence of the target obstacle, wherein the historical decision tag sequence includes at least one decision tag corresponding to at least one historical frame, respectively, each of the at least one decision tag indicating whether the target obstacle in the respective historical frame is used to determine driving behavior of the vehicle; and determining whether the target obstacle is used to determine the current driving behavior of the vehicle based on the historical decision tag sequence.
According to another aspect of the present disclosure, there is provided a decision making apparatus of an autonomous vehicle, comprising: a first acquisition module configured to acquire an obstacle detection result of a current frame, wherein the obstacle detection result includes a first position of a target obstacle on a vehicle travel path, a height of the target obstacle being less than a first threshold; a second acquisition module configured to acquire a historical decision tag sequence of the target obstacle in response to a first distance between the target obstacle and the vehicle being less than a second threshold and the current frame not being a first frame in which the target obstacle is detected, wherein the historical decision tag sequence includes at least one decision tag corresponding to at least one historical frame, respectively, each of the at least one decision tag indicating whether the target obstacle in the respective historical frame is used to determine a driving behavior of the vehicle; and a determination module configured to determine whether the target obstacle is used to determine a current driving behavior of the vehicle based on the historical decision tag sequence.
According to an aspect of the present disclosure, there is provided an electronic apparatus including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the automated vehicle decision method described above.
According to an aspect of the present disclosure, there is provided a non-transitory computer-readable storage medium storing computer instructions for causing a computer to execute the above-described decision method of automatically driving a vehicle.
According to an aspect of the present disclosure, there is provided a computer program product comprising computer program instructions, wherein the computer program instructions, when executed by a processor, implement the above-described method of decision making for an autonomous vehicle.
According to an aspect of the present disclosure, there is provided an autonomous vehicle including the above-described electronic apparatus.
According to one or more embodiments of the present disclosure, safe and smooth running of the vehicle can be ensured.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the disclosure, nor is it intended to be used to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following specification.
Drawings
The accompanying drawings illustrate exemplary embodiments and, together with the description, serve to explain exemplary implementations of the embodiments. The illustrated embodiments are for exemplary purposes only and do not limit the scope of the claims. Throughout the drawings, identical reference numerals designate similar, but not necessarily identical, elements.
FIG. 1 illustrates a schematic diagram of an exemplary system in which various methods described herein may be implemented, in accordance with an embodiment of the present disclosure;
FIG. 2 illustrates a flow chart of a decision method of an autonomous vehicle according to an embodiment of the present disclosure;
FIG. 3 shows a block diagram of a decision making device of an autonomous vehicle according to an embodiment of the present disclosure; and
fig. 4 illustrates a block diagram of an exemplary electronic device that can be used to implement embodiments of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present disclosure to facilitate understanding, and should be considered as merely exemplary. Accordingly, one of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
In the present disclosure, the use of the terms "first," "second," and the like to describe various elements is not intended to limit the positional relationship, timing relationship, or importance relationship of the elements, unless otherwise indicated, and such terms are merely used to distinguish one element from another element. In some examples, a first element and a second element may refer to the same instance of the element, and in some cases, they may also refer to different instances based on the description of the context.
The terminology used in the description of the various illustrated examples in this disclosure is for the purpose of describing particular examples only and is not intended to be limiting. Unless the context clearly indicates otherwise, the elements may be one or more if the number of the elements is not specifically limited. Furthermore, the term "and/or" as used in this disclosure encompasses any and all possible combinations of the listed items.
In the technical field of automatic driving, the acquisition and processing of surrounding environment and obstacle information of a vehicle are the basis and precondition for realizing automatic driving of the vehicle. The sensing system of the vehicle can detect the obstacles in the surrounding environment and report the obstacle detection result to the decision-making system, so that the decision-making system can make reasonable driving behaviors, such as braking, detouring and the like, according to the obstacle detection result.
In the related art, when the sensing system of the vehicle reports the obstacle detection result to the decision system, only whether the obstacle exists or not is considered, namely, only the obstacle is detected, the obstacle is reported to the decision system, and part of the obstacle is not considered to belong to the obstacle which does not need to be avoided or the obstacle which is detected in error, so that the vehicle is unreasonable to brake suddenly or detours, and the driving experience is affected.
In view of the above problems, embodiments of the present disclosure provide a decision method for automatically driving a vehicle, which can avoid unnecessary sudden braking and ensure safe and stable driving of the vehicle.
Embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings.
Fig. 1 illustrates a schematic diagram of an exemplary system 100 in which various methods and apparatus described herein may be implemented, in accordance with an embodiment of the present disclosure. Referring to fig. 1, the system 100 includes a motor vehicle 110, a server 120, and one or more communication networks 130 coupling the motor vehicle 110 to the server 120.
In an embodiment of the present disclosure, motor vehicle 110 may include electronics according to an embodiment of the present disclosure and/or be configured to perform a decision method of automatically driving a vehicle according to an embodiment of the present disclosure.
The server 120 may include one or more general purpose computers, special purpose server computers (e.g., PC (personal computer) servers, UNIX servers, mid-end servers), blade servers, mainframe computers, server clusters, or any other suitable arrangement and/or combination. The server 120 may include one or more virtual machines running a virtual operating system, or other computing architecture that involves virtualization (e.g., one or more flexible pools of logical storage devices that may be virtualized to maintain virtual storage devices of the server). In various embodiments, server 120 may run one or more services or software applications that provide the functionality described below.
The computing units in server 120 may run one or more operating systems including any of the operating systems described above as well as any commercially available server operating systems. Server 120 may also run any of a variety of additional server applications and/or middle tier applications, including HTTP servers, FTP servers, CGI servers, JAVA servers, database servers, etc.
In some implementations, server 120 may include one or more applications to analyze and consolidate data feeds and/or event updates received from motor vehicle 110. Server 120 may also include one or more applications to display data feeds and/or real-time events via one or more display devices of motor vehicle 110.
Network 130 may be any type of network known to those skilled in the art that may support data communications using any of a number of available protocols, including but not limited to TCP/IP, SNA, IPX, etc. By way of example only, the one or more networks 130 may be a satellite communications network, a Local Area Network (LAN), an ethernet-based network, a token ring, a Wide Area Network (WAN), the internet, a virtual network, a Virtual Private Network (VPN), an intranet, an extranet, a blockchain network, a Public Switched Telephone Network (PSTN), an infrared network, a wireless network (including, for example, bluetooth, wi-Fi), and/or any combination of these with other networks.
The system 100 may also include one or more databases 150. In some embodiments, these databases may be used to store data and other information. For example, one or more of databases 150 may be used to store information such as audio files and video files. Database 150 may reside in various locations. For example, the database used by the server 120 may be local to the server 120, or may be remote from the server 120 and may communicate with the server 120 via a network-based or dedicated connection. Database 150 may be of different types. In some embodiments, the database 150 used by the server 120 may be a relational database. One or more of these databases may store, update, and retrieve the databases and data from the databases in response to the commands.
In some embodiments, one or more of databases 150 may also be used by applications to store application data. The databases used by the application may be different types of databases, such as key value stores, object stores, or conventional stores supported by the file system.
The system 100 of fig. 1 may be configured and operated in various ways to enable application of the various methods and apparatus described in accordance with the present disclosure.
According to some embodiments, motor vehicle 110 also includes a perception system and a decision-making system. The sensing system includes a sensor 111, the sensor 111 including one or more of the following sensors: a vision camera, an infrared camera, an ultrasonic sensor, millimeter wave radar, and laser radar (LiDAR), and at least one processor that processes data acquired by the one or more sensors. The decision system may obtain data processed by the perception system and provide driving decisions for the vehicle based on the obtained data.
According to some embodiments, the method of decision making for automatically driving a vehicle of an embodiment of the present disclosure may be performed by a client (e.g., motor vehicle 110 shown in fig. 1) or by server 120 or another server (not shown in fig. 1). The obstacle information involved in the decision making method of the autonomous vehicle of the embodiments of the present disclosure may be obtained through the perception system of the motor vehicle 110, or may be obtained in other manners, without limitation.
According to an embodiment of the present disclosure, a decision method of an autonomous vehicle is provided. Fig. 2 shows a flow chart of a decision method 200 for automatically driving a vehicle in accordance with an embodiment of the present disclosure. The subject of execution of the various steps of method 200 is typically an autonomous vehicle (e.g., motor vehicle 110 in fig. 1), but may also be a server (e.g., server 120 shown in fig. 1 or other servers not shown in fig. 1).
As shown in fig. 2, the method 200 includes steps S210-S230.
In step S210, an obstacle detection result of the current frame is acquired, wherein the obstacle detection result includes a first position of a target obstacle on a vehicle travel path, and a height of the target obstacle is less than a first threshold.
In step S220, in response to the first distance between the target obstacle and the vehicle being less than the second threshold and the current frame not being the first frame in which the target obstacle is detected, a historical decision tag sequence of the target obstacle is obtained, wherein the historical decision tag sequence includes at least one decision tag corresponding to at least one historical frame, respectively, each of the at least one decision tag indicating whether the target obstacle in the respective historical frame is used to determine a driving behavior of the vehicle.
In step S230, it is determined whether the target obstacle is used to determine the current driving behavior of the vehicle, based on the history decision tag sequence.
According to an embodiment of the present disclosure, when a low obstacle (i.e., a target obstacle having a height less than a first threshold) is detected at a position closer to the vehicle (i.e., a first distance to the vehicle is less than a second threshold), instead of directly reporting the low obstacle to a decision system of the vehicle for making a driving decision, its impact on the current driving decision is determined in conjunction with the historical reporting of the low obstacle, thereby determining whether to report it to the decision system. Therefore, short obstacles at short distance can be screened, frequent sudden braking caused by reporting all detected short distance obstacles to a decision system is avoided, and safe and stable running of the vehicle is ensured.
The steps of method 200 are described in detail below.
In step S210, an obstacle detection result of the current frame is acquired, wherein the obstacle detection result includes a first position of a target obstacle on a vehicle travel path, and a height of the target obstacle is less than a first threshold.
An autonomous vehicle (e.g., vehicle 110 shown in fig. 1) includes a perception system including a plurality of sensors (e.g., lidar, millimeter wave radar, cameras, etc.) and a decision system. The laser radar can be used for acquiring point cloud data (comprising position coordinates and height information of the obstacle) of the obstacle, and the camera can be used for acquiring image information of the obstacle. The sensing system can fuse and process the data (including point cloud data and image information, etc.) of the obstacle detected by the plurality of sensors to obtain an obstacle detection result (including information of the position, the size, the category, etc. of the obstacle, for example), and output the obstacle detection result to the decision system, so that the decision system makes reasonable driving decisions (such as braking, detour, rolling, overtaking, etc.) based on the obstacle detection result.
According to some embodiments, the height of the target obstacle may be determined according to the point cloud data, for example, the height of the highest point in the point cloud data of the target obstacle may be determined as the height of the obstacle, and the difference in height between the highest point and the lowest point in the point cloud data of the target obstacle may also be determined as the height of the obstacle. If the height of the target obstacle is less than a first threshold (e.g., 20 cm), the obstacle may be a crushable obstacle (e.g., fallen leaves, plastic bags, etc.). However, not all obstacles with a height less than the first threshold are crushable, so further decisions need to be made based on more information (e.g., historical decision tag sequences of the obstacle, location information of the obstacle in the last frame, etc.). It will be appreciated that the first threshold may be set according to actual circumstances, such as according to the chassis height of the vehicle, and is not limited herein. The first position of the target obstacle can be acquired through a plurality of sensors of the sensing system, and the first position can be coordinate position information under the same coordinate system as the vehicle.
According to some embodiments, the obstacle detection result further includes a category to which the target obstacle belongs, and in response to the category being a non-collidable category, the target obstacle is determined to be used for determining the current driving behavior of the vehicle. For non-collidable obstacles such as pedestrians, vehicles, bicycles, etc., it is required to absolutely avoid collision with such obstacles. Therefore, under the condition that the target obstacle is an obstacle of a non-collidable category, whether the target obstacle can be rolled or not is further judged without more information, and the target obstacle is directly reported to a decision system for determining the current driving behavior of the vehicle, so that the running safety of the vehicle and the safety of other vehicles and pedestrians are ensured.
In step S220, in response to the first distance between the target obstacle and the vehicle being less than the second threshold and the current frame not being the first frame in which the target obstacle is detected, a historical decision tag sequence of the target obstacle is obtained, wherein the historical decision tag sequence includes at least one decision tag corresponding to at least one historical frame, respectively, each of the at least one decision tag indicating whether the target obstacle in the respective historical frame is used to determine a driving behavior of the vehicle.
It should be noted that, in the embodiment of the present disclosure, the target obstacle is used to determine the driving behavior of the vehicle, and the relevant information of the target obstacle is reported to the decision system of the vehicle, so that the decision system determines the driving behavior of the vehicle based on the relevant information of the target obstacle. Similarly, the target obstacle is not used to determine the driving behavior of the vehicle, meaning that no information about the target obstacle is reported to the decision system of the vehicle, so that the decision system does not take the target obstacle into account when determining the driving behavior of the vehicle.
According to some embodiments, the first distance between the target obstacle and the vehicle may be calculated from the position coordinates of the vehicle and the position coordinates of the first position of the target obstacle. In the case where the first distance is smaller than the second threshold (e.g., 40 m), the target obstacle is closer to the vehicle. If the target obstacle is used to determine the current driving behavior of the vehicle, that is, if the target obstacle is reported to the decision system, the driving behavior determined by the decision system based on the target obstacle at a close distance is likely to be sudden braking, resulting in unstable running of the vehicle and poor body feeling of the driver or the passenger. In this case, it is necessary to determine whether the target obstacle is an obstacle detected for the first time by the sensing system of the vehicle, i.e., to determine whether the current frame is the first frame in which the target obstacle is detected. For different determination results, different conditions may be employed to determine whether to use the target obstacle for determining the current driving behavior of the vehicle.
It should be noted that, since the operating frequencies of the different sensors are different (for example, the operating frequency of the camera is 10Hz and the operating frequency of the lidar is 15 Hz), it is necessary to combine the data of the plurality of sensors to determine whether the current frame is the first frame in which the target obstacle is detected.
According to some embodiments, in case the first distance between the target obstacle and the vehicle is less than the second threshold and the current frame is not the first frame in which the target obstacle was detected (i.e. the obstacle has been detected before the current frame), it may be determined whether the target obstacle is used for determining the current driving behaviour of the vehicle in combination with the decision tag of at least one historical frame in the historical decision tag sequence.
According to some embodiments, in response to the first distance being less than the second threshold and the current frame being a first frame in which the target obstacle is detected, it is determined that the target obstacle is not used to determine the current driving behavior of the vehicle. Because the obstacle does not have history information (i.e., a history decision tag sequence) when the target obstacle is detected for the first time, the target obstacle which is detected for the first time and has a smaller distance (smaller than a second threshold value) from the vehicle can be directly filtered out, and is not reported to the decision system, i.e., the target obstacle is not used for determining the driving behavior of the vehicle, so that the vehicle is prevented from causing sudden braking for avoiding the target obstacle.
According to some embodiments, the second threshold is determined based on a speed of the vehicle. It will be appreciated that the faster the vehicle is, the greater the braking distance. The second threshold may be set to be the same as the braking distance. The braking distance (second threshold value) varies with the vehicle speed, and can be calculated according to the following formula:
where v is the running speed of the vehicle. a is deceleration of the vehicle when braking, and is a preset constant. Typically, to avoid sudden braking, the peak of deceleration needs to be controlled at 2m/s 2 Within that, the corresponding average deceleration is generally 1-1.5m/s 2 . Meanwhile, a limit of a minimum value (for example, 40 m) may be set for the second threshold value in consideration of the fact that the braking distance is too small when the vehicle speed is too low (for example, 5 m/s). That is, the second threshold value becomes larger as the vehicle speed increases in the case where the vehicle speed is high, and may be set to a fixed value without depending on the vehicle speed change in the case where the vehicle speed is low. It will be appreciated that the deceleration of the vehicle during sudden braking may be set according to actual needs, and is not limited herein.
In step S230, it is determined whether the target obstacle is used to determine the current driving behavior of the vehicle, based on the history decision tag sequence.
According to some embodiments, in response to a ratio of the first number to the second number being less than a third threshold, it is determined that the target obstacle is not used to determine the current driving behavior of the vehicle, wherein the first number is a number of decision tags having a value of yes among the at least one decision tag, and the second number is a total number of the at least one decision tag. If the value in the decision tag is yes, the target obstacle is used for determining the driving behavior of the vehicle in the corresponding history frame, namely, the target obstacle is reported to the decision system in the history frame. If the value in the decision tag is no, the target obstacle is not used for determining the driving behavior of the vehicle in the corresponding history frame, i.e. the target obstacle is not reported to the decision system in the history frame. The sliding window method can also be used for selecting the decision labels of the most recent multi-frame (for example, the most recent 10 frames) from the historical decision label sequences, counting the first number of decision labels which are valued as yes in the most recent multi-frame, and calculating the ratio of the first number to the total number of the decision labels of the most recent multi-frame.
The ratio of the first number to the second number being less than the third threshold, i.e. the frequency with which the target obstacle is reported to the decision system is lower, indicates that the target obstacle is not stably reported to the decision system in the history frame, and that the decision system may make a history driving decision with less consideration to the target obstacle. If the target obstacle is reported to the decision system in the current frame, abrupt driving behavior is likely to be caused, and sudden braking of the vehicle is likely to occur. Therefore, in order to ensure smooth running of the vehicle, the target obstacle may be filtered out, and the determination target obstacle is not used for determining the current driving behavior of the vehicle, i.e. not reported to the decision system.
According to some embodiments, in response to the first distance between the target obstacle and the vehicle being less than the second threshold and the current frame not being the first frame in which the target obstacle was detected, obtaining a second position of the target obstacle of the previous frame; wherein, based on the historical decision tag sequence, determining whether the target obstacle is used to determine the current driving behavior of the vehicle comprises: based on a second distance between the first location and the second location and the historical decision tag sequence, a determination is made as to whether the target obstacle is used to determine a current driving behavior of the vehicle.
In the case where the target obstacle is close to the vehicle and the target obstacle of the current frame is not detected for the first time, it is also possible to determine whether the target obstacle is used to determine the current driving behavior of the vehicle according to the second distance between the position of the current frame (first position) of the obstacle and the position of the previous frame (second position) and whether the target obstacle is stably reported in the history frame. The second distance between the first position and the second position is used for judging whether the target obstacle is in position jump, and if the second distance is too large, the target obstacle is in position jump and is likely to be detected by mistake. In order to avoid sudden braking of the vehicle due to the erroneously detected target obstacle, the obstacle is filtered out, and the target obstacle is determined not to be used for determining the current driving behavior of the vehicle.
According to some embodiments, the target obstacle is determined not to be used to determine the current driving behavior of the vehicle in response to any of the following conditions being met: the second distance is greater than or equal to a fourth threshold; or the ratio of the first quantity to the second quantity is smaller than a third threshold, wherein the first quantity is the quantity of decision labels with the value being yes in the at least one decision label, and the second quantity is the total quantity of the at least one decision label. And when the target obstacle belongs to any one of two conditions of position jump occurrence or is not stably reported to the decision system in the history frame, judging that the target obstacle is not used for determining the current driving behavior of the vehicle, and filtering the obstacle. Therefore, the sudden braking of the vehicle caused by the erroneously detected obstacle or the obstacle which is not stably reported to the decision system in the history frame is avoided, and the stable running of the vehicle is ensured.
According to some embodiments, in case the first distance between the target obstacle and the vehicle is smaller than the second threshold, more conditions may be combined to determine whether to report the target obstacle to the decision system: if the current frame is not the first frame in which the target obstacle is detected, judging whether the target obstacle is stably reported to a decision system by combining decision labels in the historical frames. And if the target obstacle is stably reported to the decision system and no position jump occurs, reporting the target obstacle to the decision system in the current frame. Only if the above conditions are met will the target obstacle be reported to the decision system, otherwise the target obstacle is filtered out.
According to some embodiments, in response to determining that the target obstacle is not used to determine the current driving behavior of the vehicle, information of the target obstacle is reported to a cloud server. In the case where it is determined that the target obstacle is not used to determine the current driving behavior of the vehicle, the information of the target obstacle is not reported to the decision system. In consideration of vehicle driving safety, the target obstacle is reported to the cloud server while being filtered, so that related staff can drive the vehicle in a cloud mode, and driving decisions are timely made according to the information of the target obstacle to cope with emergency.
According to an embodiment of the present disclosure, a decision making apparatus for an autonomous vehicle is provided. Fig. 3 shows a block diagram of a decision making device 300 of an autonomous vehicle according to an embodiment of the present disclosure. As shown in fig. 3, the apparatus 300 includes a first acquisition module 310, a second acquisition module 320, and a determination module 330.
The first acquisition module 310 is configured to acquire an obstacle detection result of the current frame, wherein the obstacle detection result includes a first position of a target obstacle on a vehicle travel path, and a height of the target obstacle is less than a first threshold.
The second acquisition module 320 is configured to acquire a historical decision tag sequence of the target obstacle in response to the first distance between the target obstacle and the vehicle being less than a second threshold and the current frame not being a first frame in which the target obstacle is detected, wherein the historical decision tag sequence includes at least one decision tag corresponding to at least one historical frame, respectively, each of the at least one decision tag indicating whether the target obstacle in the respective historical frame is used to determine driving behavior of the vehicle.
The determination module 330 is configured to determine whether the target obstacle is used to determine the current driving behavior of the vehicle based on the historical decision tag sequence.
According to some embodiments, the determining module 330 includes: the determination unit is configured to determine that the target obstacle is not used to determine the current driving behavior of the vehicle in response to a ratio of the first number to the second number being less than a third threshold, wherein the first number is a number of decision tags having a value of yes among the at least one decision tag, and the second number is a total number of the at least one decision tag.
According to some embodiments, the apparatus 300 further comprises: the third acquisition module is configured to acquire a second position of the target obstacle of a previous frame in response to the first distance between the target obstacle and the vehicle being less than a second threshold and the current frame not being a first frame in which the target obstacle was detected; wherein, the judging module 330 further includes: and a judging unit configured to judge whether the target obstacle is used for determining the current driving behavior of the vehicle based on the second distance between the first position and the second position and the history decision tag sequence.
According to some embodiments, the judging unit comprises: a determination subunit configured to determine that the target obstacle is not used to determine a current driving behavior of the vehicle in response to any one of the following conditions being satisfied: the second distance is greater than or equal to a fourth threshold; or the ratio of the first quantity to the second quantity is smaller than a third threshold, wherein the first quantity is the quantity of decision labels with the value being yes in the at least one decision label, and the second quantity is the total quantity of the at least one decision label.
According to some embodiments, the apparatus 300 further comprises: the first determination module is configured to determine that the target obstacle is not used to determine a current driving behavior of the vehicle in response to the first distance being less than a second threshold and the current frame being a first frame in which the target obstacle is detected.
According to some embodiments, the obstacle detection result includes a category to which the target obstacle belongs, and the apparatus 300 further includes: the second determination module is configured to determine, in response to the category being a non-collidable category, that the target obstacle is for determining a current driving behavior of the vehicle.
According to some embodiments, the apparatus 300 further comprises: the reporting module is configured to report information of the target obstacle to the cloud server in response to determining that the target obstacle is not used to determine a current driving behavior of the vehicle.
According to some embodiments, the second threshold is determined based on a speed of the vehicle.
It should be appreciated that the various modules or units of the apparatus 300 shown in fig. 3 may correspond to the various steps in the method 200 described in fig. 2. Thus, the operations, features and advantages described in the method 200 above are equally applicable to the apparatus 300 and the various modules and units comprised thereof. For brevity, certain operations, features and advantages are not described in detail herein.
Although specific functions are discussed above with reference to specific modules, it should be noted that the functions of the various modules discussed herein may be divided into multiple modules and/or at least some of the functions of the multiple modules may be combined into a single module.
It should also be understood that the various techniques described herein may be implemented in software, hardware, elements, or program modules. The various modules described above in fig. 3 may be implemented in hardware or in hardware in combination with software and/or firmware. For example, the modules may be implemented as computer program code/instructions configured to be executed in one or more processors and stored in a computer-readable storage medium. Alternatively, these modules may be implemented as hardware logic/circuitry. For example, in some embodiments, one or more of the modules 310-330 may be implemented together in a System on Chip (SoC). The SoC may include an integrated circuit chip including one or more components of a processor (e.g., a central processing unit (Central Processing Unit, CPU), microcontroller, microprocessor, digital signal processor (Digital Signal Processor, DSP), etc.), memory, one or more communication interfaces, and/or other circuitry, and may optionally execute received program code and/or include embedded firmware to perform functions.
In the technical scheme of the disclosure, the acquisition, storage, application and the like of the related user personal information all conform to the regulations of related laws and regulations, and the public sequence is not violated.
There is also provided, in accordance with an embodiment of the present disclosure, an electronic device including: at least one processor; and a memory communicatively coupled to the at least one processor, the memory storing instructions executable by the at least one processor to enable the at least one processor to perform the method of determining an autonomous vehicle of an embodiment of the present disclosure.
According to an embodiment of the present disclosure, there is also provided a non-transitory computer-readable storage medium storing computer instructions for causing a computer to execute the decision method of the autonomous vehicle of the embodiments of the present disclosure.
According to an embodiment of the present disclosure, there is also provided a computer program product comprising computer program instructions which, when executed by a processor, implement the method of decision making of an autonomous vehicle of an embodiment of the present disclosure.
According to an embodiment of the present disclosure, there is also provided an autonomous vehicle including the above-described electronic device.
Referring to fig. 4, a block diagram of an electronic device 400 that may be a server or a client of the present disclosure, which is an example of a hardware device that may be applied to aspects of the present disclosure, will now be described. Electronic devices are intended to represent various forms of digital electronic computer devices, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other suitable computers. The electronic device may also represent various forms of mobile apparatuses, such as personal digital assistants, cellular telephones, smartphones, wearable devices, and other similar computing apparatuses. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 4, the electronic device 400 includes a computing unit 401 that can perform various suitable actions and processes according to a computer program stored in a Read Only Memory (ROM) 402 or a computer program loaded from a storage unit 408 into a Random Access Memory (RAM) 403. In the RAM403, various programs and data required for the operation of the electronic device 400 may also be stored. The computing unit 401, ROM 402, and RAM403 are connected to each other by a bus 404. An input/output (I/O) interface 405 is also connected to bus 404.
Various components in electronic device 400 are connected to I/O interface 405, including: an input unit 406, an output unit 407, a storage unit 408, and a communication unit 409. The input unit 406 may be any type of device capable of inputting information to the electronic device 400, the input unit 406 may receive input numeric or character information and generate key signal inputs related to user settings and/or function controls of the electronic device, and may include, but is not limited to, a mouse, a keyboard, a touch screen, a track pad, a track ballA joystick, a microphone, and/or a remote control. The output unit 407 may be any type of device capable of presenting information and may include, but is not limited to, a display, speakers, video/audio output terminals, vibrators, and/or printers. Storage unit 408 may include, but is not limited to, magnetic disks, optical disks. The communication unit 409 allows the electronic device 400 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunications networks, and may include, but is not limited to, modems, network cards, infrared communication devices, wireless communication transceivers and/or chipsets, such as bluetooth TM Devices, 802.11 devices, wi-Fi devices, wiMAX devices, cellular communication devices, and/or the like.
The computing unit 401 may be a variety of general purpose and/or special purpose processing components having processing and computing capabilities. Some examples of computing unit 401 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, etc. The computing unit 401 performs the various methods and processes described above, such as method 200. For example, in some embodiments, the method 200 may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as the storage unit 408. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 400 via the ROM 402 and/or the communication unit 409. One or more of the steps of the method 200 described above may be performed when a computer program is loaded into RAM 403 and executed by computing unit 401. Alternatively, in other embodiments, the computing unit 401 may be configured to perform the method 200 by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), complex Programmable Logic Devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus such that the program code, when executed by the processor or controller, causes the functions/operations specified in the flowchart and/or block diagram to be implemented. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), the internet, and blockchain networks.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server incorporating a blockchain.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps recited in the present disclosure may be performed in parallel, sequentially or in a different order, provided that the desired results of the disclosed aspects are achieved, and are not limited herein.
Although embodiments or examples of the present disclosure have been described with reference to the accompanying drawings, it is to be understood that the foregoing methods, systems, and apparatus are merely illustrative embodiments or examples and that the scope of the present disclosure is not limited by these embodiments or examples but only by the claims following the grant and their equivalents. Various elements of the embodiments or examples may be omitted or replaced with equivalent elements thereof. Furthermore, the steps may be performed in a different order than described in the present disclosure. Further, various elements of the embodiments or examples may be combined in various ways. It is important that as technology evolves, many of the elements described herein may be replaced by equivalent elements that appear after the disclosure.
Claims (20)
1. A method of automatically driving a vehicle, comprising:
Obtaining an obstacle detection result of a current frame, wherein the obstacle detection result comprises a first position of a target obstacle on a vehicle driving path, and the height of the target obstacle is smaller than a first threshold value;
responsive to a first distance between the target obstacle and the vehicle being less than a second threshold and the current frame not being a first frame in which the target obstacle is detected, obtaining a historical decision tag sequence of the target obstacle, wherein the historical decision tag sequence includes at least one decision tag corresponding to at least one historical frame, respectively, each of the at least one decision tag indicating whether the target obstacle in the respective historical frame is used to determine driving behavior of the vehicle; and
based on the historical decision tag sequence, whether the target obstacle is used for determining the current driving behavior of the vehicle is judged.
2. The method of claim 1, wherein the determining whether the target obstacle is used to determine the current driving behavior of the vehicle based on the historical decision tag sequence comprises:
and in response to the ratio of the first number to the second number being less than a third threshold, determining that the target obstacle is not used for determining the current driving behavior of the vehicle, wherein the first number is the number of decision tags with the value being yes in the at least one decision tag, and the second number is the total number of the at least one decision tag.
3. The method of claim 1, further comprising:
responsive to a first distance between the target obstacle and the vehicle being less than a second threshold and the current frame not being a first frame in which the target obstacle was detected, acquiring a second position of the target obstacle of a previous frame;
wherein the determining whether the target obstacle is used to determine the current driving behavior of the vehicle based on the historical decision tag sequence comprises:
based on a second distance between the first location and the second location and the historical decision tag sequence, it is determined whether the target obstacle is used to determine a current driving behavior of the vehicle.
4. The method of claim 3, wherein the determining whether the target obstacle is used to determine the current driving behavior of the vehicle based on a second distance between the first location and the second location and the historical decision tag sequence comprises:
determining that the target obstacle is not used to determine the current driving behavior of the vehicle in response to any of the following conditions being met:
the second distance is greater than or equal to a fourth threshold; or alternatively
The ratio of the first number to the second number is smaller than a third threshold, wherein the first number is the number of decision labels with the value being yes in the at least one decision label, and the second number is the total number of the at least one decision label.
5. The method of any of claims 1-4, further comprising:
in response to the first distance being less than the second threshold and the current frame being a first frame in which the target obstacle is detected, it is determined that the target obstacle is not used to determine the current driving behavior of the vehicle.
6. The method of any of claims 1-5, wherein the obstacle detection result further includes a category to which the target obstacle belongs, the method further comprising:
and in response to the category being a non-collidable category, determining that the target obstacle is for determining a current driving behavior of the vehicle.
7. The method of any of claims 1-6, further comprising:
and in response to determining that the target obstacle is not used for determining the current driving behavior of the vehicle, reporting information of the target obstacle to a cloud server.
8. The method of any of claims 1-7, wherein the second threshold is determined based on a speed of the vehicle.
9. A decision making device for an autonomous vehicle, comprising:
a first acquisition module configured to acquire an obstacle detection result of a current frame, wherein the obstacle detection result includes a first position of a target obstacle on a vehicle travel path, a height of the target obstacle being less than a first threshold;
A second acquisition module configured to acquire a historical decision tag sequence of the target obstacle in response to a first distance between the target obstacle and the vehicle being less than a second threshold and the current frame not being a first frame in which the target obstacle is detected, wherein the historical decision tag sequence includes at least one decision tag corresponding to at least one historical frame, respectively, each of the at least one decision tag indicating whether the target obstacle in the respective historical frame is used to determine a driving behavior of the vehicle; and
a determination module configured to determine whether the target obstacle is used to determine a current driving behavior of the vehicle based on the historical decision tag sequence.
10. The apparatus of claim 9, wherein the means for determining comprises:
and a determining unit configured to determine that the target obstacle is not used for determining the current driving behavior of the vehicle in response to a ratio of a first number to a second number being the number of decision tags having a positive value in the at least one decision tag being smaller than a third threshold.
11. The apparatus of claim 10, further comprising:
a third acquisition module configured to acquire a second position of the target obstacle of a previous frame in response to a first distance between the target obstacle and the vehicle being less than a second threshold and the current frame not being a first frame in which the target obstacle is detected;
wherein, the judging module further comprises:
a determination unit configured to determine whether the target obstacle is used to determine a current driving behavior of the vehicle based on a second distance between the first location and the second location and the historical decision tag sequence.
12. The apparatus of claim 11, wherein the determination unit comprises:
a determination subunit configured to determine that the target obstacle is not used to determine a current driving behavior of the vehicle in response to any one of the following conditions being satisfied:
the second distance is greater than or equal to a fourth threshold; or alternatively
The ratio of the first number to the second number is smaller than a third threshold, wherein the first number is the number of decision labels with the value being yes in the at least one decision label, and the second number is the total number of the at least one decision label.
13. The apparatus of any of claims 9-12, further comprising:
a first determination module configured to determine that the target obstacle is not used to determine a current driving behavior of the vehicle in response to the first distance being less than the second threshold and the current frame being a first frame in which the target obstacle is detected.
14. The apparatus of any of claims 9-13, wherein the obstacle detection result further includes a category to which the target obstacle belongs, the apparatus further comprising:
and a second determination module configured to determine that the target obstacle is used to determine a current driving behavior of the vehicle in response to the category being a non-collidable category.
15. The apparatus of any of claims 9-14, further comprising:
and the reporting module is configured to report the information of the target obstacle to a cloud server in response to determining that the target obstacle is not used for determining the current driving behavior of the vehicle.
16. The apparatus of any of claims 9-15, wherein the second threshold is determined based on a speed of the vehicle.
17. An electronic device, comprising:
at least one processor; and
A memory communicatively coupled to the at least one processor; wherein the method comprises the steps of
The memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-8.
18. A non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the method of any one of claims 1-8.
19. A computer program product comprising computer program instructions, wherein the computer program instructions, when executed by a processor, implement the method of any one of claims 1-8.
20. An autonomous vehicle comprising the electronic device of claim 17.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211567745.7A CN115993821A (en) | 2022-12-07 | 2022-12-07 | Decision-making method, device and equipment for automatic driving vehicle and automatic driving vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211567745.7A CN115993821A (en) | 2022-12-07 | 2022-12-07 | Decision-making method, device and equipment for automatic driving vehicle and automatic driving vehicle |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115993821A true CN115993821A (en) | 2023-04-21 |
Family
ID=85993228
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211567745.7A Pending CN115993821A (en) | 2022-12-07 | 2022-12-07 | Decision-making method, device and equipment for automatic driving vehicle and automatic driving vehicle |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115993821A (en) |
-
2022
- 2022-12-07 CN CN202211567745.7A patent/CN115993821A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113741485A (en) | Control method and device for cooperative automatic driving of vehicle and road, electronic equipment and vehicle | |
US20200066158A1 (en) | Electronic device, warning message providing method therefor, and non-transitory computer-readable recording medium | |
CN115366920B (en) | Decision-making method, device, equipment and medium for automatic driving vehicle | |
CN114179832A (en) | Lane changing method for autonomous vehicle | |
CN114758502B (en) | Dual-vehicle combined track prediction method and device, electronic equipment and automatic driving vehicle | |
CN115082690B (en) | Target recognition method, target recognition model training method and device | |
CN114212108B (en) | Automatic driving method, device, vehicle, storage medium and product | |
CN114394111B (en) | Lane changing method for automatic driving vehicle | |
CN117724361A (en) | Collision event detection method and device applied to automatic driving simulation scene | |
CN115235487B (en) | Data processing method, device, equipment and medium | |
CN115861953A (en) | Training method of scene coding model, and trajectory planning method and device | |
CN115675528A (en) | Automatic driving method and vehicle based on similar scene mining | |
CN115993821A (en) | Decision-making method, device and equipment for automatic driving vehicle and automatic driving vehicle | |
CN116882122A (en) | Method and device for constructing simulation environment for automatic driving | |
CN114655250A (en) | Data generation method and device for automatic driving | |
CN114282776A (en) | Method, device, equipment and medium for cooperatively evaluating automatic driving safety of vehicle and road | |
CN115583243B (en) | Method for determining lane line information, vehicle control method, device and equipment | |
CN114333368B (en) | Voice reminding method, device, equipment and medium | |
CN114333405B (en) | Method for assisting in parking a vehicle | |
CN115019278B (en) | Lane line fitting method and device, electronic equipment and medium | |
CN116434041B (en) | Mining method, device and equipment for error perception data and automatic driving vehicle | |
CN114179834B (en) | Vehicle parking method, device, electronic equipment, medium and automatic driving vehicle | |
CN116560377B (en) | Automatic driving model for predicting position track and training method thereof | |
CN115900724A (en) | Path planning method and device | |
CN116469069A (en) | Scene coding model training method, device and medium for automatic driving |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |