CN115610445A - Automatic driving method, automatic driving device, computer equipment, vehicle and storage medium - Google Patents

Automatic driving method, automatic driving device, computer equipment, vehicle and storage medium Download PDF

Info

Publication number
CN115610445A
CN115610445A CN202211227078.8A CN202211227078A CN115610445A CN 115610445 A CN115610445 A CN 115610445A CN 202211227078 A CN202211227078 A CN 202211227078A CN 115610445 A CN115610445 A CN 115610445A
Authority
CN
China
Prior art keywords
item
vehicle
information
longitudinal
driving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211227078.8A
Other languages
Chinese (zh)
Inventor
朱斯浩
刘少山
彭夏鹏
王树志
俞波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Binli Information Technology Co Ltd
Original Assignee
Beijing Binli Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Binli Information Technology Co Ltd filed Critical Beijing Binli Information Technology Co Ltd
Priority to CN202211227078.8A priority Critical patent/CN115610445A/en
Publication of CN115610445A publication Critical patent/CN115610445A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0015Planning or execution of driving tasks specially adapted for safety
    • B60W60/0016Planning or execution of driving tasks specially adapted for safety of the vehicle or its occupants
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2555/00Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
    • B60W2555/20Ambient conditions, e.g. wind or rain

Abstract

Provided are an automatic driving method, an automatic driving apparatus, a computer device, a vehicle, and a storage medium. The automatic driving method comprises the following steps: acquiring at least one item of running environment information, at least one item of running state information of the vehicle and at least one item of state information of passengers in the vehicle; analyzing each item of acquired information to obtain a corresponding analysis result; determining a longitudinal coefficient and a transverse coefficient according to the analysis result of each item of information; modifying the first amplitude limit for each of the at least one longitudinal driving parameter with the longitudinal coefficient; modifying the second amplitude limit for each of the at least one lateral driving parameter using the lateral coefficient; the vehicle is controlled to travel based on the at least one longitudinal travel parameter with the modified respective first amplitude limit and the at least one lateral travel parameter with the modified respective second amplitude limit.

Description

Automatic driving method, automatic driving device, computer equipment, vehicle and storage medium
Technical Field
The present disclosure relates to the field of vehicle technology, and in particular, to an automatic driving method, apparatus, computer device, vehicle, computer-readable storage medium, and computer program product.
Background
With the development of intelligent vehicle technology, more and more vehicles have an automatic auxiliary navigation driving function. In the case where the automatic assisted navigation driving function is turned on, the user may select an automatic driving mode among preset automatic driving modes. Accordingly, the controller of the vehicle may transmit a control command to a power system, a steering system, a braking system, etc. of the vehicle according to a preset algorithm according to the automatic driving mode selected by the user, thereby controlling the vehicle.
The approaches described in this section are not necessarily approaches that have been previously conceived or pursued. Unless otherwise indicated, it should not be assumed that any of the approaches described in this section qualify as prior art merely by virtue of their inclusion in this section. Similarly, unless otherwise indicated, the problems mentioned in this section should not be considered as having been acknowledged in any prior art.
Disclosure of Invention
The present disclosure provides an autonomous driving method, an autonomous driving apparatus, a computer device, a vehicle, a computer-readable storage medium, and a computer program product.
According to an aspect of the present disclosure, an automatic driving method is provided. The method comprises the following steps: acquiring at least one item of running environment information, at least one item of running state information of the vehicle and at least one item of state information of passengers in the vehicle; analyzing each item of information in the acquired at least one item of driving environment information, the acquired at least one item of driving state information of the vehicle and the acquired at least one item of state information of passengers in the vehicle to obtain a corresponding analysis result; determining a longitudinal coefficient and a transverse coefficient according to the analysis result of each item of information; modifying a first amplitude limit for each of at least one longitudinal travel parameter using the longitudinal factor, each longitudinal travel parameter being used to control longitudinal cruising of the host vehicle along the host vehicle within a range of the respective first amplitude limit; modifying a second amplitude limit for each of the at least one lateral driving parameter using the lateral coefficient, each lateral driving parameter being used to control the lateral lane change of the host vehicle within a range of the respective second amplitude limit; and controlling the vehicle to travel based on the at least one longitudinal travel parameter with the modified respective first amplitude limit and the at least one lateral travel parameter with the modified respective second amplitude limit.
According to another aspect of the present disclosure, an autopilot device is provided. The device includes: an information acquisition module configured to acquire at least one item of running environment information, at least one item of vehicle running state information, and at least one item of in-vehicle occupant state information; an analysis module configured to analyze each of the acquired at least one item of driving environment information, at least one item of vehicle driving state information, and at least one item of vehicle occupant state information to obtain a corresponding analysis result; a coefficient determination module configured to determine a longitudinal coefficient and a lateral coefficient from the analysis result of each item of information; a longitudinal parameter modification module configured to modify a first magnitude limit of each of at least one longitudinal travel parameter using a longitudinal coefficient, each longitudinal travel parameter being used to control the host-vehicle to longitudinally cruise along the host-vehicle within a range of the respective first magnitude limit; a lateral parameter modification module configured to modify a second amplitude limit of each of the at least one lateral driving parameter using the lateral coefficient, each lateral driving parameter being for controlling the host vehicle lateral lane change within a range of the respective second amplitude limit; and a control module configured to control the travel of the host vehicle based on the at least one longitudinal travel parameter with the modified respective first amplitude limit and the at least one lateral travel parameter with the modified respective second amplitude limit.
According to another aspect of the present disclosure, there is provided a computer apparatus including: at least one processor; and at least one memory having a computer program stored thereon, the computer program, when executed by the at least one processor, causes the at least one processor to perform the above described autopilot method.
According to yet another aspect of the present disclosure, there is provided a vehicle comprising an autopilot device as described above or a computer apparatus as described above.
According to yet another aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, causes the processor to execute the above-described autopilot method.
According to yet another aspect of the present disclosure, a computer program product is provided, comprising a computer program which, when executed by a processor, causes the processor to carry out the above described autopilot method.
According to the embodiment of the disclosure, the safety, the traffic efficiency and the user comfort of automatic driving can be improved.
These and other aspects of the disclosure will be apparent from and elucidated with reference to the embodiments described hereinafter.
Drawings
Further details, features and advantages of the disclosure are disclosed in the following description of exemplary embodiments, taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a schematic diagram illustrating an example system in which various methods described herein may be implemented, according to an example embodiment;
FIG. 2 is a flow chart illustrating an automated driving method according to an exemplary embodiment;
FIG. 3 is a flow chart illustrating a portion of the process in the autonomous driving method of FIG. 2 in accordance with an exemplary embodiment;
FIG. 4 is a flowchart illustrating a portion of the process in the autopilot method of FIG. 3 in accordance with an exemplary embodiment;
FIG. 5 is a flow chart illustrating a portion of the process in the autonomous driving method of FIG. 2 in accordance with an exemplary embodiment;
FIG. 6 is a flowchart illustrating a portion of the process in the automated driving method of FIG. 2 in accordance with an exemplary embodiment;
FIG. 7 is a schematic block diagram illustrating an autonomous driving apparatus according to an exemplary embodiment; and is
FIG. 8 is a block diagram illustrating an exemplary computer device that can be used in exemplary embodiments.
Detailed Description
In the present disclosure, unless otherwise specified, the use of the terms "first", "second", etc. to describe various elements is not intended to limit the positional relationship, the timing relationship, or the importance relationship of the elements, and such terms are used only to distinguish one element from another. In some examples, a first element and a second element may refer to the same instance of the element, while in some cases they may refer to different instances based on the context of the description.
The terminology used in the description of the various described examples in this disclosure is for the purpose of describing the particular examples only and is not intended to be limiting. Unless the context clearly indicates otherwise, if the number of elements is not specifically limited, the element may be one or a plurality of. As used herein, the term "plurality" means two or more, and the term "based on" should be interpreted as "based, at least in part, on. Further, the terms "and/or" and "\8230, at least one of which" encompasses any and all possible combinations of the listed items.
In the related art, a vehicle may include one or more preset automatic driving modes for selection by a user. For example, some users may select a desired driving mode among several preset "soft", "normal", or "aggressive" modes. After a user selects an autonomous driving mode, the vehicle may be driven according to the operating parameters corresponding to the autonomous driving mode. Generally, the number of preset automatic driving modes available for a user to select is small, and the differentiation requirements of different users may not be met. In addition, in a more complicated driving environment, the preset fixed driving mode may not cope with the change of the driving environment or the working condition, so that the driving safety is threatened, and the driving comfort and the passing efficiency are also adversely affected.
In view of this, embodiments of the present disclosure provide an autopilot method, an autopilot apparatus, a computer device, a vehicle, a computer-readable storage medium, and a computer program product.
Exemplary embodiments of the present disclosure are described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic diagram illustrating an example system 100 in which various methods described herein may be implemented, according to an example embodiment.
Referring to FIG. 1, the system 100 includes an in-vehicle system 110, a server 120, and a network 130 communicatively coupling the in-vehicle system 110 and the server 120.
In-vehicle system 110 includes a display 114 and an Application (APP) 112 that may be displayed via display 114. The application 112 may be an application installed by default or downloaded and installed by the user 102 for the in-vehicle system 110, or an applet that is a lightweight application. In some embodiments, the in-vehicle system 110 may include one or more processors and one or more memories (not shown), and the in-vehicle system 110 is implemented as an in-vehicle computer. In some embodiments, in-vehicle system 110 may include more or fewer display screens 114 (e.g., not including display screens 114), and/or one or more speakers or other human interaction devices. In some embodiments, in-vehicle system 110 may not be in communication with server 120.
Server 120 may represent a single server, a cluster of multiple servers, a distributed system, or a cloud server providing an underlying cloud service (such as cloud database, cloud computing, cloud storage, cloud communications). It will be appreciated that although the server 120 is shown in FIG. 1 as communicating with only one in-vehicle system 110, the server 120 may provide background services for multiple in-vehicle systems simultaneously.
The network 130 allows wireless communication and information exchange between vehicles-X ("X" means vehicle, road, pedestrian, or internet, etc.) according to agreed communication protocols and data interaction standards. Examples of network 130 include a Local Area Network (LAN), a Wide Area Network (WAN), a Personal Area Network (PAN), and/or a combination of communication networks such as the Internet. The network 130 may be a wired or wireless network. In one example, the network 130 may be an in-vehicle network, an inter-vehicle network, and/or an in-vehicle mobile internet network.
FIG. 2 is a flowchart illustrating an autopilot method 200 according to an exemplary embodiment. The method 200 may be performed at an on-board system (e.g., the on-board system 110 shown in fig. 1), i.e., the subject of execution of the various steps of the method 200 may be the on-board system 110 shown in fig. 1. In some embodiments, method 200 may be performed at a server (e.g., server 120 shown in fig. 1). In some embodiments, method 200 may be performed by an in-vehicle system (e.g., in-vehicle system 110) in combination with a server (e.g., server 120). Hereinafter, each step of the method 200 is described in detail by taking an execution subject as the in-vehicle system 110 as an example.
Referring to fig. 2, an autopilot method 200 includes:
step S210, acquiring at least one item of running environment information, at least one item of running state information of the vehicle and at least one item of state information of passengers in the vehicle;
step S220, analyzing each item of information in the acquired at least one item of running environment information, the acquired at least one item of vehicle running state information and the acquired at least one item of vehicle occupant state information to obtain a corresponding analysis result;
step S230, determining a longitudinal coefficient and a transverse coefficient according to the analysis result of each item of information;
step S240, modifying a first amplitude limit of each longitudinal running parameter in at least one longitudinal running parameter by using a longitudinal coefficient, wherein each longitudinal running parameter is used for controlling the host vehicle to cruise longitudinally along the host vehicle within the range of the corresponding first amplitude limit;
step S250, modifying a second amplitude limit of each transverse driving parameter in at least one transverse driving parameter by using the transverse coefficient, wherein each transverse driving parameter is used for controlling the transverse lane change of the vehicle within the range of the corresponding second amplitude limit; and
step S260 of controlling the vehicle to travel based on the at least one longitudinal travel parameter with the modified respective first amplitude limit and the at least one lateral travel parameter with the modified respective second amplitude limit.
In step S210, the driving environment information may include weather environment information, such as intensity of illuminance, rainfall, snow, hail, or haze; the running environment information may also include road static environment information such as road type (urban road, tunnel, highway, etc.), number of lanes, road gradient, road curvature, road flatness, etc.; the driving environment information may also include dynamic environment information such as the type and number of vehicles in a road, traffic flow, road construction information, road accident information, and the like. In addition, the vehicle running state information may include the vehicle-mounted battery remaining capacity of the vehicle, vehicle dynamics parameter setting information (e.g., air suspension height, preset driving pattern), the vehicle load, and information on whether the vehicle is malfunctioning. Further, the in-vehicle occupant status information may include the sex of the driver, the status information of the passenger, and the like.
For step S220, in one example, analyzing the driving environment information may include analyzing the illuminance information and determining whether the current driving environment is day or night, and is currently a sunny day, a cloudy day, or cloudy. In one example, analyzing the driving environment information may include analyzing traffic information and determining that the current road is congested or clear. In one example, analyzing the driving environment information may include analyzing the surrounding vehicle types and determining whether there are a large number of large vehicles (e.g., trucks) around. In some examples, the driving environment information may be an image captured by a camera of the host vehicle, and the image may be subjected to image recognition to obtain a corresponding analysis result. In one example, analyzing the vehicle driving state information may include analyzing a remaining amount of a vehicle-mounted battery and determining whether the amount of power is below a preset threshold. In one example, analyzing the in-vehicle occupant status information includes analyzing driver information and determining whether the driver is male or female.
Thus, based on the analysis results obtained by analyzing the running environment information, the own vehicle running state information, and the in-vehicle occupant state information, the influence of the above information on the driving behavior of the autonomous vehicle can be determined, respectively. For example, the driving behavior of an autonomous vehicle may tend to be conservative when the current lighting situation is determined to be at night; for example, the driving behavior of an autonomous vehicle may tend to be conservative when it is determined that there are a large number of large vehicles (e.g., trucks) around the host vehicle; for example, when it is determined that the vehicle-mounted battery remaining capacity is less than 20%, the driving behavior of the autonomous vehicle may also tend to be conservative in order to save electric energy.
Further, the present solution proposes to separate the influence of the above information on the driving behavior of the autonomous vehicle into a "longitudinal" influence and a "lateral" influence. The "longitudinal" influence may be an influence on the driving behavior in the longitudinal cruising direction of the host vehicle, and the "lateral" influence may be an influence on the driving behavior in the lateral lane change of the host vehicle. That is, the longitudinal coefficient and the lateral coefficient may be determined from the analysis result of each item of information; and modifying a first magnitude limit for the longitudinal travel parameter using the longitudinal coefficient and a second magnitude limit for the lateral travel parameter using the lateral coefficient. Wherein modifying the amplitude limit may represent increasing a maximum threshold value of the driving parameter or decreasing a minimum threshold value of the driving parameter. Accordingly, when the host vehicle is controlled to travel based on the longitudinal travel parameter with the modified corresponding first amplitude limit and the lateral travel parameter with the modified corresponding second amplitude limit, it is possible to more finely control the automated driving parameters (rather than simply using the preset automated driving mode). For example, when it is determined that a large number of large vehicles (e.g., trucks) are present around the host vehicle, the behavior of the host vehicle in the lateral lane change may be restricted to a large extent based on this factor, while the behavior in the longitudinal cruising direction of the host vehicle may be restricted to a small extent, so that the host vehicle may quickly and safely pass through a dense area of large vehicles without frequently changing lanes. For another example, when it is determined that the current environment is at night, the behaviors of both the lateral lane change and the longitudinal cruising of the host vehicle may be restricted to a large extent based on this factor to ensure the driving safety of the host vehicle and other vehicles in the road. Therefore, even if the vehicle faces complex driving environment and conditions of passengers in the vehicle, the automatic driving vehicle can adjust the driving strategy in time according to the environment and working conditions, and therefore user experience such as safety, traffic efficiency and user comfort of automatic driving is further improved.
FIG. 3 is a flowchart illustrating a portion of the process in the autopilot method 200 of FIG. 2 according to an exemplary embodiment.
According to some embodiments, referring to fig. 3, the acquiring of the at least one item of running environment information in step S210 may include:
step S311, at least one item of first running environment information within a first range from the vehicle is obtained by at least one vehicle-mounted sensor of the vehicle; and
step S312, at least one item of second running environment information within a second range from the vehicle is acquired from at least one of the operator, the surrounding vehicles or the road side equipment.
And the analyzing of each of the acquired at least one item of driving environment information in the above step S220 may include:
step S321, fusing each item of first running environment information in the at least one item of first running environment information with a corresponding item of second running environment information in the at least one item of second running environment information to obtain at least one item of fused information; and
step S322, analyzing at least one item of fusion information to obtain the corresponding analysis result.
The vehicle-mounted sensor may include one or more of a vehicle-mounted camera, a millimeter wave radar, and a laser radar. In addition to acquiring the running environment information by the on-vehicle sensor of the own vehicle, the running environment information may be acquired from a surrounding vehicle or a roadside device by means of the vehicle-X technology. In addition, some operators may also provide information such as weather information, road congestion information, road grade, road curvature, road flatness, and the like. For example, information on the gradient of a road, the curvature of the road, the flatness of the road, and the like is acquired from a high-precision map.
It will be appreciated that the first and second ranges described above may be the same range, for example, the first and second ranges may each be a range of 1000 meters from the host vehicle. The second range may also be greater than the first range, for example, the first range may be a range of 500 meters from the host vehicle, while the second range may be a range of 1000 meters from the host vehicle.
In an example, each item of first driving environment information and each item of second driving environment information may be input into a model trained in advance to obtain fusion information predicted by the model. The accuracy of the analysis result can be further improved by analyzing based on the fusion information.
Fig. 4 is a flowchart illustrating a portion of the process in the automated driving method of fig. 3 according to an exemplary embodiment. According to some embodiments, as shown in fig. 4, the step S321 may include:
for each item of first travel environment information:
step S401, comparing the item of first driving environment information with a corresponding item of second driving environment information; and
step S402, in response to the fact that the difference value between the item of first running environment information and the corresponding item of second running environment information is larger than a preset threshold value, taking the item of first running environment information as one item of fusion information.
Therefore, when the difference value between the second running environment information acquired from at least one of the operator, the surrounding vehicles or the road side equipment and the first running environment information acquired by the vehicle-mounted sensor of the vehicle is larger than the preset threshold value, in order to avoid that the second running environment information provided by the operator, the surrounding vehicles or the road side equipment is error information, the first running environment information is used as one item of fusion information, so that the driving safety is ensured.
FIG. 5 is a flowchart illustrating a portion of the process in the autopilot method of FIG. 2 in accordance with an exemplary embodiment. According to some embodiments, as shown in fig. 5, the acquiring at least one item of in-vehicle occupant status information in step S210 may include: step S511 acquires an image of an occupant in the vehicle using the in-vehicle image sensor of the vehicle.
And in step S220, analyzing each of the acquired at least one item of in-vehicle occupant status information may include:
step S521, performing image recognition on the image to recognize a driver and a passenger in the image; and
step S522, judging whether the identified driver is in a fatigue driving state or a dangerous driving state; and/or
Step S523, it is determined whether the identified passenger includes an old person and a child, or whether the passenger is in a sleeping state.
In step S511, the on-vehicle image sensor may be an image sensor in an occupant monitoring system for capturing an image in a vehicle cabin.
In the example of step S521, a person corresponding to the cockpit in the image may be identified as the driver, and persons corresponding to other seatings may be identified as the passengers.
In the example of step S522, it may be determined that the driver is in a dangerous driving state when it is recognized that there is a behavior of the driver such as making a call, operating a center control screen, or eating. And whether the driver is in a fatigue driving state can be determined by recognizing the facial expression of the driver.
According to the embodiment of the disclosure, by analyzing the state information of the occupant in the vehicle, when it is detected that the driver has fatigue driving or dangerous driving, the behavior of the vehicle in both lateral lane changing and longitudinal cruising can be restricted to a large extent based on this factor to ensure the driving safety of the vehicle and other vehicles in the road. When the passengers in the vehicle are detected to be in a sleeping state or comprise old people and children, the behaviors of transverse lane changing and longitudinal cruising of the vehicle can be greatly limited based on the factor so as to avoid violent driving, and the comfort of the passengers is improved.
Fig. 6 is a flowchart illustrating a portion of the process in the automated driving method of fig. 2 according to an exemplary embodiment. According to some embodiments, as shown in fig. 6, step S230 may include:
step S631, determining factor values of each item of information according to the analysis result of each item of information;
step S632, acquiring the longitudinal weight and the transverse weight of each item of information;
step S633, determining the sum of the product of the factor value of each item of information and the longitudinal weight of the item of information as a longitudinal coefficient; and
step S634, the sum of the factor value of each item of information and the product of the transverse weights of the item of information is determined as a transverse coefficient.
In the example of table 1, it is possible to determine the respective factor values C1 to C18 and determine the vertical weights W1 to W18 and the horizontal weights F1 to F18 of each item of information according to the analysis result of each item of information. It will be appreciated that the information listed in table 1 is by way of example, and that the information acquired in method 200 may be part of the information in table 1, or may include information not listed in table 1.
Table 1
Figure BDA0003880137320000081
Figure BDA0003880137320000091
Then, the transverse coefficient X and the longitudinal coefficient Y can be determined by the following equations, respectively:
transverse coefficient X = C1W 1+ C2W 2+ \ 8230n equation (1)
Longitudinal coefficient Y = C1F 1+ C2F 2+ \ 8230n equation (2)
In equations (1) and (2), n is a positive integer.
According to some embodiments, the at least one longitudinal driving parameter may comprise at least one of: acceleration, jerk, distance to a following stop with a preceding vehicle, or a following time for a preceding vehicle.
According to some embodiments, the at least one lateral driving parameter may comprise at least one of: transverse avoidance acceleration, triggering lane change conditions, lane change time, collision time, or the latest lane change position in the navigation lane change process.
In the example of table 2, not only examples of the longitudinal running parameter and the lateral running parameter are shown, respectively, but also example base values of the longitudinal running parameter and the lateral running parameter, each of which may be empirically determined, for example, each of which may be an average of a plurality of empirical values. Further, the lateral running parameter modified with the lateral coefficient X, the longitudinal running parameter modified with the longitudinal coefficient Y, and example allowable ranges of the running parameters are also shown.
Table 2
Figure BDA0003880137320000092
Therefore, the driving parameters of the multiple dimensions are respectively modified by the transverse coefficient X and the longitudinal coefficient Y, so that the control of the automatic driving vehicle can be optimized, and the safety, the passing efficiency and the user comfort of the automatic driving are further improved.
It will be appreciated that, although the operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in an order that is parallel, or that all illustrated operations be performed, to achieve desirable results. For example, step 250 may be performed prior to step 240, or concurrently with step 240.
According to another aspect of the present disclosure, an autopilot device is provided. Fig. 7 is a schematic block diagram illustrating an autopilot device 700 according to an exemplary embodiment. As shown in fig. 7, the autopilot device 700 includes:
an information acquisition module 710 configured to acquire at least one item of running environment information, at least one item of vehicle running state information, and at least one item of in-vehicle occupant state information;
an analysis module 720 configured to analyze each of the obtained at least one item of driving environment information, the obtained at least one item of driving state information of the vehicle, and the obtained at least one item of state information of occupants in the vehicle to obtain a corresponding analysis result;
a coefficient determination module 730 configured to determine a longitudinal coefficient and a lateral coefficient according to an analysis result of each item of information;
a longitudinal parameter modification module 740 configured to modify a first magnitude limit of each of at least one longitudinal travel parameter using a longitudinal coefficient, each longitudinal travel parameter being used to control the host-vehicle to longitudinally cruise along the host-vehicle within a range of the respective first magnitude limit;
a lateral parameter modification module 750 configured to modify a second magnitude limit for each of the at least one lateral driving parameter using the lateral coefficient, each lateral driving parameter being for controlling the host vehicle lateral lane change within a range of the respective second magnitude limit; and
a control module 760 configured to control the driving of the host vehicle based on the at least one longitudinal driving parameter having the modified respective first amplitude limit and the at least one lateral driving parameter having the modified respective second amplitude limit.
It should be understood that the various modules of the apparatus 700 shown in fig. 7 may correspond to the various steps in the method 200 described with reference to fig. 2. Thus, the operations, features and advantages described above with respect to method 200 are equally applicable to apparatus 700 and the modules included therein. Certain operations, features and advantages may not be described in detail herein for the sake of brevity.
Although specific functionality is discussed above with reference to particular modules, it should be noted that the functionality of the various modules discussed herein may be divided into multiple modules and/or at least some of the functionality of multiple modules may be combined into a single module. Performing an action by a particular module discussed herein includes the particular module itself performing the action, or alternatively the particular module invoking or otherwise accessing another component or module that performs the action (or performs the action in conjunction with the particular module). Thus, a particular module that performs an action can include the particular module that performs the action itself and/or another module that the particular module invokes or otherwise accesses that performs the action. For example, the longitudinal parameter modification module 740 and the lateral parameter modification module 750 described above may be combined into a single module in some embodiments.
It should also be appreciated that various techniques may be described herein in the general context of software, hardware elements, or program modules. The various modules described above with respect to fig. 7 may be implemented in hardware or in hardware in combination with software and/or firmware. For example, the modules may be implemented as computer program code/instructions configured to be executed in one or more processors and stored in a computer-readable storage medium. Alternatively, the modules may be implemented as hardware logic/circuitry. For example, in some embodiments, one or more of the information acquisition module 710, the analysis module 720, the coefficient determination module 730, the longitudinal parameter modification module 740, the lateral parameter modification module 750, and the control module 760 may be implemented together in a System on Chip (SoC). The SoC may include an integrated circuit chip that includes one or more components of a Processor (e.g., a Central Processing Unit (CPU), microcontroller, microprocessor, digital Signal Processor (DSP), etc.), memory, one or more communication interfaces, and/or other circuitry, and may optionally execute received program code and/or include embedded firmware to perform functions.
According to an aspect of the disclosure, a computer device is provided that includes at least one memory, at least one processor, and a computer program stored on the at least one memory. The at least one processor is configured to execute the computer program to implement the steps of any of the method embodiments described above.
According to an aspect of the present disclosure, there is provided a vehicle comprising an apparatus or a computer device as described above.
According to an aspect of the present disclosure, a non-transitory computer-readable storage medium is provided, having stored thereon a computer program which, when executed by a processor, implements the steps of any of the method embodiments described above.
According to an aspect of the present disclosure, a computer program product is provided, comprising a computer program which, when executed by a processor, performs the steps of any of the method embodiments described above.
Illustrative examples of such computer devices, non-transitory computer-readable storage media, and computer program products are described below in connection with fig. 8.
Fig. 8 illustrates an example configuration of a computer device 800 that may be used to implement the methods described herein. For example, the in-vehicle system 110 shown in fig. 1 may include an architecture similar to the computer device 800. The apparatus 700 described above may also be implemented in whole or at least in part by a computer device 800 or similar device or system.
The computer device 800 may include at least one processor 802, memory 804, communication interface(s) 806, display device 808, other input/output (I/O) devices 810, and one or more mass storage devices 812, which may be capable of communicating with each other, such as through a system bus 814 or other suitable connection.
Processor 802 may be a single processing unit or multiple processing units, all of which may include single or multiple computing units or multiple cores. The processor 802 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitry, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor 802 can be configured to retrieve and execute computer-readable instructions, such as program code for an operating system 816, program code for an application program 818, program code for other programs 820, and the like, stored in the memory 804, mass storage device 812, or other computer-readable medium.
Memory 804 and mass storage device 812 are examples of computer-readable storage media for storing instructions that are executed by processor 802 to implement the various functions described above. By way of example, the memory 804 may generally include both volatile and non-volatile memory (e.g., RAM, ROM, etc.). In addition, mass storage device 812 may generally include a hard disk drive, solid state drive, removable media, including external and removable drives, memory cards, flash memory, floppy disks, optical disks (e.g., CD, DVD), storage arrays, network attached storage, storage area networks, and the like. The memory 804 and mass storage device 812 may both be referred to collectively herein as memory or computer-readable storage medium, and may be non-transitory media capable of storing computer-readable, processor-executable program instructions as computer program code that may be executed by the processor 802 as a particular machine configured to implement the operations and functions described in the examples herein.
A number of programs may be stored on the mass storage device 812. These programs include an operating system 816, one or more application programs 818, other programs 820, and program data 822, and may be loaded into memory 804 for execution. Examples of such applications or program modules may include, for instance, computer program logic (e.g., computer program code or instructions) to implement the following components/functions: the application 112 (including the information acquisition module 710, the analysis module 720, the coefficient determination module 730, the longitudinal parameter modification module 740, the lateral parameter modification module 750, and the control module 760), the method 200, and/or further embodiments described herein.
Although illustrated in fig. 8 as being stored in memory 804 of computer device 800, modules 816, 818, 820, and 822, or portions thereof, may be implemented using any form of computer-readable media that is accessible by computer device 800. As used herein, "computer-readable media" includes at least two types of computer-readable media, namely computer-readable storage media and communication media.
Computer-readable storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. Computer-readable storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information for access by a computer device. In contrast, communication media may embody computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism. Computer-readable storage media, as defined herein, does not include communication media.
One or more communication interfaces 806 are used to exchange data with other devices, such as over a network, a direct connection, and so forth. Such communication interfaces may be one or more of the following: any type of network interface (e.g., a Network Interface Card (NIC)), wired or wireless (such as IEEE 802.11 Wireless LAN (WLAN)) wireless interface, worldwide interoperability for microwave Access (Wi-MAX) interface, ethernet interface, universal Serial Bus (USB) interface, cellular network interface, bluetooth TM An interface, a Near Field Communication (NFC) interface, etc. The communication interface 806 may facilitate communication within a variety of networks and protocol types, including wired networks (e.g., LAN, cable, etc.) and wireless networks (e.g., WLAN, cellular, satellite, etc.), the Internet, and so forth. The communication interface 806 may also provide for communication with external storage devices (not shown), such as in storage arrays, network attached storage, storage area networks, and the like.
In some examples, a display device 808, such as a monitor, may be included for displaying information and images to a user. Other I/O devices 810 may be devices that receive various inputs from a user and provide various outputs to the user, and may include touch input devices, gesture input devices, cameras, keyboards, remote controls, mice, printers, audio input/output devices, and so forth.
The techniques described herein may be supported by these various configurations of computer device 800 and are not limited to specific examples of the techniques described herein. For example, the functionality may also be implemented in whole or in part on a "cloud" using a distributed system. The cloud includes and/or represents a platform for resources. The platform abstracts underlying functionality of hardware (e.g., servers) and software resources of the cloud. The resources may include applications and/or data that may be used when performing computing processes on servers remote from the computer device 800. Resources may also include services provided over the internet and/or over a subscriber network such as a cellular or Wi-Fi network. The platform may abstract resources and functions to connect the computer device 800 with other computer devices. Thus, implementations of the functionality described herein may be distributed throughout the cloud. For example, the functionality may be implemented in part on the computer device 800 and in part by a platform that abstracts the functionality of the cloud.
While the disclosure has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative and exemplary and not restrictive; the present disclosure is not limited to the disclosed embodiments. Variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed subject matter, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps not listed, the indefinite article "a" or "an" does not exclude a plurality, the term "a" or "an" means two or more, and the term "based on" should be construed as "based at least in part on". The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
Some exemplary aspects of the disclosure will be described below.
Aspect 1, a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, causes the processor to perform a method according to an embodiment of the disclosure.
Aspect 2, a computer program product comprising a computer program which, when executed by a processor, causes the processor to perform a method according to an embodiment of the disclosure.

Claims (10)

1. An autonomous driving method comprising:
acquiring at least one item of running environment information, at least one item of running state information of the vehicle and at least one item of state information of passengers in the vehicle;
analyzing each item of information in the at least one item of driving environment information, the at least one item of driving state information of the vehicle and the at least one item of state information of passengers in the vehicle to obtain a corresponding analysis result;
determining a longitudinal coefficient and a transverse coefficient according to the analysis result of each item of information;
modifying a first amplitude limit of each longitudinal running parameter of at least one longitudinal running parameter by using the longitudinal coefficient, wherein each longitudinal running parameter is used for controlling the vehicle to cruise longitudinally along the vehicle within the range of the corresponding first amplitude limit;
modifying a second amplitude limit for each of at least one lateral driving parameter used to control the host vehicle lateral lane change within the range of the respective second amplitude limit using the lateral coefficient; and
controlling the vehicle to travel based on the at least one longitudinal travel parameter with the modified respective first amplitude limit and the at least one lateral travel parameter with the modified respective second amplitude limit.
2. The method of claim 1, wherein obtaining the at least one item of driving environment information comprises:
acquiring at least one item of first running environment information within a first range from the vehicle by using at least one vehicle-mounted sensor of the vehicle; and
acquiring at least one item of second running environment information within a second range from the host vehicle from at least one of the operator, the surrounding vehicle, or the roadside apparatus, and
wherein analyzing each of the acquired at least one item of traveling environment information includes:
fusing each item of first running environment information in the at least one item of first running environment information with a corresponding item of second running environment information in the at least one item of second running environment information to obtain at least one item of fused information; and
and analyzing the at least one item of fusion information to obtain the corresponding analysis result.
3. The method of claim 2, wherein fusing each of the at least one item of first travel environment information with a corresponding one of the at least one item of second travel environment information to obtain at least one fused information comprises:
for each item of first travel environment information:
comparing the item of first driving environment information with the corresponding item of second driving environment information; and
and in response to determining that the difference between the item of first running environment information and the corresponding item of second running environment information is greater than a preset threshold, taking the item of first running environment information as an item of fusion information.
4. The method of claim 1, wherein obtaining at least one item of in-vehicle occupant status information comprises: acquiring an image of an occupant in the vehicle using an on-vehicle image sensor of the vehicle, and
wherein analyzing each item of information in the acquired at least one item of in-vehicle occupant status information comprises:
performing image recognition on the image to identify a driver and a passenger in the image; and
judging whether the identified driver is in a fatigue driving state or a dangerous driving state; and/or
Determining whether the identified passengers include elderly people and children, or whether the passengers are in a sleeping state.
5. The method of any one of claims 1 to 4, wherein determining longitudinal coefficients and transverse coefficients from the analysis of each of the items of information comprises:
determining the factor value of each item of information according to the analysis result of each item of information;
acquiring the longitudinal weight and the transverse weight of each item of information;
determining the sum of the product of the factor value of each item of information and the longitudinal weight of the item of information as the longitudinal coefficient; and
and determining the sum of the factor value of each item of information and the product of the transverse weight of the item of information as the transverse coefficient.
6. The method according to any one of claims 1 to 4, wherein the at least one longitudinal driving parameter comprises at least one of:
acceleration, jerk, distance to a preceding vehicle to a stop, or time to follow for a preceding vehicle.
7. The method according to any one of claims 1 to 4, wherein the at least one lateral driving parameter comprises at least one of:
transverse avoidance acceleration, triggering lane change conditions, lane change time, collision time, or the latest lane change position in the navigation lane change process.
8. An autopilot device comprising:
an information acquisition module configured to acquire at least one item of running environment information, at least one item of vehicle running state information, and at least one item of in-vehicle occupant state information;
the analysis module is configured to analyze each item of the acquired at least one item of driving environment information, the acquired at least one item of driving state information of the vehicle and the acquired at least one item of state information of passengers in the vehicle to obtain a corresponding analysis result;
a coefficient determination module configured to determine a longitudinal coefficient and a transverse coefficient according to an analysis result of the each item of information;
a longitudinal parameter modification module configured to modify a first magnitude limit of each of at least one longitudinal travel parameter for controlling the host-vehicle to longitudinally cruise along the host-vehicle within a range of the respective first magnitude limit using the longitudinal coefficient;
a lateral parameter modification module configured to modify a second magnitude limit of each of at least one lateral driving parameter for controlling a lateral lane change of the host vehicle within a range of the respective second magnitude limit using the lateral coefficient; and
a control module configured to control the host vehicle to travel based on the at least one longitudinal travel parameter with the modified respective first amplitude limit and the at least one lateral travel parameter with the modified respective second amplitude limit.
9. A computer device, comprising:
at least one processor; and
at least one memory having a computer program stored thereon,
wherein the computer program, when executed by the at least one processor, causes the at least one processor to perform the method of any one of claims 1-7.
10. A vehicle comprising the apparatus of claim 8 or the computer device of claim 9.
CN202211227078.8A 2022-10-09 2022-10-09 Automatic driving method, automatic driving device, computer equipment, vehicle and storage medium Pending CN115610445A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211227078.8A CN115610445A (en) 2022-10-09 2022-10-09 Automatic driving method, automatic driving device, computer equipment, vehicle and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211227078.8A CN115610445A (en) 2022-10-09 2022-10-09 Automatic driving method, automatic driving device, computer equipment, vehicle and storage medium

Publications (1)

Publication Number Publication Date
CN115610445A true CN115610445A (en) 2023-01-17

Family

ID=84861255

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211227078.8A Pending CN115610445A (en) 2022-10-09 2022-10-09 Automatic driving method, automatic driving device, computer equipment, vehicle and storage medium

Country Status (1)

Country Link
CN (1) CN115610445A (en)

Similar Documents

Publication Publication Date Title
US11112793B2 (en) Mixed-mode driving of a vehicle having autonomous driving capabilities
CN110641472B (en) Safety monitoring system for autonomous vehicle based on neural network
US20230004157A1 (en) Mixed-mode driving of a vehicle having autonomous driving capabilities
US20190064805A1 (en) Mixed-mode driving of a vehicle having autonomous driving capabilities
US20190064800A1 (en) Mixed-mode driving of a vehicle having autonomous driving capabilities
WO2022016457A1 (en) Method and device for controlling switching of vehicle driving mode
CN113460042B (en) Vehicle driving behavior recognition method and recognition device
US20190064803A1 (en) Mixed-mode driving of a vehicle having autonomous driving capabilities
JPWO2018100619A1 (en) Vehicle control system, vehicle control method, and vehicle control program
US11926315B2 (en) Electronic apparatus for detecting risk factors around vehicle and method for controlling same
EP4316935A1 (en) Method and apparatus for obtaining lane change area
CN111464972A (en) Prioritized vehicle messaging
US20190064802A1 (en) Mixed-mode driving of a vehicle having autonomous driving capabilities
JP7140067B2 (en) VEHICLE CONTROL DEVICE, VEHICLE CONTROL METHOD, AND PROGRAM
WO2019046204A1 (en) Mixed-mode driving of a vehicle having autonomous driving capabilities
WO2022062825A1 (en) Vehicle control method, device, and vehicle
CN114450211A (en) Traffic control system, traffic control method, and control device
WO2022251766A1 (en) Trajectory consistency measurement for autonomous vehicle operation
CN114056346A (en) Automatic driving control method and device
CN115610445A (en) Automatic driving method, automatic driving device, computer equipment, vehicle and storage medium
WO2023028273A1 (en) System and method of adaptive distribution of autonomous driving computations
US20220388534A1 (en) Method and system for predicting behavior of actors in an environment of an autonomous vehicle
WO2023277012A1 (en) Information processing system and information processing device
WO2023176256A1 (en) Processing method, driving system, and processing program
WO2022259621A1 (en) Information processing device, information processing method, and computer program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination