CN111830949B - Automatic driving vehicle control method, device, computer equipment and storage medium - Google Patents

Automatic driving vehicle control method, device, computer equipment and storage medium Download PDF

Info

Publication number
CN111830949B
CN111830949B CN201910237979.7A CN201910237979A CN111830949B CN 111830949 B CN111830949 B CN 111830949B CN 201910237979 A CN201910237979 A CN 201910237979A CN 111830949 B CN111830949 B CN 111830949B
Authority
CN
China
Prior art keywords
neural network
navigation
loss function
branch neural
branch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910237979.7A
Other languages
Chinese (zh)
Other versions
CN111830949A (en
Inventor
刘文如
裴锋
闫春香
王玉龙
闵欢
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Automobile Group Co Ltd
Original Assignee
Guangzhou Automobile Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Automobile Group Co Ltd filed Critical Guangzhou Automobile Group Co Ltd
Priority to CN201910237979.7A priority Critical patent/CN111830949B/en
Publication of CN111830949A publication Critical patent/CN111830949A/en
Application granted granted Critical
Publication of CN111830949B publication Critical patent/CN111830949B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The method comprises the steps that the navigation branch neural network is trained together with the intersection constraint neural network during training, and learning of the navigation branch neural network can be assisted and controlled in the combined training process, so that even if a complex road scene is encountered, the navigation branch neural network can output a correct control instruction according to the output of the correct control instruction, the running track of the automatic driving vehicle is controlled in a standardized manner, the driving safety of the vehicle is guaranteed, and the adaptability of the automatic driving vehicle to various complex scenes is greatly improved.

Description

Automatic driving vehicle control method, device, computer equipment and storage medium
Technical Field
The present disclosure relates to the field of autopilot technology, and in particular, to a method and apparatus for controlling an autopilot vehicle, a computer device, and a storage medium.
Background
In the field of automatic driving, end-to-end automatic driving refers to an automatic driving technology for simulating and learning driving behaviors of people by using a deep neural network technology, so that direct mapping from sensing information to control information is realized, and a complete automatic driving system can be formed.
At present, the main body of the end-to-end automatic driving system is a neural network, the neural network receives images at an inlet as input, generates control instructions for vehicles at an outlet as output, and directly maps the input to the output, in the process, the neural network adaptively learns the internal connection between the input and the output to form a set of closed loop of the unmanned driving system which is tightly packaged, is difficult to directly control and interfere with the learning process in the system, and the learning capacity of the neural network has a certain limit, can only handle simple problems and cannot adapt to complex scenes, for example, on the problem of intersection turning, the unmanned driving vehicles face complex scenes, and cannot ensure that the vehicles turn at accurate angles at different intersections and smoothly pass through the intersections.
Therefore, the existing end-to-end automatic driving system has the technical problem that the existing end-to-end automatic driving system cannot adapt to complex scenes.
Disclosure of Invention
Based on this, it is necessary to provide an automatic driving vehicle control method, apparatus, computer device and storage medium for solving the technical problem that the existing end-to-end automatic driving system cannot adapt to complex scenes.
In a first aspect, an embodiment of the present application provides a method for controlling an autonomous vehicle, the method including:
acquiring current environmental information around a vehicle;
inputting the current environment information into a preset navigation branch neural network to obtain a control instruction; the navigation branch neural network is a network model obtained by combined training with a preset intersection steering constraint neural network; the intersection steering constraint neural network is used for acquiring the relative distance between the vehicle and the road shoulder;
and controlling the vehicle to run according to the control instruction.
In one embodiment, the navigational branch neural network includes a visual feature extraction neural network and a plurality of sub-branch neural networks;
the step of inputting the current environment information into a preset navigation branch neural network to obtain a control instruction comprises the following steps:
extracting a neural network by adopting the visual features, and extracting the visual features from the environment information;
And inputting the visual characteristics into a target sub-branch neural network to obtain the control instruction.
In one embodiment, the sub-branch neural network includes a left-turn branch neural network, a right-turn branch neural network, and a lane-keeping branch neural network, and the method includes, prior to the inputting the visual feature into the target sub-branch neural network:
obtaining current navigation information of a vehicle, wherein the navigation information comprises any one of a left turn instruction, a right turn instruction and a straight-going instruction;
if the navigation information is a left turn instruction, determining the left turn branch neural network as the target sub-branch neural network;
if the navigation information is a right turn instruction, determining the right turn branch neural network as the target sub-branch neural network;
and if the navigation information is a straight-going instruction, determining the lane keeping branch neural network as the target sub-branch neural network.
In one embodiment, the method further comprises:
respectively inputting environmental information around a plurality of groups of vehicles into an initial navigation branch neural network and an initial intersection steering constraint neural network;
the outputs of the initial navigation branch neural network and the initial intersection steering constraint neural network are input into a preset system loss function for joint training, and the value of the system loss function is obtained;
And adjusting parameters of the initial navigation branch neural network and the initial intersection steering constraint neural network according to the value of the system loss function until the value of the system loss function reaches a preset threshold value to obtain the navigation branch neural network and the intersection steering constraint neural network.
In one embodiment, the step of inputting the outputs of the initial navigation branch neural network and the initial intersection steering constraint neural network into a preset system loss function to perform joint training to obtain the value of the system loss function includes:
taking the output of the initial navigation branch neural network as the input of a preset first loss function, and taking the output of the initial intersection steering constraint neural network as the input of a preset second loss function to obtain the value of the first loss function and the value of the second loss function;
and acquiring the value of the system loss function according to the value of the first loss function, the weight of the first loss function, the value of the second loss function and the weight of the second loss function.
In one embodiment, the method further comprises:
and adjusting the weight of the first loss function and the weight of the second loss function according to the value of the system loss function.
In a second aspect, embodiments of the present application provide an autonomous vehicle control apparatus, the apparatus comprising:
the environment information acquisition module is used for acquiring current environment information around the vehicle;
the control instruction output module is used for inputting the current environment information into a preset navigation branch neural network to obtain a control instruction; the navigation branch neural network is a network model obtained by combined training with a preset intersection steering constraint neural network; the intersection steering constraint neural network is used for acquiring the relative distance between the vehicle and the road shoulder;
and the vehicle driving control module is used for controlling the vehicle to drive according to the control instruction.
In a third aspect, an embodiment of the present application provides a computer device, including a memory and a processor, where the memory stores a computer program, and the processor implements the steps of any embodiment of the first aspect when the processor executes the computer program.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of any of the embodiments of the first aspect described above.
According to the method, the device, the computer equipment and the storage medium for controlling the automatic driving vehicle, the computer equipment inputs the acquired current environmental information around the vehicle into the preset navigation branch neural network to obtain the control instruction, and the vehicle is controlled to run according to the control instruction.
Drawings
FIG. 1 is an application environment diagram of an autonomous vehicle control method according to one embodiment;
FIG. 2 is a flow chart of a method for controlling an autonomous vehicle according to one embodiment;
FIG. 3 is a flow chart of a method for controlling an autonomous vehicle according to one embodiment;
FIG. 4 is a flow chart of a method for controlling an autonomous vehicle according to one embodiment;
FIG. 5 is a flow chart of a method for controlling an autonomous vehicle according to one embodiment;
FIG. 6 is a flow chart of a method for controlling an autonomous vehicle according to one embodiment;
FIG. 7 is a block diagram of an autonomous vehicle control system according to one embodiment;
FIG. 8 is a block diagram of an autonomous vehicle control apparatus according to one embodiment;
FIG. 9 is a block diagram of an autonomous vehicle control apparatus according to one embodiment;
FIG. 10 is a block diagram of an autonomous vehicle control apparatus according to one embodiment;
FIG. 11 is a block diagram of an autonomous vehicle control apparatus according to one embodiment;
FIG. 12 is a block diagram of an autonomous vehicle control apparatus according to one embodiment;
fig. 13 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
The automatic driving vehicle control method can be applied to an application environment shown in fig. 1, and the automatic driving system comprises an input data acquisition device, computer equipment and a vehicle, wherein the data acquisition device and the computer equipment can be arranged on the vehicle or outside the vehicle, the data acquisition device is used for collecting current environment information around the vehicle, the computer equipment is used for controlling driving of the vehicle according to data collected by the data acquisition device, and the computer equipment comprises a processor, a memory, a network interface and a database which are connected through a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The database of the computer device is for storing autonomous vehicle control data. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a method of autonomous vehicle control.
The embodiment of the application provides an automatic driving vehicle control method, an automatic driving vehicle control device, computer equipment and a storage medium, and aims to solve the technical problem that an existing end-to-end automatic driving system cannot adapt to a complex scene. The following will specifically describe the technical solutions of the present application and how the technical solutions of the present application solve the above technical problems by means of examples and with reference to the accompanying drawings. The following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments. It should be noted that, in the method for controlling an automatic driving vehicle provided in the present application, the execution body in fig. 2 to 6 is a computer device, and the execution body may also be an automatic driving vehicle control device, where the device may be implemented in a manner of software, hardware or a combination of software and hardware to become part or all of the automatic driving vehicle control.
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments.
In one embodiment, fig. 2 provides a method for controlling an automatic driving vehicle, and the embodiment relates to a specific process that a computer device obtains a control instruction by adopting a preset navigation branch neural network according to current environmental information around the vehicle, and controls the running of the vehicle according to the control instruction. As shown in fig. 2, the method includes:
s101, acquiring current environment information around the vehicle.
In this embodiment, the environmental information around the vehicle represents road condition information around the vehicle and obstacle information, where the road condition information may include traffic flow on left and right adjacent lanes, a front Fang Fencha intersection, and other information, and the obstacle information may include information such as a size and a position of a roadside obstacle, and the content included in the environmental information is not specifically limited in this embodiment. In practical application, the method for acquiring the current environmental information around the vehicle by the computer equipment may be that the camera is used for acquiring the environmental image around the vehicle, the acquired environmental image is divided into image information of one frame by one frame, and then specific environmental information is determined according to the image information; the information on the surrounding environment may be obtained by other environment sensors such as radar, and this embodiment is not limited thereto.
S102, inputting the current environment information into a preset navigation branch neural network to obtain a control instruction; the navigation branch neural network is a network model obtained by combined training with a preset intersection steering constraint neural network; the intersection steering constraint neural network is used for acquiring the relative distance between the vehicle and the road shoulder.
In this step, the control command indicates a command for controlling the vehicle to run safely and normally, and for example, the control command may be a steering wheel angle, an accelerator pedal position, a braking force, or other control parameters. The corresponding control instruction can be obtained by inputting the current environmental information around the vehicle obtained by the computer device in the step S101 into a preset navigation branch neural network, wherein the navigation branch neural network is a network model obtained by jointly training with the preset intersection steering constraint neural network, the intersection steering constraint neural network is a network model for obtaining the relative distance between the vehicle and the road shoulder, the intersection steering constraint neural network can take the environmental information around the vehicle as input for training learning, and then output the relative distance between the vehicle and the road shoulder, the specific process of jointly training the navigation branch neural network with the intersection steering constraint neural network by the computer device is described in detail in the following embodiment, and the embodiment is not repeated herein.
And S103, controlling the vehicle to run according to the control instruction.
Based on the control instruction obtained in step S102, the computer device controls the vehicle to run according to the control instruction, and, for example, taking the steering wheel angle, the accelerator pedal position and the braking force as examples of the control instruction, the computer device controls the degree of the steering wheel angle of the vehicle to be the same as the degree given in the control instruction, the accelerator pedal position to be the same as the position given in the control instruction and the braking force to be the same as the braking force given in the control instruction according to the control instruction. In this embodiment, although the intersection constraint neural network only assists the navigation branch neural network to learn when the network model is trained, and does not intervene when the navigation branch neural network outputs a control instruction according to current environmental information around the vehicle, the computer device refers to the relative distance between the vehicle and the road shoulder output by the intersection constraint neural network when controlling the vehicle to travel according to the control instruction output by the navigation branch neural network, so as to ensure the traveling safety and normalization of the vehicle.
According to the automatic driving vehicle control method, the computer equipment inputs the acquired current environmental information around the vehicle into the preset navigation branch neural network to obtain the control instruction, and the driving of the vehicle is controlled according to the control instruction.
The specific process of inputting preset navigation branch neural network by the computer device in the current environment information to obtain the control instruction is described in the following embodiments. Wherein the navigation branch neural network includes a visual feature extraction neural network and a plurality of sub-branch neural networks, and on the basis of the above embodiment, as shown in fig. 3, the step S102 includes:
s201, extracting the visual characteristics from the environment information by adopting the visual characteristic extraction neural network.
In this embodiment, based on the current environmental information around the vehicle obtained in the step S101, the computer device extracts the visual features from the environmental information by using the visual feature extraction neural network, where the visual features represent information such as points, edge lines, straight lines, curves, and the like of each object and space in the environmental information around the vehicle, and since the visual feature extraction neural network is a model trained in advance according to a plurality of environmental information, in practical application, assuming that the environmental information obtained by the computer device is in the form of an image, the computer device directly inputs the environmental information in the form of the image into the visual feature extraction network, and the obtained output result is the extracted visual features.
S202, inputting the visual characteristics into a target sub-branch neural network to obtain the control instruction.
It should be noted that, when the navigation branch neural network obtains the control instruction according to the current environmental information around the vehicle, the navigation branch neural network is divided into a plurality of sub-branch neural networks, that is, different road conditions adopt different sub-branch neural networks to output the control instruction, and different sub-branch neural networks are network models trained in advance according to different road conditions. In this step, therefore, based on the visual features extracted in the step S201, the computer device inputs the visual features into the target sub-branch neural network to obtain a control instruction suitable for the current situation.
According to the automatic driving vehicle control method provided by the embodiment, the computer equipment firstly adopts the visual characteristic extraction neural network to extract the visual characteristic from the environment information, and then inputs the visual characteristic into the target sub-branch neural network to obtain the control instruction, and the target sub-branch neural network indicates the neural network suitable for the current road condition, so that the obtained control instruction is the standard control instruction, the non-passing road condition is achieved, the selected target sub-branch neural network is different, and the intelligent adaptability of the automatic driving vehicle to various complex scenes is greatly improved.
In the above embodiment, the sub-branch neural network includes three types, that is, a left-turn branch neural network, a right-turn branch neural network, and a lane keeping branch neural network, that is, different road conditions correspond to different target sub-branch neural networks, which means that before the computer device inputs the visual characteristics into the target sub-branch neural network, the computer device needs to determine the target sub-branch neural network first, as shown in fig. 4, the method includes:
s301, acquiring current navigation information of a vehicle; the navigation information includes any one of a left turn instruction, a right turn instruction, and a straight-going instruction.
In this embodiment, the sub-branch neural network is distinguished according to different road conditions, so that the computer device needs to determine the current road condition, taking the current road condition as an example of selection of a driving track in an intersection scene, the computer device obtains current navigation information of the vehicle, where the navigation information includes any one of a left turn instruction, a right turn instruction and a straight-going instruction, and then the computer device can determine whether the driving track of the vehicle should be left turn, right turn or straight-going at the intersection according to the navigation information. The current navigation information of the vehicle may be obtained from a navigation device of the vehicle by the computer device, which is not limited in this embodiment.
And S302, if the navigation information is a left turn instruction, determining the left turn branch neural network as the target sub-branch neural network.
Taking the scene of the intersection as an example, in the step, the navigation information indicates that the current intersection needs to turn left, the computer equipment determines the left-turn branch neural network as a target sub-branch neural network, and inputs the visual features extracted in the step S201 into the left-turn branch neural network, so that the output control instruction is a steering wheel angle, an accelerator pedal position and a braking force for indicating the safe and standard left turn of the vehicle at the intersection.
And S303, if the navigation information is a right turn instruction, determining the right turn branch neural network as the target sub-branch neural network.
In this step, if the navigation information indicates that the current intersection needs to turn right, the computer device determines the right-turn branch neural network as a target sub-branch neural network, and inputs the visual features extracted in the step S201 into the right-turn branch neural network, so that the output control instruction is a steering wheel angle, an accelerator pedal position and a braking force for indicating that the vehicle turns right safely and normally at the intersection.
And S304, if the navigation information is a straight-going instruction, determining the lane keeping branch neural network as the target sub-branch neural network.
In this step, if the navigation information indicates that the current intersection needs to go straight, the computer device determines the lane keeping branch neural network as a target sub-branch neural network, and inputs the visual features extracted in the step S201 into the lane keeping branch neural network, so that the output control instruction is a steering wheel angle, an accelerator pedal position and a braking force indicating that the vehicle is safe and normal at the intersection.
According to the automatic driving vehicle control method provided by the embodiment, the computer equipment determines the target sub-branch neural network according to the acquired navigation information, visual characteristics extracted from the current environment information are input into the target sub-branch neural network to obtain corresponding control instructions, and the target sub-branch neural network represents the neural network suitable for the current road conditions, so that the obtained control instructions are standard control instructions and are not passed road conditions, and the selected target sub-branch neural networks are different, so that the intelligent adaptability of the automatic driving vehicle to various complex scenes is greatly improved.
In addition, the embodiment of the application also provides an automatic driving vehicle control method, which relates to a specific process of jointly training an initial navigation branch neural network and an initial intersection steering constraint neural network by computer equipment to obtain the navigation branch neural network and the intersection steering constraint neural network, as shown in fig. 5, and the method further comprises the following steps:
S401, inputting environmental information around a plurality of vehicles into an initial navigation branch neural network and an initial intersection steering constraint neural network respectively.
In this embodiment, the environmental information around the plurality of vehicles is pre-collected data, and in practical application, the computer device uses the environmental information around the plurality of vehicles as input data and inputs the input data into the initial navigation branch neural network and the initial intersection constraint neural network respectively, so as to train the initial navigation branch neural network and the initial intersection constraint neural network.
S402, inputting the output of the initial navigation branch neural network and the output of the initial intersection steering constraint neural network into a preset system loss function for joint training, and obtaining the value of the system loss function.
Based on the step S401, the computer device may obtain outputs of the initial navigation branch neural network and the initial intersection steering constraint neural network, and then input the outputs into a preset system loss function to obtain a value of the system loss function, so as to perform joint training on the initial navigation branch neural network and the initial intersection constraint neural network, where the value of the system loss function is used to evaluate a training result common to the initial navigation branch neural network and the initial intersection constraint neural network.
As shown in fig. 6, one implementation manner of the step S402 includes:
s501, taking the output of the initial navigation branch neural network as the input of a preset first loss function, and taking the output of the initial intersection steering constraint neural network as the input of a preset second loss function, so as to obtain the value of the first loss function and the value of the second loss function.
In this embodiment, the computer device uses the output of the initial navigation branch neural network as an input of a preset first loss function to obtain a value of the first loss function, and uses the output of the initial intersection steering constraint neural network as an input of a preset second loss function to obtain a value of the second loss function.
S502, acquiring the value of the system loss function according to the value of the first loss function, the weight of the first loss function, the value of the second loss function and the weight of the second loss function.
Based on the value of the first loss function and the value of the second loss function obtained in the step S501, the computer device obtains the value of the system loss function according to the value of the first loss function, the weight of the first loss function, the value of the second loss function and the weight of the second loss function, wherein the weight of the first loss function and the weight of the second loss function may be artificially and empirically allocated, for example: setting the system loss function as L, the first loss function as L1, and the second loss function as L2, then l=αl1+βl2, where α and β represent weights of L1 and L2 in the system loss function, and α+β=1, during the training process, the weights may be adjusted according to the training result, and optionally, the computer device adjusts the weights of the first loss function and the second loss function according to the value of the system loss function, and the computer device adjusts the weights of the first loss function and the second loss function in real time, so as to ensure that the system loss function reaches a preset threshold.
S403, adjusting parameters of the initial navigation branch neural network and the initial intersection steering constraint neural network according to the value of the system loss function until the value of the system loss function reaches a preset threshold value, and obtaining the navigation branch neural network and the intersection steering constraint neural network.
In this step, based on the values of the system loss function obtained in the step S402, the computer device adjusts parameters of the initial navigation branch neural network and the initial intersection constraint neural network, so as to reduce the values of the system loss function, until the values of the system loss function reach a preset threshold, which indicates that the initial navigation branch neural network and the initial intersection constraint neural network have been trained, and thus the navigation branch neural network and the intersection constraint neural network are obtained.
According to the automatic driving vehicle control method provided by the embodiment, the computer equipment inputs the environmental information around a plurality of vehicles as input data into the initial navigation branch neural network and the initial intersection steering constraint neural network, outputs the results into the preset system loss function to obtain the value of the system loss function, then adjusts the parameters of the initial navigation branch neural network and the initial intersection steering constraint neural network according to the value of the system loss function until the value of the system loss function reaches the preset threshold, and then the trained navigation branch neural network and the trained intersection steering constraint neural network are obtained, so that the initial navigation branch neural network and the initial intersection steering constraint neural network are trained in a combined mode, the training results are evaluated uniformly by adopting the system loss function, learning of the intersection steering constraint neural network is achieved, the navigation branch neural network is assisted and controlled, even if the navigation branch neural network encounters a complex road scene, correct control instructions can be output, and the driving safety and the driving normalization of the automatic driving vehicle are greatly ensured.
To further assist understanding, as shown in fig. 7, an embodiment of the present application provides a block diagram of an autopilot vehicle control system, where a [1] camera is used as a visual input end in the autopilot vehicle control system to obtain a front image during driving. The image is taken as input and enters a [2] visual feature extraction neural network. [2] The visual feature extraction neural network extracts the features of the input image in the step [1] by using a convolutional neural network, and extracts visual features useful in the driving process by the learning ability of the convolutional neural network on the image. [2] The extracted visual characteristics are commonly used by each sub-branch neural network. [3] The navigation information branch point provides navigation instructions, wherein the navigation instructions comprise three navigation instructions, namely left turn, right turn and straight going. [2] The three branch neural networks are respectively accessed after the three navigation instructions: [4] a left turn branch neural network, [5] a right turn branch neural network, [6] a lane keeping branch neural network. When each branch neural network is started, a set of control quantity is output, wherein the left-turning branch neural network [4] corresponds to the control of [7], the right-turning branch neural network [5] corresponds to the control of [8], and the lane-keeping branch neural network [6] corresponds to the control of [9 ]. [7] The control amounts in the [8] [9] control include steering wheel angle, accelerator pedal position, braking force, and the like, so that control of the automatically driven vehicle can be achieved by executing the control amounts. In addition, the intersection steering constraint neural network uses the image of the camera of the step (1) as input, uses the neural network of the step (11) to learn, and outputs the relative distance between the vehicle of the step (12) and the road shoulder. In the training process of the system, the loss value of the system loss function [14] is reduced through the learning of the network, so that the aim of jointly training the navigation branch network, the nerve and the road junction steering constraint neural network is fulfilled.
It should be understood that, although the steps in the flowcharts of fig. 2-6 are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in fig. 2-6 may include multiple sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, nor does the order in which the sub-steps or stages are performed necessarily occur in sequence, but may be performed alternately or alternately with at least a portion of the sub-steps or stages of other steps or other steps.
In one embodiment, as shown in fig. 8, there is provided an automatic driving vehicle control apparatus, the apparatus including: an environmental information acquisition module 10, a control instruction output module 11, and a control vehicle running module 12, wherein,
an environmental information acquisition module 10 for acquiring current environmental information around the vehicle;
the control instruction output module 11 is used for inputting the current environment information into a preset navigation branch neural network to obtain a control instruction; the navigation branch neural network is a network model obtained by combined training with a preset intersection steering constraint neural network; the intersection steering constraint neural network is used for acquiring the relative distance between the vehicle and the road shoulder;
A control vehicle travel module 12 for controlling travel of the vehicle in accordance with the control instruction.
The implementation principle and technical effects of the automatic driving vehicle control device provided in the above embodiment are similar to those of the above method embodiment, and are not described herein again.
In one embodiment, as shown in fig. 9, there is provided an automatic driving vehicle control apparatus, the control instruction output module 11 including: a visual characteristics extraction unit 111, and a control instruction output unit 112, wherein,
a visual feature extraction unit 111 for extracting visual features from the environmental information using the visual feature extraction neural network;
and a control instruction output unit 112, configured to input the visual characteristic into a target sub-branch neural network, and obtain the control instruction.
The implementation principle and technical effects of the automatic driving vehicle control device provided in the above embodiment are similar to those of the above method embodiment, and are not described herein again.
In one embodiment, as shown in fig. 10, there is provided an automatic driving vehicle control apparatus, the apparatus including: a navigation information acquisition unit 113, a first sub-branch determination unit 114, a second sub-branch determination unit 115, and a third sub-branch determination unit 116, wherein,
A navigation information obtaining unit 113 for obtaining current navigation information of the vehicle; the navigation information comprises any one of a left turn instruction, a right turn instruction and a straight-going instruction;
a first sub-branch determining unit 114, configured to determine the left-turn branch neural network as the target sub-branch neural network if the navigation information is a left-turn instruction;
a second sub-branch determining unit 115, configured to determine the right-turn branch neural network as the target sub-branch neural network if the navigation information is a right-turn instruction;
and a third sub-branch determining unit 116, configured to determine the lane keeping branch neural network as the target sub-branch neural network if the navigation information is a straight-ahead instruction.
The implementation principle and technical effects of the automatic driving vehicle control device provided in the above embodiment are similar to those of the above method embodiment, and are not described herein again.
In one embodiment, as shown in fig. 11, there is provided an autonomous vehicle control apparatus, the apparatus further comprising: an input module 13, a training module 14 and an adjustment module 15, wherein,
the input module 13 is used for respectively inputting the environmental information around a plurality of vehicles into the initial navigation branch neural network and the initial intersection steering constraint neural network;
The training module 14 is configured to input the outputs of the initial navigation branch neural network and the initial intersection steering constraint neural network into a preset system loss function for joint training, so as to obtain a value of the system loss function;
and the adjusting module 15 is configured to adjust parameters of the initial navigation branch neural network and the initial intersection steering constraint neural network according to the value of the system loss function until the value of the system loss function reaches a preset threshold value, thereby obtaining the navigation branch neural network and the intersection steering constraint neural network.
The implementation principle and technical effects of the automatic driving vehicle control device provided in the above embodiment are similar to those of the above method embodiment, and are not described herein again.
In one embodiment, as shown in fig. 12, there is provided an automatic driving vehicle control apparatus, the training module 14 includes: a first determination unit 141 and a second determination unit 142, wherein,
a first determining unit 141, configured to take an output of the initial navigation branch neural network as an input of a preset first loss function, and take an output of the initial intersection steering constraint neural network as an input of a preset second loss function, so as to obtain a value of the first loss function and a value of the second loss function;
A second determining unit 142, configured to obtain the value of the system loss function according to the value of the first loss function, the weight of the first loss function, the value of the second loss function, and the weight of the second loss function.
The implementation principle and technical effects of the automatic driving vehicle control device provided in the above embodiment are similar to those of the above method embodiment, and are not described herein again.
In one embodiment, an autonomous vehicle control apparatus is provided, the apparatus further for adjusting the weight of the first loss function and the weight of the second loss function according to the value of the system loss function.
The implementation principle and technical effects of the automatic driving vehicle control device provided in the above embodiment are similar to those of the above method embodiment, and are not described herein again.
The specific limitation regarding the automatic driving vehicle control apparatus may be referred to the limitation regarding the automatic driving vehicle control method hereinabove, and will not be described in detail herein. The respective modules in the above-described autonomous vehicle control apparatus may be implemented in whole or in part by software, hardware, and a combination thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a terminal, and the internal structure thereof may be as shown in fig. 13. The computer device includes a processor, a memory, a network interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a method of autonomous vehicle control. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, can also be keys, a track ball or a touch pad arranged on the shell of the computer equipment, and can also be an external keyboard, a touch pad or a mouse and the like.
It will be appreciated by those skilled in the art that the structures shown in FIG. 13 and described above are merely block diagrams of partial structures associated with aspects of the present application and are not intended to limit the computer devices to which aspects of the present application may be applied, and that a particular computer device may include more or fewer components than shown, or may combine certain components, or may have a different arrangement of components.
In one embodiment, a computer device is provided comprising a memory and a processor, the memory having stored therein a computer program, the processor when executing the computer program performing the steps of:
acquiring current environmental information around a vehicle;
inputting the current environment information into a preset navigation branch neural network to obtain a control instruction; the navigation branch neural network is a network model obtained by combined training with a preset intersection steering constraint neural network; the intersection steering constraint neural network is used for acquiring the relative distance between the vehicle and the road shoulder;
and controlling the vehicle to run according to the control instruction.
The computer device provided in the foregoing embodiments has similar implementation principles and technical effects to those of the foregoing method embodiments, and will not be described herein in detail.
In one embodiment, a computer readable storage medium is provided having a computer program stored thereon, which when executed by a processor, performs the steps of:
acquiring current environmental information around a vehicle;
inputting the current environment information into a preset navigation branch neural network to obtain a control instruction; the navigation branch neural network is a network model obtained by combined training with a preset intersection steering constraint neural network; the intersection steering constraint neural network is used for acquiring the relative distance between the vehicle and the road shoulder;
and controlling the vehicle to run according to the control instruction.
The foregoing embodiment provides a computer readable storage medium, which has similar principles and technical effects to those of the foregoing method embodiment, and will not be described herein.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the various embodiments provided herein may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), memory bus direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples merely represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the invention. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application is to be determined by the claims appended hereto.

Claims (9)

1. A method of controlling an autonomous vehicle, the method comprising:
acquiring current environmental information around a vehicle;
inputting the current environment information into a preset navigation branch neural network so that the navigation branch neural network determines a control instruction according to the current environment information and the current navigation information of the vehicle; the navigation branch neural network is a network model obtained by training the joint intersection steering constraint neural network according to environmental information during training; the intersection steering constraint neural network is used for acquiring the relative distance between the vehicle and the road shoulder;
And controlling the vehicle to run according to the control instruction.
2. The method of claim 1, wherein the navigational branch neural network comprises a visual feature extraction neural network and a plurality of sub-branch neural networks;
the step of inputting the current environment information into a preset navigation branch neural network to obtain a control instruction comprises the following steps:
extracting a neural network by adopting the visual features, and extracting the visual features from the environment information;
inputting the visual characteristics into a target sub-branch neural network to obtain the control instruction; the target sub-branch neural network is one of a plurality of sub-branch neural networks.
3. The method of claim 2, wherein the sub-branch neural network comprises a left-turn branch neural network, a right-turn branch neural network, and a lane-keeping branch neural network, and wherein the method comprises, prior to the inputting the visual characteristic into the target sub-branch neural network:
obtaining current navigation information of a vehicle, wherein the navigation information comprises any one of a left turn instruction, a right turn instruction and a straight-going instruction;
if the navigation information is a left turn instruction, determining the left turn branch neural network as the target sub-branch neural network;
If the navigation information is a right turn instruction, determining the right turn branch neural network as the target sub-branch neural network;
and if the navigation information is a straight-going instruction, determining the lane keeping branch neural network as the target sub-branch neural network.
4. A method according to any one of claims 1 to 3, further comprising:
respectively inputting environmental information around a plurality of groups of vehicles into an initial navigation branch neural network and an initial intersection steering constraint neural network;
the outputs of the initial navigation branch neural network and the initial intersection steering constraint neural network are input into a preset system loss function for joint training, and the value of the system loss function is obtained;
and adjusting parameters of the initial navigation branch neural network and the initial intersection steering constraint neural network according to the value of the system loss function until the value of the system loss function reaches a preset threshold value to obtain the navigation branch neural network and the intersection steering constraint neural network.
5. The method of claim 4, wherein the step of inputting the outputs of the initial navigation branch neural network and the initial intersection turn constraint neural network into a preset system loss function to perform joint training to obtain the value of the system loss function comprises:
Taking the output of the initial navigation branch neural network as the input of a preset first loss function, and taking the output of the initial intersection steering constraint neural network as the input of a preset second loss function to obtain the value of the first loss function and the value of the second loss function;
and acquiring the value of the system loss function according to the value and the weight of the first loss function and the value and the weight of the second loss function.
6. The method of claim 5, wherein the method further comprises:
and adjusting the weight of the first loss function and the weight of the second loss function according to the value of the system loss function.
7. An autonomous vehicle control apparatus, the apparatus comprising:
the environment information acquisition module is used for acquiring current environment information around the vehicle;
the control instruction output module is used for inputting the current environment information into a preset navigation branch neural network so that the navigation branch neural network can determine a control instruction according to the current environment information and the current navigation information of the vehicle; the navigation branch neural network is a network model obtained by training the joint intersection steering constraint neural network according to environmental information during training; the intersection steering constraint neural network is used for acquiring the relative distance between the vehicle and the road shoulder;
And the vehicle driving control module is used for controlling the vehicle to drive according to the control instruction.
8. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any of claims 1 to 6 when the computer program is executed.
9. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 6.
CN201910237979.7A 2019-03-27 2019-03-27 Automatic driving vehicle control method, device, computer equipment and storage medium Active CN111830949B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910237979.7A CN111830949B (en) 2019-03-27 2019-03-27 Automatic driving vehicle control method, device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910237979.7A CN111830949B (en) 2019-03-27 2019-03-27 Automatic driving vehicle control method, device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111830949A CN111830949A (en) 2020-10-27
CN111830949B true CN111830949B (en) 2024-01-16

Family

ID=72914155

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910237979.7A Active CN111830949B (en) 2019-03-27 2019-03-27 Automatic driving vehicle control method, device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111830949B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113276863B (en) * 2021-07-01 2022-09-13 浙江吉利控股集团有限公司 Vehicle control method, apparatus, device, medium, and program product
CN113515812A (en) * 2021-07-09 2021-10-19 东软睿驰汽车技术(沈阳)有限公司 Automatic driving method, device, processing equipment and storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2009295270A1 (en) * 2008-09-19 2010-03-25 The University Of Sydney A method and system of data modelling
CN102529963A (en) * 2010-12-14 2012-07-04 上海摩西海洋工程有限公司 Computer-aided driving system
WO2016130719A2 (en) * 2015-02-10 2016-08-18 Amnon Shashua Sparse map for autonomous vehicle navigation
CN106355948A (en) * 2015-07-17 2017-01-25 本田技研工业株式会社 Turn predictions
CN106918342A (en) * 2017-03-10 2017-07-04 广州汽车集团股份有限公司 Automatic driving vehicle driving path localization method and alignment system
CN107272687A (en) * 2017-06-29 2017-10-20 深圳市海梁科技有限公司 A kind of driving behavior decision system of automatic Pilot public transit vehicle
CN107545232A (en) * 2016-06-24 2018-01-05 福特全球技术公司 Track detection system and method
WO2018055378A1 (en) * 2016-09-21 2018-03-29 Oxford University Innovation Limited Autonomous route determination
WO2018075325A1 (en) * 2016-10-17 2018-04-26 Uber Technologies, Inc. Neural network system for autonomous vehicle control
CN108459588A (en) * 2017-02-22 2018-08-28 腾讯科技(深圳)有限公司 Automatic Pilot method and device, vehicle
CN108657189A (en) * 2018-06-22 2018-10-16 南京航空航天大学 Line operating mode automatic Pilot steering and its control method are moved based on BP neural network and safe distance

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9989964B2 (en) * 2016-11-03 2018-06-05 Mitsubishi Electric Research Laboratories, Inc. System and method for controlling vehicle using neural network

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2009295270A1 (en) * 2008-09-19 2010-03-25 The University Of Sydney A method and system of data modelling
CN102529963A (en) * 2010-12-14 2012-07-04 上海摩西海洋工程有限公司 Computer-aided driving system
WO2016130719A2 (en) * 2015-02-10 2016-08-18 Amnon Shashua Sparse map for autonomous vehicle navigation
CN107438754A (en) * 2015-02-10 2017-12-05 御眼视觉技术有限公司 Sparse map for autonomous vehicle navigation
CN106355948A (en) * 2015-07-17 2017-01-25 本田技研工业株式会社 Turn predictions
CN107545232A (en) * 2016-06-24 2018-01-05 福特全球技术公司 Track detection system and method
WO2018055378A1 (en) * 2016-09-21 2018-03-29 Oxford University Innovation Limited Autonomous route determination
WO2018075325A1 (en) * 2016-10-17 2018-04-26 Uber Technologies, Inc. Neural network system for autonomous vehicle control
CN108459588A (en) * 2017-02-22 2018-08-28 腾讯科技(深圳)有限公司 Automatic Pilot method and device, vehicle
CN106918342A (en) * 2017-03-10 2017-07-04 广州汽车集团股份有限公司 Automatic driving vehicle driving path localization method and alignment system
CN107272687A (en) * 2017-06-29 2017-10-20 深圳市海梁科技有限公司 A kind of driving behavior decision system of automatic Pilot public transit vehicle
CN108657189A (en) * 2018-06-22 2018-10-16 南京航空航天大学 Line operating mode automatic Pilot steering and its control method are moved based on BP neural network and safe distance

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Brain-Inspired Cognitive Model With Attention for Self-Driving Cars;Shitao Chen等;《 IEEE Transactions on Cognitive and Developmental Systems》;第11卷(第1期);全文 *
应用于智能车的车道信息感知及增量建模技术研究;于泳;《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》(第12期);全文 *

Also Published As

Publication number Publication date
CN111830949A (en) 2020-10-27

Similar Documents

Publication Publication Date Title
CN111142557B (en) Unmanned aerial vehicle path planning method and system, computer equipment and readable storage medium
EP3588226B1 (en) Method and arrangement for generating control commands for an autonomous road vehicle
CN113168708B (en) Lane line tracking method and device
US20200189597A1 (en) Reinforcement learning based approach for sae level-4 automated lane change
CN109765902B (en) Unmanned vehicle driving reference line processing method and device and vehicle
KR20200101517A (en) Method for autonomous cooperative driving based on vehicle-road infrastructure information fusion and apparatus for the same
CN109976334B (en) Vehicle lane changing method, device, equipment and storage medium
CN111923919B (en) Vehicle control method, vehicle control device, computer equipment and storage medium
WO2021036083A1 (en) Driver behavior model development method and device for automatic driving, and storage medium
DE102017204983B4 (en) Method for specifying a driving movement in a machine learning-based autopilot device of a motor vehicle and a control device, motor vehicle and training device for an autopilot device
CN111830949B (en) Automatic driving vehicle control method, device, computer equipment and storage medium
WO2022165614A1 (en) Path construction method and apparatus, terminal, and storage medium
WO2022052856A1 (en) Vehicle-based data processing method and apparatus, computer, and storage medium
CN114802234A (en) Road edge avoiding method and system in intelligent cruise
CN113383283A (en) Perception information processing method and device, computer equipment and storage medium
CN109492835A (en) Determination method, model training method and the relevant apparatus of vehicle control information
CN115230715A (en) Lane change prediction method and device, nonvolatile storage medium and computer equipment
CN114644014A (en) Intelligent driving method based on lane line and related equipment
CN109144052B (en) Navigation system for autonomous vehicle and method thereof
CN111811522B (en) Unmanned vehicle autonomous navigation method and device, computer equipment and storage medium
US20240133696A1 (en) Path construction method and apparatus, terminal, and storage medium
CN117128976B (en) Method and device for acquiring road center line, vehicle and storage medium
CN113954857B (en) Automatic driving control method and system, computer equipment and storage medium
DE102023100983A1 (en) AUTONOMOUS RACETRACK DRIVER COACH AND DEMONSTRATOR
CN115235497A (en) Path planning method and device, automobile and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant