CN110775059B - Automatic car following method based on artificial intelligence and related device - Google Patents

Automatic car following method based on artificial intelligence and related device Download PDF

Info

Publication number
CN110775059B
CN110775059B CN201911007635.3A CN201911007635A CN110775059B CN 110775059 B CN110775059 B CN 110775059B CN 201911007635 A CN201911007635 A CN 201911007635A CN 110775059 B CN110775059 B CN 110775059B
Authority
CN
China
Prior art keywords
vehicle
following
car
target
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911007635.3A
Other languages
Chinese (zh)
Other versions
CN110775059A (en
Inventor
由长喜
王斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201911007635.3A priority Critical patent/CN110775059B/en
Publication of CN110775059A publication Critical patent/CN110775059A/en
Application granted granted Critical
Publication of CN110775059B publication Critical patent/CN110775059B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/14Adaptive cruise control
    • B60W30/143Speed control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/10Longitudinal speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/10Longitudinal speed
    • B60W2520/105Longitudinal acceleration

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the application discloses an automatic car following method based on artificial intelligence, which is characterized in that corresponding relations between different car following models and car following scenes are established in advance, when an automatic driving automatic car following mode is entered, the car following scene at the current moment is identified, and therefore a target car following model corresponding to the car following scene is called according to the corresponding relations. The method comprises the steps of obtaining vehicle state data and previous vehicle state data required by a target vehicle following model and a target distance between the vehicle and the previous vehicle, determining a target control quantity according to the vehicle state data, the previous vehicle state data, the target distance and the target vehicle following model, and controlling the vehicle by using the target control quantity. Therefore, the appropriate car following model is called according to the car following scene to calculate the target control quantity, all the state data of the front car do not need to be acquired, the dependence on the detection results of the speed and the acceleration state of the front car is greatly weakened, and the requirements on the sensor are reduced, so that the method is suitable for various types of cars, and the mobility of the automatic car following method in the automatic driving process is improved.

Description

Automatic car following method based on artificial intelligence and related device
Technical Field
The application relates to the field of automatic driving, in particular to an automatic car following method based on artificial intelligence and a related device.
Background
With the rapid development of the vehicle industry and the continuous improvement of the living standard of people, the proportion of automobiles in the daily life of people is gradually increased. As more and more vehicles are used on the road, the vehicles often need to follow the front vehicle to keep a certain speed and then travel back and forth, so that the driver needs to pay high attention to the distance between the vehicle and the front vehicle. After a long time, the driver is easily in a fatigue driving state, which leads to traffic accidents. In order to alleviate the driving intensity of the vehicle drivers on the congested road sections, the automatic vehicle following system is widely concerned.
The existing car following technology has strict requirements on the real-time performance and the accuracy rate of a sensor, so most of the existing car following technologies are developed based on a relatively expensive drive-by-wire chassis system and a sensing system, are rarely used on middle and low-end engine locomotives, and cannot ensure good comfort and safety in the automatic car following process under the condition of poor comfort of the car.
Disclosure of Invention
In order to solve the technical problems, the application provides an automatic car following method based on artificial intelligence and a related device, dependence on the detection results of the speed and the acceleration state of a front car is greatly weakened, and the requirements on a sensor are reduced, so that the method can be suitable for various types of cars, and the mobility of the automatic car following method is improved.
The embodiment of the application discloses the following technical scheme:
in a first aspect, an embodiment of the present application provides an automatic car following method based on artificial intelligence, where a correspondence between a car following model and a car following scene is preset, and the method includes:
identifying a car following scene at the current moment;
calling a target car following model corresponding to the car following scene according to the corresponding relation;
obtaining vehicle state data and front vehicle state data required by the target vehicle following model and a target distance between the vehicle and the front vehicle;
determining a target control quantity according to the vehicle state data, the preceding vehicle state data, the target distance and the target vehicle following model;
and controlling the vehicle by using the target control quantity.
In a second aspect, an embodiment of the present application provides an automatic car following device based on artificial intelligence, which presets a corresponding relationship between a car following model and a car following scene, and the device includes an identification unit, a calling unit, an acquisition unit, a determination unit and a control unit:
the identification unit is used for identifying a car following scene at the current moment;
the calling unit is used for calling a target car following model corresponding to the car following scene according to the corresponding relation;
the acquisition unit is used for acquiring the vehicle state data and the front vehicle state data required by the target vehicle following model and the target distance between the vehicle and the front vehicle;
the determining unit is used for determining a target control quantity according to the vehicle state data, the preceding vehicle state data, the target distance and the target vehicle following model;
and the control unit is used for controlling the vehicle by using the target control quantity.
In a third aspect, an embodiment of the present application provides an apparatus for automatic following, where the apparatus includes a processor and a memory:
the memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to perform the method of the first aspect according to instructions in the program code.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium for storing program code for executing the method of the first aspect.
According to the technical scheme, the corresponding relation between different car following models and the car following scenes is established in advance, when the automatic car following mode is entered, the car following scene at the current moment can be identified, and therefore the target car following model corresponding to the car following scene is called according to the corresponding relation. The data used by different vehicle following models may be different, and the vehicle state data, the preceding vehicle state data and the target distance between the vehicle and the preceding vehicle required by the target vehicle following model are acquired, so that the target control quantity is determined according to the vehicle state data, the preceding vehicle state data, the target distance and the target vehicle following model, and the vehicle is controlled by the target control quantity. According to the method and the device, the appropriate car following model can be called according to the car following scene to calculate the target control quantity, the different car following models have different required front car state data, all the front car state data do not need to be acquired, even the front car acceleration does not need to be acquired, the dependence on the detection results of the speed and the acceleration state of the front car is greatly weakened, the requirements on the sensor are reduced, and therefore the method and the device are suitable for various types of cars, and the mobility of the automatic car following method is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a schematic view of an application scenario of an automatic car following method based on artificial intelligence according to an embodiment of the present application;
fig. 2 is a flowchart of an automatic car following method based on artificial intelligence according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a method for controlling a vehicle following according to an embodiment of the present disclosure;
fig. 4 is an exemplary diagram of a following scenario provided in an embodiment of the present application;
fig. 5 is a schematic diagram of calculating a maximum driving force of a rear wheel and a maximum braking force of the rear wheel according to an embodiment of the present application;
fig. 6 is a structural diagram of an automatic car following device based on artificial intelligence according to an embodiment of the present application;
fig. 7 is a block diagram of an apparatus for automatically following a vehicle according to an embodiment of the present disclosure;
fig. 8 is a block diagram of a server according to an embodiment of the present application.
Detailed Description
Embodiments of the present application are described below with reference to the accompanying drawings.
Conventional automatic car following methods, such as a level 3(Leve 3, L3 for short) system of Audi (Audi) A8 or an Autopilot (Autopilot) system of Tesla (Tesla), are applied to a chassis with good maneuverability and comfort, and are not necessarily popularized in more vehicle models.
In addition, the common car following method can only be applied to specific scenes, such as an Intelligent Driver Model (IDM), and is more suitable for the conventional non-congestion non-emergency stop working condition, and the stable car following scene and the vehicle cut-in scene are difficult to be processed in a distinguishing manner.
In order to solve the technical problem, an embodiment of the present application provides an automatic car following method based on artificial intelligence, in which different car following models are pre-established, and when an automatic car following mode is entered, a car following scene at the current moment can be identified, so that a car following model corresponding to the car following scene is called according to the car following scene. The dependence on the detection results of the speed and the acceleration state of the front vehicle is greatly weakened, and the requirements on the sensor are reduced, so that the method can be suitable for various vehicles, and the mobility of the automatic vehicle following method is improved.
It should be emphasized that the automatic car following method provided in the embodiments of the present application is implemented based on Artificial Intelligence (AI), which is a theory, method, technique and application system that simulates, extends and expands human Intelligence using a digital computer or a machine controlled by a digital computer, senses the environment, acquires knowledge and uses the knowledge to obtain the best result. In other words, artificial intelligence is a comprehensive technique of computer science that attempts to understand the essence of intelligence and produce a new intelligent machine that can react in a manner similar to human intelligence. Artificial intelligence is the research of the design principle and the realization method of various intelligent machines, so that the machines have the functions of perception, reasoning and decision making.
The artificial intelligence technology is a comprehensive subject and relates to the field of extensive technology, namely the technology of a hardware level and the technology of a software level. The artificial intelligence infrastructure generally includes technologies such as sensors, dedicated artificial intelligence chips, cloud computing, distributed storage, big data processing technologies, operation/interaction systems, mechatronics, and the like. The artificial intelligence software technology mainly comprises a computer vision technology, a voice processing technology, a natural language processing technology, machine learning/deep learning, automatic driving and the like.
In the embodiment of the present application, the artificial intelligence technology mainly involved includes the above-mentioned automatic driving and the like.
For example, the method can relate to automatic driving, and the automatic driving technology generally comprises technologies of environment perception, behavior decision, path planning, motion control and the like, and has wide application prospects. The environment perception comprises a perception sensor, a high-precision map, positioning, speed perception and the like; the behavior decision comprises information acquisition, information preprocessing, decision making and the like; the path planning comprises global path planning, local path planning and the like; motion control includes acceleration, deceleration, turning, braking, and the like.
The method can be applied to a data processing device, which can be a terminal device, and the terminal device can be an electronic device with a communication function, such as a smart phone, a computer, a Personal Digital Assistant (PDA), a tablet computer, a wearable device, and a vehicle-mounted device.
The automatic car following method provided by the embodiment of the application can be applied to unmanned cars of L2, L3 and L4 grades, is suitable for starting a locomotive or an electric car, has high acceptance to a car chassis, and can process conventional car following, cut-in/cut-out of cars on two sides, frequent start and stop of a front car, emergency braking and the like as comfortably as possible.
In order to facilitate understanding of the technical scheme of the present application, the following introduces an automatic car following method provided by the embodiment of the present application in combination with an actual application scenario.
Referring to fig. 1, fig. 1 is a schematic view of an application scenario of an automatic car following method according to an embodiment of the present application. The application scenario includes a host vehicle 101, a preceding vehicle 102, and a terminal device 103.
Where the host vehicle 101 refers to a currently controlled vehicle, i.e., a host vehicle, the terminal device 103 may be located on the host vehicle 101. The front vehicle 102 (also referred to as a "front guidance vehicle") is a front vehicle that is in the same lane as the host vehicle 101 and is closest to the host vehicle.
The terminal device 103 may identify a car-following scene at the current time, and call a target car-following model corresponding to the car-following scene according to the correspondence. The terminal device 103 acquires own-vehicle state data required for a target following vehicle model from the own vehicle 101, acquires preceding-vehicle state data of the preceding vehicle 102 required for the target following vehicle model, and acquires a target distance between the own vehicle 101 and the preceding vehicle 102. The state data may include one or more of velocity, acceleration, position information, and the like, among other combinations.
The terminal device 103 determines a target control amount from the own-vehicle state data, preceding-vehicle state data, target distance, and target following model, thereby controlling the own vehicle 101 with the target control amount.
According to the method and the device, the appropriate car following model can be called according to the car following scene to calculate the target control quantity, the different car following models have different required front car state data, all the front car state data do not need to be acquired, even the front car acceleration does not need to be acquired, the dependence on the detection results of the speed and the acceleration state of the front car is greatly weakened, the requirements on the sensor are reduced, and therefore the method and the device are suitable for various types of cars, and the mobility of the automatic car following method is improved.
Next, the automatic car following method based on artificial intelligence provided by the embodiment of the present application will be described in detail with reference to the accompanying drawings.
Referring to fig. 2, fig. 2 shows a flow chart of an automatic car following method, the method comprising:
s201, identifying a car following scene at the current moment.
In the embodiment of the application, a user can control the vehicle to switch to the automatic following mode, for example, the user can enter the automatic following mode through a mode switching function on the vehicle, so that the automatic following method provided by the embodiment of the application is utilized to automatically follow the vehicle.
Referring to fig. 3, fig. 3 is a schematic diagram illustrating a following vehicle control method, in which a host vehicle and a preceding vehicle can communicate with each other, status data of the host vehicle and status data of the preceding vehicle can be acquired by sensors and transmitted to a vehicle controller, and the vehicle controller can determine a control amount, such as acceleration, a target distance, a speed, etc., according to the status data of the host vehicle and the status data of the preceding vehicle, so as to control the host vehicle by using the control amount. Of course, in some cases, the control amount used by the vehicle controller to control the host vehicle may also be input by the user, for example, the user inputs the target distance and the target speed to control the host vehicle.
For this reason, in order to realize automatic following, it is necessary to determine a control amount for controlling the own vehicle. In the embodiment of the present application, there may be a plurality of different situations in which the host vehicle runs following the host vehicle, that is, there are a plurality of following scenes, for example, a following scene in which a vehicle ahead on one side cuts into the host vehicle lane, a following scene in which the host vehicle is significantly braked, a following scene in which the host vehicle runs normally, a following scene in which the host vehicle runs stably with a stable vehicle distance from the host vehicle, and the like.
Because the different car following models have different functions and different dependent state data, the different car following models are suitable for different car following scenes, and therefore in order to accurately call the appropriate car following model, the car following scene at the current moment needs to be recognized firstly.
It should be noted that, in this embodiment, the manner of identifying the current car following scene may include multiple manners, one manner is to determine whether a vehicle ahead of the vehicle cuts into the lane where the vehicle is located, and if so, to ensure comfort of car following during the vehicle cutting, the vehicle needs to be controlled to keep constant-speed car following as much as possible, so that it may be determined that the current car following scene is the constant-speed car following scene. The method for determining whether the vehicle in front of the side cuts into the lane where the vehicle is located may be to collect front images or video information through an image collecting sensor, so as to identify whether the vehicle in front of the side cuts into the lane where the vehicle is located.
The other mode is to determine whether the front vehicle brakes, and if so, the following vehicle scene is the braking scene. The manner of determining whether the braking of the preceding vehicle occurs may be that the vehicle communicates with the preceding vehicle, the preceding vehicle may transmit its own driving state to the vehicle, and the vehicle may determine whether the braking of the preceding vehicle occurs according to the driving state transmitted by the preceding vehicle.
The other mode may also be that whether the braking distance of the vehicle following model called at the current time and the braking distance of the stable vehicle distance vehicle following model meet preset conditions or not is determined, and if the braking distance of the vehicle following model called at the current time and the braking distance of the stable vehicle distance vehicle following model meet the preset conditions, it indicates that the vehicle and the preceding vehicle have reached the stable vehicle distance vehicle following at this time, the vehicle following scene may be determined to be the stable vehicle distance vehicle following scene.
S202, calling a target car following model corresponding to the car following scene according to the corresponding relation.
In the embodiment of the application, the car following model suitable for different car following scenes is constructed in advance, and the corresponding relation between the car following scenes and the car following model is established. Next, the principle of constructing the following vehicle model will be described.
Referring to fig. 4, in fig. 4, it is assumed that a is a leading vehicle, B is a controlled vehicle (own vehicle), the target distance between the two vehicles is Δ L, and e is used1Indicating the distance error between the actual distance of the vehicle from the preceding vehicle and the target distance, i.e. e1=YA-YB- Δ L, wherein YAPosition of the front vehicle, YBIs the position of the host vehicle; e.g. of the type2E representing the speed error of the vehicle and the preceding vehicle, the speed error being determined by the distance error1Is expressed in derivative, i.e.
Figure BDA00022432342900000712
Figure BDA00022432342900000713
Wherein, VAIndicating the speed, V, of the preceding vehicleBIndicating the speed of the host vehicle. Definition e2Derivative of (2)
Figure BDA00022432342900000714
Can be designed by1And λ2So that e1、e2Converging to 0. The target acceleration may be limited for the application scenario and the dynamic constraints of the vehicle. For the following scene on the expressway, the general road surface condition is better, the road curvature is not too large, and the dynamic constraint influence of the vehicle is not obvious.
Therefore, the calculation model of the desired acceleration, that is, the acceleration for controlling the host vehicle may be:
Figure BDA0002243234290000071
wherein the content of the first and second substances,
Figure BDA0002243234290000072
showing the acceleration of the vehicle, fRxIs the rear wheel driving (braking) force of the vehicle,
Figure BDA00022432342900000715
to representThe maximum driving force of the rear wheels of the vehicle,f Rxrepresents the maximum braking force of the rear wheels of the vehicle, mBThe mass of the host vehicle is represented,
Figure BDA0002243234290000073
representing the derivative of the speed of the vehicle ahead, e1Representing the distance error between the actual distance of the vehicle from the preceding vehicle and the target distance, e2Indicates the speed error, lambda, between the vehicle and the preceding vehicle1And λ2Is a constant.
This acceleration may be used as a target control amount for the control of the host vehicle, but of course, in some implementations, the acceleration may be further used to calculate a rear-wheel drive (braking) torque TRThereby will TRAs the target control amount, TRThe calculation formula of (2) is as follows:
Figure BDA0002243234290000074
wherein, TRFor the rear wheel drive (braking) torque,
Figure BDA0002243234290000075
is an acceleration, RWRadius of the wheel, mBIs the mass of the vehicle, IWIs the moment of inertia of the wheel.
It should be noted that, in the following description,
Figure BDA0002243234290000076
andf Rxthe calculation method can be seen from fig. 5, first, the speed of the vehicle is decomposed to obtain Vx and Vy shown in fig. 5, wherein an included angle between the speed of the vehicle and Vx is β, tan β is calculated, a point a is determined according to tan β, a straight line parallel to the direction of an arrow shown by the speed of the vehicle ahead passes through the point a, and the radius of the straight line is
Figure BDA0002243234290000077
Intersects the point B and the point C, the point C corresponds to the maximum driving force of the rear wheel, the point B corresponds to the maximum braking force of the rear wheel, and thus, the obtained
Figure BDA0002243234290000078
Andf Rxthe calculation formulas of (a) and (b) are respectively as follows:
Figure BDA0002243234290000079
Figure BDA00022432342900000710
in the above-mentioned formula (2) and formula (3),
Figure BDA00022432342900000711
is the maximum driving force of the rear wheel,f Rxmaximum braking force of rear wheel, m mass of wheel, g acceleration of gravity, lfIs the distance from the center of gravity to the front axle,/rThe distance from the center of gravity to the rear axle,
Figure BDA0002243234290000081
in order to obtain the comprehensive friction coefficient of the rear wheel,
Figure BDA0002243234290000082
the comprehensive slip rate of the rear wheel, h is the height of the center of gravity, and beta is the included angle between the speed of the vehicle and Vx.
It follows that the target control quantity can be represented by Se,Ve,Sl,Vl,al;λ12And (4) determining the parameters. If the target control amount for controlling the host vehicle is an acceleration, the target control amount may be represented by the following equation:
Figure BDA0002243234290000083
where μ is a target control amount (acceleration), f () represents a function, SeIndicating the position of the vehicle, VeIndicating the speed of the vehicle, SlIndicating front vehiclePosition of (A), VlIndicating the speed of the preceding vehicle, alIndicating the acceleration, λ, of the preceding vehicle1And λ2Is a constant.
And (3) obtaining the following models suitable for different following scenes through the change of the target control variable formula shown in the formula (5). In this embodiment, the constructed following model may include one or more of a stable following model, a constant speed following model and a parking model.
The uniform speed car following model is suitable for a uniform speed car following scene, is mainly used for improving comfort, and needs a vehicle to be relatively insensitive to position feedback but relatively sensitive to speed feedback. When a vehicle in front of the side cuts into the lane, the model needs to give an overlarge response according to the change of the distance error, and the situation is processed by a constant-speed vehicle following model. By changing the above equation (5), the uniform velocity following model is given by:
Figure BDA0002243234290000084
wherein, aeIndicates the acceleration of the vehicle, i.e., the target control amount, SeIndicating the position of the vehicle, VeIndicating the speed of the vehicle, SlIndicating the position of the leading vehicle, VlIndicating the speed of the leading vehicle and deltal the target distance.
Therefore, if it is determined that the vehicle ahead of the side cuts into the lane where the vehicle is located, that is, the vehicle following scene is the constant-speed vehicle following scene, the constant-speed vehicle following model corresponding to the constant-speed vehicle following scene may be called as the target vehicle following model, and the subsequent target control quantity is calculated.
The parking model is suitable for a scene that the front vehicle brakes, for example, the front vehicle brakes obviously in the process of using the constant-speed vehicle following model to follow the vehicle. By varying the above equation (5), the parking model can be given by the following equation:
Figure BDA0002243234290000085
wherein, aeShowing the acceleration of the vehicle, i.e., the target control amount, Se showing the position of the vehicle, VeIndicating the speed of the vehicle, SlIndicating the position of the leading vehicle, VlRepresenting the speed of the leading vehicle,. DELTA.L representing the target distance, alIndicating the acceleration of the preceding vehicle.
Therefore, if the front vehicle is determined to be braked, namely the vehicle following scene is determined to be a braking scene, in order to guarantee timely braking and safety, a parking model sensitive to position feedback can be called as a target vehicle following model, and subsequent target control quantity calculation is carried out.
The stable vehicle distance following model is suitable for stable vehicle distance following scenes, and the stable vehicle distance following model can adjust the vehicle speed according to the distance change. When the sensing result of the speed and the acceleration of the front vehicle is inaccurate or delay which is not negligible exists, the stable vehicle distance following model can quickly respond to the distance error to brake or accelerate, so that the stable vehicle distance following is ensured. By changing the above equation (5), the stable following model can be given by the following equation:
Figure BDA0002243234290000091
wherein, aeIndicates the acceleration of the vehicle, i.e., the target control amount, SeIndicating the position of the vehicle, VeIndicating the speed of the vehicle, SlIndicating the position of the leading vehicle, VlRepresents the speed of the leading vehicle, Δ L represents the target distance, and Δ t is a settable time constant, i.e., the time required to adjust the vehicle distance to the target distance.
Because other vehicle following models, such as a constant speed vehicle following model or a parking model, may be used before the stable vehicle distance vehicle following model is used, if the braking distance of the vehicle following model called at the current time and the braking distance of the stable vehicle distance vehicle following model satisfy a preset condition, it may be considered that the vehicle distance has been pulled away at this time, it is necessary to keep the stable vehicle distance vehicle following, that is, it is determined that the vehicle following scene at the current time is the stable vehicle distance vehicle following scene, and the stable vehicle distance vehicle following model corresponding to the stable vehicle distance vehicle following scene may be called as the target vehicle following model to calculate the subsequent target control quantity.
S203, obtaining the state data of the vehicle, the state data of the front vehicle and the target distance between the vehicle and the front vehicle, which are required by the target vehicle following model.
According to the formula corresponding to the following vehicle model, the preceding vehicle state data required by different following vehicle models for calculating the target control quantity are different, all the preceding vehicle state data do not need to be acquired, even the acceleration of the preceding vehicle does not need to be acquired, the dependence on the detection result of the speed and the acceleration state of the preceding vehicle is greatly weakened, and the requirement on a sensor is reduced.
After the target vehicle following model is determined, vehicle state data and preceding vehicle state data required by the target vehicle following model and a target distance between the vehicle and the preceding vehicle can be acquired.
For example, the target following model is a constant-speed following model, and it can be seen from formula (6) that the vehicle state data required by the following model includes the position of the vehicle and the speed of the vehicle, and the vehicle state data includes the position of the vehicle and the speed of the vehicle, and the target distance.
And S204, determining a target control quantity according to the vehicle state data, the preceding vehicle state data, the target distance and the target vehicle following model.
The obtained vehicle state data, preceding vehicle state data, and target distance are substituted into a target following model to determine a target control amount, for example, the target following model is a constant-speed following model, and the position of the vehicle, the speed of the vehicle, the position of the preceding vehicle, the speed of the preceding vehicle, and the target distance are substituted into a formula (6) corresponding to the constant-speed following model to calculate a target control amount (acceleration).
And S205, controlling the vehicle by using the target control quantity.
The target control amount is fed back to the vehicle controller, so that the host vehicle is controlled by the target control amount, for example, the vehicle speed and the following distance of the host vehicle are adjusted by the acceleration, so as to reach the target distance and the target vehicle speed.
According to the technical scheme, the corresponding relation between different car following models and the car following scenes is established in advance, when the automatic car following mode is entered, the car following scene at the current moment can be identified, and therefore the target car following model corresponding to the car following scene is called according to the corresponding relation. The data used by different vehicle following models may be different, and the vehicle state data, the preceding vehicle state data and the target distance between the vehicle and the preceding vehicle required by the target vehicle following model are acquired, so that the target control quantity is determined according to the vehicle state data, the preceding vehicle state data, the target distance and the target vehicle following model, and the vehicle is controlled by the target control quantity. According to the method and the device, the appropriate car following model can be called according to the car following scene to calculate the target control quantity, the different car following models have different required front car state data, all the front car state data do not need to be acquired, even the front car acceleration does not need to be acquired, the dependence on the detection results of the speed and the acceleration state of the front car is greatly weakened, the requirements on the sensor are reduced, and therefore the method and the device are suitable for various types of cars, and the mobility of the automatic car following method is improved.
It should be noted that, for a certain specific model and a specific scenario, for example, a stable vehicle distance following model and a constant speed following model may be replaced by using a parameterized state feedback controller, or may be replaced by optimizing an IDM model, and the specific form of the model is not limited in this embodiment.
Next, the automatic car following method provided in the embodiment of the present application will be described with reference to specific application scenarios. In the application scenario, the user is located in the vehicle B, and when the user wants that the vehicle B can automatically follow the vehicle a in front to run, the user can click a button of the automatic car following function on the vehicle B, so that the vehicle B enters the automatic car following mode. The automatic car following method can be realized through the vehicle-mounted terminal, and the vehicle-mounted terminal can identify the car following scene at the current moment, so that the target car following model corresponding to the car following scene is called according to the corresponding relation. The vehicle-mounted terminal acquires the vehicle state data and the preceding vehicle state data required by the target vehicle following model and the target distance between the vehicle and the preceding vehicle, determines a target control quantity according to the vehicle state data, the preceding vehicle state data, the target distance and the target vehicle following model, and then feeds the target control quantity back to the vehicle controller to control the vehicle.
Based on the automatic car following method based on artificial intelligence provided by the foregoing embodiment, an embodiment of the present application further provides an automatic car following apparatus, referring to fig. 6, a corresponding relationship between a car following model and a car following scene is preset, and the apparatus includes an identification unit 601, a calling unit 602, an obtaining unit 603, a determining unit 604, and a control unit 605:
the identification unit 601 is configured to identify a following scene at the current time;
the calling unit 602 is configured to call a target car following model corresponding to the car following scene according to the correspondence;
the acquiring unit 603 is configured to acquire vehicle state data and preceding vehicle state data required by the target vehicle following model, and a target distance between the vehicle and the preceding vehicle;
the determining unit 604 is configured to determine a target control amount according to the vehicle state data, the preceding vehicle state data, the target distance, and the target following model;
the control unit 605 is configured to control the host vehicle by using the target control amount.
In one possible implementation manner, the following model includes one or more of a stable vehicle distance following model, a constant speed following model and a parking model.
In a possible implementation manner, the identifying unit 601 is configured to:
determining whether a vehicle in front of the side cuts into the lane where the vehicle is located;
if so, the car following scene is a constant-speed car following scene;
the invoking unit 602 is configured to:
and calling the constant-speed car following model corresponding to the constant-speed car following scene as the target car following model.
In a possible implementation manner, the identifying unit 601 is configured to:
determining whether braking of the front vehicle occurs;
if so, the car following scene is a braking scene;
the invoking unit 602 is configured to:
and calling the parking model corresponding to the braking scene as the target car following model.
In a possible implementation manner, the identifying unit 601 is configured to:
if the braking distance of the called following model at the current moment and the braking distance of the stable following model meet preset conditions, determining that the following scene is a stable following scene;
the invoking unit 602 is configured to:
and calling the stable vehicle following model corresponding to the stable vehicle following scene as the target vehicle following model.
In one possible implementation, the target control amount is an acceleration.
The embodiment of the application also provides equipment for automatic car following, and the equipment for automatic car following is introduced in combination with the attached drawings. Referring to fig. 7, an embodiment of the present application provides an apparatus 700 for automatically following a vehicle, where the apparatus 700 may also be a terminal apparatus, and the terminal apparatus may be any intelligent terminal including a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), a Point of Sales (POS), a vehicle-mounted computer, and the terminal apparatus is a mobile phone:
fig. 7 is a block diagram illustrating a partial structure of a mobile phone related to a terminal device provided in an embodiment of the present application. Referring to fig. 7, the handset includes: radio Frequency (RF) circuit 710, memory 720, input unit 730, display unit 740, sensor 750, audio circuit 760, wireless fidelity (WiFi) module 770, processor 780, and power supply 790. Those skilled in the art will appreciate that the handset configuration shown in fig. 7 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile phone in detail with reference to fig. 7:
the RF circuit 710 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, receives downlink information of a base station and then processes the received downlink information to the processor 780; in addition, the data for designing uplink is transmitted to the base station. In general, the RF circuit 710 includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuit 710 may also communicate with networks and other devices via wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Message Service (SMS), and the like.
The memory 720 may be used to store software programs and modules, and the processor 780 may execute various functional applications and data processing of the cellular phone by operating the software programs and modules stored in the memory 720. The memory 720 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 720 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 730 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone. Specifically, the input unit 730 may include a touch panel 731 and other input devices 732. The touch panel 731, also referred to as a touch screen, can collect touch operations of a user (e.g. operations of the user on or near the touch panel 731 by using any suitable object or accessory such as a finger, a stylus, etc.) and drive the corresponding connection device according to a preset program. Alternatively, the touch panel 731 may include two portions of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts it to touch point coordinates, and sends the touch point coordinates to the processor 780, and can receive and execute commands from the processor 780. In addition, the touch panel 731 may be implemented by various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The input unit 730 may include other input devices 732 in addition to the touch panel 731. In particular, other input devices 732 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 740 may be used to display information input by the user or information provided to the user and various menus of the mobile phone. The Display unit 740 may include a Display panel 741, and optionally, the Display panel 741 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch panel 731 can cover the display panel 741, and when the touch panel 731 detects a touch operation on or near the touch panel 731, the touch operation is transmitted to the processor 780 to determine the type of the touch event, and then the processor 780 provides a corresponding visual output on the display panel 741 according to the type of the touch event. Although the touch panel 731 and the display panel 741 are two independent components in fig. 7 to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 731 and the display panel 741 may be integrated to implement the input and output functions of the mobile phone.
The handset may also include at least one sensor 750, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that adjusts the brightness of the display panel 741 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 741 and/or a backlight when the mobile phone is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
Audio circuitry 760, speaker 761, and microphone 762 may provide an audio interface between a user and a cell phone. The audio circuit 760 can transmit the electrical signal converted from the received audio data to the speaker 761, and the electrical signal is converted into a sound signal by the speaker 761 and output; on the other hand, the microphone 762 converts the collected sound signal into an electric signal, converts the electric signal into audio data after being received by the audio circuit 760, and then processes the audio data output processor 780, and then transmits the audio data to, for example, another cellular phone through the RF circuit 710, or outputs the audio data to the memory 720 for further processing.
WiFi belongs to short-distance wireless transmission technology, and the mobile phone can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 770, and provides wireless broadband Internet access for the user. Although fig. 7 shows the WiFi module 770, it is understood that it does not belong to the essential constitution of the handset, and can be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 780 is a control center of the mobile phone, connects various parts of the entire mobile phone by using various interfaces and lines, and performs various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 720 and calling data stored in the memory 720, thereby integrally monitoring the mobile phone. Optionally, processor 780 may include one or more processing units; preferably, the processor 780 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 780.
The handset also includes a power supply 790 (e.g., a battery) for powering the various components, which may preferably be logically coupled to the processor 780 via a power management system, so that the power management system may be used to manage charging, discharging, and power consumption.
Although not shown, the mobile phone may further include a camera, a bluetooth module, etc., which are not described herein.
In this embodiment, the processor 780 included in the terminal device further has the following functions:
identifying a car following scene at the current moment;
calling a target car following model corresponding to the car following scene according to the corresponding relation;
obtaining vehicle state data and front vehicle state data required by the target vehicle following model and a target distance between the vehicle and the front vehicle;
determining a target control quantity according to the vehicle state data, the preceding vehicle state data, the target distance and the target vehicle following model;
and controlling the vehicle by using the target control quantity.
Referring to fig. 8, fig. 8 is a block diagram of a server 800 provided in this embodiment, where the server 800 may have a relatively large difference due to different configurations or performances, and may include one or more Central Processing Units (CPUs) 822 (e.g., one or more processors) and a memory 832, and one or more storage media 830 (e.g., one or more mass storage devices) for storing applications 842 or data 844. Memory 832 and storage medium 830 may be, among other things, transient or persistent storage. The program stored in the storage medium 830 may include one or more modules (not shown), each of which may include a series of instruction operations for the server. Still further, a central processor 822 may be provided in communication with the storage medium 830 for executing a series of instruction operations in the storage medium 830 on the server 800.
The server 800 may also include one or more power supplies 826, one or more wired or wireless network interfaces 850, one or more input-output interfaces 858, and/or one or more operating systems 841, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, and so forth.
The steps performed by the server in the above embodiments may be based on the server structure shown in fig. 8.
The terms "first," "second," "third," "fourth," and the like in the description of the application and the above-described figures, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be understood that in the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" for describing an association relationship of associated objects, indicating that there may be three relationships, e.g., "a and/or B" may indicate: only A, only B and both A and B are present, wherein A and B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of single item(s) or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (8)

1. An automatic car following method based on artificial intelligence is characterized in that a corresponding relation between a car following model and a car following scene is preset, the car following model comprises a stable car distance car following model, a constant speed car following model and a parking model, and the method comprises the following steps of:
identifying a car following scene at the current moment;
calling a target car following model corresponding to the car following scene according to the corresponding relation;
obtaining vehicle state data and front vehicle state data required by the target vehicle following model and a target distance between the vehicle and the front vehicle;
determining a target control quantity according to the vehicle state data, the preceding vehicle state data, the target distance and the target vehicle following model;
controlling the vehicle by using the target control quantity;
wherein, the car scene with the discernment current moment includes:
determining whether a vehicle in front of the side cuts into the lane where the vehicle is located;
if so, the car following scene is a constant-speed car following scene;
the calling of the target car following model corresponding to the car following scene according to the corresponding relation comprises the following steps:
and calling the constant-speed car following model corresponding to the constant-speed car following scene as the target car following model.
2. The method of claim 1, wherein the identifying the following scene at the current time comprises:
determining whether braking of the front vehicle occurs;
if so, the car following scene is a braking scene;
the calling of the target car following model corresponding to the car following scene according to the corresponding relation comprises the following steps:
and calling the parking model corresponding to the braking scene as the target car following model.
3. The method according to any one of claims 1-2, characterized in that the target control quantity is an acceleration.
4. An automatic car following method based on artificial intelligence is characterized in that a corresponding relation between a car following model and a car following scene is preset, the car following model comprises a stable car distance car following model, a constant speed car following model and a parking model, and the method comprises the following steps of:
identifying a car following scene at the current moment;
calling a target car following model corresponding to the car following scene according to the corresponding relation;
obtaining vehicle state data and front vehicle state data required by the target vehicle following model and a target distance between the vehicle and the front vehicle;
determining a target control quantity according to the vehicle state data, the preceding vehicle state data, the target distance and the target vehicle following model;
controlling the vehicle by using the target control quantity;
wherein, the car scene with the discernment current moment includes:
if the braking distance of the called following model at the current moment and the braking distance of the stable following model meet preset conditions, determining that the following scene is a stable following scene;
the calling of the target car following model corresponding to the car following scene according to the corresponding relation comprises the following steps:
and calling the stable vehicle following model corresponding to the stable vehicle following scene as the target vehicle following model.
5. The utility model provides an automatic car device of following based on artificial intelligence which characterized in that sets up in advance with the car model with the corresponding relation in car scene, with the car model including stable vehicle distance with the car model, at the uniform velocity with car model and parking model in, the device includes identification element, calls the unit, obtains unit, confirm unit and the control unit:
the identification unit is used for determining whether a vehicle in front of the side cuts into the lane where the vehicle is located, and if so, the vehicle following scene is a constant-speed vehicle following scene;
the calling unit is used for calling the constant-speed car-following model corresponding to the constant-speed car-following scene as a target car-following model;
the acquisition unit is used for acquiring the vehicle state data and the front vehicle state data required by the target vehicle following model and the target distance between the vehicle and the front vehicle;
the determining unit is used for determining a target control quantity according to the vehicle state data, the preceding vehicle state data, the target distance and the target vehicle following model;
and the control unit is used for controlling the vehicle by using the target control quantity.
6. The utility model provides an automatic car device of following based on artificial intelligence which characterized in that sets up in advance with car model and with the corresponding relation of car scene, includes with car model, at the uniform velocity with car model and the parking model with the car model with stable vehicle distance, the device includes identification element, calls the unit, obtains unit, confirm unit and the control unit:
the identification unit is used for determining that the following scene is a stable following scene if the braking distance of the called following model at the current moment and the braking distance of the stable following model meet preset conditions;
the calling unit is used for calling the stable vehicle following model corresponding to the stable vehicle following scene as a target vehicle following model;
the acquisition unit is used for acquiring the vehicle state data and the front vehicle state data required by the target vehicle following model and the target distance between the vehicle and the front vehicle;
the determining unit is used for determining a target control quantity according to the vehicle state data, the preceding vehicle state data, the target distance and the target vehicle following model;
and the control unit is used for controlling the vehicle by using the target control quantity.
7. An apparatus for automatic car following, the apparatus comprising a processor and a memory:
the memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to perform the method of any of claims 1-4 according to instructions in the program code.
8. A computer-readable storage medium, characterized in that the computer-readable storage medium is configured to store a program code for performing the method of any of claims 1-4.
CN201911007635.3A 2019-10-22 2019-10-22 Automatic car following method based on artificial intelligence and related device Active CN110775059B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911007635.3A CN110775059B (en) 2019-10-22 2019-10-22 Automatic car following method based on artificial intelligence and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911007635.3A CN110775059B (en) 2019-10-22 2019-10-22 Automatic car following method based on artificial intelligence and related device

Publications (2)

Publication Number Publication Date
CN110775059A CN110775059A (en) 2020-02-11
CN110775059B true CN110775059B (en) 2021-08-27

Family

ID=69384476

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911007635.3A Active CN110775059B (en) 2019-10-22 2019-10-22 Automatic car following method based on artificial intelligence and related device

Country Status (1)

Country Link
CN (1) CN110775059B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112598911A (en) * 2021-03-05 2021-04-02 华砺智行(武汉)科技有限公司 Safe car following method, system and device in V2X car networking environment
CN113570845A (en) * 2021-07-23 2021-10-29 东风汽车集团股份有限公司 Networked vehicle formation driving method and system
CN113928313B (en) * 2021-10-08 2023-04-07 南京航空航天大学 Intelligent vehicle following control method and system suitable for heterogeneous traffic
CN114248780A (en) * 2021-12-27 2022-03-29 江苏大学 IDM-LSTM combined following model establishing method considering driver style
CN114802240B (en) * 2022-06-24 2022-09-27 禾多科技(北京)有限公司 Vehicle speed control method, device, equipment and computer readable medium
CN115977041A (en) * 2022-12-30 2023-04-18 重庆交通大学 Adjustable short-distance fishway system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018168323A1 (en) * 2017-03-17 2018-09-20 マツダ株式会社 Driving assist control device
WO2018168322A1 (en) * 2017-03-17 2018-09-20 マツダ株式会社 Driving assist control device
CN109421711A (en) * 2017-08-28 2019-03-05 腾讯科技(北京)有限公司 Follow the bus method for control speed, device, system, computer equipment and storage medium
CN110023165A (en) * 2016-11-29 2019-07-16 马自达汽车株式会社 Controller of vehicle
JP2019156272A (en) * 2018-03-15 2019-09-19 本田技研工業株式会社 Vehicle control device, vehicle control method, and program

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3666338B2 (en) * 2000-01-20 2005-06-29 日産自動車株式会社 Vehicle travel control device
CN108287540B (en) * 2017-10-19 2020-05-08 腾讯科技(深圳)有限公司 Vehicle control method, vehicle control device, vehicle and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110023165A (en) * 2016-11-29 2019-07-16 马自达汽车株式会社 Controller of vehicle
WO2018168323A1 (en) * 2017-03-17 2018-09-20 マツダ株式会社 Driving assist control device
WO2018168322A1 (en) * 2017-03-17 2018-09-20 マツダ株式会社 Driving assist control device
CN109421711A (en) * 2017-08-28 2019-03-05 腾讯科技(北京)有限公司 Follow the bus method for control speed, device, system, computer equipment and storage medium
JP2019156272A (en) * 2018-03-15 2019-09-19 本田技研工業株式会社 Vehicle control device, vehicle control method, and program

Also Published As

Publication number Publication date
CN110775059A (en) 2020-02-11

Similar Documents

Publication Publication Date Title
CN110775059B (en) Automatic car following method based on artificial intelligence and related device
CN112046503B (en) Vehicle control method based on artificial intelligence, related device and storage medium
JP7072763B2 (en) How to determine driving behavior, devices, equipment and storage media
EP3604064B1 (en) Method and terminal for carrying out driving control on vehicle
CN108447472B (en) Voice wake-up method and device
CN105788321B (en) Vehicle communication method, device and system
CN107749194B (en) Lane changing assisting method and mobile terminal
CN112258837B (en) Vehicle early warning method, related device, equipment and storage medium
CN110926484B (en) Vehicle position obtaining method and device and intelligent vehicle
CN109556612B (en) Navigation information processing method, device, server, terminal and storage medium
CN107826109B (en) Lane keeping method and apparatus
CN111739329B (en) Travel route generation method, travel route generation device, storage medium, and server
CN111539371B (en) Vehicle control method, device, equipment and storage medium
CN111885500A (en) Road condition reminding method and device based on narrowband Internet of things and storage medium
JP2009023562A (en) Advice providing system
CN113110487A (en) Vehicle simulation control method and device, electronic equipment and storage medium
CN113923775A (en) Method, device, equipment and storage medium for evaluating quality of positioning information
CN112256006B (en) Data processing method and device and electronic equipment
US20230022123A1 (en) Autonomous driving method and apparatus
CN107835304B (en) Method and device for controlling mobile terminal, mobile terminal and storage medium
CN110126829B (en) Torque filter coefficient determining method, vehicle-mounted terminal and vehicle
CN114291113A (en) Risk threshold determination method, device, equipment and storage medium
CN109855643B (en) Lane guiding method and navigation equipment
CN113299098A (en) Traffic intersection vehicle guiding method and device
CN110895743A (en) Task processing method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40020984

Country of ref document: HK

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant