CN117207975A - Intersection driving assisting driving control method and device, vehicle and readable storage medium - Google Patents

Intersection driving assisting driving control method and device, vehicle and readable storage medium Download PDF

Info

Publication number
CN117207975A
CN117207975A CN202311233849.9A CN202311233849A CN117207975A CN 117207975 A CN117207975 A CN 117207975A CN 202311233849 A CN202311233849 A CN 202311233849A CN 117207975 A CN117207975 A CN 117207975A
Authority
CN
China
Prior art keywords
target vehicle
lane
vehicle
center line
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311233849.9A
Other languages
Chinese (zh)
Inventor
罗华平
洪吉发
吴方义
邓晶
罗浩
刘卫东
黄少堂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangling Motors Corp Ltd
Original Assignee
Jiangling Motors Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangling Motors Corp Ltd filed Critical Jiangling Motors Corp Ltd
Priority to CN202311233849.9A priority Critical patent/CN117207975A/en
Publication of CN117207975A publication Critical patent/CN117207975A/en
Pending legal-status Critical Current

Links

Landscapes

  • Steering Control In Accordance With Driving Conditions (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention relates to the technical field of vehicle auxiliary driving, in particular to a method and a device for controlling intersection driving auxiliary driving, a vehicle and a readable storage medium, wherein the method comprises the steps of obtaining current motion state information and road state information of the vehicle, establishing a scene coordinate system of a position of the vehicle according to the current motion state information and the road state information of the vehicle, and determining coordinate parameters of the center line of the vehicle and a lane in the scene coordinate system; according to the vehicle and the lane center line coordinate parameters, determining the transverse offset distance from the near end of the lane center line to the longitudinal axis of the vehicle coordinate system, the lane center line curve and the pre-aiming point coordinate parameters; according to the transverse offset distance, the lane center line curve and the coordinate parameters of the pre-aiming point, a feedforward tracking algorithm and a feedback algorithm are adopted to determine the total steering wheel angle of the vehicle, and under the conditions that a plurality of lanes are faced and steering lane change is possible, the safe and reliable auxiliary driving function is realized, the experience of a driver is improved, the sustainable function is ensured, and the interference of zebra crossings and the like is avoided.

Description

Intersection driving assisting driving control method and device, vehicle and readable storage medium
Technical Field
The invention relates to the technical field of vehicle auxiliary driving, in particular to a method and a device for controlling intersection driving auxiliary driving, a vehicle and a readable storage medium.
Background
Currently, information technologies represented by high-precision sensors, high-power chips, internet of things, cloud computing, big data and artificial intelligence are widely applied, the intelligent development of society is accelerated, the intelligent progress in the field of automobile traffic is also evolving, ADAS/AD (advanced assisted driving system/automatic driving) is an important innovative field of the automobile industry, and various sub-functions are endlessly layered in the process of ADAS (advanced assisted driving system) evolving to AD (automatic driving), such as ACC (adaptive cruise), LKA (lane keeping assistance), LDW (lane departure pre-warning), TJA/ICA (traffic jam assistance system/integrated cruise assistance) and the like.
In the related art, in an auxiliary control method based on a hardware configuration of a single mid-range radar sensor and a single forward camera, auxiliary driving is usually implemented through TJA/ICA (traffic jam assist system/integrated cruise assist), and a longitudinal function (ACC) and a lateral function (LKA) are usually interposed at the same time, but in a public road intersection scene, particularly for a plurality of lanes, due to complexity of road conditions and possibility of steering lane change, in the related art, there are generally two control strategies in the scene: the TJA/ICA function directly exits, and the driver intervenes in the takeover; continuously driving the vehicle forwards for 1 to 2 seconds through the virtual lane line; however, the TJA/ICA function is directly withdrawn, the driving experience is poor, the driver intervenes and takes over the driving experience, the strategy cannot realize continuous auxiliary driving control on the vehicle, the vehicle continuously runs forward through the virtual lane line to control the strategy, the auxiliary driving function is withdrawn easily in the process of passing through the road, and the line pressing/riding condition occurs when the vehicle runs to the opposite lane, so that the auxiliary driving safety of the control strategy is low.
Disclosure of Invention
The present application aims to solve at least one of the technical problems existing in the prior art. Therefore, the application provides an intersection driving assisting driving control method, an intersection driving assisting driving controlling device, a vehicle and a readable storage medium.
In a first aspect, an embodiment of the present application provides a method for controlling driving assistance in driving at an intersection, including:
continuously acquiring current motion state information of a target vehicle and road state information of the target vehicle, wherein the road state information comprises lane number information of a road, position information of a lane of the target vehicle and distance information of an intersection from the target vehicle;
according to the current motion state information and the road state information of the target vehicle, establishing a scene coordinate system of the current position of the target vehicle, and determining a target vehicle coordinate parameter and a lane center line coordinate parameter in the scene coordinate system;
according to the target vehicle coordinate parameters and the lane center line coordinate parameters, determining the transverse offset distance from the near end of the lane center line to the longitudinal axis of the target vehicle coordinate system, the lane center line curve and the pre-aiming point coordinate parameters;
and determining the total steering wheel angle of the target vehicle by adopting a feedforward tracking algorithm and a feedback algorithm according to the transverse offset distance, the lane center line curve and the coordinate parameters of the pre-aiming point.
According to some embodiments of the application, the establishing a scene coordinate system of the current position of the target vehicle according to the motion state information and the road state information of the target vehicle, and determining the target vehicle coordinate parameter and the lane center line coordinate parameter in the scene coordinate system further includes:
judging whether the target vehicle enters a lane crossing or not according to the motion state information and the road state information of the target vehicle;
if yes, determining and generating a virtual lane line according to the road state information;
and determining the duration time of the current virtual lane line and the preset effective length corresponding to the duration time based on a preset mapping table according to the motion state information of the target vehicle.
According to some embodiments of the application, the determining the duration of the current virtual lane line and the preset effective length corresponding to the duration based on the preset mapping table according to the motion state information of the target vehicle includes
According to the duration time of the current virtual lane line, if the target vehicle does not recognize the opposite effective lane within the duration time, the auxiliary driving function exits;
if the target vehicle identifies the opposite effective lane within the duration time, acquiring the effective length of the opposite effective lane and the preset effective length corresponding to the duration time, and if the effective length is smaller than the preset effective length corresponding to the duration time, the auxiliary driving function exits.
According to some embodiments of the present application, the determining the lateral offset distance from the near end of the lane center line to the longitudinal axis of the target vehicle coordinate system, the lane center line curve, and the pre-aiming point coordinate parameters according to the target vehicle coordinate parameters and the lane center line coordinate parameters includes:
obtaining position parameter information of each lane center line and a target vehicle according to the target vehicle coordinate parameters and the lane center line coordinate parameters, wherein the position parameter information comprises a transverse offset distance from the near end of a lane line to the longitudinal axis of the vehicle coordinate system, an included angle between the lane line coordinate system and the vehicle coordinate system and a lane line curvature;
based on the lane center line calculation formula according to the position parameter informationAnd determining a center line curve of each lane, wherein C0 is a transverse offset distance from the near end of the lane to the longitudinal axis of the coordinate system of the vehicle, C1 is an included angle between the coordinate system of the lane and the coordinate system of the vehicle, and C2 is the curvature of the lane.
According to some embodiments of the application, the determining the lateral offset distance from the near end of the lane center line to the longitudinal axis of the target vehicle coordinate system, the lane center line curve and the pre-aiming point coordinate parameters according to the target vehicle coordinate parameters and the lane center line coordinate parameters further includes:
Determining a pretightening distance of a pretightening point according to the coordinate parameters and the current motion state information of the target vehicleWherein, pretightening distance +.>By the formula->Obtaining, wherein k is a gain factor, V is a target vehicle speed, ++>The distance is pre-aimed as a basis.
According to the pretightening distance of the pretightening pointAnd a lane center line calculation equation to obtain a pre-aiming point transverse distance +.>
According to some embodiments of the present application, the determining the total steering wheel angle of the target vehicle according to the lateral offset distance, the lane centerline curve and the pretightening point coordinate parameter by using a feedforward tracking algorithm and a feedback algorithm includes:
obtaining the turning radius R and the front wheel steering angle of the target vehicle according to the coordinate parameters of the target vehicle and the coordinate parameters of the pre-aiming pointWherein the turning radius R is represented by the formula: />Obtaining the front wheel steering angle->The formula is:obtained by the method, wherein->For the forward vision distance>Is the included angle between the pre-aiming point and the forward direction of the automobile, and is +.>Is the wheelbase of the automobile, is->The transverse distance between the position of the automobile and the pre-aiming point;
according to the front wheel steering angle of the target vehicleObtaining the theoretical steering wheel angle of the target vehicle>Wherein, the method comprises the steps of, wherein,wherein->For the target vehicle front wheel steering angle, R is the steering wheel gear ratio.
According to some embodiments of the present application, the determining the total steering wheel angle of the target vehicle according to the lateral offset distance, the lane centerline curve and the pretightening point coordinate parameter by adopting a feedforward tracking algorithm and a feedback algorithm further includes:
eliminating lateral deviation from a PID controller of a target vehicleAnd heading deviation->ObtainingDeviation steering wheel angleAnd->Wherein->,/>Wherein->For PID controller coefficients, +.>For lateral deviation +.>Is course deviation;
steering wheel angle according to the deviationAnd->Obtaining the total steering wheel angle of the target vehicle>Wherein, the method comprises the steps of, wherein,wherein A, B is a proportionality coefficient, < >>Is the theoretical steering wheel angle.
In a second aspect, an embodiment of the present application provides an intersection driving support driving control device, including:
the parameter acquisition module is configured to continuously acquire current motion state information of a target vehicle and road state information of the target vehicle;
the scene construction module is configured to establish a coordinate system scene of the position of the current target vehicle according to the current motion state information of the target vehicle and the road state information of the target vehicle, which are obtained by the obtaining module, and determine the coordinate parameters of the target vehicle and the center line coordinate parameters of the lane in the scene coordinate system;
The first determining module is configured to determine a transverse offset distance from the near end of the lane center line to the longitudinal axis of the target vehicle coordinate system, a lane center line curve and a pre-aiming point coordinate parameter according to the target vehicle coordinate parameter and the lane center line coordinate parameter obtained by the scene building module;
the second determining module is configured to obtain a target vehicle coordinate parameter and a lane center line coordinate parameter according to the scene building module, wherein the duration time of the current virtual lane line and the preset effective length corresponding to the duration time;
the data processing module is configured to determine a transverse offset distance from the near end of the lane center line to the longitudinal axis of the target vehicle coordinate system, a lane center line curve and coordinate parameters of a pre-aiming point according to the first determining module, and determine the total steering wheel angle of the target vehicle by adopting a feedforward tracking algorithm and a feedback algorithm;
the judging module is configured to judge whether the auxiliary driving function of the target vehicle needs to be exited or not according to the duration time of the current virtual lane line and the preset effective length corresponding to the duration time determined by the second determining module;
and the control module is configured to determine the total steering wheel angle of the target vehicle according to the judging result of the judging module and the data processing of the data processing module, and execute the intersection driving assisting driving control strategy.
In a third aspect, an embodiment of the present application provides a vehicle including:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to:
the method for controlling the driving assistance of the intersection driving according to the embodiment of the first aspect is implemented.
In a fourth aspect, an embodiment of the present application provides a computer readable storage medium, on which computer program instructions are stored, where the program instructions, when executed by a processor, implement the steps of an intersection driving assisting driving control method according to the embodiment of the first aspect.
Compared with the prior art, the technical scheme provided by the embodiment of the application at least comprises the following beneficial effects:
1) In the intersection driving auxiliary driving control method, a scene coordinate system of the current position of a target vehicle is established by collecting the current motion state information and the road state information of the target vehicle, the transverse offset distance from the near end of a lane center line to the longitudinal axis of the target vehicle coordinate system, the lane center line curve and the coordinate parameters of a pre-aiming point are determined, and the coordinate parameters of the pre-aiming point and the transverse offset distance are combined with the curves of the lane lines to obtain the theoretical steering wheel corner And then combine the deviation steering wheel angle>And->Eliminating lateral deviation->And heading deviation->Obtaining the total steering wheel angle of the target vehicle>Total steering wheel angle of target vehicle by acquisition +.>Generating control instructions to realize auxiliary driving control of target vehicles, and realizing safety and reliability under the conditions of facing complex roads of multiple vehicles and steering lane changeThe auxiliary driving function improves the experience of the driver, and meanwhile, for some special conditions, such as the time when the virtual lane line is preset and maintained, the target vehicle does not recognize the opposite effective lane, and at the moment, the function exit reminding is taken over by the driver, so that the safety of the driver is ensured;
2) Interpolation is carried out through a preset mapping table based on the own vehicle speed of the target vehicle, so that the time for maintaining the virtual lane line is obtained, and the situation that no lane line exists when entering an intersection, so that functions directly exit is avoided;
3) Based on the virtual lane line time, carrying out correlation interpolation by presetting the virtual lane line time and identifying the effective length of the opposite lane, determining the effective length of the opposite lane, ensuring the sustainable function and avoiding the interference of zebra crossings and the like;
4) Obtaining an effective lane line equation by using the lane line equation, and selecting a proper lane by comparing absolute values of lateral offset distances from the near end of the lane line to the longitudinal axis of the vehicle coordinate system;
5) Based on a target lane center line equation, a feedforward tracking algorithm and a feedback algorithm are utilized to control the transverse operation of the target vehicle, so that the target vehicle smoothly and centrally enters the target lane.
Additional aspects and advantages of the application will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the application.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of an intersection driving support driving control method according to an embodiment of the present application;
FIG. 2 is another flow chart of an intersection driving assistance driving control method according to an embodiment of the present application;
FIG. 3 is a flow chart of control of hardware of a corresponding automobile for the intersection driving assistance driving control method according to the embodiment of the application;
FIG. 4 is a diagram of an intersection scene corresponding to an intersection driving assistance driving control method according to an embodiment of the present application;
FIG. 5 is a geometric data diagram of vehicle steering control corresponding to an intersection driving assistance driving control method according to an embodiment of the present application;
fig. 6 is a block diagram of an intersection driving support driving control device according to an embodiment of the present application;
fig. 7 is a functional block diagram of a vehicle according to an embodiment of the present application.
Detailed Description
The following detailed description of embodiments of the application, with reference to the accompanying drawings, is illustrative of the embodiments described herein, and it is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application.
It is to be noted that all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs unless defined otherwise. The terminology used herein in the description of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. The term "and/or" as used herein includes any and all combinations of one or more of the associated listed items.
Example 1
Referring to fig. 1 and fig. 3 to 5, the present embodiment provides a driving support control method for an intersection, including:
step S100: continuously acquiring current motion state information of a target vehicle and road state information of the target vehicle, wherein the road state information comprises lane number information of a road, position information of a lane of the target vehicle and distance information of an intersection from the target vehicle;
In this step, the method of this embodiment is applicable to a vehicle with a driving assisting function, as shown in fig. 2, where the vehicle includes an intelligent camera IPM, an electronic power steering system EPS, a vehicle steering control VSC, a mid-range radar MRR, a vehicle body electronic stability system ESP, and an electronic control unit ECU, where the target vehicle may be understood as a current vehicle, and the name application of the target vehicle is only for more convenience in describing the method steps of this embodiment, where the vehicle type of the target vehicle is not limited, for example, the target vehicle may be a home car, an SUV, a commercial car, and the like;
further, the target vehicle continuously acquires the surrounding environment of the vehicle according to the intelligent camera and the radar, and simultaneously, the navigation system and the positioning system are combined to continuously acquire the current motion state information and the road state information of the target vehicle, and it can be understood that the current motion state information of the target vehicle comprises the speed of the target vehicle, the acceleration of the target vehicle and the gear information of the target vehicle, and the current motion state information can be acquired continuously through a sensor in the vehicle;
Meanwhile, road state information can be acquired through a navigation system and a positioning system, the road state information comprises lane number information of a road, position information of a lane where a target vehicle is located and distance information of the road junction from the target vehicle, the road state information is continuously acquired, data acquired last time are required to be updated after each acquisition, and of course, besides the information, the target vehicle can acquire the movement state conditions of the left and right lanes of the vehicle and other vehicles before and after the lane where the vehicle is located, meanwhile, the traffic light condition of the road junction can be acquired through a cloud end when the road junction is located, and the control of the vehicle is adjusted.
Step S200: according to the current motion state information and the road state information of the target vehicle, establishing a scene coordinate system of the current position of the target vehicle, and determining a target vehicle coordinate parameter and a lane center line coordinate parameter in the scene coordinate system;
in this step, it can be understood that the scene coordinate system is a virtual coordinate system, which performs a generating operation in a program, and includes a position and a coordinate parameter of the target vehicle, a position and a coordinate parameter of a center line of each lane, and it is to be noted that, in a moving process of the target vehicle, the position and the coordinate parameter of the target vehicle and the position and the coordinate parameter of the center line of each lane corresponding to a front of the vehicle are also adjusted and optimized, and the process is performed according to data collected by the navigation system and the positioning system;
In one embodiment, as shown in fig. 3, fig. 3 shows a schematic view of a target vehicle traveling at a road intersection, it can be seen that, unlike a conventional intersection, the lane lines at both ends of the intersection are not aligned, and for this reason, in the conventional driving assistance control method, the TJA/ICA function directly exits from the takeover by the driver when the target vehicle faces such a special intersection, and automatic control cannot be achieved to make the target vehicle pass through completely, and the method steps of the present embodiment effectively solve the problem;
specifically, the target vehicle firstly collects the central line position information of each lane and the current position information of the target lane, determines the transverse offset distance from the central line near end of the lane to the longitudinal axis of the target vehicle coordinate system, compares the transverse offset distances to select the transverse offset distance with the minimum value by obtaining the transverse offset distances from the central line near end of each lane to the longitudinal axis of the target vehicle coordinate system, and determines the lane which the target vehicle needs to enter and the position information of the lane;
in one case, it should be noted that in the case of three lanes as in fig. 3, i.e., the corresponding lateral offset distance of each lane needs to be minimized If the absolute value of the lateral offset distance from the near end of the center line of the lane to the longitudinal axis of the own vehicle coordinate system is larger than a threshold value, that is, the lateral offset distance is not smaller than 0.8m, at the moment, in order to prevent the safety accident caused by the sharp turning of the target vehicle, the auxiliary driving control function of the target vehicle exits, and prompts the driver to take over.
Step S300: according to the target vehicle coordinate parameters and the lane center line coordinate parameters, determining the transverse offset distance from the near end of the lane center line to the longitudinal axis of the target vehicle coordinate system, the lane center line curve and the pre-aiming point coordinate parameters;
in the step, according to the target vehicle coordinate parameters and the lane center line coordinate parameters, position parameter information of each lane center line and the target vehicle is obtained, wherein the position parameter information comprises a transverse offset distance from the near end of a lane line to a longitudinal axis of a vehicle coordinate system, an included angle between the lane line coordinate system and the vehicle coordinate system and a lane line curvature;
based on the lane center line calculation formula according to the position parameter informationDetermining a central line curve of each lane, wherein C0 is a transverse offset distance from the near end of the lane to the longitudinal axis of the coordinate system of the vehicle, C1 is an included angle between the coordinate system of the lane and the coordinate system of the vehicle, and C2 is the curvature of the lane;
Illustratively, with continued reference to fig. 4, from four lane lines, a curve of the center line of three lanes, a left lane center line curve, may be obtained:the method comprises the steps of carrying out a first treatment on the surface of the Center lane center line curve: />The method comprises the steps of carrying out a first treatment on the surface of the Right lane centerline curve: />It should be noted that the strategy of the auxiliary control method is implemented by +.>(absolute value of lateral offset distance from the near end of the lane center line to the longitudinal axis of the own vehicle coordinate system) selecting a proper lane;
meanwhile, in the step, the method further comprises the step of determining the pretightening distance of the pretightening point according to the coordinate parameters and the current motion state information of the target vehicleWherein, pretightening distance +.>By the formula->Obtaining, wherein k is a gain factor, V is a target vehicle speed, ++>The distance is pre-aimed as a basis.
According to the pretightening distance of the pretightening pointAnd a lane center line calculation equation to obtain a pre-aiming point transverse distance +.>Specifically, the point +_is found in the lane centerline equation by the curve of the centerline of the lane>Satisfy->At this time
Step S400: and determining the total steering wheel angle of the target vehicle by adopting a feedforward tracking algorithm and a feedback algorithm according to the transverse offset distance, the lane center line curve and the coordinate parameters of the pre-aiming point.
In the step, a feedforward tracking algorithm and a feedback algorithm are adopted to determine the total steering wheel angle of a target vehicle through the obtained transverse offset distance, a lane center line curve and the coordinate parameters of a pre-aiming point;
The feedforward pure tracking algorithm is based on a geometric principle, and can calculate a turning radius according to the relation between a pre-aiming point and the position of an automobile, and calculate a front wheel corner according to the wheelbase and the turning radius of the automobile, wherein the feedforward pure tracking algorithm comprises the steps of obtaining the turning radius R and the turning radius R of a target vehicle according to the coordinate parameters of the target vehicle and the coordinate parameters of the pre-aiming pointFront wheel steering angleWherein the turning radius R is represented by the formula: />Obtaining the front wheel steering angle->The formula is: />Obtained by the method, wherein->For the forward vision distance>Is the included angle between the pre-aiming point and the forward direction of the automobile, and is +.>Is the wheelbase of the automobile, is->The transverse distance between the position of the automobile and the pre-aiming point;
specifically, by the formulaDerived->Combining with FIG. 5 to obtain,/>Is the wheelbase of the automobile, is->The transverse distance between the position of the automobile and the pre-aiming point is finally arranged as follows:according to the front wheel angle of the target vehicle +.>Obtaining the theoretical steering wheel angle of the target vehicle>Wherein->Wherein->For the target vehicle front wheel steering angle, R is the steering wheel gear ratio.
Further, for the feedback algorithm in this step, the lateral deviation is eliminated according to the PID controller of the target vehicleAnd heading deviation->,/>,/>Obtaining the deviation steering wheel angle >And->Specifically, the number of the cells, specifically,,/>in which, in the process,for PID controller coefficients, +.>For lateral deviation +.>Is course deviation;
steering wheel angle according to the deviationAnd->Combining with a feedforward pure tracking algorithm to obtain the total steering wheel angle of the target vehicle>Wherein->Wherein A, B is a proportionality coefficient, < >>For a theoretical steering wheel angle, it should be noted that +.>Proportional coefficient and curvature->In relation, the greater the curvature, the greater the value of B to eliminate the predictable lateral and heading deviations and vice versa.
Example 2
Referring to fig. 2, the present embodiment further describes step S200 based on embodiment 1, including:
step S210: judging whether the target vehicle enters a lane crossing or not according to the motion state information and the road state information of the target vehicle;
in the step, the target vehicle continuously collects the surrounding environment of the vehicle according to the intelligent camera and the radar, and simultaneously, continuously acquires the current motion state information and the road state information of the target vehicle by combining the navigation system and the positioning system, so as to judge whether the target vehicle enters a lane crossing or not;
step S220: if yes, determining and generating a virtual lane line according to the road state information;
In the step, in order to ensure that the transverse operation of the target vehicle can be controlled, the target vehicle smoothly and centrally enters a target lane, and as no lane line exists at the intersection position, a corresponding virtual lane line can be generated according to the road state information of the target vehicle so as to assist the target vehicle to pass through the intersection;
step S230: and determining the duration time of the current virtual lane line and the preset effective length corresponding to the duration time based on a preset mapping table according to the motion state information of the target vehicle.
In this step, the lane line of the intersection is usually interrupted and the width is different, so that when the vehicle passes the intersection, the current virtual lane line needs to be maintained for a certain time so that the opposite lane can be identified by the camera;
table (1) is a duration map of target vehicle speed and current virtual lane line
Meanwhile, for the effective length of the opposite lane, the opposite lane with the effective length must be identified within the duration of the virtual lane line so as to smoothly and continuously control the vehicle to drive to the opposite appropriate lane, it can be understood that the table (2) is a representation form of a preset mapping table with the effective length must be identified in the duration mapping table, and the duration of the corresponding current virtual lane line is determined by looking up a table through the acquired speed of the target vehicle; performing secondary table lookup on the duration of the obtained current virtual lane line to obtain a mapping table with an effective length, and obtaining the mapping table with the effective length through the obtained duration of the current virtual lane line and the secondary table lookup to judge whether the auxiliary driving function of the target vehicle needs a driver to take over;
Table (2) is a mapping table in which the effective length must be identified within the duration mapping table
Step S240: according to the duration time of the current virtual lane line, if the target vehicle does not recognize the opposite effective lane within the duration time, the auxiliary driving function exits;
in the step, if the target vehicle does not recognize the opposite effective lane within the duration according to the duration of the determined current virtual lane line, performing an auxiliary driving control function on the target vehicle so as to control the target vehicle to transversely operate and smoothly and centrally drive into the opposite target lane;
step S260: if the target vehicle identifies the opposite effective lane within the duration time, acquiring the effective length of the opposite effective lane and the preset effective length corresponding to the duration time, and if the effective length is smaller than the preset effective length corresponding to the duration time, the auxiliary driving function exits.
In the step, if the target vehicle identifies the opposite effective lane within the duration time, acquiring the effective length of the opposite effective lane and the preset effective length corresponding to the duration time, and if the effective length is greater than the preset effective length corresponding to the duration time, executing an auxiliary driving control function on the target vehicle so as to control the transverse operation of the target vehicle and smoothly and centrally drive into the opposite target lane;
In some embodiments, the method includes the steps of establishing a scene coordinate system of a current target vehicle by collecting current motion state information and road state information of the target vehicle, determining a lateral offset distance from a near end of a lane center line to a longitudinal axis of the target vehicle coordinate system, a lane center line curve and pre-aiming point coordinate parameters, and combining the lane center line curves, the pre-aiming point coordinate parameters and the lateral offset distance to obtain a theorySteering wheel cornerAnd then combine the deviation steering wheel angle>And->Eliminating lateral deviation->And heading deviation->Obtaining the total steering wheel angle of the target vehicle>Total steering wheel angle of target vehicle by acquisition +.>Generating a control instruction to realize auxiliary driving control of a target vehicle, realizing a safe and reliable auxiliary driving function under the conditions of facing a plurality of complex roads and steering lane change, improving the experience of a driver, and simultaneously ensuring the safety of the driver because the target vehicle does not recognize a valid opposite lane in the time of presetting and maintaining a virtual lane line under special conditions, namely, the function exit reminding is taken over by the driver;
interpolation is carried out through a preset mapping table based on the own vehicle speed of the target vehicle, so that the time for maintaining the virtual lane line is obtained, and the situation that no lane line exists when entering an intersection, so that functions directly exit is avoided;
Based on the virtual lane line time, carrying out correlation interpolation by presetting the virtual lane line time and identifying the effective length of the opposite lane, determining the effective length of the opposite lane, ensuring the sustainable function and avoiding the interference of zebra crossings and the like;
obtaining an effective lane line equation by using the lane line equation, and selecting a proper lane by comparing absolute values of lateral offset distances from the near end of the lane line to the longitudinal axis of the vehicle coordinate system;
based on a target lane center line equation, a feedforward tracking algorithm and a feedback algorithm are utilized to control the transverse operation of the target vehicle, so that the target vehicle smoothly and centrally enters the target lane.
Example 3
Referring to fig. 6, the present embodiment provides an intersection driving support driving control device, where the intersection driving support driving control device 200 includes:
a parameter obtaining module 210 configured to continuously obtain current motion state information of a target vehicle and road state information of the target vehicle;
the scene construction module 220 is configured to establish a coordinate system scene of the position of the current target vehicle according to the current motion state information of the target vehicle and the road state information of the target vehicle obtained by the obtaining module, and determine a target vehicle coordinate parameter and a lane center line coordinate parameter in a scene coordinate system;
The first determining module 230 is configured to determine a lateral offset distance from a near end of a lane center line to a longitudinal axis of the target vehicle coordinate system, a lane center line curve and a pre-aiming point coordinate parameter according to the target vehicle coordinate parameter and the lane center line coordinate parameter obtained by the scene building module;
the second determining module 240 is configured to obtain, according to the target vehicle coordinate parameter and the lane center line coordinate parameter obtained by the scene building module, a duration of the current virtual lane line and a preset effective length corresponding to the duration;
the data processing module 250 is configured to determine a transverse offset distance from the near end of the lane center line to the longitudinal axis of the target vehicle coordinate system, a lane center line curve and coordinate parameters of a pre-aiming point according to the first determining module, and determine a total steering wheel angle of the target vehicle by adopting a feedforward tracking algorithm and a feedback algorithm;
a judging module 260, configured to determine, according to the second determining module, a duration of the current virtual lane line and a preset effective length corresponding to the duration, and judge whether the auxiliary driving function of the target vehicle needs to exit;
and the control module 270 is configured to determine the total steering wheel angle of the target vehicle according to the judging result of the judging module and the data processing of the data processing module, and execute the intersection driving assisting driving control strategy.
Example 4
Referring to fig. 7, the present embodiment provides a vehicle 600 that may include various subsystems, such as an infotainment system 610, a perception system 620, a decision control system 630, a drive system 640, and a computing platform 650. Alternatively, vehicle 600 may include more or fewer subsystems, and each subsystem may include multiple components. In addition, each of the subsystems and components of vehicle 600 may be interconnected via wires or wirelessly.
In some embodiments, the infotainment system 610 may include a communication system 611, an entertainment system 612, and a navigation system 613.
The communication system 611 may comprise a wireless communication system, which may communicate wirelessly with one or more devices, either directly or via a communication network. For example, the wireless communication system may use 3G cellular communication, such as CDMA, EVD0, GSM/GPRS, or 4G cellular communication, such as LTE. Or 5G cellular communication. The wireless communication system may communicate with a wireless local area network (wireless local area network, WLAN) using WiFi. In some embodiments, the wireless communication system may communicate directly with the device using an infrared link, bluetooth, or ZigBee. Other wireless protocols, such as various vehicle communication systems, for example, wireless communication systems may include one or more dedicated short-range communication (dedicated short range communications, DSRC) devices, which may include public and/or private data communications between vehicles and/or roadside stations.
Entertainment system 612 may include a display device, a microphone, and an audio, and a user may listen to the broadcast in the vehicle based on the entertainment system, playing music; or the mobile phone is communicated with the vehicle, the screen of the mobile phone is realized on the display equipment, the display equipment can be in a touch control type, and a user can operate through touching the screen.
In some cases, the user's voice signal may be acquired through a microphone and certain controls of the vehicle 600 by the user may be implemented based on analysis of the user's voice signal, such as adjusting the temperature within the vehicle, etc. In other cases, music may be played to the user through sound.
The navigation system 613 may include a map service provided by a map provider to provide navigation of a travel route for the vehicle 600, and the navigation system 613 may be used with the global positioning system 621 and the inertial measurement unit 622 of the vehicle. The map service provided by the map provider may be a two-dimensional map or a high-precision map.
The perception system 620 may include several types of sensors that sense information about the environment surrounding the vehicle 600. For example, sensing system 620 may include a global positioning system 621 (which may be a GPS system, or may be a beidou system, or other positioning system), an inertial measurement unit (inertial measurement unit, IMU) 622, a lidar 623, a millimeter wave radar 624, an ultrasonic radar 625, and a camera 626. The sensing system 620 may also include sensors (e.g., in-vehicle air quality monitors, fuel gauges, oil temperature gauges, etc.) of the internal systems of the monitored vehicle 600. Sensor data from one or more of these sensors may be used to detect objects and their corresponding characteristics (location, shape, direction, speed, etc.). Such detection and identification is a critical function of the safe operation of the vehicle 600.
The global positioning system 621 is used to estimate the geographic location of the vehicle 600.
The inertial measurement unit 622 is configured to sense a change in the pose of the vehicle 600 based on inertial acceleration. In some embodiments, inertial measurement unit 622 may be a combination of an accelerometer and a gyroscope.
The lidar 623 uses a laser to sense objects in the environment in which the vehicle 600 is located. In some embodiments, lidar 623 may include one or more laser sources, a laser scanner, and one or more detectors, among other system components.
The millimeter-wave radar 624 utilizes radio signals to sense objects within the surrounding environment of the vehicle 600. In some embodiments, millimeter-wave radar 624 may be used to sense the speed and/or heading of an object in addition to sensing the object.
The ultrasonic radar 625 may utilize ultrasonic signals to sense objects around the vehicle 600.
The image pickup device 626 is used to capture image information of the surrounding environment of the vehicle 600. The image capturing device 626 may include a monocular camera, a binocular camera, a structured light camera, a panoramic camera, etc., and the image information acquired by the image capturing device 626 may include still images or video stream information.
The decision control system 630 includes a computing system 631 that makes analysis decisions based on information acquired by the perception system 620, and the decision control system 630 also includes a vehicle controller 632 that controls the powertrain of the vehicle 600, as well as a steering system 633, throttle 634, and braking system 635 for controlling the vehicle 600.
The computing system 631 may be operable to process and analyze the various information acquired by the perception system 620 in order to identify targets, objects, and/or features in the environment surrounding the vehicle 600. The targets may include pedestrians or animals and the objects and/or features may include traffic signals, road boundaries, and obstacles. The computing system 631 may use object recognition algorithms, in-motion restoration structure (Structure from Motion, SFM) algorithms, video tracking, and the like. In some embodiments, the computing system 631 may be used to map the environment, track objects, estimate the speed of objects, and so forth. The computing system 631 may analyze the acquired various information and derive control strategies for the vehicle.
The vehicle controller 632 may be configured to coordinate control of the power battery and the engine 641 of the vehicle to enhance the power performance of the vehicle 600.
Steering system 633 is operable to adjust the direction of travel of vehicle 600. For example, in one embodiment may be a steering wheel system.
Throttle 634 is used to control the operating speed of engine 641 and thereby the speed of vehicle 600.
The braking system 635 is used to control deceleration of the vehicle 600. The braking system 635 may use friction to slow the wheels 644. In some embodiments, the braking system 635 may convert kinetic energy of the wheels 644 into electrical current. The braking system 635 may take other forms to slow the rotational speed of the wheels 644 to control the speed of the vehicle 600.
The drive system 640 may include components that provide powered movement of the vehicle 600. In one embodiment, the drive system 640 may include an engine 641, an energy source 642, a transmission 643, and wheels 644. The engine 641 may be an internal combustion engine, an electric motor, an air compression engine, or other types of engine combinations, such as a hybrid engine of a gasoline engine and an electric motor, or a hybrid engine of an internal combustion engine and an air compression engine. The engine 641 converts the energy source 642 into mechanical energy.
Examples of energy sources 642 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electricity. The energy source 642 may also provide energy to other systems of the vehicle 600.
The transmission 643 may transfer mechanical power from the engine 641 to wheels 644. The transmission 643 may include a gearbox, a differential, and a driveshaft. In one embodiment, the transmission 643 may also include other devices, such as a clutch. Wherein the drive shaft may include one or more axles that may be coupled to one or more wheels 644.
Some or all of the functions of the vehicle 600 are controlled by the computing platform 650. The computing platform 650 may include at least one processor 651, and the processor 651 may execute instructions 653 stored in a non-transitory computer-readable medium, such as memory 652. In some embodiments, computing platform 650 may also be a plurality of computing devices that control individual components or subsystems of vehicle 600 in a distributed manner.
The processor 651 may be any conventional processor, such as a commercially available CPU. Alternatively, the processor 651 may also include, for example, an image processor (Graphic Process Unit, GPU), a field programmable gate array (Field Programmable Gate Array, FPGA), a System On Chip (SOC), an application specific integrated Chip (Application Specific Integrated Circuit, ASIC), or a combination thereof. Although FIG. 7 functionally illustrates a processor, memory, and other elements of a computer in the same block, it will be understood by those of ordinary skill in the art that the processor, computer, or memory may in fact comprise multiple processors, computers, or memories that may or may not be stored within the same physical housing. For example, the memory may be a hard disk drive or other storage medium located in a different housing than the computer. Thus, references to a processor or computer will be understood to include references to a collection of processors or computers or memories that may or may not operate in parallel. Rather than using a single processor to perform the steps described herein, some components, such as the steering component and the retarding component, may each have their own processor that performs only calculations related to the component-specific functions.
In the present disclosure, the processor 651 may perform the steps of the intersection driving support driving control method in the above-described embodiment.
In various aspects described herein, the processor 651 can be located remotely from and in wireless communication with the vehicle. In other aspects, some of the processes described herein are performed on a processor disposed within the vehicle and others are performed by a remote processor, including taking the necessary steps to perform a single maneuver.
In some embodiments, memory 652 may contain instructions 653 (e.g., program logic), which instructions 653 may be executed by processor 651 to perform various functions of vehicle 600. Memory 652 may also contain additional instructions, including instructions to send data to, receive data from, interact with, and/or control one or more of infotainment system 610, perception system 620, decision control system 630, drive system 640.
In addition to instructions 653, memory 652 may store data such as road maps, route information, vehicle location, direction, speed, and other such vehicle data, as well as other information. Such information may be used by the vehicle 600 and the computing platform 650 during operation of the vehicle 600 in autonomous, semi-autonomous, and/or manual modes.
The computing platform 650 may control the functions of the vehicle 600 based on inputs received from various subsystems (e.g., the drive system 640, the perception system 620, and the decision control system 630). For example, computing platform 650 may utilize input from decision control system 630 in order to control steering system 633 to avoid obstacles detected by perception system 620. In some embodiments, computing platform 650 is operable to provide control over many aspects of vehicle 600 and its subsystems.
Alternatively, one or more of these components may be mounted separately from or associated with vehicle 600. For example, the memory 652 may exist partially or completely separate from the vehicle 600. The above components may be communicatively coupled together in a wired and/or wireless manner.
Alternatively, the above components are only an example, and in practical applications, components in the above modules may be added or deleted according to actual needs, and fig. 7 should not be construed as limiting the embodiments of the present disclosure.
Alternatively, the vehicle 600 or a sensing and computing device associated with the vehicle 600 (e.g., computing system 631, computing platform 650) may predict the behavior of the identified object based on the characteristics of the identified object and the state of the surrounding environment (e.g., traffic, rain, ice on a road, etc.). Alternatively, each identified object depends on each other's behavior, so all of the identified objects can also be considered together to predict the behavior of a single identified object. The vehicle 600 is able to adjust its speed based on the predicted behavior of the identified object. In other words, the autonomous car is able to determine what steady state the vehicle will need to adjust to (e.g., accelerate, decelerate, or stop) based on the predicted behavior of the object. In this process, other factors may also be considered to determine the speed of the vehicle 600, such as the lateral position of the vehicle 600 in the road on which it is traveling, the curvature of the road, the proximity of static and dynamic objects, and so forth.
In addition to providing instructions to adjust the speed of the autonomous vehicle, the computing device may also provide instructions to modify the steering angle of the vehicle 600 so that the autonomous vehicle follows a given trajectory and/or maintains safe lateral and longitudinal distances from objects in the vicinity of the autonomous vehicle (e.g., vehicles in adjacent lanes on a roadway).
The vehicle 600 may be a pure electric vehicle of different models, and the embodiment of the present disclosure is not particularly limited.
In another exemplary embodiment, a computer program product is also provided, the computer program product comprising a computer program executable by a programmable apparatus, the computer program having code portions for performing the above-described intersection driving assistance driving control method when executed by the programmable apparatus.
Example 5
Based on the same inventive concept, the present disclosure also provides a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the steps of the intersection driving assistance driving control method provided by the above embodiment.
The terms first, second, third and the like in the description and in the claims and in the drawings are used for distinguishing between different objects and not for describing a particular sequential order. Furthermore, the terms "comprising," "including," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion. For example, a series of steps or elements may be included, or alternatively, steps or elements not listed or, alternatively, other steps or elements inherent to such process, method, article, or apparatus may be included.
Only some, but not all, of the details relating to the application are shown in the accompanying drawings. Before discussing the exemplary embodiments in more detail, it should be mentioned that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart depicts operations (or steps) as a sequential process, many of the operations can be performed in parallel, concurrently, or at the same time. Furthermore, the order of the operations may be rearranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figures. The processes may correspond to methods, functions, procedures, subroutines, and the like.
As used in this specification, the terms "component," "module," "system," "unit," and the like are intended to refer to a computer-related entity, either hardware, firmware, a combination of hardware and software, or software in execution. For example, a unit may be, but is not limited to being, a process running on a processor, an object, an executable, a thread of execution, a program, and/or being distributed between two or more computers. Furthermore, these units may be implemented from a variety of computer-readable media having various data structures stored thereon. The units may communicate by way of local and/or remote processes such as in accordance with a signal having one or more data packets (e.g., second unit data from another unit interacting with a local system, distributed system, and/or across a network).
In the description of the present invention, it should be understood that the terms "center", "longitudinal", "lateral", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", "axial", "radial", "circumferential", etc. indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings are merely for convenience in describing the present invention and simplifying the description, and do not indicate or imply that the device or element being referred to must have a specific orientation, be configured and operated in a specific orientation, and therefore should not be construed as limiting the invention.
In the description of the present specification, reference to the terms "one embodiment," "some embodiments," "illustrative embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples.
It will be apparent that the described embodiments are only some, but not all, embodiments of the application. Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application for the embodiment. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly understand that the embodiments described herein may be combined with other embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
While embodiments of the present application have been shown and described, it will be understood by those of ordinary skill in the art that: many changes, modifications, substitutions and variations may be made to the embodiments without departing from the spirit and principles of the application, the scope of which is defined by the claims and their equivalents.

Claims (10)

1. An intersection driving assisting driving control method is characterized by comprising the following steps:
Continuously acquiring current motion state information of a target vehicle and road state information of the target vehicle, wherein the road state information comprises lane number information of a road, position information of a lane of the target vehicle and distance information of an intersection from the target vehicle;
according to the current motion state information and the road state information of the target vehicle, establishing a scene coordinate system of the current position of the target vehicle, and determining a target vehicle coordinate parameter and a lane center line coordinate parameter in the scene coordinate system;
according to the target vehicle coordinate parameters and the lane center line coordinate parameters, determining the transverse offset distance from the near end of the lane center line to the longitudinal axis of the target vehicle coordinate system, the lane center line curve and the pre-aiming point coordinate parameters;
and determining the total steering wheel angle of the target vehicle by adopting a feedforward tracking algorithm and a feedback algorithm according to the transverse offset distance, the lane center line curve and the coordinate parameters of the pre-aiming point.
2. The method for controlling driving assistance at an intersection according to claim 1, wherein the step of establishing a scene coordinate system of a current position of the target vehicle according to the motion state information and the road state information of the target vehicle, and determining a target vehicle coordinate parameter and a lane center line coordinate parameter in the scene coordinate system, further comprises:
Judging whether the target vehicle enters a lane crossing or not according to the motion state information and the road state information of the target vehicle;
if yes, determining and generating a virtual lane line according to the road state information;
and determining the duration time of the current virtual lane line and the preset effective length corresponding to the duration time based on a preset mapping table according to the motion state information of the target vehicle.
3. The method for controlling driving assistance at an intersection according to claim 2, wherein determining the duration of the current virtual lane line and the preset effective length corresponding to the duration based on the preset mapping table according to the movement state information of the target vehicle comprises:
according to the duration time of the current virtual lane line, if the target vehicle does not recognize the opposite effective lane within the duration time, the auxiliary driving function exits;
if the target vehicle identifies the opposite effective lane within the duration time, acquiring the effective length of the opposite effective lane and the preset effective length corresponding to the duration time, and if the effective length is smaller than the preset effective length corresponding to the duration time, the auxiliary driving function exits.
4. The method for controlling driving assistance at an intersection according to claim 1, wherein determining the lateral offset distance from the near end of the lane center line to the longitudinal axis of the target vehicle coordinate system, the lane center line curve, and the pre-aiming point coordinate parameters according to the target vehicle coordinate parameters and the lane center line coordinate parameters comprises:
obtaining position parameter information of each lane center line and a target vehicle according to the target vehicle coordinate parameters and the lane center line coordinate parameters, wherein the position parameter information comprises a transverse offset distance from the near end of a lane line to the longitudinal axis of the vehicle coordinate system, an included angle between the lane line coordinate system and the vehicle coordinate system and a lane line curvature;
based on the lane center line calculation formula according to the position parameter informationAnd determining a center line curve of each lane, wherein C0 is a transverse offset distance from the near end of the lane to the longitudinal axis of the coordinate system of the vehicle, C1 is an included angle between the coordinate system of the lane and the coordinate system of the vehicle, and C2 is the curvature of the lane.
5. The method for controlling driving assistance at an intersection according to claim 1, wherein determining the lateral offset distance from the near end of the lane center line to the longitudinal axis of the target vehicle coordinate system, the lane center line curve, and the pre-aiming point coordinate parameters according to the target vehicle coordinate parameters and the lane center line coordinate parameters further comprises:
Determining a pretightening distance of a pretightening point according to the coordinate parameters and the current motion state information of the target vehicleWherein, pretightening distance +.>By the formula->Obtaining, wherein k is a gain factor, V is a target vehicle speed, ++>Pre-aiming distance as a basis;
according to the pretightening distance of the pretightening pointAnd a lane center line calculation equation to obtain a pre-aiming point transverse distance +.>
6. The method for controlling driving assistance at an intersection according to claim 1, wherein determining the total steering wheel angle of the target vehicle by using a feedforward tracking algorithm and a feedback algorithm according to the lateral offset distance, the lane centerline curve and the coordinate parameter of the pre-aiming point comprises:
obtaining the turning radius R and the front wheel steering angle of the target vehicle according to the coordinate parameters of the target vehicle and the coordinate parameters of the pre-aiming pointWherein the turning radius R is represented by the formula: />Obtaining the front wheel steering angle->The formula is:obtained by the method, wherein->For the forward vision distance>Is the included angle between the pre-aiming point and the forward direction of the automobile, and is +.>Is a vaporWheelbase (I/O)>The transverse distance between the position of the automobile and the pre-aiming point;
according to the front wheel steering angle of the target vehicleObtaining the theoretical steering wheel angle of the target vehicle>Wherein->Wherein- >For the target vehicle front wheel steering angle, R is the steering wheel gear ratio.
7. The method for controlling driving assistance at an intersection according to claim 1, wherein the determining the total steering wheel angle of the target vehicle by using a feedforward tracking algorithm and a feedback algorithm according to the lateral offset distance, the lane centerline curve and the coordinate parameter of the pre-aiming point further comprises:
eliminating lateral deviation from a PID controller of a target vehicleAnd heading deviation->Obtaining the deviation steering wheel angle>Andwherein->,/>Wherein->For PID controller coefficients, +.>For lateral deviation +.>Is course deviation;
steering wheel angle according to the deviationAnd->Obtaining the total steering wheel angle of the target vehicle>Wherein, the method comprises the steps of, wherein,wherein A, B is a proportionality coefficient, < >>Is the theoretical steering wheel angle.
8. An intersection driving support driving control device, comprising:
the parameter acquisition module is configured to continuously acquire current motion state information of a target vehicle and road state information of the target vehicle;
the scene construction module is configured to establish a coordinate system scene of the position of the current target vehicle according to the current motion state information of the target vehicle and the road state information of the target vehicle, which are obtained by the obtaining module, and determine the coordinate parameters of the target vehicle and the center line coordinate parameters of the lane in the scene coordinate system;
The first determining module is configured to determine a transverse offset distance from the near end of the lane center line to the longitudinal axis of the target vehicle coordinate system, a lane center line curve and a pre-aiming point coordinate parameter according to the target vehicle coordinate parameter and the lane center line coordinate parameter obtained by the scene building module;
the second determining module is configured to obtain a target vehicle coordinate parameter and a lane center line coordinate parameter according to the scene building module, wherein the duration time of the current virtual lane line and the preset effective length corresponding to the duration time;
the data processing module is configured to determine a transverse offset distance from the near end of the lane center line to the longitudinal axis of the target vehicle coordinate system, a lane center line curve and coordinate parameters of a pre-aiming point according to the first determining module, and determine the total steering wheel angle of the target vehicle by adopting a feedforward tracking algorithm and a feedback algorithm;
the judging module is configured to judge whether the auxiliary driving function of the target vehicle needs to be exited or not according to the duration time of the current virtual lane line and the preset effective length corresponding to the duration time determined by the second determining module;
and the control module is configured to determine the total steering wheel angle of the target vehicle according to the judging result of the judging module and the data processing of the data processing module, and execute the intersection driving assisting driving control strategy.
9. A vehicle, characterized by comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to:
a step of realizing the intersection driving support driving control method according to any one of claims 1 to 7.
10. A computer-readable storage medium having stored thereon computer program instructions, which when executed by a processor, implement the steps of an intersection driving assistance driving control method according to any one of claims 1 to 7.
CN202311233849.9A 2023-09-23 2023-09-23 Intersection driving assisting driving control method and device, vehicle and readable storage medium Pending CN117207975A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311233849.9A CN117207975A (en) 2023-09-23 2023-09-23 Intersection driving assisting driving control method and device, vehicle and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311233849.9A CN117207975A (en) 2023-09-23 2023-09-23 Intersection driving assisting driving control method and device, vehicle and readable storage medium

Publications (1)

Publication Number Publication Date
CN117207975A true CN117207975A (en) 2023-12-12

Family

ID=89036811

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311233849.9A Pending CN117207975A (en) 2023-09-23 2023-09-23 Intersection driving assisting driving control method and device, vehicle and readable storage medium

Country Status (1)

Country Link
CN (1) CN117207975A (en)

Similar Documents

Publication Publication Date Title
CN113525373B (en) Lane changing control system, control method and lane changing controller for vehicle
WO2022016351A1 (en) Method and apparatus for selecting driving decision
CN112429016B (en) Automatic driving control method and device
CN113631452B (en) Lane change area acquisition method and device
JP2018118609A (en) Automatic driving system
JP7554846B2 (en) Method and device for adjusting driving performance
CN115042821B (en) Vehicle control method, vehicle control device, vehicle and storage medium
CN115100377B (en) Map construction method, device, vehicle, readable storage medium and chip
CN114842440B (en) Automatic driving environment sensing method and device, vehicle and readable storage medium
CN115164910B (en) Travel route generation method, travel route generation device, vehicle, storage medium, and chip
CN114782638B (en) Method and device for generating lane line, vehicle, storage medium and chip
CN115205848A (en) Target detection method, target detection device, vehicle, storage medium and chip
CN117184215A (en) Vehicle control method and device, storage medium and vehicle
CN114987549A (en) Vehicle control method, device, storage medium and vehicle
CN117207975A (en) Intersection driving assisting driving control method and device, vehicle and readable storage medium
CN114877911B (en) Path planning method, device, vehicle and storage medium
CN116653634B (en) Method and device for controlling motor torque of pure electric vehicle, vehicle and storage medium
CN115082886B (en) Target detection method, device, storage medium, chip and vehicle
CN114852092B (en) Steering wheel hands-off detection method and device, readable storage medium and vehicle
CN114572219B (en) Automatic overtaking method and device, vehicle, storage medium and chip
CN115649165B (en) Vehicle starting control method and device, vehicle and storage medium
CN114822216B (en) Method and device for generating parking space map, vehicle, storage medium and chip
CN114771514B (en) Vehicle running control method, device, equipment, medium, chip and vehicle
CN115535004B (en) Distance generation method, device, storage medium and vehicle
CN115082573B (en) Parameter calibration method and device, vehicle and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination