CN114932947B - Vehicle steering control system and method - Google Patents

Vehicle steering control system and method Download PDF

Info

Publication number
CN114932947B
CN114932947B CN202210605590.5A CN202210605590A CN114932947B CN 114932947 B CN114932947 B CN 114932947B CN 202210605590 A CN202210605590 A CN 202210605590A CN 114932947 B CN114932947 B CN 114932947B
Authority
CN
China
Prior art keywords
information
controller
processing result
vehicle
driving state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210605590.5A
Other languages
Chinese (zh)
Other versions
CN114932947A (en
Inventor
姜廷龙
高尚
常秀岩
侯慧贤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
FAW Group Corp
Original Assignee
FAW Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by FAW Group Corp filed Critical FAW Group Corp
Priority to CN202210605590.5A priority Critical patent/CN114932947B/en
Publication of CN114932947A publication Critical patent/CN114932947A/en
Application granted granted Critical
Publication of CN114932947B publication Critical patent/CN114932947B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D6/00Arrangements for automatically controlling steering depending on driving conditions sensed and responded to, e.g. control circuits
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D6/00Arrangements for automatically controlling steering depending on driving conditions sensed and responded to, e.g. control circuits
    • B62D6/001Arrangements for automatically controlling steering depending on driving conditions sensed and responded to, e.g. control circuits the torque NOT being among the input parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D6/00Arrangements for automatically controlling steering depending on driving conditions sensed and responded to, e.g. control circuits
    • B62D6/007Arrangements for automatically controlling steering depending on driving conditions sensed and responded to, e.g. control circuits adjustable by the driver, e.g. sport mode

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)
  • Steering Control In Accordance With Driving Conditions (AREA)

Abstract

The embodiment of the invention discloses a vehicle steering control system and a method. The system comprises: the TBox controller is used for acquiring first driving state information, first environment information, second driving state information, second environment information and a first processing result; the perception fusion controller is used for carrying out perception fusion processing according to the second driving state information, the second environment information, the first driving state information and the first environment information to obtain a second processing result; the automatic driving function controller is used for checking the first processing result and the second processing result to obtain a checking result, and executing steering control operation according to the checking result. Judging the safety of the lane to be steered of the target vehicle through the second processing result, and avoiding the collision risk problem caused by the fact that other vehicles or obstacles possibly exist in the steered lane cannot be perceived; the safety and reliability of the vehicle steering control can be further improved by checking the first processing result and the second processing result.

Description

Vehicle steering control system and method
Technical Field
The embodiment of the invention relates to the technical field of automatic driving, in particular to a vehicle steering control system and method.
Background
With the development of automatic driving technology, a vehicle with an automatic driving function can realize automatic control of steering in the driving process.
However, as the road conditions of the traffic road become more complex, other vehicles or obstacles may be present in front of or behind the lane in which the vehicle is to turn during the automatic steering control, and the vehicle may pose a certain risk of collision due to the inability to perceive other vehicles or obstacles that may be present in the lane being steered.
Therefore, how to improve the safety and reliability of the automatic steering control of the vehicle is a technical problem to be solved currently.
Disclosure of Invention
The embodiment of the invention provides a vehicle steering control system and a vehicle steering control method, which are used for solving the problem that a vehicle cannot sense other vehicles or obstacles possibly existing in a steering lane to cause collision risk and improving the safety and reliability of automatic steering control of the vehicle.
According to an aspect of the present invention, there is provided a vehicle steering control system including: a remote communication Box (TBox) controller, a perception fusion controller, and an autopilot function controller;
the TBox controller is used for acquiring first driving state information, first environment information, second driving state information, second environment information and a first processing result; the first driving state information is driving state information of a target vehicle, the first environmental information is environmental information of the target vehicle, the first environmental information comprises first environmental vehicles and first obstacles around the target vehicle, the second driving state information is driving state information of the first environmental vehicle, the second environmental information is environmental information of the first environmental vehicle, the second environmental information comprises second environmental vehicles and second obstacles around the first environmental vehicle, and the first processing result is a processing result for carrying out safety judgment on lanes to be turned of the first environmental vehicle;
The sensing fusion controller is connected with the TBox controller and is used for performing sensing fusion processing according to the second driving state information, the second environment information, the first driving state information and the first environment information to obtain a second processing result, wherein the second processing result is a processing result for performing safety judgment on a lane to be steered of the target vehicle;
the automatic driving function controller is respectively connected with the perception fusion controller and the TBox controller, and is used for verifying the first processing result and the second processing result to obtain a verification result and executing steering control operation according to the verification result;
according to another aspect of the present invention, there is provided a vehicle steering control method applied to a vehicle steering control system including: a TBox controller, a perception fusion controller and an automatic driving function controller; the method comprises the following steps:
the TBox controller acquires first driving state information, first environment information, second driving state information, second environment information and a first processing result; the first driving state information is driving state information of a target vehicle, the first environmental information is environmental information of the target vehicle, the first environmental information comprises first environmental vehicles and first obstacles around the target vehicle, the second driving state information is driving state information of the first environmental vehicle, the second environmental information is environmental information of the first environmental vehicle, the second environmental information comprises second environmental vehicles and second obstacles around the first environmental vehicle, and the first processing result is a processing result for carrying out safety judgment on lanes to be turned of the first environmental vehicle;
The perception fusion controller carries out perception fusion processing according to the second driving state information, the second environment information, the first driving state information and the first environment information to obtain a second processing result, wherein the second processing result is a processing result for carrying out safety judgment on a lane to be steered of the target vehicle;
and the automatic driving function controller checks the first processing result and the second processing result to obtain a checking result, and executes steering control operation according to the checking result.
According to the technical scheme, the TBox controller is used for acquiring first driving state information, first environment information, second driving state information, second environment information and a first processing result; the sensing fusion controller is connected with the TBox controller and is used for performing sensing fusion processing according to the second driving state information, the second environment information, the first driving state information and the first environment information to obtain a second processing result; the automatic driving function controller is respectively connected with the perception fusion controller and the TBox controller, and is used for verifying the first processing result and the second processing result to obtain a verification result and executing steering control operation according to the verification result. According to the technical scheme, the safety of the lane to be steered of the target vehicle is judged through the second processing result, the environment information of the target vehicle is utilized, the environment information of the surrounding environment vehicles of the target vehicle is utilized, and the problem that collision risks caused by other vehicles or obstacles possibly exist in the lane to be steered due to the fact that the detection area is limited can be avoided; and by checking the first processing result of the first environment vehicle and the second processing result of the target vehicle, whether the safety judgment of the target vehicle on the lane to be steered is reliable or not can be further judged, and the safety and reliability of the automatic steering control of the vehicle are improved.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of a vehicle steering control system according to a first embodiment of the present invention;
fig. 2 is a flowchart of a vehicle steering control method according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of a steering control system according to a second embodiment of the present invention;
fig. 4 is a schematic diagram illustrating an implementation of interaction between a TBox and a cloud server according to a second embodiment of the present invention;
fig. 5 is a schematic implementation diagram of an automatic steering control method according to a second embodiment of the present invention;
Fig. 6 is a schematic implementation diagram of another automatic steering control method according to the second embodiment of the present invention;
fig. 7 is a schematic diagram illustrating an implementation of another automatic steering control method according to a second embodiment of the present invention;
fig. 8 is a schematic diagram illustrating an implementation of another automatic steering control method according to the second embodiment of the present invention;
fig. 9 is a schematic diagram illustrating an implementation of another automatic steering control method according to a second embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
With the development of automobile technology, the domestic traffic road conditions are more and more complicated, especially vehicles and pedestrians suddenly appear in a visual field blind area in the running process, the current main-flow automobile type is not provided with an emergency automatic steering function and an automatic driving automatic steering function, and on the limited automobile type provided with the emergency automatic steering function and the automatic driving automatic steering function, the emergency automatic steering function prepares the environmental information entering a lane through radar detection, then makes a decision on whether automatic steering is intervened or not, and if vehicles or obstacles exist in front of or behind, risks cannot be perceived. And after confirming the environmental information, the automatic driving system starts automatic driving steering and lane changing through the driver fluctuation operation rod.
In the embodiment, in the use process of the traditional vehicle, under the scene of man-driving (i.e. manual driving), if no front obstacle is found, the vehicle needs to judge and actively intervene according to the information identified by the radar, so that the emergency automatic steering is realized to help the driver avoid accidents; in the emergency automatic steering process, the vehicle possibly enters an adjacent lane, a plurality of visual field blind areas and sensor blind areas can exist in the vehicle through a radar detection means, the vehicle in the blind areas cannot be identified, and the vehicle is easy to collide with the rear high-speed incoming vehicle or the vehicle in the blind areas in the emergency automatic steering process, so that accidents occur. Under the automatic driving scene, the vehicle can make a decision according to the information detected by the radar, the steering system is controlled to steer and enter the adjacent lane, a plurality of visual field blind areas and sensor blind areas can exist in the vehicle through the radar detection means, the problem that the vehicle in the blind areas cannot be identified is also solved, and the vehicle is easy to collide with the rear high-speed incoming vehicle or the vehicle in the blind areas in the automatic steering process, so that accidents occur.
The embodiment of the invention provides an automatic steering control strategy and an automatic steering control method, which are based on an environment sensing means detected by an automatic driving radar, increase the strategy of vehicle position identification and surrounding environment vehicle position information interaction, enrich environment sensing information, help an emergency automatic steering or automatic driving automatic steering function to detect vehicles in a dead zone in the decision steering process, make a more reasonable decision, reduce the probability of accidents and improve the safety of the emergency automatic steering or automatic driving automatic steering function.
Example 1
Fig. 1 is a schematic structural diagram of a vehicle steering control system according to an embodiment of the present invention, where the present embodiment is applicable to a vehicle steering control situation, and the system may be implemented in software and/or hardware, and may perform the method according to the embodiment of the present invention. As shown in fig. 1, the vehicle steering control system includes: TBox controller 110, awareness fusion controller 120, and autopilot function controller 130;
the TBox controller 110 is configured to obtain first driving state information, first environment information, second driving state information, second environment information, and a first processing result; the first driving state information is driving state information of a target vehicle, the first environment information is environment information of the target vehicle, the first environment information comprises first environment vehicles around the target vehicle and first obstacles, the second driving state information is driving state information of the first environment vehicles, the second environment information is environment information of the first environment vehicles, the second environment information comprises second environment vehicles around the first environment vehicles and second obstacles, and the first processing result is a processing result for carrying out safety judgment on lanes to be turned of the first environment vehicles;
The sensing fusion controller 120 is connected to the TBox controller 110, and is configured to perform sensing fusion processing according to the second driving state information, the second environment information, the first driving state information, and the first environment information, to obtain a second processing result, where the second processing result is a processing result for performing security determination on a lane to be steered of the target vehicle;
the autopilot function controller 130 is connected to the sensing fusion controller 120 and the TBox controller 110, respectively, and the autopilot function controller 130 is configured to verify the first processing result and the second processing result to obtain a verification result, and perform a steering control operation according to the verification result.
In the present embodiment, the TBox controller 110, the awareness fusion controller 120, and the autopilot function controller 130 may all refer to existing onboard controllers configured in a vehicle for implementing corresponding functions. The TBox controller 110 may be configured to obtain first driving state information, first environment information, second driving state information, second environment information, a first processing result, and the like, and the data obtained by the TBox controller 110 is not particularly limited herein.
The target vehicle may be understood as the currently driving vehicle. The first driving state information may refer to driving state information of the target vehicle; the driving state information may be understood as information characterizing the driving state of the vehicle, e.g., the driving state information may include vehicle position information, vehicle speed information, etc.; on the basis of this, the first driving state information can be understood as the current vehicle position information and vehicle speed information of the target vehicle. The first environmental information may refer to environmental information of the target vehicle; environmental information is understood to be information characterizing the environmental state around the vehicle, e.g. the environmental information may include other driving vehicles and obstacles around the vehicle (obstacles may include static obstacles, but also dynamic obstacles, such as pedestrians and bicycles, etc.); on this basis, the first environmental information may include a first environmental vehicle surrounding the target vehicle, which may be understood as other traveling vehicles (such as one or more) surrounding the target vehicle, and a first obstacle, which may be understood as an obstacle surrounding the target vehicle. It is understood that the first environmental information may also include a situation where there are no other traveling vehicles and obstacles around the target vehicle.
The second driving state information may be driving state information of the first environmental vehicle, and accordingly, for example, the second driving state information may include current vehicle position information and vehicle speed information of the first environmental vehicle. The second environmental information may be environmental information of the first environmental vehicle, and accordingly, for example, the second environmental information may include a second environmental vehicle surrounding the first environmental vehicle and a second obstacle; wherein the second ambient vehicle may be understood as other driving vehicles around the first ambient vehicle and the second obstacle may be understood as an obstacle around the first ambient vehicle. It is understood that the second environmental information may also include situations where there are no other traveling vehicles and obstacles around the first environmental vehicle.
The first processing result may be understood as a processing result of making a safety determination for the lane to be steered of the first environmental vehicle. The lane to be steered may be understood as an adjacent lane into which the first surrounding vehicle is to be steered, such as an adjacent lane to the left of the current driving lane of the first surrounding vehicle or an adjacent lane to the right of the current driving lane of the first surrounding vehicle. The safety determination is understood as a determination of whether or not there are high-speed passing vehicles in front of and behind the lane to be steered, and whether or not there are factors affecting the running safety of the first environmental vehicle when steering the lane to be steered, and the like, and is not particularly limited herein.
The present embodiment is not particularly limited as to how the TBox controller 110 obtains the first driving state information, the first environment information, the second driving state information, the second environment information, and the first processing result. For example, the TBox controller 110 may obtain first driving state information (such as vehicle position information and vehicle speed information of the target vehicle) of the target vehicle through high-precision map information analysis, may obtain first environment information through communication connection with a module for performing environment information collection in the target vehicle, and may obtain relevant information (such as second driving state information, second environment information, and a first processing result) of the first environment vehicle through communication connection with a data collection module in the cloud.
The sense fusion controller 120 may be connected to the TBox controller 110, where the connection manner between the sense fusion controller 120 and the TBox controller 110 is not limited, and may be, for example, a communication connection through a controller area network (Controller Area Network, CAN) network, or a communication connection through a flexible data Rate controller area network (Controller Area Network with Flexible Data-Rate, CANFD) network. The perceptual fusion controller 120 may be understood as a controller for performing a perceptual fusion process on the received information. On the basis, the sensing fusion controller 120 is connected with the TBox controller 110, and the TBox controller 110 sends the acquired second driving state information, second environment information, first driving state information and first environment information to the sensing fusion controller 120 for receiving; the perceptual fusion controller 120 may be configured to perform a perceptual fusion process according to the received second driving state information, the second environment information, the first driving state information, and the first environment information, to obtain a second processing result. The second processing result may be understood as a processing result of performing a safety determination on the lane to be steered of the target vehicle.
The embodiment is not particularly limited as to how the perceptual fusion controller 120 performs the perceptual fusion processing to obtain the second processing result according to the second driving state information, the second environment information, the first driving state information, and the first environment information. For example, the perception fusion controller 120 determines whether other traveling vehicles (i.e., the first environmental vehicles) and obstacles (i.e., the first obstacles) exist around the target vehicle according to the first environmental information, if other traveling vehicles exist in front of and/or behind the target vehicle, the target vehicle may have a detection blind area due to shielding of the other traveling vehicles, at this time, it may be determined whether traveling vehicles and obstacles exist in the detection blind area according to environmental information (i.e., the second environmental information) of the other traveling vehicles, and it may also determine a safe vehicle distance and a safe vehicle speed between the target vehicle and the other traveling vehicles, and whether collision risk may occur or not according to the first driving state information and the second driving state information; on the basis of this, it is possible to determine whether the lane of the target vehicle to be steered can be safely steered in without collision risk (i.e., safety determination). It is understood that whether or not there are other running vehicles and obstacles around the target vehicle, whether or not there are detection dead zones, whether or not there are other running vehicles and obstacles in the detection dead zones, a safe vehicle distance between the other running vehicles and the target vehicle, whether or not the target vehicle is involved in a collision risk situation with the other running vehicles, and the like can be regarded as the second processing result after the perception processing.
It should be noted that, correspondingly, the first processing result may also be considered as a processing result obtained by the sensing fusion controller of the first environmental vehicle performing the sensing fusion processing according to the driving state information (i.e., the second driving state information) of the first environmental vehicle, the environmental information (i.e., the second environmental information) of the first environmental vehicle, the driving state information (i.e., the driving state information of the second environmental vehicle (i.e., other driving vehicles around the first environmental vehicle), and the environmental information of the second environmental vehicle, which are sent by the TBox controller of the first environmental vehicle.
The autopilot function controller 130 may refer to a controller for controlling the autopilot function of the vehicle. The autopilot function controller 130 is connected to the sense fusion controller 120 and the TBox controller 110, respectively, and may be communicatively connected via a CAN network or a CANFD network, for example. The perception fusion controller 120 sends the second processing result to the autopilot function controller 130, and the tbox controller 110 sends the acquired first processing result to the autopilot function controller 130; on this basis, the autopilot function controller 130 may be configured to verify the received first and second processing results to obtain a verification result, and perform a steering control operation according to the verification result. The verification result may be understood as a result obtained by verifying the first processing result and the second processing result. The steering control operation may be understood as an operation of controlling automatic steering of the vehicle, and may include, for example, generating steering request information for performing a steering operation to a corresponding controller to perform a corresponding steering operation, or generating request information for performing a braking or driving operation to a corresponding controller to perform a corresponding braking or driving operation when steering is not possible, or the like.
The first processing result is a processing result of the lane to be steered safety judgment of the first environmental vehicle, which is obtained at the angle of the first environmental vehicle as the current running vehicle. Correspondingly, the second processing result is a processing result of the lane safety judgment to be turned of the target vehicle, which is obtained by taking the target vehicle as the current driving vehicle. It is understood that the first processing result may be converted into the first processing result of the target vehicle angle before verifying the first processing result and the second processing result; the manner of how to convert is not particularly limited here, and for example, vehicle position information and vehicle speed information of the target vehicle and the first environmental vehicle may be determined based on the first driving state information and the second driving state information, and the first processing result determined based on the vehicle position information and the vehicle speed information of the first environmental vehicle may be converted into the first processing result based on the vehicle position information and the vehicle speed information of the target vehicle, thereby obtaining a converted first processing result.
On the basis, the second processing result and the converted first processing result can be regarded as processing results based on the vehicle position information and the vehicle speed information of the target vehicle, and at the moment, a corresponding checking result can be obtained by checking whether the second processing result and the converted first processing result are consistent or not, and corresponding steering control operation is executed according to the checking result. For example, if the verification result is that the second processing result is inconsistent with the converted first processing result, it may indicate that the safety of the lane to be steered of the target vehicle is determined to be possibly dangerous, and the steering operation may not be performed to avoid the collision risk, where the automatic function driving controller 130 may generate corresponding braking or driving request information based on the second processing result to control the target vehicle to brake or continue driving in the current driving lane, and so on. Accordingly, if the verification result is that the second processing result is consistent with the converted first processing result, it may indicate that the safety of the lane to be steered of the target vehicle is determined to be safe, and then a corresponding steering operation may be performed based on the second processing result, and at this time, the automatic function driving controller 130 may generate corresponding steering request information to control the target vehicle to automatically steer into the corresponding lane to be steered in the current driving lane.
The first embodiment provides a vehicle steering control system, which is configured to obtain, by using a TBox controller 110, first driving state information, first environment information, second driving state information, second environment information, and a first processing result; the sensing fusion controller 120 is connected with the TBox controller 110, and is configured to perform sensing fusion processing according to the second driving state information, the second environment information, the first driving state information, and the first environment information, so as to obtain a second processing result; the autopilot function controller 130 is connected to the sensing fusion controller 120 and the TBox controller 110, respectively, and the autopilot function controller 130 is configured to verify the first processing result and the second processing result to obtain a verification result, and execute a steering control operation according to the verification result. The system judges the safety of the lane to be steered of the target vehicle through the second processing result, utilizes the environmental information of the target vehicle and the environmental information of the surrounding environment vehicles of the target vehicle, and can avoid the problem of collision risk caused by the fact that the situation that other vehicles or obstacles possibly exist in the steered lane due to limited detection areas; and by checking the first processing result of the first environment vehicle and the second processing result of the target vehicle, whether the safety judgment of the target vehicle on the lane to be steered is reliable or not can be further judged, and the safety and reliability of the automatic steering control of the vehicle are improved.
Optionally, the system further comprises: the cloud server;
the cloud server is connected with the TBox controller 110, and is configured to acquire and store first driving state information and first environment information, determine second driving state information and second environment information according to the first environment, and send the second driving state information and the second environment information to the TBox controller 110;
the cloud server is further configured to obtain a first processing result, and send the first processing result to the TBox controller 110.
The cloud server may establish a network connection (such as a wireless network connection, which is not limited herein) with each vehicle (such as a TBox controller of each vehicle) to obtain relevant information of each vehicle to perform information transfer and sharing between each vehicle. The information acquired from each vehicle by the cloud server is not particularly limited, and the required related information can be acquired according to actual conditions.
The cloud server is connected to the TBox controller 110 of the target vehicle, and may be configured to acquire and store the first driving state information and the first environmental information, and determine the second driving state information and the second environmental information according to the first environment, so as to send the second driving state information and the second environmental information to the TBox controller 110. Here, how to determine the second driving state information and the second environmental information according to the first environment is not particularly limited, for example, the cloud server determines the first environmental vehicle around the target vehicle according to the first environment, and the cloud server may obtain the environmental information and the driving state information (i.e., the second driving state information and the second environmental information) of the first environmental vehicle because the cloud server is also connected to the first environmental vehicle through a network.
The cloud server may be further configured to obtain a first processing result, and send the first processing result to the TBox controller 110. Correspondingly to the target vehicle, the TBox controller of the first environmental vehicle may also obtain the first processing result of the perceived fusion controller of the first environmental vehicle (herein, how the perceived fusion controller of the first environmental vehicle obtains the first processing result is not specifically limited, for example, how the perceived fusion controller 120 of the target vehicle determines the second processing result in the above embodiment); on the basis, the cloud server can acquire a first processing result of the first environmental vehicle through the TBox controller of the first environmental vehicle.
Optionally, the system further comprises: a steering controller, a braking controller, and a driving controller;
the steering controller is connected with the autopilot function controller 130 and is used for receiving steering request information of the autopilot function controller 130;
the brake controller is connected with the autopilot function controller 130 and is used for receiving brake request information of the autopilot function controller 130;
the driving controller is connected to the autopilot function controller 130, and is configured to receive driving request information of the autopilot function controller 130.
The system can further comprise a steering controller, a braking controller and a driving controller, which can be arranged at the end of the target vehicle. The steering controller may be understood as a controller for steering control of the target vehicle. A brake controller may be understood as a controller for brake control of a target vehicle. A drive controller may be understood as a controller for driving control of a target vehicle.
The steering controller is connected to the autopilot function controller 130 and may be used to receive steering request information from the autopilot function controller 130. Steering request information may be understood as information characterizing the requested steering of the target vehicle. After receiving the steering request information from the autopilot controller 130, the steering controller may control the target vehicle to steer accordingly based on the steering request information. How the target vehicle is controlled to make the corresponding steering based on the steering request information is not particularly limited herein.
The brake controller is connected to the autopilot function controller 130 and may be configured to receive brake request information from the autopilot function controller 130. Brake request information may be understood as information characterizing a request for a target vehicle brake. After receiving the brake request information from the autopilot controller 130, the brake controller may control the target vehicle to perform a corresponding brake based on the brake request information. How the control target vehicle is subjected to the corresponding braking based on the braking request information is not particularly limited herein.
The driving controller is connected to the autopilot function controller 130 and may be used to receive driving request information of the autopilot function controller 130. The drive request information may be understood as information for characterizing the drive of the requested target vehicle. After receiving the driving request information from the autopilot controller 130, the driving controller may control the target vehicle to perform a corresponding driving based on the driving request information. How the target vehicle is controlled to be driven accordingly based on the drive request information is not particularly limited herein.
Optionally, the system further comprises: an environmental detector;
the environment detector is connected to the sensing fusion controller 120, and is configured to detect the first environment information and send the first environment information to the sensing fusion controller 120.
The environment detector may be understood as a controller for detecting environmental information around the target vehicle. The environment detector may include millimeter wave radar and/or cameras, etc., which are not limited herein. The environment detector may be disposed at the target vehicle end.
The environmental detector is connected to the sensing fusion controller 120 and can be used to detect the first environmental information of the target vehicle and send the first environmental information to the sensing fusion controller 120.
Optionally, the environment detector comprises a millimeter wave radar and/or a camera.
Optionally, performing a sensing fusion process according to the second driving state information, the second environment information, the first driving state information and the first environment information to obtain a second processing result, where the second processing result includes: judging whether a target vehicle has a detection blind area or not according to the first environmental information; if the target vehicle does not have a detection blind area, judging whether a first obstacle exists in the drivable area according to the first environment information to obtain a first judgment result, and determining a second processing result based on the first judgment result and the first driving state information; if the target vehicle has a detection blind area, judging whether a second environment vehicle and/or a second obstacle exists in the detection blind area according to the second environment information to obtain a second judgment result, and determining a second processing result based on the second judgment result, the first driving state information and the second driving state information.
The process of performing the sensing fusion processing according to the second driving state information, the second environment information, the first driving state information and the first environment information to obtain the second processing result may be: firstly, judging whether a target vehicle has a detection blind area according to first environment information, wherein the detection blind area can be understood as an area where an environment detector of the target vehicle cannot detect corresponding environment information due to shielding of other running vehicles in front of and/or behind the target vehicle.
If the target vehicle does not have a detection blind area, the condition that other running vehicles do not exist in a certain range in front of and behind the target vehicle can be indicated, and whether a first obstacle exists in a running area of the target vehicle or not can be determined according to the first environmental information, so that a first judgment result is obtained; a drivable zone may be understood as a road zone within a range of which a target vehicle can travel, such as the current lane and adjacent lanes of the target vehicle; the first determination result may be understood as a determination result as to whether or not the first obstacle exists in the target vehicle drivable region. On this basis, a second processing result is determined based on the first determination result and the first driving information. If the first determination result is that the first obstacle exists, it may be determined whether the distance between the target vehicle and the first obstacle is within the set safety range, whether the target vehicle needs to be steered, or the like, based on the position information (e.g., the front of the target vehicle, the front right of the target vehicle, or the like) and the speed information (e.g., the corresponding moving speed if the target vehicle is a dynamic obstacle, and the speed information may be considered as zero if the target vehicle is a static obstacle) of the first obstacle, and the vehicle position information and the vehicle speed information of the target vehicle, so as to obtain the corresponding second processing result. If the first determination result indicates that the first obstacle is not present, the second processing result may be regarded as the absence of the first obstacle, and the vehicle may continue traveling along the predetermined route.
If the target vehicle has a detection blind area, it can be indicated that other running vehicles exist in front of and/or behind the target vehicle, whether a second environment vehicle and a second obstacle exist in the detection blind area or not can be judged according to the second environment information, and a second judgment result is obtained; the second determination result may be understood as a determination result of whether the second environmental vehicle and/or the second obstacle exists in the detection blind area. On the basis of this, the second processing result may be determined based on the second determination result, the first driving state information, and the second driving state information. If the second determination result is that the second environmental vehicle and/or the second obstacle exists in the detection blind area, it may be determined, according to the position information and the speed information of the second obstacle, the vehicle position information and the vehicle speed information of the second environmental vehicle, and the vehicle position information and the vehicle speed information of the target vehicle, whether the distance between the target vehicle and the second obstacle and the second environmental vehicle is within the set safety range, whether the lane to be steered of the target vehicle can safely enter, whether collision risk occurs between the target vehicle and the second obstacle and the second environmental vehicle, and so on, so as to obtain the corresponding second processing result. If the second judging result is that the second environment vehicle and the second obstacle are not present in the detection blind area, the second processing result can be regarded as that other running vehicles and obstacles are not present in the lane to be steered of the target vehicle, and the target vehicle can safely enter the lane to be steered.
Optionally, verifying the first processing result and the second processing result to obtain a verification result includes: acquiring first driving state information and second driving state information, and converting the first processing result into a first processing result of the target vehicle angle according to the first driving state information and the second driving state information; and checking whether the second processing result is consistent with the converted first processing result to obtain a corresponding checking result.
Wherein the autopilot function controller 130 is operable to verify the first and second processing results. Specifically, first driving state information and second driving state information of the TBox controller may be obtained first, and vehicle position information and vehicle speed information of the target vehicle, and vehicle position information and vehicle speed information of the first environmental vehicle may be obtained according to the first driving state information and the second driving state information. And then, according to the first driving state information and the second driving state information, converting the first processing result of the first environmental vehicle into the first processing result of the target vehicle angle, wherein the converted first processing result and second processing result can be regarded as processing results of the target vehicle angle for carrying out safety judgment on the lane to be steered of the target vehicle. And finally, checking whether the second processing result is consistent with the converted first processing result to obtain a corresponding checking result. The verification results may include both consistency and inconsistency.
Optionally, executing the steering control operation according to the verification result includes: if the verification results are consistent, determining steering request information based on the second processing results so as to perform steering control; if the verification results are inconsistent, the steering control operation is not executed, and the braking request information and/or the driving request information are determined based on the second processing result.
If the verification result is consistent, it may indicate that the safety of the lane to be steered of the target vehicle is determined to be safe, and at this time, the corresponding steering request information may be determined based on the second processing result (or the converted first processing result, which is not limited herein) and sent to the corresponding steering controller to perform steering control. If the verification result is inconsistent, it may indicate that the acquired information in the target vehicle or the first environmental vehicle may be inaccurate due to network delay or the like, so that the corresponding processing result is inaccurate, at this time, the lane to be steered of the target vehicle may be dangerous, so that the steering control operation is not performed, and the corresponding braking request information and/or driving request information may be determined based on the second processing result of the target vehicle, and sent to the corresponding braking controller and/or driving controller, so as to control the target vehicle to brake in the current driving lane or continue driving, and the like.
Example two
Fig. 2 is a flowchart of a vehicle steering control method according to a second embodiment of the present invention, where the method may be applied to a situation where steering control is performed on a vehicle, and the method may be performed by a vehicle steering control system according to an embodiment of the present invention, where the system may be implemented by software and/or hardware. Referring to fig. 1, a vehicle steering control system includes: TBox controller 110, awareness fusion controller 120, and autopilot function controller 130. It should be noted that technical details not described in detail in this embodiment may be found in any of the above embodiments.
As shown in fig. 2, the method comprises the steps of:
s210, the TBox controller acquires first driving state information, first environment information, second driving state information, second environment information and a first processing result.
In the present embodiment, the first driving state information is driving state information of the target vehicle, and the first driving state information may include vehicle position information and vehicle speed information of the target vehicle. The first environmental information is environmental information of the target vehicle, and the first environmental information may include a first environmental vehicle surrounding the target vehicle and a first obstacle. The second driving state information is driving state information of the first environmental vehicle, and the second driving state information may include vehicle position information and vehicle speed information of the first environmental vehicle. The second environmental information is environmental information of the first environmental vehicle, and the second environmental information may include a second environmental vehicle surrounding the first environmental vehicle and a second obstacle. The first processing result is a processing result of performing safety judgment on a lane to be steered of the first environmental vehicle.
The TBox controller is connected with the cloud server through a network, and can acquire first driving state information, first environment information, second driving state information, second environment information and a first processing result from the cloud server.
And S220, performing a sensing fusion process by the sensing fusion controller according to the second driving state information, the second environment information, the first driving state information and the first environment information to obtain a second processing result.
In the present embodiment, the second processing result may be a processing result of making a security determination on the lane to be steered of the target vehicle.
The sensing fusion controller is connected with the TBox controller, and can acquire required first driving state information, first environment information, second driving state information, second environment information and a first processing result from the TBox controller. On the basis, the perception fusion controller can carry out perception fusion processing according to the first driving state information, the first environment information, the second driving state information, the second environment information and the first processing result to obtain a corresponding second processing result.
Specifically, judging whether a target vehicle has a detection blind area or not according to the first environmental information; if the target vehicle does not have a detection blind area, judging whether a first obstacle exists in the drivable area according to the first environment information to obtain a first judgment result, and determining a second processing result based on the first judgment result and the first driving state information; if the target vehicle has a detection blind area, judging whether a second environment vehicle and/or a second obstacle exists in the detection blind area according to the second environment information to obtain a second judgment result, and determining a second processing result based on the second judgment result, the first driving state information and the second driving state information.
And S230, the automatic driving function controller verifies the first processing result and the second processing result to obtain a verification result, and executes steering control operation according to the verification result.
In this embodiment, the autopilot function controller may be connected to the TBox controller to obtain the first processing result from the TBox controller. The autopilot function controller may be coupled to the sensory fusion controller to obtain a second processing result from the sensory fusion controller. On the basis, the automatic driving function controller can verify the first processing result and the second processing result to obtain corresponding verification results, and execute corresponding steering control operation according to the verification results.
Specifically, the automatic driving function controller acquires first driving state information and second driving state information, and converts a first processing result into a first processing result of the target vehicle angle according to the first driving state information and the second driving state information; and checking whether the second processing result is consistent with the converted first processing result to obtain a corresponding checking result.
If the verification results are consistent, determining steering request information based on the second processing results so as to perform steering control; and if the verification results are inconsistent, not executing steering control operation, and determining brake request information and/or drive request information based on the second processing result.
The steering controller is connected with the automatic driving function controller for receiving steering request information of the automatic driving function controller and executing corresponding steering control operation. The brake controller is connected with the automatic driving function controller to receive the brake request information of the automatic driving function controller and execute corresponding brake control operation. The driving controller is connected with the automatic driving function controller for receiving driving request information of the automatic driving function controller and executing corresponding driving control operation.
The second embodiment provides a vehicle steering control method, firstly, a TBox controller obtains first driving state information, first environment information, second driving state information, second environment information and a first processing result; then the perception fusion controller carries out perception fusion processing according to the second driving state information, the second environment information, the first driving state information and the first environment information to obtain a second processing result; and finally, the automatic driving function controller checks the first processing result and the second processing result to obtain a checking result, and executes steering control operation according to the checking result. According to the method, the safety of the lane to be steered of the target vehicle is judged through the second processing result, the environment information of the target vehicle is utilized, the environment information of the surrounding environment vehicles of the target vehicle is utilized, and the problem that collision risks caused by other vehicles or obstacles possibly exist in the lane to be steered due to the fact that the detection area is limited can be avoided; and by checking the first processing result of the first environment vehicle and the second processing result of the target vehicle, whether the safety judgment of the target vehicle on the lane to be steered is reliable or not can be further judged, and the safety and reliability of the automatic steering control of the vehicle are improved.
The present invention is exemplified below.
The invention provides an automatic steering control method which can be applied to a steering control system. Fig. 3 is a schematic structural diagram of a steering control system according to a second embodiment of the present invention. As shown in fig. 3, the system includes a cloud server 101, a millimeter wave radar 102, a camera 103, a host vehicle TBox104 (i.e., TBox controller), a sense fusion controller 105, an autopilot function controller 106, a steering system 107 (i.e., steering controller), a braking system 108 (i.e., braking controller), and a drive system 109 (i.e., drive controller).
The host vehicle (i.e. the target vehicle) TBox104 interacts with the cloud server 101 through a network, the host vehicle TBox104 uploads data such as vehicle position information, vehicle speed information (i.e. first driving state information), environment information (millimeter wave radar and camera detection, i.e. first environment information) and the like obtained from the high-precision map to the cloud server 101, and meanwhile, the host vehicle TBox104 receives the position information, the vehicle speed information (i.e. second driving state information) and the environment information (millimeter wave radar and camera detection, i.e. second environment information) of the environment vehicle (i.e. first environment vehicle) from the cloud server 101, so that the driving state information and the vehicle environment information of the surrounding environment vehicle can be shared. The host vehicle TBox104 may further obtain a first processing result of the environmental vehicle transmitted by the cloud server 101.
The cloud server 101 is configured to collect and send vehicle position, vehicle speed information, and environmental information (millimeter wave radar and camera detection), and help to realize information transfer and sharing between vehicles.
Millimeter wave radar 102 and camera 103 (i.e., an environmental detector) communicate detected environmental information (i.e., first environmental information) to perceptual fusion controller 105.
The perception fusion controller 105 receives the vehicle position, the vehicle speed information, the environment information (millimeter wave radar and camera detection) and the host vehicle environment information (i.e., first environment information) detected by the millimeter wave radar 102 and the camera 103 of the environmental vehicle transmitted from the cloud server 101 by the host vehicle TBox 104; and the surrounding vehicle information transmitted by the millimeter wave radar 102, the camera 103 and the TBox104 is subjected to perception fusion processing, and finally the processing result (namely, the second processing result) is sent to the automatic driving function controller 106 for decision.
The autopilot function controller 106 receives the result of the information processing (i.e., the second result of the processing) by the awareness fusion controller 105; the host vehicle TBox104 transmits the vehicle position information and the vehicle speed information (i.e., first driving state information), the ambient vehicle position information and the vehicle speed information (i.e., second driving state information) of the host vehicle, and the first processing result of the ambient vehicle from the cloud server 101; and converting the first processing result based on the first driving state information and the second driving state information to obtain a converted first processing result, checking the converted first processing result and the information processing result of the perception fusion controller 105, and then deciding vehicle control based on the checking result.
The steering system 107 receives the steering request control of the autopilot steering function controller (i.e., receives steering request information of the autopilot steering function controller).
The brake system 108 receives the autopilot steering function controller brake request control (i.e., receives brake request information for the autopilot function controller).
The brake system 109 receives the driving request control of the automatic steering function controller (i.e., receives the driving request information of the automatic steering function controller).
The cloud server 101 and the host vehicle TBox104 are communicated through a wireless network.
The host vehicle TBox104, the sensing fusion controller 105, the automatic driving function controller 106, the steering system 107, the braking system 108 and the driving system 109 CAN communicate with each other through CAN or CANFD.
Fig. 4 is a schematic diagram illustrating an implementation of interaction between a TBox and a cloud server according to a second embodiment of the present invention. As shown in fig. 4, the environmental vehicles around the host vehicle may also be called as auxiliary vehicles (such as auxiliary vehicle 1, auxiliary vehicle 2 and auxiliary vehicle 3), each vehicle is provided with a TBox, and can perform information interaction with the cloud server, so as to realize information transfer and sharing between the vehicles.
Fig. 5 is a schematic implementation diagram of an automatic steering control method according to a second embodiment of the present invention. As shown in fig. 5, when the sensing fusion controller 105 senses a front obstacle (such as a pedestrian located in front of the host vehicle in the figure) according to the millimeter wave radar 102 and the camera 103, an emergency automatic steering function is required to intervene, at this time, the sensing fusion controller 105 determines that the host vehicle has no detection blind area and no obstacle in the front right direction according to the second driving state information, the second environment information, the first driving state information and the first environment information, determines that the host vehicle has no detection blind area and no obstacle in the rear right direction, and then checks the first processing result and the second processing result through the automatic steering function controller 106, the check is consistent, no vehicle is near the host vehicle, the lane to be entered (i.e. the lane to be steered) is safe, and the automatic steering function controller 106 controls the steering system 107 to complete the emergency automatic steering action so as to avoid an accident.
Fig. 6 is a schematic diagram illustrating an implementation of another automatic steering control method according to a second embodiment of the present invention. As shown in fig. 6, when the sensing fusion controller 105 senses a front obstacle (such as the auxiliary vehicle 1 in fig. 6) or prepares to change lane and overtake according to the millimeter wave radar 102 and the camera 103, the sensing fusion controller 105 determines that a detection blind area exists in the right front of the main vehicle (due to shielding of the auxiliary vehicle 1), and determines that no obstacle exists in the detection blind area according to the environment information of the auxiliary vehicle 1 fed back by the main vehicle TBox 104; the sensing fusion controller 105 judges that a detection blind area exists at the right rear part, judges that no obstacle exists in the blind area according to the environment information of the auxiliary vehicle 2 fed back by the main vehicle TBox104, and checks that no high-speed incoming vehicle exists in a lane to be driven (namely, a lane to be steered of a target vehicle) through the automatic driving function controller 106, and the lane to be driven is safe, and the automatic driving function controller 106 controls the steering system 107 to complete automatic steering action so as to realize obstacle avoidance or lane change overtaking.
Fig. 7 is a schematic diagram illustrating an implementation of another automatic steering control method according to a second embodiment of the present invention. As shown in fig. 7, when the sensing fusion controller 105 senses a front obstacle or prepares to change lane and overtake according to the millimeter wave radar 102 and the camera 103, the sensing fusion controller 105 determines that a detection blind area exists in the right front of the host vehicle, and determines that an obstacle (such as a pedestrian in fig. 7) exists in the blind area according to the environment information of the auxiliary vehicle 1 fed back by the host vehicle TBox 104; judging that the to-be-driven lane (namely the to-be-driven lane of the target vehicle) has danger, and in order to avoid accidents, the automatic steering function is prohibited from starting, and the vehicle continues to run or brakes to decelerate and stop in the own lane.
Fig. 8 is a schematic diagram illustrating an implementation of another automatic steering control method according to a second embodiment of the present invention. As shown in fig. 8, when the sensing fusion controller 105 senses a front obstacle or prepares to change lane and overtake according to the millimeter wave radar 102 and the camera 103, the sensing fusion controller 105 determines that a detection blind area exists in the right front of the host vehicle (due to shielding of the auxiliary vehicle 1) and determines that no obstacle exists in the detection blind area according to the environment information of the auxiliary vehicle 1 fed back by the host vehicle TBox 104; the sensing fusion controller 105 judges that a detection blind area is arranged at the right rear part (due to shielding of the auxiliary vehicle 2), and judges that an obstacle (such as the auxiliary vehicle 3) exists in the detection blind area according to the environment information of the auxiliary vehicle 2 fed back by the main vehicle TBox104, and the automatic driving function controller 106 performs verification and safety judgment according to the position information and the vehicle speed information of the auxiliary vehicle 3 fed back by the sensing fusion controller 105 and the main vehicle TBox104, judges that a high-speed incoming vehicle (such as the auxiliary vehicle 3) exists behind a lane to be entered (namely a lane to be turned of a target vehicle), and risks entering the lane, and the automatic turning function prohibits starting to avoid accidents, so that the vehicle controls the braking system 108 and the driving system 109 to continue running or brakes and decelerates to stop in the lane.
Fig. 9 is a schematic diagram illustrating an implementation of another automatic steering control method according to a second embodiment of the present invention. As shown in fig. 9, the specific implementation process of the method is as follows:
step 310, judging whether the vehicle is in need of emergency steering or automatic driving demand steering lane change when the obstacle occurs in the main vehicle, if yes, executing step 320, otherwise, executing step 380.
Step 320, the perception fusion controller judges whether a detection blind area or a barrier exists in front of a lane to be entered according to the environment information detected by the millimeter wave radar and the camera of the host vehicle, if yes, step 330 is executed, and otherwise step 360 is executed.
And 330, the perception fusion controller judges whether a detection blind area does not exist behind the lane to be entered according to the environment information detected by the millimeter wave radar and the camera of the host vehicle, if so, the step 340 is executed, and otherwise, the step 370 is executed.
Step 340, the autopilot function controller checks whether the lane to be entered is safe or not and whether the autopilot condition is provided (the autopilot condition is the same as the check result of the first processing result and the second processing result) according to the environmental vehicle position information and the vehicle speed information of the host vehicle TBox, if yes, step 350 is executed, otherwise step 380 is executed.
Step 350, the automatic steering function controller starts an automatic steering function and controls the steering system to automatically steer.
Step 360, the perception fusion controller judges whether the front of the lane to be entered is unobstructed or not through the related information (namely the second environmental information and the second driving state information) of the host vehicle TBox calling environmental vehicle, if yes, the step 330 is executed, otherwise, the step 380 is executed.
Step 370, the perception fusion controller calls related information (namely second environmental information and second driving state information) of the environmental vehicle through the host vehicle TBox, judges whether the rear of the lane to be entered is unobstructed, if yes, returns to execute step 340, otherwise, executes step 380.
Step 380, the autopilot controller (or driver) controls the braking system and the drive system to service or brake within the host lane.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (9)

1. A vehicle steering control system, the system comprising: a TBox controller, a perception fusion controller and an automatic driving function controller;
the TBox controller is used for acquiring first driving state information, first environment information, second driving state information, second environment information and a first processing result; the first driving state information is driving state information of a target vehicle, the first environmental information is environmental information of the target vehicle, the first environmental information comprises first environmental vehicles and first obstacles around the target vehicle, the second driving state information is driving state information of the first environmental vehicle, the second environmental information is environmental information of the first environmental vehicle, the second environmental information comprises second environmental vehicles and second obstacles around the first environmental vehicle, and the first processing result is a processing result for carrying out safety judgment on lanes to be turned of the first environmental vehicle;
the sensing fusion controller is connected with the TBox controller and is used for performing sensing fusion processing according to the second driving state information, the second environment information, the first driving state information and the first environment information to obtain a second processing result, wherein the second processing result is a processing result for performing safety judgment on a lane to be steered of the target vehicle;
The automatic driving function controller is respectively connected with the perception fusion controller and the TBox controller, and is used for verifying the first processing result and the second processing result to obtain a verification result and executing steering control operation according to the verification result;
the performing a sensing fusion process according to the second driving state information, the second environment information, the first driving state information and the first environment information to obtain a second processing result, including:
judging whether the target vehicle has a detection blind area or not according to the first environmental information;
if the target vehicle does not have a detection blind area, judging whether the first obstacle exists in the drivable area according to the first environment information to obtain a first judgment result, and determining the second processing result based on the first judgment result and the first driving state information;
if the target vehicle has a detection blind area, judging whether the second environment vehicle and/or the second obstacle exist in the detection blind area according to the second environment information to obtain a second judgment result, and determining the second processing result based on the second judgment result, the first driving state information and the second driving state information.
2. The system of claim 1, wherein the system further comprises: the cloud server;
the cloud server is connected with the TBox controller, and is used for acquiring and storing the first driving state information and the first environment information, determining the second driving state information and the second environment information according to the first environment, and sending the second driving state information and the second environment information to the TBox controller;
the cloud server is further configured to obtain the first processing result, and send the first processing result to the TBox controller.
3. The system of claim 1, wherein the system further comprises: a steering controller, a braking controller, and a driving controller;
the steering controller is connected with the automatic driving function controller and is used for receiving steering request information of the automatic driving function controller;
the brake controller is connected with the automatic driving function controller and is used for receiving brake request information of the automatic driving function controller;
the driving controller is connected with the automatic driving function controller and is used for receiving driving request information of the automatic driving function controller.
4. The system of claim 1, wherein the system further comprises: an environmental detector;
the environment detector is connected with the sensing fusion controller and is used for detecting the first environment information and sending the first environment information to the sensing fusion controller.
5. The system of claim 4, wherein the environmental detector comprises a millimeter wave radar and/or a camera.
6. The system of claim 1, wherein the verifying the first processing result and the second processing result results in a verification result comprises:
acquiring the first driving state information and the second driving state information, and converting the first processing result into a first processing result of the target vehicle angle according to the first driving state information and the second driving state information;
and checking whether the second processing result is consistent with the converted first processing result to obtain a corresponding checking result.
7. The system of claim 1, wherein the performing a steering control operation based on the verification result comprises:
if the verification results are consistent, determining steering request information based on the second processing results so as to perform steering control;
And if the verification results are inconsistent, not executing steering control operation, and determining brake request information and/or drive request information based on the second processing result.
8. A vehicle steering control method, characterized by being applied to a vehicle steering control system, the vehicle steering control system comprising: a TBox controller, a perception fusion controller and an automatic driving function controller; the method comprises the following steps:
the TBox controller acquires first driving state information, first environment information, second driving state information, second environment information and a first processing result; the first driving state information is driving state information of a target vehicle, the first environmental information is environmental information of the target vehicle, the first environmental information comprises first environmental vehicles and first obstacles around the target vehicle, the second driving state information is driving state information of the first environmental vehicle, the second environmental information is environmental information of the first environmental vehicle, the second environmental information comprises second environmental vehicles and second obstacles around the first environmental vehicle, and the first processing result is a processing result for carrying out safety judgment on lanes to be turned of the first environmental vehicle;
The perception fusion controller carries out perception fusion processing according to the second driving state information, the second environment information, the first driving state information and the first environment information to obtain a second processing result, wherein the second processing result is a processing result for carrying out safety judgment on a lane to be steered of the target vehicle;
the automatic driving function controller verifies the first processing result and the second processing result to obtain a verification result, and executes steering control operation according to the verification result;
the performing a sensing fusion process according to the second driving state information, the second environment information, the first driving state information and the first environment information to obtain a second processing result, including:
judging whether the target vehicle has a detection blind area or not according to the first environmental information;
if the target vehicle does not have a detection blind area, judging whether the first obstacle exists in the drivable area according to the first environment information to obtain a first judgment result, and determining the second processing result based on the first judgment result and the first driving state information;
If the target vehicle has a detection blind area, judging whether the second environment vehicle and/or the second obstacle exist in the detection blind area according to the second environment information to obtain a second judgment result, and determining the second processing result based on the second judgment result, the first driving state information and the second driving state information.
9. The method as recited in claim 8, further comprising:
the cloud server acquires and stores the first driving state information and the first environment information, determines the second driving state information and the second environment information according to the first environment, and sends the second driving state information and the second environment information to the TBox controller;
and the cloud server acquires the first processing result and sends the first processing result to the TBox controller.
CN202210605590.5A 2022-05-30 2022-05-30 Vehicle steering control system and method Active CN114932947B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210605590.5A CN114932947B (en) 2022-05-30 2022-05-30 Vehicle steering control system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210605590.5A CN114932947B (en) 2022-05-30 2022-05-30 Vehicle steering control system and method

Publications (2)

Publication Number Publication Date
CN114932947A CN114932947A (en) 2022-08-23
CN114932947B true CN114932947B (en) 2023-04-28

Family

ID=82866176

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210605590.5A Active CN114932947B (en) 2022-05-30 2022-05-30 Vehicle steering control system and method

Country Status (1)

Country Link
CN (1) CN114932947B (en)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107264524B (en) * 2017-05-17 2020-01-24 吉利汽车研究院(宁波)有限公司 Intelligent lane changing auxiliary system and intelligent lane changing auxiliary method based on data fusion
KR102070605B1 (en) * 2017-10-27 2020-03-02 주식회사 만도 Autonomous emergency braking system and method by predicting circumstances surrounding vehicle
CN107839691B (en) * 2017-10-31 2020-03-24 奇瑞汽车股份有限公司 Vehicle control method and device
CN108639048B (en) * 2018-05-15 2020-03-03 智车优行科技(北京)有限公司 Automobile lane change assisting method and system and automobile
KR102506879B1 (en) * 2018-11-23 2023-03-08 현대자동차주식회사 Appartus and method for controlling autonomous driving of vehicle
CN109774712A (en) * 2018-12-17 2019-05-21 北京汽车集团有限公司 The method, apparatus and vehicle of vehicle control
CN112406820B (en) * 2020-11-12 2022-03-18 岚图汽车科技有限公司 Multi-lane enhanced automatic emergency braking system control method
CN112677976B (en) * 2020-12-28 2022-05-13 广州小鹏自动驾驶科技有限公司 Vehicle driving method, device, vehicle and storage medium

Also Published As

Publication number Publication date
CN114932947A (en) 2022-08-23

Similar Documents

Publication Publication Date Title
CN112416004B (en) Control method and device based on automatic driving, vehicle and related equipment
CN107953884B (en) Travel control apparatus and method for autonomous vehicle
US10935976B2 (en) Blinker judgment device and autonomous driving system
CN111204333B (en) Vehicle front blind spot detection and warning system
CN111469838B (en) Collaborative ACC/AEB decision management system based on Internet of vehicles and vehicle
US9499171B2 (en) Driving support apparatus for vehicle
JP6353525B2 (en) Method for controlling the speed of a host vehicle and system for controlling the speed of a host vehicle
CN102800214B (en) Vehicle lane change conflict resolution method under vehicle information interaction condition
CN112277937B (en) Collision avoidance aid
CN110662683B (en) Driving support device and driving support method
CN110087964B (en) Vehicle control system, vehicle control method, and storage medium
JP4684960B2 (en) Vehicle collision prevention support system
US11183066B2 (en) Method and apparatus for analyzing driving tendency and system for controlling vehicle
JP2016517106A (en) Automobile navigation system
EP3440653B1 (en) Method for vehicle identification
KR20180008726A (en) Vehicle control device and vehicle control method
US11605299B2 (en) Vehicle and control method thereof
US11281224B2 (en) Vehicle control device
CN110446645B (en) Vehicle control device
CN110949387A (en) Vehicle control device
CN112602107A (en) Information providing method for vehicle dispatching system, vehicle dispatching system and information providing device
CN107839691B (en) Vehicle control method and device
CN114537374A (en) Vehicle front anti-collision system based on travelable area
CN113479190B (en) Intelligent parking system, method, apparatus and computer-readable storage medium
CN112714718B (en) Vehicle control method and vehicle control device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant