CN114537393A - Vehicle control method, device, equipment and storage medium - Google Patents

Vehicle control method, device, equipment and storage medium Download PDF

Info

Publication number
CN114537393A
CN114537393A CN202210436406.9A CN202210436406A CN114537393A CN 114537393 A CN114537393 A CN 114537393A CN 202210436406 A CN202210436406 A CN 202210436406A CN 114537393 A CN114537393 A CN 114537393A
Authority
CN
China
Prior art keywords
vehicle
virtual
target feature
lane
guiding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210436406.9A
Other languages
Chinese (zh)
Inventor
王超
孙雁宇
王里
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhuxian Technology Co Ltd
Original Assignee
Beijing Zhuxian Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhuxian Technology Co Ltd filed Critical Beijing Zhuxian Technology Co Ltd
Priority to CN202210436406.9A priority Critical patent/CN114537393A/en
Publication of CN114537393A publication Critical patent/CN114537393A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/14Adaptive cruise control
    • B60W30/16Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
    • B60W30/165Automatically following the path of a preceding lead vehicle, e.g. "electronic tow-bar"
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0098Details of control systems ensuring comfort, safety or stability not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0015Planning or execution of driving tasks specially adapted for safety
    • B60W60/0017Planning or execution of driving tasks specially adapted for safety of other traffic participants
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0043Signal treatments, identification of variables or parameters, parameter estimation or state estimation

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the application provides a vehicle control method, a vehicle control device, vehicle control equipment and a storage medium. The method can be applied to scenes comprising lane lines, such as urban traffic or high-speed traffic, and the like, and comprises the following steps: responding to lane line abnormal information, and acquiring a real-time position of a target feature of a second vehicle, wherein the lane line abnormal information is used for indicating that a first vehicle cannot continuously acquire continuous lane lines on two sides, and the second vehicle and the first vehicle belong to the same vehicle formation; generating a virtual guide line for guiding the first vehicle to run according to the real-time position of the target characteristic, wherein the target characteristic is the vehicle body characteristic of the second vehicle; and controlling the first vehicle to follow the second vehicle to run according to the virtual guide line. When lane line detection is abnormal in vehicle formation, the transverse position control of the vehicle is ensured, and the driving safety of the following vehicle is further ensured.

Description

Vehicle control method, device, equipment and storage medium
Technical Field
The embodiment of the application relates to the technical field of automatic driving, in particular to a vehicle control method, device, equipment and storage medium.
Background
With the generation of diversified traffic demands, the automatic driving technology is gradually more widely applied. The cooperative automatic driving refers to a formation state that a plurality of vehicles follow to run at a very small distance based on the support of an automatic driving technology and a vehicle networking technology, and under a common condition, a following vehicle in a cooperative automatic driving fleet can carry out automatic driving control according to the running parameters of a front vehicle and/or a pilot vehicle so as to ensure the safe running of the following vehicle in the autonomous following process.
Disclosure of Invention
The embodiment of the application provides a vehicle control method, a vehicle control device, vehicle control equipment and a storage medium, and aims to solve the problem of transverse control failure when a following vehicle cannot identify a lane line.
In a first aspect, the present application provides a vehicle control method comprising:
responding to lane line abnormal information, and acquiring a real-time position of a target feature of a second vehicle, wherein the lane line abnormal information is used for indicating that a first vehicle cannot continuously acquire continuous lane lines on two sides, and the second vehicle and the first vehicle belong to the same vehicle formation;
generating a virtual guide line for guiding the first vehicle to run according to the real-time position of the target feature, wherein the target feature is the body feature of the second vehicle;
and controlling the first vehicle to follow the second vehicle according to the virtual guide line.
Therefore, the first vehicle generates a virtual guide line for guiding the first vehicle to advance according to the track of the second vehicle, so that the transverse position of the first vehicle is effectively controlled in a scene where the lane line cannot be effectively identified, the problem of control failure caused by the fact that the lane line cannot be identified is solved, and the controllability and the safety of the automatic driving vehicle are improved.
Optionally, generating a virtual guideline for guiding travel of the first vehicle based on the location of the target feature comprises: generating a virtual lane line for guiding the first vehicle to follow the second vehicle to run according to the position and the preset width of the target feature; and/or generating a virtual centerline for guiding the first vehicle to follow the second vehicle based on the location of the target feature.
Therefore, by generating different types of virtual guide lines, the safety requirements of different application scenes are effectively met, and the controllability and the safety of the automatic driving vehicle when the lane line cannot be identified are ensured to the maximum extent.
Optionally, generating a virtual lane line for guiding the first vehicle to follow the second vehicle according to the position of the target feature and the preset width includes: acquiring the central position of the tail of the second vehicle according to the position of the target feature; generating virtual lane lines for guiding the first vehicle to follow two sides of a lane where the second vehicle runs according to the central position of the tail of the second vehicle and the preset width; the center positions of the virtual lane lines on the two sides are the center positions of the tail of the second vehicle, and the distance between the virtual lane lines on the two sides is a preset width.
Therefore, the virtual lane lines with different widths can be selected for the automatic driving vehicle under different road conditions by generating the virtual lane lines with different widths, so that the controllability and the safety of the automatic driving vehicle under different road conditions are improved.
Optionally, generating a virtual lane line for guiding the first vehicle to follow the second vehicle according to the position of the target feature and the preset width includes: according to the position of the target feature, width information of the second vehicle is obtained, wherein the width information comprises positions of two sides of the second vehicle; and generating virtual lane lines on two sides of a lane for guiding the first vehicle to follow the second vehicle according to the positions of the two sides of the second vehicle and the preset length value, wherein the virtual lane line on one side is obtained by extending the preset length value outwards according to one side of the second vehicle, and the virtual lane line on the other side is obtained by extending the preset length value outwards according to the other side of the second vehicle.
Therefore, the virtual lane lines are generated according to different types of target features, so that the virtual lane lines are associated with different features on the second vehicle, the corresponding target features are acquired through the sensor on the first vehicle according to the characteristics of a specific scene to generate the width, the automatic driving vehicle can effectively generate the virtual lane lines based on the first vehicle under different scenes, and the controllability and the safety of the automatic driving vehicle when the lane lines cannot be identified are ensured.
Optionally, generating a virtual centerline for guiding the first vehicle to follow the second vehicle based on the location of the target feature comprises: acquiring the central position of the second vehicle according to the position of the target feature; a virtual center line for guiding the first vehicle to follow the second vehicle is generated based on the center position of the second vehicle.
The virtual center line is generated by collecting the center position of the second vehicle, the transverse position of the first vehicle is effectively controlled based on the running track of the first vehicle, and the transverse control and safety of the first vehicle under complex road conditions or vehicle conditions are further effectively improved.
Optionally, the obtaining of the position of the target feature includes at least one of: acquiring the width of the tail part of a second vehicle; or acquiring the contour width of the second vehicle; or, a width between two rear lamps of the second vehicle is acquired.
Optionally, when the first vehicle and the second vehicle belong to a same fleet of vehicles running in a formation, and the first vehicle is a following vehicle and the second vehicle is a pilot vehicle, the method further comprises: identifying a first license plate of a vehicle in front of the first vehicle, and if the identified license plate is different from a second license plate of a second vehicle recorded in advance, determining that the vehicle to which the first license plate belongs is a non-formation vehicle; transmitting notification information to the second vehicle, the notification information indicating any one of: the notification information is used for indicating the second vehicle to decelerate to wait for the first vehicle; or the notification information is used for indicating that the first vehicle is set as a pilot vehicle; alternatively, the notification information is used to disambiguate the vehicle formation of the second vehicle and the first vehicle.
By adopting corresponding adjustment measures when non-formation vehicles are inserted into the formation, the flexibility of formation adjustment of the automatic driving vehicles is effectively ensured, and the safety of the respective automatic driving vehicles is improved.
In a second aspect, the present application provides a vehicle control apparatus comprising:
the acquisition module is used for responding to lane line abnormal information which is used for indicating that a first vehicle cannot acquire the lane line information and acquiring the position of a target feature, wherein the target feature comprises a vehicle body feature of a second vehicle;
a generating module for generating a virtual guide line for guiding a first vehicle to travel according to a position of the target feature;
and the indicating module is used for controlling the first vehicle to follow the second vehicle to run according to the virtual guide line.
Optionally, the generating module is specifically configured to generate a virtual lane line for guiding the first vehicle to travel along with the second vehicle according to the position of the target feature and a preset width; and/or generating a virtual centerline for guiding the first vehicle to follow the second vehicle based on the location of the target feature.
Optionally, the generating module is specifically configured to obtain a center position of the second vehicle tail according to the position of the target feature; generating virtual lane lines for guiding the first vehicle to follow two sides of a lane where the second vehicle runs according to the central position of the tail of the second vehicle and the preset width; the center positions of the virtual lane lines on the two sides are the center positions of the tail of the second vehicle, and the distance between the virtual lane lines on the two sides is a preset width.
Optionally, the generating module is specifically configured to obtain width information of the second vehicle according to the position of the target feature, where the width information includes positions of two sides of the second vehicle; and generating virtual lane lines on two sides of a lane for guiding the first vehicle to follow the second vehicle according to the positions of the two sides of the second vehicle and the preset length value, wherein the virtual lane line on one side is obtained by extending the preset length value outwards according to one side of the second vehicle, and the virtual lane line on the other side is obtained by extending the preset length value outwards according to the other side of the second vehicle.
Optionally, the generating module is specifically configured to obtain a center position of the second vehicle according to the position of the target feature; a virtual center line for guiding the first vehicle to follow the second vehicle is generated based on the center position of the second vehicle.
Optionally, the obtaining module is specifically configured to obtain at least one of the following: acquiring the width of the tail part of a second vehicle; or acquiring the contour width of the second vehicle; or, a width between two rear lamps of the second vehicle is acquired.
Optionally, the indicating module is further configured to, when the first vehicle and the second vehicle belong to a same platoon running in a formation, and the first vehicle is a following vehicle and the second vehicle is a lead vehicle, identify a first license plate of a vehicle in front of the first vehicle, and if the identified license plate is different from a second license plate of a second vehicle recorded in advance, determine that the vehicle to which the first license plate belongs is a non-formation vehicle; transmitting notification information to the second vehicle, the notification information indicating any one of: the notification information is used for indicating the second vehicle to decelerate to wait for the first vehicle; the notification information is used for indicating that the first vehicle is set as a pilot vehicle; the notification information is used to disambiguate the vehicle formation of the second vehicle and the first vehicle.
In a third aspect, the present application also provides a control apparatus, including:
at least one processor;
and a memory communicatively coupled to the at least one processor;
the memory stores instructions executable by the at least one processor, and the instructions are executed by the at least one processor to cause the control device to perform the vehicle control method according to any one of the embodiments of the first aspect of the present application.
In a fourth aspect, the present application further provides a computer-readable storage medium having stored thereon computer-executable instructions for implementing the vehicle control method according to any one of the first aspect of the present application when executed by a processor.
In a fifth aspect, the present disclosure also provides a computer program product comprising computer executable instructions for implementing the vehicle control method according to any embodiment corresponding to the first aspect of the present disclosure when executed by a processor.
According to the vehicle control method, the vehicle control device, the vehicle control equipment and the vehicle control storage medium, the position of the identification feature of the second vehicle is obtained according to the lane line abnormal information; generating a virtual guide line for guiding the first vehicle to run according to the position of the identification feature; and then controlling the first vehicle to follow the second vehicle according to the virtual guide line. Therefore, the first vehicle can generate a virtual guide line for guiding the first vehicle to advance according to the track of the second vehicle, so that the state of following the second vehicle in the scene that the lane line cannot be effectively identified is continuously maintained, the problem of following vehicle control failure is effectively reduced, and the controllability and the safety of the following vehicle are improved.
Drawings
Fig. 1 is an application scenario diagram of a vehicle control method according to an embodiment of the present application;
FIG. 2 is a flow chart of a vehicle control method provided in an embodiment of the present application;
FIG. 3a is a flow chart of a vehicle control method provided in another embodiment of the present application;
fig. 3b is a schematic view of a scene for determining a virtual lane line according to an embodiment of the present application;
fig. 3c is a schematic view of a scene for determining a virtual lane line according to another embodiment of the present application;
FIG. 3d is a schematic view of a scenario for determining a virtual centerline according to an embodiment of the present application;
FIG. 3e is a schematic diagram of a vehicle center position and a tail center position provided by an embodiment of the present application;
FIG. 4a is a flow chart of a vehicle control method according to yet another embodiment of the present application;
FIG. 4b is a schematic diagram of a second vehicle decelerating to wait for the first vehicle according to an embodiment of the present disclosure;
fig. 4c is a schematic view of a scenario in which a first vehicle is set as a pilot vehicle according to an embodiment of the present application;
FIG. 4d is a diagram illustrating an application scenario for dissembling a fleet of vehicles according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of a vehicle control device according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a control device according to an embodiment of the present application.
Detailed Description
When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments.
The several embodiments described below may be combined with each other and may not be described in detail in some embodiments for the same or similar concepts or processes. Embodiments of the present application will be described below with reference to the accompanying drawings.
In the cooperative automatic driving, based on the support of the automatic driving technology and the vehicle networking technology, the formation state of the running is followed by a very small vehicle distance, and at the moment, the front vehicle can be driven by an automatic driving system or manually controlled by a driver; the following vehicle can control the automatic driving state of the following vehicle according to the driving parameters based on the front vehicle and/or the pilot vehicle and the lane lines on the two sides of the following vehicle at the same time so as to ensure the safe driving of the following vehicle in the autonomous following process. However, if the following vehicle cannot identify the lane line, the problem that the following vehicle control fails because the transverse relative position of the following vehicle on the lane cannot be effectively determined, that is, effective transverse control cannot be implemented exists.
In order to solve the above problem, an embodiment of the present invention provides a vehicle control method that, when a lane line is abnormally identified, generates a virtual guide line for guiding a following vehicle to travel based on an identification feature of a preceding vehicle, and controls a lateral position of the following vehicle based on the virtual guide line, thereby ensuring travel safety of the following vehicle when the lane line cannot be effectively identified.
Fig. 1 is an application scenario diagram of a vehicle control method according to an embodiment of the present application. As shown in fig. 1, in the vehicle control flow, the following vehicle 100 identifies the forward direction by identifying the position of the leading vehicle 110, identifies its own lateral position by identifying the lane line 120, and if the lane line 120 cannot be identified, it is necessary to generate a virtual guide line 130 based on the position of the leading vehicle 110 to control its own lateral position.
It should be noted that, in the scenario shown in fig. 1, the following vehicle, the pilot vehicle, and the lane line are only illustrated as an example, but the embodiment of the present application is not limited thereto, that is, the number of the following vehicle, the pilot vehicle, and the lane line may be any.
The following describes the vehicle control method provided by the present application in detail by way of specific embodiments. It should be noted that the following specific embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments.
Fig. 2 is a flowchart of a vehicle control method according to an embodiment of the present application. As shown in fig. 2, including but not limited to the following steps:
step S201, responding to the lane line abnormal information, and acquiring the position of the target feature.
The lane line abnormal information is used for indicating that the first vehicle cannot continuously acquire the lane line information, and the target feature comprises a vehicle body feature of the second vehicle.
Specifically, in the embodiment of the present application, the first vehicle is used to indicate a following vehicle, and the second vehicle may be a pilot vehicle or a following vehicle located in front of the first vehicle.
The lane line abnormality information is notification information sent by the control device of the first vehicle when the first vehicle cannot continuously acquire lane line information (the information includes image information) continuous on both sides thereof.
Specifically, the situation that the first vehicle cannot continuously acquire the images of the two continuous lane lines may include, but is not limited to, any of the following situations:
the lane line is fuzzy and difficult to identify due to lack of maintenance and other reasons; at least one of the lane lines on the two sides has a long-distance breakpoint or discontinuous part, so that the lane line cannot be identified; the vehicle is turned from a lane with a lane line to a lane without a lane line (e.g., from an urban road to a rural road without a lane line); the shielded lane line cannot be obtained due to the fact that vehicles continuously press the line to run or continuously try to queue for insertion and the like; an image sensor on the first vehicle for capturing the lane line is abnormal.
Alternatively, the lane markings are typically captured by image sensors at the front or both sides of the vehicle.
Further, the target feature is used to represent a contour feature of the second vehicle or a feature of a specific recognizable structure, such as a wheel, a license plate, a lamp, etc. on the rear side of the vehicle, but cannot be a feature that is generally difficult to recognize or greatly differs on different vehicles, such as a poster decoration on the vehicle, a structure on the side of the vehicle, etc.
Optionally, the target feature may also be acquired by an image sensor, or may also be acquired by a radar, a laser sensor, or other devices, where the sensor includes a front-mounted sensor and a rear-mounted sensor, which is not limited in this application.
Alternatively, the real-time position is generally a coordinate system established based on the center point or front end intermediate position of the first vehicle itself, and then the coordinates or relative position of the second vehicle with respect to the first vehicle in the coordinate system is determined.
And further, acquiring the real-time position, wherein the real-time position comprises specific position data acquired by the acquisition equipment of the first vehicle, processed by the control equipment of the first vehicle, or sent to the cloud server for processing.
And S202, generating a virtual guide line for guiding the first vehicle to run according to the position of the target feature.
Wherein the target feature is a body feature of the second vehicle.
Specifically, the virtual guiding line is a continuous virtual line used for guiding and assisting the vehicle in lateral control, and the virtual guiding line may be used for guiding and controlling the foreground direction of the vehicle, or may be used for only adjusting the lateral position of the vehicle.
In some embodiments, the virtual guide line may be a dashed line, and the first vehicle needs to maintain the center on the virtual guide line, or the lateral distance between the center and the virtual guide line is less than a predetermined value. Thereby achieving control of the lateral position of the first vehicle.
In some embodiments, the virtual guiding line may also be two dotted lines, in which case the virtual guiding line is located on both sides of the first vehicle, and the first vehicle implements its lateral position control by ensuring that it is always located between the two lines.
In some embodiments, the act of generating a virtual guideline is performed by a pilot vehicle (or a second vehicle); or the request of the pilot vehicle (or the second vehicle) or the following vehicle (or the first vehicle) can be executed by a cloud server for managing vehicle formation, a virtual guide line is generated and then issued to the pilot vehicle, and the virtual guide line is distributed to the following vehicle by the pilot vehicle; the method can also be executed by RSU (Road Side Unit) on two sides of a Road based on the request of any vehicle in the vehicle formation, and the RSU is sent to the vehicle/navigation vehicle/cloud server sending the request after generating the virtual guide line and then distributed to the whole vehicle formation by the receiving party. In specific implementation, different modes can be dynamically selected to be executed according to the use conditions of the vehicle terminal computing power and the cloud server computing power.
In some embodiments, the location of the target feature may be obtained by the first vehicle, sent to the cloud server to generate a virtual guideline, and the result sent to the first vehicle; or the control device of the first vehicle performs the actions of acquiring the position of the target feature and generating the virtual guide line.
And S203, controlling the first vehicle to follow the second vehicle to run according to the virtual guide line.
Specifically, when the virtual guide lines are present, by always ensuring that the first vehicle is located between the virtual guide lines (when two virtual guide lines are present) or the distance to the virtual guide lines is smaller than a set value (only one virtual guide line), it is achieved that the lateral position of the first vehicle is ensured while the first vehicle moves following the second vehicle.
When the control action is performed directly by the first vehicle, it is typically performed directly by the control device of the first vehicle controlling the movement device of the first vehicle directly.
When the control action is determined by the cloud server, a corresponding instruction is sent by the cloud server to the first vehicle, and the control device of the first vehicle controls the movement device to adjust the lateral position based on the instruction.
Further, if there are following vehicles behind the first vehicle in the same other vehicle formation, the determined virtual guide line may be directly sent to other following vehicles by the first vehicle or the cloud server, so that the other following vehicles directly complete the lateral control based on the generated virtual guide line without repeatedly generating the virtual guide line, thereby improving the processing efficiency and saving the computing resources.
According to the vehicle control method provided by the embodiment of the application, the real-time position of the target characteristic of the second vehicle is obtained according to the lane line abnormal information; generating a virtual guide line for guiding the first vehicle to run according to the real-time position of the target characteristic; and then controlling the first vehicle to follow the second vehicle according to the virtual guide line. Therefore, the first vehicle can generate a virtual guide line for guiding the first vehicle to advance according to the track of the second vehicle, the state of driving along with the second vehicle is continuously maintained in the scene that the lane line cannot be effectively identified, the problem of follow-up control failure is effectively avoided, and controllability and safety of the follow-up vehicle are improved.
Fig. 3a is a flowchart of a vehicle control method according to another embodiment of the present application. The vehicle control method provided by the embodiment has a plurality of implementation manners, and some possible implementation manners are given below with reference to fig. 3a to 3d, in fig. 3b to 3d, 310 is a first vehicle, 320 is a second vehicle, 330 is a virtual guide line, 340 is an actual lane line, and a bold solid line is used to represent an action of the first vehicle to identify a target feature of the second vehicle.
First, fig. 3b is a schematic view of a scene for determining a virtual lane line according to an embodiment of the present application. With reference to fig. 3a and 3b, the vehicle control method includes:
and S301, responding to the lane line abnormal information, and acquiring the position of the target feature of the second vehicle.
The lane line abnormality information is used for indicating that the first vehicle cannot continuously acquire the continuous lane lines on the two sides.
Specifically, when the target feature of the second vehicle is obtained, the first vehicle may obtain only one target feature (for example, obtain only the target feature of the vehicle light or the vehicle body width of the first vehicle) according to the pre-configured identification rule; and various target characteristics can be identified at the same time (such as the target characteristics of the rear wheel and the lamp of the vehicle are acquired at the same time), and the accuracy and the reliability of the identification result are effectively ensured by combining different target characteristics.
In some embodiments, obtaining the target feature includes the following different cases:
acquiring the width of the tail part of a second vehicle; or, obtaining a contour width of a second vehicle; alternatively, the real-time positions of two first vehicle lights of a second vehicle are obtained.
Specifically, the tail width is obtained by the image sensor at two sides of the tail of the second vehicle, and the tail width of the second vehicle is determined according to the two sides. When the shape of the tail of the second vehicle is relatively flat (namely the left side and the right side of the tail are vertical surfaces instead of arc surfaces), the width measurement of the tail is relatively accurate, and therefore the accuracy of the determined real-time position of the target feature of the second vehicle can be effectively guaranteed.
The contour width is obtained by the image sensor, the whole contour of the second vehicle can be shot, and the width of the dragged contour is determined based on the positions of two sides of the whole contour; the profile width may be greater than the tail width (as in the case of the second vehicle shape contraction from mid-to-tail) or may be equal to the tail width. When the two sides of the dragged tail part are non-flat surfaces, the real-time position of the target feature of the second vehicle is determined through the width of the outline because the outline of the second vehicle is greatly different from the background, and the accuracy of the result can be effectively ensured.
The real-time position of the first vehicle lamp is determined by the position of the central point of the first vehicle lamp, when the vehicle runs at night, the color difference between the vehicle outline and the background is small, and when the first vehicle lamp is turned on, the recognition degree is high, so that the recognition accuracy during the running at night can be ensured by determining the real-time position of the first vehicle lamp.
And S302, acquiring the central position of the tail of the second vehicle according to the position of the target feature.
Specifically, after the real-time position of the target feature is obtained, a virtual lane line for guiding the first vehicle to travel along with the second vehicle may be generated according to the real-time position and the preset width of the target feature. At this time, the virtual lane lines are two continuous lines parallel to each other to simulate the actual lane lines.
From the positions of both sides of the second vehicle tail portion, in addition to the tail width, the center position of the second vehicle tail portion can be further determined, and in this case, the center position is calculated based on the positions of both side surfaces, instead of being directly photographed.
Further, the rear portion of the second vehicle (e.g., the portion from the lower chassis and the exhaust duct to the upper trunk roof, i.e., the rear portion) that can be acquired by the image sensor from the first vehicle angle (i.e., the angle behind the second vehicle) can be centered on the rear portion of the second vehicle, which can be determined by calculating the corresponding image center point based on the image of the rear portion of the first vehicle acquired by the first vehicle, or can be directly determined as the rear center position based on a preset configuration, where the position of a specific mark (e.g., the license plate or the logo of the second vehicle) on the rear portion is located on the rear portion of the second vehicle.
And step S303, generating virtual lane lines for guiding the first vehicle to follow two sides of a lane where the second vehicle runs according to the central position of the tail of the second vehicle and the preset width.
The central positions of the virtual lane lines on the two sides are the central positions of the tail of the second vehicle, and the distance between the virtual lane lines on the two sides is a preset width.
Specifically, the preset width may be a lane width detected in real time before the abnormality is identified, or may be a preset width corresponding to the current road type (for example, a standard width of a national road).
For example, when a road following a vehicle is partially narrowed before an abnormality is detected (for example, a lane line is narrowed first and then disappears due to a painting error, but an actual lane is not changed), a driver is present in the second vehicle or the pilot vehicle, so that the second vehicle can keep continuous and stable control, and if the width detected in real time before the abnormality is identified is taken as a preset width, the second vehicle is not in accordance with the actual situation; therefore, the width corresponding to the current road type should be taken as the preset width.
In some embodiments, the preset width may be 0 (or the preset width is not preset), and at this time, the virtual lane line may also be generated directly according to the position of the target feature of the second vehicle, and if the target feature is a rear lamp of the second vehicle, the width between two rear lamps on the second vehicle is taken as the width of the virtual lane line.
In some embodiments, the preset width may also be a distance from two sides of the vehicle to the lane line on the same side, and in this case, the sum of the preset width (or 2 times of the preset width, that is, the preset width is a distance between the vehicle and a single side of the lane line) and the width of the first vehicle itself is the width between the virtual lane lines.
And step S308, controlling the first vehicle to follow the second vehicle to run according to the virtual guide line.
Specifically, this step is the same as step S203 in the embodiment shown in fig. 2, and is not described here again.
In the application scenario, the first vehicle is guided to control the lateral position of the first vehicle by the virtual guide line generated based on steps S302 and S303, and the generated virtual guide line has the same position as the original lane line, so that the first vehicle can be guided to normally run instead of the virtual guide line which cannot be effectively monitored. The virtual guide line generated at the time can be widely applied to the environment with any lane line detection abnormity, and the safety of the automatic driving vehicle is fully ensured.
Next, fig. 3c is a schematic view of a scene for determining a virtual lane line according to another embodiment of the present application. With reference to fig. 3a and 3c, the vehicle control method includes:
and S301, responding to the lane line abnormal information, and acquiring the real-time position of the target feature of the second vehicle.
Specifically, the content of this step is the same as that of step S201 in the embodiment shown in fig. 2, and is not described herein again.
And step S304, acquiring width information of the second vehicle according to the position of the target feature.
Wherein the width information includes positions of both sides of the second vehicle.
Specifically, the width information of the second vehicle may be determined by combining an image captured by the image sensor with preset second vehicle structure information, so as to ensure the accuracy of the identification.
For example, when the vehicle runs on a muddy road, a large amount of muddy water may be stained on the tail of the second vehicle, and at this time, from the angle of the image sensor of the first vehicle, the color difference between the tail of the second vehicle and the ground background is not large, so that the feature points on the two sides of the second vehicle cannot be accurately determined, and therefore the width information of the second vehicle cannot be accurately determined, and at this time, the width of the second vehicle can be determined by combining with a preset second vehicle structure; in another case, if the vehicle in front of the first vehicle is not the second vehicle in the same vehicle formation, the determination of the width of the second vehicle through the preset second vehicle structure is meaningless, so that the feature points on the two sides of the vehicle in front can be directly obtained through the image sensor, and the width information of the vehicle in front can be further obtained.
And S305, generating virtual lane lines on two sides of a lane for guiding the first vehicle to follow the second vehicle to run according to the positions of the two sides of the second vehicle and the preset length value.
The virtual lane line on one side is obtained by extending a preset length value outwards according to one side of the second vehicle, and the virtual lane line on the other side is obtained by extending the preset length value outwards according to the other side of the second vehicle.
Specifically, the preset length value may be different from the actual lane width to meet the requirements of different road conditions.
Further, the preset length value may be smaller than the actual lane width, and the first vehicle can only control its lateral position within a smaller range. The preset length value is suitable for the condition of a busy lane, and the transverse movement of the first vehicle is reduced, so that the vehicle is prevented from being scratched with the vehicle adjacent to the lane, and the safety of automatic driving is improved.
In some embodiments, the preset length may be 0, and at this time, the first vehicle is required to strictly control the lateral position of the first vehicle according to the action track of the second vehicle, so that the arrangement is suitable for complex conditions of the ground of the lane, and complex conditions such as broken stones and pits which may damage the chassis of the vehicle exist.
The preset length value can also be larger than the actual lane width, and at the moment, the first vehicle can partially cross the lane line (namely 'riding the line to run') as required, so that the first vehicle can have a larger transverse adjustment space, and further can better adapt to the situation that the situation of the lane surface is complex, if the lane is provided with a pit and a well cover on the ground, the rolling, the moving and the adjustment are convenient through increasing the transverse adjustment space of the first vehicle, and meanwhile, the transverse control is ensured, and further the safety of automatic driving is better ensured.
And step S308, controlling the first vehicle to follow the second vehicle to run according to the virtual guide line.
Specifically, this step is the same as step S203 in the embodiment shown in fig. 2, and is not described here again.
In the application scenario, the virtual guide line generated based on the steps S304 and S305 guides the first vehicle to control the lateral position of the first vehicle, and the generated virtual guide line can select a corresponding preset length value according to the busyness of a specific lane and the road surface condition of the lane, so that the capability of the first vehicle in adjusting the lateral position of the first vehicle in a complex environment can be better improved compared with a virtual lane line generated according to an original lane line, and the safety of the automatic driving vehicle in the complex road condition is better ensured.
Fig. 3d is a schematic view of a scene for determining a virtual center line according to an embodiment of the present application, and with reference to fig. 3a and 3d, the vehicle control method includes:
and S301, responding to the lane line abnormal information, and acquiring the real-time position of the target feature of the second vehicle.
Specifically, the content of this step is the same as that of step S201 in the embodiment shown in fig. 2, and is not described herein again.
And S306, acquiring the central position of the second vehicle according to the position of the target feature.
Specifically, the central position of the second vehicle may also be determined in real time by using the characteristics of the license plate at the rear of the second vehicle and the like in the middle of the rear side of the vehicle.
At this time, the center position of the second vehicle can be determined by the position of the center of the license plate actually shot by the image sensor.
The center position of the second vehicle, that is, the center position of the entire contour of the second vehicle (such as the entire contour of a vehicle from a lower wheel to an upper roof portion on the rear side of a common sedan car) that can be acquired from the first vehicle angle (that is, the angle behind the second vehicle) by the image sensor, can be determined by calculating the corresponding image center point based on the image of the entire contour of the first vehicle acquired by the first vehicle. As shown in fig. 3e, which is a schematic diagram of a vehicle center position and a tail center position, where point a is the vehicle center position and point B is the tail center position, a virtual lane line may be generated based on any one point according to different choices.
Step S307 is to generate a virtual center line for guiding the first vehicle to travel following the second vehicle, based on the center position of the second vehicle.
Specifically, when generating a virtual guide line based on the second vehicle center position, a center guide line of one piece, that is, a virtual center line is generally generated. One end of the virtual center line is connected with the center of the second vehicle, the other end of the virtual center line is connected with the center of the first vehicle, and the shape of the guide line is determined based on the travel plan and the motion state (including real-time speed, mutual included angle along the traveling direction, change rate of included angle and the like) of the front first vehicle. The virtual centre line may be a straight line when the second vehicle and the first vehicle are travelling on a straight lane (but may also be a curved line if the trip plan indicates that the formation of vehicles will require a lane change); however, when the second vehicle and the first vehicle travel on the non-linear turning tool, the virtual center lines are generally both curved lines.
When only the virtual centerline is used to guide and control the lateral position of the first vehicle, the center of the first vehicle may be required to remain on-line at all times (e.g., when the road conditions are good), or the center of the first vehicle may be allowed to be a set range of inexpensive (e.g., no more than 0.5 meters laterally offset) with respect to the virtual centerline.
And step S308, controlling the first vehicle to follow the second vehicle to run according to the virtual guide line.
Specifically, this step is the same as step S203 in the embodiment shown in fig. 2, and is not described here again.
In the application scenario, the virtual guide line generated based on the steps S306 and S307 guides the first vehicle to control the lateral position of the first vehicle, and the generated virtual guide line is directly determined based on the travel track of the second vehicle, so that the running track of the first vehicle can be furthest ensured to be matched with the second vehicle, and the first vehicle does not affect the consistency with the running track of the second vehicle when adjusting the lateral position of the first vehicle, and is not limited by the original lane line.
In some embodiments, in addition to obtaining the real-time location of the target feature of the second vehicle, the manner in which the virtual guideline is established based on other features may include:
in a first mode (not shown), a real-time position of the second vehicle is determined in response to lane line abnormality information, and a virtual guide line is established based on the real-time position of the second vehicle.
Specifically, at the time when the second vehicle receives the lane line abnormality information, the second vehicle usually keeps at the middle position of the lane, (if the vehicle has a forward sensor abnormality, such as damage or skew of a front camera), a virtual guide line can be established based on the characteristic data of the center position, the vehicle body width, the lamp position, and the like of the second vehicle; the created virtual guiding line may also include a virtual center line, a virtual lane line, etc., and the specific method may refer to the steps in the embodiments of fig. 3b to fig. 3 d.
In a second mode (not shown), in response to lane line abnormality information, a real-time position of a following vehicle behind the second vehicle is determined, and a virtual guide line is established based on the real-time position of the following vehicle.
Specifically, if the sensor in front of the second vehicle is abnormal, and the sensor behind the second vehicle is normal, and the real-time position of the following vehicle behind the second vehicle can be normally obtained, the virtual guide line can be established based on the real-time position of the target feature on the following vehicle. The steps of the embodiments of fig. 3b to 3d may be referred to by the method of establishing a virtual guideline by following the real-time position of the target feature of the vehicle, and are not described herein again.
According to the vehicle control method provided by the embodiment of the application, the real-time positions of the target features of different types of second vehicles are obtained, and different types of virtual guide lines are generated based on the corresponding target features, the preset width, the preset length value and the like, so that the first vehicle is controlled to adjust the transverse position of the first vehicle based on the virtual guide lines. Through acquireing different types of target characteristics respectively, and generate different types of virtual guide line, effectively satisfy the requirement of controlling following car transverse position under the different lane situations, and then guarantee to follow the car and homoenergetic under the different lane situations and guarantee to realize effective control to its transverse position to effectively follow the security of car under lane line detects abnormal conditions.
Fig. 4a is a flowchart of a vehicle control method according to still another embodiment of the present application. As shown in fig. 4a, the vehicle control method provided by the present embodiment includes the steps of:
and S401, responding to the lane line abnormal information, and acquiring the position of the target feature of the second vehicle.
The lane line abnormality information is used for indicating that the first vehicle cannot continuously acquire the continuous lane lines on the two sides.
Specifically, the content of this step is the same as that of step S201 in the embodiment shown in fig. 2, and is not described here again.
And S402, identifying a first license plate of a vehicle in front of the first vehicle, and if the identified license plate is different from a second license plate of a second vehicle recorded in advance, determining that the vehicle to which the first license plate belongs is a non-formation vehicle.
The first vehicle and the second vehicle belong to a fleet which runs in the same formation, the first vehicle is a following vehicle, and the second vehicle is a pilot vehicle.
Specifically, when the traffic flow in the lane is busy, there may be a case where a non-formation vehicle is inserted between the second vehicle and the first vehicle, and at this time, if the virtual guide line continues to be generated based on the vehicle ahead of the first vehicle, a potential safety hazard may occur (because the driving safety of the inserted vehicle cannot be ensured). Therefore, it is necessary to synchronously confirm whether the second vehicle is a convoy vehicle when the target feature of the second vehicle is acquired.
For example, when the license plate of the front vehicle (namely, the first license plate) is identified, the front vehicle is checked with a second license plate in the preset formation vehicle, so as to ensure that the front vehicle is the formation vehicle.
When the second license plate is different from the first license plate, the front vehicle is determined to be a non-formation vehicle, and at the moment, further processing is needed to ensure the safety of the first vehicle and the formation vehicle behind the first vehicle.
And step S403, sending notification information to the second vehicle, wherein the notification information is used for indicating the second vehicle to decelerate so as to wait for the first vehicle.
In particular, there is a high demand for speed for non-formation vehicles that will actively queue. Fig. 4b is a schematic view of a scenario in which a second vehicle decelerates to wait for a first vehicle according to an embodiment of the present application, (where, in fig. 4b to 4d, 410 is the first vehicle, 420 is the second vehicle, 430 is an actual lane line, 440 is a non-formation vehicle, 450 is a formation vehicle behind the second vehicle, and solid arrows are used to indicate moving directions of the second vehicle and the non-formation vehicle), in combination with fig. 4b, the non-formation vehicle in the queue can actively select to continue to change lanes to leave the fleet by decelerating the second vehicle, and at the same time, the first vehicle can wait for the second vehicle and return to a normal following distance, thereby restoring the original state of the fleet. At this time, if the first vehicle still has the problem of lane line monitoring abnormality, the method in the embodiment shown in fig. 2 or fig. 3 may be referred to, and the lateral movement of the first vehicle may be controlled based on the target feature of the second vehicle.
And S404, sending notification information to the second vehicle, wherein the notification information is used for indicating that the first vehicle is set as a pilot vehicle.
Specifically, fig. 4c is a schematic view of a scene where a first vehicle is set as a pilot vehicle according to an embodiment of the present disclosure, and referring to fig. 4c, when there are more vehicles on a lane and deceleration of a second vehicle may affect a large number of vehicles to travel (and an inter-team vehicle may not change lanes because there are vehicles in other lanes or enters a single lane, and may not quickly change lanes to leave), the fleet of vehicles may be separated. At this time, if other formation vehicles exist behind the first vehicle, the first vehicle can be set as a pilot vehicle, and the first vehicle and the formation vehicles behind the first vehicle form a new fleet, so that the running efficiency is guaranteed.
And step S405, sending notification information to the second vehicle, wherein the notification information is used for disassembling the vehicle formation of the second vehicle and the first vehicle.
Specifically, fig. 4d is an application scenario diagram for separating a vehicle fleet provided in an embodiment of the present application, and as shown in fig. 4d, when there are more vehicles on a lane and a formation vehicle has only two vehicles, namely a second vehicle and a first vehicle, the vehicles can also be directly separated to facilitate adjustment of a driving state of each vehicle.
According to the vehicle control method provided by the embodiment of the application, when the real-time position of the target feature of the second vehicle is obtained, the first license plate of the second vehicle is identified, if the identified license plate is different from the pre-recorded second license plate of the second vehicle, the vehicle to which the first license plate belongs is determined to be a non-formation vehicle, and at the moment, corresponding notification information is sent to the second vehicle according to the road condition and the vehicle formation condition so as to adopt a corresponding processing strategy. Through the condition that in time handle non-formation vehicle and insert the motorcade, effectively solve because first vehicle the place ahead vehicle is non-formation vehicle, lead to the virtual guide line based on the place ahead vehicle generates, can't guarantee the security that first vehicle travel, and then furthest promotes the validity of following the car and guaranteeing its lateral position control under different lane situations to effectively follow the security of car under lane line detects abnormal conditions.
Fig. 5 is a schematic structural diagram of a vehicle control device according to an embodiment of the present application. As shown in fig. 5, the vehicle control device 500 includes: an acquisition module 510, a generation module 520, and an indication module 530. Wherein:
the obtaining module 510 is configured to respond to lane line abnormality information, where the lane line abnormality information is used to indicate that a first vehicle cannot obtain lane line information, and obtain a position of a target feature, where the target feature includes a vehicle body feature of a second vehicle;
a generating module 520, configured to generate a virtual guiding line for guiding the first vehicle to travel according to the position of the target feature;
and an indicating module 530, configured to control the first vehicle to follow the second vehicle according to the virtual guiding line.
Optionally, the generating module 520 is specifically configured to generate a virtual lane line for guiding the first vehicle to follow the second vehicle according to the position of the target feature and a preset width; and/or generating a virtual centerline for guiding the first vehicle to follow the second vehicle based on the real-time location of the target feature.
Optionally, the generating module 520 is specifically configured to obtain a center position of the tail of the second vehicle according to the position of the target feature; generating virtual lane lines for guiding the first vehicle to follow two sides of a lane where the second vehicle runs according to the central position of the tail of the second vehicle and the preset width; the center positions of the virtual lane lines on the two sides are the center positions of the tail of the second vehicle, and the distance between the virtual lane lines on the two sides is a preset width.
Optionally, the generating module 520 is specifically configured to obtain width information of the second vehicle according to the position of the target feature, where the width information includes positions of two sides of the second vehicle; and generating virtual lane lines on two sides of a lane for guiding the first vehicle to run along with the second vehicle according to the positions of the two sides of the second vehicle and the preset length value, wherein the virtual lane line on one side is obtained by extending the preset length value outwards according to one side of the second vehicle, and the virtual lane line on the other side is obtained by extending the preset length value outwards according to the other side of the second vehicle.
Optionally, the generating module 520 is specifically configured to obtain a center position of the second vehicle according to the position of the target feature; a virtual center line for guiding the first vehicle to follow the second vehicle is generated based on the center position of the second vehicle.
Optionally, the obtaining module 510 is specifically configured to obtain at least one of the following: acquiring the width of the tail part of a second vehicle; or acquiring the contour width of the second vehicle; or, a width between two rear lamps of the second vehicle is acquired.
Optionally, the indicating module 530 is further configured to, when the first vehicle and the second vehicle belong to a same platoon, and the first vehicle is a following vehicle and the second vehicle is a lead vehicle, identify a first license plate of a vehicle in front of the first vehicle, and if the identified license plate is different from a pre-recorded second license plate of the second vehicle, determine that the vehicle to which the first license plate belongs is a non-platoon vehicle; transmitting notification information to the second vehicle, the notification information indicating any one of: the notification information is used for indicating the second vehicle to decelerate to wait for the first vehicle; the notification information is used for indicating that the first vehicle is set as a pilot vehicle; the notification information is used to disambiguate the vehicle formation of the second vehicle and the first vehicle.
In the embodiment, the vehicle control device can effectively ensure the control of the transverse position of the formation vehicle under the condition of abnormal lane line detection through the combination of the modules, thereby ensuring the safety of the automatic driving vehicle.
Fig. 6 is a schematic structural diagram of a control device according to an embodiment of the present application, and as shown in fig. 6, the control device 600 includes: a memory 610 and a processor 620.
Wherein the memory 610 stores computer programs executable by the at least one processor 620. The computer program is executed by the at least one processor 620 to cause the control apparatus to implement the vehicle control method provided in any of the above embodiments.
Wherein the memory 610 and the processor 620 may be connected by a bus 630.
The related descriptions may be understood by referring to the related descriptions and effects corresponding to the method embodiments, which are not repeated herein.
An embodiment of the present application provides a computer-readable storage medium having a computer program stored thereon, the computer program being executed by a processor to implement the vehicle control method of any of the embodiments described above.
The computer readable storage medium may be, among others, ROM, Random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like.
An embodiment of the present application provides a computer program product comprising computer executable instructions, which when executed by a processor, are configured to implement a vehicle control method according to any of the embodiments corresponding to fig. 2 to 3.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of modules is merely a division of logical functions, and an actual implementation may have another division, for example, a plurality of modules or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or modules, and may be in an electrical, mechanical or other form.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope of the disclosure being indicated by the following claims.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof.

Claims (10)

1. A vehicle control method, characterized by comprising:
responding to lane line abnormal information, wherein the lane line abnormal information is used for indicating that a first vehicle cannot acquire the lane line information and acquiring the position of a target feature, and the target feature comprises a vehicle body feature of a second vehicle;
generating a virtual guide line for guiding the first vehicle to travel according to the position of the target feature;
and controlling the first vehicle to follow the second vehicle according to the virtual guide line.
2. The vehicle control method according to claim 1, wherein the generating a virtual guide line for guiding the first vehicle to travel according to the position of the target feature includes:
generating a virtual lane line for guiding the first vehicle to follow the second vehicle according to the position of the target feature and a preset width;
and/or generating a virtual centerline for guiding the first vehicle to follow the second vehicle based on the location of the target feature.
3. The vehicle control method according to claim 2, wherein the generating a virtual lane line for guiding the first vehicle to travel following the second vehicle, based on the position of the target feature and a preset width, includes:
acquiring the central position of the tail of the second vehicle according to the position of the target feature;
generating virtual lane lines for guiding the first vehicle to follow two sides of a lane where the second vehicle runs according to the central position of the tail of the second vehicle and a preset width; the center positions of the virtual lane lines on the two sides are the center positions of the tail of the second vehicle, and the distance between the virtual lane lines on the two sides is the preset width.
4. The vehicle control method according to claim 2, wherein the generating a virtual lane line for guiding the first vehicle to travel following the second vehicle, based on the position of the target feature and a preset width, includes:
according to the position of the target feature, width information of the second vehicle is obtained, wherein the width information comprises positions of two sides of the second vehicle;
and generating virtual lane lines on two sides of a lane for guiding the first vehicle to follow the second vehicle according to the positions of the two sides of the second vehicle and the preset length value, wherein the virtual lane line on one side is obtained by extending the preset length value outwards according to one side of the second vehicle, and the virtual lane line on the other side is obtained by extending the preset length value outwards according to the other side of the second vehicle.
5. The vehicle control method according to claim 2, wherein the generating a virtual center line for guiding the first vehicle to travel following the second vehicle based on the position of the target feature includes:
acquiring the central position of the second vehicle according to the position of the target feature;
generating a virtual center line for guiding the first vehicle to follow the second vehicle according to the center position of the second vehicle.
6. The vehicle control method according to claim 1, wherein the acquiring of the position of the target feature includes at least one of:
acquiring the tail width of the second vehicle; or the like, or, alternatively,
acquiring the contour width of the second vehicle; or the like, or, alternatively,
obtaining a width between two rear lamps of the second vehicle.
7. The vehicle control method according to any one of claims 1 to 6, wherein when the first vehicle and the second vehicle belong to a same platoon running in formation, and the first vehicle is a follower vehicle and the second vehicle is a pilot vehicle, the method further comprises:
identifying a first license plate of a vehicle in front of the first vehicle, and if the identified license plate is different from a pre-recorded second license plate of the second vehicle, determining that the vehicle to which the first license plate belongs is a non-formation vehicle;
transmitting notification information to the second vehicle, the notification information indicating any one of:
the notification information is used for indicating the second vehicle to decelerate to wait for the first vehicle; or the like, or, alternatively,
the notification information is used for indicating that the first vehicle is set as a pilot vehicle; or the like, or, alternatively,
the notification information is used to de-queue the vehicles of the second vehicle and the first vehicle.
8. A vehicle control apparatus characterized by comprising:
the acquisition module is used for responding to lane line abnormal information, wherein the lane line abnormal information is used for indicating that a first vehicle cannot acquire the lane line information and acquiring the position of a target feature, and the target feature comprises a vehicle body feature of a second vehicle;
a generating module, configured to generate a virtual guide line for guiding the first vehicle to travel according to the position of the target feature;
and the indicating module is used for controlling the first vehicle to follow the second vehicle to run according to the virtual guide line.
9. A control apparatus, characterized by comprising:
at least one processor;
and a memory communicatively coupled to the at least one processor;
wherein the memory stores instructions executable by the at least one processor to cause the control apparatus to perform the vehicle control method of any one of claims 1 to 7.
10. A computer-readable storage medium having stored therein computer-executable instructions for implementing the vehicle control method of any one of claims 1 to 7 when executed by a processor.
CN202210436406.9A 2022-04-25 2022-04-25 Vehicle control method, device, equipment and storage medium Pending CN114537393A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210436406.9A CN114537393A (en) 2022-04-25 2022-04-25 Vehicle control method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210436406.9A CN114537393A (en) 2022-04-25 2022-04-25 Vehicle control method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114537393A true CN114537393A (en) 2022-05-27

Family

ID=81667252

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210436406.9A Pending CN114537393A (en) 2022-04-25 2022-04-25 Vehicle control method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114537393A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115123218A (en) * 2022-09-02 2022-09-30 小米汽车科技有限公司 Vehicle detection method and device and electronic equipment thereof
CN115188178A (en) * 2022-07-07 2022-10-14 广西智能驾驶研究中心有限公司 Vehicle formation method, device, equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180129854A1 (en) * 2016-11-09 2018-05-10 Samsung Electronics Co., Ltd. Method and apparatus for generating virtual driving lane for traveling vehicle
CN111267862A (en) * 2020-01-13 2020-06-12 清华大学 Method and system for constructing virtual lane line depending on following target
CN111738207A (en) * 2020-07-13 2020-10-02 腾讯科技(深圳)有限公司 Lane line detection method and device, electronic device and readable storage medium
CN112477847A (en) * 2020-12-11 2021-03-12 清华大学苏州汽车研究院(吴江) Traffic jam auxiliary control method and system
WO2021259000A1 (en) * 2020-06-24 2021-12-30 中国第一汽车股份有限公司 Method and apparatus for controlling vehicle following, vehicle, and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180129854A1 (en) * 2016-11-09 2018-05-10 Samsung Electronics Co., Ltd. Method and apparatus for generating virtual driving lane for traveling vehicle
CN111267862A (en) * 2020-01-13 2020-06-12 清华大学 Method and system for constructing virtual lane line depending on following target
WO2021259000A1 (en) * 2020-06-24 2021-12-30 中国第一汽车股份有限公司 Method and apparatus for controlling vehicle following, vehicle, and storage medium
CN111738207A (en) * 2020-07-13 2020-10-02 腾讯科技(深圳)有限公司 Lane line detection method and device, electronic device and readable storage medium
CN112477847A (en) * 2020-12-11 2021-03-12 清华大学苏州汽车研究院(吴江) Traffic jam auxiliary control method and system

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115188178A (en) * 2022-07-07 2022-10-14 广西智能驾驶研究中心有限公司 Vehicle formation method, device, equipment and storage medium
CN115188178B (en) * 2022-07-07 2023-12-22 广西智能驾驶研究中心有限公司 Vehicle formation method, device, equipment and storage medium
CN115123218A (en) * 2022-09-02 2022-09-30 小米汽车科技有限公司 Vehicle detection method and device and electronic equipment thereof

Similar Documents

Publication Publication Date Title
JP5074365B2 (en) Camera device
CN114537393A (en) Vehicle control method, device, equipment and storage medium
US9718473B2 (en) Travel control device and travel control method
CN110045736B (en) Bend obstacle avoiding method based on unmanned aerial vehicle
CN110979401B (en) Method and device for preventing collision of cooperative formation trains
EP3451311B1 (en) Method and device for parking assistance
US11338812B2 (en) Vehicle control device
JP5202741B2 (en) Branch entry judgment device
CN110203197B (en) Lane recognition and lane keeping method and terminal equipment
CN109035863B (en) Forced lane-changing driving method for vehicle
CN111267862B (en) Method and system for constructing virtual lane line depending on following target
US20230242119A1 (en) Method and Device for the Automated Driving Mode of a Vehicle, and Vehicle
CN113619578A (en) Vehicle anti-collision method, anti-collision system and computer readable storage medium
JP2019191882A (en) Platoon travel controller
JP2020069969A (en) Vehicle control system
CN114822083B (en) Intelligent vehicle formation auxiliary control system
CN107221181A (en) Method, vehicle termination, roadway segment and the commander's equipment of vehicle access car networking
CN113895462B (en) Method, device, computing equipment and storage medium for predicting lane change of vehicle
CN115775463A (en) Navigation method for automatically driving automobile
CN111231954B (en) Control method for automatic driving
CN116229758A (en) Lane changing guiding method, road side equipment, system and storage medium
CN111376904B (en) Automatic car following method and device
CN202641689U (en) Vehicle outside monitoring device and driving control device with vehicle outside monitoring device
CN114537392A (en) Vehicle control method, device, equipment and storage medium
CN114937372B (en) Vehicle positioning system, positioning method, vehicle and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination