CN112703506B - Lane line detection method and device - Google Patents

Lane line detection method and device Download PDF

Info

Publication number
CN112703506B
CN112703506B CN202080005027.3A CN202080005027A CN112703506B CN 112703506 B CN112703506 B CN 112703506B CN 202080005027 A CN202080005027 A CN 202080005027A CN 112703506 B CN112703506 B CN 112703506B
Authority
CN
China
Prior art keywords
vehicle
lane
target
probability
weekly
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202080005027.3A
Other languages
Chinese (zh)
Other versions
CN112703506A (en
Inventor
徐建锋
郭姣
张晓洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN112703506A publication Critical patent/CN112703506A/en
Application granted granted Critical
Publication of CN112703506B publication Critical patent/CN112703506B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]

Abstract

A lane line detection method and a lane line detection device relate to the field of automatic driving and are used for screening peripheral vehicles according to a lane changing probability and determining a lane line of a lane where a self vehicle is located according to a motion track of the peripheral vehicle with the maximum association probability so as to improve the accuracy of lane line detection and ensure the safety of the vehicle. The method comprises the following steps: determining the change trend of the distance between the week vehicle and the central line of the virtual lane according to the motion track of the week vehicle; analyzing the change trend of the distance between the surrounding vehicles and the central line of the virtual lane and the lane changing probability of the surrounding vehicles at the previous moment to obtain the lane changing probability of the surrounding vehicles at the current moment; determining the association probability of the weekly vehicle and the central line of the virtual lane at the current moment according to the lane changing probability of the weekly vehicle at the current moment, the association probability of the weekly vehicle and the central line of the virtual lane at the previous moment, the position of the weekly vehicle relative to the central line of the virtual lane at the current moment and a preset transfer coefficient; and determining the lane line of the lane where the self vehicle is located according to the motion track and the preset width of the target vehicle.

Description

Lane line detection method and device
Technical Field
The application relates to the field of automatic driving, in particular to a lane line detection method and device.
Background
The lane line is used as a road component for indicating the path planning of the automatic driving vehicle so as to ensure the safety, comfort, intelligence and the like of the vehicle in the automatic driving process. Generally, an autonomous vehicle may detect a lane line on a road using real-time vision, so as to perform autonomous path planning according to the detected lane line.
In the prior art, the position of a vehicle around the vehicle, namely the position of the vehicle around the vehicle, which is updated in real time, is usually subjected to fitting, interpolation and other processing to determine the motion track of the vehicle, the motion track of the vehicle is translated according to the transverse distance between the vehicle and the vehicle, the translated motion track of the vehicle is subjected to equalization processing to obtain a lane line, finally, the range of the lane line is further determined by using obstacle information around the vehicle, and the lane range of the lane where the vehicle is located, namely the lane line of the lane where the vehicle is located, is output. Or after the moving track of the week vehicle is determined by performing processing such as fitting and interpolation on the position of the week vehicle updated in real time, the moving track of the week vehicle is used as the lane center line of the lane where the week vehicle is located, the moving track of the vehicle and the width of the preset lane are used for performing translation and balance processing on the lane center line of the lane where the week vehicle is located, so that the lane center line of the lane where the own vehicle is located is obtained, and finally the lane range of the lane where the own vehicle is located is determined by translating the lane center line of the lane where the own vehicle is located according to the width of the preset lane. The lane lines detected by the two prior arts may have lane deviation, which affects the safety of the vehicle, and the two prior arts cannot distinguish whether the vehicles change lanes or not, which may affect the safety of the vehicles passing through the intersection scene, and thus are not suitable for more complicated intersection scenes, such as two lanes at the exit and three lanes at the entrance. Or, in the prior art, taking tesla autonomous driving vehicle as an example, multiple kinds of observation data acquired by the sensor may be used for fusion, so as to obtain a lane line of a lane where the own vehicle is located. In this prior art, the situation that the lane line is unstable may occur due to too many kinds of observation data for determining the lane line, thereby affecting the safety of the vehicle passing through the intersection scene.
Disclosure of Invention
The application provides a lane line detection method and device, which are used for identifying the lane changing intention of a week vehicle according to the lane changing probability of the week vehicle so as to screen the week vehicle, and then determining the lane line of the lane where the own vehicle is located according to the appropriate motion track of the week vehicle and the preset width according to the correlation probability of the week vehicle and the central line of the virtual lane of the own vehicle, thereby improving the accuracy of the lane line detection of the lane where the own vehicle is located and ensuring the safety of the vehicle.
In order to achieve the purpose, the technical scheme is as follows:
in a first aspect, the present application provides a lane line detection method, which is used in the field of automatic driving, and includes: and determining the change trend of the distance between the week vehicle and the central line of the virtual lane according to the motion trail of the week vehicle. The week vehicle is a vehicle with a distance from the own vehicle smaller than a preset distance. And then, analyzing the change trend of the distance between the peripheral vehicle and the central line of the virtual lane and the lane change probability of the peripheral vehicle at the previous moment to obtain the lane change probability of the peripheral vehicle at the current moment. The lane changing probability is the probability of the vehicle changing the lane where the vehicle is located. And determining the association probability of the central lines of the weekly vehicle and the virtual lane at the current moment according to the lane changing probability of the weekly vehicle at the current moment, the association probability of the weekly vehicle and the central line of the virtual lane at the previous moment, the position of the weekly vehicle relative to the central line of the virtual lane at the current moment and a preset transfer coefficient. The preset transfer coefficient is the association probability of the week vehicle and the center line of the virtual lane at the last moment, and is a conversion coefficient when the association probability of the week vehicle and the center line of the virtual lane is converted, wherein the association probability is used for representing the distance between the week vehicle and the center line of the virtual lane and the similarity between the motion track of the week vehicle and the center line of the virtual lane. And finally, determining a lane line of a lane where the self vehicle is located according to the motion track of the target vehicle and the preset width. The target vehicle is the vehicle with the highest association probability with the virtual lane in the week vehicle.
In the process, firstly, the method and the device can determine the change trend of the distance between the week vehicle and the central line of the virtual lane according to the motion track of the week vehicle, and further determine the lane change probability of the week vehicle at the current moment according to the change trend of the distance between the week vehicle and the central line of the virtual lane and the lane change probability of the week vehicle at the previous moment, so as to identify the lane change intention of the week vehicle at the current moment. Secondly, the method and the device can determine the association probability of the central lines of the weekly vehicle and the virtual lane at the current moment according to the lane changing probability of the weekly vehicle at the current moment, the association probability of the central lines of the weekly vehicle and the virtual lane at the previous moment, the position of the weekly vehicle relative to the central line of the virtual lane at the current moment and a preset transfer coefficient. And finally, determining the lane line of the lane where the self-vehicle is located according to the motion track of the week vehicle with the maximum association probability at the current moment and the preset width, so that the accuracy of detecting the lane line of the lane where the self-vehicle is located is improved, and the safety of the vehicle is ensured.
In one possible implementation manner, the virtual lane is parallel to a connecting line between a head and a tail of the vehicle, the width of the virtual lane is a preset width, and the vehicle is located on a central line of the virtual lane.
In a possible implementation manner, the motion trajectory of the weekly vehicle is a connection line of a plurality of position points, wherein values of longitudinal distances between the plurality of position points are equal to values of preset distance intervals, and the plurality of position points include position points corresponding to the current position of each weekly vehicle.
In a possible implementation manner, analyzing a variation trend of a distance between the peripheral vehicle and a center line of the virtual lane and a lane change probability of the peripheral vehicle at a previous time to obtain the lane change probability of the peripheral vehicle at a current time includes: and analyzing the change trend of the distance between the peripheral vehicle and the central line of the virtual lane to obtain a first observation probability of the peripheral vehicle at the current moment, wherein the first observation probability is a predicted value of the lane change probability of the vehicle. And then carrying out weighted summation on the first observation probability of the weekly vehicle at the current moment and the lane change probability of the weekly vehicle at the previous moment, and determining the lane change probability of the weekly vehicle at the current moment.
In the process, the method and the device can analyze the variation trend of the distance between the week vehicle and the center line of the virtual lane line to obtain a predicted value of the lane change probability of the week vehicle at the current moment, namely the first observation probability of the week vehicle at the current moment, and then perform weighted summation on the first observation probability of the week vehicle at the current moment and the lane change probability of the week vehicle at the previous moment to obtain the lane change probability of the week vehicle at the current moment. The lane change probability of the week vehicle at the previous moment is further considered on the basis of the first observation probability of the week vehicle at the current moment, so that the accuracy of the obtained lane change probability of the week vehicle at the current moment is higher, and the lane change intention of the week vehicle is more accurately identified.
In one possible implementation manner, the lane change probability before the week vehicle enters the preset area is the preset lane change probability, wherein the preset area is the intersection.
In one possible implementation manner, determining the association probability of the weekly vehicle and the central line of the virtual lane at the current moment according to the lane change probability of the weekly vehicle at the current moment, the association probability of the weekly vehicle and the central line of the virtual lane at the previous moment, the position of the weekly vehicle relative to the central line of the virtual lane at the current moment and a preset transfer coefficient includes: the method comprises the steps of firstly determining a first target week vehicle according to a preset lane change probability threshold value and the lane change probability of the week vehicle at the current moment. And then determining the transverse distance between the first target week vehicle and the center line of the virtual lane and the included angle between the head direction of the first target week vehicle and the center line of the virtual lane according to the position of the first target week vehicle relative to the center line of the virtual lane at the current moment. And then carrying out weighted summation on the included angle and the transverse distance between the first target week vehicle and the center line of the virtual lane, and determining a second observation probability of the first target week vehicle at the current moment, namely a predicted value of the association probability of the vehicle and the center line of the virtual lane. And finally, carrying out normalization processing on the second observation probability of the first target week at the current moment, the preset transfer coefficient and the association probability of the center lines of the first target week and the virtual lane at the previous moment, and determining the association probability of the center lines of the first target week and the virtual lane at the current moment.
In the process, the lane change intention of the week vehicle at the current moment can be identified according to the preset lane change probability threshold and the lane change probability of the week vehicle at the current moment, and the week vehicle is screened and removed according to the lane change intention of the week vehicle at the current moment, so that the first target week vehicle is determined, and the accuracy of determining the association probability of the center lines of the week vehicle and the virtual lane at the current moment is improved. In addition, in the process of determining the association probability of the center lines of the first target week vehicle and the virtual lane at the current moment, the association probability of the center lines of the first target week vehicle and the virtual lane at the previous moment is also considered, so that the accuracy of the association probability of the center lines of the first target week vehicle and the virtual lane at the current moment is further improved, the accuracy of determining the lane line of the lane where the own vehicle is located is improved, and the vehicle safety is ensured.
In a possible implementation manner, if the number of the weekly vehicles is multiple, determining a first target weekly vehicle according to a preset lane change probability threshold and a lane change probability of the weekly vehicle at the current time includes: if the lane change probabilities of the multiple weekly vehicles at the current moment are all larger than or equal to a preset lane change probability threshold value, determining that the multiple weekly vehicles are all first target weekly vehicles; if the weekly vehicles with the lane change probability at the current moment greater than or equal to the preset lane change probability threshold exist in the plurality of weekly vehicles, and the weekly vehicles with the lane change probability at the current moment less than the preset lane change probability threshold exist, deleting the weekly vehicles with the lane change probability at the current moment greater than or equal to the preset lane change probability threshold, and determining the weekly vehicles with the lane change probability at the current moment less than the preset lane change probability threshold as the first target weekly vehicle.
In the process, if the lane change probability of the weekly vehicle at the current moment is greater than or equal to the preset lane change probability threshold, that is, all the weekly vehicles have the lane change intention, it can be determined that all the vehicles need to change lanes to pass in the current scene, and all the weekly vehicles are determined as the first target weekly vehicle. If vehicles with lane change probability at the current moment greater than or equal to the preset lane change probability threshold exist in the week vehicles, and vehicles with lane change probability at the current moment smaller than the preset lane change probability threshold exist, the week vehicles can be determined to be capable of changing lanes or not to be capable of changing lanes in the current scene, vehicles with lane change intentions are removed, the first target week vehicle is determined, vehicles with associated probability needing to be determined are reduced, and the efficiency of determining lane lines of lanes where the own vehicles are located is improved.
In one possible implementation, the method further includes: the method comprises the steps of firstly determining that a lane where a self-vehicle is located after the self-vehicle leaves a preset area is a target lane, wherein the preset area is an intersection. And then determining a target point of the self-vehicle on the target lane according to the speed of the self-vehicle at the first position point and the direction of the head of the self-vehicle, wherein the first position point is a position point corresponding to the position of the self-vehicle before the self-vehicle leaves the preset area. If the target point of the self-vehicle on the target lane is located on the central line of the target lane, determining the connecting line of the target point of the self-vehicle on the target lane and the first position point, wherein the connecting line is the central line of the lane where the self-vehicle is located before the self-vehicle leaves the preset area; and if the target point of the self vehicle on the target lane is not positioned on the central line of the target lane, determining that the connecting line of the perpendicular point of the target point on the central line of the target lane and the first position point is the central line of the lane where the self vehicle is positioned before leaving the preset area.
In the process, the center line of the lane where the self-vehicle leaves the preset area can be determined according to the center line of the target lane, the speed of the self-vehicle and the direction of the head of the self-vehicle when the self-vehicle leaves the preset area, namely the intersection, when the self-vehicle is at the first position point, so that a smoother lane line of the lane where the self-vehicle leaves the preset area can be obtained according to the preset width and the center line, the transverse shaking of the self-vehicle when the self-vehicle leaves the preset area is reduced, and the driving safety of the self-vehicle is improved.
In a second aspect, the present application provides a lane line detection apparatus for use in the field of autonomous driving, the apparatus comprising a determination unit and an analysis unit: the determining unit is used for determining the change trend of the distance between the week vehicle and the center line of the virtual lane according to the motion track of the week vehicle, wherein the distance between the week vehicle and the self vehicle is smaller than the preset distance. And the analysis unit is used for analyzing the change trend of the distance between the surrounding vehicle and the central line of the virtual lane and the lane change probability of the surrounding vehicle at the previous moment to obtain the lane change probability of the surrounding vehicle at the current moment, wherein the lane change probability is the probability of the vehicle changing the lane where the vehicle is located. The determining unit is further used for determining the association probability of the lane change probability of the week vehicle at the current moment, the association probability of the week vehicle and the center line of the virtual lane at the previous moment, the position of the week vehicle relative to the center line of the virtual lane at the current moment and a preset transfer coefficient. The preset transfer coefficient is the association probability of the week vehicle and the center line of the virtual lane at the last moment, and is a conversion coefficient when the association probability of the week vehicle and the center line of the virtual lane is converted, wherein the association probability is used for representing the distance between the week vehicle and the center line of the virtual lane and the similarity between the motion track of the week vehicle and the center line of the virtual lane. The determining unit is further used for determining a lane line of a lane where the self-vehicle is located according to the motion track of the target vehicle and the preset width, wherein the target vehicle is a vehicle with the highest association probability with the center line of the virtual lane at the current moment in the week vehicle.
In one possible implementation manner, the virtual lane is parallel to a connection line between a head and a tail of the vehicle, the width of the virtual lane is a preset width, and the vehicle is located on a central line of the virtual lane.
In a possible implementation manner, the motion trajectory of the cycle is a connection line of a plurality of position points, a value of a longitudinal distance between adjacent position points in the plurality of position points is equal to a value of a preset distance interval, and the plurality of position points include a position point corresponding to a current position of each cycle.
In a possible implementation manner, the analysis unit is configured to analyze a variation trend of a distance between the surrounding vehicle and a center line of the virtual lane to obtain a first observation probability of the surrounding vehicle at the current time, that is, a predicted value of a lane change probability of the vehicle. And the analysis unit is also used for carrying out weighted summation on the first observation probability of the weekly vehicle at the current moment and the lane change probability of the weekly vehicle at the previous moment so as to determine the lane change probability of the weekly vehicle at the current moment.
In one possible implementation manner, the lane change probability before the week vehicle enters the preset area is the preset lane change probability, wherein the preset area is the intersection.
In a possible implementation manner, the determining unit is configured to determine the first target weekly vehicle according to a preset lane change probability threshold and a lane change probability of the weekly vehicle at the current time. And then the determining unit is further used for determining the transverse distance between the first target week vehicle and the center line of the virtual lane and the included angle between the head direction of the first target week vehicle and the center line of the virtual lane according to the position of the first target week vehicle relative to the center line of the virtual lane at the current moment. And then the determining unit is also used for weighting and summing the included angle and the transverse distance between the first target week vehicle and the center line of the virtual lane, and determining a second observation probability of the first target week vehicle at the current moment, namely a predicted value of the association probability of the vehicle and the center line of the virtual lane. And finally, the determining unit is further used for carrying out normalization processing on the second observation probability of the first target week at the current moment, the preset transfer coefficient and the association probability of the center line of the first target week and the virtual lane at the previous moment, and determining the association probability of the center line of the first target week and the virtual lane at the current moment.
In a possible implementation manner, if the number of the weekly vehicles is multiple, the determining unit is configured to determine the first target weekly vehicle according to a preset lane change probability threshold and a lane change probability of the weekly vehicle at the current time, and includes: and if the lane change probabilities of the multiple weekly vehicles at the current moment are all larger than or equal to a preset lane change probability threshold value, determining that the multiple weekly vehicles are all first target weekly vehicles. If the weekly vehicles with the lane change probability at the current moment greater than or equal to the preset lane change probability threshold exist in the plurality of weekly vehicles, and the weekly vehicles with the lane change probability at the current moment smaller than the preset lane change probability threshold exist, deleting the weekly vehicles with the lane change probability greater than or equal to the preset lane change probability threshold, and determining the weekly vehicles with the lane change probability smaller than the preset lane change probability threshold as the first target weekly vehicle.
In a possible implementation manner, the determining unit is further configured to determine that a lane where the vehicle is located after leaving a preset area is a target lane, where the preset area is an intersection. And the determining unit is further used for determining a target point of the self-vehicle on the target lane according to the speed of the self-vehicle at the first position point and the direction of the vehicle head, wherein the first position point is a position point corresponding to the position where the self-vehicle is located before leaving the preset area. And the determining unit is further used for determining a connection line between the target point of the vehicle on the target lane and the first position point as a center line of the lane before the vehicle leaves the preset area if the target point of the vehicle on the target lane is positioned on the center line of the target lane. And the determining unit is further used for determining that a connecting line of a perpendicular point of the target point on the center line of the target lane and the first position point is the center line of the lane before the vehicle leaves the preset area if the target point of the vehicle on the target lane is not located on the center line of the target lane.
In a third aspect, the present application provides a lane line detection apparatus, comprising: a processor and a memory; the memory is used for storing computer program instructions, and the processor runs the computer program instructions to enable the lane line detection device to execute the lane line detection method in the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium comprising computer instructions which, when executed by a processor, cause a lane line detection apparatus to perform the lane line detection method according to the first aspect.
In a fifth aspect, the present application provides a computer program product, which, when run on a processor, causes a lane line detection apparatus to perform the lane line detection method according to the first aspect.
Drawings
FIG. 1 is a first schematic structural diagram of an autonomous vehicle according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram ii of an autonomous vehicle according to an embodiment of the present disclosure;
FIG. 3 is a schematic structural diagram of a computer system according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of a chip system according to an embodiment of the present disclosure;
fig. 5 is a first schematic view of an application of a cloud-side command autonomous driving vehicle according to an embodiment of the present disclosure;
fig. 6 is a schematic application diagram of a cloud-side command autonomous driving vehicle according to an embodiment of the present application;
FIG. 7 is a block diagram of a computer program product according to an embodiment of the present application;
fig. 8 is a schematic flowchart of a lane line detection method according to an embodiment of the present application;
fig. 9 (a) is a schematic view of a virtual lane provided in an embodiment of the present application;
fig. 9 (b) is a schematic view of a cycle provided in the embodiment of the present application;
FIG. 10 is a schematic diagram of a predetermined distance interval provided in an embodiment of the present application;
FIG. 11 is a schematic view of a position of a cycle;
figure 12 is a schematic diagram of a markov probability model provided by an embodiment of the present application;
fig. 13 is a first schematic view illustrating a motion trajectory of a bicycle according to an embodiment of the present disclosure;
fig. 14 is a second schematic view of a motion trajectory of a host vehicle according to an embodiment of the present application;
fig. 15 is a schematic diagram of a preset lane change probability and a preset association probability provided in an embodiment of the present application;
fig. 16 (a) is a third schematic diagram illustrating a motion trajectory of a host vehicle according to an embodiment of the present application;
fig. 16 (b) is a fourth schematic diagram illustrating a motion trajectory of a host vehicle according to an embodiment of the present application;
fig. 17 is a first schematic view of a lane line detection apparatus according to an embodiment of the present disclosure;
fig. 18 is a second schematic view of a lane line detection apparatus according to an embodiment of the present application.
Detailed Description
The embodiment of the application provides a lane line detection method, which is applied to a vehicle or other equipment (such as a cloud server, a mobile phone terminal and the like) with a function of controlling the vehicle. The vehicle may be an autonomous vehicle, which may be a vehicle having a partial autonomous driving function or a vehicle having a full autonomous driving function, that is, the autonomous driving level of the vehicle may be classified into no automation (L0), driving support (L1), partial automation (L2), conditional automation (L3), high automation (L4), or full automation (L5) with reference to classification standards of Society of Automotive Engineers (SAE). The vehicle or other device may implement the lane line detection method provided by the embodiments of the present application through its included components (including hardware and software). Firstly, a virtual lane which is parallel to the head and the tail of the vehicle and has the width of a preset width is determined, and the vehicle is positioned on the central line of the virtual lane. And then determining the change trend of the distance between the weekly vehicle and the central line of the virtual lane according to the movement track of the weekly vehicle, namely the vehicle with the distance from the weekly vehicle to the self vehicle smaller than the preset distance, analyzing the change trend of the distance between the weekly vehicle and the central line of the virtual lane and the lane change probability of the weekly vehicle at the previous moment, and determining the lane change probability of the weekly vehicle at the current moment. And then, determining the association probability of the central lines of the week vehicle and the virtual lane at the current moment according to the lane change probability of the week vehicle at the current moment, the association probability of the week vehicle and the central line of the virtual lane at the previous moment, the position of the week vehicle relative to the central line of the virtual lane line at the current moment and a preset transfer coefficient, and determining the target vehicle according to the association probability of the week vehicle and the central line of the virtual lane at the current moment. And finally, determining the lane line of the lane where the self-vehicle is located according to the motion track of the target vehicle, namely the motion track of the week vehicle with the maximum association probability with the center line of the virtual lane at the current moment and the preset width, so as to improve the accuracy of determining the lane line of the lane where the self-vehicle is located and ensure the safety of the vehicle.
Fig. 1 is a schematic structural diagram of an autonomous vehicle according to an embodiment of the present application. In one embodiment, the vehicle 100 is configured in a fully or partially autonomous driving mode. For example, when the vehicle 100 is in the partial automatic driving mode (i.e., L2) or L3, L4, L5, the vehicle 100 may determine the lane change probability of the week vehicle at the current time and the association probability of the week vehicle and the center line of the virtual lane at the current time, and further determine the lane line of the lane where the vehicle is located, according to the motion trajectory of the week vehicle, the lane change probability of the week vehicle at the previous time, the position of the week vehicle at the current time relative to the center line of the virtual lane line, the association probability of the week vehicle and the center line of the virtual lane at the previous time, and the preset transfer coefficient.
Vehicle 100 may include various subsystems such as a travel system 110, a sensor system 120, a control system 130, one or more peripherals 140, as well as a power supply 150, a computer system 160, and a user interface 170. Alternatively, vehicle 100 may include more or fewer subsystems, and each subsystem may include multiple elements. In addition, each of the sub-systems and elements of the vehicle 100 may be interconnected by wire or wirelessly.
The travel system 110 may include components that provide powered motion to the vehicle 100. In one embodiment, the travel system 110 may include an engine 111, a transmission 112, an energy source 113, and wheels 114. The engine 111 may be an internal combustion engine, an electric motor, an air compression engine, or other types of engine combinations, such as a hybrid engine consisting of a gasoline engine and an electric motor, a hybrid engine consisting of an internal combustion engine and an air compression engine. The engine 111 converts the energy source 113 into mechanical energy.
Examples of energy sources 113 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electrical power. The energy source 113 may also provide energy to other systems of the vehicle 100.
The transmission 112 may transmit mechanical power from the engine 111 to the wheels 114. The transmission 112 may include a gearbox, a differential, and a drive shaft. In one embodiment, the transmission 112 may also include other devices, such as a clutch. Wherein the drive shaft may comprise one or more shafts that may be coupled to one or more wheels 114.
The sensor system 120 may include several sensors that sense information about the environment surrounding the vehicle 100. For example, the sensor system 120 may include a positioning system 121 (the positioning system may be a Global Positioning System (GPS), a Beidou system, or other positioning system), an Inertial Measurement Unit (IMU) 122, a radar 123, a laser radar 124, and a camera 125. The sensor system 120 may also include sensors of internal systems of the monitored vehicle 100 (e.g., an in-vehicle air quality monitor, a fuel gauge, an oil temperature gauge, etc.). The sensor data collected by one or more of these sensors may be used to detect objects and their corresponding characteristics (position, shape, orientation, speed, etc.), such detection and identification being critical to the safe operation of the vehicle 100 to effect autonomous driving.
The positioning system 121 may be used to estimate the geographic location of the vehicle 100. The IMU 122 is used to sense position and orientation changes of the vehicle 100 based on inertial acceleration. In one embodiment, the IMU 122 may be a combination of an accelerometer and a gyroscope.
The radar 123 may utilize radio signals to sense objects within the surrounding environment of the vehicle 100. In some embodiments, in addition to sensing objects, radar 123 may also be used to sense the speed and/or heading of an object.
Lidar 124 may utilize a laser to sense objects in the environment in which vehicle 100 is located. In some embodiments, lidar 124 may include one or more laser sources, laser scanners, and one or more detectors, among other system components.
The camera 125 may be used to capture multiple images of the surroundings of the vehicle 100, as well as multiple images within the vehicle cabin. The camera 125 may be a still camera or a video camera.
The control system 130 may control the operation of the vehicle 100 and its components. Control system 130 may include various elements including a steering system 131, a throttle 132, a braking unit 133, a computer vision system 134, a route control system 135, and an obstacle avoidance system 136.
The steering system 131 is operable to adjust the heading of the vehicle 100. For example, in one embodiment, a steering wheel system.
The throttle 132 is used to control the operating speed of the engine 111, and thus the speed of the vehicle 100.
The brake unit 133 is used to control the vehicle 100 to decelerate. The brake unit 133 may use friction to slow the wheel 114. In other embodiments, the brake unit 133 may also convert the kinetic energy of the wheel 114 into an electrical current. The brake unit 133 may take other forms to slow the rotational speed of the wheels 114 to control the speed of the vehicle 100.
The computer vision system 134 may process and analyze the images captured by the cameras 125 to identify objects and/or features in the environment surrounding the vehicle 100 as well as limb and facial features of the driver within the vehicle cabin. The objects and/or features may include traffic signals, road conditions, and obstacles, and the driver's limb and facial features include driver's behavior, line of sight, expression, and the like. The computer vision system 134 may use object recognition algorithms, motion from motion (SFM) algorithms, video tracking, and other computer vision techniques. In some embodiments, the computer vision system 134 may also be used to map an environment, track objects, estimate the speed of objects, determine driver behavior, face recognition, and so forth.
The route control system 135 is used to determine a travel route of the vehicle 100. In some embodiments, route control system 135 may combine data from sensors, positioning system 121, and one or more predetermined maps to determine a travel route for vehicle 100.
Obstacle avoidance system 136 is used to identify, assess, and avoid or otherwise negotiate potential obstacles in the environment of vehicle 100.
Of course, in one example, the control system 130 may additionally or alternatively include other components in addition to those shown and described. Or may reduce some of the components shown above.
Vehicle 100 interacts with external sensors, other vehicles, other computer systems, or users through peripherals 140. The peripheral devices 140 may include a wireless communication system 141, an in-vehicle computer 142, a microphone 143, and/or a speaker 144.
In some embodiments, the peripheral device 140 provides a means for a user of the vehicle 100 to interact with the user interface 170. For example, the in-vehicle computer 142 may provide information to a user of the vehicle 100. The user interface 170 may also operate the in-vehicle computer 142 to receive user input. The in-vehicle computer 142 may be operated through a touch screen. In other cases, the peripheral device 140 may provide a means for the vehicle 100 to communicate with other devices located within the vehicle. For example, the microphone 143 may receive audio (e.g., voice commands or other audio input) from a user of the vehicle 100. Similarly, the speaker 144 may output audio to a user of the vehicle 100.
Wireless communication system 141 may wirelessly communicate with one or more devices directly or via a communication network. For example, wireless communication system 141 may use 3G cellular communication, such as CDMA, EVD0, GSM/GPRS, or 4G cellular communication, such as LTE. Or 5G cellular communication. The wireless communication system 141 may communicate with a Wireless Local Area Network (WLAN) using WiFi. In some embodiments, the wireless communication system 141 may utilize an infrared link, Bluetooth, or ZigBee to communicate directly with the devices. Other wireless protocols, such as various vehicle communication systems, for example, the wireless communication system 141 may include one or more Dedicated Short Range Communications (DSRC) devices.
The power supply 150 may provide power to various components of the vehicle 100. In one embodiment, power source 150 may be a rechargeable lithium ion or lead acid battery. One or more battery packs of such batteries may be configured as a power source to provide power to various components of the vehicle 100. In some embodiments, the power source 150 and the energy source 113 may be implemented together, such as in some all-electric vehicles.
Some or all of the functions of vehicle 100 are controlled by computer system 160. The computer system 160 may include at least one processor 161, the processor 161 executing instructions 1621 stored in a non-transitory computer readable medium, such as the data storage device 162. The computer system 160 may also be a plurality of computing devices that control individual components or subsystems of the vehicle 100 in a distributed manner.
Processor 161 may be any conventional processor, such as a commercially available Central Processing Unit (CPU). Alternatively, the processor may be a dedicated device such as an Application Specific Integrated Circuit (ASIC) or other hardware-based processor. Although fig. 1 functionally illustrates a processor, memory, and other elements within the same physical housing, those skilled in the art will appreciate that the processor, computer system, or memory may actually comprise multiple processors, computer systems, or memories that may or may not be stored within the same physical housing. For example, the memory may be a hard drive, or other storage medium located in a different physical enclosure. Thus, references to a processor or computer system are to be understood as including references to a collection of processors or computer systems or memories that may or may not operate in parallel. Rather than using a single processor to perform the steps described herein, some components, such as the steering component and the retarding component, may each have their own processor that performs only computations related to the component-specific functions.
In various aspects described herein, the processor may be located remotely from the vehicle and in wireless communication with the vehicle. In other aspects, some of the processes described herein are executed on a processor disposed within the vehicle and others are executed by a remote processor, including taking the steps necessary to perform a single maneuver.
In some embodiments, the data storage device 162 may include instructions 1621 (e.g., program logic), which instructions 1621 may be executed by the processor 161 to perform various functions of the vehicle 100, including those described above. The data storage device 162 may also contain additional instructions, including instructions to send data to, receive data from, interact with, and/or control one or more of the travel system 110, the sensor system 120, the control system 130, and the peripheral devices 140.
In addition to instructions 1621, data storage device 162 may also store data such as road maps, route information, the location, direction, speed, and other such vehicle data of the vehicle, as well as other information. Such information may be used by vehicle 100 and computer system 160 during operation of vehicle 100 in autonomous, semi-autonomous, and/or manual modes.
For example, in an embodiment of the present application, the data storage device 162 may obtain the driver information and the surrounding environment information from the sensor system 120 or other components of the vehicle 100, for example, the environment information may be the road type of the road where the vehicle is currently located, the current weather, the current time, etc., and the driver information may be the identity of the driver, the driving experience of the driver, the current physical condition of the driver, etc. The data storage device 162 may also store status information of the vehicle itself, as well as status information of other surrounding vehicles or devices that are less than a preset distance from the vehicle. The status information includes, but is not limited to, speed, acceleration, heading angle, position, and motion trajectory of the vehicle. For example, the vehicle obtains the speed of the vehicle itself, the speed of another vehicle, the movement track of another vehicle, and the like based on the speed measurement and distance measurement functions of the radar 123 and the laser radar 124. In this way, the processor 161 can process the information acquired from the data storage device 162 according to a preset algorithm and the like to determine the lane change probability of the other vehicle at the current time, the association probability of the center line of the other vehicle and the virtual lane at the current time, and the like, and determine the center line of the lane where the host vehicle is located according to the motion track of the vehicle with the largest association probability, thereby determining the lane line of the lane where the host vehicle is located according to the center line and the preset width, further determining the automatic driving strategy suitable for the host vehicle according to the lane line, and controlling the host vehicle to automatically drive so that the host vehicle smoothly passes through the intersection.
A user interface 170 for providing information to or receiving information from a user of the vehicle 100. Optionally, the user interface 170 may include an interface for interacting and exchanging information with a user with one or more input/output devices within the set of peripheral devices 140, which may be, for example, one or more of the wireless communication system 141, the in-vehicle computer 142, the microphone 143, and the speaker 144.
The computer system 160 may control the functions of the vehicle 100 based on inputs received from various subsystems (e.g., the travel system 110, the sensor system 120, and the control system 130) and from the user interface 170. For example, computer system 160 may utilize input from control system 130 to control steering system 131 to avoid obstacles detected by sensor system 120 and obstacle avoidance system 136. In some embodiments, the computer system 160 is operable to provide control over many aspects of the vehicle 100 and its subsystems.
Alternatively, one or more of these components described above may be mounted or associated separately from the vehicle 100. For example, the data storage device 162 may exist partially or completely separate from the vehicle 100. The above components may be communicatively coupled together in a wired and/or wireless manner.
Optionally, the above components are only an example, in an actual application, components in the above modules may be added or deleted according to an actual need, and fig. 1 should not be construed as limiting the embodiment of the present application.
An autonomous vehicle traveling on a road, such as the vehicle 100 above, may identify its own environment, the location of the vehicle in the surrounding environment, and the state information of the speed, orientation, etc. of the surrounding vehicles to determine the surrounding vehicle that is most similar to the current state of the vehicle 100. In some examples, the characteristics of each vehicle in the surrounding environment whose distance from the vehicle 100 does not exceed the preset distance may be considered independently, and the most suitable following object of the vehicle 100 may be determined based on the center line of the virtual lane of the vehicle 100 and the respective lane change probability of the vehicle in the surrounding environment at the current time, or the association probability of the vehicle in the surrounding environment and the center line of the virtual lane of the vehicle 100 at the current time, and the like, so as to determine the motion trajectory of the vehicle 100 in a later period of time, and the lane line of the lane in which the vehicle 100 is located, so as to support the automatic driving of the vehicle.
Optionally, the autonomous vehicle 100 or a computing device associated with the autonomous vehicle 100 (e.g., the computer system 160, the computer vision system 134, the data storage 162 of fig. 1) may determine a lane line of a lane in which the own vehicle is located and a movement trajectory of the own vehicle for a period of time based on a state of the vehicle in the surrounding environment (e.g., movement trajectory, position, speed, etc.) and a centerline of a virtual lane in which the own vehicle is located. In this process, other factors may also be considered to determine the lane line of the lane in which the vehicle is located, such as the movement trajectory of the vehicle 100 when previously traversing the segment of road, and so on.
In addition to providing lane lines of the lane in which the host vehicle is located, the computer device may also provide instructions to adjust the speed and steering angle of the vehicle 100 to cause the autonomous vehicle to follow a given trajectory and/or maintain a safe lateral and longitudinal distance from objects in the vicinity of the autonomous vehicle (e.g., cars in adjacent lanes on the road).
The vehicle 100 may be a car, a truck, a motorcycle, a bus, a boat, an airplane, a helicopter, a lawn mower, an amusement car, a playground vehicle, construction equipment, a trolley, a golf cart, a train, a trolley, etc., and the embodiment of the present invention is not particularly limited.
In other embodiments of the present application, the autonomous vehicle may further include a hardware structure and/or a software module, and the functions described above are implemented in the form of a hardware structure, a software module, or a hardware structure plus a software module. Whether any of the above-described functions is implemented as a hardware structure, a software module, or a hardware structure plus a software module depends upon the particular application and design constraints imposed on the technical solution.
Referring to FIG. 2, exemplary, the following modules may be included in a vehicle:
the sensing module 201 is configured to acquire data such as video stream data and laser point cloud of a radar through a roadside sensor, a vehicle-mounted sensor, and the like, and process the data such as original video stream data and laser point cloud of the radar collected by the sensor to obtain information such as a position of an obstacle and a position of a lane line in a surrounding environment of the host vehicle (for example, positions of a vehicle, a pedestrian, a traffic sign, a traffic light, and positions of a sidewalk and a stop line). Among them, the roadside sensor and the vehicle-mounted sensor may be a laser radar, a millimeter wave radar, a vision sensor, or the like. The sensing module 201 is further configured to send the determined positions of obstacles and lane lines in the surrounding environment of the host vehicle to the fusion module 202, the positioning module 203, the road structure recognition module 204, and the prediction module 205 according to data collected by all or a certain type or a certain sensor.
The fusion module 202 is configured to acquire information such as positions of obstacles and positions of lane lines in the surrounding environment of the host vehicle from the sensing module 201, analyze the information, fuse the obstacles and the lane lines in the surrounding environment of the host vehicle, and determine information of the same obstacle (e.g., the same vehicle) or the same lane line (e.g., the same parking space boundary), so as to further determine information such as a type, a position, a size, a speed, and the like of each obstacle and a type, a position, a shape, and the like of each lane line. The fusion module 202 is further configured to analyze the information acquired from the sensing module 201, and determine information such as an area where the vehicle can pass through, a location where the vehicle cannot pass through, and the like. The fusion module 202 is further configured to send information such as the type, position, size, speed, and the like of each obstacle, the type, position, shape, and the like of each lane line, the area where the host vehicle can pass, and the area where the host vehicle cannot pass, to the positioning module 203, the road structure recognition module 204, and the prediction module 205.
The positioning module 203 is configured to position the vehicle according to the information received from the sensing module 201 and the fusion module 202 and the map information acquired from the map interface, and determine a location of the vehicle (for example, longitude and latitude of the vehicle). The positioning module 203 is further configured to send the location of the vehicle to the sensing module 201, the fusion module 202, the road structure recognition module 204, the prediction module 205, and the like.
The road structure recognizing module 204 is configured to determine information such as a lane line, a traffic flow (that is, other vehicles in the surrounding environment of the own vehicle), other obstacles around the own vehicle, and a position of the own vehicle according to the information acquired from the sensing module 201, the fusing module 202, and the positioning module 203, and further analyze a local road model of the surrounding environment of the position of the own vehicle, a confidence of the local road model, and the like. The road structure recognition module 204 is further configured to send the determined information, such as the local road model of the surrounding environment where the vehicle is located and the confidence of the local road model, to the prediction module 205 and the planning control module 206.
And the prediction module 205 is configured to infer according to information acquired from the sensing module 201, the fusion module 202, the positioning module 203, and the road structure cognition module 204, and predict an intention, a motion trajectory, and the like of a vehicle or a pedestrian in a surrounding environment where the vehicle is located. The prediction module 205 is further configured to send the predicted intention or movement trajectory of the vehicle, pedestrian, or the like in the surrounding environment where the vehicle is located to the planning control module 206.
A planning control module 206 for determining the road structure, the intention and the movement track of the obstacle (for example, the vehicle in the surrounding environment of the vehicle), etc. according to the information obtained by the sensing module 201, the fusion module 202, the positioning module 203, the road structure recognition module 204, and the prediction module 205. The planning control module 206 is further configured to plan a motion trajectory of the vehicle according to the road structure, the intention of the obstacle, the motion trajectory, and the like, plan a driving path of the vehicle, generate a driving strategy suitable for the vehicle, output an action instruction or a control amount corresponding to the driving strategy, and control the vehicle to automatically drive according to the instruction or the control amount.
Vehicle communication module 207 (not shown in fig. 2): the system is used for information interaction between the self vehicle and other vehicles.
Storage component 208 (not shown in fig. 2): executable codes of the modules are stored, and part or all of the method flow of the embodiment of the application can be realized by running the executable codes.
In one possible implementation of the embodiments of the present application, as shown in fig. 3, the computer system 160 shown in fig. 1 includes a processor 301, the processor 301 is coupled to a system bus 302, and the processor 301 may be one or more processors, each of which may include one or more processor cores. A display adapter 303 may drive a display 324, the display 324 coupled with system bus 302. System BUS 302 is coupled via a BUS bridge 304 to an input/output (I/O) BUS (BUS)305, an I/O interface 306 coupled to I/O BUS 305, and I/O interface 306 in communication with various I/O devices, such as an input device 307 (e.g., keyboard, mouse, touch screen, etc.), multimedia disk (media tray)308, (e.g., CD-ROM, multimedia interface, etc.). A transceiver 309 (which can send and/or receive radio communication signals), a camera 310 (which can capture still and motion digital video images), and an external Universal Serial Bus (USB) port 311. Wherein, optionally, the interface connected with the I/O interface 306 may be a USB interface.
The processor 301 may be any conventional processor, including a Reduced Instruction Set Computing (RISC) processor, a Complex Instruction Set Computing (CISC) processor, or a combination thereof. Alternatively, the processor 301 may also be a dedicated device such as an Application Specific Integrated Circuit (ASIC). Alternatively, the processor 301 may also be a neural network processor or a combination of a neural network processor and the conventional processor described above.
Alternatively, in various embodiments described herein, the computer system 160 may be located remotely from the autonomous vehicle and communicate wirelessly with the autonomous vehicle 100. In other aspects, some of the processes described herein may be provided for execution on a processor within an autonomous vehicle, and others may be executed by a remote processor, including taking the actions necessary to perform a single maneuver.
Computer system 160 may communicate with a software deploying server (deploying server)313 via a network interface 312. Alternatively, the network interface 312 may be a hardware network interface, such as a network card. The network (network)314 may be an external network, such as the internet, or an internal network, such as an ethernet or a Virtual Private Network (VPN), and optionally the network314 may also be a wireless network, such as a WiFi network, a cellular network, etc.
Hard drive interface 315 is coupled to system bus 302. Hard drive interface 315 is coupled to hard drive 316. A system memory 317 is coupled to system bus 302. Data running in system memory 317 may include an Operating System (OS)318 and application programs 319 of computer system 160.
The Operating System (OS)318 includes, but is not limited to, a Shell 320 and a kernel 321. Shell 320 is an interface between the user and kernel 321 of operating system 318. Shell 320 is the outermost tier of operating system 318. The shell manages the interaction between users and the operating system 318: awaits user input, interprets the user input to the operating system 318, and processes the output results of the various operating systems 318.
The kernel 321 is made up of the portion of the operating system 318 that manages memory, files, peripherals, and system resources, interacting directly with the hardware. The kernel 321 of the operating system 318 typically runs processes and provides inter-process communication, CPU slot management, interrupts, memory management, IO management, and the like.
The application programs 319 include autopilot-related programs 323 such as programs for managing the interaction of an autonomous vehicle with on-road obstacles, programs for controlling the travel route or speed of an autonomous vehicle, programs for controlling the interaction of an autonomous vehicle with other on-road vehicles/autonomous vehicles, and the like. Application 319 also resides on the system of the deploying server 313. In one embodiment, computer system 160 may download application 319 from deploying server 313 when application 319 needs to be executed.
As another example, the application 319 may be an application that controls the vehicle to determine a driving strategy based on the lane lines of the vehicle and a conventional control module as described above. The processor 301 of the computer system 160 invokes the application 319 to obtain the driving strategy.
Sensor 322 is associated with computer system 160. The sensors 322 are used to detect the environment surrounding the computer system 160. For example, the sensors 322 may detect animals, cars, obstacles, and/or crosswalks, etc. Further sensors 322 may also detect the environment surrounding such objects as animals, cars, obstacles, and/or crosswalks. Such as: the environment surrounding the animal, for example, other animals present around the animal, weather conditions, brightness of the environment surrounding the animal, and the like. Alternatively, if the computer system 160 is located on an autonomous vehicle, the sensor 322 may be at least one of a camera, an infrared sensor, a chemical detector, a microphone, and the like.
In other embodiments of the present application, the lane line detection method according to the embodiments of the present application may also be executed by a chip system. Fig. 4 is a structural diagram of a chip system according to an embodiment of the present application.
A neural-Network Processing Unit (NPU) 40 is mounted as a coprocessor on a main CPU (host CPU), and the host CPU allocates tasks to the NPU 40. The core of the NPU 40 is an arithmetic circuit 403. Illustratively, the arithmetic circuit 403 is controlled by the controller 404 such that the arithmetic circuit 403 performs a multiplication operation using the matrix data extracted from the memory.
In some implementations, the arithmetic circuit 403 includes a plurality of processing units (PEs) therein. In some implementations, the operational circuitry 403 is a two-dimensional systolic array. Alternatively, the arithmetic circuit 403 may be a one-dimensional systolic array, or other electronic circuit capable of performing mathematical operations such as multiplication and addition. In some implementations, the arithmetic circuitry 403 is a general-purpose matrix processor.
For example, assume that there is an input matrix A, a weight matrix B, and an output matrix C. The operation circuit 403 obtains the data corresponding to the weight matrix B from the weight memory 402 and buffers the data on each PE in the operation circuit 403. The operation circuit 403 further obtains data corresponding to the input matrix a from the input memory 401, further performs a matrix operation according to the input matrix a and the weight matrix B, and stores a partial result or a final result of the matrix operation in an accumulator (accumulator) 408.
For another example, the operation circuit 403 may be configured to implement a feature extraction model (e.g., a convolutional neural network model), input image data into the convolutional neural network model, and obtain features of the image through operation of the model. The image features are then output to a classifier, which outputs a classification probability of the object in the image.
The unified memory 406 is used to store input data as well as output data. The weight data in the external memory is directly sent to the weight memory 402 through a memory unit access controller (DMAC) 405. The input data in the external memory may be carried into the unified memory 406 through the DMAC or into the input memory 401.
A Bus Interface Unit (BIU) 410, configured to interact with an advanced extensible interface (AXI) bus and a DMAC and an instruction fetch memory (instruction fetch buffer) 409. It is also used for the instruction fetch memory 409 to fetch instructions from the external memory, and for the memory unit access controller 405 to fetch the raw data of the input matrix a or the weight matrix B from the external memory.
The DMAC is mainly used to transfer input data in the external memory (DDR) to the unified memory 406, or transfer weight data to the weight memory 402, or transfer input data to the input memory 401.
For example, in the embodiment of the present application, when the DQN model is used to calculate data such as a target type, target shape information, and target tracking information corresponding to a target, the input data may be input data of the DQN model, that is, information such as laser radar point cloud data of a target object (for example, another vehicle interacting with the vehicle) in the surrounding environment of the vehicle. The output data is output data of the DQN model, that is, data such as a target type, target shape information, and target tracking information of a target in the vehicle surrounding environment.
The vector calculation unit 407 may include a plurality of arithmetic processing units. The output of the arithmetic circuit 403 may be further processed, such as vector multiplication, vector addition, exponential operation, logarithmic operation, magnitude comparison, etc., if desired. The method is mainly used for non-convolution/FC layer network calculation in the neural network, such as pooling (posing), batch normalization (batch normalization), local response normalization (local response normalization) and the like.
In some implementations, the vector calculation unit 407 stores the processed output vector to the unified memory 406. For example, the vector calculation unit 407 may apply a non-linear function to the output of the arithmetic circuit 403, such as a vector of accumulated values, to generate the activation value. In some implementations, the vector calculation unit 407 generates normalized values, combined values, or both. In some implementations, the processed output vector can also be used as an activation input to the arithmetic circuitry 403, for example for use in subsequent layers in a neural network.
The controller 404 is connected to an instruction fetch buffer 409, and instructions used by the controller 404 may be stored in the instruction fetch buffer 409.
As a possible implementation, the unified memory 406, the input memory 401, the weight memory 402 and the instruction fetch memory 409 are all On-Chip memories. The external memory is private to the NPU 40 hardware architecture.
With reference to fig. 1-3, the host CPU and NPU 40 cooperate to implement the corresponding algorithm for the desired functionality of vehicle 100 of fig. 1, the corresponding algorithm for the desired functionality of the vehicle of fig. 2, and the corresponding algorithm for the desired functionality of computer system 160 of fig. 3.
In other embodiments of the present application, computer system 160 may also receive information from, or transfer information to, other computer systems. Alternatively, sensor data collected from the sensor system 120 of the vehicle 100 may be transferred to another computer, where it is processed. As shown in fig. 5, data from computer system 160 may be transmitted via a network to cloud-side computer system 510 for further processing. The network and intermediate nodes may comprise various configurations and protocols, including the internet, world wide web, intranets, virtual private networks, wide area networks, local area networks, private networks using proprietary communication protocols of one or more companies, ethernet, WiFi, and HTTP, as well as various combinations of the foregoing. Such communication may be performed by any device capable of communicating data to and from other computers, such as modems and wireless interfaces.
In one example, computer system 510 may include a server having multiple computers, such as a load balancing server farm. Server 520 exchanges information with various nodes of the network in order to receive, process, and transmit data from computer system 160. The computer system 510 may have a configuration similar to computer system 160 and have a processor 530, memory 540, instructions 550, and data 560.
In one example, data 560 of server 520 may include providing weather-related information. For example, server 520 may receive, monitor, store, update, and transmit various information related to target objects in the surrounding environment. This information may include target category, target shape information, and target tracking information, for example, in a report form, radar information form, forecast form, and the like.
Referring to fig. 6, an example of autonomous vehicle and cloud service center (cloud server) interaction is shown. The cloud service center may receive information (such as data collected by vehicle sensors or other information) from autonomous vehicles 613, 612 within its environment 600 via a network 611, such as a wireless communication network.
The cloud service center 620 runs the stored programs related to controlling the automatic driving of the automobile to control the automatic driving vehicles 613 and 612 according to the received data. The programs related to controlling the automatic driving of the automobile can be as follows: a program for managing the interaction of the autonomous vehicle with obstacles on the road, or a program for controlling the route or speed of the autonomous vehicle, or a program for controlling the interaction of the autonomous vehicle with other autonomous vehicles on the road.
For example, the cloud service center 620 may provide portions of a map to the vehicles 613, 612 via the network 611. In other examples, operations may be divided among different locations. For example, multiple cloud service centers may receive, validate, combine, and/or send information reports. Information reports and/or sensor data may also be sent between vehicles in some examples. Other configurations are also possible.
In some examples, the cloud service center 620 sends suggested solutions to the autonomous vehicle regarding possible driving conditions within the environment (e.g., informing of a front obstacle and informing of how to bypass it)). For example, the cloud service center 620 may assist the vehicle in determining how to travel when facing a particular obstacle within the environment. The cloud service center 620 sends a response to the autonomous vehicle indicating how the vehicle should travel in the given scenario. For example, the cloud service center 620 may confirm the presence of a temporary stop sign ahead of the road based on the collected sensor data, and determine that the lane is closed due to construction, for example, based on a "lane close" sign and sensor data of construction vehicles. Accordingly, the cloud service center 620 sends a suggested operating mode for the vehicle to pass the obstacle (e.g., instructing the vehicle to change lanes on another road). The cloud service center 620 observes the video stream within its operating environment 600 and, having confirmed that the autonomous vehicle can safely and successfully traverse obstacles, the operational steps used for the autonomous vehicle may be added to the driving information map. Accordingly, this information may be sent to other vehicles in the area that may encounter the same obstacle in order to assist the other vehicles not only in recognizing the closed lane but also in knowing how to pass.
In some embodiments, the disclosed methods may be implemented as computer program instructions encoded on a computer-readable storage medium or other non-transitory medium or article of manufacture in a machine-readable format. Fig. 7 schematically illustrates a conceptual partial view of an example computer program product comprising a computer program for executing a computer process on a computing device, arranged in accordance with at least some embodiments presented herein. In one embodiment, the example computer program product 700 is provided using a signal bearing medium 701. The signal bearing medium 701 may include one or more program instructions 702 that, when executed by one or more processors, may provide all or part of the functionality described above with respect to fig. 1-7, or may provide all or part of the functionality described in subsequent embodiments. For example, referring to the embodiment shown in fig. 8, one or more features of S801-S805 may be undertaken by one or more instructions associated with the signal bearing medium 701. Further, program instructions 702 in FIG. 7 also describe example instructions.
In some examples, signal bearing medium 701 may include a computer readable medium 703, such as, but not limited to, a hard disk drive, a Compact Disc (CD), a Digital Video Disc (DVD), a digital tape, a memory, a read-only memory (ROM), a Random Access Memory (RAM), or the like. In some embodiments, the signal bearing medium 701 may comprise a computer recordable medium 704 such as, but not limited to, a memory, a read/write (R/W) CD, a R/W DVD, and the like. In some implementations, the signal bearing medium 701 may include a communication medium 705, such as, but not limited to, a digital and/or analog communication medium (e.g., a fiber optic cable, a waveguide, a wired communications link, a wireless communication link, etc.). Thus, for example, signal bearing medium 701 may be communicated by a wireless form of communication medium 705 (e.g., a wireless communication medium conforming to the IEEE 802.11 standard or other transmission protocol). The one or more program instructions 702 may be, for example, computer-executable instructions or logic-implementing instructions. In some examples, a computing device such as described with respect to fig. 1-7 may be configured to provide various operations, functions, or actions in response to program instructions 702 communicated to the computing device by one or more of a computer readable medium 703, and/or a computer recordable medium 704, and/or a communications medium 705. It should be understood that the arrangements described herein are for illustrative purposes only. Thus, those skilled in the art will appreciate that other arrangements and other elements (e.g., machines, interfaces, functions, orders, and groupings of functions, etc.) can be used instead, and that some elements may be omitted altogether depending upon the desired results. In addition, many of the described elements are functional entities that may be implemented as discrete or distributed components or in conjunction with other components, in any suitable combination and location.
In order to improve the accuracy of determining the lane line of the lane where the host vehicle is located, and ensure the safety of the vehicle, so that the vehicle can safely pass through an intersection scene, the embodiment of the present application provides a lane line detection method, where an execution subject of the method is a vehicle with an automatic driving function, or another device with a function of controlling the vehicle, or a processor in a vehicle with an automatic driving function or another device with a function of controlling the vehicle, such as the processor 161, the processor 301, and the processor 530 mentioned in fig. 1 to 7. As shown in fig. 8, the lane line detection method of the vehicle includes steps S801 to S805:
s801, determining a lane line of the virtual lane according to the position of the vehicle.
Optionally, the position of the vehicle is used as a point on the center line of the virtual lane, the point is parallel to a connecting line between the head and the tail of the vehicle, and the virtual lane line of the lane where the vehicle is located is established by a preset width. The preset width is the lane width set by the vehicle or the user.
For example, as shown in fig. 9 (a), a and c in the figure are lane lines of a virtual lane constructed according to the position of the host vehicle, b is a center line of the virtual lane, and a distance d between a and c has a preset width. An arrow on the side of the host vehicle in fig. 9 (a) is used to indicate the direction of the head of the host vehicle, a, b, and c are all parallel to the line connecting the head and the tail of the host vehicle, and the host vehicle is located on the center line b of the virtual lane.
S802, determining the change trend of the distance between the week vehicle and the center line of the virtual lane according to the motion track of the week vehicle.
The week vehicle is a vehicle with a distance from the week vehicle to the self vehicle smaller than a preset distance, and the preset distance is determined by the vehicle or a user. With the change of the position of the vehicle or the change of the current time, the number and the position of the vehicles which are less than the preset distance away from the vehicle may change.
In one possible implementation manner, the week vehicle may be divided into a front vehicle, a left front vehicle, a right side vehicle, a left side vehicle, a rear vehicle, a left rear vehicle or a right rear vehicle of the own vehicle according to different positions of the week vehicle relative to the own vehicle.
For example, taking the virtual lane where the host vehicle is located as shown in fig. 9 (a) as an example, the virtual lane and the region where the distance from the host vehicle does not exceed the preset distance are divided into 9 regions with the location of the host vehicle as the center, and fig. 9 (b) shows the result. These 9 regions are region a, region B, region C, region D, region E, region F, region G, region H, and region I, respectively. The vehicles are located in the area I, and the vehicles located in the areas a to H are respectively the front left vehicle, the front right vehicle, the front left vehicle, the right side vehicle, the rear left vehicle, the rear right vehicle and the rear right vehicle of the vehicles, that is, the vehicle a1 is the front left vehicle of the vehicles, the vehicle B1 is the front right vehicle of the vehicles, the vehicle C1 is the front right vehicle of the vehicles, the vehicle D1 is the left side vehicle of the vehicles, the vehicle E1 is the right side vehicle of the vehicles, the vehicle F1 is the rear left vehicle of the vehicles, the vehicle G1 is the rear vehicle of the vehicles, and the vehicle H1 is the rear right vehicle of the vehicles.
Optionally, the peripheral vehicle whose distance from the own vehicle does not exceed the preset distance and the motion track of each peripheral vehicle are determined according to the information detected by the sensor. For each vehicle, a plurality of position points exist on the motion track of the vehicle, and the motion track of the vehicle is formed by smooth connecting lines among the plurality of position points.
In a possible implementation manner, the plurality of position points on the motion trajectory of the vehicle include a position point corresponding to a current position of the vehicle.
In a possible implementation manner, the plurality of position points on the motion trajectory of the vehicle are determined randomly, or the plurality of position points on the motion trajectory of the vehicle are determined at preset time intervals, or a longitudinal distance between the plurality of position points on the motion trajectory of the vehicle has a value equal to a value of the preset distance interval.
For example, the longitudinal distance between the plurality of position points on the motion trajectory of the vehicle is equal to the preset distance interval. As shown in fig. 10, the vehicles in the week whose distance from the own vehicle is less than the preset distance include a vehicle a, a vehicle b, and a vehicle c. The motion trajectory of the vehicle a includes 3 position points, a1 and a2 are longitudinal distances between adjacent position points in the 3 position points, the motion trajectory of the vehicle b includes 4 position points, b1, b2 and b3 are longitudinal distances between adjacent position points in the 4 position points, the motion trajectory of the vehicle c includes 5 position points, and c1, c2, c3 and c4 are longitudinal distances between adjacent position points in the 5 position points. The distance d is a preset distance interval, wherein a1, a2, b1, b2, b3, c1, c2, c3, c4 and d. The following examples all describe the embodiments of the present application by taking as an example that the value of the longitudinal distance between a plurality of position points on the motion trajectory of the vehicle is the same as the value of the preset distance interval.
In one possible implementation, the preset distance interval is determined by the user or the vehicle itself.
In a possible implementation manner, a value range of the preset distance interval is determined by a user or a vehicle, and a value of a longitudinal distance between a plurality of position points on a motion track of the vehicle is within the value range of the preset distance interval.
Illustratively, taking the value range of the preset distance interval as [8, 16] as an example, if the motion trajectory of the vehicle a includes 3 position points, and the longitudinal distances between the 3 position points are a1 and a2, respectively, 8< ═ a1< ═ 16, and 8< ═ a2< > 16.
In one possible implementation, for each vehicle, a plurality of position points are determined, and then a smooth curve composed of the plurality of position points is determined as the motion trajectory of the vehicle. Or, the motion trajectory of the vehicle may be determined first, then a plurality of position points on the motion trajectory are determined, and then a smooth curve obtained by connecting the plurality of position points is used as the motion trajectory of the vehicle.
Optionally, after the motion trajectory of the weekly vehicle is determined, according to the motion trajectory, determining the lateral distances between the weekly vehicle and the center line of the virtual lane at a plurality of position points of the weekly vehicle, and recording the lateral distances, that is, obtaining the variation trend of the distances between the weekly vehicle and the center line of the virtual lane.
Illustratively, the vehicle 1 has 3 position points on the motion track, the 3 position points are a, b and c respectively, and the lateral distances between the vehicle 1 and the center line of the virtual lane at the 3 position points are 5, 7 and 6 respectively, i.e. the trend of the distance between the vehicle 1 and the center line of the virtual lane is 5, 7 and 6.
If there is no vehicle whose distance from the vehicle is less than the preset distance at the present time, the vehicle may give the driving authority to the user, so that the user may safely pass the vehicle by driving according to the driving experience and the surrounding environment. If the week vehicle with the distance from the current time to the current vehicle being less than the preset distance comprises at least one week vehicle of a left side vehicle, a right side vehicle, a left rear vehicle or a right rear vehicle, the current vehicle decelerates to change the left side vehicle, the right rear vehicle and the left rear vehicle into a left front vehicle, a right front vehicle or a front vehicle, and obtains the motion tracks of the left front vehicle, the right front vehicle and the front vehicle at the moment.
And S803, analyzing the change trend of the distance between the week vehicle and the central line of the virtual lane and the lane change probability of the week vehicle at the previous moment, and determining the lane change probability of the week vehicle at the current moment.
Optionally, for each week vehicle, the change trend of the distance between the week vehicle and the center line of the virtual lane and the lane change probability of the week vehicle at the previous time are analyzed, and the lane change probability of the week vehicle at the current time is determined.
Optionally, the change trend of the distance between the surrounding vehicle and the center line of the virtual lane is analyzed to obtain a first observation probability of the surrounding vehicle at the current time, that is, a predicted value of the lane change probability of the surrounding vehicle at the current time. And then carrying out weighted summation on the first observation probability of the weekly vehicle at the current moment and the lane change probability of the weekly vehicle at the previous moment to obtain the lane change probability of the weekly vehicle at the current moment. The weight of the first observation probability and the weight of the lane change probability at the previous moment are preset by the vehicle or the user. And traversing the vehicles in the week to obtain the lane change probability of each vehicle in the week.
Illustratively, the variation trend of the center line of the week vehicle and the virtual lane is input into a sigmoid function for analysis, and a first observation probability of the week vehicle at the current moment is output, such as a formula PObservationSigmoid (Δ closetL). Wherein, PObservationThe first observation probability of the vehicle at the current moment is shown, and the deltaclosetL shows the change trend of the vehicle and the central line of the virtual lane. Then, according to the formula PLaneChange=a*PLaneChange′+β*PObservationWherein P isLaneChangeIndicating the probability of a lane change, P, of the vehicle at the current momentLaneChange' represents a lane change probability of the vehicle at the previous moment, alpha represents a preset weight of the lane change probability of the vehicle at the previous moment, and beta represents a preset weight of the first observation probability of the vehicle. Alpha and beta areDetermined by the vehicle itself or preset by the user according to his or her needs.
In one possible implementation manner, the turnaround probability of the week vehicle entering the preset area is the preset turnaround probability. The preset lane change probability is determined by the vehicle or preset by the user according to the requirement of the user.
S804, determining the association probability of the center lines of the weekly vehicle and the virtual lane at the current moment according to the lane changing probability of the weekly vehicle at the current moment, the association probability of the weekly vehicle and the center line of the virtual lane at the previous moment, the position of the weekly vehicle relative to the center line of the virtual lane at the current moment and a preset transfer coefficient.
Optionally, the association probability of the center lines of the weekly vehicle and the virtual lane at the current moment is determined according to the lane change probability of the weekly vehicle at the current moment, the association probability of the weekly vehicle and the center line of the virtual lane at the previous moment, the position of the weekly vehicle relative to the center line of the virtual lane at the current moment and the preset transfer coefficient. And traversing each week vehicle, and determining the association probability of each week vehicle and the center line of the virtual lane at the current moment. The position of the week vehicle relative to the center line of the virtual lane at the current moment comprises the transverse distance between the week vehicle and the center line of the virtual lane and the angle between the head direction of the week vehicle and the center line of the virtual lane. The preset transfer coefficient is a conversion coefficient when the correlation probability between the weekly vehicle and the central line of the virtual lane at the last moment is converted into the correlation probability between the weekly vehicle and the central line of the virtual lane at the current moment, and is determined by the vehicle itself or the user according to the requirement. The association probability is used for representing the distance between the week vehicle and the central line of the virtual lane and the similarity between the motion trail of the week vehicle and the central line of the virtual lane.
For example, the virtual lane shown in fig. 9 (a) is taken as an example, as shown in fig. 11. In fig. 11, a vehicle 1 represents one week of the own vehicle, d represents a lateral distance between the vehicle 1 and a center line of a virtual lane in which the own vehicle is located, θ represents an angle between a heading of the vehicle 1 and the center line of the virtual lane, and a direction indicated by a single arrow in the drawing is the heading of the vehicle 1.
Optionally, the first target week vehicle is determined according to the lane change probability of the week vehicle at the current time and a preset lane change probability threshold.
In a possible implementation manner, when the preset area is an intersection with construction or the like in front, or when the number of lanes at the time of the exit and the entrance is different, the vehicles passing through the preset area need to change lanes to safely and smoothly pass through the intersection. Therefore, if a plurality of weekly vehicles are available, when determining the first target weekly vehicle, it is necessary to consider whether all the weekly vehicles need to change lanes at the current time, that is, whether the lane change probability of all the weekly vehicles at the current time is greater than or equal to the preset lane change probability threshold. And if the lane change probabilities of all vehicles in the multiple weekly vehicles at the current moment are greater than or equal to the preset lane change probability threshold value, determining that all vehicles in the multiple weekly vehicles are the first target weekly vehicle. If the weekly vehicles with the lane change probability at the current moment greater than or equal to the preset lane change probability threshold exist in the plurality of weekly vehicles, and the weekly vehicles with the lane change probability at the current moment smaller than the preset lane change probability threshold exist, determining the weekly vehicles with the lane change probability at the current moment smaller than the preset lane change probability threshold in the plurality of weekly vehicles as the first target weekly vehicle. Optionally, the week vehicles in the multiple week vehicles whose lane change probability at the current time is greater than or equal to the preset lane change probability threshold are deleted, so that vehicles whose association probability with the center line of the virtual lane needs to be determined at the current time are reduced, and the efficiency of lane line detection of the lane where the own vehicle is located is improved.
Optionally, after the first target vehicles are determined, for each first target vehicle, the lateral distance between the first target vehicle and the center line of the virtual lane and the included angle between the head orientation of the first target vehicle and the center line of the virtual lane are determined according to the position of the first target vehicle relative to the virtual lane at the current moment. Then, for each first target week vehicle, carrying out weighted summation on an included angle between the direction of the vehicle head of the first target week vehicle and the center line of the virtual lane and the transverse distance between the direction of the vehicle head of the first target week vehicle and the center line of the virtual lane, and determining a second observation probability of the first target week vehicle at the current moment, wherein the second observation probability is a predicted value of the association probability between the first target week vehicle and the center line of the virtual lane.
For example, taking fig. 11 as an example, the distance between the first target week, i.e., the vehicle 1, and the center line of the virtual lane is d, and the included angle between the heading direction of the vehicle 1 and the center line of the virtual lane is θ. D and theta are input into a formula dist ═ delta × fabs (d) + epsilon ·, and weighted summation is carried out, wherein fabs (d) represents that an absolute value is taken for d, delta represents a preset weight for d, epsilon represents a preset weight for theta, and delta and epsilon are determined by a vehicle or a user according to requirements. Then, dist of the vehicle 1 at a different position point (including a position point corresponding to the current position) is input into the sigmoid function, and a second observation probability P 'of the vehicle 1 at the current time is obtained, that is, P' is sigmoid (dist).
In one possible implementation manner, in order to more accurately obtain the association probability of the weekly vehicle and the center line of the virtual lane at the current time, the association probability of each first target weekly vehicle and the center line of the virtual lane at the current time may be determined by using the markov probability model shown in fig. 12 on the basis of considering the association probability of the weekly vehicle and the center line of the virtual lane at the previous time. That is to say, for each first target week, the second observation probability of the first target week at the current time, the preset transfer coefficient, and the association probability of the center line of the first target week and the virtual lane at the previous time are normalized, so as to obtain the association probability of the center line of the first target week and the virtual lane at the current time.
For example, the vehicle 1 is a first target week, and the association probability of the vehicle 1 and the center line of the virtual lane where the vehicle is located at the current time, the second observation probability of the vehicle 1 at the current time, and the preset transfer coefficient are input into the formula P '× P ″ × P' ″, that is, normalization processing is performed to obtain the association probability P of the vehicle 1 and the center line of the virtual lane at the current time. Wherein P 'represents a second observation probability of the vehicle 1 at the present time, P ″ represents a preset transfer coefficient, and P' "represents an association probability of the vehicle 1 with the center line of the virtual lane at the previous time.
In a possible implementation manner, when the first target week vehicle enters the preset area, that is, before the intersection, the association probability of the first target week vehicle with the center line of the virtual lane is the preset association probability. The preset association probability is determined by the vehicle itself or preset by the user according to the requirement of the user.
And S805, determining a lane line of a lane where the self vehicle is located according to the motion track of the target vehicle and the preset width.
The target vehicle is a first target week vehicle with the maximum association probability with the center line of the virtual lane at the current moment, the target vehicle is a following object most suitable for the own vehicle in week vehicles of the own vehicle, and the own vehicle can follow the motion trail of the target vehicle to run. The association probability of the week vehicle and the center line of the virtual lane represents the similarity of the week vehicle and the center line of the virtual lane and the distance between the week vehicle and the center line of the virtual lane. Therefore, the greater the probability of association between the first target vehicle and the center line of the virtual lane at the current time, the greater the similarity between the first target vehicle and the center line of the virtual lane, and the smaller the distance between the first target vehicle and the center line of the virtual lane.
Optionally, the motion trajectory of the target vehicle is translated to obtain the motion trajectory of the host vehicle. During the movement of the self vehicle, the target vehicle may change according to the change of the cycle of the self vehicle, the change of the lane change probability of the cycle and the change of the association probability of the cycle and the center line of the virtual lane.
For example, as shown in fig. 13, after the location point a1 of the host vehicle, after the above steps S801-S804, the vehicle 2 is determined to be the target vehicle, and the curve b, i.e. the curve between the location points b1 and b2 on the motion trajectory of the vehicle 2, is translated in the direction of the arrow 1 shown in the figure, so as to obtain the motion trajectory of the host vehicle over a period of time as the curve c. After the host vehicle has traveled to the position point a2, through the above steps S801-S804, the target vehicle is determined to be the vehicle 3 again, the curve d is the motion track of the vehicle 3, the curve d is translated in the direction of the arrow 2 shown in the figure, i.e. the curve between the position points d2 and d3 on the motion track of the vehicle 3, and the motion track of the host vehicle is obtained to be the curve e.
It should be noted that, after this step, the present application also needs to determine whether the motion trajectory of the vehicle meets the user's requirements, and if the motion trajectory is a motion trajectory that does not change lanes, and the vehicle needs to change lanes to reach the destination, the vehicle may give the driving authority to the user, so that the user drives the vehicle according to the driving experience and the surrounding environment to safely pass through and smoothly reach the destination. The motion trajectory of the host vehicle determined in this step is a motion trajectory of the host vehicle within a period of time from the current time (for example, time t) to a time after the current time (for example, from time t to time t + 2).
Optionally, the movement track of the target vehicle is used as a center line of a lane where the vehicle is located, the preset width is used as a lane width, and the lane where the vehicle is located and a lane line of the lane are determined.
In a possible implementation manner, the motion trail of the target vehicle is translated, the translated motion trail, namely the motion trail of the vehicle in a period of time, is used as the central line of the lane where the vehicle is located, the preset width is used as the lane width, and the lane where the vehicle is located and the lane line of the lane are determined.
For example, taking the motion trajectory of the host vehicle obtained by translating the motion trajectory of the target vehicle in fig. 13, that is, the curve c and the curve e as examples, as shown in fig. 14, taking the curve c and the curve e as the center lines of the lane where the host vehicle is located, and taking the preset width f as the lane width, the lane where the host vehicle is located, and the lane lines g and h of the lane where the host vehicle is located may be determined.
In the process, firstly, the method and the device can determine the change trend of the distance between the week vehicle and the central line of the virtual lane according to the motion track of the week vehicle, and further determine the lane change probability of the week vehicle at the current moment according to the change trend of the distance between the week vehicle and the central line of the virtual lane and the lane change probability of the week vehicle at the previous moment, so as to identify the lane change intention of the week vehicle. Secondly, the method and the device can determine the association probability of the central lines of the weekly vehicle and the virtual lane at the current moment according to the lane changing probability of the weekly vehicle at the current moment, the association probability of the central lines of the weekly vehicle and the virtual lane at the previous moment, the position of the weekly vehicle relative to the central line of the virtual lane at the current moment and a preset transfer coefficient. And finally, determining the lane line of the lane where the self-vehicle is located according to the motion track of the week vehicle with the maximum association probability at the current moment and the preset width, so that the accuracy of detecting the lane line of the lane where the self-vehicle is located is improved, and the safety of the vehicle is ensured.
In one possible implementation, if the vehicle includes multiple lanes in the area before entering the intersection, the preset lane change probabilities of the vehicles at the initial position points in the multiple lanes may be the same or different. For example, taking 3 lanes before the vehicle enters the preset area, that is, the intersection as an example, as shown in fig. 15, in lane 1, the lane change probability when the vehicle is at the initial position point a before entering the preset area, that is, the intersection, is determined to be the preset lane change probability a1, and the association probability with the center line of the virtual lane when the vehicle is at the initial position point a before entering the intersection is determined to be the preset association probability a 2. In lane 2, it is determined that the lane change probability when the vehicle is at the initial position point b before entering the preset area, i.e., the intersection, is the preset lane change probability b1, and the association probability with the center line of the virtual lane when the vehicle is at the initial position point b before entering the intersection is the preset association probability b 2. In the lane 3, it is determined that the lane change probability when the vehicle is at the initial position point c before entering the preset area, i.e., the intersection, is the preset lane change probability c1, and the association probability with the center line of the virtual lane when the vehicle is at the initial position point c before entering the intersection is the preset association probability c 2. The values of a1, b1 and c1 may be the same or different, and the values of a2, b2 and c2 may be the same or different. According to the position of the own vehicle, the lane line of the lane where the own vehicle is located is determined through the steps S801-S805 in the embodiment of the method.
In one possible implementation manner, the historical motion track before the vehicle enters the intersection and the motion track when the vehicle just enters the intersection determined according to the above steps S801-S805 are smoothed, so that the own vehicle can safely and stably enter the intersection area.
In one possible implementation manner, the motion trajectories of the own vehicle determined according to different target vehicles in the above steps S801 to S805 are smoothed, so that the own vehicle can safely and smoothly pass through the intersection.
In one possible implementation manner, the lane where the self-vehicle is located after leaving the intersection is determined as the target lane according to the motion track of the self-vehicle in the intersection. Then, a target point of the vehicle on the target lane is determined according to the speed of the vehicle at the first position point (namely, the position point where the vehicle is located before leaving the intersection) and the head direction. If the target point of the self-vehicle on the target lane is positioned on the central line of the target lane, determining a smooth connecting line between the target point of the self-vehicle on the target lane and the first position point, wherein the smooth connecting line is a motion track of the self-vehicle before the self-vehicle leaves the intersection; and otherwise, namely the target point of the self-vehicle on the target lane is not positioned on the central line of the target lane, determining a smooth connecting line between the perpendicular point of the target point on the central line of the target lane and the first position point as a motion track before the self-vehicle leaves the intersection. And finally, determining the lane where the self-vehicle leaves the intersection and the smooth lane line of the lane where the self-vehicle leaves the intersection by taking the motion track of the self-vehicle before leaving the intersection as the center line of the lane where the self-vehicle leaves the intersection and taking the preset width as the width of the lane where the self-vehicle leaves the intersection.
In one possible implementation manner, one or more position points are determined between a starting point of the self-vehicle on a target point of the self-vehicle on the target lane or a central line of the target lane and the first position point according to the head direction and the speed of the self-vehicle at the first position point, and a smooth connecting line between the first position point, the starting point of the self-vehicle on the target lane or the central line of the target lane and the one or more position points is determined as a motion track of the self-vehicle when the self-vehicle leaves the intersection.
Exemplarily, as shown in fig. 16 (a), a represents a first position point corresponding to a position where the host vehicle is located before leaving the intersection, and b1 represents a target point of the host vehicle on the target lane determined according to the heading and the speed of the host vehicle. b1 is located on the central line of the target lane, and the position point d between a and b1 is determined according to the central line of the target lane, and d is located on the extension line of the central line of the target lane. And determining a position point c between a and b1 according to the heading direction and the speed of the own vehicle. The b1, c and d are connected by the bezier curve to obtain a curve e (i.e. a dotted line) which is the motion track of the vehicle when the vehicle leaves the intersection. Wherein, the longitudinal distance between b1 and a is f, the longitudinal distance between d and b1 is f/4, and the longitudinal distance between c and a is f/4.
Exemplarily, as shown in fig. 16 (b), a represents a first position point corresponding to a position where the host vehicle leaves the intersection, and b2 represents a target point of the host vehicle on the target lane determined according to the heading and speed of the host vehicle. b2 is not located on the center line of the target lane, a perpendicular line passing through b2 is taken as the center line of the target lane, and the intersection point of the perpendicular line and the center line of the target lane is determined as b3, namely the perpendicular point of b2 on the center line of the target lane is b 3. According to the central line of the target lane, a position point d1 between a and b3 is determined, and d1 is positioned on an extension line of the central line of the target lane. And determining a position point c1 between a and b3 according to the heading direction and the speed of the bicycle. The b3, c1 and d1 are connected by bezier curves, and the motion track of the vehicle when the vehicle leaves the intersection, namely a curve e (the curve e is a broken line), is obtained. Wherein, the longitudinal distance between b3 and a is f, the longitudinal distance between d1 and b3 is f/4, and the longitudinal distance between c1 and a is f/4.
In a possible implementation manner, a connecting line between a position point and a plurality of points such as a target point is fitted through a bezier curve, so that a smooth curve is obtained.
It should be noted that through the above process, the present application can determine the smooth movement track of the vehicle at the entrance/exit and in the intersection area, thereby determining the smooth lane line of the lane where the vehicle is located when the vehicle is driving at the entrance/exit and in the intersection area, so that the vehicle can safely and smoothly pass through the intersection, reducing the lateral shake of the vehicle when passing through the intersection, and improving the riding experience of the vehicle passengers when driving at the intersection.
In a possible implementation manner, in order to further improve the efficiency of detecting the lane line of the lane where the vehicle is located, after the vehicle of the vehicle is acquired, the vehicle in the vehicle is removed first, so that the vehicle only includes at least one of the front vehicle, the left front vehicle or the right front vehicle, thereby reducing the influence of the vehicles on other lanes on determining the lane of the vehicle, and improving the efficiency and accuracy of detecting the lane line of the lane where the vehicle is located. Or after the first target week is determined in the step S804, the vehicles in the first target week are further screened, so that the first target week only includes at least one of the front vehicle, the right front vehicle or the left front vehicle of the own vehicle, thereby reducing the influence of the vehicles on other lanes on determining the lane of the own vehicle, and improving the efficiency and accuracy of detecting the lane line of the lane where the own vehicle is located.
In a possible implementation manner, in order to further improve the efficiency of detecting the lane line of the lane where the own vehicle is located, reduce the influence of vehicles on other lanes on determining the lane of the own vehicle, and improve the efficiency and accuracy of detecting the lane line of the lane where the own vehicle is located, after a week vehicle of the own vehicle is obtained, the vehicles in the week vehicle are removed first, so that the week vehicle only includes a front vehicle. Or after the first target week is determined in step S804, the vehicles in the first target week are further screened, so that only the vehicles ahead of the vehicle are included in the first target week.
When the embodiment of the application is applied to the automatic driving mode above L2, the motion track and the lane line of the vehicle passing through the intersection can be recorded, so that the vehicle can safely pass through the intersection by following the historical motion track or the lane line recorded in the vehicle or the equipment for controlling the vehicle to run without passing through the intersection.
The embodiment of the present application may perform functional module division on the lane line detection device according to the above method example, and in the case of adopting the method for dividing each functional module correspondingly to each function, fig. 17 shows a schematic diagram of a possible structure of the lane line detection device related to the above embodiment, which includes a determination unit 1701 and an analysis unit 1702. Of course, the lane line detection apparatus may include other modules, or the lane line detection apparatus may include fewer modules.
The determining unit 1701 is configured to determine a variation trend of the distance between the subject vehicle and the center line of the virtual lane according to the motion trajectory of the subject vehicle. The system comprises a cycle, a virtual lane, a vehicle tail, a vehicle head and a vehicle tail, wherein the distance between the cycle and the vehicle is smaller than a preset distance, the virtual lane is parallel to a connecting line of the vehicle head and the vehicle tail of the vehicle, the width of the virtual lane is a preset width, and the vehicle is positioned on the central line of the virtual lane.
In a possible implementation manner, the motion trajectory of the cycle is a connection line of a plurality of position points, a value of a longitudinal distance between adjacent position points in the plurality of position points is equal to a value of a preset distance interval, and the plurality of position points include a position point corresponding to a current position of each cycle.
An analyzing unit 1702, configured to analyze a variation trend of a distance between the surrounding vehicle and a center line of the virtual lane and a lane change probability of the surrounding vehicle at a previous time to obtain a lane change probability of the surrounding vehicle at a current time, where the lane change probability is a probability of the vehicle changing a lane where the vehicle is located.
Optionally, the analyzing unit 1702 is configured to analyze a variation trend of a distance between the surrounding vehicle and a center line of the virtual lane, to obtain a first observation probability of the surrounding vehicle at the current time, where the first observation probability is a predicted value of a lane change probability of the vehicle. Then, the analyzing unit 1702 is further configured to perform weighted summation on the first observation probability of the weekly vehicle at the current time and the lane change probability of the weekly vehicle at the previous time, so as to determine the lane change probability of the weekly vehicle at the current time.
The determining unit 1701 is further configured to determine the association probability of the peripheral vehicle with the center line of the virtual lane at the current time according to the lane change probability of the peripheral vehicle at the current time, the association probability of the peripheral vehicle with the center line of the virtual lane at the previous time, the position of the peripheral vehicle at the current time relative to the center line of the virtual lane, and the preset transfer coefficient.
The preset transfer coefficient is the association probability of the week vehicle and the center line of the virtual lane at the last moment, and is a conversion coefficient when the association probability of the week vehicle and the center line of the virtual lane is converted, wherein the association probability is used for representing the distance between the week vehicle and the center line of the virtual lane and the similarity between the motion track of the week vehicle and the center line of the virtual lane.
In one possible implementation manner, the lane change probability before the week vehicle enters the preset area is the preset lane change probability, wherein the preset area is the intersection.
Optionally, the determining unit 1701 is configured to determine the first target week vehicle according to a preset lane change probability threshold and the lane change probability of the week vehicle at the current time. Then, the determining unit 1701 is further configured to determine a lateral distance between the first target week vehicle and the center line of the virtual lane and an included angle between the head orientation of the first target week vehicle and the center line of the virtual lane according to the position of the first target week vehicle at the current time relative to the center line of the virtual lane. Then, the determining unit 1701 is further configured to perform weighted summation on the included angle and the lateral distance between the first target vehicle and the center line of the virtual lane, and determine a second observation probability of the first target vehicle at the current time, where the second observation probability is a predicted value of an association probability between the vehicle and the center line of the virtual lane. The final determining unit 1701 is further configured to perform normalization processing on the second observation probability of the first target vehicle at the current time, the preset transfer coefficient and the association probability of the center line of the first target vehicle and the virtual lane at the previous time, and determine the association probability of the center line of the first target vehicle and the virtual lane at the current time.
In a possible implementation manner, if the number of the weekly vehicles is multiple, the determining unit 1701 is further configured to determine that the weekly vehicles are all the first target weekly vehicle if the lane change probabilities of the weekly vehicles at the current time are greater than or equal to a preset lane change probability threshold. The determining unit 1701 is further configured to delete the weekly vehicles whose lane change probability at the current time is greater than or equal to the preset lane change probability threshold value if there are the weekly vehicles whose lane change probability at the current time is greater than or equal to the preset lane change probability threshold value and the weekly vehicles whose lane change probability at the current time is less than the preset lane change probability threshold value, and determine the weekly vehicle whose lane change probability at the current time is less than the preset lane change probability threshold value as the first target weekly vehicle.
The determining unit 1701 is further configured to determine a lane line of a lane where the host vehicle is located according to the motion trajectory of the target vehicle and the preset width. The target vehicle is the vehicle with the highest association probability with the center line of the virtual lane at the current moment in the week vehicle.
Optionally, the determining unit 1701 is further configured to determine that a lane where the vehicle is located after leaving a preset area is a target lane, where the preset area is an intersection. Then, the determining unit 1701 is further configured to determine a target point of the host vehicle on the target lane according to the speed of the host vehicle at a first position point and the heading of the host vehicle, where the first position point is a position point corresponding to a position where the host vehicle is located before leaving the preset area. The determining unit 1701 is further configured to determine, if a target point of the host vehicle on the target lane is located on a center line of the target lane, a connection line between the target point of the host vehicle on the target lane and the first position point, which is a center line of a lane where the host vehicle is located before leaving the preset area. The determining unit 1701 is further configured to determine, if a target point of the host vehicle on the target lane is not located on a center line of the target lane, that a connection line between a perpendicular point of the target point on the center line of the target lane and the first position point is the center line of the lane where the host vehicle is located before leaving the preset area.
Referring to fig. 18, the present application further provides a lane line detection apparatus including a memory 1801, a processor 1802, a communication interface 1803, and a bus 1804. The processor 1802 is used for administrative control of the actions of the apparatus, and/or other processes for performing the techniques described herein. The communication interface 1803 is used to support communication of the device with other network entities. The memory 1801 is used to store program codes and data for the device.
The processor 1802 (or alternatively described as a controller) may implement or execute various illustrative logical blocks, unit modules, and circuits described in connection with the disclosure herein. The processor or controller may be a central processing unit, general purpose processor, digital signal processor, application specific integrated circuit, field programmable gate array or other programmable logic device, transistor logic device, hardware component, or any combination thereof. Which may implement or perform the various illustrative logical blocks, unit modules, and circuits described in connection with the disclosure herein. The processor 1802 may also be a combination of computing functions, e.g., comprising one or more microprocessors, a combination of DSPs and microprocessors, and the like.
Communication interface 1803 may be a transceiver circuit.
The memory 1801 may include volatile memory, such as random access memory; the memory may also include non-volatile memory, such as read-only memory, flash memory, a hard disk, or a solid state disk; the memory may also comprise a combination of memories of the kind described above.
The bus 1804 may be an Extended Industry Standard Architecture (EISA) bus or the like. The bus 1804 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in FIG. 18, but this does not mean only one bus or one type of bus.
For the specific working process of the server or the apparatus described above, reference may be made to the corresponding process in the foregoing method embodiment, which is not described herein again.
The present embodiment provides a computer-readable storage medium storing one or more programs, where the one or more programs include instructions, which when executed by a computer, cause the computer to perform the lane line detection method described in steps S801 to S805 of the above embodiment.
The embodiment of the present application further provides a computer program product containing instructions, which when executed on a computer, causes the computer to execute the lane line detection method executed in steps S801 to S805 of the foregoing embodiment.
The embodiment of the application provides a lane line detection device, which comprises a processor and a memory; wherein, the memory is used for storing computer program instructions, and the processor is used for operating the computer program instructions to make the lane line detection device execute the lane line detection method executed in the steps S801-S805 of the above embodiment.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical functional division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another device, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, that is, may be located in one place, or may be distributed in a plurality of different places. In the application process, part or all of the units can be selected according to actual needs to achieve the purpose of the scheme of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application, or portions thereof, which substantially contribute to the prior art, may be embodied in the form of a software product, where the computer software product is stored in a storage medium and includes several instructions for enabling a device (which may be a personal computer, a server, a network device, a single chip or a chip, etc.) or a processor (processor) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application.

Claims (19)

1. A lane line detection method, comprising:
determining the change trend of the distance between the week vehicle and the central line of the virtual lane according to the motion track of the week vehicle, wherein the distance between the week vehicle and the own vehicle is less than the preset distance;
analyzing the change trend of the distance between the weekly vehicle and the central line of the virtual lane and the lane changing probability of the weekly vehicle at the previous moment to obtain the lane changing probability of the weekly vehicle at the current moment; the lane changing probability is the probability of the vehicle changing the lane;
determining the association probability of the central lines of the week vehicle and the virtual lane at the current moment according to the lane change probability of the week vehicle at the current moment, the association probability of the week vehicle and the central line of the virtual lane at the previous moment, the position of the week vehicle relative to the central line of the virtual lane at the current moment and a preset transfer coefficient, wherein the method comprises the following steps: determining a first target week vehicle according to a preset lane change probability threshold value and the lane change probability of the week vehicle at the current moment; determining a second observation probability of the first target week at the current moment, wherein the second observation probability represents a predicted value of the association probability of the first target week and the center line of the virtual lane; normalizing the second observation probability of the first target week at the current moment, a preset transfer coefficient and the association probability of the first target week and the center line of the virtual lane at the previous moment to determine the association probability of the first target week and the center line of the virtual lane at the current moment;
the preset transfer coefficient is the association probability of the weekly vehicle and the central line of the virtual lane at the last moment, and is converted into a conversion coefficient when the association probability of the weekly vehicle and the central line of the virtual lane at the current moment is converted, wherein the association probability is used for representing the distance between the weekly vehicle and the central line of the virtual lane and the similarity between the motion trail of the weekly vehicle and the central line of the virtual lane;
and determining a lane line of the lane where the self vehicle is located according to the motion track and the preset width of the target vehicle, wherein the target vehicle is the vehicle with the highest association probability with the center line of the virtual lane at the current moment in the week vehicle.
2. The lane line detection method according to claim 1, wherein the virtual lane is parallel to a connection line between a head and a tail of the host vehicle, the width of the virtual lane is the preset width, and the host vehicle is located on a center line of the virtual lane.
3. The lane line detection method according to claim 1 or 2, wherein the motion trajectory of the vehicle is a connection line of a plurality of position points, a value of a longitudinal distance between adjacent position points in the plurality of position points is equal to a value of a preset distance interval, and the plurality of position points include a position point corresponding to a current position of each vehicle.
4. The method according to claim 1, wherein the analyzing a trend of a distance between the weekly vehicle and a center line of the virtual lane and a lane change probability of the weekly vehicle at a previous time to obtain the lane change probability of the weekly vehicle at a current time comprises:
analyzing the change trend of the distance between the week vehicle and the central line of the virtual lane to obtain a first observation probability of the week vehicle at the current moment; the first observation probability is used for representing a predicted value of a lane change probability of the vehicle;
and carrying out weighted summation on the first observation probability of the weekly vehicle at the current moment and the lane change probability of the weekly vehicle at the previous moment, and determining the lane change probability of the weekly vehicle at the current moment.
5. The lane line detection method according to claim 1, wherein the lane change probability before the week vehicle enters a preset area is a preset lane change probability, and the preset area is an intersection.
6. The method of claim 1, wherein the determining the second probability of observation of the first target vehicle at the current time comprises:
determining the transverse distance between the first target week vehicle and the center line of the virtual lane and the included angle between the head direction of the first target week vehicle and the center line of the virtual lane according to the position of the first target week vehicle relative to the center line of the virtual lane at the current moment;
and carrying out weighted summation on the included angle and the transverse distance between the first target vehicle and the center line of the virtual lane, and determining a second observation probability of the first target vehicle at the current moment, wherein the second observation probability represents a predicted value of the association probability of the first target vehicle and the center line of the virtual lane.
7. The lane line detection method according to claim 6, wherein if the number of the weekly vehicles is multiple, the determining a first target weekly vehicle according to a preset lane change probability threshold and a lane change probability of the weekly vehicle at the current time comprises:
if the lane change probabilities of the multiple weekly vehicles at the current moment are all larger than or equal to the preset lane change probability threshold value, determining that the multiple weekly vehicles are the first target weekly vehicle;
if the weekly vehicles with the lane change probability at the current moment greater than or equal to the preset lane change probability threshold exist in the plurality of weekly vehicles, and the weekly vehicles with the lane change probability at the current moment less than the preset lane change probability threshold exist, deleting the weekly vehicles with the lane change probability at the current moment greater than or equal to the preset lane change probability threshold, and determining the weekly vehicles with the lane change probability at the current moment less than the preset lane change probability threshold as the first target weekly vehicle.
8. The lane line detection method according to claim 1, further comprising:
determining that a lane where the self-vehicle is located after leaving a preset area is a target lane, wherein the preset area is a road junction;
determining a target point of the self-vehicle on the target lane according to the speed of the self-vehicle at a first position point and the direction of the head of the self-vehicle, wherein the first position point is a position point corresponding to the position of the self-vehicle before the self-vehicle leaves the preset area;
if the target point of the self-vehicle on the target lane is located on the central line of the target lane, determining the connecting line of the target point of the self-vehicle on the target lane and the first position point, wherein the connecting line is the central line of the lane where the self-vehicle is located before leaving the preset area;
and if the target point of the self-vehicle on the target lane is not positioned on the central line of the target lane, determining a connection line between the perpendicular point of the target point on the central line of the target lane and the first position point, and taking the connection line as the central line of the lane where the self-vehicle is positioned before leaving the preset area.
9. A lane line detection apparatus, characterized in that the apparatus comprises:
the determining unit is used for determining the change trend of the distance between the week vehicle and the central line of the virtual lane according to the motion track of the week vehicle, wherein the distance between the week vehicle and the self vehicle is less than the preset distance;
the analysis unit is used for analyzing the change trend of the distance between the weekly vehicle and the central line of the virtual lane and the lane change probability of the weekly vehicle at the previous moment to obtain the lane change probability of the weekly vehicle at the current moment; the lane changing probability is the probability of the vehicle changing the lane;
the determination unit is also used for
Determining a first target week vehicle according to a preset lane change probability threshold value and the lane change probability of the week vehicle at the current moment; determining a second observation probability of the first target week at the current moment, wherein the second observation probability represents a predicted value of the association probability of the first target week and the center line of the virtual lane; normalizing the second observation probability of the first target week at the current moment, a preset transfer coefficient and the association probability of the first target week and the center line of the virtual lane at the previous moment to determine the association probability of the first target week and the center line of the virtual lane at the current moment;
the preset transfer coefficient is the association probability of the weekly vehicle and the central line of the virtual lane at the last moment, and is converted into a conversion coefficient when the association probability of the weekly vehicle and the central line of the virtual lane at the current moment is converted, wherein the association probability is used for representing the distance between the weekly vehicle and the central line of the virtual lane and the similarity between the motion trail of the weekly vehicle and the central line of the virtual lane;
the determining unit is further configured to determine a lane line of a lane where the vehicle is located according to a motion track and a preset width of a target vehicle, where the target vehicle is a vehicle with the highest association probability with a center line of the virtual lane at the current time in the week vehicle.
10. The lane line detection device according to claim 9, wherein the virtual lane is parallel to a connection line between a head and a tail of the host vehicle, the width of the virtual lane is the preset width, and the host vehicle is located on a center line of the virtual lane.
11. The lane line detection device according to claim 9 or 10, wherein the motion trajectory of the vehicle is a connection line of a plurality of position points, a longitudinal distance between adjacent position points in the plurality of position points is equal to a preset distance interval, and the plurality of position points include a position point corresponding to a current position of each vehicle.
12. The lane line detection apparatus according to claim 9,
the analysis unit is used for analyzing the variation trend of the distance between the week vehicle and the central line of the virtual lane to obtain a first observation probability of the week vehicle at the current moment; the first observation probability is used for representing a predicted value of a lane change probability of the vehicle;
the analysis unit is further configured to perform weighted summation on the first observation probability of the weekly vehicle at the current moment and the lane change probability of the weekly vehicle at the previous moment, so as to determine the lane change probability of the weekly vehicle at the current moment.
13. The lane line detection device according to claim 9, wherein the lane change probability before the week vehicle enters a preset area is a preset lane change probability, and the preset area is an intersection.
14. The lane line detection apparatus according to claim 9,
the determining unit is further configured to determine, according to a position of the first target week relative to a center line of the virtual lane at the current moment, a lateral distance between the first target week and the center line of the virtual lane, and an included angle between a head direction of the first target week and the center line of the virtual lane;
the determining unit is further configured to perform weighted summation on the included angle and the transverse distance between the first target vehicle and the center line of the virtual lane, and determine a second observation probability of the first target vehicle at the current time, where the second observation probability represents a predicted value of an association probability between a vehicle and the center line of the virtual lane.
15. The device according to claim 14, wherein if the number of the weekly vehicles is multiple, the determining unit is configured to determine a first target weekly vehicle according to a preset lane change probability threshold and a lane change probability of the weekly vehicle at the current time, and includes:
if the lane change probabilities of the multiple weekly vehicles at the current moment are all larger than or equal to the preset lane change probability threshold value, determining that the multiple weekly vehicles are the first target weekly vehicle;
if the weekly vehicles with the lane change probability at the current moment greater than or equal to the preset lane change probability threshold exist in the plurality of weekly vehicles, and the weekly vehicles with the lane change probability at the current moment less than the preset lane change probability threshold exist, deleting the weekly vehicles with the lane change probability at the current moment greater than or equal to the preset lane change probability threshold, and determining the weekly vehicles with the lane change probability at the current moment less than the preset lane change probability threshold as the first target weekly vehicle.
16. The lane line detection apparatus according to claim 9,
the determining unit is further configured to determine that a lane where the vehicle is located after leaving a preset area is a target lane, and the preset area is an intersection;
the determining unit is further configured to determine a target point of the self-vehicle on the target lane according to the speed of the self-vehicle at a first position point and the direction of the vehicle head, where the first position point is a position point corresponding to a position where the self-vehicle is located before leaving the preset area;
the determining unit is further configured to determine, if a target point of the vehicle on the target lane is located on a center line of the target lane, a connection line between the target point of the vehicle on the target lane and the first position point, where the connection line is the center line of a lane where the vehicle is located before leaving the preset area;
the determining unit is further configured to determine, if a target point of the vehicle on the target lane is not located on a center line of the target lane, a connection line between a perpendicular point of the target point on the center line of the target lane and the first position point, where the connection line is the center line of the lane where the vehicle is located before leaving the preset area.
17. A lane line detection apparatus, characterized in that the apparatus comprises: a processor and a memory; wherein the memory is configured to store computer program instructions that are executed by the processor to cause the lane line detection apparatus to perform the lane line detection method of any one of claims 1 to 8.
18. A computer-readable storage medium comprising computer instructions that, when executed by a processor, cause a lane line detection apparatus to perform the lane line detection method according to any one of claims 1 to 8.
19. A computer program product, characterized in that it, when run on a processor, causes the lane line detection apparatus to carry out the lane line detection method according to any one of claims 1 to 8.
CN202080005027.3A 2020-04-22 2020-04-22 Lane line detection method and device Active CN112703506B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/086196 WO2021212379A1 (en) 2020-04-22 2020-04-22 Lane line detection method and apparatus

Publications (2)

Publication Number Publication Date
CN112703506A CN112703506A (en) 2021-04-23
CN112703506B true CN112703506B (en) 2022-04-08

Family

ID=75514822

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080005027.3A Active CN112703506B (en) 2020-04-22 2020-04-22 Lane line detection method and device

Country Status (2)

Country Link
CN (1) CN112703506B (en)
WO (1) WO2021212379A1 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20220077002A (en) * 2020-12-01 2022-06-08 현대자동차주식회사 Apparatus and method for controlling vehicle
CN115265564A (en) * 2021-04-30 2022-11-01 华为技术有限公司 Lane line marking method and device
CN113276853B (en) * 2021-05-21 2022-09-09 武汉光庭信息技术股份有限公司 LKA control method and system in failure scene
CN113895439B (en) * 2021-11-02 2022-08-12 东南大学 Automatic driving lane change behavior decision method based on probability fusion of vehicle-mounted multisource sensors
CN113804214B (en) * 2021-11-19 2022-05-03 智道网联科技(北京)有限公司 Vehicle positioning method and device, electronic equipment and computer readable storage medium
CN115206053B (en) * 2022-06-02 2023-10-03 河南越秀尉许高速公路有限公司 Expressway construction area early warning method with low false alarm rate
CN115507874B (en) * 2022-06-09 2024-03-01 广东省智能网联汽车创新中心有限公司 Lane matching method and device based on V2X
CN115311635B (en) * 2022-07-26 2023-08-01 阿波罗智能技术(北京)有限公司 Lane line processing method, device, equipment and storage medium
CN116110216B (en) * 2022-10-21 2024-04-12 中国第一汽车股份有限公司 Vehicle line crossing time determining method and device, storage medium and electronic device

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140074356A1 (en) * 2011-05-20 2014-03-13 Honda Motor Co., Ltd. Lane change assist system
US20140257686A1 (en) * 2013-03-05 2014-09-11 GM Global Technology Operations LLC Vehicle lane determination
JP5900454B2 (en) * 2013-10-09 2016-04-06 トヨタ自動車株式会社 Vehicle lane guidance system and vehicle lane guidance method
CN105631217B (en) * 2015-12-30 2018-12-21 苏州安智汽车零部件有限公司 Front effective target selection method based on the adaptive virtual lane of this vehicle
CN105809130B (en) * 2016-03-08 2020-03-10 武汉大学 Vehicle travelable area calculation method based on binocular depth perception
KR102368604B1 (en) * 2017-07-03 2022-03-02 현대자동차주식회사 Ecu, autonomous vehicle including the ecu, and method of controlling lane change for the same
CN107919027B (en) * 2017-10-24 2020-04-28 北京汽车集团有限公司 Method, device and system for predicting lane change of vehicle
CN110146100B (en) * 2018-02-13 2021-08-13 华为技术有限公司 Trajectory prediction method, apparatus and storage medium
CN110614993B (en) * 2018-12-29 2020-10-30 长城汽车股份有限公司 Lane changing method and system of automatic driving vehicle and vehicle
CN109733390B (en) * 2018-12-29 2021-07-20 江苏大学 Self-adaptive lane change early warning method based on driver characteristics
CN109855639B (en) * 2019-01-15 2022-05-27 天津大学 Unmanned driving trajectory planning method based on obstacle prediction and MPC algorithm
CN110550030B (en) * 2019-09-09 2021-01-12 深圳一清创新科技有限公司 Lane changing control method and device for unmanned vehicle, computer equipment and storage medium
CN110610271B (en) * 2019-09-17 2022-05-13 北京理工大学 Multi-vehicle track prediction method based on long and short memory network

Also Published As

Publication number Publication date
WO2021212379A1 (en) 2021-10-28
CN112703506A (en) 2021-04-23

Similar Documents

Publication Publication Date Title
CN112703506B (en) Lane line detection method and device
CN109901574B (en) Automatic driving method and device
WO2022001773A1 (en) Trajectory prediction method and apparatus
CN110379193B (en) Behavior planning method and behavior planning device for automatic driving vehicle
CN110371132B (en) Driver takeover evaluation method and device
CN112230642B (en) Road travelable area reasoning method and device
WO2021102955A1 (en) Path planning method for vehicle and path planning apparatus for vehicle
CN113168708B (en) Lane line tracking method and device
CN110532846B (en) Automatic channel changing method, device and storage medium
CN113460042A (en) Vehicle driving behavior recognition method and recognition device
CN113632033B (en) Vehicle control method and device
CN113631452B (en) Lane change area acquisition method and device
CN113835421A (en) Method and device for training driving behavior decision model
CN114693540A (en) Image processing method and device and intelligent automobile
US20230048680A1 (en) Method and apparatus for passing through barrier gate crossbar by vehicle
CN113498529A (en) Target tracking method and device
WO2022017307A1 (en) Autonomous driving scenario generation method, apparatus and system
CN112810603A (en) Positioning method and related product
WO2022178858A1 (en) Vehicle driving intention prediction method and apparatus, terminal and storage medium
CN115039095A (en) Target tracking method and target tracking device
CN113799794B (en) Method and device for planning longitudinal movement parameters of vehicle
CN113022573B (en) Road structure detection method and device
CN113859265A (en) Reminding method and device in driving process
CN114261404A (en) Automatic driving method and related device
CN114327842A (en) Multitask deployment method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant