CN116977954A - Lane positioning method, device, equipment and storage medium - Google Patents

Lane positioning method, device, equipment and storage medium Download PDF

Info

Publication number
CN116977954A
CN116977954A CN202211574961.4A CN202211574961A CN116977954A CN 116977954 A CN116977954 A CN 116977954A CN 202211574961 A CN202211574961 A CN 202211574961A CN 116977954 A CN116977954 A CN 116977954A
Authority
CN
China
Prior art keywords
information
probability distribution
lane
vehicle
positioning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211574961.4A
Other languages
Chinese (zh)
Inventor
闫伟
储超
王月明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202211574961.4A priority Critical patent/CN116977954A/en
Publication of CN116977954A publication Critical patent/CN116977954A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Abstract

The application provides a lane positioning method, a lane positioning device, lane positioning equipment and a lane positioning storage medium, which can be applied to the technical field of intelligent driving and vehicle-mounted scenes. The lane positioning method comprises the following steps: obtaining lane change detection information of a vehicle at a first time; acquiring probability distribution prediction information of the vehicles positioned in a plurality of lanes at the first time according to the lane change detection information and the first probability distribution information of the vehicles positioned in the plurality of lanes at the second time; acquiring probability distribution measurement information of the first time vehicle in a plurality of lanes according to the positioning information of the first time vehicle; and carrying out Kalman filtering processing on the probability distribution prediction information and the probability distribution measurement information, and determining second probability distribution information of the vehicle in a plurality of lanes at the first time. The embodiment of the application can help to improve the positioning accuracy of the lane.

Description

Lane positioning method, device, equipment and storage medium
Technical Field
The embodiment of the application relates to the technical field of intelligent driving, in particular to a lane positioning method, a lane positioning device, lane positioning equipment and a storage medium.
Background
With the continuous development of intelligent driving technology, people have higher and higher requirements on the accuracy of vehicle positioning, and lane-level positioning has become a necessary requirement for high-accuracy positioning. High-precision positioning generally refers to positioning with precision at the level of decimeters or centimeters or even more, and can provide a higher-precision positioning result for a vehicle.
In the related art, common lane-level positioning schemes are generally lane-level positioning based on visual sensors, lane-level positioning schemes based on multi-sensor fusion, lane-level positioning based on lane change detection, and the like. The positioning scheme of the multi-sensor fusion is used for fusing related positioning means including satellite positioning, wireless communication signal positioning, sensor positioning and the like, so that a better fusion positioning result is obtained compared with a single positioning scheme. Here, the sensors include, but are not limited to, visual sensors, inertial policy unit (Inertial Measurement Unit, IMU) sensors, vehicle speed sensors, and the like.
However, it is impossible to ensure that any of the above-mentioned lane positioning methods does not make errors for a long period of time, and the positioning accuracy and integrity of each lane positioning method cannot be ensured. Based on this, how to further improve the accuracy of lane positioning is urgently needed to be solved.
Disclosure of Invention
The application provides a method, a device, equipment and a storage medium for detecting a road edge, which can further improve the positioning accuracy of a lane.
In a first aspect, an embodiment of the present application provides a method for positioning a lane, including:
obtaining lane change detection information of a vehicle at a first time;
Acquiring probability distribution prediction information of the vehicle in a plurality of lanes at the first time according to the lane change detection information and first probability distribution information of the vehicle in the plurality of lanes at the second time, wherein the second time comprises time before the first time;
acquiring probability distribution measurement information of the vehicle in a plurality of lanes at the first time according to the positioning information of the vehicle at the first time;
and carrying out Kalman filtering processing on the probability distribution prediction information and the probability distribution measurement information, and determining second probability distribution information of the vehicle in a plurality of lanes at the first time.
In a second aspect, an embodiment of the present application provides a lane positioning apparatus, including:
the acquisition unit is used for acquiring lane change detection information of the vehicle at the first time;
the prediction unit is used for acquiring probability distribution prediction information of the vehicle in a plurality of lanes at the first time according to the lane change detection information and first probability distribution information of the vehicle in the plurality of lanes at the second time, wherein the second time comprises the time before the first time;
the measuring unit is used for acquiring probability distribution measuring information of the vehicle in a plurality of lanes at the first time according to the positioning information of the vehicle at the first time;
And the processing unit is used for carrying out Kalman filtering processing on the probability distribution prediction information and the probability distribution measurement information and determining second probability distribution information of the vehicle in a plurality of lanes at the first time.
In a third aspect, an embodiment of the present application provides an electronic device, including: a processor and a memory for storing a computer program, the processor being for invoking and running the computer program stored in the memory for performing the method as in the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium comprising instructions which, when run on a computer, cause the computer to perform a method as in the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product comprising computer program instructions for causing a computer to perform the method as in the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program for causing a computer to perform the method as in the first aspect.
Through the scheme, the embodiment of the application can fuse the probability distribution prediction information of the vehicle in the multiple lanes, which is obtained based on the lane change detection information of the vehicle, and the probability distribution measurement information of the vehicle in the multiple lanes, which is obtained based on the positioning information of the vehicle, so as to obtain the probability distribution information of the vehicle in the multiple lanes. The lane positioning result determined based on the lane change detection information and the observed lane positioning result are fused, so that the final positioning result is not strongly dependent on a certain type of positioning result, and the lane positioning accuracy can be improved.
For example, when the visual perception information is abnormal, there is necessarily an error in the visual positioning result, and the error gradually affects the fusion positioning result. In the embodiment of the application, the abnormal fusion positioning result can be filtered according to the lane positioning result determined by the lane change detection information and further according to the lane positioning result determined based on the lane change detection information, so as to obtain a positioning result with higher precision.
Drawings
Fig. 1 is a schematic diagram of an application scenario according to an embodiment of the present application;
FIG. 2 is a schematic flow chart of a method of lane positioning according to an embodiment of the present application;
FIG. 3 is a schematic flow chart of another method of lane positioning according to an embodiment of the present application;
FIG. 4 is a specific example of determining a confidence level of positioning information according to an embodiment of the present application;
FIG. 5 is a schematic flow chart of another method of lane positioning according to an embodiment of the present application;
FIG. 6 is a schematic block diagram of an apparatus for lane positioning according to an embodiment of the present application;
fig. 7 is a schematic block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application.
The scheme provided by the application can be applied to the field of intelligent transportation. The intelligent transportation system (Intelligent Traffic System, ITS), also called intelligent transportation system (Intelligent Transportation System), is a comprehensive transportation system which uses advanced scientific technology (information technology, computer technology, data communication technology, sensor technology, electronic control technology, automatic control theory, operation study, artificial intelligence, etc.) effectively and comprehensively for transportation, service control and vehicle manufacturing, and enhances the connection among vehicles, roads and users, thereby forming a comprehensive transportation system for guaranteeing safety, improving efficiency, improving environment and saving energy.
The intelligent vehicle-road cooperative system is called vehicle-road cooperative system for short, and is one development direction of the intelligent traffic system ITS. The vehicle-road cooperative system adopts advanced wireless communication, new generation internet and other technologies, performs omnibearing real-time vehicle-vehicle and vehicle-road dynamic real-time information interaction, develops active safety control and road cooperative management on the basis of full-time idle dynamic traffic information acquisition and fusion, fully realizes effective cooperation of people and vehicles and roads, ensures traffic safety and improves traffic efficiency, thereby forming a safe, efficient and environment-friendly road traffic system. The lane positioning method provided by the embodiment of the application can provide technical support for traffic safety and vehicle-road cooperation based on lane-level positioning of vehicles.
The scheme provided by the application can be applied to the technical field of intelligent vehicle control, the automatic driving field and the advanced auxiliary driving field. The embodiment of the application can obtain accurate positioning experience of a lane level based on inaccurate visual positioning results, such as satellite positioning results, visual positioning results or fusion positioning results, and combining lane change detection information of a vehicle.
In the map navigation field, lane-level positioning of a vehicle is very important, and has important significance for determining the lateral position of the vehicle and specifying a navigation strategy. In addition, based on the lane-level positioning result, the vehicle lane-level path planning and motion control can be performed. The lane-level navigation can restore a real road scene, fine lane-level action guidance can be provided when a lane is required to be changed, the difficulty of the user in understanding the navigation is reduced, and the driving safety is improved.
Lane-level navigation may be applied to L2, L3 level autopilot scenarios.
Fig. 1 shows a schematic diagram of an application scenario according to an embodiment of the present application.
As shown in fig. 1, the application scenario includes a server and a terminal device. The terminal equipment can comprise a vehicle-mounted terminal and a user terminal. It should be understood that this is merely an exemplary illustration, and is not intended to limit the application scenarios of the embodiments of the present application.
The in-vehicle terminal may include a car computer or an in-vehicle Unit (OBU), etc. The vehicle-mounted terminal may also be an application program (APP) on the terminal, an APP on the intelligent rearview mirror, an APP or applet on the mobile phone, etc., which are not limited herein.
The UE (user equipment) may be a wireless terminal device or a wired terminal device, where the wireless terminal device may refer to a device with a wireless transceiver function, and the UE may be a mobile phone (mobile phone), a tablet (Pad), a computer with a wireless transceiver function, a Virtual Reality (VR) user device, an augmented Reality (Augmented Reality, AR) user device, an intelligent voice interaction device, a vehicle-mounted terminal, an aircraft, and the like, which are not limited herein.
The client can be deployed on the terminal device, wherein the client can be operated on the terminal device in a browser mode or in a stand-alone APP mode, and the specific presentation mode of the client is not limited herein.
The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs (Content Delivery Network, content delivery networks), basic cloud computing services such as big data and artificial intelligent platforms, and the like. Servers may also become nodes of the blockchain.
The server may be one or more. Where the servers are multiple, there are at least two servers for providing different services and/or there are at least two servers for providing the same service, such as in a load balancing manner, as embodiments of the application are not limited in this respect.
The terminal device and the server may be directly or indirectly connected through wired or wireless communication, which is not limited in the present application. The present application does not limit the number of servers or terminal devices. The scheme provided by the application can be independently completed by the terminal equipment, can be independently completed by the server, and can be completed by the cooperation of the terminal equipment and the server, and the application is not limited to the scheme.
For example, taking a terminal device as a smart phone, a mobile phone navigation application is installed on the smart phone. The server may send map data to the smartphone. The smart phone combines the real-time positioning data and the map data to determine a real-time lane-level positioning result. Based on the above, the real-time lane positioning result can be displayed on the mobile phone navigation application.
For example, taking a terminal device as an in-vehicle device, an in-vehicle navigation application is installed on the in-vehicle device. The server may transmit map data to the in-vehicle apparatus. The vehicle-mounted equipment can determine a real-time lane-level positioning result by combining the real-time positioning data and the map data. Based on this, the real-time lane positioning result can be displayed on the in-vehicle navigation application.
In the following, related terms related to the embodiments of the present application are described.
High precision (HighDefinition, HD) map: compared with a common electronic navigation map, the electronic navigation map has the information of high-precision coordinates, slope rate, curvature, heading, elevation and the like. For example, the map layer of the HD map contains a number of attribute elements of the road that are in centimeter level accuracy, including but not limited to lane line equations/line point coordinates, lane types, lane topology information, lane speed limits, lane line types, pole coordinates, guidepost positions, camera/traffic light positions, etc. During the running process of the vehicle, accurate navigation can be performed according to the information in the HD map.
Global navigation satellite system (Global Navigation Satellite System, GNSS): generally, satellite navigation systems, including global satellite navigation systems, regional satellite navigation systems, and enhanced satellite navigation systems, such as global positioning systems (Global Positioning System, GPS), glonass, galileo in europe, beidou satellite navigation systems, and related enhanced systems, such as wide area enhanced systems (Wide Area Augmentation System, WAAS), european Geostationary Navigation Overlay Systems (EGNOS), and multi-functional transport satellite enhanced systems (MSAS), etc., may also be included in other satellite navigation systems under construction and later in construction. The international GNSS system is a complex combination of multiple systems, multi-level, multi-mode systems.
Satellite positioning: refers to techniques for positioning using satellites (e.g., GNSS). Positioning of a position is achieved by receiving a plurality of satellite signals simultaneously. However, satellite signals can generate certain fluctuation when crossing an ionosphere and a troposphere, so that errors are caused, and the positioning accuracy is in the order of meters. The ground-based enhanced base station may calculate satellite positioning errors using Real-time kinematic (RTK) carrier phase difference techniques to perform further position corrections to improve positioning accuracy. For example, the accuracy of satellite positioning can be enhanced from the order of meters to the order of centimeters.
Sensor positioning: refers to a technique for locating using information collected by sensors, such as vision sensors, vehicle speed sensors, inertial measurement unit (Inertial Measurement Unit, IMU) sensors. Including visual positioning, inertial positioning, etc.
Visual positioning: refers to a technique for locating using information collected by a vision sensor. The visual positioning can recognize and sense the attribute elements of the roads in the high-precision map positioning map layer through the visual sensor, and the vehicle position information is calculated by means of a visual algorithm. The visual positioning can highly multiplex the sensors such as the high-precision map and the camera, and the like, without additional hardware deployment, and has obvious advantages in cost.
Inertial positioning: refers to a technique for locating using information acquired by IMU sensors. The inertial positioning measures the angular velocity and acceleration information of the vehicle through the IMU sensor, utilizes Newton's law of motion to automatically calculate the instantaneous speed and position information of the carrier, and has the characteristics of no dependence on external information, no energy radiation to the external, no interference and good concealment.
Multisource fusion localization: the method is a technology for fusing a plurality of positioning technologies based on an information fusion strategy, and can fuse relevant positioning means including satellite positioning, various sensor positioning and the like, so that a better fusion positioning result is obtained compared with a single positioning scheme. High-precision positioning can be realized through multisource fusion positioning. The multi-source fusion localization may also be referred to as multi-sensor fusion localization or fusion localization, without limitation.
Lane change detection: i.e. detecting whether the vehicle is changing lanes. Specifically, if the main body of the vehicle (such as reaching the preset proportion of the ground of the vehicle) is detected to cross the lane line position, the lane change of the vehicle can be determined; or it may be determined that the vehicle is lane-changing when it is detected that the vehicle center crosses the lane line, i.e., it is detected that the positional relationship of the vehicle center and the lane line is changed. The lane change detection result may include whether the vehicle changes lane information and a confidence of whether the vehicle changes lane information. Based on the lane change detection result, lane-level positioning of the vehicle can be achieved.
In the related art, common lane-level positioning schemes are generally lane-level positioning based on visual sensors, lane-level positioning schemes based on multi-sensor fusion, lane-level positioning based on lane change detection, and the like. However, it is impossible to ensure that any of the above-mentioned lane positioning methods does not make errors for a long period of time, and the positioning accuracy and integrity of each lane positioning method cannot be ensured.
For example, under the condition that the positioning observation value based on the visual sensor is free from errors and omission, the visual positioning algorithm can theoretically obtain a high-precision result, and the positioning algorithm is well assisted to be fused. However, when a certain accuracy and recall rate exist in the visual positioning observation value, once the visual observation is in error for a long time, the visual positioning will work abnormally for a long time, and the positioning accuracy and integrity of the visual positioning observation value cannot be guaranteed. The precision of the multi-sensor fusion positioning based on visual positioning is also greatly affected.
In view of the above, the embodiments of the present application provide a method, apparatus, device, and storage medium for positioning a lane, which can help to improve the accuracy of positioning the lane.
Specifically, lane change detection information of a vehicle at a first time can be obtained, and then probability distribution prediction information of the vehicle at the first time on a plurality of lanes is obtained according to the lane change detection information and first probability distribution information of the vehicle at a second time on the plurality of lanes; according to the positioning information of the first-time vehicle, probability distribution measurement information of the first-time vehicle in a plurality of lanes is obtained; and finally, carrying out Kalman filtering on the probability distribution prediction information and the probability distribution measurement information, and determining second probability distribution information of the vehicle in a plurality of lanes at the first time.
Therefore, the embodiment of the application can fuse the probability distribution prediction information of the vehicle in the multiple lanes, which is obtained based on the lane change detection information of the vehicle, and the probability distribution measurement information of the vehicle in the multiple lanes, which is obtained based on the positioning information of the vehicle, so as to obtain the probability distribution information of the vehicle in the multiple lanes. The lane positioning result determined based on the lane change detection information and the observed lane positioning result are fused, so that the final positioning result is not strongly dependent on a certain type of positioning result, and the lane positioning accuracy can be improved.
Wherein the second time includes a time before the first time. For example, when the first time is the current time, the second time may be the previous time, which is not limited by the present application.
In some embodiments, the positioning information may include at least one of visual positioning information, fused positioning information, satellite positioning information, and inertial positioning information.
Taking the example that the positioning information comprises visual positioning information or fusion positioning information, when the information acquired by the visual sensor (namely visual perception information) is abnormal, the visual positioning result or fusion positioning result has an error necessarily. According to the embodiment of the application, the probability distribution prediction information of the vehicle in the multiple lanes, which is obtained by combining the lane change detection result of the vehicle, can be obtained, and the probability distribution measurement information of the current vehicle in the multiple lanes can be obtained according to the visual positioning result or the fusion positioning result. Then, kalman filtering (Kalman Filter) is carried out on the probability distribution prediction information and the probability distribution measurement information, so that filtering of unreasonable visual positioning results or fusion positioning results can be achieved. That is, the embodiment of the application can utilize the lane change detection result to perform secondary lane number positioning after the visual positioning result or the fusion positioning result, and reject unreasonable visual positioning result or fusion positioning result. Therefore, the embodiment of the application can improve the precision of visual positioning or fusion positioning.
The following describes the technical scheme of the embodiments of the present application in detail through some embodiments. The following embodiments may be combined with each other, and some embodiments may not be repeated for the same or similar concepts or processes.
Fig. 2 is a schematic flow chart of a lane positioning method 200 according to an embodiment of the present application. The subject of the method 200 may be any electronic device having data processing capabilities, which may be implemented, for example, as a terminal device, such as a lane-level positioning module of an in-vehicle terminal or a user terminal, as the application is not limited in this regard.
As shown in fig. 2, method 200 may include steps 210 through 240.
And 210, acquiring lane change detection information of the vehicle at the first time.
Specifically, the lane change detection information is used to indicate whether the vehicle is lane change, and may include, for example, whether lane change information and a confidence of whether the lane change information is present. For example, the lane change detection information may be that a lane change has occurred, and the confidence that the lane change has occurred is 50%; for another example, the lane change detection information may be that no lane change has occurred and the confidence that no lane change has occurred is 80%.
In some embodiments, lane change detection information may be obtained by lane change detection of the vehicle based on at least one of the visual perception information and the vehicle heading information.
The visual perception information refers to an image acquired by a visual sensor (such as a camera) mounted on a vehicle for a road on which the vehicle is running, and a recognition result obtained by recognizing at least one of lane lines, vehicles, pedestrians, road marks and the like in the image by adopting an artificial intelligent model.
The vehicle heading information refers to information such as a distance, a direction and the like of a vehicle moving relative to a position at a current moment, and can be used for calculating the position of the vehicle at the next moment. The vehicle course information can be obtained through information collected by a vehicle speed sensor or an IMU sensor of the vehicle, and the application is not limited to this.
As one possible implementation manner, a visual lane line equation may be obtained according to visual perception information, and then lane change detection is performed on the vehicle based on double-line lane change of the visual lane line or single-line lane change of the visual lane line equation, so as to obtain lane change detection information. As an example, lane change detection may be performed using only a lane line equation obtained from visual perception information in a scene where the vehicle is traveling normally, to obtain lane change detection information.
As another possible implementation manner, whether the vehicle changes track may be calculated according to the vehicle heading information, based on Dead Reckoning (DR) algorithm, according to the vehicle position at the current moment, and the vehicle heading information of the vehicle movement (such as the distance and azimuth of the vehicle movement). As an example, since the visual perception information may not recognize the lane line in the traffic jam scene, lane change detection may be performed by using the vehicle heading information at this time to obtain lane change detection information.
In some embodiments of the present application, lane change detection information may be obtained by accurately detecting each lane change of the vehicle in the above manner. Optionally, the lane change detection result in the embodiment of the application can have high accuracy and high recall rate, so that the reliability of the lane change detection result is ensured. The recall (recovery) is a probability of being predicted as a positive sample among samples that are actually positive, for the original sample. The high recall rate means that more false detections are possible, but the aim of checking the potential safety hazard is fulfilled by trying to find the object of each application.
In some embodiments, the visual positioning results may also be obtained based on at least one of visual perception information and vehicle heading information, optionally in combination with lane-level map data, such as HD maps. Optionally, the fusion positioning result may be obtained according to at least one of the visual positioning result, the satellite positioning and the inertial positioning. Wherein the visual positioning result or the fusion positioning result comprises a lane number.
In some embodiments, when the positioning information (such as the visual positioning result and/or the fusion positioning result) of the vehicle meets the first condition, it may be determined to start acquiring the probability distribution prediction information and the probability distribution measurement information of the vehicle in multiple lanes, that is, start positioning the secondary lane number of the vehicle. The obtained visual positioning result, fusion positioning result or lane change detection result can be regarded as the first lane number positioning of the vehicle.
Illustratively, the first condition may specifically include at least one of:
1) The confidence of the positioning information of the vehicle is larger than a preset threshold value within a preset duration;
2) Lane-level map data corresponding to positioning information of the vehicle satisfies that the number of lanes is 1 or 2;
3) Lane-level map data corresponding to positioning information of a vehicle satisfies that no guide belt exists;
4) The lane-level map data corresponding to the positioning information of the vehicle satisfies that the vehicle is not at the intersection position.
Specifically, the secondary lane number positioning of the vehicle is performed in order to cope with a problem of inaccurate lane number positioning in positioning information (i.e., first lane number positioning) such as a visual positioning result or a fusion positioning result. Therefore, by the limitation of the first condition described above, it can be ensured that the secondary positioning of the lane number can be started from a trusted position.
The confidence coefficient of the positioning information of the vehicle in the above 1) is greater than a preset threshold value within a preset duration, that is, the vehicle can keep the visual positioning result with high confidence coefficient for a long time, that is, the positioning information of the vehicle in a period of time meets the strict condition. The above 2) -4) defines lane-level map data corresponding to the positioning information of the vehicle, such as HD map corresponding to the current position of the vehicle, so that the lane-level map data is relatively "single", for example, the number of lanes is small, no complex lane lines such as a guide belt exist, or no intersection position exists, i.e. the vehicle is currently in a road section which is not prone to error.
It should be noted that, the starting condition of the secondary positioning of the lane number is closely related to the reason why the lane positioning scheme is provided in the embodiment of the present application. Assuming that the accuracy of the fused positioning is high enough, or the integrity is good enough, the necessity of a lane secondary positioning will be greatly reduced. Under the condition that visual observation is free from errors and omission, the visual positioning algorithm can theoretically obtain a high-precision positioning result, and the positioning algorithm is well assisted to be fused. However, when the accuracy of the visual positioning result is not high and the visual positioning result has a certain recall rate, once the visual observation is in error for a long time, the visual positioning is abnormally operated for a long time, so that the accuracy and the integrity of the visual positioning cannot be ensured, and the fusion positioning is greatly influenced. That is, the secondary positioning of the lane number in the embodiment of the application can intercept the problem of abnormal operation of the visual positioning module caused by inaccurate visual observation. On the basis, under the condition that the visual positioning result or the fusion positioning result meets the first condition, the secondary positioning of the starting track number can be determined, so that the accuracy and the integrity of the visual positioning or the fusion positioning are improved.
In some embodiments, when the positioning information of the vehicle meets the first condition, or when it is determined to start positioning the secondary lane number of the vehicle, the current state of the vehicle, that is, probability distribution information currently located in a plurality of lanes, may be used as initial probability distribution information, so as to initialize the kalman filter system. For example, the initialization probability distribution information may be determined based on visual location information or fusion location information of the current vehicle.
Illustratively, the system state quantity of the Kalman filtering system includes a state variable X and a covariance matrix P. Wherein X represents probability distribution values of vehicles in a plurality of different lanes, and P represents accuracy of the probability distribution values X. Accordingly, the initial probability distribution information may include an initial probability distribution value and a covariance matrix corresponding to the initial probability distribution value.
In some embodiments, positioning information, such as visual positioning results or fusion positioning results, may be output directly prior to initialization of the Kalman filtering system without secondary lane number positioning of the vehicle.
After the Kalman filtering system is initialized, probability distribution prediction information and probability distribution measurement information of the vehicle in a plurality of lanes can be obtained according to the lane change detection result of the vehicle, and a Kalman filtering method is adopted to position the vehicle for a secondary lane number. In particular, see the description of steps 220 through 240 below.
220, acquiring probability distribution prediction information of the vehicles positioned in the multiple lanes at the first time according to the lane change detection information and first probability distribution information of the vehicles positioned in the multiple lanes at the second time, wherein the second time comprises the time before the first time.
Illustratively, the first time may be the current time and the second time may be the last time, which is not limited by the present application.
For example, the first probability distribution information may include a first probability distribution value and a covariance matrix of the first probability distribution value; the probability distribution prediction information may include a probability distribution prediction value and a covariance matrix of the probability distribution prediction value, which is not limited in the present application.
Specifically, a state equation can be constructed based on the Kalman filtering principle according to the lane change detection information and the first probability distribution information of the vehicles in the multiple lanes at the second time, so as to predict and obtain the probability distribution prediction information of the vehicles in the multiple lanes at the first time. The state equation is used for predicting the state at the current moment according to the state information at the previous moment and the system noise.
By way of example, the state equation may be represented by the following equation (1):
X(k|k-1)=AX(k-1|k-1)+B U(k)+ W(k) (1)
wherein X (k|k-1) is the result of the (k-1) time state prediction, X (k-1|k-1) is the result of the (k-1) time state optimization, A, B is a system parameter, U (k) is the control amount of the k time state, and W (k) is the noise of the k time prediction process system. The covariance matrix of W (k) is Q. Alternatively, if there is no control amount, U (k) may be 0.
Alternatively, the covariance moment of X (k|k-1) can also be determined as shown in the following equation (2):
P(k|k-1)=AP(k-1|k-1) A'+Q (2)
wherein P (k|k-1) is the covariance corresponding to X (k|k-1), P (k-1|k-1) is the covariance corresponding to X (k-1|k-1), A' represents the transpose matrix of A, and Q is the covariance of system noise.
In some embodiments, prediction of the system state can be achieved based on the above formulas (1) and (2), and the probability distribution predicted values of the vehicle in the plurality of lanes and the covariance matrix of the probability distribution predicted values are obtained.
As a specific example, when the front vehicle is in a right position of the first lane, the vehicle lane change detection may be performed at least once. For example, the probability distribution value of the vehicle in three lanes at this time is [0.5,0.3,0.2]. As an example, first, a lane change of which the result of lane change detection is rightward is detected, and the probability of lane change is 50% (at this time, the vehicle has approached the right lane line), that is, the vehicle has a tendency to lane change rightward, at this time, lane number prediction can be performed according to the result of lane change detection, and a state equation for state-transferring the state variable X can be constructed, at this time, the probability distribution value of the vehicle in three lanes becomes [0.4,0.35,0.25]; then, a lane change detection result is detected to be a lane change to the right, the probability of the lane change is 90% (at this time, the vehicle has mostly crossed the lane line on the right), at this time, the state equation can be continuously constructed according to the lane change detection approach, the state transition is continuously performed on the state variable X, and at this time, the probability distribution value of the vehicle in three lanes becomes [0.3,0.4,0.3]. Here, the lane change detection may continue to be performed on the vehicle until the peak of the state variable X is shifted from the first lane to the second lane. Alternatively, the covariance matrix P of each state variable X may be determined simultaneously with each state variable X.
In some embodiments, after the kalman filtering system is initialized, when the positioning result of the vehicle is positioned for the first time for the second time, the probability distribution prediction information of the vehicle in the multiple lanes at the current moment can be determined according to the lane change detection information and the probability distribution initial information of the vehicle in the multiple lanes.
Optionally, the probability distribution prediction information, such as the state variable X and the covariance matrix P, may be modified according to the topological relation of the lanes in the lane-level map data. Optionally, the covariance matrix Q of the system noise may also be modified.
As an example, when the total number of lanes changes and a topology other than 1 to 1 occurs in the lane where the own vehicle is located, the matrices X and P may be modified; when a 1-to-many topology occurs in the lane where the own vehicle is located, both matrices X, P and Q may be modified.
As an example, when a lane change occurs in a vehicle, the lane number in which the own vehicle is located changes, a modification to the matrix X, P is required. Optionally, the matrix Q may be additionally adjusted in combination with the accuracy (or confidence) of the current lane change.
It should be noted that, the corresponding matrix may be adjusted in combination with a specific road scene where the vehicle is located, and the specific adjustment mode is not limited by the present application. For example, when the width of the lane where the host vehicle is located is too wide or too narrow, additional adjustments may be made to the matrix Q, which is not limiting in the present application.
And 230, acquiring probability distribution measurement information of the first time vehicle in a plurality of lanes according to the positioning information of the first time vehicle.
In some embodiments, the positioning information may include at least one of visual positioning information, fused positioning information, satellite positioning information, and inertial positioning information. The positioning information may include a lane number where the current vehicle is located and a confidence that the vehicle is located in that lane number.
For example, the probability distribution measurement information for a vehicle located in a plurality of lanes may include probability distribution measurements and a covariance matrix of the probability distribution measurements.
In some embodiments, probability distribution measurement information of the vehicle located in a plurality of lanes at the first time, that is, the observed value of the current moment, may be obtained according to the positioning information of the vehicle. By way of example, the observation at the present time may be expressed as the following formula (2):
Z(k)=H X(k)+V(k) (3)
wherein Z (k) represents a measured value at the moment k, X (k) represents a system state at the moment k, H is a measurement system parameter, V (k) represents noise in a measurement process, and a covariance matrix of V (k) is R.
As a specific example, when the positioning information indicates that the vehicle is currently in the left three lanes and the confidence is 20%, it may be determined that the probability distribution measurement value of the current vehicle being in the plurality of lanes is [0.3,0.3,0.4]. Alternatively, in determining the probability distribution measurement value, a covariance matrix of the probability distribution measurement value may also be determined. Alternatively, the covariance matrix R of the observed noise may also be determined according to the confidence level of the positioning information.
And 240, carrying out Kalman filtering processing on the probability distribution prediction information and the probability distribution measurement information, and determining second probability distribution information of the vehicle in a plurality of lanes at the first time.
Specifically, the probability distribution prediction information may be used as a prediction value, the probability distribution measurement information may be used as a measurement value, and the prediction value and the measurement value kalman filtering principle are combined to obtain an optimal estimation value of the current state, that is, the second probability distribution information.
For example, the optimal estimate of the current state may be shown in equation (4) as follows:
X(k|k)= X(k|k-1)+Kg(k) (Z(k)-H X(k|k-1)) (4)
wherein, X (k|k) represents the optimal estimated value of the current state, kg (k) is kalman gain (kalman gain) at time k, and may be expressed as the following formula (5):
Kg(k)= P(k|k-1) H' / (H P(k|k-1) H' + R) (5)
where H' represents the transposed matrix of H.
In some embodiments, in order to make the kalman filter continuously run, the covariance matrix corresponding to the state X (k|k) at the k moment may be updated, which may be specifically shown in the following formula (6):
P(k|k)=(I-Kg(k) H)P(k|k-1) (6)
where I is a matrix of 1, such as measuring i=1 for a single model. When the system enters the (k+1) time, P (k-1|k-1) in the above formula can be replaced by P (k|k), so that the optimal estimated value of the system at the (k+1) time can be determined.
In some embodiments, as shown in fig. 3, the second probability distribution information of the current vehicle in multiple lanes may also be determined by applying a corresponding filtering policy for different application scenarios as follows steps 241 to 243.
241, determining a confidence level of the positioning information according to lane-level map data corresponding to the current positioning information of the vehicle.
The lane-level map data may be, for example, an HD map. Specifically, the confidence of the positioning information may be classified by comprehensively considering the information such as the number of lanes, the width of the lanes, the curvature of the road, and the like in the lane-level map data corresponding to the current positioning information according to the lane-level map data.
As a specific example, from the HD map, it is possible to obtain: the vehicle is positioned on a 'subsidiary road' of a '2 lane', the road width 'meets the standard', the road 'is not curved', the lane line 'is not virtual and actually exists', and a '1 broken line' exists on the road, wherein the content in the quotation marks in the description is judged according to the HD map, and the information can be used as the judgment condition of the confidence degree classification of the positioning information.
In some embodiments, the confidence level of the positioning information may be classified into three levels, namely, a first level, a second level and a third level, according to the lane-level map data corresponding to the current positioning information. The second level is higher than the third level, the third level is higher than the first level, and the corresponding three levels are the first level (i.e. low level), the third level (i.e. medium level) and the second level (i.e. high level) in sequence from low to high.
Optionally, under the condition that the confidence coefficient of the positioning information is smaller than a preset value, determining that the confidence coefficient level corresponding to the positioning information is a first level. That is, the positioning result with the original low confidence of the positioning information can be directly classified into the lowest confidence level, namely the first level.
For the positioning result with higher original confidence, the scene can be judged according to the lane-level map data, and the opposite confidence level can be further subdivided. For example, a vehicle can observe a middle lane line with a high probability in a 2-lane scene, and is difficult to observe a middle lane line in a scene above 3 lanes; for example, the observation accuracy of the lane line equation is lower than that of a straight lane or the like when the vehicle is in a curve with a large curvature. Accordingly, the confidence level of the positioning information may be further subdivided based on at least one of the number of lanes in the lane-level map data, the road type, the road width, the road curvature, and whether or not the observed lane line corresponds to the lane-level map data.
In some embodiments, in determining the confidence level of the positioning information according to the lane-level map data, the conditions of different degrees of strictness may be constructed by stacking a plurality of conditions, thereby achieving further subdivision of the confidence level of the positioning information.
As one possible implementation, when the confidence level of the current positioning information is greater than or equal to the preset value and the lane-level map data meets at least one of the following second conditions, the confidence level is determined to be the second level, that is, the highest confidence level:
the number of lanes is less than 3;
the type/color of the visually observed lane line completely corresponds to the lane-level map data;
the visually observed broken line completely corresponds to the lane-level map data;
the curvature radius of the lane is larger than a preset first value;
the predetermined range around the vehicle is free of obstacles.
The predetermined first value is a larger value, i.e. a larger lane curvature makes the road a straighter road. The preset range may be a suitable range, so that visual perception information of the vehicle can be better obtained. As a specific example, the preset first value may be set to 5000m and the preset range may be set to 10m.
As another possible implementation, when the confidence level of the current positioning information is greater than or equal to the preset value, and the lane-level map data satisfies at least one of the following third conditions, the confidence level is determined to be a third level, that is, a medium confidence level:
The number of lanes is 3 or more and 4 or less;
the visually observed lane line type/color portion corresponds to lane-level map data;
the curvature radius of the lane is larger than a preset second value and smaller than a preset first value; wherein the preset second value is smaller than the preset first value.
Illustratively, more than 70% of visually observed lane line types/colors correspond to lane-level map data. The preset second value is smaller than the preset first value, and the corresponding road has certain bending degree. As a specific example, the preset second value may be set to 2000m.
Fig. 4 shows a specific example of determining a confidence level of positioning information. Specifically, after the confidence level of the positioning information is obtained, the confidence level of the positioning information can be classified by combining with the HD data scenerization result. As shown in fig. 4, three conditions may be obtained according to the confidence level of the positioning information and the HD data scenerization result, where the three conditions are respectively: stringent conditions, more stringent conditions and unconditional conditions. Wherein, strict conditions correspond to positioning results with low recall rate and extremely high accuracy; the stricter condition corresponds to a positioning result with low recall rate and high accuracy; unconditionally corresponds to a high recall positioning result.
As a possible implementation, referring to fig. 4, positioning information with low original confidence (for example, less than a preset value) may be classified into an unconditional category, which corresponds to a lowest level of confidence, for example, level 3.
As another possible implementation manner, the positioning information with high original confidence (for example, greater than a preset value) may be further subdivided by using the result of the scenerization processing of the HD data, comprehensively considering one or more of the lane number, the lane type, the road width, the curvature, and the like. With continued reference to fig. 4, for example, the confidence level may be determined to be level 1 for a scene in which the HD data satisfies at least one of the above-described second conditions, and the confidence level may be determined to be level 1 for a scene in which the HD data satisfies at least one of the above-described third conditions.
The above-described examples illustrate examples of determining the confidence level of the positioning information from the lane-level map data, and the present application is not limited thereto. For example, the above-mentioned determination conditions or preset values may also be adjusted according to the accumulation of experience of the real vehicle test, which is within the scope of the embodiments of the present application.
242, determining a filtering strategy according to at least one of the confidence level and the confidence of the lane change detection information.
In some embodiments, a first threshold (high threshold) and a second threshold (low threshold) may be set, and a size relationship between the lane change detection information and the first threshold and the second threshold is determined, so as to implement classification on the confidence of the lane change detection information. For example, the confidence of the lane change detection information may be determined to be a high level when the confidence of the lane change detection information is greater than or equal to a first threshold; when the confidence coefficient of the lane change detection information is smaller than the first threshold value and larger than or equal to the second threshold value, determining that the confidence coefficient of the lane change detection information is of a middle level; and when the confidence of the lane change detection information is smaller than the second threshold value, determining that the confidence of the lane change detection information is low.
For example, the lane change detection method is prone to failure in the following scenario, where the corresponding lane change detection information may correspond to a lower confidence (i.e., may correspond to a low level confidence):
1) The road is complex, for example, a scene with larger curvature such as lane expansion, lanes with gradually increased lane width, roundabout and the like occurs;
2) Road wear, resulting in a scene where the lane lines of the road are not visible;
3) The visual perception information error causes the visual lane change failure and other scenes.
Further, the filtering strategy may be determined by combining the confidence level of the positioning information and the confidence level of the lane change detection result and the lane change detection information (i.e., the magnitude relation between the confidence of the lane change detection information and the first and second thresholds).
It should be noted that, by classifying the confidence of the positioning information and classifying the confidence of the lane change detection information, it is able to find the positioning result and the lane change detection result that are absolutely correct and highly probable correct under the condition of low recall, so that these two types of results can be used as important correction amounts of kalman filtering, and robust processing, such as initializing, parameter correction or coarse rejection, can be performed on the probability distribution prediction information or the probability distribution measurement information during the filtering process.
For example, the filtering strategy may be determined as follows, corresponding to the three confidence levels of the positioning information and the confidence level of the lane change detection information described above:
1) If the confidence level is the first level and the confidence level of the lane change detection information is greater than or equal to a first threshold (namely, the confidence level of the lane change detection information is the high level), performing coarse difference elimination on the probability distribution measurement information, and determining second probability distribution information according to the probability distribution prediction information;
2) If the confidence level is the second level and the confidence level of the lane change detection information is smaller than a second threshold (namely, the confidence level of the lane change detection information is the low level), performing coarse difference rejection on the probability distribution prediction information, and determining second probability distribution information according to the probability distribution measurement information;
3) If the confidence level is the third level or the second level and the confidence level of the lane change detection information is greater than or equal to a second threshold (i.e. the confidence level of the lane change detection information is the high level or the medium level), filtering the probability distribution prediction information and the probability distribution measurement information to obtain second probability distribution information;
4) And initializing a filtering system if the confidence level is the first level and the confidence of the lane change detection information is smaller than the second threshold (namely, the confidence of the lane change detection information is the low level).
For 1), the confidence level of the positioning information is lower, the confidence level of the lane change detection information is higher, the error representing the positioning information is larger, rough rejection can be carried out on probability distribution measurement information at the moment, second probability distribution information is determined according to probability distribution prediction information, namely, the probability distribution prediction information of the vehicle in a plurality of lanes is normally predicted according to the lane change detection information, and the current lane number is further determined according to the probability distribution prediction information.
For 2), the confidence level of the positioning information is higher, the confidence level of the lane change detection information is lower, the error of the lane change detection information is larger, rough rejection can be carried out on the probability distribution prediction information at the moment, and the second probability distribution information is determined according to the probability distribution measurement information, namely, the current lane number is determined normally according to the positioning information.
And 3) the confidence of the positioning information and the confidence of the lane change detection information are high, and are reliable, and filtering processing can be performed according to the predicted probability distribution prediction information and the measured probability distribution measurement information to obtain the current lane number.
Optionally, the filtering parameter may be further adjusted according to at least one of the confidence level of the lane-level map data and the lane change detection information, where the filtering parameter includes at least one of a covariance matrix P of the probability distribution predicted value, a covariance matrix Q of the noise of the probability distribution predicted value, and a covariance matrix R of the noise of the probability distribution measured value. The probability distribution prediction information and the probability distribution measurement information may then be subjected to a filtering process according to the adjusted filtering parameters, e.g. based on the above formulas (1) to (6), to obtain the second probability distribution information.
It should be noted that the filtering parameters may be adjusted in combination with a specific road scene where the vehicle is located, and the specific adjustment mode is not limited in the present application.
For 4), the confidence of the positioning information and the confidence of the lane change detection information are low, and the errors of the positioning information and the confidence of the lane change detection information are large, so that the reinitialization of the filtering system can be determined, and the current lane number is not output. As an example, the filtering system may be initialized when the vehicle satisfies the above-described first condition, i.e. when it is determined that the vehicle is located in the initial probability distribution information of a plurality of lanes.
243, filtering the probability distribution prediction information and the probability distribution measurement information according to a filtering strategy to obtain second probability distribution information.
Therefore, the embodiment of the application can fuse the probability distribution prediction information of the vehicle in the multiple lanes, which is obtained based on the lane change detection information of the vehicle, and the probability distribution measurement information of the vehicle in the multiple lanes, which is obtained based on the positioning information of the vehicle, so as to obtain the probability distribution information of the vehicle in the multiple lanes. The lane positioning result determined based on the lane change detection information and the observed lane positioning result are fused, so that the final positioning result is not strongly dependent on a certain type of positioning result, and the lane positioning accuracy can be improved.
For example, when the visual perception information is abnormal, there is necessarily an error in the visual positioning result, and the error gradually affects the fusion positioning result. In the embodiment of the application, the abnormal fusion positioning result can be filtered according to the lane positioning result determined by the lane change detection information and further according to the lane positioning result determined based on the lane change detection information, so as to obtain a positioning result with higher precision.
Fig. 5 shows a schematic flow chart of another method 400 of lane positioning provided by an embodiment of the application. In the method 400, the positioning information is described as an example of fusing positioning results. The subject of execution of the method 400 may be any electronic device with data processing capabilities, which may be implemented, for example, as a terminal device, such as a lane-level positioning module of an in-vehicle terminal or a user terminal.
It should be understood that fig. 5 illustrates steps or operations of a method of lane positioning, but these steps or operations are merely examples, and that other operations or variations of the operations in fig. 5 may also be performed by embodiments of the present application. Furthermore, the various steps in fig. 5 may be performed in a different order than presented in fig. 5, and it is possible that not all of the operations of fig. 5 are to be performed.
And 401, acquiring HD data. Among them, HD data is one specific example of lane-level map data.
402, visual perception information is acquired.
Specifically, an image acquired by a visual sensor (such as a camera) mounted on the vehicle for a road on which the vehicle is traveling may be acquired, and at least one of a lane line, a vehicle, a pedestrian, a road sign, and the like in the image may be identified by using an artificial intelligence model, and the obtained identification result may be used as the visual perception information.
And 403, obtaining a fusion positioning result.
For example, visual positioning results may be obtained from visual perception information and HD data. Then, the visual positioning result, the inertial positioning result, the satellite positioning result and the like can be fused to obtain a fused positioning result.
404, obtaining a lane change detection result.
The lane change detection result can be obtained by a double-line lane change detection method based on a visual lane line equation, a single-line lane change detection method based on a visual lane line equation or a DR lane change detection method based on vehicle heading information, which is not limited in the application.
For example, in a conventional scenario, double lane change detection, or single lane change detection, may be performed using only visually perceived lane-line equations. For another example, in a traffic jam scenario, the visual perception may not observe the lane line, and at this time, lane change detection may be performed using the vehicle heading information.
405, it is determined whether to start the twice positioning of the lane number.
Specifically, it may be determined whether the visual positioning result or the fusion positioning result meets the first condition, and if so, it is determined to start the twice positioning of the track number, that is, step 406 is executed next; and under the condition of unsatisfied condition, determining that the lane number secondary positioning is not started.
Here, the first condition can ensure that the secondary positioning of the track number starts from one high-confidence position (a position corresponding to the visual positioning/fusion positioning result that maintains high confidence for a long time, or a position corresponding to relatively single HD data). In particular, the first condition may be described with reference to step 210 of fig. 2.
Alternatively, in the case where it is determined that the secondary positioning of the lane number is not started, step 411 may be executed to output the lane number result. The lane number result may be determined directly from the fused location result.
406, lane number secondary positioning.
Specifically, the lane number secondary positioning algorithm in the embodiment of the application is essentially a Kalman filtering. As an example, the motion of the vehicle can be predicted by the topological relation of the HD data and the lane change information, and the probability that the current vehicle is in each lane can be determined; taking the fusion positioning result as an observed quantity to obtain the latest fusion positioning result; and then, modifying the vehicle prediction result based on the lane change information according to the latest fusion positioning result to acquire the lane number of the final vehicle.
The lane number secondary positioning algorithm can be implemented by the following steps 4061 to 4063.
4061, initializing the kalman filter system.
The kalman filtering needs to be performed by initializing the covariance matrix P of the state variables X and X. Specifically, the relevant information of the high confidence level position can be obtained, and an initial state variable X and a covariance matrix P are constructed. When the positioning information of the vehicle satisfies the first condition, the position corresponding to the positioning information may be determined to be a high confidence position. At this time, a probability distribution of the vehicle in a plurality of lanes may be determined as an initial state variable X, and a covariance matrix of the probability distribution may be determined as a covariance matrix P.
In some embodiments, steps 4062 and 4063 may continue to be performed when the initialization is successful.
Alternatively, when the initialization fails, step 411 may be performed to output a lane number result. The lane number result may be determined directly from the fused location result.
4062, predicting, that is, predicting the probability distribution prediction information of the vehicle in the current state in the multiple lanes according to the lane detection result and the probability distribution information of the vehicle in the previous state in the multiple lanes, for example, may include a probability distribution prediction value X and a covariance matrix P of the probability distribution prediction value. Alternatively, the covariance matrix Q of the system noise may also be determined. In particular, the prediction process may be described with reference to step 220 of FIG. 2.
4063, updating, i.e. obtaining probability distribution measurement information of the vehicle in a plurality of lanes, such as probability distribution measurement value Z, and covariance matrix R of noise in the measurement process, according to the fusion positioning result of the vehicle. Specifically, the implementation of Kalman filtering update depends on the fusion positioning result. In particular, the update process may be described with reference to step 230 of FIG. 2.
407a, acquiring HD data. Specifically, according to the positioning information updated in step 4063, corresponding local HD data is acquired.
407b, confidence ranking of the positioning results.
Specifically, for the positioning result with higher original confidence, the HD data can be further subdivided by using the scene judgment, and for the positioning result with lower original confidence, the positioning result can be directly classified as the lowest confidence level. In particular, the confidence ranking of the positioning results may be described with reference to step 241 in fig. 3 and to fig. 4.
408, confidence grading of the lane change detection result.
Specifically, the confidence of the lane change detection information may be classified according to the magnitude relation between the confidence of the lane change detection result and a preset threshold (for example, a preset first threshold and a preset second threshold). In particular, the confidence ranking of the positioning results may be described with reference to step 241 in fig. 3 and to fig. 4.
409, time window processing.
In consideration of the fact that the actual scene is more complex in the confidence level grading process of the positioning information and the error grading situation possibly occurs, a time window is added for processing, and the confidence level grading result of the positioning result and the lane changing result in a time window are used for jointly determining a filtering strategy to exchange the hysteresis of the filtering process for the accuracy of filtering correction.
410, filtering processing. Specifically, the filtering process may be performed according to the determined filtering strategy. For example, the filtering process may include the following steps 4101 to 4105.
4101, initializing.
Specifically, if the confidence level of the positioning result is a first level (i.e., low level) and the confidence level of the lane change detection information is smaller than the second threshold (i.e., the confidence level of the lane change detection information is low level), the filtering system is initialized.
4102, filtering parameter correction.
Specifically, the filtering parameter may be adjusted according to at least one of the confidence level of the lane-level map data and the lane change detection information, and the filtering parameter includes at least one of a covariance matrix P of the probability distribution predicted value, a covariance matrix Q of the noise of the probability distribution predicted value, and a covariance matrix R of the noise of the probability distribution measured value.
4103, predictive gross error culling.
Specifically, if the confidence level is the second level (i.e., the high level) and the confidence level of the lane change detection information is smaller than the second threshold (i.e., the confidence level of the lane change detection information is the low level), the probability distribution prediction information is subjected to coarse difference rejection, and the second probability distribution information is determined according to the probability distribution measurement information.
4104, observing a gross error rejection.
Specifically, if the confidence level is a first level (low level) and the confidence level of the lane change detection information is greater than or equal to a first threshold (i.e., the confidence level of the lane change detection information is high level), performing coarse difference rejection on the probability distribution measurement information, and determining second probability distribution information according to the probability distribution prediction information.
4105, filtering calculation
And if the confidence level is the third level or the second level and the confidence level of the lane change detection information is greater than or equal to a second threshold (namely, the confidence level of the lane change detection information is the high level or the medium level), filtering the probability distribution prediction information and the probability distribution measurement information to obtain second probability distribution information.
411, outputting a lane number result.
In some embodiments, the lane number output may be based on the highest probability in the second probability distribution information.
In some embodiments, the lane number result may also be determined directly from the fused location result.
Therefore, the lane positioning result determined based on the lane change detection information and the observed lane positioning result can be fused, so that the fusion positioning result is not strongly dependent on the visual positioning result obtained by the visual perception information. For example, when the visual perception information is abnormal, there is necessarily an error in the visual positioning result, and the error gradually affects the fusion positioning result. In the embodiment of the application, the abnormal fusion positioning result can be filtered according to the lane positioning result determined by the lane change detection information and further according to the lane positioning result determined based on the lane change detection information, so as to obtain a positioning result with higher precision.
The specific embodiments of the present application have been described in detail above with reference to the accompanying drawings, but the present application is not limited to the specific details of the above embodiments, and various simple modifications can be made to the technical solution of the present application within the scope of the technical concept of the present application, and all the simple modifications belong to the protection scope of the present application. For example, the specific features described in the above embodiments may be combined in any suitable manner, and in order to avoid unnecessary repetition, various possible combinations are not described further. As another example, any combination of the various embodiments of the present application may be made without departing from the spirit of the present application, which should also be regarded as the disclosure of the present application.
It should be further understood that, in the various method embodiments of the present application, the sequence numbers of the foregoing processes do not mean the order of execution, and the order of execution of the processes should be determined by the functions and internal logic of the processes, and should not constitute any limitation on the implementation process of the embodiments of the present application. It is to be understood that the numbers may be interchanged where appropriate such that the described embodiments of the application may be practiced otherwise than as shown or described.
The method embodiments of the present application are described in detail above and the embodiments of the apparatus of the present application are described in detail below in conjunction with fig. 6 and 7.
Fig. 6 is a schematic block diagram of a lane positioning apparatus 10 according to an embodiment of the present application. As shown in fig. 6, the apparatus 10 may include an acquisition unit 11, a prediction unit 12, a measurement unit 13, and a processing unit 14.
An acquisition unit 11 for acquiring lane change detection information of the vehicle at a first time;
a prediction unit 12, configured to obtain probability distribution prediction information of the vehicle being located in a plurality of lanes at a first time according to the lane change detection information and first probability distribution information of the vehicle being located in a plurality of lanes at a second time, where the second time includes a time before the first time;
A measurement unit 13, configured to obtain probability distribution measurement information of the vehicle located in a plurality of lanes at the first time according to the positioning information of the vehicle at the first time;
the processing unit 14 performs a kalman filter process on the probability distribution prediction information and the probability distribution measurement information, and determines second probability distribution information of the vehicle in a plurality of lanes at the first time.
In some embodiments, the processing unit 14 is specifically configured to:
determining the confidence level of the positioning information according to lane-level map data corresponding to the current positioning information of the vehicle;
determining a filtering strategy according to at least one of the confidence level and the confidence of the lane change detection information;
and carrying out Kalman filtering processing on the probability distribution prediction information and the probability distribution measurement information according to the filtering strategy to obtain the second probability distribution information.
In some embodiments, the filtering strategy comprises one of:
if the confidence level is the first level and the confidence level of the lane change detection information is greater than or equal to a first threshold, performing coarse difference rejection on the probability distribution measurement information, and determining the second probability distribution information according to the probability distribution prediction information;
If the confidence level is a second level and the confidence level of the lane change detection information is smaller than a second threshold, performing rough elimination on the probability distribution prediction information, and determining the second probability distribution information according to the probability distribution measurement information;
if the confidence level is the third level or the second level and the confidence level of the lane change detection information is greater than or equal to the second threshold value, filtering the probability distribution prediction information and the probability distribution measurement information to obtain second probability distribution information;
initializing a filtering system if the confidence level is the first level and the confidence of the lane change detection information is smaller than the second threshold;
wherein the second level is higher than the third level, the third level is higher than the first level, and the first threshold is greater than the second threshold.
In some embodiments, the processing unit 14 is specifically configured to:
and under the condition that the confidence coefficient of the current positioning information is smaller than a preset value, determining the confidence coefficient level as the first level.
In some embodiments, the processing unit 14 is specifically configured to:
when the confidence level of the current positioning information is larger than or equal to a preset value and the lane-level map data meets at least one of the following second conditions, determining that the confidence level is the second level:
The number of lanes is less than 3;
the type/color of the visually observed lane line completely corresponds to the lane-level map data;
the visually observed broken line completely corresponds to the lane-level map data;
the curvature radius of the lane is larger than a preset first value;
the predetermined range around the vehicle is free of obstacles.
In some embodiments, the processing unit 14 is specifically configured to:
when the confidence level of the current positioning information is greater than or equal to a preset value and the lane-level map data meets at least one of the following third conditions, determining that the confidence level is the third level:
the number of lanes is 3 or more and 4 or less;
visually observed lane line type/color portions correspond to the lane-level map data;
the curvature radius of the lane is larger than a preset second value and smaller than a preset first value; wherein the preset second value is smaller than the preset first value.
In some embodiments, the processing unit 14 is specifically configured to:
according to at least one of the lane-level map data and the confidence of the lane change detection information, adjusting a filtering parameter; the filtering parameter comprises at least one of a covariance matrix of a probability distribution predicted value, a covariance matrix of noise of the probability distribution predicted value and a covariance matrix of noise of a probability distribution measured value;
And carrying out filtering processing on the probability distribution prediction information and the probability distribution measurement information according to the adjusted filtering parameters to obtain the second probability distribution information.
In some embodiments, the method further comprises an initialization unit for:
when the first condition is satisfied, initial probability distribution information that the vehicle is located in a plurality of lanes is determined.
In some embodiments, the first condition includes at least one of:
the confidence of the positioning information of the vehicle is larger than a preset threshold value in a preset duration;
the lane-level map data corresponding to the positioning information of the vehicle satisfies that the number of lanes is 1 or 2;
the lane-level map data corresponding to the positioning information of the vehicle meets the condition that no guide belt exists;
the lane-level map data corresponding to the positioning information of the vehicle satisfies that the vehicle is not at the intersection position.
In some embodiments, the obtaining unit 11 is specifically configured to:
and carrying out lane change detection on the vehicle according to at least one of visual perception information and vehicle course information to acquire lane change detection information.
In some embodiments, the lane change detection information includes lane change information and a confidence of the lane change information.
In some embodiments, the positioning information includes at least one of visual positioning information, fused positioning information, satellite positioning information, and inertial positioning information.
It should be understood that apparatus embodiments and method embodiments may correspond with each other and that similar descriptions may refer to the method embodiments. To avoid repetition, no further description is provided here. Specifically, the apparatus 10 shown in fig. 6 may correspond to an apparatus for performing the method 200 according to the embodiment of the present application, and the foregoing and other operations and/or functions of each module in the apparatus 10 are respectively for implementing the corresponding flow of the method 200, which is not described herein for brevity.
The remote driving control device of the embodiment of the present application is described above in terms of functional blocks with reference to the accompanying drawings. It should be understood that the functional module may be implemented in hardware, or may be implemented by instructions in software, or may be implemented by a combination of hardware and software modules. Specifically, each step of the method embodiment in the embodiment of the present application may be implemented by an integrated logic circuit of hardware in a processor and/or an instruction in a software form, and the steps of the method disclosed in connection with the embodiment of the present application may be directly implemented as a hardware decoding processor or implemented by a combination of hardware and software modules in the decoding processor. Alternatively, the software modules may be located in a well-established storage medium in the art such as random access memory, flash memory, read-only memory, programmable read-only memory, electrically erasable programmable memory, registers, and the like. The storage medium is located in a memory, and the processor reads information in the memory, and in combination with hardware, performs the steps in the above method embodiments.
Fig. 7 is a schematic block diagram of an electronic device 20 provided by an embodiment of the present application.
As shown in fig. 7, the electronic device 20 may include:
a memory 21 and a processor 22, the memory 21 being adapted to store a computer program and to transfer the program code to the processor 22. In other words, the processor 22 may call and run a computer program from the memory 21 to implement the method in an embodiment of the present application.
For example, the processor 22 may be configured to perform the corresponding steps in the methods 200 or 400 described above in accordance with instructions in the computer program.
In some embodiments of the present application, the processor 22 may include, but is not limited to:
a general purpose processor, digital signal processor (Digital Signal Processor, DSP), application specific integrated circuit (Application Specific Integrated Circuit, ASIC), field programmable gate array (Field Programmable Gate Array, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like.
In some embodiments of the present application, the memory 21 includes, but is not limited to:
volatile memory and/or nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM) which acts as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (Double Data Rate SDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), and Direct memory bus RAM (DR RAM).
In some embodiments of the application, the computer program may be split into one or more modules that are stored in the memory 21 and executed by the processor 22 to perform the methods provided by the application. The one or more modules may be a series of computer program instruction segments capable of performing the specified functions, which are intended to describe the execution of the computer program in the electronic device 20.
Optionally, as shown in fig. 7, the electronic device 20 may further include:
a transceiver 23, the transceiver 23 being connectable to the processor 22 or the memory 21.
The processor 22 may control the transceiver 23 to communicate with other devices, and in particular, may send information or data to other devices or receive information or data sent by other devices. The transceiver 23 may include a transmitter and a receiver. The transceiver 23 may further include antennas, the number of which may be one or more.
It will be appreciated that the various components in the electronic device 20 are connected by a bus system that includes, in addition to a data bus, a power bus, a control bus, and a status signal bus.
The present application also provides a computer storage medium having stored thereon a computer program which, when executed by a computer, enables the computer to perform the method of the above-described method embodiments. Alternatively, embodiments of the present application also provide a computer program product comprising instructions which, when executed by a computer, cause the computer to perform the method of the method embodiments described above.
When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line (digital subscriber line, DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a digital video disc (digital video disc, DVD)), or a semiconductor medium (e.g., a Solid State Disk (SSD)), or the like.
It should be understood that in embodiments of the present application, "B corresponding to a" means that B is associated with a. In one implementation, B may be determined from a. It should also be understood that determining B from a does not mean determining B from a alone, but may also determine B from a and/or other information.
In the description of the present application, unless otherwise indicated, "at least one" means one or more, and "a plurality" means two or more. In addition, "and/or" describes an association relationship of the association object, and indicates that there may be three relationships, for example, a and/or B may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural.
It should be further understood that the description of the first, second, etc. in the embodiments of the present application is for illustration and distinction of descriptive objects, and is not intended to represent any limitation on the number of devices in the embodiments of the present application, nor is it intended to constitute any limitation on the embodiments of the present application.
It should also be appreciated that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the application. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or server that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or apparatus, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It will be appreciated that in the specific implementation of the present application, when the above embodiments of the present application are applied to specific products or technologies and relate to data related to user information and the like, user permission or consent needs to be obtained, and the collection, use and processing of the related data needs to comply with the relevant laws and regulations and standards.
Those of ordinary skill in the art will appreciate that the various illustrative modules and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, and for example, the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or modules, which may be in electrical, mechanical, or other forms.
The modules illustrated as separate components may or may not be physically separate, and components shown as modules may or may not be physical modules, i.e., may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. For example, functional modules in various embodiments of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module.

Claims (16)

1. A method of lane positioning, comprising:
obtaining lane change detection information of a vehicle at a first time;
acquiring probability distribution prediction information of the vehicle in a plurality of lanes at the first time according to the lane change detection information and first probability distribution information of the vehicle in the plurality of lanes at the second time, wherein the second time comprises time before the first time;
acquiring probability distribution measurement information of the vehicle in a plurality of lanes at the first time according to the positioning information of the vehicle at the first time;
and carrying out Kalman filtering processing on the probability distribution prediction information and the probability distribution measurement information, and determining second probability distribution information of the vehicle in a plurality of lanes at the first time.
2. The method of claim 1, wherein the performing a kalman filter process on the probability distribution prediction information and the probability distribution measurement information to determine second probability distribution information that the vehicle is located in a plurality of lanes at the first time comprises:
determining the confidence level of the positioning information according to lane-level map data corresponding to the current positioning information of the vehicle;
Determining a filtering strategy according to at least one of the confidence level and the confidence of the lane change detection information;
and carrying out Kalman filtering processing on the probability distribution prediction information and the probability distribution measurement information according to the filtering strategy to obtain the second probability distribution information.
3. The method of claim 2, wherein the filtering strategy comprises one of:
if the confidence level is the first level and the confidence level of the lane change detection information is greater than or equal to a first threshold, performing coarse difference rejection on the probability distribution measurement information, and determining the second probability distribution information according to the probability distribution prediction information;
if the confidence level is a second level and the confidence level of the lane change detection information is smaller than a second threshold, performing rough elimination on the probability distribution prediction information, and determining the second probability distribution information according to the probability distribution measurement information;
if the confidence level is the third level or the second level and the confidence level of the lane change detection information is greater than or equal to the second threshold value, filtering the probability distribution prediction information and the probability distribution measurement information to obtain second probability distribution information;
Initializing a filtering system if the confidence level is the first level and the confidence of the lane change detection information is smaller than the second threshold;
wherein the second level is higher than the third level, the third level is higher than the first level, and the first threshold is greater than the second threshold.
4. The method of claim 3, wherein determining the confidence level of the positioning information based on lane-level map data corresponding to the current positioning information of the vehicle comprises:
and under the condition that the confidence coefficient of the current positioning information is smaller than a preset value, determining the confidence coefficient level as the first level.
5. The method of claim 3, wherein determining the confidence level of the positioning information based on lane-level map data corresponding to the current positioning information of the vehicle comprises:
when the confidence level of the current positioning information is larger than or equal to a preset value and the lane-level map data meets at least one of the following second conditions, determining that the confidence level is the second level:
the number of lanes is less than 3;
the type/color of the visually observed lane line completely corresponds to the lane-level map data;
The visually observed broken line completely corresponds to the lane-level map data;
the curvature radius of the lane is larger than a preset first value;
the predetermined range around the vehicle is free of obstacles.
6. The method of claim 3, wherein determining the confidence level of the positioning information based on lane-level map data corresponding to the current positioning information of the vehicle comprises:
when the confidence level of the current positioning information is greater than or equal to a preset value and the lane-level map data meets at least one of the following third conditions, determining that the confidence level is the third level:
the number of lanes is 3 or more and 4 or less;
visually observed lane line type/color portions correspond to the lane-level map data;
the curvature radius of the lane is larger than a preset second value and smaller than a preset first value; wherein the preset second value is smaller than the preset first value.
7. A method according to claim 3, wherein said filtering said probability distribution prediction information and said probability distribution measurement information to obtain said second probability distribution information comprises:
according to at least one of the lane-level map data and the confidence of the lane change detection information, adjusting a filtering parameter; the filtering parameter comprises at least one of a covariance matrix of a probability distribution predicted value, a covariance matrix of noise of the probability distribution predicted value and a covariance matrix of noise of a probability distribution measured value;
And carrying out filtering processing on the probability distribution prediction information and the probability distribution measurement information according to the adjusted filtering parameters to obtain the second probability distribution information.
8. The method according to any one of claims 1-7, wherein the obtaining, according to the lane change detection result, first probability distribution information of the vehicle being located in a plurality of lanes at a second time, before obtaining probability distribution prediction information of the vehicle being located in the plurality of lanes at the first time, further comprises:
when the first condition is satisfied, initial probability distribution information that the vehicle is located in a plurality of lanes is determined.
9. The method of claim 8, wherein the first condition comprises at least one of:
the confidence of the positioning information of the vehicle is larger than a preset threshold value in a preset duration;
the lane-level map data corresponding to the positioning information of the vehicle satisfies that the number of lanes is 1 or 2;
the lane-level map data corresponding to the positioning information of the vehicle meets the condition that no guide belt exists;
the lane-level map data corresponding to the positioning information of the vehicle satisfies that the vehicle is not at the intersection position.
10. The method according to any one of claims 1-7, wherein the obtaining lane change detection information of the vehicle at the first time includes:
And carrying out lane change detection on the vehicle according to at least one of visual perception information and vehicle course information to acquire lane change detection information.
11. The method of any of claims 1-7, wherein the lane change detection information includes whether lane change information and a confidence level of the whether lane change information.
12. The method of any of claims 1-7, wherein the positioning information comprises at least one of visual positioning information, fused positioning information, satellite positioning information, and inertial positioning information.
13. A lane positioning apparatus, comprising:
the acquisition unit is used for acquiring lane change detection information of the vehicle at the first time;
the prediction unit is used for acquiring probability distribution prediction information of the vehicle in a plurality of lanes at the first time according to the lane change detection information and first probability distribution information of the vehicle in the plurality of lanes at the second time, wherein the second time comprises the time before the first time;
the measuring unit is used for acquiring probability distribution measuring information of the vehicle in a plurality of lanes at the first time according to the positioning information of the vehicle at the first time;
And the processing unit is used for carrying out Kalman filtering processing on the probability distribution prediction information and the probability distribution measurement information and determining second probability distribution information of the vehicle in a plurality of lanes at the first time.
14. An electronic device, comprising:
a processor and a memory having instructions stored therein which, when executed by the processor, cause the processor to perform the method of any of claims 1 to 12.
15. A computer readable storage medium comprising instructions which, when run on a computer, cause the computer to perform the method of any of claims 1 to 12.
16. A computer program product comprising computer programs/instructions which, when run on a computer, cause the computer to perform the method of any of claims 1 to 12.
CN202211574961.4A 2022-12-08 2022-12-08 Lane positioning method, device, equipment and storage medium Pending CN116977954A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211574961.4A CN116977954A (en) 2022-12-08 2022-12-08 Lane positioning method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211574961.4A CN116977954A (en) 2022-12-08 2022-12-08 Lane positioning method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116977954A true CN116977954A (en) 2023-10-31

Family

ID=88473714

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211574961.4A Pending CN116977954A (en) 2022-12-08 2022-12-08 Lane positioning method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116977954A (en)

Similar Documents

Publication Publication Date Title
US11543834B2 (en) Positioning system based on geofencing framework
US11235777B2 (en) Vehicle path prediction and target classification for autonomous vehicle operation
US10591608B2 (en) Positioning quality filter for the V2X technologies
US20200255027A1 (en) Method for planning trajectory of vehicle
CN106546977B (en) Vehicle radar sensing and localization
US11125566B2 (en) Method and apparatus for determining a vehicle ego-position
CN108068792A (en) For the automatic collaboration Driving control of autonomous vehicle
CN111429716A (en) Method for determining position of own vehicle
WO2016182964A1 (en) Adaptive positioning system
Miller et al. Map‐aided localization in sparse global positioning system environments using vision and particle filtering
CN114174137A (en) Source lateral offset of ADAS or AD features
FR3058214A1 (en) METHOD FOR PRODUCING A NAVIGATION AUTONOMY CARD FOR A VEHICLE
Williams et al. A qualitative analysis of vehicle positioning requirements for connected vehicle applications
Nam et al. CNVPS: Cooperative neighboring vehicle positioning system based on vehicle-to-vehicle communication
CN113063425A (en) Vehicle positioning method and device, electronic equipment and storage medium
CN114274972A (en) Scene recognition in an autonomous driving environment
Wang et al. UGV‐UAV robust cooperative positioning algorithm with object detection
Tsogas et al. Using digital maps to enhance lane keeping support systems
CN116977954A (en) Lane positioning method, device, equipment and storage medium
Toledo-Moreo et al. Positioning and digital maps
Farrell et al. Lane-Level Localization and Map Matching for Advanced Connected and Automated Vehicle (CAV) Applications
US20230242099A1 (en) Method for Vehicle Driving Assistance within Delimited Area
CN116931005B (en) V2X-assisted vehicle high-precision positioning method and device and storage medium
Abdellattif Multi-sensor fusion of automotive radar and onboard motion sensors for seamless land vehicle positioning in challenging environments
CN113534214B (en) Vehicle positioning method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication