CN110979346A - Method, device and equipment for determining lane where vehicle is located - Google Patents

Method, device and equipment for determining lane where vehicle is located Download PDF

Info

Publication number
CN110979346A
CN110979346A CN201911206663.8A CN201911206663A CN110979346A CN 110979346 A CN110979346 A CN 110979346A CN 201911206663 A CN201911206663 A CN 201911206663A CN 110979346 A CN110979346 A CN 110979346A
Authority
CN
China
Prior art keywords
vehicle
information
lane lines
lane
lanes
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911206663.8A
Other languages
Chinese (zh)
Other versions
CN110979346B (en
Inventor
杨鹏斌
王俊
杨洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201911206663.8A priority Critical patent/CN110979346B/en
Publication of CN110979346A publication Critical patent/CN110979346A/en
Application granted granted Critical
Publication of CN110979346B publication Critical patent/CN110979346B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • B60W2050/0075Automatic parameter input, automatic initialising or calibrating means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/50External transmission of data to or from the vehicle of positioning data, e.g. GPS [Global Positioning System] data

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Abstract

The application discloses a method, a device and equipment for determining a lane where a vehicle is located, and relates to the technical field of intelligent driving, in particular to the technical field of vehicle positioning. The scheme disclosed by the application is as follows: acquiring rough position information of a vehicle at the current moment and sensing data of the vehicle at the current moment, wherein the sensing data comprises information of n sensing lane lines acquired by an acquisition device in the vehicle; acquiring information of m +1 real lane lines of a road where a vehicle is located on a positioning map according to the position information; according to the information of the n sensed lane lines and the information of the m +1 real lane lines, obtaining confidence coefficients that the vehicle is respectively positioned in the m lanes at the current moment, and further determining the lane where the vehicle is positioned at the current moment. Therefore, the method and the device only need to obtain the perception data aiming at the lane line, reduce the requirement on the perception data, and also reduce the implementation difficulty of the scheme and the requirement on the computing capability.

Description

Method, device and equipment for determining lane where vehicle is located
Technical Field
The application relates to the technical field of intelligent driving, in particular to a method, a device and equipment for determining a lane where a vehicle is located.
Background
In recent years, intelligent driving technology is rapidly developed. In an intelligent driving scene, when the position of a vehicle is positioned, the vehicle needs to be positioned in a certain lane. However, the current positioning method based on GPS only obtains the approximate position of the vehicle, and cannot accurately position the vehicle to the lane level.
Currently, when a lane where a vehicle is located, a particle filter-based method is generally adopted. Specifically, sensing data of a plurality of sensors are fused by a particle filtering method, an accurate vehicle position is directly obtained, and then a corresponding lane is determined in a map according to the vehicle position.
However, the scheme has high requirement on perception data, needs to acquire perception data of various road signs on a road by using a plurality of sensors, is complex to implement and has high requirement on computing capability.
Disclosure of Invention
The application provides a method, a device and equipment for determining a lane where a vehicle is located, which can reduce requirements on perception data and computing capacity.
In a first aspect, the present application provides a method for determining a lane in which a vehicle is located, including: acquiring position information of a vehicle at the current moment and sensing data of the vehicle at the current moment, wherein the sensing data comprises information of n sensing lane lines acquired by an acquisition device in the vehicle; according to the position information, obtaining the information of m +1 real lane lines of the road where the vehicle is located on a positioning map, wherein the road where the vehicle is located comprises m lanes; wherein m +1> ═ n; obtaining confidence coefficients of the vehicle respectively positioned on the m lanes at the current moment according to the information of the n perception lane lines and the information of the m +1 real lane lines; and determining the lane of the vehicle at the current moment according to the confidence degrees that the vehicle is respectively positioned in the m lanes at the current moment.
In the method and the device, the lane where the vehicle is located is determined by using the position information of the vehicle, the information of the perception lane line acquired by the acquisition device in the vehicle and the information of the real lane line in the positioning map. According to the method and the device, only the perception data aiming at the lane lines need to be obtained, and the perception data of various road signs does not need to be obtained at the same time, so that the requirement on the perception data is reduced; because the lane line detection algorithm is mature, the lane level positioning is carried out on the vehicle by using the information of the perception lane line, and the implementation difficulty of the scheme and the requirement on the computing capability are also reduced. The lane where the vehicle is located is positioned, so that the boundary is limited for further accurate positioning in the lane, the search space during accurate positioning is greatly reduced, and the difficulty in further performing accurate positioning in the lane is reduced. Meanwhile, a boundary is still defined for the position of the vehicle even under the condition that accurate positioning fails in the lane, and the safety of intelligent driving is improved.
In a possible implementation manner, the obtaining confidence levels that the vehicle is respectively located in the m lanes at the current time according to the information of the n sensing lane lines and the information of the m +1 real lane lines includes: acquiring a first positioning result according to the information of the m +1 real lane lines, wherein the first positioning result is used for indicating the posterior probability that the vehicles are respectively positioned in the m lanes under the condition of not depending on the perception data; acquiring a second positioning result according to the information of the n sensing lane lines and the information of the m +1 real lane lines, wherein the second positioning result is used for indicating the posterior probability that the vehicles are respectively positioned in the m lanes under the condition of depending on the sensing data; and obtaining confidence degrees that the vehicle is respectively positioned in the m lanes at the current moment according to the first positioning result and the second positioning result.
In the implementation mode, the positioning results of the vehicle are respectively determined under the two conditions of depending on the perception data and not depending on the perception data, the positioning results obtained under the two conditions are fused, the confidence coefficients that the vehicle is respectively positioned on the m lanes are obtained, and therefore the accuracy of the positioning results can be improved.
In a possible implementation manner, the obtaining a second positioning result according to the information of the n sensing lane lines and the information of the m +1 real lane lines includes: acquiring at least one piece of observation information according to the information of the n sensing lane lines; aiming at each type of observation information, acquiring a second positioning result based on the observation information according to the observation information and the information of the m +1 real lane lines; the obtaining confidence that the vehicle is respectively located in the m lanes at the current moment according to the first positioning result and the second positioning result includes: and obtaining confidence degrees that the vehicle is respectively positioned in the m lanes at the current moment according to the first positioning result and a second positioning result corresponding to the at least one type of observation information.
In the implementation mode, the positioning results corresponding to various observations are calculated to obtain the confidence coefficient of the vehicle in each lane, so that the expansibility of the scheme of the application is better.
In one possible implementation, the at least one type of observation information includes: the distance between the vehicle and the leftmost sensing lane line and the distance between the vehicle and the rightmost sensing lane line; the obtaining, according to the observation information and the information of the m +1 real lane lines, a second positioning result based on the observation information includes: acquiring the road width of the road where the vehicle is located according to the information of the m +1 real lane lines; and obtaining the posterior probability that the vehicle is respectively positioned on the m lanes according to the distance between the vehicle and the leftmost perception lane line, the distance between the vehicle and the rightmost perception lane line and the road width.
In one possible implementation, the at least one type of observation information includes: the number of perception lane lines located on the left side of the vehicle and the number of perception lane lines located on the right side of the vehicle; the acquiring a second positioning result based on the observation information according to the observation information and the information of the m +1 real lane lines includes: acquiring the road width of the road where the vehicle is located according to the information of the m +1 real lane lines; and acquiring the posterior probabilities that the vehicles are respectively positioned in the m lanes according to the quantity of the perception lane lines positioned on the left side of the vehicles, the quantity of the perception lane lines positioned on the right side of the vehicles and the road width.
In one possible implementation, the at least one type of observation information includes: a distance between the vehicle and a roadway boundary; the obtaining of a second positioning result based on the observation information according to the observation information and the information of the m +1 real lane lines includes: acquiring the distances between the vehicle and the m +1 real lane lines respectively according to the information of the m +1 real lane lines; and acquiring posterior probabilities that the vehicles are respectively positioned in the m lanes according to the distance between the vehicles and the road boundary and the distance between the vehicles and the m +1 real lane lines.
In one possible implementation, the at least one type of observation information includes: the types of the n sensing lane lines; the obtaining of a second positioning result based on the observation information according to the observation information and the information of the m +1 real lane lines includes: acquiring the types of the m +1 real lane lines according to the information of the m +1 real lane lines; and obtaining the posterior probability that the vehicles are respectively positioned in the m lanes according to the types of the n sensing lane lines and the types of the m +1 real lane lines.
In a possible implementation manner, the obtaining a first positioning result according to the information of the m +1 real lane lines includes: acquiring the distances between the vehicle and the m +1 real lane lines respectively according to the information of the m +1 real lane lines; and acquiring posterior probabilities that the vehicles are respectively positioned in the m lanes according to the distances between the vehicles and the m +1 real lane lines.
In a possible implementation manner, the obtaining, according to the information of the n sensing lane lines and the information of the m +1 real lane lines, confidence levels that the vehicle is respectively located in the m lanes at the current time further includes: obtaining confidence coefficients that the vehicle is respectively positioned in the m lanes at the previous moment; and according to the confidence degrees that the vehicle is respectively positioned in the m lanes at the previous moment, correcting the confidence degrees that the vehicle is respectively positioned in the m lanes at the current moment to obtain the corrected confidence degrees that the vehicle is respectively positioned in the m lanes at the current moment.
In the implementation mode, the positioning result at the previous moment is used as the positioning result at the current moment for modification, so that the accuracy of the positioning result can be improved.
In a second aspect, the present application provides an apparatus for determining a lane in which a vehicle is located, comprising: the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring the position information of a vehicle at the current moment and the perception data of the vehicle at the current moment, and the perception data comprises the information of n perception lane lines acquired by an acquisition device in the vehicle; the second acquisition module is used for acquiring the information of m +1 real lane lines of the road where the vehicle is located on a positioning map according to the position information, wherein the road where the vehicle is located comprises m lanes; wherein m +1> ═ n; the determining module is used for acquiring confidence degrees that the vehicle is respectively positioned in the m lanes at the current moment according to the information of the n sensing lane lines and the information of the m +1 real lane lines; the determining module is further configured to determine, according to the confidence levels that the vehicle is located in the m lanes at the current time, a lane in which the vehicle is located at the current time.
In a possible implementation manner, the determining module is specifically configured to: acquiring a first positioning result according to the information of the m +1 real lane lines, wherein the first positioning result is used for indicating the posterior probability that the vehicles are respectively positioned in the m lanes under the condition of not depending on the perception data; acquiring a second positioning result according to the information of the n sensing lane lines and the information of the m +1 real lane lines, wherein the second positioning result is used for indicating the posterior probability that the vehicles are respectively positioned in the m lanes under the condition of depending on the sensing data; and obtaining confidence degrees that the vehicle is respectively positioned in the m lanes at the current moment according to the first positioning result and the second positioning result.
In a possible implementation manner, the determining module is specifically configured to: acquiring at least one piece of observation information according to the information of the n sensing lane lines; aiming at each type of observation information, acquiring a second positioning result based on the observation information according to the observation information and the information of the m +1 real lane lines; and obtaining confidence degrees that the vehicle is respectively positioned on the m lanes at the current moment according to the first positioning result and a second positioning result corresponding to the at least one type of observation information.
In one possible implementation, the at least one type of observation information includes: the distance between the vehicle and the leftmost sensing lane line and the distance between the vehicle and the rightmost sensing lane line are specifically configured to: acquiring the road width of the road where the vehicle is located according to the information of the m +1 real lane lines; and acquiring posterior probabilities that the vehicles are respectively positioned on the m lanes according to the distance between the vehicles and the leftmost perception lane line, the distance between the vehicles and the rightmost perception lane line and the road width.
In one possible implementation, the at least one type of observation information includes: the number of perception lane lines located on the left side of the vehicle and the number of perception lane lines located on the right side of the vehicle; the determining module is specifically configured to: acquiring the road width of the road where the vehicle is located according to the information of the m +1 real lane lines; and acquiring posterior probabilities that the vehicles are respectively positioned in the m lanes according to the quantity of the perception lane lines positioned on the left side of the vehicles, the quantity of the perception lane lines positioned on the right side of the vehicles and the road width.
In one possible implementation, the at least one type of observation information includes: a distance between the vehicle and a roadway boundary; the determining module is specifically configured to: acquiring the distances between the vehicle and the m +1 real lane lines respectively according to the information of the m +1 real lane lines; and acquiring posterior probabilities of the vehicles respectively positioned in the m lanes according to the distance between the vehicles and the road boundary and the distance between the vehicles and the m +1 real lane lines respectively.
In one possible implementation, the at least one type of observation information includes: the types of the n sensing lane lines; the determining module is specifically configured to: acquiring the types of the m +1 real lane lines according to the information of the m +1 real lane lines; and obtaining the posterior probability that the vehicles are respectively positioned in the m lanes according to the types of the n sensing lane lines and the types of the m +1 real lane lines.
In a possible implementation manner, the determining module is specifically configured to: acquiring distances between the vehicle and the m +1 real lane lines respectively according to the information of the m +1 real lane lines; and acquiring posterior probabilities that the vehicles are respectively positioned in the m lanes according to the distances between the vehicles and the m +1 real lane lines.
In a possible implementation manner, the determining module is further configured to: obtaining confidence coefficients that the vehicle is respectively positioned in the m lanes at the previous moment; and according to the confidence degrees that the vehicle is respectively positioned in the m lanes at the previous moment, correcting the confidence degrees that the vehicle is respectively positioned in the m lanes at the current moment to obtain the corrected confidence degrees that the vehicle is respectively positioned in the m lanes at the current moment.
In a third aspect, the present application provides an electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of the first aspects.
In a fourth aspect, the present application provides a non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any of the first aspects.
The method, the device and the equipment for determining the lane where the vehicle is located comprise the following steps: acquiring position information of a vehicle at the current moment and sensing data of the vehicle at the current moment, wherein the sensing data comprises information of n sensing lane lines acquired by an acquisition device in the vehicle; according to the position information, obtaining the information of m +1 real lane lines of a road where the vehicle is located on a positioning map, wherein the road where the vehicle is located comprises m lanes; wherein m +1> ═ n; obtaining confidence coefficients of the vehicle respectively positioned on m lanes at the current moment according to the information of the n sensed lane lines and the information of the m +1 real lane lines; and determining the lane where the vehicle is located at the current moment according to the confidence degrees that the vehicle is respectively located in the m lanes at the current moment. Therefore, the method and the device only need to obtain the perception data aiming at the lane lines, do not need to obtain the perception data of various road signs at the same time, and reduce the requirement on the perception data; because the lane line detection algorithm is mature, the lane level positioning is carried out on the vehicle by using the information of the sensing lane line, and the implementation difficulty of the scheme and the requirement on the computing capability are also reduced.
Other effects of the above alternatives will be described below with reference to specific embodiments.
Drawings
The drawings are included to provide a better understanding of the present solution and are not intended to limit the present application. Wherein:
FIG. 1 is a schematic diagram of an application scenario in which the present application is applicable;
FIG. 2 is a schematic flow chart diagram illustrating a method for determining a lane in which a vehicle is located according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a real lane line and a perceived lane line provided in one embodiment of the present application;
FIG. 4 is a flowchart illustrating a method for determining a lane of a vehicle according to another embodiment of the present application;
FIG. 5 is a schematic structural diagram of an apparatus for determining a lane in which a vehicle is located according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application for the purpose of understanding, which are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Fig. 1 is a schematic view of an application scenario applicable to the present application. As shown in fig. 1, the vehicle 10 travels on a road, or the vehicle 10 is parked on a road, on which a lane line 20 is drawn. The vehicle can be a common vehicle and also can be an intelligent driving vehicle.
Among other things, the vehicle 10 may be provided with a positioning device 11, such as: a Global Positioning System (GPS) positioning device, an Inertial Measurement Unit (IMU), and the like. The positioning device can detect the approximate position of the vehicle. Taking GPS as an example, the error fluctuation of the positioning result is large because the plane positioning error of the GPS positioning device is within 10-20 meters and is affected by many factors such as weather, satellite ephemeris error, ionospheric refraction, and the like. Therefore, the accuracy of the positioning result of the GPS is low, and the lane where the vehicle is located cannot be determined from the positioning result of the GPS.
With the development of intelligent driving technology, the precision requirement for realizing self-positioning of the vehicle is higher and higher. Generally speaking, the intelligent driving vehicle controls the positioning error range to centimeter level, which is a necessary condition for realizing the driving safety of intelligent driving. If the positioning error is too large, especially when the vehicle is positioned on a wrong lane, catastrophic results can be brought, and the driving safety is threatened. Therefore, accurate lane-level positioning is of great significance to intelligent driving.
Currently, when a lane where a vehicle is located, a particle filter-based method is generally adopted. Specifically, sensing data of a plurality of sensors are fused by a particle filtering method, an accurate vehicle position is directly obtained, and then a corresponding lane is determined in a map according to the vehicle position. For example, assuming that the own vehicle is distributed in a certain range with a certain probability, in this range, some particles are assumed to represent the vehicle position according to the probability distribution, the position after Δ t of each particle is predicted according to the vehicle motion condition, the vehicle can also sense the distance of the road sign in the road relative to the own vehicle by using a sensor every Δ t, the position of the road sign around each particle is compared with the position of the road sign sensed by the sensor, and the closer the road sign sensed by the sensor and the detected road sign are, the more likely the particle is to represent the vehicle position. The actual position of the vehicle is confirmed iteratively from prediction to detection update. However, the above scheme has a high requirement for sensing data, requires a plurality of sensors to collect sensing data of a plurality of road signs on a road, and is complex to implement and has a high requirement for computing power.
In addition, with the development of deep learning technology, a method for positioning a lane where a vehicle is located based on deep learning also appears. Specifically, lane line sensing data acquired by the vehicle and corresponding map data are input into the deep learning model, and the deep learning model directly outputs the lane number where the vehicle is located. The method is simple to implement, but massive training data need to be trained in advance, and the learning capacity of the model is directly influenced by the scene covered by the training data. Moreover, generalization performance of the method in the actual using process is difficult to guarantee, and once a problem occurs or a scene never occurs in a training set is encountered, training data of the corresponding scene needs to be collected for retraining. Moreover, if the type of the sensing data acquired by the sensor is newly added, the model needs to be trained again, and the expansibility is poor.
In order to solve at least one of the above problems, the present application provides a method for determining a lane where a vehicle is located, which determines the lane where the vehicle is located by using approximate position information of the vehicle, information of a perceived lane line collected by a collecting device in the vehicle, and information of a real lane line in a positioning map. The method and the device only need to obtain the perception data aiming at the lane lines, do not need to obtain the perception data of various road signs at the same time, and reduce the requirement on the perception data; because the lane line detection algorithm is mature, the lane-level positioning is carried out on the vehicle by using the information of the sensing lane line, and the implementation difficulty of the scheme and the requirement on the computing capability are also reduced.
The lane where the vehicle is located is positioned, the boundary is defined for further accurate positioning in the lane, the search space during accurate positioning is greatly reduced, and the difficulty in further performing accurate positioning in the lane is reduced. Meanwhile, a boundary is still defined for the position of the vehicle even under the condition that accurate positioning fails in the lane, and the safety of intelligent driving is improved.
The technical solution of the present application is described in detail below with reference to several specific embodiments. Several of the following embodiments may be combined with each other and the description of the same or similar content may not be repeated in some embodiments.
Fig. 2 is a schematic flowchart of a method for determining a lane in which a vehicle is located according to an embodiment of the present application. As shown in fig. 2, the method of the present embodiment includes:
s201: the method comprises the steps of obtaining position information of a vehicle at the current moment and perception data of the vehicle at the current moment, wherein the perception data comprise information of n perception lane lines collected by a collecting device in the vehicle.
The method of the present embodiment may be performed by an apparatus for determining a lane in which the vehicle is located, which may be in the form of software and/or hardware. The device can be arranged in vehicle-mounted equipment of a vehicle, and can also be arranged in electronic equipment independent of the vehicle.
Positioning devices, such as GPS, IMU, etc., may be provided in the vehicle. The device for determining the lane where the vehicle is located can be in communication connection with the positioning device, and after the positioning device collects the position information of the vehicle, the position information is sent to the device for determining the lane where the vehicle is located, so that the device for determining the lane where the vehicle is located can obtain the position information of the vehicle at the current moment.
It can be understood that the positioning accuracy of the positioning device is low, taking a GPS positioning device as an example, the plane positioning error of the GPS itself is within 10-20 meters, and the error fluctuation of the GPS positioning result is large due to the influence of many factors such as weather, satellite ephemeris error, ionosphere refraction and the like. Therefore, in the present embodiment, the position information of the vehicle acquired by the positioning device is actually an approximate position of the vehicle (i.e., a positioning result with a large error), and the lane where the vehicle is located cannot be directly specified from the position information.
The vehicle may also be provided with a collection device, including but not limited to: image acquisition device, infrared acquisition device, sensor etc.. The acquisition device in this embodiment refers to a device for sensing a lane line on a road. For example, lane line information is collected by a vehicle-mounted camera. The device for determining the lane where the vehicle is located can be in communication connection with the acquisition device, and the acquisition device transmits the sensing data to the device for determining the lane where the vehicle is located after acquiring the sensing data, so that the device for determining the lane where the vehicle is located can acquire the sensing data.
In this embodiment, the sensing data includes information of n sensing lane lines. The perception lane line refers to a lane line perceived by the collecting device. Taking the collecting device as an example of a vehicle-mounted camera, the lane lines which can be collected by the vehicle-mounted camera are called sensing lane lines, and the number of the sensing lane lines can be different from the number of real lane lines on the road. For example, for a three lane road, there are 4 lanes on the road, and when the vehicle is traveling on the road, the vehicle-mounted camera may only be able to capture 2 or 3 or 4 of the lanes. That is, the n perceived lane lines may be a subset of the real lane lines on the road.
In this embodiment, the information for sensing the lane line includes, but is not limited to: the position information of the sensing lane line, the type attribute of the sensing lane line and the like. Taking the example of collecting and sensing lane lines by a vehicle-mounted camera, the data form of the obtained lane lines is often a cubic polynomial expression coefficient of n lane lines in a vehicle body coordinate system, and n is the number of the lane lines obtained by sensing, namely the coefficient of a group of cubic polynomials corresponding to each lane line. Since the cubic polynomial of the perception lane line is expressed under the vehicle body coordinate system, the distance between the perception lane line and the vehicle can be directly obtained according to the coefficient of the cubic polynomial of each perception lane line. For convenience of description, the distance between the sensing lane line and the vehicle is referred to as the intercept of the sensing lane line. The type attributes of the perceived lane lines include, but are not limited to: solid, dashed, white, yellow, etc.
S202: according to the position information, obtaining the information of m +1 real lane lines of the road where the vehicle is located on a positioning map, wherein the road where the vehicle is located comprises m lanes; wherein m +1> -n.
The positioning map is a map in which road information is recorded. In this embodiment, the positioning map may be a high-precision map. The high-precision map refers to a map defined in a high-precision and fine manner. Compared with the traditional map, the high-precision map has more accurate, clear and comprehensive description on the whole road, and the main characteristics of the high-precision map are that the high-precision map contains data such as lanes, lane lines, various traffic facilities and pedestrian crossings on the road and the like.
The present embodiment is based on a limited space assumption, i.e. assuming that the vehicle will only travel within the lanes of the road given by the high-precision map. Therefore, after the position information of the vehicle is determined, high-precision map data in a certain area in the vicinity can be obtained according to the position information, the obtained high-precision map data is mainly given in the form of lanes, and each lane is composed of a plurality of lane lines, including but not limited to: left lane line, right lane line, guardrail, curbs, etc.
In this embodiment, in order to distinguish the perceived lane line from the lane line in the high-precision map, the lane line in the perception data is referred to as a perceived lane line, and the lane line in the high-precision map is referred to as a real lane line.
Assume that m lanes are acquired from the high-precision map based on the position information of the vehicle. For each lane, a left lane line and a right lane line can be obtained from the high-precision map. Because there are only two lane lines on the left and right of one lane, and there is an overlap between the lane lines of adjacent lanes (i.e. the right lane line of the 0 th lane on the left side is the same lane line as the left lane line of the 1 st lane). Therefore, m +1 lane lines are acquired in total.
Fig. 3 is a schematic diagram of a real lane line and a sensing lane line provided in an embodiment of the present application. Among them, fig. 3 (a) illustrates a real lane line, and fig. 3 (b) illustrates a perceived lane line. As shown in fig. 3, assuming that the road includes 3 lanes, i.e., m is 3, 4 real lane lines are acquired from the high-precision map as shown in (a) of fig. 3. When the vehicle runs on the middle lane, the sensing lane lines collected by the vehicle may be as shown in (b) of fig. 3, and the vehicle only collects lane lines on both sides of the lane where the vehicle is located, that is, the number of the sensing lane lines is 2, and the sensing lane lines correspond to two real lane lines in the middle respectively. It should be noted that fig. 3 is only one possible example. It can be understood that, in the case where the road includes m +1 real lane lines in total, the vehicle senses that the number of lane lines is less than or equal to m +1, that is, n < ═ m +1, when the vehicle travels on the road.
In the information of the real lane lines obtained from the high-precision map, the position data of the real lane lines is given in the form of points (longitude and latitude), so that the coordinates of the points in the real lane lines in a vehicle body system (vehicle body coordinate system) can be solved according to the position information of the vehicle, each real lane line is regressed by a cubic polynomial according to the points, and an expression of each real lane line in the body system is obtained, so that the intercept distance d (namely the distance from the vehicle to the real lane line) of each real lane line in the body system can be directly obtained. Therefore, the lane-level positioning problem can be abstracted to be a one-dimensional positioning problem, the state of the lane-level positioning problem is discrete and limited, and the difficulty of the lane-level positioning problem can be greatly reduced.
The information of the real lane lines obtained from the high-precision map may further include type attributes of each real lane line, such as a solid line, a dotted line, a white line, a yellow line, and the like.
S203: and obtaining confidence degrees that the vehicle is respectively positioned in the m lanes at the current moment according to the information of the n sensed lane lines and the information of the m +1 real lane lines.
In this embodiment, the information of m +1 real lane lines acquired from the high-precision map may be used as the prior information. Based on the prior information, according to the information of the n sensed lane lines sensed by the vehicle, the confidence degrees that the vehicle is respectively positioned in the m lanes are determined.
In a possible implementation, it may be assumed that the vehicle is located in the ith lane (i is 0, 1, …, m-1), a series of observation information (e.g., the distance between the vehicle and each real lane line, the number of real lane lines measured left and right of the vehicle, the distance between the vehicle and the road boundary, etc.) when the vehicle is located in the ith lane is obtained according to the information of m +1 real lane lines, and then, whether these observation features are consistent with the information of n sensed lane lines is judged, and the confidence level that the vehicle is located in the ith lane is determined according to the degree of coincidence.
In another possible embodiment, the positioning result of the vehicle (i.e. the posterior probability that the vehicle is located in m lanes respectively) can be determined separately in both cases of dependence on the perception data and independence on the perception data. Then, the positioning results obtained under the two conditions are fused to obtain confidence degrees that the vehicles are respectively positioned on the m lanes, so that the accuracy of the positioning results can be improved.
Illustratively, according to the information of m +1 real lane lines, a first positioning result is obtained, where the first positioning result is used to indicate posterior probabilities that the vehicles are respectively located in the m lanes without depending on the perception data. For example: and (3) assuming that the position information of the vehicle meets Gaussian distribution, and determining the posterior probability that the vehicle is respectively positioned in the m lanes by using a Gaussian error function.
And acquiring a second positioning result according to the information of the n sensing lane lines and the information of the m +1 real lane lines, wherein the second positioning result is used for indicating the posterior probability that the vehicles are respectively positioned in the m lanes under the condition of depending on the sensing data.
It can be understood that when obtaining information on n perceived lane lines, it is possible to observe the n perceived lane lines from different angles to obtain a plurality of kinds of observation information. For example, the number of the perceived lane lines on the left and right sides of the vehicle can be observed, and the obtained observation information is: the number of perceived lane lines on the left side of the vehicle and the number of perceived lane lines on the right side of the vehicle. The maximum distance from the vehicle to the left and right sides of the sensing lane line can be observed, and the obtained observation information is as follows: the distance between the vehicle and the leftmost sensing lane line and the distance between the vehicle and the rightmost sensing lane line. And observing according to the road boundary, wherein the obtained observation information is as follows: the distance between the vehicle and the road boundary (the distance between the vehicle and the left boundary of the road, and/or the distance between the vehicle and the right boundary of the road). Observation can also be carried out by sensing the type of the lane line, and the obtained observation information is as follows: the n sensing lane lines are of the type (e.g., solid, dashed, white, yellow, etc.).
Further, for each kind of observation information, according to the observation information and the information of the m +1 real lane lines, a second positioning result based on the observation information is obtained (i.e., posterior probabilities that the vehicles are respectively located in the m lanes based on the observation information).
And obtaining confidence degrees that the vehicles are respectively positioned on the m lanes according to the first positioning result and the second positioning result corresponding to the various kinds of observation information.
S204: and determining the lane of the vehicle at the current moment according to the confidence degrees that the vehicle is respectively positioned in the m lanes at the current moment.
For example, after obtaining the confidence levels that the vehicle is located in m lanes, if the confidence level that the vehicle is located in the kth lane is the highest, and the highest confidence level is greater than or equal to a preset threshold, it is determined that the vehicle is located in the kth lane.
The embodiment provides a method for determining a lane where a vehicle is located, which comprises the following steps: acquiring position information of a vehicle at the current moment and sensing data of the vehicle at the current moment, wherein the sensing data comprises information of n sensing lane lines acquired by an acquisition device in the vehicle; according to the position information, obtaining the information of m +1 real lane lines of a road where the vehicle is located on a positioning map, wherein the road where the vehicle is located comprises m lanes; wherein m +1> ═ n; obtaining confidence coefficients of the vehicle respectively positioned on m lanes at the current moment according to the information of the n sensed lane lines and the information of the m +1 real lane lines; and determining the lane where the vehicle is located at the current moment according to the confidence degrees that the vehicle is respectively located in the m lanes at the current moment. Therefore, the method of the embodiment only needs to obtain the perception data aiming at the lane line, does not need to obtain the perception data of various road signs at the same time, and reduces the requirement on the perception data; because the lane line detection algorithm is mature, the lane level positioning is carried out on the vehicle by using the information of the perception lane line, and the realization difficulty of the scheme and the requirement on the computing capability are also reduced.
Fig. 4 is a flowchart illustrating a method for determining a lane where a vehicle is located according to another embodiment of the present application, which is a further refinement of the embodiment of fig. 2. As shown in fig. 4, the method of this embodiment includes:
s401: the method comprises the steps of obtaining position information of a vehicle at the current moment and perception data of the vehicle at the current moment, wherein the perception data comprise information of n perception lane lines collected by a collecting device in the vehicle.
S402: according to the position information, obtaining the information of m +1 real lane lines of the road where the vehicle is located on a positioning map, wherein the road where the vehicle is located comprises m lanes; wherein m +1> -n.
The specific implementation of S401 and S402 in this embodiment is similar to S201 and S202 in fig. 2, and is not described here again.
In this embodiment, to facilitate subsequent calculation, the vehicle body coordinate system is translated in the road width direction, so that the leftmost real lane line is used as the origin. That is, the following processing is performed on the intercept of each real lane:
di=di-d0(1)
s403a to S403e described below correspond to a further refinement of S203 in fig. 2, and five observation angles are observed for the perception data to obtain five kinds of observation data. And each kind of observation data is respectively utilized, and the information of the real lane line is taken as a priori, so that the positioning result (namely the posterior probability that the vehicles are respectively positioned on the m lanes) based on each kind of observation data is obtained. It is understood that, in practical applications, only one or more of the phases S403a through S403e may be performed. Each is described in detail below.
S403 a: and observing based on the position information of the vehicle at the current moment to obtain the posterior probabilities that the vehicle is respectively positioned in the m lanes.
Specifically, the distances between the vehicle and the m +1 real lane lines are obtained according to the information of the m +1 real lane lines; and obtaining the posterior probability that the vehicles are respectively positioned in the m lanes according to the distance between the vehicles and the m +1 real lane lines.
In the present embodiment, S403a is the observation based on the position information of the vehicle at the current time, that is, the calculation is performed using only the information of the m +1 real lane lines acquired from the high-precision map without using the observation data of the vehicle. Illustratively, the distances between the vehicles and the m +1 real lane lines are obtained according to the information of the m +1 real lane lines.
It is assumed that the vehicle position information acquired by the positioning device satisfies the Gaussian distribution N (mu, delta)2) Where μ is the mean, δ2Is the variance. For convenience of calculation, in the present embodiment, the origin of the vehicle body system is translated to the leftmost real lane line in the vehicle width direction, that is, the leftmost real lane line is used as the origin of the x-axis, and therefore, μ ═ d0. In this way, the probability that the vehicle is in the ith lane at the present time can be obtained by the following formula (2).
Figure BDA0002297073580000141
Wherein,
Figure BDA0002297073580000142
the probability that the vehicle is in the ith lane is obtained by observation based on the position information of the vehicle at the current moment, erf is a Gaussian error function,
Figure BDA0002297073580000143
is the intercept of the left real lane line of the ith lane,
Figure BDA0002297073580000144
is the intercept of the right real lane line of the ith lane.
In the application, the vehicle is supposed to only run on the road given by the high-precision map, that is, after the probability of the vehicle in each lane is obtained through the formula (2), the probabilities can be normalized through the following formula (3), so that the probability of the vehicle under each lane under the assumption is obtained.
Figure BDA0002297073580000145
S403 b: and observing based on the farthest distance between the vehicle and the left and right sides of the perception lane lines to obtain the posterior probability that the vehicle is respectively positioned in the m lanes.
Specifically, according to the information of the n sensing lane lines, the distance between the vehicle and the leftmost sensing lane line and the distance between the vehicle and the rightmost sensing lane line are obtained; acquiring the road width of a road where a vehicle is located according to the information of the m +1 real lane lines; and obtaining the posterior probabilities that the vehicles are respectively positioned in the m lanes according to the distance between the vehicle and the leftmost sensing lane line, the distance between the vehicle and the rightmost sensing lane line and the road width.
In this embodiment, S403b is observed from the farthest distance of the vehicle from the left and right sides of the lane line. Specifically, the distance dis between the vehicle and the leftmost sensing lane line is obtained according to the information of the n sensing lane linesmax_left(i.e., the intercept of the leftmost perceived lane line) and the distance dis between the vehicle and the rightmost perceived lane linemax_right(i.e., the intercept of the rightmost sensing lane line).
And according to the information of the m +1 real lane lines acquired from the high-precision map, the intercept of the left real lane line and the intercept of the right real lane line corresponding to each lane are obtained, and further according to the intercepts of the real lane lines, the road width of the road where the vehicle is located is determined according to a formula (4).
Figure BDA0002297073580000151
Wherein, widthroadIs the road width, m is the number of lanes,
Figure BDA0002297073580000152
is the intercept of the left real lane line of the ith lane,
Figure BDA0002297073580000153
is the intercept of the right real lane line of the ith lane.
Further, according to the distance dis between the vehicle and the leftmost perception lane linemax_leftDistance dis between the vehicle and the rightmost sensing lane linemax_rightAnd road widthroadAccording to the formula (5), the posterior probability that the vehicle is respectively located in each lane can be obtained.
Figure BDA0002297073580000154
Wherein,
Figure BDA0002297073580000155
the probability that the vehicle is in the ith lane is obtained by observation based on the farthest perception lane line, erf is a Gaussian error function,
Figure BDA0002297073580000156
is the intercept of the real lane line on the left side of the ith lane,
Figure BDA0002297073580000157
is the intercept, dis, of the true lane line on the right side of the ith laneleft=dismax_left,disright=dismax_right
S403 c: and observing based on the number of the perception lane lines on the two sides of the vehicle to obtain the posterior probability that the vehicle is respectively positioned on the m lanes.
Specifically, the number of the perception lane lines positioned on the left side of the vehicle and the number of the perception lane lines positioned on the right side of the vehicle are obtained according to the information of the n perception lane lines; acquiring the road width of the road where the vehicle is located according to the information of the m +1 real lane lines; and obtaining the posterior probability that the vehicles are respectively positioned on the m lanes according to the quantity of the perception lane lines positioned on the left side of the vehicles, the quantity of the perception lane lines positioned on the right side of the vehicles and the road width.
In this embodiment, S403c is observed from the number of the perceived lane lines on both sides of the vehicle. Because one lane has only two lane lines on the left and the right, and the lane lines between the adjacent lanes are overlapped, that is
Figure BDA0002297073580000161
(e.g., the right lane line of the left 0 th lane is the same lane line as the left lane line of the 1 st lane). The lane in which the vehicle is likely to be located can be confirmed by observing the number of perceived lane lines on the left and right sides of the vehicle.
Specifically, based on the information of the n sensing lane lines, the n sensing lane lines can be divided into a sensing lane line located on the left side of the vehicle (with an intercept d <0) and a sensing lane line located on the right side of the vehicle (with an intercept d >0) according to the intercept in the system. Further, the number of perceived lane lines on the left side of the vehicle and the number of perceived lane lines on the right side of the vehicle may be determined.
Obtaining the road width of the road where the vehicle is located according to the information of the m +1 real lane linesroadSee equation (4) for details, similar to S403 b.
Suppose that the number of observed perceived lane lines on the left side of the vehicle is nleftThe number of the perception lane lines on the right side of the vehicle is nrightOrder:
Figure BDA0002297073580000162
Figure BDA0002297073580000163
will disleft、disrightAnd widthroadAnd substituting the formula (5) to obtain the posterior probability of the vehicle in each lane.
S403 d: and observing based on the distance between the vehicle and the road boundary to obtain the posterior probability that the vehicle is respectively positioned in the m lanes.
Specifically, the distance between the vehicle and the road boundary is acquired according to the information of n sensing lane lines; acquiring the distance between the vehicle and the m +1 real lane lines according to the information of the m +1 real lane lines; and acquiring the posterior probability that the vehicles are respectively positioned in the m lanes according to the distance between the vehicles and the road boundary and the distance between the vehicles and the m +1 real lane lines.
In this embodiment, S403d is observed from the distance between the vehicle and the road boundary (e.g., guardrail, curb, etc.). If the observation of the road boundary can be directly obtained through the perception data, the lane where the vehicle is located can be determined through the intercept corresponding to the road boundary. The core of S403d is how to obtain the road boundary observation through the perception data, and in this embodiment, whether there is an observation on the road boundary is determined by the following two ways.
Mode 1: and determining the road boundary according to the type of the curve obtained by perception. If the type of a certain curve is sensed to be a road boundary or a guardrail, the position of the vehicle in the road can be determined according to the intercept of the curve, and then the lane where the vehicle is located is determined.
Mode 2: and determining the road boundary according to the shortest distance between the perceived curves. In practice it has been found that vision-based sensors tend to give barriers or road boundaries with very little confidence, but are given in unknown (unbown) or other types along with conventional lane lines. Therefore, in the present embodiment, the road boundary may be determined by detecting whether the distance between the two sensing lane lines is less than a preset threshold (for example, the distance between the road boundary and the nearest lane line is usually 40cm, and the preset threshold may be set to 50 cm). For example, if it is determined that the distance between the two leftmost sensing lane lines is smaller than the preset threshold according to the sensing data, it is determined that the leftmost sensing lane line is a road boundary, and the second left sensing lane line is the left lane line of the 0 th lane.
The intercept corresponding to the road boundary can be determined by adopting any one of the two modes and is recorded as dboundary. Let mu be dboundaryAnd d, substituting mu into the formula (2) to obtain the posterior probability of the vehicle in each lane. Where Δ d is a distance from a preset road boundary to the nearest lane line, for example, 50 cm.
S403 e: and observing based on the type of the lane line to obtain the posterior probability that the vehicles are respectively positioned in the m lanes.
Specifically, the types of the n sensing lane lines are obtained according to the information of the n sensing lane lines; obtaining the types of the m +1 real lane lines according to the information of the m +1 real lane lines; and obtaining the posterior probability that the vehicle is respectively positioned in the m lanes according to the types of the n sensed lane lines and the types of the m +1 real lane lines.
In this embodiment, S403e is observed from the type of lane line. As described above, the lane line data and the perception data obtained from the high-precision map each include the type attribute of the lane line, and therefore, the lane in which the vehicle is located can also be determined by type matching. When n lane lines are acquired from the sensing data, the alignment of the leftmost sensing lane line and the leftmost real lane line in the high-precision map is started until the alignment of the rightmost sensing lane line and the rightmost real lane line in the high-precision map is finished, and m- (n-1) +1 possible matching conditions exist.
The correct probability note P is assumed to be observed according to the type of the n lane lines obtained by sensing datatrueThen the probability of the kth possible match case can be calculated by equation (6).
Figure BDA0002297073580000171
Wherein,
Figure BDA0002297073580000172
get the kth seedProbability of matching cases of energy
Figure BDA0002297073580000181
Then, it can be determined that the vehicle is located at the xth positionk+ΔmProbability of individual lane because
Figure BDA0002297073580000182
Wherein, Δ m ═ nleft-1,nleftThe perceived number of lane lines located on the left side of the vehicle.
For the convenience of subsequent calculation, the probability of the lane which is not calculated is set as a minimum value PminE.g. Pmin0.05. For example, if two sensing lane lines are found on the left side of the vehicle and one sensing lane line is found on the right side of the vehicle according to the sensing data, the vehicle is basically impossible to be located in the 0 th lane on the left side (the lane subscript starts from 0).
After the probability of the vehicle in each lane is obtained, the normalization can be further completed by using formula (3).
S404: and fusing the posterior probabilities that the vehicles acquired in the S403 a-S403 e are respectively located in the m lanes to obtain the confidence degrees that the vehicles are respectively located in the m lanes.
In this embodiment, it is necessary to determine the lane where the vehicle is most likely to be located at the current time, in combination with the posterior probabilities that the vehicles are located in m lanes respectively, which are acquired in S403a to S403 e. The present embodiment is also based on a state invariant assumption. In other words, the confidence level for each lane is calculated by using all the observation information at the same time t, and since all the observation information is obtained from the perception data at the same time, it can be reasonably assumed that the position of the vehicle does not change at the time t. There are two and only two states of a vehicle in a lane, i.e., in the lane or not. Based on the above assumptions, the confidence based on the current perceptual data can be calculated in the present application
Figure BDA0002297073580000188
Indicating that at time t, m observations are being obtained (z)0…zk) The confidence that the vehicle is in the ith lane.
Optionally, in this embodiment, when the confidence level of the vehicle in each lane is calculated according to the positioning results obtained in S403a to S403e, a static binary bayesian filter may be used for calculation, that is, an equation (7) is used for iterative calculation. Since all observations are obtained at the same time t and the calculated confidence of the lane where the vehicle is located is also for the time t, the symbol t representing the time is hidden in equation (7) for the sake of simplicity of expression.
Figure BDA0002297073580000183
Wherein l is the logarithm of the probability ratio, i.e.
Figure BDA0002297073580000184
And k is 0, 1, 2, … and m in sequence, wherein m is the number of observation information. p (a)i) For the prior probability, taking the number of lanes as m as an example,
Figure BDA0002297073580000185
Figure BDA0002297073580000186
is the intercept of the right lane line of the ith lane,
Figure BDA0002297073580000187
is the intercept, width, of the left lane line of the ith lanehoadIs the width of the road. For example: taking a three-lane road as an example, p (x)i) Can be equal to
Figure BDA0002297073580000191
Obtained by the formula (7)
Figure BDA0002297073580000192
Then, at the time t, the confidence p (x) that the vehicle is in the ith lane under the current k pieces of observation information can be solved by the formula (8)i|z0:k)。
Figure BDA0002297073580000193
In the embodiment, because the static binary bayesian filtering is performed on the positioning results corresponding to the multiple types of observations (S403a to S403e), the confidence level that the vehicle is located in each lane is obtained, and the expansibility of the scheme of the application is better. When other types of observation information are newly added, the formula (7) is only required to be continuously executed under the current result for fusion, and the existing result is not influenced.
S405: obtaining confidence coefficients that the vehicle is respectively positioned in m lanes at the last moment; and correcting the confidence coefficients of the vehicle in the m lanes at the current moment according to the confidence coefficients of the vehicle in the m lanes at the previous moment to obtain the confidence coefficients of the corrected vehicle in the m lanes at the current moment.
In order to improve the robustness of the positioning result, in this embodiment, the final positioning result can be recursively completed by taking the idea of histogram filtering as a reference. Conventional histogram filtering is shown in equations (9) and (10).
Figure BDA0002297073580000194
In the application scenario of the present application, since the lane where the vehicle is located is substantially unchanged with respect to the acquisition frequency of the perception data (≧ 10Hz), equation (9) may be modified to equation (11), i.e., the confidence that the vehicle is located in the ith lane at time t
Figure BDA0002297073580000195
Probability p of being located in ith lane with vehicle at time t-1i,t-1Are equal. Namely:
Figure BDA0002297073580000196
meanwhile, the formula (10) can be converted into the following formula by the bayes formula:
Figure BDA0002297073580000197
wherein, p (X)t=xi) The probability of being a priori, i.e.,
Figure BDA0002297073580000198
in this embodiment, each time a frame of perceptual data is obtained, the confidence p (X) under the current perceptual data is calculated from the posterior probabilities obtained in steps S403a to S403e through the static binary bayesian filtering described in step S404t=xi|zt). Meanwhile, since normalization is also performed by equation (3), p (z)t) Can be incorporated into the normalization factor η so equation (12) can be written as equation (14).
Figure BDA0002297073580000201
As can be seen from the formula (14), the confidence that the vehicle is located in the ith lane at the time t-1 is adopted, and the confidence that the vehicle is located in the ith lane at the time t is corrected, so that the confidence that the corrected vehicle is located in the ith lane at the time t is obtained.
S406: and determining the lane where the vehicle is located at the current moment according to the confidence degrees that the corrected vehicle is respectively located in the m lanes at the current moment.
The confidence p of the vehicle in each lane at the current time t is obtainedi,tThen, if the confidence of the vehicle in the ith lane is the highest and the highest probability exceeds the preset threshold pthreshold_maxI.e. pimax,t≥pthreshold_maxThen the vehicle can be considered to be in the lane.
According to the method, only the perception data aiming at the lane lines need to be obtained, and the perception data of various road signs does not need to be obtained simultaneously, so that the perception data requirement is reduced; since the lane line detection and calculation method is mature, the lane level positioning is carried out on the vehicle by sensing the information of the lane line, and the implementation difficulty of the scheme and the requirement on the calculation capability are reduced. In addition, because static binary Bayesian filtering is carried out on the positioning results corresponding to various types of observation information to obtain the confidence coefficient of the vehicle in each lane, the expansibility of the scheme of the application is better. When other types of observation information are newly added, the positioning result corresponding to the new observation information only needs to be fused under the current result, and the existing result is not influenced.
Fig. 5 is a schematic structural diagram of an apparatus for determining a lane in which a vehicle is located according to an embodiment of the present application. As shown in fig. 5, the apparatus 500 of the present embodiment may include: a first obtaining module 501, a second obtaining module 502 and a determining module 503.
The first obtaining module 501 is configured to obtain position information of a vehicle at a current time and perception data of the vehicle at the current time, where the perception data includes information of n perception lane lines collected by a collecting device in the vehicle; a second obtaining module 502, configured to obtain, on a positioning map, information of m +1 real lane lines of a road where the vehicle is located, where the road where the vehicle is located includes m lanes, according to the location information; wherein m +1> ═ n; a determining module 503, configured to obtain confidence levels that the vehicle is located in the m lanes at the current time respectively according to the information of the n sensed lane lines and the information of the m +1 real lane lines; the determining module 503 is further configured to determine, according to the confidence levels that the vehicle is located in the m lanes at the current time, the lane where the vehicle is located at the current time.
In a possible implementation manner, the determining module 503 is specifically configured to: obtaining first positioning results according to the information of the m +1 real lane lines, wherein the first positioning results are used for indicating the posterior probability that the vehicles are respectively positioned on the m lanes under the condition of not depending on the perception data; obtaining a second positioning result according to the information of the n sensing lane lines and the information of the m +1 real lane lines, wherein the second positioning result is used for indicating the posterior probability that the vehicle is respectively positioned in the m lanes under the condition of depending on the sensing data; and obtaining confidence degrees that the vehicle is respectively positioned in the m lanes at the current moment according to the first positioning result and the second positioning result.
In a possible implementation manner, the determining module 503 is specifically configured to: acquiring at least one piece of observation information according to the information of the n sensing lane lines; for each type of observation information, acquiring a second positioning result based on the observation information according to the observation information and the information of the m +1 real lane lines; and obtaining confidence degrees that the vehicle is respectively positioned in the m lanes at the current moment according to the first positioning result and a second positioning result corresponding to the at least one type of observation information.
In one possible implementation, the at least one type of observation information includes: the determining module 503 is specifically configured to: acquiring the road width of the road where the vehicle is located according to the information of the m +1 real lane lines; and acquiring posterior probabilities that the vehicles are respectively positioned on the m lanes according to the distance between the vehicles and the leftmost perception lane line, the distance between the vehicles and the rightmost perception lane line and the road width.
In one possible implementation, the at least one type of observation information includes: the number of perception lane lines located on the left side of the vehicle and the number of perception lane lines located on the right side of the vehicle; the determining module 503 is specifically configured to: acquiring the road width of the road where the vehicle is located according to the information of the m +1 real lane lines; and acquiring posterior probabilities that the vehicles are respectively positioned in the m lanes according to the quantity of the perception lane lines positioned on the left side of the vehicles, the quantity of the perception lane lines positioned on the right side of the vehicles and the road width.
In one possible implementation, the at least one type of observation information includes: a distance between the vehicle and a roadway boundary; the determining module 503 is specifically configured to: acquiring the distances between the vehicle and the m +1 real lane lines respectively according to the information of the m +1 real lane lines; and acquiring the posterior probabilities that the vehicles are respectively positioned in the m lanes according to the distance between the vehicles and the road boundary and the distance between the vehicles and the m +1 real lane lines.
In one possible implementation, the at least one type of observation information includes: the types of the n sensing lane lines; the determining module 503 is specifically configured to: acquiring the types of the m +1 real lane lines according to the information of the m +1 real lane lines; and obtaining the posterior probability that the vehicles are respectively positioned in the m lanes according to the types of the n sensing lane lines and the types of the m +1 real lane lines.
In a possible implementation manner, the determining module 503 is specifically configured to: acquiring the distances between the vehicle and the m +1 real lane lines respectively according to the information of the m +1 real lane lines; and acquiring posterior probabilities that the vehicles are respectively positioned in the m lanes according to the distances between the vehicles and the m +1 real lane lines.
In a possible implementation manner, the determining module 503 is further configured to: obtaining confidence coefficients that the vehicle is respectively positioned in the m lanes at the previous moment; and according to the confidence degrees that the vehicle is respectively positioned in the m lanes at the previous moment, correcting the confidence degrees that the vehicle is respectively positioned in the m lanes at the current moment to obtain the corrected confidence degrees that the vehicle is respectively positioned in the m lanes at the current moment.
The apparatus of this embodiment may be configured to implement the technical solution in any of the method embodiments, and the implementation principle and the technical effect are similar, which are not described herein again.
According to an embodiment of the present application, an electronic device and a readable storage medium are also provided.
As shown in fig. 6, the embodiment of the present application is a block diagram of an electronic device for determining a lane where a vehicle is located. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the present application that are described and/or claimed herein.
As shown in fig. 6, the electronic apparatus includes: one or more processors 701, a memory 702, and interfaces for connecting the various components, including a high-speed interface and a low-speed interface. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions for execution within the electronic device, including instructions stored in or on a memory to display graphical information of a GUI on an external input/output device (such as a display device coupled to the interface). In other embodiments, multiple processors and/or multiple buses may be used, along with multiple memories and multiple memories, as desired. Also, multiple electronic devices may be connected, with each device providing portions of the necessary operations (e.g., as a server array, a group of blade servers, or a multi-processor system). In fig. 6, one processor 701 is taken as an example.
The memory 702 is a non-transitory computer readable storage medium as provided herein. Wherein the memory stores instructions executable by the at least one processor to cause the at least one processor to perform the method of determining a lane in which a vehicle is located as provided herein. The non-transitory computer readable storage medium of the present application stores computer instructions for causing a computer to perform the method of determining a lane in which a vehicle is located provided by the present application.
The memory 702, which is a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules corresponding to the method for determining the lane where the vehicle is located in the embodiment of the present application (for example, the first obtaining module 501, the second obtaining module 502, and the determining module 503 shown in fig. 5). The processor 701 executes various functional applications and data processing of the server or the terminal device by running the non-transitory software programs, instructions and modules stored in the memory 702, that is, implements the method for determining the lane where the vehicle is located in the above-described method embodiment.
The memory 702 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created by use of the electronic device, and the like. Further, the memory 702 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 702 may optionally include memory located remotely from the processor 701, which may be connected to the electronic device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device may further include: an input device 703 and an output device 704. The processor 701, the memory 702, the input device 703 and the output device 704 may be connected by a bus or other means, as exemplified by the bus connection in fig. 6.
The input device 703 may receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic apparatus, such as a touch screen, keypad, mouse, track pad, touch pad, pointer stick, one or more mouse buttons, track ball, joystick, or other input device. The output devices 704 may include a display device, auxiliary lighting devices (e.g., LEDs), and tactile feedback devices (e.g., vibrating motors), among others. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device can be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
These computing programs (also known as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented using high-level procedural and/or object-oriented programming languages, and/or assembly/machine languages. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, and the present invention is not limited thereto as long as the desired results of the technical solutions disclosed in the present application can be achieved.
The above-described embodiments should not be construed as limiting the scope of the present application. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (12)

1. A method of determining a lane in which a vehicle is located, comprising:
acquiring position information of a vehicle at the current moment and sensing data of the vehicle at the current moment, wherein the sensing data comprises information of n sensing lane lines acquired by an acquisition device in the vehicle;
according to the position information, obtaining the information of m +1 real lane lines of the road where the vehicle is located on a positioning map, wherein the road where the vehicle is located comprises m lanes; wherein m +1> ═ n;
obtaining confidence coefficients of the vehicle respectively positioned on the m lanes at the current moment according to the information of the n sensed lane lines and the information of the m +1 real lane lines;
and determining the lane of the vehicle at the current moment according to the confidence degrees that the vehicle is respectively positioned in the m lanes at the current moment.
2. The method according to claim 1, wherein the obtaining the confidence levels that the vehicle is respectively located in the m lanes at the current time according to the information of the n sensing lane lines and the information of the m +1 real lane lines comprises:
acquiring a first positioning result according to the information of the m +1 real lane lines, wherein the first positioning result is used for indicating the posterior probability that the vehicles are respectively positioned in the m lanes under the condition of not depending on the perception data;
acquiring a second positioning result according to the information of the n sensing lane lines and the information of the m +1 real lane lines, wherein the second positioning result is used for indicating the posterior probability that the vehicles are respectively positioned in the m lanes under the condition of depending on the sensing data;
and obtaining confidence degrees that the vehicle is respectively positioned in the m lanes at the current moment according to the first positioning result and the second positioning result.
3. The method according to claim 2, wherein the obtaining a second positioning result according to the information of the n sensing lane lines and the information of the m +1 real lane lines comprises:
acquiring at least one piece of observation information according to the information of the n sensing lane lines;
for each type of observation information, acquiring a second positioning result based on the observation information according to the observation information and the information of the m +1 real lane lines;
the obtaining confidence that the vehicle is respectively located in the m lanes at the current moment according to the first positioning result and the second positioning result includes:
and obtaining confidence degrees that the vehicle is respectively positioned in the m lanes at the current moment according to the first positioning result and a second positioning result corresponding to the at least one type of observation information.
4. The method of claim 3, wherein the at least one observation comprises: the distance between the vehicle and the leftmost sensing lane line and the distance between the vehicle and the rightmost sensing lane line; the obtaining of a second positioning result based on the observation information according to the observation information and the information of the m +1 real lane lines includes:
acquiring the road width of the road where the vehicle is located according to the information of the m +1 real lane lines;
and acquiring posterior probabilities that the vehicles are respectively positioned on the m lanes according to the distance between the vehicles and the leftmost perception lane line, the distance between the vehicles and the rightmost perception lane line and the road width.
5. The method of claim 3, wherein the at least one observation comprises: the number of perception lane lines located on the left side of the vehicle and the number of perception lane lines located on the right side of the vehicle; the obtaining of a second positioning result based on the observation information according to the observation information and the information of the m +1 real lane lines includes:
acquiring the road width of the road where the vehicle is located according to the information of the m +1 real lane lines;
and acquiring posterior probabilities that the vehicles are respectively positioned in the m lanes according to the quantity of the perception lane lines positioned on the left side of the vehicles, the quantity of the perception lane lines positioned on the right side of the vehicles and the road width.
6. The method of claim 3, wherein the at least one observation comprises: a distance between the vehicle and a road boundary; the obtaining of a second positioning result based on the observation information according to the observation information and the information of the m +1 real lane lines includes:
acquiring the distances between the vehicle and the m +1 real lane lines respectively according to the information of the m +1 real lane lines;
and acquiring the posterior probabilities of the vehicles respectively positioned in the m lanes according to the distance between the vehicles and the road boundary and the distance between the vehicles and the m +1 real lane lines respectively.
7. The method of claim 3, wherein the at least one observation comprises: the types of the n sensing lane lines; the obtaining of a second positioning result based on the observation information according to the observation information and the information of the m +1 real lane lines includes:
acquiring the types of the m +1 real lane lines according to the information of the m +1 real lane lines;
and obtaining the posterior probability that the vehicles are respectively positioned in the m lanes according to the types of the n sensing lane lines and the types of the m +1 real lane lines.
8. The method according to claim 2, wherein the obtaining a first positioning result according to the information of the m +1 real lane lines comprises:
acquiring the distances between the vehicle and the m +1 real lane lines respectively according to the information of the m +1 real lane lines;
and acquiring posterior probabilities that the vehicles are respectively positioned in the m lanes according to the distances between the vehicles and the m +1 real lane lines.
9. The method according to any one of claims 1 to 8, wherein the obtaining the confidence that the vehicle is respectively located in the m lanes at the current time according to the information of the n sensing lane lines and the information of the m +1 real lane lines further comprises:
obtaining confidence coefficients that the vehicle is respectively positioned in the m lanes at the previous moment;
and according to the confidence degrees that the vehicle is respectively positioned in the m lanes at the previous moment, correcting the confidence degrees that the vehicle is respectively positioned in the m lanes at the current moment to obtain the corrected confidence degrees that the vehicle is respectively positioned in the m lanes at the current moment.
10. An apparatus for determining a lane in which a vehicle is located, comprising:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring the position information of a vehicle at the current moment and the perception data of the vehicle at the current moment, and the perception data comprises the information of n perception lane lines acquired by an acquisition device in the vehicle;
the second acquisition module is used for acquiring the information of m +1 real lane lines of the road where the vehicle is located on a positioning map according to the position information, wherein the road where the vehicle is located comprises m lanes; wherein m +1> ═ n;
the determining module is used for acquiring confidence degrees that the vehicle is respectively positioned in the m lanes at the current moment according to the information of the n sensing lane lines and the information of the m +1 real lane lines;
the determining module is further configured to determine the lane where the vehicle is located at the current time according to the confidence levels that the vehicle is located in the m lanes at the current time.
11. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1 to 9.
12. A non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the method of any one of claims 1 to 9.
CN201911206663.8A 2019-11-29 2019-11-29 Method, device and equipment for determining lane where vehicle is located Active CN110979346B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911206663.8A CN110979346B (en) 2019-11-29 2019-11-29 Method, device and equipment for determining lane where vehicle is located

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911206663.8A CN110979346B (en) 2019-11-29 2019-11-29 Method, device and equipment for determining lane where vehicle is located

Publications (2)

Publication Number Publication Date
CN110979346A true CN110979346A (en) 2020-04-10
CN110979346B CN110979346B (en) 2021-08-31

Family

ID=70088766

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911206663.8A Active CN110979346B (en) 2019-11-29 2019-11-29 Method, device and equipment for determining lane where vehicle is located

Country Status (1)

Country Link
CN (1) CN110979346B (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111814114A (en) * 2020-06-30 2020-10-23 北京百度网讯科技有限公司 Method, device, electronic device, vehicle and storage medium for lane positioning verification
CN112328731A (en) * 2021-01-06 2021-02-05 禾多科技(北京)有限公司 Vehicle lane level positioning method and device, electronic equipment and computer readable medium
CN112415552A (en) * 2020-11-17 2021-02-26 北京百度网讯科技有限公司 Vehicle position determining method and device and electronic equipment
CN112595330A (en) * 2020-11-13 2021-04-02 禾多科技(北京)有限公司 Vehicle positioning method and device, electronic equipment and computer readable medium
CN113232658A (en) * 2021-06-28 2021-08-10 驭势(上海)汽车科技有限公司 Vehicle positioning method and device, electronic equipment and storage medium
CN113551680A (en) * 2020-04-23 2021-10-26 上汽通用汽车有限公司 Lane-level positioning system and method
CN113701781A (en) * 2021-09-30 2021-11-26 重庆长安汽车股份有限公司 Matching lane searching method based on high-precision map and visual lane line
CN113942502A (en) * 2021-10-25 2022-01-18 中汽创智科技有限公司 Vehicle driving assisting method, device, equipment and storage medium
CN114179825A (en) * 2021-12-08 2022-03-15 北京百度网讯科技有限公司 Method for obtaining confidence of measurement value through multi-sensor fusion and automatic driving vehicle
CN114216467A (en) * 2021-12-13 2022-03-22 禾多科技(北京)有限公司 Road positioning method, device, computer equipment and storage medium
CN114264310A (en) * 2020-09-14 2022-04-01 阿里巴巴集团控股有限公司 Positioning and navigation method, device, electronic equipment and computer storage medium
CN116380088A (en) * 2023-06-05 2023-07-04 小米汽车科技有限公司 Vehicle positioning method and device, vehicle and storage medium
WO2024037487A1 (en) * 2022-08-18 2024-02-22 中国第一汽车股份有限公司 Path correction method and apparatus applied to vehicle, and electronic device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160229410A1 (en) * 2013-12-04 2016-08-11 Mobileye Vision Technologies Ltd. Systems and methods for navigating a vehicle among encroaching vehicles
CN109115231A (en) * 2018-08-29 2019-01-01 东软睿驰汽车技术(沈阳)有限公司 A kind of vehicle positioning method, equipment and automatic driving vehicle
EP3428900A1 (en) * 2017-07-12 2019-01-16 PSA Automobiles SA Method and system to assist with driving an autonomous vehicle on a roadway having pedestrian crossings
CN109849922A (en) * 2018-12-25 2019-06-07 青岛中汽特种汽车有限公司 A method of the view-based access control model information for intelligent vehicle is merged with GIS information
CN110146096A (en) * 2018-10-24 2019-08-20 北京初速度科技有限公司 A kind of vehicle positioning method and device based on image perception
CN110174113A (en) * 2019-04-28 2019-08-27 福瑞泰克智能系统有限公司 A kind of localization method, device and the terminal in vehicle driving lane

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160229410A1 (en) * 2013-12-04 2016-08-11 Mobileye Vision Technologies Ltd. Systems and methods for navigating a vehicle among encroaching vehicles
EP3428900A1 (en) * 2017-07-12 2019-01-16 PSA Automobiles SA Method and system to assist with driving an autonomous vehicle on a roadway having pedestrian crossings
CN109115231A (en) * 2018-08-29 2019-01-01 东软睿驰汽车技术(沈阳)有限公司 A kind of vehicle positioning method, equipment and automatic driving vehicle
CN110146096A (en) * 2018-10-24 2019-08-20 北京初速度科技有限公司 A kind of vehicle positioning method and device based on image perception
CN109849922A (en) * 2018-12-25 2019-06-07 青岛中汽特种汽车有限公司 A method of the view-based access control model information for intelligent vehicle is merged with GIS information
CN110174113A (en) * 2019-04-28 2019-08-27 福瑞泰克智能系统有限公司 A kind of localization method, device and the terminal in vehicle driving lane

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113551680A (en) * 2020-04-23 2021-10-26 上汽通用汽车有限公司 Lane-level positioning system and method
CN111814114A (en) * 2020-06-30 2020-10-23 北京百度网讯科技有限公司 Method, device, electronic device, vehicle and storage medium for lane positioning verification
CN111814114B (en) * 2020-06-30 2023-12-08 阿波罗智能技术(北京)有限公司 Lane positioning verification method, lane positioning verification device, electronic device, vehicle and storage medium
CN114264310A (en) * 2020-09-14 2022-04-01 阿里巴巴集团控股有限公司 Positioning and navigation method, device, electronic equipment and computer storage medium
CN112595330A (en) * 2020-11-13 2021-04-02 禾多科技(北京)有限公司 Vehicle positioning method and device, electronic equipment and computer readable medium
CN112595330B (en) * 2020-11-13 2021-10-15 禾多科技(北京)有限公司 Vehicle positioning method and device, electronic equipment and computer readable medium
CN112415552A (en) * 2020-11-17 2021-02-26 北京百度网讯科技有限公司 Vehicle position determining method and device and electronic equipment
CN112328731A (en) * 2021-01-06 2021-02-05 禾多科技(北京)有限公司 Vehicle lane level positioning method and device, electronic equipment and computer readable medium
CN113232658B (en) * 2021-06-28 2022-06-28 驭势(上海)汽车科技有限公司 Vehicle positioning method and device, electronic equipment and storage medium
WO2023273780A1 (en) * 2021-06-28 2023-01-05 驭势(上海)汽车科技有限公司 Vehicle positioning method and apparatus, electronic device, and storage medium
CN113232658A (en) * 2021-06-28 2021-08-10 驭势(上海)汽车科技有限公司 Vehicle positioning method and device, electronic equipment and storage medium
CN113701781A (en) * 2021-09-30 2021-11-26 重庆长安汽车股份有限公司 Matching lane searching method based on high-precision map and visual lane line
CN113701781B (en) * 2021-09-30 2023-07-18 重庆长安汽车股份有限公司 Matching lane searching method based on high-precision map and visual lane lines
CN113942502A (en) * 2021-10-25 2022-01-18 中汽创智科技有限公司 Vehicle driving assisting method, device, equipment and storage medium
CN114179825A (en) * 2021-12-08 2022-03-15 北京百度网讯科技有限公司 Method for obtaining confidence of measurement value through multi-sensor fusion and automatic driving vehicle
CN114179825B (en) * 2021-12-08 2022-11-18 北京百度网讯科技有限公司 Method for obtaining confidence of measurement value through multi-sensor fusion and automatic driving vehicle
CN114216467A (en) * 2021-12-13 2022-03-22 禾多科技(北京)有限公司 Road positioning method, device, computer equipment and storage medium
WO2024037487A1 (en) * 2022-08-18 2024-02-22 中国第一汽车股份有限公司 Path correction method and apparatus applied to vehicle, and electronic device
CN116380088B (en) * 2023-06-05 2023-08-29 小米汽车科技有限公司 Vehicle positioning method and device, vehicle and storage medium
CN116380088A (en) * 2023-06-05 2023-07-04 小米汽车科技有限公司 Vehicle positioning method and device, vehicle and storage medium

Also Published As

Publication number Publication date
CN110979346B (en) 2021-08-31

Similar Documents

Publication Publication Date Title
CN110979346B (en) Method, device and equipment for determining lane where vehicle is located
CN111753765B (en) Sensing device detection method, sensing device detection apparatus, sensing device detection device and storage medium
EP3901826A2 (en) Vehicle position determining method, apparatus and electronic device
JP7314213B2 (en) Vehicle positioning method, apparatus, electronic device, storage medium and program
WO2020237996A1 (en) Vehicle pose correction method and device
CN110806215B (en) Vehicle positioning method, device, equipment and storage medium
CN111583668B (en) Traffic jam detection method and device, electronic equipment and storage medium
CN113723141B (en) Vehicle positioning method and device, electronic equipment, vehicle and storage medium
CN111998860B (en) Automatic driving positioning data verification method and device, electronic equipment and storage medium
CN111220164A (en) Positioning method, device, equipment and storage medium
CN111959495B (en) Vehicle control method and device and vehicle
CN111968229A (en) High-precision map making method and device
CN111797187A (en) Map data updating method and device, electronic equipment and storage medium
CN111324115A (en) Obstacle position detection fusion method and device, electronic equipment and storage medium
US11447153B2 (en) Method and apparatus for annotating virtual lane at crossing
CN111611901A (en) Vehicle reverse running detection method, device, equipment and storage medium
CN110595490B (en) Preprocessing method, device, equipment and medium for lane line perception data
CN110823237B (en) Starting point binding and prediction model obtaining method, device and storage medium
CN111950537A (en) Zebra crossing information acquisition method, map updating method, device and system
CN111597986B (en) Method, apparatus, device and storage medium for generating information
CN111523471A (en) Method, device and equipment for determining lane where vehicle is located and storage medium
CN111597987A (en) Method, apparatus, device and storage medium for generating information
CN111693059A (en) Navigation method, device and equipment for roundabout and storage medium
CN112558072A (en) Vehicle positioning method, device, system, electronic equipment and storage medium
CN114111813B (en) High-precision map element updating method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant