CN110806215A - Vehicle positioning method, device, equipment and storage medium - Google Patents

Vehicle positioning method, device, equipment and storage medium Download PDF

Info

Publication number
CN110806215A
CN110806215A CN201911146739.2A CN201911146739A CN110806215A CN 110806215 A CN110806215 A CN 110806215A CN 201911146739 A CN201911146739 A CN 201911146739A CN 110806215 A CN110806215 A CN 110806215A
Authority
CN
China
Prior art keywords
pose
positioning
current
vehicle navigation
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911146739.2A
Other languages
Chinese (zh)
Other versions
CN110806215B (en
Inventor
杨洋
陈文龙
王俊
杨鹏斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201911146739.2A priority Critical patent/CN110806215B/en
Publication of CN110806215A publication Critical patent/CN110806215A/en
Application granted granted Critical
Publication of CN110806215B publication Critical patent/CN110806215B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Navigation (AREA)

Abstract

The application discloses a method, a device, equipment and a storage medium for vehicle positioning, which can be used in the field of automatic driving, particularly autonomous parking. The specific implementation scheme is as follows: the method is applied to electronic equipment, the electronic equipment is in communication connection with a vehicle, the vehicle is provided with a plurality of perception sensors, and the perception sensors at least comprise: the method comprises the following steps that a visual sensor and a GNSS sensor are adopted, the visual sensor is used for determining a visual lane line and a vehicle visual positioning pose, the GNSS sensor is used for determining a vehicle navigation positioning pose, and the method comprises the following steps: acquiring a current visual lane line; judging whether the current state of the visual sensor is a failure state or not according to the current visual lane line; if the current state of the vision sensor is determined to be a failure state, acquiring a current vehicle navigation positioning pose and a corresponding vehicle navigation offset pose; and calculating the current vehicle fusion positioning pose according to the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose.

Description

Vehicle positioning method, device, equipment and storage medium
Technical Field
The application relates to the technical field of data processing, in particular to an automatic driving technology.
Background
With the maturity of artificial intelligence technology, the automatic driving technology has also been developed rapidly. The requirements for lane-level positioning in autopilot technology are becoming increasingly stringent. In the lane-level positioning technology based on multi-sensor fusion, the visual sensor is matched with a high-precision map to provide lane-level transverse position constraint of a vehicle, so that the visual sensor plays an important role in the lane-level positioning technology based on multi-sensor fusion.
However, the visual sensor is in a failure state due to the problems of false detection or missed detection in scenes such as fuzzy and missing lane lines and the like. When the vision sensor is in a failure state, the vision sensor is unable to provide lane-level lateral position constraints for the vehicle.
In the prior art, when the visual sensor is in a failure state, the compensation of the fusion positioning position of the vehicle is not carried out, but other non-visual sensors are directly adopted for fusion positioning, so that the fusion positioning position of the vehicle can only meet the positioning requirement of a GNSS sensor, but cannot meet the positioning requirement of a lane level, and then larger positioning error fluctuation is generated, so that the vehicle has stronger pause and frustration in the driving process, and the safety of automatic driving and riding experience are greatly influenced.
Disclosure of Invention
The embodiment of the application provides a vehicle positioning method, device, equipment and storage medium, and solves the technical problems that in the prior art, the lane-level positioning requirement cannot be met, and then large positioning error fluctuation is generated, so that a vehicle has strong pause and frustration in the driving process, and the safety of automatic driving and riding experience are greatly influenced.
A first aspect of an embodiment of the present application provides a method for positioning a vehicle, where the method is applied to an electronic device, the electronic device is in communication connection with a vehicle, the vehicle is equipped with multiple sensing sensors, and the sensing sensors at least include: a vision sensor for determining a visual lane line and a vehicle vision positioning pose and a GNSS-like sensor for determining a vehicle navigation positioning pose, the method comprising:
acquiring a current visual lane line; judging whether the current state of the vision sensor is a failure state or not according to the current vision lane line; if the current state of the vision sensor is determined to be a failure state, acquiring a current vehicle navigation positioning pose and a corresponding vehicle navigation offset pose; and calculating the current vehicle fusion positioning pose according to the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose.
In the embodiment of the application, once the GNSS-type sensor completes positioning, positioning under a scene with open sky is very stable, although absolute positioning errors exist, relative positioning errors with a vehicle fusion positioning pose in a short period are very small, so that the vehicle navigation offset pose corresponding to the current vehicle navigation positioning pose can be accurately determined by utilizing the characteristic that the relative positioning errors in the short period of the GNSS-type sensor are very small and stable, the positioning errors of the corresponding vehicle navigation offset pose are very small, lane-level positioning requirements can be met, and further the calculated current vehicle fusion positioning pose can meet lane-level positioning requirements. The method can effectively reduce the fluctuation of positioning errors when the vision sensor fails, so that the vehicle does not have strong pause and pause feeling in the driving process, and the safety of automatic driving and riding experience are improved.
Further, the method for determining whether the current state of the visual sensor is a failure state according to the current visual lane line includes:
calculating the matching degree of the current visual lane line and the corresponding high-precision map lane line; if the matching degree is smaller than a preset matching degree threshold value, determining that the current state of the visual sensor is a failure state; and if the matching degree is greater than or equal to the preset matching degree threshold value, determining that the current state of the visual sensor is an effective state.
In the embodiment of the application, because the high-precision map lane line is an accurate lane line, whether the current visual sensor is in the scenes of lane line fuzziness, absence and the like can be accurately determined by matching the current visual lane line acquired by the visual sensor with the corresponding high-precision map lane line, and then whether the current state of the visual sensor is in a failure state can be accurately determined.
Further, the method as described above, if it is determined that the current state of the vision sensor is the failure state, acquiring a current vehicle navigation positioning pose and a corresponding vehicle navigation offset pose, including:
if the current state of the visual sensor is determined to be a failure state, judging whether the failure time of the visual sensor is within a preset time period range; and if the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose are determined to be within the preset time period range, acquiring the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose.
In the embodiment of the application, after the current state of the vision sensor is determined to be the failure state, whether the failure time of the vision sensor is within the preset time period range is judged, if the failure time is determined to be within the preset time period range, whether the relative positioning error of the GNSS sensor is still very small can be determined, and the current vehicle fusion positioning pose can be calculated by adopting the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose. The calculated fusion positioning pose of the current vehicle can meet the positioning requirement of the lane level.
Further, the method as described above, the acquiring corresponding vehicle navigation offset poses, comprising:
acquiring a historical vehicle fusion positioning pose and a historical vehicle navigation positioning pose in a historical positioning period closest to the current positioning period; the state of the vision sensor in the historical positioning period closest to the current positioning period is an effective state; calculating historical vehicle navigation offset poses according to the historical vehicle fusion positioning poses and the historical vehicle navigation positioning poses; determining the historical vehicle navigation bias poses as the corresponding vehicle navigation bias poses.
In the embodiment of the application, the acquired corresponding vehicle navigation offset pose is calculated according to the historical vehicle fusion positioning pose and the historical vehicle navigation positioning pose in the historical positioning period closest to the current positioning period. And the state of the vision sensor is an active state in a historical positioning period closest to the current positioning period. Therefore, the positioning error of the corresponding vehicle navigation offset pose is very small, and the lane-level positioning requirement can be met, so that the calculated current vehicle fusion positioning pose can also meet the lane-level positioning requirement.
Further, the method as described above, the calculating historical vehicle navigation bias poses from the historical vehicle fusion positioning poses and the historical vehicle navigation positioning poses, comprising:
calculating a first difference value between the historical vehicle navigation positioning pose and the historical vehicle fusion positioning pose; determining the first difference value as a historical vehicle navigation bias pose.
In the embodiment of the application, the historical vehicle navigation offset pose is an offset pose between the vehicle navigation positioning pose and the vehicle fusion positioning pose in the historical positioning period. Therefore, a first difference value between the historical vehicle navigation positioning pose and the historical vehicle fusion positioning pose is calculated; and determining the first difference value as the historical vehicle navigation offset pose, so that the historical vehicle navigation offset pose can be accurately calculated.
Further, the method as described above, the calculating a current vehicle fusion localization pose from the current vehicle navigation localization pose and the corresponding vehicle navigation bias pose, comprising:
calculating a second difference value between the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose; and determining the second difference value as the current vehicle fusion positioning pose.
In the embodiment of the application, the corresponding vehicle navigation offset pose is an offset pose between the current vehicle navigation positioning pose and the current vehicle fusion positioning pose. Calculating a second difference value between the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose; and determining the second difference as the current vehicle fusion positioning pose, so that the current vehicle fusion positioning pose can be accurately calculated.
Further, the method as described above, if the status of the vision sensor is valid, further includes:
calculating a current vehicle fusion positioning pose according to the current vehicle vision positioning pose and the current vehicle navigation positioning pose; calculating a current vehicle navigation offset pose according to the current vehicle fusion positioning pose and the current vehicle navigation positioning pose; determining the current vehicle navigation offset pose as the corresponding vehicle navigation offset pose when the state of the vision sensor is an invalid state in a future positioning period; the future positioning cycle is a positioning cycle which is not generated within a preset time period range from the current positioning cycle.
In the embodiment of the application, when the state of the vision sensor is an effective state, in addition to calculating the current vehicle fusion positioning pose according to the current vehicle vision positioning pose and the current vehicle navigation positioning pose, the current vehicle navigation offset pose is also calculated according to the current vehicle fusion positioning pose and the current vehicle navigation positioning pose, the current vehicle navigation offset pose is determined as the vehicle navigation offset pose corresponding to the state of the vision sensor in an ineffective state in a future positioning period, because the relative positioning error of the GNSS type sensor in a short period is very small, namely the vehicle navigation offset pose corresponding to the positioning period which does not occur within a preset time period range from the current positioning period is approximately equal to the current vehicle navigation offset pose, the current vehicle navigation offset pose is determined as the vehicle navigation offset pose corresponding to the state of the vision sensor in the future positioning period, the vehicle fusion positioning pose in the future positioning period can be accurately calculated.
A second aspect of the embodiments of the present application provides an apparatus for positioning a vehicle, where the apparatus is located in an electronic device, the electronic device is in communication connection with the vehicle, the vehicle is mounted with multiple kinds of perception sensors, and the perception sensors at least include: the device comprises a vision sensor and a GNSS sensor, wherein the vision sensor is used for determining a vision lane line and a vehicle vision positioning pose, the GNSS sensor is used for determining a vehicle navigation positioning pose, and the device comprises:
the lane line acquisition module is used for acquiring a current visual lane line; the state judgment module is used for judging whether the current state of the vision sensor is a failure state or not according to the current vision lane line; the offset pose acquisition module is used for acquiring a current vehicle navigation positioning pose and a corresponding vehicle navigation offset pose if the current state of the vision sensor is determined to be a failure state; and the fusion pose calculation module is used for calculating the current vehicle fusion positioning pose according to the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose.
Further, in the apparatus described above, the state determination module is specifically configured to:
calculating the matching degree of the current visual lane line and the corresponding high-precision map lane line; if the matching degree is smaller than a preset matching degree threshold value, determining that the current state of the visual sensor is a failure state; and if the matching degree is greater than or equal to the preset matching degree threshold value, determining that the current state of the visual sensor is an effective state.
Further, in the apparatus described above, the offset pose acquisition module is specifically configured to:
if the current state of the visual sensor is determined to be a failure state, judging whether the failure time of the visual sensor is within a preset time period range; and if the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose are determined to be within the preset time period range, acquiring the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose.
Further, in the apparatus as described above, the offset pose acquisition module, when acquiring the corresponding vehicle navigation offset pose, is specifically configured to:
acquiring a historical vehicle fusion positioning pose and a historical vehicle navigation positioning pose in a historical positioning period closest to the current positioning period; the state of the vision sensor in the historical positioning period closest to the current positioning period is an effective state; calculating historical vehicle navigation offset poses according to the historical vehicle fusion positioning poses and the historical vehicle navigation positioning poses; determining the historical vehicle navigation bias poses as the corresponding vehicle navigation bias poses.
Further, in the apparatus as described above, the offset pose acquisition module, when calculating historical vehicle navigation offset poses according to the historical vehicle fusion positioning poses and the historical vehicle navigation positioning poses, is specifically configured to:
calculating a first difference value between the historical vehicle navigation positioning pose and the historical vehicle fusion positioning pose; determining the first difference value as a historical vehicle navigation bias pose.
Further, the apparatus as described above, the fusion pose calculation module is specifically configured to:
calculating a second difference value between the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose; and determining the second difference value as the current vehicle fusion positioning pose.
Further, the apparatus as described above, further comprising: the system comprises an offset pose calculation module and an offset pose determination module;
the fusion pose calculation module is further used for calculating the current vehicle fusion positioning pose according to the current vehicle vision positioning pose and the current vehicle navigation positioning pose if the state of the vision sensor is an effective state; the offset pose calculation module is used for calculating the current vehicle navigation offset pose according to the current vehicle fusion positioning pose and the current vehicle navigation positioning pose; the bias pose determining module is used for determining the current vehicle navigation bias pose as the corresponding vehicle navigation bias pose when the state of the vision sensor is an invalid state in a future positioning period; the future positioning cycle is a positioning cycle which is not generated within a preset time period range from the current positioning cycle.
A third aspect of the embodiments of the present application provides an electronic device, including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of the first aspects.
A fourth aspect of embodiments of the present application provides a non-transitory computer-readable storage medium having stored thereon computer instructions for causing a computer to perform the method of any one of the first aspects.
A fifth aspect of embodiments of the present application provides a computer program comprising program code for performing the method according to the first aspect when the computer program is run by a computer.
Drawings
The drawings are included to provide a better understanding of the present solution and are not intended to limit the present application. Wherein:
FIG. 1 is a scene diagram of a method of vehicle localization that may implement an embodiment of the present application;
FIG. 2 is a schematic flow chart diagram of a method of vehicle localization provided in accordance with a first embodiment of the present application;
FIG. 3 is a schematic flow chart diagram of a method of vehicle localization provided in accordance with a second embodiment of the present application;
FIG. 4 is a schematic flow chart diagram illustrating step 202 of a method for vehicle localization according to a second embodiment of the present application;
FIG. 5 is a schematic diagram of an implementation of step 203 in a method for vehicle localization according to a second embodiment of the present application;
FIG. 6 is a schematic flow chart diagram of step 207 of a method for vehicle localization provided in accordance with a second embodiment of the present application;
FIG. 7 is a schematic flow chart diagram illustrating step 2072 in a method for vehicle positioning according to a second embodiment of the present application;
FIG. 8 is a schematic flow chart diagram illustrating step 208 of a method for vehicle localization provided in accordance with a second embodiment of the present application;
FIG. 9 is a schematic flow chart diagram of a method of vehicle localization provided in accordance with a third embodiment of the present application;
FIG. 10 is a schematic diagram of an implementation of a method for vehicle localization provided in accordance with a third embodiment of the present application;
FIG. 11 is a signaling flow diagram of a method of vehicle localization provided in accordance with a fourth embodiment of the present application;
FIG. 12 is a schematic structural diagram of a vehicle positioning device according to a fifth embodiment of the present application;
FIG. 13 is a schematic structural diagram of a vehicle positioning device according to a sixth embodiment of the present application;
FIG. 14 is a block diagram of an electronic device for implementing a method of vehicle localization of an embodiment of the present application.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application for the understanding of the same, which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
For clearly understanding the technical solution of the present application, an application scenario of the vehicle positioning method provided in the embodiment of the present application is described first. As shown in fig. 1, an application scenario corresponding to the vehicle positioning method provided in the embodiment of the present application includes: electronic equipment and vehicle. And the electronic equipment is in communication connection with the vehicle. A vehicle is equipped with a plurality of sensors, including at least: vision sensors and GNSS-like sensors. The vision sensor is used for determining a vision lane line and a vehicle vision positioning pose. The GNSS type sensor is used for determining the navigation and positioning pose of the vehicle. Of course, in order to calculate the vehicle fusion positioning pose more accurately, the sensor may further include: IMU type sensors, odometers, etc. The electronic device acquires data acquired by the sensors by communicating with the sensors. Specifically, the electronic device first communicates with the vision sensor to obtain a current visual lane line determined by the vision sensor. And then judging whether the current state of the vision sensor is a failure state or not according to the current vision lane line, if the current state of the vision sensor is determined to be the failure state, indicating that the vision sensor is possibly in a scene with fuzzy and missing lane lines, and enabling the confidence coefficient of the determined current vision lane line and the vehicle vision positioning pose to be lower. Therefore, the visual sensor is no longer adopted to provide the lane-level transverse position constraint of the vehicle to calculate the current vehicle fusion positioning pose. And acquiring a current vehicle navigation positioning pose and a corresponding vehicle navigation offset pose, wherein the corresponding vehicle navigation offset pose is an offset pose between the current vehicle navigation positioning pose and a current vehicle fusion positioning pose. And calculating the current vehicle fusion positioning pose according to the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose. Once the positioning is finished by the GNSS sensor, the positioning under the open sky scene can be very stable, although absolute positioning error exists, the relative positioning error with the vehicle fusion positioning pose in a short period is very small, so the vehicle navigation offset pose corresponding to the current vehicle navigation positioning pose can be accurately determined by utilizing the characteristic that the relative positioning error in the short period of the GNSS sensor is very small and stable, and the corresponding vehicle navigation offset pose is calculated according to the historical vehicle fusion positioning pose and the historical vehicle navigation positioning pose in the historical positioning period closest to the current positioning period. And the state of the vision sensor is an active state in a historical positioning period closest to the current positioning period. Therefore, the positioning error of the corresponding vehicle navigation offset pose is very small, and the lane-level positioning requirement can be met, so that the calculated current vehicle fusion positioning pose can also meet the lane-level positioning requirement. The method can effectively reduce the fluctuation of positioning errors when the vision sensor fails, so that the vehicle does not have strong pause and pause feeling in the driving process, and the safety of automatic driving and riding experience are improved.
It is to be understood that the electronic device may be an in-vehicle terminal integrated on a vehicle to be able to communicate with the vehicle. The communication mode may be Global System for Mobile communication (GSM), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Time Division-Synchronous Code Division multiple Access (TD-SCDMA), Long Term Evolution (LTE), or 5G. It can be understood that the communication mode between the vehicle and the electronic device may also be a wireless communication mode, and the wireless communication mode may be zigbee communication, bluetooth BLE communication, or wifi communication of a mobile hotspot.
Embodiments of the present application will be described below in detail with reference to the accompanying drawings.
Example one
Fig. 2 is a schematic flow chart of a method for vehicle positioning according to a first embodiment of the present application, and as shown in fig. 2, an implementation subject of the embodiment of the present application is a vehicle positioning device, which may be integrated in an electronic device. The method for locating a vehicle provided by the present embodiment includes the following steps.
Step 101, obtaining a current visual lane line.
In the present embodiment, the electronic device is connected to the vision sensor in a communication manner, the vision sensor is mounted on a vehicle, and the vehicle travels on a road. The vision sensor periodically positions the pose of the vehicle according to the positioning frequency and collects the information of the surrounding environment of the vehicle. The positioning information of the vision sensor to the vehicle pose can be represented as the vehicle vision positioning pose. And if the information of the surrounding environment of the vehicle comprises the lane line, the lane line acquired by the visual sensor is the visual lane line. The electronic device periodically receives the visual lane lines and the vehicle visual positioning poses by communicating with the visual sensor. The visual lane line received in the current positioning period is the current visual lane line. And the vehicle vision positioning pose received in the current positioning period is the current vehicle vision positioning pose.
The visual sensor may be a laser scanner, a linear array or a charge coupled device camera (CCD camera for short), or a digital camera or other types of visual sensors, which is not limited in this embodiment.
And 102, judging whether the current state of the visual sensor is a failure state or not according to the current visual lane line.
In this embodiment, since the vision sensor cannot accurately detect the current vision lane line in the scenes of the fuzzy and missing lane lines, as an optional implementation, it can be determined whether the current state of the vision sensor is the failure state according to whether the current vision lane line is complete and clear. And if the integrity and the definition of the current visual lane line reach preset standards, determining that the current state of the visual sensor is an effective state, otherwise, determining that the current state of the visual sensor is an invalid state.
It can be understood that, as another alternative implementation, whether the current state of the vision sensor is the failure state is judged according to the current vision lane line, and may also be: matching the current visual lane line with the corresponding high-precision map lane line to determine the confidence coefficient of the current visual lane line, comparing the confidence coefficient of the current visual lane line with a preset confidence coefficient threshold value, if the confidence coefficient of the current visual lane line is greater than the preset confidence coefficient threshold value, indicating that the current visual lane line is credible, determining that the current state of the visual sensor is an effective state, and if the confidence coefficient of the current visual lane line is less than or equal to the preset confidence coefficient threshold value, indicating that the current visual lane line is not credible, and determining that the current state of the visual sensor is an invalid state.
In this embodiment, if the current state of the vision sensor is the failure state, the vehicle vision positioning pose determined by the vision sensor cannot be adopted to provide lane-level lateral position constraint for determining the current vehicle fusion positioning pose. On the contrary, if the current state of the vision sensor is an effective state, the vehicle vision positioning pose determined by the vision sensor can be adopted to provide lane-level transverse position constraint for determining the current vehicle fusion positioning pose.
And 103, if the current state of the vision sensor is determined to be a failure state, acquiring the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose.
In this embodiment, if the current state of the vision sensor is the failure state, the current vehicle fusion positioning pose cannot be determined by the current vehicle vision positioning pose determined by the vision sensor and the current vehicle navigation positioning pose determined by the GNSS-like sensor. And the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose need to be obtained so as to calculate the current vehicle fusion positioning pose according to the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose.
And the vehicle navigation offset pose corresponding to the current vehicle navigation positioning pose is an offset pose between the current vehicle navigation positioning pose and the current vehicle fusion positioning pose.
In the embodiment, because the relative positioning error of the GNSS-type sensor in a short period is small and the positioning is stable, the historical vehicle navigation offset pose determined when the state of the vision sensor in the historical positioning period closest to the current positioning period is an effective state can be used as the vehicle navigation offset pose corresponding to the current vehicle navigation positioning pose.
The GNSS type sensor can be a global positioning system (GPS for short), a Glonass navigation satellite system, a Galileo satellite navigation system, a Beidou satellite navigation system and the like.
And 104, calculating the current vehicle fusion positioning pose according to the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose.
In this embodiment, since the vehicle navigation offset pose corresponding to the current vehicle navigation positioning pose is an offset pose between the current vehicle navigation positioning pose and the current vehicle fusion positioning pose, the current vehicle fusion positioning pose can be determined by calculating a difference between the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose.
In the method for positioning the vehicle provided by the embodiment, the current visual lane line is obtained; judging whether the current state of the visual sensor is a failure state or not according to the current visual lane line; if the current state of the vision sensor is determined to be a failure state, acquiring a current vehicle navigation positioning pose and a corresponding vehicle navigation offset pose; and calculating the current vehicle fusion positioning pose according to the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose. Once the positioning is finished by the GNSS sensor, the positioning under the open sky scene can be very stable, although absolute positioning error exists, the relative positioning error with the vehicle fusion positioning pose in a short period is very small, so the vehicle navigation offset pose corresponding to the current vehicle navigation positioning pose can be accurately determined by utilizing the characteristic that the relative positioning error in the short period of the GNSS sensor is very small and stable, the positioning error of the corresponding vehicle navigation offset pose is very small, the lane-level positioning requirement can be met, and the calculated current vehicle fusion positioning pose can meet the lane-level positioning requirement. The method can effectively reduce the fluctuation of positioning errors when the vision sensor fails, so that the vehicle does not have strong pause and pause feeling in the driving process, and the safety of automatic driving and riding experience are improved.
Example two
Fig. 3 is a schematic flow chart of a method for vehicle positioning according to a second embodiment of the present application, and as shown in fig. 3, the method for vehicle positioning according to the present embodiment is further detailed in steps 102 to 104 on the basis of the method for vehicle positioning according to the first embodiment of the present application. And if the current state of the vision sensor is determined to be the effective state, the step of calculating the current vehicle navigation offset pose is further included, and the vehicle positioning method provided by the embodiment includes the following steps.
Step 201, obtaining a current visual lane line.
In this embodiment, the implementation manner of step 201 is the same as that of step 101 in the embodiment shown in fig. 2 of this application, and is not described in detail here.
Step 202, judging whether the current state of the vision sensor is a failure state according to the current vision lane line, if not, executing step 203, otherwise, executing step 206.
Further, in this embodiment, as shown in fig. 4, step 202 includes the following steps:
step 2021, calculating the matching degree between the current visual lane line and the corresponding high-precision map lane line.
Further, in this embodiment, a high-precision map is stored in the electronic device, and the high-precision map includes lane line information. The high-precision map lane line corresponding to the current visual lane line can be determined according to the position information of the current visual lane line, and the corresponding high-precision map lane line can be obtained. And matching the current visual lane line with the corresponding high-precision map lane line to obtain the matching degree of the current visual lane line and the corresponding high-precision map lane line.
When the current visual lane line is matched with the corresponding high-precision map lane line, the current visual lane line and the corresponding high-precision map lane line can be input into a matching algorithm, the matching algorithm calculates the matching degree of the current visual lane line and the corresponding high-precision map lane line, and the corresponding matching degree is output from the matching algorithm.
Step 2022, if the matching degree is smaller than the preset matching degree threshold, determining that the current state of the visual sensor is a failure state.
Step 2023, if the matching degree is greater than or equal to the preset matching degree threshold, determining that the current state of the visual sensor is an effective state.
The preset matching degree threshold may be 90%, 95%, or other suitable values, which is not limited in this embodiment.
In this embodiment, since the high-precision map lane line is an accurate lane line, if the matching degree between the current visual lane line and the corresponding high-precision map lane line is smaller than the preset matching degree threshold, it indicates that the confidence of the current visual lane line is low, and the probability of the visual sensor being in the scene where the lane line is fuzzy, missing, or the like is high, and it is determined that the current state of the visual sensor is a failure state. If the matching degree of the current visual lane line and the corresponding high-precision map lane line is greater than or equal to the preset matching degree threshold value, it is indicated that the confidence coefficient of the current visual lane line is high, it is determined that the visual sensor can accurately detect the lane line, and the current state of the visual sensor is an effective state.
In this embodiment, because the high-precision map lane line is an accurate lane line, whether the current vision sensor is in a scene such as a lane line blur or a lane line loss can be accurately determined by matching the current vision lane line acquired by the vision sensor with the corresponding high-precision map lane line, and then whether the current state of the vision sensor is in a failure state can be accurately determined.
And 203, calculating the fusion positioning pose of the current vehicle according to the vision positioning pose of the current vehicle and the navigation positioning pose of the current vehicle.
Further, as shown in fig. 5, if the state of the vision sensor is the valid state, the current vehicle fusion positioning pose is calculated according to the current vehicle vision positioning pose and the current vehicle navigation positioning pose. As shown by the circle in fig. 5, the noise of the GNSS-like sensor is gaussian distributed noise, and the errors in all directions are consistent, so that the current vehicle navigation positioning pose provides a position constraint of longitudinal positioning for the current fusion positioning pose. As shown by the oval in fig. 5, the noise of the vision sensor is small in lateral error and large in longitudinal error, so that the current vehicle vision positioning pose determined by the vision sensor provides lateral position constraint for the current fusion vehicle pose. And the vehicle fusion positioning pose can be obtained after the vehicle fusion positioning pose is fused with data determined by other sensors.
Wherein the other sensors include: IMU type sensors, odometers, etc.
And 204, calculating the current vehicle navigation offset pose according to the current vehicle fusion positioning pose and the current vehicle navigation positioning pose.
And the current vehicle navigation offset pose is an offset pose between the vehicle navigation positioning pose and the vehicle fusion positioning pose in the current positioning period.
Therefore, in this embodiment, the difference between the current vehicle navigation positioning pose and the current vehicle fusion positioning pose is calculated, and the difference is determined as the current vehicle navigation offset pose.
Step 205, determining the current vehicle navigation offset pose as the corresponding vehicle navigation offset pose when the state of the vision sensor is the invalid state in the future positioning period; the future positioning cycle is a positioning cycle which is not generated within a preset time period from the current positioning cycle.
Further, in this embodiment, since the relative positioning error of the GNSS-like sensor is small in a short period, that is, the vehicle navigation offset pose corresponding to the positioning period that does not occur within the preset time period from the current positioning period is approximately equal to the current vehicle navigation offset pose, the current vehicle navigation offset pose is determined as the vehicle navigation offset pose corresponding to the state of the vision sensor in the future positioning period when the state of the vision sensor is the invalid state, so as to calculate the fused vehicle positioning pose of the future positioning period according to the vehicle navigation positioning pose of the future positioning period and the current vehicle navigation offset pose when the state of the vision sensor in the future positioning period is the invalid state.
The preset time period range is a time period range in which the relative positioning error of the GNSS sensor is kept small, and may be, for example, 20 minutes, 30 minutes, or another suitable value, which is not limited in this embodiment.
Step 206, determining whether the failure time of the vision sensor is within a preset time period range, if so, executing step 207, otherwise, executing step 209.
Further, in this embodiment, since the GNSS-based sensor has a small relative positioning error in a short period, if it is determined that the current state of the vision sensor is the failure state, it is determined whether the failure time of the vision sensor is within the preset time period range to determine whether the relative positioning error of the GNSS-based sensor is still small, and if it is determined that the failure time of the vision sensor is within the preset time period range, it is determined that the relative positioning error of the GNSS-based sensor is still small, and the current vehicle fusion positioning pose can be calculated by using the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose. And if the failure time of the vision sensor is determined not to be within the preset time period range, the relative positioning error of the GNSS sensor cannot be ensured to be still small, and the current vehicle fusion positioning pose is calculated by adopting the current vehicle navigation positioning pose.
And step 207, acquiring the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose.
Further, in this embodiment, the electronic device communicates with a GNSS sensor, the GNSS sensor is disposed on a vehicle, the vehicle runs on a road, and the GNSS sensor periodically locates the pose of the vehicle according to a locating frequency. The vehicle pose positioned by the GNSS type sensor is a vehicle navigation positioning pose. The electronic equipment periodically receives the vehicle navigation positioning pose by communicating with the GNSS type sensor. And the vehicle navigation positioning pose received in the current positioning period is the current vehicle navigation positioning pose.
Further, in this embodiment, as shown in fig. 6, the acquiring a corresponding vehicle navigation offset pose in step 207 includes the following steps:
step 2071, acquiring a historical vehicle fusion positioning pose and a historical vehicle navigation positioning pose in a historical positioning period closest to the current positioning period; the state of the vision sensor in the historical positioning period closest to the current positioning period is the effective state.
Further, in this embodiment, in order to obtain the vehicle navigation offset pose corresponding to the current vehicle navigation positioning pose, it is necessary to obtain the historical vehicle fusion positioning pose and the historical vehicle navigation positioning pose when the state of the vision sensor in the historical positioning period closest to the current positioning period is the valid state, and calculate the historical vehicle navigation offset pose according to the historical vehicle fusion positioning pose and the historical vehicle navigation positioning pose. Therefore, the historical vehicle navigation offset pose is determined as the vehicle navigation offset pose corresponding to the current vehicle navigation positioning pose.
And 2072, calculating historical vehicle navigation offset poses according to the historical vehicle fusion positioning poses and the historical vehicle navigation positioning poses.
As an alternative embodiment, as shown in fig. 7, step 2072 comprises the steps of:
step 2072a, calculate a first difference between the historical vehicle navigation and positioning poses and the historical vehicle fusion and positioning poses.
Step 2072b, determine the first difference as the historical vehicle navigation offset pose.
Specifically, in this embodiment, when the visual sensor is in an effective state in the historical positioning period closest to the current positioning period, the historical fusion vehicle positioning pose can be accurately calculated, and the historical fusion vehicle positioning pose can be acquired. A difference between the historical vehicle navigation position fix and the historical vehicle fusion position fix can be calculated and expressed as a first difference. The first difference is determined as a historical vehicle navigation bias pose.
The historical vehicle fusion positioning pose can be represented in a vector form, and the historical vehicle navigation positioning pose can also be represented in a vector form, so that the historical vehicle navigation offset pose can still be represented in a vector form.
Step 2073, determine the historical vehicle navigation offset poses as corresponding vehicle navigation offset poses.
In this embodiment, the obtained corresponding vehicle navigation offset pose is calculated according to the historical vehicle fusion positioning pose and the historical vehicle navigation positioning pose in the historical positioning period closest to the current positioning period. And the state of the vision sensor is an active state in a historical positioning period closest to the current positioning period. Therefore, the positioning error of the corresponding vehicle navigation offset pose is very small, and the lane-level positioning requirement can be met, so that the calculated current vehicle fusion positioning pose can also meet the lane-level positioning requirement.
And 208, calculating the fusion positioning pose of the current vehicle according to the navigation positioning pose of the current vehicle and the corresponding navigation offset pose of the vehicle.
Further, in this embodiment, as shown in fig. 8, step 208 includes the following steps:
step 2081, calculating a second difference value between the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose.
And 2082, determining the second difference as the fusion positioning pose of the current vehicle.
Further, in this embodiment, the difference value between the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose is represented as a second difference value, and the second difference value is determined as the current vehicle fusion positioning pose.
In the embodiment, after the current state of the vision sensor is determined to be the failure state, whether the failure time of the vision sensor is within the preset time period range is judged, if the failure time is determined to be within the preset time period range, whether the relative positioning error of the GNSS sensor is still very small can be determined, and the current vehicle fusion positioning pose can be calculated by adopting the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose. The calculated fusion positioning pose of the current vehicle can meet the positioning requirement of the lane level.
And 209, calculating the fusion positioning pose of the current vehicle according to the navigation positioning pose of the current vehicle.
Further, in this embodiment, if the failure time of the vision sensor is not within the preset time period, it cannot be guaranteed that the relative positioning error of the GNSS sensor is still small, and the current vehicle navigation positioning pose is directly fused with data acquired by other non-vision sensors, so as to calculate the current vehicle fusion positioning pose.
EXAMPLE III
Fig. 9 is a schematic flowchart of a method for positioning a vehicle according to a third embodiment of the present application, and fig. 10 is a schematic diagram of an implementation of the method for positioning a vehicle according to the third embodiment of the present application, and as shown in fig. 9 and fig. 10, the method for positioning a vehicle according to the present embodiment is described with an example that a state of a vision sensor in a kth positioning cycle is an active state, and a state of a vision sensor in K + N positioning cycles is an inactive state, where N positioning cycles are within a preset time period, and then the method for positioning a vehicle according to the present embodiment includes the following steps:
step 301, setting the electronic device and the multiple sensing sensors to adopt the same time reference.
In this embodiment, in order to synchronize data collected by the multiple kinds of sensing sensors, the multiple kinds of sensing sensors are used on the same time reference. The same time reference is used for the time stamps of the data when the data are collected by the various sensors. And the electronic device and the various perception sensors are used on the same time reference.
For example, the time references used are all times of the same city.
Step 302, a visual lane line of the kth positioning period is obtained.
In this embodiment, the positioning frequency of the fused vehicle positioning pose is preset to determine each positioning cycle. Because the positioning frequencies of the visual sensor and the GNSS sensor may not be completely the same, when the visual lane line of the Kth positioning period is obtained, the time corresponding to the Kth positioning period is determined, the visual lane line positioned by the visual sensor is obtained, and the visual lane line matched with the Kth positioning period in time is determined as the visual lane line of the Kth positioning period according to the timestamp information of the visual lane line.
And step 303, determining that the state of the vision sensor in the Kth positioning period is an effective state according to the vision lane line of the Kth positioning period.
In this embodiment, an implementation manner of determining that the state of the visual sensor in the kth positioning period is an effective state in step 303 is the same as an implementation manner of determining that the current state of the visual sensor is an effective state in step 202 in the embodiment shown in fig. 3 of the present application, and details are not repeated here.
Step 304, calculating a vehicle fusion positioning pose of the Kth positioning period according to the vehicle vision positioning pose and the vehicle navigation positioning pose of the Kth positioning period; and calculating the vehicle navigation offset pose of the Kth positioning period according to the vehicle fusion positioning pose and the vehicle navigation positioning pose of the Kth positioning period.
Wherein, the vehicle fusion positioning pose of the Kth positioning period is expressed as
Figure BDA0002282410630000171
The navigation positioning pose of the vehicle in the Kth positioning period is expressed as
Figure BDA0002282410630000172
Then the vehicle navigation offset pose of the kth positioning period is expressed as shown in equation (1):
Figure BDA0002282410630000173
wherein the content of the first and second substances,
Figure BDA0002282410630000174
and biasing the pose for the vehicle navigation of the k positioning period. Which is in the form of a vector, as shown in figure 10,
Figure BDA0002282410630000175
a coordinate representation may also be employed, with coordinates represented as (gps _ bias _ x, gps _ bias _ y).
And 305, acquiring a visual lane line of the K +1 positioning period.
And step 306, determining that the state of the vision sensor in the K +1 th positioning period is an invalid state according to the vision lane line in the K +1 th positioning period.
In this embodiment, an implementation manner of determining that the state of the visual sensor in the kth positioning period is an invalid state in step 306 is the same as an implementation manner of determining that the current state of the visual sensor is an invalid state in step 202 in the embodiment shown in fig. 3 of this application, and details are not repeated here.
And 307, calculating the vehicle fusion positioning pose of the K +1 positioning period according to the vehicle navigation positioning pose of the K +1 positioning period and the vehicle navigation offset pose of the K positioning period.
Further, in this embodiment, since the state of the vision sensor in the K +1 th positioning cycle is an invalid state, the historical positioning cycle closest to the K +1 th positioning cycle is a K-th positioning cycle, and the state of the vision sensor in the K-th positioning cycle is an valid state, the vehicle navigation offset pose corresponding to the vehicle navigation positioning pose in the K +1 th positioning cycle is the vehicle navigation offset pose in the K-th positioning cycle, and the calculation of the vehicle fusion positioning pose in the K +1 th positioning cycle according to the vehicle navigation positioning pose in the K +1 th positioning cycle and the vehicle navigation offset pose in the K-th positioning cycle can be expressed as shown in equation (2):
wherein the content of the first and second substances,
Figure BDA0002282410630000182
fusing the positioning pose for the vehicle in the K +1 positioning period,
Figure BDA0002282410630000183
the vehicle navigation positioning pose of the K +1 positioning period is shown,
Figure BDA0002282410630000184
and representing the vehicle navigation offset pose of the Kth positioning period.
Wherein the content of the first and second substances,
Figure BDA0002282410630000185
a coordinate representation may be employed, with coordinates represented as (gps _ bias _ x, gps _ bias _ y).
And 308, acquiring the visual lane line of the K + N positioning period.
Step 309, determining the state of the vision sensor in the K + N positioning period as an invalid state according to the vision lane line in the K + N positioning period.
And 310, calculating the vehicle fusion positioning pose of the K + N positioning period according to the vehicle navigation positioning pose of the K + N positioning period and the vehicle navigation offset pose of the K positioning period.
The value of N is 2,3,4, … … and so on, respectively, until the N +1 th positioning cycle is not within the preset time period.
Wherein, the vehicle fusion positioning pose of the K + N positioning period is calculated according to the vehicle navigation positioning pose of the K + N positioning period and the vehicle navigation offset pose of the K positioning period, which can be expressed as formula (3):
Figure BDA0002282410630000186
as shown in fig. 10, the vehicle fusion positioning pose in the K +1 th positioning period is the difference between the vehicle navigation positioning pose in the K +1 th positioning period and the vehicle navigation offset pose in the K positioning period, and the vehicle fusion positioning pose in the K +2 th positioning period is the difference between the vehicle navigation positioning pose in the K +2 th positioning period and the vehicle navigation offset pose in the K positioning period. And the vehicle fusion positioning pose of the K +3 positioning period is the difference between the vehicle navigation positioning pose of the K +3 positioning period and the vehicle navigation offset pose of the K positioning period.
Example four
Fig. 11 is a signaling flowchart of a method for vehicle positioning according to a fourth embodiment of the present application, and as shown in fig. 11, an executing subject of the method for vehicle positioning according to the present embodiment is a system for vehicle positioning, where the system for vehicle positioning includes: the system comprises a vision sensor, a GNSS sensor, electronic equipment and a vehicle control center. The method for locating a vehicle provided by the embodiment comprises the following steps:
in step 401, a vision sensor determines a current vision lane line.
Further, in this embodiment, after the positioning frequency fused with the vehicle positioning pose is set, the electronic device may send the current positioning period to the vision sensor, and the vision sensor determines a vision lane line matched with the current positioning period as the current vision lane line.
Step 402, the vision sensor sends the current vision lane line to the electronic device.
In this embodiment, the visual sensor communicates with the electronic device, and sends the current visual lane line to the electronic device.
And 403, judging whether the current state of the visual sensor is a failure state or not by the electronic equipment according to the current visual lane line, if not, executing 404, otherwise, executing 405.
And 404, the electronic equipment calculates a current vehicle fusion positioning pose according to the current vehicle vision positioning pose and the current vehicle navigation positioning pose, calculates a current vehicle navigation offset pose according to the current vehicle fusion positioning pose and the current vehicle navigation positioning pose, and determines the current vehicle navigation offset pose as the corresponding vehicle navigation offset pose when the state of the vision sensor is the invalid state in a future positioning period.
It is understood that after step 404 is performed, step 410 is performed.
In step 405, the electronic device determines whether the failure time of the visual sensor is within a preset time period range, and if so, executes step 406. Otherwise, step 409 is performed.
In this embodiment, the implementation manners of step 404 to step 405 are similar to the implementation manners of step 202 to step 206 in the embodiment shown in fig. 3 of the present invention, and are not described in detail here.
In step 406, the GNSS-like sensor determines the current vehicle navigation and positioning pose.
Further, in this embodiment, the electronic device may also send the current positioning period to the GNSS-like sensor, and the GNSS-like sensor determines the vehicle navigation positioning pose matched with the current positioning period as the current vehicle navigation positioning pose.
In step 407, the GNSS sensor sends the current vehicle navigation positioning pose to the electronic device.
In this embodiment, the GNSS-like sensor communicates with the electronic device, and sends the current vehicle navigation positioning pose to the electronic device.
And 408, the electronic equipment acquires the vehicle navigation offset pose corresponding to the current vehicle navigation positioning pose, and calculates the current vehicle fusion positioning pose according to the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose.
In this embodiment, the implementation manner of step 408 is similar to the implementation manner of step 207 to step 208 in the embodiment shown in fig. 3 of this application, and is not described again.
It is understood that after step 408 is performed, step 410 is performed.
And 409, calculating the current vehicle fusion positioning pose according to the current vehicle navigation positioning pose by the electronic equipment.
In this embodiment, the implementation manner of step 409 is similar to that of step 209 in the embodiment shown in fig. 3 of this application, and is not described in detail here.
It is understood that after step 409 is performed, step 410 is performed.
And step 410, the electronic equipment sends the current vehicle fusion positioning pose to a control center of the vehicle.
Further, in this embodiment, no matter whether the vision sensor is in an active state or an inactive state, the electronic device sends the current vehicle fusion positioning pose to the control center of the vehicle. Namely, the current vehicle fusion positioning pose determined in step 404 or step 408 or step 409 is sent to the control center of the vehicle.
And 411, controlling the vehicle to run by the vehicle control center according to the current vehicle fusion positioning pose.
Further, in this embodiment, the control center of the vehicle plans the vehicle path according to the vehicle fusion positioning pose, and controls the vehicle to travel according to the planned path.
In the embodiment, after the current vehicle fusion positioning pose is calculated, the current vehicle fusion positioning pose is sent to the control center of the vehicle, the control center of the vehicle controls the vehicle to run according to the current vehicle fusion positioning pose, and the current vehicle fusion positioning pose can meet the positioning requirement of a lane level no matter the vision sensor is in a failure state or an effective state, so that a path planned by the control center is more accurate, and the safety of automatic driving is further improved.
EXAMPLE five
Fig. 12 is a schematic structural diagram of a vehicle positioning apparatus according to a fifth embodiment of the present application, and as shown in fig. 12, the vehicle positioning apparatus according to the present embodiment is located in an electronic device, the electronic device is in communication connection with a vehicle, the vehicle is mounted with multiple sensing sensors, and the sensing sensors at least include: the system comprises a vision sensor and a GNSS sensor, wherein the vision sensor is used for determining a vision lane line and a vehicle vision positioning pose, and the GNSS sensor is used for determining a vehicle navigation positioning pose. The vehicle positioning apparatus 1200 includes: a lane line acquisition module 1201, a state judgment module 1202, an offset pose acquisition module 1203, and a fusion pose calculation module 1204.
The lane line obtaining module 1201 is configured to obtain a current visual lane line. The state determining module 1202 is configured to determine whether the current state of the visual sensor is a failure state according to the current visual lane line. And an offset pose acquisition module 1203, configured to acquire the current vehicle navigation positioning pose and a corresponding vehicle navigation offset pose if it is determined that the current state of the vision sensor is the failure state. And a fusion pose calculation module 1204, configured to calculate a current vehicle fusion positioning pose according to the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose.
The vehicle positioning apparatus provided in this embodiment may implement the technical solution of the method embodiment shown in fig. 2, and the implementation principle and technical effect thereof are similar to those of the method embodiment shown in fig. 2, and are not described in detail herein.
EXAMPLE six
Fig. 13 is a schematic structural diagram of a vehicle positioning apparatus according to a sixth embodiment of the present application, and as shown in fig. 13, the vehicle positioning apparatus 1300 provided in this embodiment further includes, in addition to the vehicle positioning apparatus shown in fig. 12: an offset pose calculation module 1301 and an offset pose determination module 1302.
Further, the state determining module 1202 is specifically configured to:
calculating the matching degree of the current visual lane line and the corresponding high-precision map lane line; if the matching degree is smaller than a preset matching degree threshold value, determining that the current state of the visual sensor is a failure state; and if the matching degree is greater than or equal to the preset matching degree threshold value, determining that the current state of the visual sensor is an effective state.
Further, the offset pose acquisition module 1203 is specifically configured to: if the current state of the visual sensor is determined to be a failure state, judging whether the failure time of the visual sensor is within a preset time period range; and if the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose are determined to be within the preset time period range, acquiring the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose.
Further, the offset pose acquisition module 1203, when acquiring the corresponding vehicle navigation offset pose, is specifically configured to:
acquiring a historical vehicle fusion positioning pose and a historical vehicle navigation positioning pose in a historical positioning period closest to the current positioning period; the state of the visual sensor in a historical positioning period closest to the current positioning period is an effective state; calculating historical vehicle navigation offset poses according to the historical vehicle fusion positioning poses and the historical vehicle navigation positioning poses; and determining the historical vehicle navigation offset poses as corresponding vehicle navigation offset poses.
Further, the offset pose acquisition module 1203, when calculating the historical vehicle navigation offset poses according to the historical vehicle fusion positioning poses and the historical vehicle navigation positioning poses, is specifically configured to:
calculating a first difference value between the historical vehicle navigation positioning pose and the historical vehicle fusion positioning pose; and determining the first difference value as the historical vehicle navigation offset pose.
Further, the fusion pose calculation module 1204 is specifically configured to:
calculating a second difference value between the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose; and determining the second difference as the current vehicle fusion positioning pose.
Further, the fusion pose calculation module 1204 is further configured to calculate a current vehicle fusion positioning pose according to the current vehicle vision positioning pose and the current vehicle navigation positioning pose if the state of the vision sensor is an effective state; the offset pose calculation module 1301 is used for calculating the current vehicle navigation offset pose according to the current vehicle fusion positioning pose and the current vehicle navigation positioning pose; an offset pose determination module 1302, configured to determine a current vehicle navigation offset pose as a corresponding vehicle navigation offset pose when the state of the vision sensor is an invalid state in a future positioning period; the future positioning cycle is a positioning cycle which is not generated within a preset time period from the current positioning cycle.
The vehicle positioning apparatus provided in this embodiment may execute the technical solutions of the method embodiments shown in fig. 3,4, 6 to 9, and 11, and the implementation principle and technical effects thereof are similar to those of the method embodiments shown in fig. 3,4, 6 to 9, and 11, and are not described again.
According to an embodiment of the present application, an electronic device and a readable storage medium are also provided.
As shown in fig. 14, it is a block diagram of an electronic device of a method of vehicle positioning according to an embodiment of the present application. Electronic devices are intended for various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the present application that are described and/or claimed herein.
As shown in fig. 14, the electronic apparatus includes: one or more processors 1401, a memory 1402, and interfaces for connecting the various components, including a high-speed interface and a low-speed interface. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions for execution within the electronic device, including instructions stored in or on the memory to display graphical information of a GUI on an external input/output apparatus (such as a display device coupled to the interface). In other embodiments, multiple processors and/or multiple buses may be used, along with multiple memories and multiple memories, as desired. Also, multiple electronic devices may be connected, with each device providing portions of the necessary operations (e.g., as a server array, a group of blade servers, or a multi-processor system). Fig. 14 illustrates an example of a processor 1401.
Memory 1402 is a non-transitory computer readable storage medium as provided herein. Wherein the memory stores instructions executable by the at least one processor to cause the at least one processor to perform the method of vehicle localization provided herein. The non-transitory computer readable storage medium of the present application stores computer instructions for causing a computer to perform the method of vehicle localization provided herein.
The memory 1402, as a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules corresponding to the method of vehicle positioning in the embodiment of the present application (for example, the lane line acquisition module 1201, the state determination module 1202, the bias pose acquisition module 1203, and the fusion pose calculation module 1204 shown in fig. 12). The processor 1401 executes various functional applications of the server and data processing, i.e., a method of vehicle positioning in the above-described method embodiments, by executing non-transitory software programs, instructions, and modules stored in the memory 1402.
The memory 1402 may include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the electronic device of fig. 14, and the like. Further, the memory 1402 may include high-speed random access memory, and may also include non-transitory memory, such as at least one disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, memory 1402 may optionally include memory located remotely from processor 1401, which may be connected to the electronic device of FIG. 14 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device of fig. 14 may further include: an input device 1403 and an output device 1404. The processor 1401, the memory 1402, the input device 1403, and the output device 1404 may be connected by a bus or other means, as exemplified by the bus connection in fig. 14.
The input device 1403 may receive input voice, numeric, or character information and generate key signal inputs related to user settings and function control of the electronic device of fig. 14, such as an input device like a touch screen, keypad, mouse, track pad, touch pad, pointer stick, one or more mouse buttons, track ball, joystick, etc. The output devices 1404 may include a voice playing device, a display device, auxiliary lighting devices (e.g., LEDs), and haptic feedback devices (e.g., vibrating motors), among others. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device can be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs (also known as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented using high-level procedural and/or object-oriented programming languages, and/or assembly/machine languages. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
According to the technical scheme of the embodiment of the application, once the GNSS-type sensor completes positioning, the positioning under the open sky scene is very stable, although absolute positioning errors exist, the relative positioning errors with the vehicle fusion positioning poses in a short period are very small, so that the vehicle navigation offset poses corresponding to the current vehicle navigation positioning poses can be accurately determined by utilizing the characteristics that the relative positioning errors in the short period of the GNSS-type sensor are very small and stable, the positioning errors of the corresponding vehicle navigation offset poses are very small, the lane-level positioning requirements can be met, and further the calculated current vehicle fusion positioning poses can meet the lane-level positioning requirements. The method can effectively reduce the fluctuation of positioning errors when the vision sensor fails, so that the vehicle does not have strong pause and pause feeling in the driving process, and the safety of automatic driving and riding experience are improved.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, and the present invention is not limited thereto as long as the desired results of the technical solutions disclosed in the present application can be achieved.
The above-described embodiments should not be construed as limiting the scope of the present application. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (16)

1. A vehicle positioning method is applied to an electronic device, the electronic device is in communication connection with a vehicle, the vehicle is provided with a plurality of perception sensors, and the perception sensors at least comprise: a vision sensor for determining a visual lane line and a vehicle vision positioning pose and a GNSS-like sensor for determining a vehicle navigation positioning pose, the method comprising:
acquiring a current visual lane line;
judging whether the current state of the vision sensor is a failure state or not according to the current vision lane line;
if the current state of the vision sensor is determined to be a failure state, acquiring a current vehicle navigation positioning pose and a corresponding vehicle navigation offset pose;
and calculating the current vehicle fusion positioning pose according to the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose.
2. The method of claim 1, wherein determining whether the current state of the visual sensor is a failure state based on the current visual lane line comprises:
calculating the matching degree of the current visual lane line and the corresponding high-precision map lane line;
if the matching degree is smaller than a preset matching degree threshold value, determining that the current state of the visual sensor is a failure state;
and if the matching degree is greater than or equal to the preset matching degree threshold value, determining that the current state of the visual sensor is an effective state.
3. The method of claim 1, wherein obtaining the current vehicle navigation positioning pose and the corresponding vehicle navigation biasing pose if the current state of the vision sensor is determined to be the failure state comprises:
if the current state of the visual sensor is determined to be a failure state, judging whether the failure time of the visual sensor is within a preset time period range;
and if the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose are determined to be within the preset time period range, acquiring the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose.
4. The method of claim 1, wherein the obtaining corresponding vehicle navigation bias poses comprises:
acquiring a historical vehicle fusion positioning pose and a historical vehicle navigation positioning pose in a historical positioning period closest to the current positioning period; the state of the vision sensor in the historical positioning period closest to the current positioning period is an effective state;
calculating historical vehicle navigation offset poses according to the historical vehicle fusion positioning poses and the historical vehicle navigation positioning poses;
determining the historical vehicle navigation bias poses as the corresponding vehicle navigation bias poses.
5. The method of claim 4, wherein the calculating historical vehicle navigation bias poses from the historical vehicle fusion positioning poses and the historical vehicle navigation positioning poses comprises:
calculating a first difference value between the historical vehicle navigation positioning pose and the historical vehicle fusion positioning pose;
determining the first difference value as a historical vehicle navigation bias pose.
6. The method of claim 1, wherein calculating a current vehicle fusion position pose from the current vehicle navigation position pose and corresponding vehicle navigation bias poses comprises:
calculating a second difference value between the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose;
and determining the second difference value as the current vehicle fusion positioning pose.
7. The method of claim 1, wherein if the status of the vision sensor is active, further comprising:
calculating a current vehicle fusion positioning pose according to the current vehicle vision positioning pose and the current vehicle navigation positioning pose;
calculating a current vehicle navigation offset pose according to the current vehicle fusion positioning pose and the current vehicle navigation positioning pose;
determining the current vehicle navigation offset pose as the corresponding vehicle navigation offset pose when the state of the vision sensor is an invalid state in a future positioning period; the future positioning cycle is a positioning cycle which is not generated within a preset time period range from the current positioning cycle.
8. A vehicle positioning device is located in an electronic device, the electronic device is in communication connection with a vehicle, the vehicle is provided with a plurality of perception sensors, and the perception sensors at least comprise: the device comprises a vision sensor and a GNSS sensor, wherein the vision sensor is used for determining a vision lane line and a vehicle vision positioning pose, the GNSS sensor is used for determining a vehicle navigation positioning pose, and the device comprises:
the lane line acquisition module is used for acquiring a current visual lane line;
the state judgment module is used for judging whether the current state of the vision sensor is a failure state or not according to the current vision lane line;
the offset pose acquisition module is used for acquiring a current vehicle navigation positioning pose and a corresponding vehicle navigation offset pose if the current state of the vision sensor is determined to be a failure state;
and the fusion pose calculation module is used for calculating the current vehicle fusion positioning pose according to the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose.
9. The apparatus of claim 8, wherein the state determination module is specifically configured to:
calculating the matching degree of the current visual lane line and the corresponding high-precision map lane line;
if the matching degree is smaller than a preset matching degree threshold value, determining that the current state of the visual sensor is a failure state;
and if the matching degree is greater than or equal to the preset matching degree threshold value, determining that the current state of the visual sensor is an effective state.
10. The apparatus of claim 8, wherein the offset pose acquisition module is specifically configured to:
if the current state of the visual sensor is determined to be a failure state, judging whether the failure time of the visual sensor is within a preset time period range;
and if the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose are determined to be within the preset time period range, acquiring the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose.
11. The apparatus of claim 8, wherein the offset pose acquisition module, when acquiring the corresponding vehicle navigation offset pose, is specifically configured to:
acquiring a historical vehicle fusion positioning pose and a historical vehicle navigation positioning pose in a historical positioning period closest to the current positioning period; the state of the vision sensor in the historical positioning period closest to the current positioning period is an effective state;
calculating historical vehicle navigation offset poses according to the historical vehicle fusion positioning poses and the historical vehicle navigation positioning poses;
determining the historical vehicle navigation bias poses as the corresponding vehicle navigation bias poses.
12. The apparatus of claim 11, wherein the bias pose acquisition module, when calculating historical vehicle navigation bias poses from the historical vehicle fusion positioning poses and the historical vehicle navigation positioning poses, is specifically configured to:
calculating a first difference value between the historical vehicle navigation positioning pose and the historical vehicle fusion positioning pose;
determining the first difference value as a historical vehicle navigation bias pose.
13. The apparatus according to claim 8, wherein the fusion pose calculation module is specifically configured to:
calculating a second difference value between the current vehicle navigation positioning pose and the corresponding vehicle navigation offset pose;
and determining the second difference value as the current vehicle fusion positioning pose.
14. The apparatus of claim 8, further comprising: the system comprises an offset pose calculation module and an offset pose determination module;
the fusion pose calculation module is further used for calculating the current vehicle fusion positioning pose according to the current vehicle vision positioning pose and the current vehicle navigation positioning pose if the state of the vision sensor is an effective state;
the offset pose calculation module is used for calculating the current vehicle navigation offset pose according to the current vehicle fusion positioning pose and the current vehicle navigation positioning pose;
the bias pose determining module is used for determining the current vehicle navigation bias pose as the corresponding vehicle navigation bias pose when the state of the vision sensor is an invalid state in a future positioning period; the future positioning cycle is a positioning cycle which is not generated within a preset time period range from the current positioning cycle.
15. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-7.
16. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-7.
CN201911146739.2A 2019-11-21 2019-11-21 Vehicle positioning method, device, equipment and storage medium Active CN110806215B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911146739.2A CN110806215B (en) 2019-11-21 2019-11-21 Vehicle positioning method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911146739.2A CN110806215B (en) 2019-11-21 2019-11-21 Vehicle positioning method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110806215A true CN110806215A (en) 2020-02-18
CN110806215B CN110806215B (en) 2021-06-29

Family

ID=69491065

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911146739.2A Active CN110806215B (en) 2019-11-21 2019-11-21 Vehicle positioning method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110806215B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111830544A (en) * 2020-07-02 2020-10-27 合肥移顺信息技术有限公司 Vehicle positioning method, device, system and storage medium
CN112577526A (en) * 2020-12-29 2021-03-30 武汉中海庭数据技术有限公司 Confidence calculation method and system for multi-sensor fusion positioning
CN113643440A (en) * 2021-07-06 2021-11-12 北京百度网讯科技有限公司 Positioning method, device, equipment and storage medium
CN113870316A (en) * 2021-10-19 2021-12-31 青岛德智汽车科技有限公司 Front vehicle path reconstruction method under scene without GPS vehicle following
CN114001742A (en) * 2021-10-21 2022-02-01 广州小鹏自动驾驶科技有限公司 Vehicle positioning method and device, vehicle and readable storage medium
CN114114369A (en) * 2022-01-27 2022-03-01 智道网联科技(北京)有限公司 Autonomous vehicle positioning method and apparatus, electronic device, and storage medium
CN114111845A (en) * 2021-12-15 2022-03-01 安徽江淮汽车集团股份有限公司 Vehicle positioning calibration method based on ground identification
WO2023035359A1 (en) * 2021-09-10 2023-03-16 中车株洲电力机车研究所有限公司 Combined tracking method and apparatus
CN116481548A (en) * 2023-06-25 2023-07-25 蘑菇车联信息科技有限公司 Positioning method and device for automatic driving vehicle and electronic equipment
US11866064B2 (en) 2021-01-25 2024-01-09 Beijing Baidu Netcom Science Technology Co., Ltd. Method and apparatus for processing map data

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150248588A1 (en) * 2014-03-03 2015-09-03 Denso Corporation Lane line recognition apparatus
DE102014215570A1 (en) * 2014-08-06 2016-02-11 Elektrobit Automotive Gmbh Car navigation system
CN106767853A (en) * 2016-12-30 2017-05-31 中国科学院合肥物质科学研究院 A kind of automatic driving vehicle high-precision locating method based on Multi-information acquisition
CN108960183A (en) * 2018-07-19 2018-12-07 北京航空航天大学 A kind of bend target identification system and method based on Multi-sensor Fusion
CN109085620A (en) * 2017-06-13 2018-12-25 百度在线网络技术(北京)有限公司 Automatic driving vehicle positions abnormal calibration method, apparatus, equipment and storage medium
CN109785667A (en) * 2019-03-11 2019-05-21 百度在线网络技术(北京)有限公司 Deviation recognition methods, device, equipment and storage medium
CN109927722A (en) * 2019-03-01 2019-06-25 武汉光庭科技有限公司 The method and system that the lane of view-based access control model and combined inertial nevigation is kept in automatic Pilot
CN109989329A (en) * 2019-04-22 2019-07-09 河南城建学院 A kind of intelligent line-marking vehicle guided using unmanned plane
CN110136199A (en) * 2018-11-13 2019-08-16 北京初速度科技有限公司 A kind of vehicle location based on camera, the method and apparatus for building figure
CN110293970A (en) * 2019-05-22 2019-10-01 重庆长安汽车股份有限公司 A kind of travel control method of autonomous driving vehicle, device and automobile

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150248588A1 (en) * 2014-03-03 2015-09-03 Denso Corporation Lane line recognition apparatus
DE102014215570A1 (en) * 2014-08-06 2016-02-11 Elektrobit Automotive Gmbh Car navigation system
CN106767853A (en) * 2016-12-30 2017-05-31 中国科学院合肥物质科学研究院 A kind of automatic driving vehicle high-precision locating method based on Multi-information acquisition
CN109085620A (en) * 2017-06-13 2018-12-25 百度在线网络技术(北京)有限公司 Automatic driving vehicle positions abnormal calibration method, apparatus, equipment and storage medium
CN108960183A (en) * 2018-07-19 2018-12-07 北京航空航天大学 A kind of bend target identification system and method based on Multi-sensor Fusion
CN110136199A (en) * 2018-11-13 2019-08-16 北京初速度科技有限公司 A kind of vehicle location based on camera, the method and apparatus for building figure
CN109927722A (en) * 2019-03-01 2019-06-25 武汉光庭科技有限公司 The method and system that the lane of view-based access control model and combined inertial nevigation is kept in automatic Pilot
CN109785667A (en) * 2019-03-11 2019-05-21 百度在线网络技术(北京)有限公司 Deviation recognition methods, device, equipment and storage medium
CN109989329A (en) * 2019-04-22 2019-07-09 河南城建学院 A kind of intelligent line-marking vehicle guided using unmanned plane
CN110293970A (en) * 2019-05-22 2019-10-01 重庆长安汽车股份有限公司 A kind of travel control method of autonomous driving vehicle, device and automobile

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111830544B (en) * 2020-07-02 2024-01-05 合肥移顺信息技术有限公司 Method, device, system and storage medium for vehicle positioning
CN111830544A (en) * 2020-07-02 2020-10-27 合肥移顺信息技术有限公司 Vehicle positioning method, device, system and storage medium
CN112577526A (en) * 2020-12-29 2021-03-30 武汉中海庭数据技术有限公司 Confidence calculation method and system for multi-sensor fusion positioning
CN112577526B (en) * 2020-12-29 2023-10-13 武汉中海庭数据技术有限公司 Confidence calculating method and system for multi-sensor fusion positioning
US11866064B2 (en) 2021-01-25 2024-01-09 Beijing Baidu Netcom Science Technology Co., Ltd. Method and apparatus for processing map data
CN113643440A (en) * 2021-07-06 2021-11-12 北京百度网讯科技有限公司 Positioning method, device, equipment and storage medium
WO2023035359A1 (en) * 2021-09-10 2023-03-16 中车株洲电力机车研究所有限公司 Combined tracking method and apparatus
CN113870316B (en) * 2021-10-19 2023-08-15 青岛德智汽车科技有限公司 Front vehicle path reconstruction method under GPS-free following scene
CN113870316A (en) * 2021-10-19 2021-12-31 青岛德智汽车科技有限公司 Front vehicle path reconstruction method under scene without GPS vehicle following
CN114001742A (en) * 2021-10-21 2022-02-01 广州小鹏自动驾驶科技有限公司 Vehicle positioning method and device, vehicle and readable storage medium
CN114001742B (en) * 2021-10-21 2024-06-04 广州小鹏自动驾驶科技有限公司 Vehicle positioning method, device, vehicle and readable storage medium
CN114111845A (en) * 2021-12-15 2022-03-01 安徽江淮汽车集团股份有限公司 Vehicle positioning calibration method based on ground identification
CN114114369A (en) * 2022-01-27 2022-03-01 智道网联科技(北京)有限公司 Autonomous vehicle positioning method and apparatus, electronic device, and storage medium
CN116481548A (en) * 2023-06-25 2023-07-25 蘑菇车联信息科技有限公司 Positioning method and device for automatic driving vehicle and electronic equipment
CN116481548B (en) * 2023-06-25 2023-10-03 蘑菇车联信息科技有限公司 Positioning method and device for automatic driving vehicle and electronic equipment

Also Published As

Publication number Publication date
CN110806215B (en) 2021-06-29

Similar Documents

Publication Publication Date Title
CN110806215B (en) Vehicle positioning method, device, equipment and storage medium
CN110979346B (en) Method, device and equipment for determining lane where vehicle is located
CN111220154A (en) Vehicle positioning method, device, equipment and medium
CN111649739B (en) Positioning method and device, automatic driving vehicle, electronic equipment and storage medium
CN110595494B (en) Map error determination method and device
CN105593877B (en) Object tracking is carried out based on the environmental map data dynamically built
CN111721289A (en) Vehicle positioning method, device, equipment, storage medium and vehicle
CN111679302A (en) Vehicle positioning method, device, electronic equipment and computer storage medium
CN107748569B (en) Motion control method and device for unmanned aerial vehicle and unmanned aerial vehicle system
US10976163B2 (en) Robust vision-inertial pedestrian tracking with heading auto-alignment
CN110617825B (en) Vehicle positioning method and device, electronic equipment and medium
CN110793544B (en) Method, device and equipment for calibrating parameters of roadside sensing sensor and storage medium
CN111274343A (en) Vehicle positioning method and device, electronic equipment and storage medium
CN110542436A (en) Evaluation method, device and equipment of vehicle positioning system and storage medium
CN111462029B (en) Visual point cloud and high-precision map fusion method and device and electronic equipment
CN111220164A (en) Positioning method, device, equipment and storage medium
CN111811521A (en) Positioning method and device, electronic equipment, vehicle-end equipment and automatic driving vehicle
CN111638528B (en) Positioning method, positioning device, electronic equipment and storage medium
CN111079079B (en) Data correction method, device, electronic equipment and computer readable storage medium
CN111611901A (en) Vehicle reverse running detection method, device, equipment and storage medium
CN111523471B (en) Method, device, equipment and storage medium for determining lane where vehicle is located
CN111784837A (en) High-precision map generation method and device
CN113887400B (en) Obstacle detection method, model training method and device and automatic driving vehicle
CN111721305B (en) Positioning method and apparatus, autonomous vehicle, electronic device, and storage medium
CN112147632A (en) Method, device, equipment and medium for testing vehicle-mounted laser radar perception algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant