CN112050806A - Positioning method and device for moving vehicle - Google Patents

Positioning method and device for moving vehicle Download PDF

Info

Publication number
CN112050806A
CN112050806A CN201910488886.1A CN201910488886A CN112050806A CN 112050806 A CN112050806 A CN 112050806A CN 201910488886 A CN201910488886 A CN 201910488886A CN 112050806 A CN112050806 A CN 112050806A
Authority
CN
China
Prior art keywords
information
image
current
pose information
imu
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910488886.1A
Other languages
Chinese (zh)
Other versions
CN112050806B (en
Inventor
罗金辉
蔡娟
柴政
穆北鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Momenta Technology Co Ltd
Original Assignee
Beijing Chusudu Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Chusudu Technology Co ltd filed Critical Beijing Chusudu Technology Co ltd
Priority to CN201910488886.1A priority Critical patent/CN112050806B/en
Publication of CN112050806A publication Critical patent/CN112050806A/en
Application granted granted Critical
Publication of CN112050806B publication Critical patent/CN112050806B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/3415Dynamic re-routing, e.g. recalculating the route when the user deviates from calculated route or after detecting real-time traffic data or accidents
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial

Abstract

The embodiment of the invention discloses a method and a device for positioning a moving vehicle, wherein the method comprises the following steps: acquiring first position information of a feature point to be utilized in a current image and current IMU data corresponding to the current image; determining a current re-projection error corresponding to the space point to be utilized based on the first position information, the current depth information and the second position information corresponding to the imaging point of the space point to be utilized in the reference image corresponding to the feature point to be utilized, the reference pose information of the reference image and the first estimated pose information; determining an IMU data measurement error corresponding to a space point to be utilized based on the current IMU data, the first estimated pose information and the previous pose information; and determining the current pose information of the mobile vehicle in the world coordinate system based on the current reprojection error, the IMU data measurement error and first observation pose information and first observation speed information of the IMU in the world coordinate system, which are obtained by a global positioning system, so as to improve the accuracy of the positioning result of the mobile vehicle in the world coordinate system.

Description

Positioning method and device for moving vehicle
Technical Field
The invention relates to the technical field of automatic driving, in particular to a method and a device for positioning a moving vehicle.
Background
The related vehicle real-time positioning schemes are as follows: a vehicle positioning scheme based on images acquired by an image acquisition device and an inertial measurement unit IMU. The imaging position information and the depth information of the imaging points of the space points in the continuous multi-frame images acquired by the image acquisition equipment, the pose information of each frame of image acquired by the image acquisition equipment and the measurement data measured by the inertia measurement unit are utilized to obtain more accurate position information and pose information of the vehicle in a reference coordinate system, wherein the reference coordinate system is a space coordinate system established based on the pose of the image acquisition equipment in acquiring the first frame of image.
However, in an outdoor application scenario, it is often necessary to obtain the position information and the posture information of the vehicle in the world coordinate system, and how to provide a method capable of providing more accurate position information and posture information of the vehicle in the world coordinate system at the same time becomes a problem to be solved urgently.
Disclosure of Invention
The invention provides a method and a device for positioning a moving vehicle, which aim to improve the accuracy of a positioning result of the moving vehicle in a world coordinate system. The specific technical scheme is as follows:
in a first aspect, an embodiment of the present invention provides a method for positioning a mobile vehicle, where the mobile vehicle is provided with an image acquisition device, an inertial measurement unit IMU, and a global positioning system, and includes:
acquiring first position information of a detected feature point to be utilized in a current image acquired by image acquisition equipment and current IMU data corresponding to the current image acquired by the IMU;
based on the first position information, current depth information and second position information corresponding to an imaging point of the space point to be utilized corresponding to the feature point to be utilized in a reference image, reference pose information and first estimated pose information corresponding to the reference image, and determining a current reprojection error corresponding to the space point to be utilized, wherein an initial value of the first estimated pose information is as follows: based on the previous pose information corresponding to the previous frame image of the current image, the estimated pose information corresponding to the current image, wherein the reference image is as follows: observing an image of an imaging point of the space point to be utilized corresponding to the characteristic point to be utilized for the first time;
determining an IMU data measurement error corresponding to the space point to be utilized based on the current IMU data, the first estimated pose information and the previous pose information;
and determining the current pose information of the mobile vehicle in the world coordinate system based on the current reprojection error, the IMU data measurement error and first observation pose information and first observation speed information of the IMU in the world coordinate system, which are obtained by the global positioning system.
Optionally, before the step of determining the current reprojection error corresponding to the spatial point to be utilized based on the first position information, the current depth information and the second position information corresponding to the imaging point of the spatial point to be utilized in the reference image corresponding to the feature point to be utilized, the reference pose information and the first estimated pose information corresponding to the reference image, the method further includes:
acquiring third position information of an imaging point of the space point to be utilized in each historical image corresponding to the feature point to be utilized and historical pose information corresponding to each historical image, wherein the historical images comprise images between the current image and the reference image;
the step of determining the current reprojection error corresponding to the space point to be utilized based on the current depth information and the second position information corresponding to the imaging point of the space point to be utilized in the reference image corresponding to the first position information, the reference pose information and the first estimated pose information corresponding to the reference image comprises the following steps:
and determining a current re-projection error corresponding to the space point to be utilized based on the first position information, the current depth information, the second position information, the third position information, the reference pose information corresponding to the reference image, the first estimated pose information and the historical pose information.
Optionally, the step of determining a current reprojection error corresponding to the space point to be utilized based on the first position information, the current depth information, the second position information, the third position information, the reference pose information corresponding to the reference image, the first estimated pose information, and the historical pose information includes:
determining spatial position information of the to-be-utilized spatial point corresponding to the to-be-utilized feature point under an equipment coordinate system corresponding to the reference image based on the current depth information and the second position information, wherein the equipment coordinate system corresponding to the reference image is as follows: the image acquisition equipment is in an equipment coordinate system when acquiring the pose of the reference image;
determining first projection position information of a projection point of the space point to be utilized in the current image based on the space position information, the reference pose information and the first estimated pose information;
for each historical image, determining second projection position information of the projection point of the space point to be utilized in the historical image based on the space position information, the reference pose information and the historical pose information corresponding to the historical image;
and determining the current re-projection error corresponding to the space point to be utilized based on the first projection position information, the first position information, the second projection position information and the third position information.
Optionally, the step of determining, based on the current IMU data, the first estimated pose information, and the previous pose information, an IMU data measurement error corresponding to the spatial point to be utilized includes:
obtaining historical IMU data corresponding to each historical image and reference IMU data corresponding to the reference image;
and determining an IMU data measurement error corresponding to the space point to be utilized based on the current IMU data corresponding to the current image, the first estimated pose information, the previous pose information, the reference IMU data corresponding to the reference image, the reference pose information, the pose information corresponding to the previous frame of image of the reference image, historical IMU data corresponding to each historical image, historical pose information and historical pose information corresponding to the previous frame of image of the historical image.
Optionally, the step of determining the current pose information of the mobile vehicle in the world coordinate system based on the current reprojection error, the IMU data measurement error, and first observation pose information and first observation speed information of the IMU in the world coordinate system, which are obtained by the global positioning system, includes:
determining first estimation pose information and first estimation speed information of the IMU under a world coordinate system based on the current reprojection error and the IMU data measurement error;
obtaining first observation pose information and first observation speed information under a world coordinate system of the IMU, which are obtained through the global positioning system;
determining a pose velocity error based on the first estimated velocity information and the first observed velocity information, and the first estimated pose information and the first observed pose information;
and determining the current pose information of the moving vehicle under the world coordinate system based on the pose speed error.
Optionally, the step of obtaining first observation pose information and first observation speed information in the world coordinate system of the IMU obtained by the global positioning system includes:
obtaining second observation pose information and second observation speed information of the mobile vehicle under a world coordinate system, which are measured by the global positioning system;
and determining first observation pose information and first observation speed information of the IMU under the world coordinate system based on the rotation matrix between the mobile vehicle and the IMU, the second observation pose information and the second observation speed information.
Optionally, the step of determining the current pose information of the mobile vehicle in the world coordinate system based on the current reprojection error, the IMU data measurement error, and the first observation pose information and the first observation speed information in the world coordinate system of the IMU obtained by the global positioning system includes:
determining current pose information of the IMU in a world coordinate system based on the current reprojection error, the IMU data measurement error and first observation pose information and first observation speed information of the IMU in the world coordinate system, which are obtained through the global positioning system;
determining current pose information of the mobile vehicle in a world coordinate system of the IMU based on a rotation matrix between the mobile vehicle and the IMU and the current pose information in the world coordinate system of the IMU.
Optionally, after the step of determining the current pose information of the mobile vehicle in the world coordinate system based on the current reprojection error, the IMU data measurement error, and the first observation pose information and the first observation speed information in the world coordinate system of the IMU obtained by the global positioning system, the method further includes:
under the condition that the global positioning system is detected to be invalid, obtaining an image behind the current image acquired by image acquisition equipment as an image to be utilized;
obtaining IMU data corresponding to the image to be utilized acquired after the IMU;
and determining the pose information of the moving vehicle in the world coordinate system based on the current pose information of the moving vehicle in the world coordinate system, the position information of the imaging point of the space point to be utilized in the image to be utilized corresponding to the characteristic point to be utilized and the IMU data acquired by the IMU.
In a second aspect, an embodiment of the present invention provides a positioning apparatus for a mobile vehicle, where the mobile vehicle is provided with an image capture device, an inertial measurement unit IMU, and a global positioning system, and the positioning apparatus includes:
the system comprises a first obtaining module, a second obtaining module and a third obtaining module, wherein the first obtaining module is configured to obtain first position information of a detected feature point to be utilized in a current image acquired by image acquisition equipment and current IMU data corresponding to the current image acquired by the IMU;
a first determining module, configured to determine, based on the first position information, current depth information and second position information corresponding to an imaging point of the spatial point to be utilized corresponding to the feature point in a reference image, and reference pose information and first estimated pose information corresponding to the reference image, a current re-projection error corresponding to the spatial point to be utilized, where an initial value of the first estimated pose information is: based on the previous pose information corresponding to the reference image of the current image, estimating the pose information corresponding to the current image, wherein the reference image is as follows: observing an image of an imaging point of the space point to be utilized corresponding to the characteristic point to be utilized for the first time;
a second determination module configured to determine an IMU data measurement error corresponding to the spatial point to be utilized based on the current IMU data, the first estimated pose information, and the previous pose information;
a third determination module configured to determine current pose information of the mobile vehicle in a world coordinate system based on the current reprojection error, the IMU data measurement error, and first observed pose information and first observed velocity information of the IMU in the world coordinate system obtained by the global positioning system.
Optionally, the apparatus further comprises: a second obtaining module configured to: obtaining third position information of an imaging point of the space point to be utilized corresponding to the feature point to be utilized in each historical image and historical pose information corresponding to each historical image before determining a current re-projection error corresponding to the space point to be utilized based on the first position information, the reference pose information and the first estimated pose information corresponding to the reference image, wherein the historical images comprise images between the current image and the reference image;
the first determining module is specifically configured to:
and determining a current re-projection error corresponding to the space point to be utilized based on the first position information, the current depth information, the second position information, the third position information, the reference pose information corresponding to the reference image, the first estimated pose information and the historical pose information.
Optionally, the first determining module is specifically configured to:
determining spatial position information of the to-be-utilized spatial point corresponding to the to-be-utilized feature point under an equipment coordinate system corresponding to the reference image based on the current depth information and the second position information, wherein the equipment coordinate system corresponding to the reference image is as follows: the image acquisition equipment is in an equipment coordinate system when acquiring the pose of the reference image;
determining first projection position information of a projection point of the space point to be utilized in the current image based on the space position information, the reference pose information and the first estimated pose information;
for each historical image, determining second projection position information of the projection point of the space point to be utilized in the historical image based on the space position information, the reference pose information and the historical pose information corresponding to the historical image;
and determining the current re-projection error corresponding to the space point to be utilized based on the first projection position information, the first position information, the second projection position information and the third position information.
Optionally, the second determining module is specifically configured to:
obtaining historical IMU data corresponding to each historical image and reference IMU data corresponding to the reference image;
and determining an IMU data measurement error corresponding to the space point to be utilized based on the current IMU data corresponding to the current image, the first estimated pose information, the previous pose information, the reference IMU data corresponding to the reference image, the reference pose information, the pose information corresponding to the previous frame of image of the reference image, historical IMU data corresponding to each historical image, historical pose information and historical pose information corresponding to the previous frame of image of the historical image.
Optionally, the third determining module includes:
a first determination unit configured to determine first estimated pose information and first estimated velocity information of the IMU in a world coordinate system based on the current reprojection error and the IMU data measurement error;
an obtaining unit configured to obtain first observation pose information and first observation speed information in a world coordinate system of the IMU obtained by the global positioning system;
a second determination unit configured to determine a pose velocity error based on the first estimated velocity information and the first observed velocity information, and the first estimated pose information and the first observed pose information;
a third determination unit configured to determine current pose information of the moving vehicle in the world coordinate system based on the pose speed error.
Optionally, the obtaining unit is configured to obtain second observation pose information and second observation speed information of the mobile vehicle in a world coordinate system, which are measured by the global positioning system;
and determining first observation pose information and first observation speed information of the IMU under the world coordinate system based on the rotation matrix between the mobile vehicle and the IMU, the second observation pose information and the second observation speed information.
Optionally, the third determining module is specifically configured to determine current pose information of the IMU in a world coordinate system based on the current reprojection error, the IMU data measurement error, and first observation pose information and first observation speed information in the world coordinate system of the IMU obtained by the global positioning system;
determining current pose information of the mobile vehicle in a world coordinate system of the IMU based on a rotation matrix between the mobile vehicle and the IMU and the current pose information in the world coordinate system of the IMU.
Optionally, the apparatus further comprises: a third obtaining module configured to, after determining the current pose information of the mobile vehicle in the world coordinate system based on the current reprojection error, the IMU data measurement error, and the first observation pose information and the first observation speed information in the world coordinate system of the IMU obtained by the global positioning system, obtain, as an image to be utilized, an image subsequent to the current image acquired by an image acquisition device in a case where the global positioning system is detected to be disabled;
a fourth obtaining module, configured to obtain IMU data corresponding to the image to be utilized acquired after the IMU;
a fourth determining module configured to determine pose information of the mobile vehicle in a world coordinate system based on current pose information of the mobile vehicle in the world coordinate system, position information of imaging points of the spatial points to be utilized in the image to be utilized corresponding to the feature points to be utilized, and IMU data acquired by the IMU.
As can be seen from the above, in the positioning method and apparatus for a mobile vehicle provided in the embodiments of the present invention, the mobile vehicle is provided with an image acquisition device, an inertial measurement unit IMU, and a global positioning system, and can obtain first position information of a feature point to be utilized, which is detected from a current image acquired by the image acquisition device, and current IMU data corresponding to the current image acquired by the IMU; based on the first position information, current depth information and second position information corresponding to the imaging point of the space point to be utilized corresponding to the feature point in the reference image, reference pose information corresponding to the reference image and first estimated pose information, and determining a current re-projection error corresponding to the space point to be utilized, wherein the initial value of the first estimated pose information is as follows: based on the previous pose information corresponding to the previous frame image of the current image and the estimated pose information corresponding to the current image, the reference image is as follows: observing an image of an imaging point of a space point to be utilized corresponding to the characteristic point to be utilized for the first time; determining an IMU data measurement error corresponding to a space point to be utilized based on the current IMU data, the first estimated pose information and the previous pose information; and determining the current pose information of the mobile vehicle in the world coordinate system based on the current reprojection error, the IMU data measurement error and first observation pose information and first observation speed information of the IMU in the world coordinate system, which are obtained by a global positioning system.
By applying the embodiment of the invention, the position information of the space point to be utilized in the current image and the reference image corresponding to the characteristic point to be utilized, the reference pose information of the reference image and the estimated first estimated pose information corresponding to the current image are firstly determined, the current reprojection error corresponding to the space point to be utilized is determined, the IMU data measurement error corresponding to the space point to be utilized is determined based on the current IMU data, the first estimated pose information and the previous pose information, further, the first observation pose information and the first observation speed information under the world coordinate system of the IMU, which are obtained by a global positioning system, are combined to further optimize the current pose information under the world coordinate system of the IMU, further optimize the current pose information under the world coordinate system of the mobile vehicle, so as to realize the pose change and the speed change of the mobile vehicle, which are determined by combining the position of the space point to be utilized in the image collected by the image collecting equipment, relative to the previous pose, the pose change and the speed change of the moving vehicle relative to the previous pose, which are determined by IMU data collected by the IMU, and the position and the speed of the moving vehicle, which are measured by a global positioning system, under the world coordinate system at present are determined together, so that the current pose information of the moving vehicle under the world coordinate system is determined, and the accuracy of the positioning result of the moving vehicle under the world coordinate system is improved. Of course, not all of the advantages described above need to be achieved at the same time in the practice of any one product or method of the invention.
The innovation points of the embodiment of the invention comprise:
1. the method can firstly determine the current reprojection error corresponding to the space point to be utilized based on the position information of the space point to be utilized in the current image and the corresponding reference image corresponding to the feature point to be utilized, the reference pose information of the reference image and the estimated first estimated pose information corresponding to the current image, and determine the IMU data measurement error corresponding to the space point to be utilized based on the current IMU data, the first estimated pose information and the previous pose information, further, the current pose information of the IMU in the world coordinate system is further optimized by combining the first observation pose information and the first observation speed information of the IMU in the world coordinate system obtained by the global positioning system, and further, the current pose information of the moving vehicle in the world coordinate system is optimized so as to realize the pose change and the speed change of the moving vehicle relative to the previous pose determined by combining the position of the space point to be utilized in the image collected by the image collecting device, the pose change and the speed change of the moving vehicle relative to the previous pose, which are determined by IMU data collected by the IMU, and the position and the speed of the moving vehicle, which are measured by a global positioning system, under the world coordinate system at present are determined together, so that the current pose information of the moving vehicle under the world coordinate system is determined, and the accuracy of the positioning result of the moving vehicle under the world coordinate system is improved.
2. And establishing more accurate current reprojection error by combining the first position information of the feature points to be utilized in the current image, the current depth information and the second position information corresponding to the imaging points of the space points to be utilized in the reference image corresponding to the feature points to be utilized, the third position information of the space points to be utilized in each historical image, the first estimated pose information, the historical pose information and the reference pose information, and further determining more accurate current pose information of the moving vehicle in the world coordinate system by combining the IMU data measurement error corresponding to the space points to be utilized and the position and the speed of the moving vehicle in the world coordinate system measured by the global positioning system.
3. And combining the current IMU data, the first estimated pose information, the previous pose information, historical IMU data and historical pose information corresponding to each historical image and historical pose information corresponding to a previous frame image to jointly construct more accurate IMU data measurement errors corresponding to the space points to be utilized, and further combining the current re-projection errors and the position and speed of the global positioning system under the world coordinate system to jointly determine more accurate current pose information of the moving vehicle under the world coordinate system.
4. Based on the current reprojection error and the IMU data measurement error, determining first estimation pose information and first estimation speed information of the IMU in a world coordinate system, and further combining first observation pose information and first observation speed information of the IMU in the world coordinate system, which are obtained by a global positioning system, to construct a pose speed error so as to determine the current pose information of the IMU and the mobile vehicle in the world coordinate system.
5. In the positioning process, under the condition that the global positioning system is detected to be invalid, the pose information of the mobile vehicle in the world coordinate system can be jointly estimated based on the current pose information, the image to be utilized behind the current image acquired by the image acquisition equipment and the IMU data corresponding to the image to be utilized acquired by the IMU, so that the positioning failure condition of the mobile vehicle in the world coordinate system is avoided to a certain extent.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below. It is to be understood that the drawings in the following description are merely exemplary of some embodiments of the invention. For a person skilled in the art, without inventive effort, further figures can be obtained from these figures.
Fig. 1 is a schematic flow chart of a positioning method for a mobile vehicle according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of a positioning method for a mobile vehicle according to an embodiment of the present invention;
FIG. 3 is a timing diagram of an image acquired by an image acquisition device, IMU data acquired by an IMU, and a positioning result obtained by a global positioning system in a world coordinate system;
fig. 4 is a schematic structural diagram of a positioning device of a mobile vehicle according to an embodiment of the present invention.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. It is to be understood that the described embodiments are merely a few embodiments of the invention, and not all embodiments. All other embodiments, which can be obtained by a person skilled in the art without inventive effort based on the embodiments of the present invention, are within the scope of the present invention.
It is to be noted that the terms "comprises" and "comprising" and any variations thereof in the embodiments and drawings of the present invention are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
The invention provides a method and a device for positioning a moving vehicle, which aim to improve the accuracy of a positioning result of the moving vehicle in a world coordinate system. The following provides a detailed description of embodiments of the invention.
Fig. 1 is a schematic flow chart of a positioning method for a mobile vehicle according to an embodiment of the present invention. The method may comprise the steps of:
it is understood that the positioning method for a mobile vehicle provided by the embodiment of the present invention may be applied to any type of electronic device, and the electronic device may be a server or a terminal device. The electronic equipment can be vehicle-mounted equipment and is arranged on the mobile vehicle; or may not be provided on the moving vehicle, which is all possible. In one case, when the electronic device is not installed on the mobile vehicle, the mobile vehicle is provided with an image acquisition device, an inertial measurement unit (imu), and a Global Positioning System (GPS). The image acquisition equipment can shoot the surrounding environment of the position where the mobile vehicle is located, and can shoot information on the ground of the environment where the mobile vehicle is located and information in the air. The IMU may include acceleration sensors, gyroscopes, and the like.
S101: and obtaining first position information of the feature points to be utilized detected from the current image acquired by the image acquisition equipment and current IMU data corresponding to the current image acquired by the IMU.
The image acquisition equipment can shoot the environment where the mobile vehicle is located in real time so as to acquire the image containing the information of the environment where the mobile vehicle is located. The current image is an image acquired by the image acquisition equipment at the current moment, and the current IMU data corresponding to the current image is as follows: the IMU acquires IMU data between the moment when the image acquisition equipment acquires the previous frame image of the current image and the current moment, namely the IMU data acquired by the IMU between the previous moment and the current moment.
The electronic equipment can detect the feature points to be utilized from the current image through a preset feature point detection algorithm, and determine the position information of the feature points to be utilized in the current image as first position information. The preset feature point detection algorithm may be any type of feature point detection algorithm, such as a Scale-invariant feature transform (SIFT) detection algorithm, a speedup Robust Features (speedup Robust Features) detection algorithm, and a Harris corner point detection algorithm, and the embodiment of the present invention does not limit the preset feature point detection algorithm.
A plurality of feature points may be detected in the current image, and one or more of the feature points may be used as feature points to be utilized. The spatial points corresponding to the feature points to be utilized may be: the moving vehicle travels at a spatial point on an object in a stationary state in a road, such as a spatial point on a traffic sign, a spatial point on a traffic sign line, and a spatial point on buildings and plants beside the road.
S102: and determining the current re-projection error corresponding to the space point to be utilized based on the first position information, the current depth information and the second position information corresponding to the imaging point of the space point to be utilized in the reference image corresponding to the characteristic point to be utilized, the reference pose information corresponding to the reference image and the first estimated pose information.
Wherein, the initial value of the first estimated pose information is: based on the previous pose information corresponding to the previous frame image of the current image and the estimated pose information corresponding to the current image, the reference image is as follows: and observing the image of the imaging point of the space point to be utilized corresponding to the feature point to be utilized for the first time, namely observing the image of the imaging point of the space point to be utilized corresponding to the feature point to be utilized, namely the image of the imaging point of the space point to be utilized corresponding to the feature point to be utilized, which is contained firstly.
In one case, the reference pose information corresponding to the reference image may be: the pose information of the IMU corresponding to the reference image comprises position information and posture information of the IMU; correspondingly, the first estimated pose information is as follows: pose information of the IMU corresponding to the current image; correspondingly, the history pose information corresponding to each history image mentioned later is: and historical pose information of the IMU corresponding to each historical image. The pose information of the IMU corresponding to the xx image is as follows: when the xx image is acquired by the image acquisition equipment, the pose information of the IMU can comprise a current image and a historical image.
In another case, the reference pose information corresponding to the reference image may be: the pose information of the image acquisition equipment corresponding to the reference image comprises position information and posture information of the image acquisition equipment; correspondingly, the first estimated pose information is as follows: pose information of the image acquisition equipment corresponding to the current image; correspondingly, the history pose information corresponding to each history image mentioned later is: and historical pose information of the image acquisition equipment corresponding to each historical image. The pose information of the image acquisition equipment corresponding to the xx image is as follows: when the xx image is acquired by the image acquisition equipment, the pose information of the xx image is acquired.
After obtaining the first position information of the feature point to be utilized in the current image, the electronic device may continue to obtain the current depth information and the second position information corresponding to the imaging point of the spatial point to be utilized in the reference image corresponding to the feature point to be utilized, and the reference pose information corresponding to the reference image, and estimate the first estimated pose information corresponding to the current image according to the previous pose information corresponding to the reference image of the current image. And then, based on the information, determining the current reprojection error corresponding to the space point to be utilized.
When the reference pose information corresponding to the reference image is as follows: when the pose information of the IMU corresponding to the image is referred to and the current reprojection error corresponding to the spatial point to be utilized is determined, the pose information acquired by the image acquisition equipment to obtain the reference image is determined as the first pose information based on the rotation relation and the translation relation between the IMU and the image acquisition equipment and the reference pose information. And determining pose information when the image acquisition equipment acquires the current image based on the rotation relation and the translation relation between the IMU and the image acquisition equipment and the first estimated pose information.
Further, the process of determining the current reprojection error corresponding to the spatial point to be utilized based on the above information may be: and determining a current re-projection error corresponding to the space point to be utilized based on the first position information, the current depth information, the second position information, the first position and posture information and position and posture information when the image acquisition equipment acquires the current image.
After the image capture device and IMU settings in the moving vehicle are complete, the relative position between the image capture device and IMU may no longer change. An external reference Tbc between the image acquisition device and the IMU can be calibrated through any type of calibration algorithm in the related art, the external reference Tbc can represent a rotation relation and a translation relation between an IMU coordinate system and a device coordinate system of the image acquisition device, and the pose information of the other can be obtained through the external reference Tbc and the pose information of any one of the image acquisition device and the IMU.
It can be understood that, during the running process of the moving vehicle, an image containing the environment where the moving vehicle is located may be acquired in real time by the image acquisition device, and a spatial point in the environment may be observed by consecutive multiple frames of the image acquisition device, that is, multiple frames of images acquired by the image acquisition device may each include an imaging point corresponding to a spatial point in the environment. The tracking of the same spatial point in the environment can be realized through a plurality of feature point tracking algorithms, which may include, but are not limited to, algorithms such as an optical flow method or descriptor matching. The relative pose change between poses of the image acquisition equipment when different images are acquired can be restrained through the position information of the imaging point of the same space point in the multi-frame images acquired by the image acquisition equipment in the environment, further, the relative pose change of the moving vehicle can be determined, and further, the initial pose information of the image acquisition equipment is combined, so that the moving vehicle can be positioned. As shown in fig. 3, the spatial point may be observed multiple times by the image capturing device, that is, multiple frames of images captured by the image capturing device may each include an imaging point corresponding to a spatial point in the environment. And the image acquisition equipment can acquire imaging points including a plurality of spatial points in one frame of image.
In one implementation, the S102 may include the following steps: firstly, determining projection position information of a projection point of a space point to be utilized in a current image, which corresponds to a feature point to be utilized; and then, determining the current re-projection error corresponding to the space point to be utilized based on the projection position information of the projection point of the space point to be utilized corresponding to the feature point to be utilized in the current image and the position information of the feature point to be utilized in the current image, namely the first position information. Specifically, the method can comprise the following steps:
and estimating the pose information of the current image acquired by the image acquisition equipment as second pose information based on the pose information of the previous frame of image acquired by the image acquisition equipment, wherein the pose information is the pose information of the image acquisition equipment at the current moment, namely the pose information of the image acquisition equipment corresponding to the current image. And determining the equipment position information of the space point to be utilized in the equipment coordinate system corresponding to the reference image acquired by the image acquisition equipment as the first equipment position information based on the current depth information and the second position information of the space point to be utilized corresponding to the imaging point of the reference image. Based on the second position and posture information, the first position and posture information and the first equipment position information, determining equipment position information under an equipment coordinate system corresponding to the space point to be utilized when the image acquisition equipment acquires the current image, and taking the equipment position information as the second equipment position information, wherein the equipment coordinate system corresponding to the image acquisition equipment acquires the current image is as follows: and the image acquisition equipment corresponds to the equipment coordinate system at the current moment. And projecting the space point to be utilized to the current image based on the second device position information and a preset projection matrix of the image acquisition device, and determining the projection position information of the projection point of the space point to be utilized in the current image. And then, constructing a current re-projection error corresponding to the space point to be utilized based on the projection position information of the projection point of the space point to be utilized in the current image and the position information of the feature point to be utilized in the current image, namely the first position information, so as to construct and obtain a first constraint for solving the pose information of the moving vehicle.
The preset projection matrix is: the device coordinate system of the image acquisition device is used for projecting the spatial points to the projection matrix of the image coordinate system of the image acquisition device. The image acquisition device determines the preset projection matrix.
The process of determining the pose information when the image acquisition device acquires the current image based on the pose information when the image acquisition device acquires the previous frame of image of the current image as the second pose information may be: and directly taking the pose information of the image acquisition equipment when the image acquisition equipment acquires the previous frame of image of the current image as an initial value of the second pose information of the current image acquired by the image acquisition equipment. Alternatively, it may be: and estimating pose information based on pose information acquired by the image acquisition equipment when the previous frame of image of the current image is acquired and current IMU data, wherein the pose information is used as an initial value of second pose information acquired by the image acquisition equipment when the current image is acquired. This is all possible. When the first estimated pose information is the pose information of the image acquisition equipment corresponding to the current image, the second pose information is the first estimated pose information.
In one case, the process of determining the projection position information of the projection point of the spatial point to be utilized corresponding to the feature point in the current image can be expressed by the following formula (1):
Figure BDA0002086293450000111
wherein h isf,(xi1,yi1) Representing projection position information of a projection point of a space point to be utilized corresponding to the feature point to be utilized in the current image, wherein i is any integer;
Figure BDA0002086293450000112
the spatial point to be utilized corresponding to the feature point to be utilized is represented, and the device position information under the device coordinate system corresponding to the reference image acquired by the image acquisition device can be represented as follows:
Figure BDA0002086293450000113
Figure BDA0002086293450000114
(ui,vi) Second position information, 1/lambda, representing the correspondence of the imaging points of the spatial points to be utilized in the reference image with respect to the feature points to be utilized1Representing the current depth information corresponding to the imaging point of the space point to be utilized in the reference image corresponding to the characteristic point to be utilized;
Figure BDA0002086293450000115
representing the posture of the image acquisition equipment corresponding to the reference moment under an IMU coordinate system, and representing a rotation matrix between the equipment coordinate system of the image acquisition equipment corresponding to the reference moment and the IMU coordinate system, wherein the reference moment is as follows: the IMU coordinate system is as follows at the moment when the image acquisition equipment acquires the reference image: a coordinate system established based on the IMU;
Figure BDA00020862934500001215
diagram showing reference time correspondenceA translation matrix for translating the position of the image acquisition device to the position of the IMU, representing the translation relationship from the device coordinate system to the IMU coordinate system corresponding to the reference time, the translation matrix representing the translation relationship from the device coordinate system to the IMU coordinate system corresponding to the reference time
Figure BDA00020862934500001217
And
Figure BDA00020862934500001218
can be obtained by the external reference Tbc;
Figure BDA00020862934500001216
representing the attitude of the IMU corresponding to the reference moment in a world coordinate system;
Figure BDA00020862934500001213
representing the inverse of the attitude of the IMU corresponding to the current moment in a world coordinate system;
Figure BDA00020862934500001214
representing the attitude of the IMU corresponding to the current moment under the equipment coordinate system;
Figure BDA00020862934500001212
a translation matrix for translating the position of the IMU corresponding to the current moment to the position of the image acquisition equipment, and
Figure BDA00020862934500001211
reciprocal; k denotes an internal reference matrix of the image acquisition device, which can be expressed as
Figure BDA0002086293450000121
Wherein fx denotes a focal length in a horizontal axis direction of an image coordinate system of the image capturing apparatus, fy denotes a focal length in a vertical axis direction of the image coordinate system, and (cx, cy) denotes position information of an image principal point in the image coordinate system, wherein the image principal point is: the intersection point of the optical axis of the image acquisition equipment, the perpendicular line of the image plane and the image plane; the position relationship between the image acquisition device and the IMU is fixed, the
Figure BDA0002086293450000129
And
Figure BDA00020862934500001210
the reciprocal relationship can be obtained by the external reference Tbc.
Wherein, the
Figure BDA0002086293450000128
The heading, i.e., yaw _ gps, of the speed of the moving vehicle, as measured by the global positioning system corresponding to the reference time, and the mounting deviation angle between the moving vehicle and the IMU, can be calculated. Specifically, the attitude of the moving vehicle under the world coordinate system corresponding to the reference time can be obtained through the conversion relation between the Euler angle and the rotation matrix and the attitude of the moving vehicle at the reference time, namely the yaw angle
Figure BDA0002086293450000126
According to the installation deviation angle between the moving vehicle and the IMU, the attitude of the IMU corresponding to the reference moment under the coordinate system of the moving vehicle can be obtained
Figure BDA0002086293450000127
Multiplying the two to obtain the attitude of the IMU corresponding to the reference time in the world coordinate system, namely the rotation matrix from the IMU coordinate system to the world coordinate system
Figure BDA0002086293450000125
Wherein the content of the first and second substances,
Figure BDA0002086293450000123
it will be appreciated that the positional relationship between the mobile vehicle and the IMU is fixed, and that at any one time, the rotation matrix between the mobile vehicle and the IMU
Figure BDA0002086293450000124
And is not changed.
For the same reason, the
Figure BDA0002086293450000122
Can pass through the world corresponding to the current momentThe heading, i.e., yaw _ gps, of the speed of the moving vehicle as measured by the positioning system is calculated, as well as the mounting offset angle between the moving vehicle and the IMU.
Subsequently, the current reprojection error corresponding to the spatial point to be utilized can be expressed by the following formula (2):
e1=||zf1-hf||; (2)
wherein z isf1The first position information indicating the feature point to be utilized may be represented as (x)11,y11),e1Representing the current reprojection error for the point in space to be utilized.
S103: and determining an IMU data measurement error corresponding to the space point to be utilized based on the current IMU data, the first estimated pose information and the previous pose information.
Wherein, if the first estimated pose information is: the pose information of the IMU corresponding to the current image, and the previous pose information: the pose information of the IMU corresponding to the previous frame image of the current image; the S103 may include: and determining an IMU data measurement error corresponding to the space point to be utilized directly based on the current IMU data, the first estimated pose information and the previous pose information. If the first estimated pose information is: the position and posture information of the image acquisition equipment corresponding to the current image, and the previous position and posture information: when the pose information of the image capturing device corresponding to the previous frame image of the current image is obtained, this S103 may include: converting the first estimated pose information into pose information of an IMU (inertial measurement Unit) corresponding to the current image; and converting the previous pose information into the pose information of the IMU corresponding to the previous frame of image of the current image, and further determining the IMU data measurement error corresponding to the space point to be utilized based on the current IMU data, the pose information of the IMU corresponding to the current image and the pose information corresponding to the previous frame of image of the current image.
The first estimated pose information is taken as follows: the pose information of the IMU corresponding to the current image, and the previous pose information: for example, the pose information of the IMU corresponding to the previous frame image of the current image is described as follows:
based on the current IMU data, the observed change amounts of the position, velocity, and attitude angle of the IMU between the previous time and the current time, that is, the change amounts of the position, velocity, and attitude angle of the IMU between the previous time and the current time, may be determined, where the previous time is: the image acquisition equipment acquires the moment of acquiring the previous frame image of the current image; furthermore, the variation of the position, the speed and the attitude angle of the moving vehicle between the current time and the previous time can be determined, namely the variation of the observed position, the speed and the attitude angle of the moving vehicle between the current time and the previous time; through the first estimated pose information and the previous pose information, the variation of the position, the speed and the attitude angle of the IMU from the previous moment to the current moment can be determined, namely the estimated variation of the position, the speed and the attitude angle of the IMU from the previous moment to the current moment, and further the variation of the position, the speed and the attitude angle of the moving vehicle from the current moment to the previous moment can be determined, namely the estimated variation of the position, the speed and the attitude angle of the moving vehicle from the current moment to the previous moment.
According to the observed variation of the position, the speed and the attitude angle of the IMU from the previous moment to the current moment and the estimated variation of the position, the speed and the attitude angle of the IMU from the previous moment to the current moment, an IMU data measurement error corresponding to a space point to be utilized can be determined, so that a second constraint for solving the pose information of the moving vehicle is constructed. Wherein, the change of the attitude angle of the IMU is the change of the attitude of the IMU.
The observed variation of the position, the speed and the posture of the IMU between the previous time and the current time can be obtained by pre-integration calculation of the current IMU data, and can be represented by the following formula (3):
Figure BDA0002086293450000131
wherein alpha is21Representing the observed amount of change, β, in the position of the IMU between the previous time and the current time21Denotes the observed IThe change amount, γ, of the speed of the MU from the previous time to the current time21Representing the amount of change in the observed attitude angle of the IMU between the previous time and the current time; a istRepresenting the measured value of the acceleration sensor at time t, t2Indicates the previous time, t1Represents the current time, batRepresents the zero offset of the acceleration sensor at time t; omegatRepresenting the measured value of the gyroscope at time t, naRepresents the noise of a preset acceleration sensor,
Figure BDA0002086293450000135
representing zero bias of the gyroscope at time t, nωRepresenting the noise of a preset gyroscope;
Figure BDA0002086293450000134
representing the pose change of the IMU from time t to the previous time, te (t)2,t1);
Figure BDA0002086293450000132
Wherein the content of the first and second substances,
Figure BDA0002086293450000133
ωztrepresents: the measured value of the gyroscope at time t is the measured value on the vertical axis of the coordinate system in which the gyroscope is located, omegaxtRepresents: the measured value of the gyroscope at time t is the measured value on the horizontal axis of the coordinate system where the gyroscope is located, omegaytRepresents: and (4) measuring the vertical axis of the gyroscope under the coordinate system of the gyroscope at the time t.
The estimated variation of the position, the speed and the attitude angle of the IMU between the previous moment and the current moment can be directly calculated through the first estimated pose information, the previous pose information and the time difference between the current moment and the previous moment. In one case, the estimated variation of the position, the speed and the attitude angle of the IMU between the previous time and the current time can be calculated by the following equation (4):
Figure BDA0002086293450000141
wherein h isimu21Representing the predicted amount of change in position, velocity, and attitude angle of the IMU between the previous time and the current time, wherein,
Figure BDA0002086293450000142
representing an estimated amount of change in position of the IMU from a previous time to a current time,
Figure BDA0002086293450000143
representing the predicted amount of change in velocity of the IMU between the previous time and the current time,
Figure BDA0002086293450000145
representing the estimated variation of the attitude angle between the previous moment and the current moment of the IMU;
Figure BDA0002086293450000146
representing the inverse of the posture of the IMU corresponding to the previous moment in a world coordinate system;
Figure BDA0002086293450000147
represents the attitude of the IMU corresponding to the current time in the world coordinate system, and
Figure BDA0002086293450000144
reciprocal; p1Representing the position of the IMU corresponding to the current moment, wherein the iteration initial value is the initial value of the position information in the first estimated pose information; p2Representing the position information of the IMU corresponding to the previous moment in the previous position information under the reference coordinate system; g represents the gravitational acceleration; Δ t21Representing the time difference between the previous time and the current time; v2Represents the velocity of the IMU at the previous time; v1Represents the velocity of the IMU at the current time, wherein V1May be V2Or may be V2And the amount of change in velocity resulting from integration of the current IMU data.
Subsequently, the measurement error of the IMU data corresponding to the spatial point to be utilized may be determined, where the measurement error of the IMU data corresponding to the determined spatial point to be utilized may be represented by the following formula (5):
e2=||zimu21-himu21||; (5)
wherein z isimu12Represents the amount of change in the observed position, speed, and attitude of the moving vehicle between the previous time and the current time, i.e., α represented in formula (3)21、β21And gamma21,e2Indicating the IMU data measurement error corresponding to the spatial point to be utilized.
As shown in fig. 3, the data acquisition frequency of the IMU is greater than the image acquisition frequency of the image acquisition device, the current IMU data corresponding to the current image includes IMU data acquired by the IMU from the previous time to the current time, and after the current IMU data is integrated, the variation of the position, the speed, and the posture of the IMU from the previous time to the current time can be obtained.
S104: and determining the current pose information of the mobile vehicle in the world coordinate system based on the current reprojection error, the IMU data measurement error and first observation pose information and first observation speed information of the IMU in the world coordinate system, which are obtained by a global positioning system.
In this step, the electronic device may iterate estimated pose information and velocity information of the IMU, that is, subsequently mentioned first estimated pose information and first estimated velocity information, based on the current reprojection error and the IMU data measurement error, and determine a position information and a velocity information measurement error of the IMU in the world coordinate system, that is, subsequently mentioned pose velocity error, in combination with first observation pose information and first observation velocity information of the IMU in the world coordinate system obtained by the global positioning system at the current time, to construct a third constraint for obtaining pose information for solving the moving vehicle.
The position information and speed information measurement error of the IMU in the world coordinate system can be expressed by the following formula (6):
e3=||zgps1-hgps1||; (6)
wherein e is3Indicating the measurement error of the position information and velocity information of the IMU in the world coordinate system, zgps1Representing first observation pose information and first observation speed information h in a world coordinate system of an IMU (inertial measurement Unit) obtained by a global positioning systemgps1And representing the estimated pose information and the speed information of the estimated IMU under a world coordinate system.
Based on the three constraints, more accurate pose information and speed information of the IMU under a world coordinate system can be determined in an iterative mode, and further, based on a rotation matrix between the IMU and the moving vehicle and more accurate pose information and speed information of the IMU under the world coordinate system, the current pose information and speed information of the moving vehicle under the world coordinate system are determined, and positioning of the moving vehicle under the world coordinate system is achieved.
As shown in fig. 3, the global positioning system may obtain observation pose information and observation speed information in the world coordinate system of the IMU between two frames of images acquired by the image acquisition device, and may use the observation pose information and observation speed information in the world coordinate system of the IMU obtained by the global positioning system as the observation pose information and observation speed information corresponding to the next frame of image in the two frames of images, that is, between the time of acquiring the previous frame of image of the current image and the time of acquiring the current image, the global positioning system obtains the observation pose information and observation speed information in the world coordinate system of the IMU as the observation pose information and observation speed information corresponding to the current image, that is, the observation pose information and observation speed information corresponding to the current time.
In one case, one or more feature points to be utilized may be detected from the current image; if a plurality of feature points to be utilized can be detected from the current image, determining a current reprojection error and an IMU data measurement error corresponding to a space point to be utilized corresponding to each feature point to be utilized; further, determining pose information and speed information of the IMU in a reference coordinate system based on a current reprojection error and an IMU data measurement error corresponding to the to-be-utilized space point corresponding to each to-be-utilized characteristic point; and further, determining the current pose information of the moving vehicle in the world coordinate system based on the pose information and the speed information of the IMU in the reference coordinate system and the first observation pose information and the first observation speed information of the IMU in the world coordinate system, which are obtained through a global positioning system.
By applying the embodiment of the invention, the position information of the space point to be utilized in the current image and the corresponding reference image corresponding to the feature point to be utilized, the reference pose information of the reference image and the estimated first estimated pose information corresponding to the current image are firstly based on, the current re-projection error corresponding to the space point to be utilized is determined, the IMU data measurement error corresponding to the space point to be utilized is determined based on the current IMU data, the first estimated pose information and the previous pose information, further, the first observation pose information and the first observation speed information under the world coordinate system of the IMU, which are obtained by the global positioning system, are combined to further optimize the current pose information under the world coordinate system of the IMU, and further optimize the current pose information under the world coordinate system of the mobile vehicle, so that the pose change and the speed of the mobile vehicle relative to the previous pose, which are determined by combining the position of the space point to be utilized in the image collected by the image collecting equipment, are realized And determining the current pose information of the moving vehicle in the world coordinate system through the pose change and the speed change of the moving vehicle relative to the previous pose determined by IMU data acquired by the IMU and the current position and the speed of the moving vehicle measured by the global positioning system in the world coordinate system, so that the accuracy of the positioning result of the moving vehicle in the world coordinate system is improved.
In another embodiment of the present invention, in order to determine more accurate pose information of the moving vehicle in the world coordinate system, it is necessary to determine more accurate pose information of the IMU in the world coordinate system, and the estimated pose information of the IMU in the world coordinate system may be determined based on not only the related information of the current image and the related information of the reference image. Specifically, the estimated pose information of the IMU in the world coordinate system may be determined by combining the relevant information of the current image, the relevant information of the reference image, and the relevant information of the image between the reference image and the current image.
As shown in fig. 2, the method may include the steps of:
s201: and obtaining first position information of the feature points to be utilized detected from the current image acquired by the image acquisition equipment and current IMU data corresponding to the current image acquired by the IMU.
S202: and obtaining third position information of an imaging point of the space point to be utilized in each historical image corresponding to the feature point to be utilized and historical pose information corresponding to each historical image.
Wherein the history image comprises images between the current image and the reference image.
S203: and determining a current re-projection error corresponding to the space point to be utilized based on the first position information, the current depth information, the second position information, the third position information, the reference pose information corresponding to the reference image, the first estimated pose information and the historical pose information.
Wherein, the initial value of the first estimated pose information is: and estimating the pose information corresponding to the current image based on the previous pose information corresponding to the previous frame of image of the current image.
S204: and determining an IMU data measurement error corresponding to the space point to be utilized based on the current IMU data, the first estimated pose information and the previous pose information.
S205: and determining the current pose information of the mobile vehicle in the world coordinate system based on the current reprojection error, the IMU data measurement error and first observation pose information and first observation speed information under the world coordinate system of the IMU, which are obtained through a global positioning system.
The S201 is the same as S101 shown in fig. 1, the S204 is the same as S103 shown in fig. 1, and the S205 is the same as S104 shown in fig. 1, which are not repeated herein.
In this embodiment, the historical pose information of each historical image in the reference coordinate system is: taking the historical pose information of the IMU corresponding to each historical image as an example, the following description is given:
after the electronic device obtains the historical pose information of the IMU corresponding to each historical image, the electronic device needs to determine the pose information of the image acquisition device corresponding to the previous pose information as first pose information based on the rotation relation and the translation relation between the IMU and the image acquisition device and reference pose information; and determining pose information of the image acquisition equipment corresponding to each historical pose information as third pose information corresponding to each historical image based on the rotation relation and the translation relation between the IMU and the image acquisition equipment and each historical pose information. And then, determining the current reprojection error corresponding to the space point to be utilized based on the first position information, the current depth information, the second position information, the third position information, the first position information and the third position information.
Accordingly, in another embodiment of the present invention, the S203 may include:
determining spatial position information of the to-be-utilized spatial point corresponding to the to-be-utilized feature point under an equipment coordinate system corresponding to the reference image based on the current depth information and the second position information, wherein the equipment coordinate system corresponding to the reference image is as follows: and the image acquisition equipment is in an equipment coordinate system when acquiring the pose of the reference image.
And determining first projection position information of a projection point of the space point to be utilized in the current image based on the space position information, the reference pose information and the first estimated pose information.
And for each historical image, determining second projection position information of the projection point of the space point to be utilized in the historical image based on the spatial position information, the reference pose information and the historical pose information corresponding to the historical image.
And determining the current re-projection error corresponding to the space point to be utilized based on the first projection position information, the first position information, the second projection position information and the third position information.
In this embodiment, the historical pose information corresponding to each xx image is: the pose information of the IMU corresponding to each xx image is taken as an example to explain:
the pose information of the image acquisition device when acquiring the reference image, namely the first pose information mentioned above, can be determined based on the rotational relationship and the translational relationship between the IMU and the image acquisition device and the reference pose information. And then, determining pose information of the image acquisition equipment when acquiring the current image, namely the second pose information, based on the rotation relation and the translation relation between the IMU and the image acquisition equipment and the first estimated pose information.
Based on the current depth information and the second position information, determining spatial position information of the to-be-utilized spatial point corresponding to the to-be-utilized feature point under an equipment coordinate system corresponding to a reference image, wherein the equipment coordinate system corresponding to the reference image is as follows: and the image acquisition equipment is in an equipment coordinate system when acquiring the pose of the reference image.
Determining the current equipment position information of the space point to be utilized under the equipment coordinate system corresponding to the current image based on the second position information, the first position information and the space position information, wherein the equipment coordinate system corresponding to the current image is as follows: and the image acquisition equipment is in an equipment coordinate system when acquiring the pose of the current image. And projecting the space point to be utilized to the current image based on the position information of the current device and a preset projection matrix of the image acquisition device, and determining first projection position information of the projection point of the space point to be utilized in the current image. The rotation relation and the translation relation between the IMU and the image acquisition equipment can be determined through an external reference Tbc calibrated in advance.
Determining pose information of the image acquisition equipment when acquiring each historical image, namely third pose information corresponding to each historical image, based on the rotation relation and the translation relation between the IMU and the image acquisition equipment and each historical pose information; and then, for each historical image, determining historical device position information of the space point to be utilized in a device coordinate system corresponding to each historical image based on the space position information, the first position information and third position information corresponding to the historical image, further projecting the space point to be utilized to the historical image based on the historical device position information and a preset projection matrix of the image acquisition device, and determining second projection position information of the projection point of the space point to be utilized in the historical image.
And then, determining the current re-projection error corresponding to the space point to be utilized based on the first projection position information, the first position information, the second projection position information corresponding to each historical image and the third position information.
Wherein the first estimated pose information may be: and directly taking the pose information of the IMU when the image acquisition equipment acquires the previous image of the current image as an initial value of the first estimated pose information. Alternatively, it may be: estimating pose information based on IMU pose information and current IMU data when an image acquisition device acquires a previous image of a current image, wherein the pose information is used as an initial value of the first estimated pose information. This is all possible.
In one case, the process of determining the projection position information of the projection points of the spatial points to be utilized corresponding to the feature points to be utilized in the historical image and the current image can be expressed by the following formula (7):
Figure BDA0002086293450000181
wherein h isfij,(xij,yij) The projection position information of the projection point projected to the jth frame image is shown when the space point to be utilized corresponding to the feature point to be utilized is in the equipment coordinate system corresponding to the reference image, j can be [ i +1, k ]]K is an integer larger than i, the kth frame image represents a current image, and the jth frame image comprises the current image and a historical image;
Figure BDA00020862934500001812
the device position information of the spatial point to be utilized corresponding to the feature point to be utilized in the device coordinate system corresponding to the reference image may be expressed as:
Figure BDA0002086293450000182
(ui,vi) Imaging point pair representing spatial points to be utilized corresponding to characteristic points to be utilized in reference imageCorresponding position information, i.e. second position information, 1/lambdaAt presentRepresenting current depth information;
Figure BDA0002086293450000183
the posture of the image acquisition equipment corresponding to the reference image under the IMU coordinate system is shown, the rotation relation between the equipment coordinate system corresponding to the reference image and the IMU coordinate system is shown,
Figure BDA00020862934500001811
a translation matrix which represents that the position of the image acquisition equipment corresponding to the reference image is translated to the position of the IMU, represents the translation relation between the equipment coordinate system corresponding to the reference image and the IMU coordinate system, and the rotation matrix and the translation matrix are obtained through an external parameter Tbc;
Figure BDA0002086293450000189
the method comprises the following steps of representing the posture of an IMU corresponding to a reference image in a world coordinate system, representing the rotation relation between the IMU coordinate system corresponding to the reference image and the world coordinate system, and representing the IMU coordinate system corresponding to the reference image as follows: the IMU is in a coordinate system of the pose when the image acquisition equipment acquires the reference image;
Figure BDA00020862934500001810
the method comprises the following steps of representing the inverse of the posture of an IMU corresponding to the jth frame image in a world coordinate system, representing the rotation relation from the world coordinate system corresponding to the jth frame image to the IMU coordinate system, wherein the IMU coordinate system corresponding to the jth frame image is as follows: the IMU is in a coordinate system of the pose when the image acquisition equipment acquires the j frame image;
Figure BDA0002086293450000188
representing the posture of the IMU corresponding to the jth frame image in the equipment coordinate system, and representing the rotation relation between the IMU coordinate system corresponding to the jth frame image and the equipment coordinate system;
Figure BDA0002086293450000187
the position of the IMU corresponding to the jth frame image is translated to a translation matrix of the position of the image acquisition equipment, so that the translation matrix represents that the jth frame image corresponds toTo the device coordinate system. K denotes an internal reference matrix of the image acquisition device, which can be expressed as
Figure BDA0002086293450000184
Wherein fx denotes a focal length in a horizontal axis direction of an image coordinate system of the image capturing apparatus, fy denotes a focal length in a vertical axis direction of the image coordinate system, and (cx, cy) denotes position information of an image principal point in the image coordinate system, wherein the image principal point is: the intersection point of the optical axis of the image acquisition equipment, the perpendicular line of the image plane and the image plane; wherein the positional relationship between the image capturing device and the IMU is fixed, the
Figure BDA0002086293450000185
And
Figure BDA0002086293450000186
the reciprocal relationship can be obtained by the external reference Tbc.
Subsequently, the current reprojection error corresponding to the spatial point to be utilized can be expressed by the following formula (8):
Figure BDA0002086293450000191
wherein z isfijRepresenting the position information of the imaging point corresponding to the space point to be utilized in the j frame image corresponding to the characteristic point, wherein when j is k, z isfijFor the first position information, j takes [ i +1, k-1 [ ]]When z isfijThird position information corresponding to the jth frame image, e4Representing the current reprojection error for the point in space to be utilized.
In another embodiment of the present invention, the S204 may include:
obtaining historical IMU data corresponding to each historical image; the historical IMU data corresponding to each historical image comprises: the image acquisition equipment acquires IMU data acquired by the IMU at the moment of acquiring the previous frame of image of the historical image and before the moment of acquiring the historical image;
and determining an IMU data measurement error corresponding to the space point to be utilized based on the current IMU data corresponding to the current image, the first estimated pose information, the previous pose information, the reference IMU data corresponding to the reference image, the reference pose information, the pose information corresponding to the previous frame of image of the reference image, the historical IMU data corresponding to each historical image, the historical pose information and the historical pose information corresponding to the previous frame of image of the historical image.
The determining of the IMU data measurement error corresponding to the spatial point to be utilized based on the current IMU data corresponding to the current image, the first estimated pose information, the previous pose information, the reference IMU data corresponding to the reference image, the reference pose information, the pose information corresponding to the previous frame image of the reference image, the historical IMU data corresponding to each historical image, the historical pose information, and the historical pose information corresponding to the previous frame image of the historical image may specifically include:
determining a first IMU data measurement error corresponding to the current image based on the current IMU data corresponding to the current image, the first estimated pose information and the previous pose information;
determining a second IMU data measurement error corresponding to the reference image based on the reference IMU data corresponding to the reference image, the reference pose information and the pose information corresponding to the previous frame image of the reference image;
for each historical image, determining a third IMU data measurement error corresponding to the historical image based on the historical IMU data corresponding to the historical image, the historical pose information corresponding to the historical image and the historical pose information corresponding to the previous frame of image of the historical image;
and determining the IMU data measurement error corresponding to the space point to be utilized based on the first IMU data measurement error, the second IMU data measurement error and the third IMU data measurement error corresponding to each historical image.
In this embodiment, in order to improve the accuracy of the positioning result of the mobile vehicle, in the process of determining the estimated pose information and speed information of the IMU, an IMU data measurement error corresponding to a spatial point to be utilized may be constructed in combination with IMU data corresponding to a target image, pose information of the IMU corresponding to the target image, and pose information of the IMU corresponding to a previous frame image of the target image, where the target image includes: a current image, a reference image, and a history image; combining the space point to be utilized corresponding to the feature point to be utilized, current depth information and second position information corresponding to the imaging point in the reference image, first position information of the feature point to be utilized, third position information of the imaging point of the space point to be utilized in the historical image corresponding to the feature point to be utilized, and pose information when the reference image, the historical image and the current image are acquired by the image acquisition equipment, and constructing a current re-projection error corresponding to the space point to be utilized; furthermore, the estimated pose information and the speed information of the IMU are determined together by combining the IMU data measurement error and the current reprojection error corresponding to the constructed spatial point to be utilized, so that the accuracy of the determined estimated pose information and the determined speed information of the IMU is higher; and then based on the estimated pose information and speed information of the IMU with higher accuracy and first observation pose information and first observation speed information of the IMU under a world coordinate system, which are obtained by a global positioning system, determining the current pose information and speed information of the IMU with higher accuracy under the world coordinate system, and further based on the current pose information and speed information of the IMU with higher accuracy under the world coordinate system, determining the current pose information and speed information of the mobile vehicle with higher accuracy under the world coordinate system.
In this embodiment, the electronic device determines to obtain the estimated variation of the position, the speed, and the attitude angle of the IMU between the time when the image acquisition device acquires each two adjacent frames of images, which may specifically be represented by the following formula (9):
Figure BDA0002086293450000201
wherein h isimuj-1J represents the estimated variation of the position, the speed and the attitude angle of the IMU between the moment when the image acquisition equipment acquires the j-1 th frame image and the moment when the image acquisition equipment acquires the j-th frame image, wherein the j-th frame image comprises a historical mapImage and current image, j may take [ i +1, k [ ]]I is any integer, k is an integer greater than i, wherein,
Figure BDA0002086293450000202
representing the estimated variation of the position of the IMU between the moment of acquiring the j-1 frame image by the image acquisition equipment and the moment of acquiring the j frame image,
Figure BDA0002086293450000203
Figure BDA0002086293450000204
representing the estimated variation of the speed between the moment when the IMU acquires the j-1 frame image from the image acquisition equipment and the moment when the IMU acquires the j frame image;
Figure BDA0002086293450000205
representing the estimated variation of the attitude angle between the moment when the IMU acquires the j-1 frame image in the image acquisition equipment and the moment when the IMU acquires the j frame image;
Figure BDA0002086293450000206
the inverse of the posture of the IMU under the world coordinate system corresponding to the moment of acquiring the j-1 frame image by the image acquisition equipment is represented, namely the rotation matrix from the world coordinate system corresponding to the moment of acquiring the j-1 frame image by the image acquisition equipment to the IMU coordinate system is represented;
Figure BDA0002086293450000207
the IMU posture in the world coordinate system corresponding to the moment when the image acquisition equipment acquires the jth frame image is represented, namely the IMU coordinate system corresponding to the moment when the image acquisition equipment acquires the jth frame image is represented, and the IMU posture is converted into a rotation matrix of the world coordinate system; pjThe position of the IMU corresponding to the moment when the image acquisition equipment acquires the jth frame of image is represented, and when the jth frame of image is the current image, the iteration initial value of the jth frame of image is the initial value of the position information in the first estimated pose information; pj-1Indicating that the IMU corresponding to the image of the j-1 th frame is in the reference positionPosition information in pose information under the mark system; g represents the gravitational acceleration; Δ tj-1jThe time difference between the moment when the image acquisition equipment acquires the j-1 frame image and the moment when the image acquisition equipment acquires the j frame image is represented; vj-1The IMU speed at the moment when the image acquisition equipment acquires the j-1 frame image is represented; vjThe velocity of the IMU at the moment when the image acquisition equipment acquires the j frame image is shown.
Moreover, the electronic device may determine, based on the current IMU data corresponding to the current image, the historical IMU data corresponding to each historical image, and the reference IMU data corresponding to the reference image, a variation amount of a position, a speed, and an attitude of the observed IMU between the time when each two adjacent frames of images are acquired by the image acquisition device, may be obtained by pre-integration calculation of the current IMU data, the historical IMU data, and the reference IMU data, and may be represented by the following formula (10):
Figure BDA0002086293450000211
wherein alpha isj-1jRepresents the variation of the observed IMU position between the time of acquiring the j-1 frame image and the j frame image by the image acquisition equipmentj-1jRepresenting the variation of the observed velocity, gamma, of the IMU between the moment of acquiring the image of the j-1 th frame by the image acquisition equipment and the moment of acquiring the image of the j-th framej-1jRepresenting the variation of the attitude angle between the moment when the observed IMU acquires the j-1 frame image from the image acquisition equipment and the moment when the observed IMU acquires the j frame image; a istRepresenting the measured value of the acceleration sensor at time t, tj-1Represents the moment t of acquiring the j-1 frame image by the image acquisition equipmentjRepresenting the moment when the image acquisition device acquires the j frame image, batRepresents the zero offset of the acceleration sensor at time t; omegatRepresenting the measured value of the gyroscope at time t, naRepresents the noise of a preset acceleration sensor,
Figure BDA0002086293450000216
representing zero bias of the gyroscope at time t, nωRepresenting the noise of a preset gyroscope;
Figure BDA0002086293450000212
representing the attitude change of the IMU from the time t to the time when the image acquisition equipment acquires the image of the (j-1) th frame, and t is an element (t)j-1,tj);
Figure BDA0002086293450000213
Wherein the content of the first and second substances,
Figure BDA0002086293450000214
ω=ωt-bωt-nω,ωztrepresents: the measured value of the gyroscope at time t is the measured value on the vertical axis of the coordinate system in which the gyroscope is located, omegaxtRepresents: the measured value of the gyroscope at time t is the measured value on the horizontal axis of the coordinate system where the gyroscope is located, omegaytRepresents: and (4) measuring the vertical axis of the gyroscope under the coordinate system of the gyroscope at the time t.
Subsequently, the measurement error of the IMU data corresponding to the spatial point to be utilized may be determined, wherein the measurement error of the IMU data corresponding to the determined spatial point to be utilized may be represented by the following formula (11):
Figure BDA0002086293450000215
wherein z isimuj-1jThe variation of the position, the speed and the posture of the observed IMU between the moment when the image acquisition equipment acquires the j-1 frame image and the moment when the image acquisition equipment acquires the j frame image is shown, namely alpha shown in formula (10)j-1j、βj-1jAnd gammaj-1j,e5Indicating the IMU data measurement error corresponding to the spatial point to be utilized.
In an implementation manner, when the positioning process of the mobile vehicle provided by the embodiment of the present invention is started, a reference coordinate system may be constructed in advance based on pose information acquired by the image acquisition device when the first frame image is acquired. The direction of the vertical axis of the reference coordinate system may be parallel to the direction pointed by the optical axis when the first frame image is acquired by the image acquisition device, and the directions of the horizontal axis and the vertical axis of the reference coordinate system are not particularly limited. In one case, the direction of the horizontal axis of the reference coordinate system may be parallel to the direction of the horizontal axis of the image coordinate system when the first frame of image is acquired by the image acquisition device; the direction of the longitudinal axis of the reference coordinate system may be parallel to the direction of the longitudinal axis of the image coordinate system when the first frame of image is acquired by the image acquisition device.
After the reference coordinate system is constructed, initial position information, initial attitude information and speed information of the IMU in the reference coordinate system can be determined through the external reference Tbc, wherein the initial position information and the initial attitude information are initial pose information of the IMU.
After the positioning process of the mobile vehicle provided by the embodiment of the invention is started, before observation pose information and observation speed information of an IMU (inertial measurement Unit) obtained by a global positioning system in a world coordinate system are not obtained, an initial re-projection error can be constructed by referring to the above mode through position information and depth information corresponding to an imaging point of a space point to be utilized in an image acquired by an image acquisition device and pose information of each image acquired by the image acquisition device in a reference coordinate system; constructing an initial IMU data measurement error by referring to IMU data acquired by an IMU corresponding to an image acquired by image acquisition equipment and pose information of the IMU in a reference coordinate system during the image acquired by the image acquisition equipment in the manner; and then on the basis of the least square principle, constructing an initial target function based on the constructed initial reprojection error and the initial IMU data measurement error, and continuously and iteratively adjusting the value of the independent variable in the initial target function to enable the function value of the initial target function to be minimum, or to meet a preset condition, so that the obtained function value of the initial target function is minimum, or when the preset condition is met, the final value of the independent variable in the initial target function is determined, and based on the final value, the pose information and the speed information of the IMU in the reference coordinate system are determined and obtained. Wherein, the initial objective function can be expressed as:
Q0=argmin{e11+e12}; wherein e is11Representing the constructed initial reprojection error, e12Representing the constructed initial IMU data measurement error, Q0 initial objective function.
Wherein, the argument in the initial objective function can be expressed as:
Figure BDA00020862934500002210
Pmindicating the position information, R, of the IMU at time m in a reference coordinate systemmRepresenting the attitude information, P, of the IMU at time m in a reference coordinate systemmAnd RmForm the pose information V of the IMU at the moment m in a reference coordinate systemmIndicating the velocity information, λ, of the IMU at time m in a reference coordinate system1,λ2,…,λnRepresenting the latest depth information of each space point to be utilized at the moment m under the equipment coordinate system corresponding to the reference image observed for the first time, n is a positive integer,
Figure BDA0002086293450000228
represents the zero offset of the acceleration sensor at time m,
Figure BDA0002086293450000229
representing the zero offset of the gyroscope at time m.
Subsequently, after obtaining the observation pose information and the observation speed information of the IMU in the world coordinate system, which are obtained by the global positioning system, determining the conversion relationship between the reference coordinate system and the world coordinate system directly based on the pose information of the IMU in the reference coordinate system iterated at the moment m and the observation pose information of the IMU in the world coordinate system, which is obtained by the global positioning system at the moment m;
specifically, it can be expressed by the formula:
Figure BDA0002086293450000221
wherein the content of the first and second substances,
Figure BDA0002086293450000226
for the position information of the IMU iterated at the moment m in the reference coordinate system,
Figure BDA0002086293450000227
attitude information of the IMU iterated for m time under a reference coordinate system, the
Figure BDA0002086293450000222
And
Figure BDA0002086293450000223
the pose information of the IMU iterated at the moment m in the reference coordinate system is obtained; the
Figure BDA0002086293450000225
And
Figure BDA0002086293450000224
and obtaining the observation pose information of the IMU under the world coordinate system for the global positioning system at the moment m.
In one case, the transformation relation may include a rotation and translation relation between the reference coordinate system and the world coordinate system, and may be determined by pose information of the IMU iterated at the time m in the reference coordinate system, and observation pose information of the IMU obtained by the global positioning system at the time m in the world coordinate system. The initial objective function may be solved by Gauss newtons (Gauss newtons), Levenberg Marquardt (Levenberg-Marquardt), and the like.
After the conversion relation between the reference coordinate system and the world coordinate system is determined, the obtained pose information and the speed information of the IMU under the reference coordinate system are converted into the world coordinate system through the conversion relation between the reference coordinate system and the world coordinate system to obtain the estimated pose information and the estimated speed information of the IMU, and then the IMU is converted into the world coordinate system based on the estimated pose information and the estimated speed information of the IMUAnd constructing a pose speed error to obtain more accurate pose information of the IMU in the world coordinate system. I.e. multiplying all the arguments in the initial objective function by the above
Figure BDA0002086293450000231
I.e. MmAnd
Figure BDA0002086293450000232
and multiplying to obtain independent variables of the reference coordinate system and the world coordinate system, and directly determining pose information and speed information of the IMU in the world coordinate system during subsequent iteration.
In another embodiment of the present invention, the 104 may include:
determining first estimation pose information and first estimation speed information of the IMU under a world coordinate system based on the current reprojection error and the IMU data measurement error; obtaining first observation pose information and first observation speed information under a world coordinate system of an IMU (inertial measurement Unit) obtained through a global positioning system; determining a pose velocity error based on the first estimated velocity information and the first observed velocity information, and the first estimated pose information and the first observed pose information; and determining the current pose information of the moving vehicle under the world coordinate system based on the pose speed error.
In this embodiment, a first objective function may be constructed based on a current reprojection error and an IMU data measurement error by referring to a least square principle, and a value of an independent variable in the first objective function may be continuously iteratively adjusted, so that a function value of the first objective function is minimized, or a preset condition is satisfied, and when the function value of the first objective function is minimized, or the preset condition is satisfied, a final value of the independent variable in the first objective function is obtained, and based on the final value, estimated pose information and estimated speed information of the IMU are determined.
Wherein the first objective function can be expressed by the following equation (12):
Q1=argmin{e4+e5}; (12)
q1 represents a function value of the first objective function, and the above equation (8) and equation (10) are substituted into the equation (12) to obtain a final value of the argument in the first objective function when the function value of the first objective function is minimized or a predetermined condition is satisfied.
When the function value of the first objective function obtained by the electronic equipment reaches the minimum value or meets the preset condition, the value of the independent variable in the first objective function is determined and obtained based on the value, and the position information, the attitude information and the speed information of the IMU at the current moment, namely the estimated pose information and the estimated speed information of the IMU at the current moment are determined and obtained.
When the observation pose information and the observation speed information of the IMU in the world coordinate system obtained by the global positioning system are not obtained before the current moment, the independent variable in the first objective function is an independent variable in a reference coordinate system, and correspondingly, the estimation pose information and the estimation speed information are the pose information and the speed information of the IMU in the reference coordinate system. At this time, a conversion relation between the reference coordinate system and the world coordinate system may be determined based on the estimated pose information and first observed pose information obtained at the current time, and the estimated pose information and the estimated speed information may be converted into the world coordinate system based on the conversion relation to obtain first estimated pose information and first estimated speed information, and a pose speed error may be determined based on the first estimated speed information and the first observed speed information, and the first estimated pose information and the first observed pose information, and further, the IMU and the current pose information of the mobile vehicle in the world coordinate system may be determined.
When the observation pose information and the observation speed information of the IMU in the world coordinate system, which are obtained by the global positioning system, are obtained before the current moment, the independent variable in the first objective function is the independent variable in the world coordinate system, and correspondingly, the estimation pose information and the estimation speed information are the pose information and the speed information of the IMU in the world coordinate system. And directly taking the estimated pose information and the estimated speed information as first estimated pose information and first estimated speed information to carry out subsequent processes.
The pose velocity error can be expressed by the above equation (6), and can also be expressed by the following equation (13):
e6=∑j‖Zgpsj-hgpsj‖; (13)
wherein e is6Indicates the pose velocity error, hgpsjRepresenting first estimated pose information and first estimated velocity information; zgpsjAnd representing the first observation pose information and the first observation speed information.
And the final value of the independent variable in the second objective function is determined based on the final value when the function value of the second objective function reaches the minimum value or meets the preset condition, and the current pose information and the current speed information of the IMU in the world coordinate system are determined and obtained based on the final value.
Wherein the second objective function can be expressed by the following equation (14):
Q2=argmin{e4+e5+e6}; (14)
q2 represents a function value of the second objective function, and the above formula (8), formula (10), and formula (13) are substituted into the formula (14) to obtain a value of an argument in the second objective function when the function value of the second objective function is minimized or a predetermined condition is satisfied. Wherein the argument in the second objective function can be expressed as
Figure BDA0002086293450000241
Wherein the content of the first and second substances,
Figure BDA0002086293450000242
indicating a pre-established reference coordinate system and world coordinate systemThe conversion relationship of (1).
The electronic equipment obtains a final value of an independent variable in the second objective function when the function value of the second objective function reaches the minimum or meets a preset condition, and determines to obtain the current pose information and the current speed information of the IMU in a world coordinate system based on the final value; and further, determining the current pose information of the mobile vehicle in the world coordinate system based on the rotation matrix between the mobile vehicle and the IMU and the current pose information of the IMU in the world coordinate system.
In another embodiment of the present invention, the step of obtaining first observation pose information and first observation speed information in a world coordinate system of the IMU obtained by the global positioning system includes:
obtaining second observation pose information and second observation speed information of the moving vehicle under a world coordinate system, which are measured by a global positioning system; and determining first observation pose information and first observation speed information of the IMU under a world coordinate system based on a rotation matrix between the mobile vehicle and the IMU, the second observation pose information and the second observation speed information.
In this embodiment, the global positioning system may measure, in real time or periodically, observation pose information and observation speed information of the mobile vehicle in the world coordinate system, where the second observation speed information includes a speed value and a speed direction of the mobile vehicle. Also, when the IMU is disposed on a moving vehicle, the rotational relationship between the moving vehicle and the IMU, i.e., the rotation matrix, is determined, as mentioned above
Figure BDA0002086293450000251
The electronic device can obtain second observation pose information and second observation speed information of the moving vehicle in the world coordinate system, which are measured by the global positioning system at the current moment, and can rotate the second observation pose information and the second observation speed information based on the rotation relation to obtain first observation pose information and first observation speed information of the IMU in the world coordinate system.
In another embodiment of the present invention, the 104 may include: determining current pose information of the IMU in a world coordinate system based on the current reprojection error, the IMU data measurement error and first observation pose information and first observation speed information of the IMU in the world coordinate system, which are obtained through a global positioning system; and determining the current pose information of the mobile vehicle in the world coordinate system based on the rotation matrix between the mobile vehicle and the IMU and the current pose information of the IMU in the world coordinate system.
It can be understood that the current pose information and the speed information of the IMU in the world coordinate system can be determined and obtained through the constraints. When the IMU is disposed on a moving vehicle, the rotational relationship between the moving vehicle and the IMU, i.e., the rotation matrix, is determined, as mentioned above
Figure BDA0002086293450000252
Furthermore, based on the rotation relationship, the IMU can be correspondingly rotated in the current pose information of the IMU in the world coordinate system, so that the current pose information of the moving vehicle in the world coordinate system can be obtained, and the moving vehicle can be positioned in the world coordinate system.
In another embodiment of the present invention, after the step of determining the current pose information of the mobile vehicle in the world coordinate system based on the current reprojection error, the IMU data measurement error, and the first observation pose information and the first observation velocity information in the world coordinate system of the IMU obtained by the global positioning system, the method may further include:
under the condition that the failure of the global positioning system is detected, obtaining an image behind the current image acquired by the image acquisition equipment as an image to be utilized; obtaining IMU data corresponding to the utilization image acquired after the IMU; and determining the pose information of the mobile vehicle in the world coordinate system based on the current pose information of the mobile vehicle in the world coordinate system, the position information of the imaging point of the space point to be utilized in the image to be utilized corresponding to the characteristic point to be utilized and the IMU data acquired by the IMU.
In the running process of the mobile vehicle, the situation that the electronic equipment cannot receive the positioning information of the global positioning system is inevitable, and the electronic equipment can determine that the global positioning system is invalid under the situation that the electronic equipment cannot receive the positioning information of the global positioning system. Under the condition that the failure of the global positioning system is determined to be detected, the electronic equipment can continuously obtain an image behind a current image acquired by the image acquisition equipment as an image to be utilized, obtain position information of an imaging point of a detected space point to be utilized in the image to be utilized corresponding to a feature point to be utilized in the image to be utilized as position information to be utilized, obtain IMU data corresponding to the image to be utilized acquired behind the IMU, construct a latest reprojection error by utilizing the position information to be utilized and combining the latest depth information and second position information corresponding to the imaging point of the space point to be utilized in a reference image, estimated pose information and reference pose information of the image acquisition equipment corresponding to the image to be utilized, third position information and historical pose information of the space point to be utilized in a historical image, and construct the latest reprojection error based on the IMU data corresponding to the image to be utilized and the IMU data ahead of the image, and constructing IMU data measurement errors, determining the variation of the IMU relative to the pose information at the current time by combining the latest reprojection errors and the IMU data measurement errors, determining the latest pose information of the IMU based on the pose information at the current time, and determining the latest pose information of the moving vehicle based on the latest pose information of the IMU.
In one case, by outputting the reference position information and the IMU data in the above (14), it is possible to obtain the latest position and orientation information of the IMU in the world coordinate system, and further, based on the latest position and orientation information of the IMU in the world coordinate system, it is possible to determine the latest position and orientation information in the world coordinate system of the moving vehicle.
Corresponding to the above method embodiment, an embodiment of the present invention provides a positioning apparatus for a mobile vehicle, as shown in fig. 4, where the mobile vehicle is provided with an image capturing device, an inertial measurement unit IMU, and a global positioning system, and includes:
a first obtaining module 410, configured to obtain first position information of a detected feature point to be utilized in a current image acquired by an image acquisition device, and current IMU data corresponding to the current image acquired by the IMU;
a first determining module 420, configured to determine a current re-projection error corresponding to the spatial point to be utilized based on the first position information, current depth information and second position information corresponding to an imaging point of the spatial point to be utilized in a reference image corresponding to the feature point to be utilized, reference pose information corresponding to the reference image, and first estimated pose information, where an initial value of the first estimated pose information is: based on the previous pose information corresponding to the reference image of the current image, estimating the pose information corresponding to the current image, wherein the reference image is as follows: observing an image of an imaging point of the space point to be utilized corresponding to the characteristic point to be utilized for the first time;
a second determining module 430 configured to determine an IMU data measurement error corresponding to the spatial point to be utilized based on the current IMU data, the first estimated pose information, and the previous pose information;
a third determination module 440 configured to determine current pose information of the mobile vehicle in a world coordinate system based on the current reprojection error, the IMU data measurement error, and first observed pose information and first observed velocity information of the IMU in the world coordinate system obtained by the global positioning system.
By applying the embodiment of the invention, the position information of the space point to be utilized in the current image and the corresponding reference image corresponding to the feature point to be utilized, the reference pose information of the reference image and the estimated first estimated pose information corresponding to the current image are firstly based on, the current re-projection error corresponding to the space point to be utilized is determined, the IMU data measurement error corresponding to the space point to be utilized is determined based on the current IMU data, the first estimated pose information and the previous pose information, further, the first observation pose information and the first observation speed information under the world coordinate system of the IMU, which are obtained by the global positioning system, are combined to further optimize the current pose information under the world coordinate system of the IMU, and further optimize the current pose information under the world coordinate system of the mobile vehicle, so that the pose change and the speed of the mobile vehicle relative to the previous pose, which are determined by combining the position of the space point to be utilized in the image collected by the image collecting equipment, are realized And determining the current pose information of the moving vehicle in the world coordinate system through the pose change and the speed change of the moving vehicle relative to the previous pose determined by IMU data acquired by the IMU and the current position and the speed of the moving vehicle measured by the global positioning system in the world coordinate system, so that the accuracy of the positioning result of the moving vehicle in the world coordinate system is improved.
In another embodiment of the present invention, the apparatus further comprises: a second obtaining module (not shown in the figures) configured to: obtaining third position information of an imaging point of the space point to be utilized corresponding to the feature point to be utilized in each historical image and historical pose information corresponding to each historical image before determining a current re-projection error corresponding to the space point to be utilized based on the first position information, the reference pose information and the first estimated pose information corresponding to the reference image, wherein the historical images comprise images between the current image and the reference image;
the first determining module 420 is specifically configured to: and determining a current re-projection error corresponding to the space point to be utilized based on the first position information, the current depth information, the second position information, the third position information, the reference pose information corresponding to the reference image, the first estimated pose information and the historical pose information.
In another embodiment of the present invention, the first determining module 420 is specifically configured to: determining spatial position information of the to-be-utilized spatial point corresponding to the to-be-utilized feature point under an equipment coordinate system corresponding to the reference image based on the current depth information and the second position information, wherein the equipment coordinate system corresponding to the reference image is as follows: the image acquisition equipment is in an equipment coordinate system when acquiring the pose of the reference image; determining first projection position information of a projection point of the space point to be utilized in the current image based on the space position information, the reference pose information and the first estimated pose information; for each historical image, determining second projection position information of the projection point of the space point to be utilized in the historical image based on the space position information, the reference pose information and the historical pose information corresponding to the historical image; and determining the current re-projection error corresponding to the space point to be utilized based on the first projection position information, the first position information, the second projection position information and the third position information.
In another embodiment of the present invention, the second determining module 430 is specifically configured to:
obtaining historical IMU data corresponding to each historical image and reference IMU data corresponding to the reference image;
and determining an IMU data measurement error corresponding to the space point to be utilized based on the current IMU data corresponding to the current image, the first estimated pose information, the previous pose information, the reference IMU data corresponding to the reference image, the reference pose information, the pose information corresponding to the previous frame of image of the reference image, historical IMU data corresponding to each historical image, historical pose information and historical pose information corresponding to the previous frame of image of the historical image.
In another embodiment of the present invention, the third determining module 440 includes: a first determination unit configured to determine first estimated pose information and first estimated velocity information of the IMU in a world coordinate system based on the current reprojection error and the IMU data measurement error; an obtaining unit configured to obtain first observation pose information and first observation speed information in a world coordinate system of the IMU obtained by the global positioning system; a second determination unit configured to determine a pose velocity error based on the first estimated velocity information and the first observed velocity information, and the first estimated pose information and the first observed pose information; a third determination unit configured to determine current pose information of the moving vehicle in the world coordinate system based on the pose speed error.
Optionally, the obtaining unit is configured to obtain second observation pose information and second observation speed information of the mobile vehicle in a world coordinate system, which are measured by the global positioning system; and determining first observation pose information and first observation speed information of the IMU under the world coordinate system based on the rotation matrix between the mobile vehicle and the IMU, the second observation pose information and the second observation speed information.
In another embodiment of the present invention, the third determining module 440 is specifically configured to determine the current pose information of the IMU in the world coordinate system based on the current reprojection error, the IMU data measurement error, and the first observation pose information and the first observation speed information in the world coordinate system of the IMU obtained by the global positioning system; determining current pose information of the mobile vehicle in a world coordinate system of the IMU based on a rotation matrix between the mobile vehicle and the IMU and the current pose information in the world coordinate system of the IMU.
In another embodiment of the present invention, the apparatus further comprises: a third obtaining module (not shown in the figure) configured to, after determining the current pose information of the mobile vehicle in the world coordinate system based on the current reprojection error, the IMU data measurement error, and the first observation pose information and the first observation speed information in the world coordinate system of the IMU obtained by the global positioning system, obtain, as an image to be utilized, an image subsequent to the current image acquired by the image acquisition device in a case where the global positioning system is detected to be disabled; a fourth obtaining module (not shown in the figures) configured to obtain IMU data corresponding to the image to be utilized acquired after the IMU; a fourth determining module (not shown in the figure) configured to determine the pose information of the moving vehicle in the world coordinate system based on the current pose information of the moving vehicle in the world coordinate system, the position information of the imaging point of the to-be-utilized spatial point corresponding to the to-be-utilized feature point in the to-be-utilized image, and the IMU data acquired by the IMU.
The above device embodiment corresponds to the method embodiment, and has the same technical effect as the method embodiment, and for the specific description, refer to the method embodiment. The device embodiment is obtained based on the method embodiment, and for specific description, reference may be made to the method embodiment section, which is not described herein again. Those of ordinary skill in the art will understand that: the figures are merely schematic representations of one embodiment, and the blocks or flow diagrams in the figures are not necessarily required to practice the present invention. Those of ordinary skill in the art will understand that: modules in the devices in the embodiments may be distributed in the devices in the embodiments according to the description of the embodiments, or may be located in one or more devices different from the embodiments with corresponding changes. The modules of the above embodiments may be combined into one module, or further split into multiple sub-modules.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. A method for locating a moving vehicle, wherein the moving vehicle is provided with an image acquisition device, an Inertial Measurement Unit (IMU), and a global positioning system, comprising:
acquiring first position information of a detected feature point to be utilized in a current image acquired by image acquisition equipment and current IMU data corresponding to the current image acquired by the IMU;
based on the first position information, current depth information and second position information corresponding to an imaging point of the space point to be utilized corresponding to the feature point to be utilized in a reference image, reference pose information and first estimated pose information corresponding to the reference image, and determining a current reprojection error corresponding to the space point to be utilized, wherein an initial value of the first estimated pose information is as follows: based on the previous pose information corresponding to the previous frame image of the current image, the estimated pose information corresponding to the current image, wherein the reference image is as follows: observing an image of an imaging point of the space point to be utilized corresponding to the characteristic point to be utilized for the first time;
determining an IMU data measurement error corresponding to the space point to be utilized based on the current IMU data, the first estimated pose information and the previous pose information;
and determining the current pose information of the mobile vehicle in the world coordinate system based on the current reprojection error, the IMU data measurement error and first observation pose information and first observation speed information of the IMU in the world coordinate system, which are obtained by the global positioning system.
2. The method of claim 1, wherein before the step of determining the current reprojection error corresponding to the to-be-utilized spatial point based on the first position information, the current depth information and the second position information corresponding to the imaging point of the to-be-utilized spatial point in the reference image corresponding to the to-be-utilized feature point, the reference pose information corresponding to the reference image, and the first estimated pose information, the method further comprises:
acquiring third position information of an imaging point of the space point to be utilized in each historical image corresponding to the feature point to be utilized and historical pose information corresponding to each historical image, wherein the historical images comprise images between the current image and the reference image;
the step of determining the current reprojection error corresponding to the space point to be utilized based on the current depth information and the second position information corresponding to the imaging point of the space point to be utilized in the reference image corresponding to the first position information, the reference pose information and the first estimated pose information corresponding to the reference image comprises the following steps:
and determining a current re-projection error corresponding to the space point to be utilized based on the first position information, the current depth information, the second position information, the third position information, the reference pose information corresponding to the reference image, the first estimated pose information and the historical pose information.
3. The method of claim 2, wherein the step of determining the current reprojection error corresponding to the to-be-utilized spatial point based on the first position information, the current depth information, the second position information, the third position information, the reference pose information corresponding to the reference image, the first estimated pose information, and the historical pose information comprises:
determining spatial position information of the to-be-utilized spatial point corresponding to the to-be-utilized feature point under an equipment coordinate system corresponding to the reference image based on the current depth information and the second position information, wherein the equipment coordinate system corresponding to the reference image is as follows: the image acquisition equipment is in an equipment coordinate system when acquiring the pose of the reference image;
determining first projection position information of a projection point of the space point to be utilized in the current image based on the space position information, the reference pose information and the first estimated pose information;
for each historical image, determining second projection position information of the projection point of the space point to be utilized in the historical image based on the space position information, the reference pose information and the historical pose information corresponding to the historical image;
and determining the current re-projection error corresponding to the space point to be utilized based on the first projection position information, the first position information, the second projection position information and the third position information.
4. The method of claim 2, wherein the step of determining IMU data measurement errors corresponding to the points in space to utilize based on the current IMU data, the first estimated pose information, and the previous pose information comprises:
obtaining historical IMU data corresponding to each historical image and reference IMU data corresponding to the reference image;
and determining an IMU data measurement error corresponding to the space point to be utilized based on the current IMU data corresponding to the current image, the first estimated pose information, the previous pose information, the reference IMU data corresponding to the reference image, the reference pose information, the pose information corresponding to the previous frame of image of the reference image, historical IMU data corresponding to each historical image, historical pose information and historical pose information corresponding to the previous frame of image of the historical image.
5. The method of any of claims 1-4, wherein the step of determining the current pose information of the moving vehicle in the world coordinate system based on the current reprojection error, the IMU data measurement error, and first observed pose information and first observed velocity information of the IMU in the world coordinate system obtained by the global positioning system comprises:
determining first estimation pose information and first estimation speed information of the IMU under a world coordinate system based on the current reprojection error and the IMU data measurement error;
obtaining first observation pose information and first observation speed information under a world coordinate system of the IMU, which are obtained through the global positioning system;
determining a pose velocity error based on the first estimated velocity information and the first observed velocity information, and the first estimated pose information and the first observed pose information;
and determining the current pose information of the moving vehicle under the world coordinate system based on the pose speed error.
6. The method of claim 5, wherein the step of obtaining first observation pose information and first observation velocity information in the world coordinate system of the IMU obtained by the global positioning system comprises:
obtaining second observation pose information and second observation speed information of the mobile vehicle under a world coordinate system, which are measured by the global positioning system;
and determining first observation pose information and first observation speed information of the IMU under the world coordinate system based on the rotation matrix between the mobile vehicle and the IMU, the second observation pose information and the second observation speed information.
7. The method of any of claims 1-6, wherein the step of determining the current pose information of the moving vehicle in the world coordinate system based on the current reprojection error, the IMU data measurement error, and first observed pose information and first observed velocity information in the world coordinate system of the IMU obtained by the global positioning system comprises:
determining current pose information of the IMU in a world coordinate system based on the current reprojection error, the IMU data measurement error and first observation pose information and first observation speed information of the IMU in the world coordinate system, which are obtained through the global positioning system;
determining current pose information of the mobile vehicle in a world coordinate system of the IMU based on a rotation matrix between the mobile vehicle and the IMU and the current pose information in the world coordinate system of the IMU.
8. The method of any of claims 1-4, wherein after the step of determining the current pose information of the moving vehicle in the world coordinate system based on the current reprojection error, the IMU data measurement error, and first observed pose information and first observed velocity information in the world coordinate system of the IMU obtained by the global positioning system, the method further comprises:
under the condition that the global positioning system is detected to be invalid, obtaining an image behind the current image acquired by image acquisition equipment as an image to be utilized;
obtaining IMU data corresponding to the image to be utilized acquired after the IMU;
and determining the pose information of the moving vehicle in the world coordinate system based on the current pose information of the moving vehicle in the world coordinate system, the position information of the imaging point of the space point to be utilized in the image to be utilized corresponding to the characteristic point to be utilized and the IMU data acquired by the IMU.
9. A positioning device of a mobile vehicle, characterized in that the mobile vehicle is provided with an image acquisition device, an inertial measurement unit IMU and a global positioning system, comprising:
the system comprises a first obtaining module, a second obtaining module and a third obtaining module, wherein the first obtaining module is configured to obtain first position information of a detected feature point to be utilized in a current image acquired by image acquisition equipment and current IMU data corresponding to the current image acquired by the IMU;
a first determining module, configured to determine, based on the first position information, current depth information and second position information corresponding to an imaging point of the spatial point to be utilized corresponding to the feature point in a reference image, and reference pose information and first estimated pose information corresponding to the reference image, a current re-projection error corresponding to the spatial point to be utilized, where an initial value of the first estimated pose information is: based on the previous pose information corresponding to the reference image of the current image, estimating the pose information corresponding to the current image, wherein the reference image is as follows: observing an image of an imaging point of the space point to be utilized corresponding to the characteristic point to be utilized for the first time;
a second determination module configured to determine an IMU data measurement error corresponding to the spatial point to be utilized based on the current IMU data, the first estimated pose information, and the previous pose information;
a third determination module configured to determine current pose information of the mobile vehicle in a world coordinate system based on the current reprojection error, the IMU data measurement error, and first observed pose information and first observed velocity information of the IMU in the world coordinate system obtained by the global positioning system.
10. The apparatus of claim 9, wherein the apparatus further comprises:
a second obtaining module configured to: obtaining third position information of an imaging point of the space point to be utilized corresponding to the feature point to be utilized in each historical image and historical pose information corresponding to each historical image before determining a current re-projection error corresponding to the space point to be utilized based on the current depth information and the second position information corresponding to the imaging point of the space point to be utilized in the reference image corresponding to the feature point to be utilized, the reference pose information and the first estimated pose information corresponding to the reference image, wherein the historical images comprise images between the current image and the reference image;
the first determining module is specifically configured to:
and determining a current re-projection error corresponding to the space point to be utilized based on the first position information, the current depth information, the second position information, the third position information, the reference pose information corresponding to the reference image, the first estimated pose information and the historical pose information.
CN201910488886.1A 2019-06-06 2019-06-06 Positioning method and device for moving vehicle Active CN112050806B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910488886.1A CN112050806B (en) 2019-06-06 2019-06-06 Positioning method and device for moving vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910488886.1A CN112050806B (en) 2019-06-06 2019-06-06 Positioning method and device for moving vehicle

Publications (2)

Publication Number Publication Date
CN112050806A true CN112050806A (en) 2020-12-08
CN112050806B CN112050806B (en) 2022-08-30

Family

ID=73608658

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910488886.1A Active CN112050806B (en) 2019-06-06 2019-06-06 Positioning method and device for moving vehicle

Country Status (1)

Country Link
CN (1) CN112050806B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113048989A (en) * 2021-04-06 2021-06-29 北京三快在线科技有限公司 Positioning method and positioning device of unmanned equipment
CN114111813A (en) * 2021-10-18 2022-03-01 阿波罗智能技术(北京)有限公司 High-precision map element updating method and device, electronic equipment and storage medium
CN114370872A (en) * 2022-01-14 2022-04-19 苏州挚途科技有限公司 Vehicle attitude determination method and vehicle
WO2022179047A1 (en) * 2021-02-26 2022-09-01 魔门塔(苏州)科技有限公司 State information estimation method and apparatus
WO2023016271A1 (en) * 2021-08-13 2023-02-16 北京迈格威科技有限公司 Attitude determining method, electronic device, and readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2144038A2 (en) * 2008-07-10 2010-01-13 Lockheed Martin Corporation Inertial measurement using an imaging sensor and a digitized map
CN107246868A (en) * 2017-07-26 2017-10-13 上海舵敏智能科技有限公司 A kind of collaborative navigation alignment system and navigation locating method
CN107850899A (en) * 2015-05-23 2018-03-27 深圳市大疆创新科技有限公司 Merged using the sensor of inertial sensor and imaging sensor
CN107869989A (en) * 2017-11-06 2018-04-03 东北大学 A kind of localization method and system of the fusion of view-based access control model inertial navigation information
CN108051002A (en) * 2017-12-04 2018-05-18 上海文什数据科技有限公司 Transport vehicle space-location method and system based on inertia measurement auxiliary vision
CN108489482A (en) * 2018-02-13 2018-09-04 视辰信息科技(上海)有限公司 The realization method and system of vision inertia odometer
CN109143305A (en) * 2018-09-30 2019-01-04 百度在线网络技术(北京)有限公司 Automobile navigation method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2144038A2 (en) * 2008-07-10 2010-01-13 Lockheed Martin Corporation Inertial measurement using an imaging sensor and a digitized map
CN107850899A (en) * 2015-05-23 2018-03-27 深圳市大疆创新科技有限公司 Merged using the sensor of inertial sensor and imaging sensor
CN107246868A (en) * 2017-07-26 2017-10-13 上海舵敏智能科技有限公司 A kind of collaborative navigation alignment system and navigation locating method
CN107869989A (en) * 2017-11-06 2018-04-03 东北大学 A kind of localization method and system of the fusion of view-based access control model inertial navigation information
CN108051002A (en) * 2017-12-04 2018-05-18 上海文什数据科技有限公司 Transport vehicle space-location method and system based on inertia measurement auxiliary vision
CN108489482A (en) * 2018-02-13 2018-09-04 视辰信息科技(上海)有限公司 The realization method and system of vision inertia odometer
CN109143305A (en) * 2018-09-30 2019-01-04 百度在线网络技术(北京)有限公司 Automobile navigation method and device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
JIE JIN 等: "Localization Based on Semantic Map and Visual Inertial Odometry", 《2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR)》 *
胡月志: "面向智能车的高精度视觉定位技术研究", 《中国优秀硕士学位论文全文数据库》 *
高严岩: "基于单目视觉几何的智能车辆位姿估计", 《中国优秀硕士学位论文全文数据库》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022179047A1 (en) * 2021-02-26 2022-09-01 魔门塔(苏州)科技有限公司 State information estimation method and apparatus
CN113048989A (en) * 2021-04-06 2021-06-29 北京三快在线科技有限公司 Positioning method and positioning device of unmanned equipment
CN113048989B (en) * 2021-04-06 2022-12-09 北京三快在线科技有限公司 Positioning method and positioning device of unmanned equipment
WO2023016271A1 (en) * 2021-08-13 2023-02-16 北京迈格威科技有限公司 Attitude determining method, electronic device, and readable storage medium
CN114111813A (en) * 2021-10-18 2022-03-01 阿波罗智能技术(北京)有限公司 High-precision map element updating method and device, electronic equipment and storage medium
CN114370872A (en) * 2022-01-14 2022-04-19 苏州挚途科技有限公司 Vehicle attitude determination method and vehicle
CN114370872B (en) * 2022-01-14 2024-04-09 苏州挚途科技有限公司 Vehicle attitude determination method and vehicle

Also Published As

Publication number Publication date
CN112050806B (en) 2022-08-30

Similar Documents

Publication Publication Date Title
CN112050806B (en) Positioning method and device for moving vehicle
CN110243358B (en) Multi-source fusion unmanned vehicle indoor and outdoor positioning method and system
US9025033B2 (en) Surveillance camera and method for calibrating the survelliance camera using a calibration tool
US7313252B2 (en) Method and system for improving video metadata through the use of frame-to-frame correspondences
WO2018142900A1 (en) Information processing device, data management device, data management system, method, and program
US9275458B2 (en) Apparatus and method for providing vehicle camera calibration
CN105698765A (en) Method using combination of double IMUs (inertial measurement units) and monocular vision to measure pose of target object under non-inertial system
US20080195316A1 (en) System and method for motion estimation using vision sensors
TWI556198B (en) Positioning and directing data analysis system and method thereof
US20180075614A1 (en) Method of Depth Estimation Using a Camera and Inertial Sensor
CN109596121B (en) Automatic target detection and space positioning method for mobile station
WO2011120141A1 (en) Dynamic network adjustment for rigorous integration of passive and active imaging observations into trajectory determination
KR101890612B1 (en) Method and apparatus for detecting object using adaptive roi and classifier
CN105352509A (en) Unmanned aerial vehicle motion target tracking and positioning method under geographic information space-time constraint
US20180075609A1 (en) Method of Estimating Relative Motion Using a Visual-Inertial Sensor
JP2014186004A (en) Measurement device, method and program
US11315276B2 (en) System and method for dynamic stereoscopic calibration
Karam et al. Integrating a low-cost mems imu into a laser-based slam for indoor mobile mapping
KR101224830B1 (en) Portable Multi-Sensor System for Acquiring Georeferenced Images and Method thereof
CN112577479A (en) Multi-sensor fusion vehicle positioning method and device based on map element data
KR101183866B1 (en) Apparatus and method for real-time position and attitude determination based on integration of gps, ins and image at
KR20200109116A (en) Method and system for position estimation of unmanned aerial vehicle using graph structure based on multi module
CN111862146A (en) Target object positioning method and device
Randeniya et al. Calibration of inertial and vision systems as a prelude to multi-sensor fusion
CN115307646A (en) Multi-sensor fusion robot positioning method, system and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20220303

Address after: 100083 unit 501, block AB, Dongsheng building, No. 8, Zhongguancun East Road, Haidian District, Beijing

Applicant after: BEIJING MOMENTA TECHNOLOGY Co.,Ltd.

Address before: 100083 room 28, 4 / F, block a, Dongsheng building, 8 Zhongguancun East Road, Haidian District, Beijing

Applicant before: BEIJING CHUSUDU TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant