CN112258577A - Method and system for evaluating vehicle-end monocular vision mapping measurement confidence - Google Patents

Method and system for evaluating vehicle-end monocular vision mapping measurement confidence Download PDF

Info

Publication number
CN112258577A
CN112258577A CN202011158441.6A CN202011158441A CN112258577A CN 112258577 A CN112258577 A CN 112258577A CN 202011158441 A CN202011158441 A CN 202011158441A CN 112258577 A CN112258577 A CN 112258577A
Authority
CN
China
Prior art keywords
covariance matrix
confidence
vehicle
target point
monocular vision
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011158441.6A
Other languages
Chinese (zh)
Other versions
CN112258577B (en
Inventor
王小亮
吴凯
辛梓
贾腾龙
刘奋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Heading Data Intelligence Co Ltd
Original Assignee
Heading Data Intelligence Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Heading Data Intelligence Co Ltd filed Critical Heading Data Intelligence Co Ltd
Priority to CN202011158441.6A priority Critical patent/CN112258577B/en
Publication of CN112258577A publication Critical patent/CN112258577A/en
Application granted granted Critical
Publication of CN112258577B publication Critical patent/CN112258577B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/18Complex mathematical operations for evaluating statistical data, e.g. average values, frequency distributions, probability functions, regression analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Analysis (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Computational Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Algebra (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Probability & Statistics with Applications (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Operations Research (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computing Systems (AREA)
  • Image Analysis (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention relates to a method and a system for evaluating the confidence degree of monocular vision mapping measurement at a vehicle end, wherein the method comprises the following steps: initializing each module covariance matrix of the vehicle-end monocular vision system, wherein the covariance matrix comprises a vehicle-end pose error covariance matrix omegaGAnd the image characteristic point matching pixel error covariance matrix omegaP(ii) a Calculating a position covariance matrix of the target point after monocular vision mapping; and calculating the corresponding measurement confidence coefficient under the specified error radius according to the position covariance matrix based on a confidence ellipse theory. Calculating the confidence of the measurement result based on the specified error confidence interval to solve the problem of quantitative evaluation of uncertainty of the measurement result of the vehicle-end monocular image construction, directly providing the error distribution interval of the current measurement result for the user, and providing visual positioning, visual navigation and visual navigationAnd (4) establishing a reliability evaluation basis of a vision measurement result in applications such as drawing and the like, and further guiding to improve the vision drawing measurement precision.

Description

Method and system for evaluating vehicle-end monocular vision mapping measurement confidence
Technical Field
The invention relates to the field of computer vision positioning and mapping, in particular to a method and a system for evaluating the confidence degree of mapping measurement of monocular vision at a vehicle end.
Background
Monocular vision has the advantages of low price, convenient installation, simple calibration process and the like, and is gradually applied to the vision positioning and image building process in a large scale in recent years. The visual positioning and mapping refers to recovering motion information of a camera from adjacent image frames through a mutual correlation relationship between visual images, and estimating a target position and establishing a point cloud map of a surrounding environment by combining an initial position prior value. The visual positioning and mapping provides target space position information for a plurality of applications such as visual obstacle avoidance and visual navigation, and the reliability of the measurement result provides safety coefficient measurement for upper-layer applications.
Currently, confidence evaluation of a visual positioning measurement result mainly uses a measurement information matrix to calculate an artificially defined algebraic value, and the method mainly has two problems: (1) the confidence coefficient is artificially defined and lacks physical effectiveness and rationality; (2) the confidence result cannot directly measure the error range of the current measurement value, and lacks guidance significance in practical application.
Disclosure of Invention
The invention provides a method and a system for evaluating the confidence of monocular vision mapping measurement at a vehicle end, aiming at the technical problems in the prior art, and solving the problems in the prior art.
The technical scheme for solving the technical problems is as follows: a method for evaluating confidence of monocular vision mapping measurement at a vehicle end comprises the following steps:
step 1, initializing each module error covariance matrix of a vehicle-end monocular vision system, wherein the covariance matrix comprises a vehicle-end position and attitude error covariance matrix omegaGAnd the image characteristic point matching pixel error covariance matrix omegaP
Step 2, calculating a position covariance matrix of the target point after monocular vision mapping;
and 3, calculating the corresponding confidence coefficient under the specified error radius according to the position covariance matrix based on a confidence ellipse theory.
A vehicle-end monocular vision mapping measurement confidence evaluation system comprises: the device comprises an initialization module, a position covariance matrix estimation module and a confidence coefficient calculation module;
an initialization module for initializing each error covariance matrix of the vehicle-end monocular vision system, wherein the covariance matrix comprises a vehicle-end position and attitude error covariance matrix omegaGAnd the image characteristic point matching pixel error covariance matrix omegaP
The position covariance matrix estimation module is used for calculating a position covariance matrix of the target point after monocular vision mapping;
and the confidence coefficient calculation module is used for calculating the corresponding confidence coefficient under the appointed error radius according to the position covariance matrix based on a confidence ellipse theory.
The invention has the beneficial effects that: the invention provides a method for evaluating the confidence of a vehicle-end monocular vision-based mapping measurement, which is used for calculating the confidence of a measurement result based on a specified error confidence interval so as to solve the problem of quantitative evaluation of uncertainty of the vehicle-end monocular measurement result, directly providing the error uncertainty of the current measurement result for a user, providing a reliability evaluation basis of the vision measurement result in the applications of vision positioning, vision navigation, vision mapping and the like, and further guiding the improvement of the vision measurement precision.
On the basis of the technical scheme, the invention can be further improved as follows.
Further, in the step 1, a monte carlo statistical method is adopted to calculate each error covariance matrix of the vehicle-end monocular vision system.
Further, the step 2 comprises:
step 201, combining with the monocular vision mapping process, based on the ceres optimization library, calculating a position covariance matrix Ω of the feature point set on the target point surfacei(i 1.., n), wherein n is the number of feature points;
step 202, calculating the position covariance matrix omega of the target pointO
Figure BDA0002743529480000021
Further, the position covariance matrix is an error of the target in the transverse direction and the longitudinal direction under the coordinate system of the monocular camera, and the size of the position covariance matrix is 2 × 2 square matrix:
Figure BDA0002743529480000031
further, the step 2 further comprises: perfecting the boundary value of the position covariance matrix of the target point according to the sample data;
the process of calculating the boundary values of the position covariance matrix of the target point includes:
step 203, initializing a set Ω ═ { Φ } of the covariance matrix of the current target point, and adding the covariance matrix of the first target point position, where Ω ═ Ω { (Ω) } at this time1};
Step 204, comparing the difference values of diagonal element values of the position covariance matrices of the new target point and the current target point in sequence, and adding the position covariance matrix of the new target point into the set omega of the position covariance matrices when judging that the difference value exceeds a set threshold;
in step 205, the estimated value of the boundary value of the covariance matrix of the target point is:
Figure BDA0002743529480000032
n represents the number of position covariance matrices of the target point.
Further, in step 3, the error radius is set to R according to the measurement precision requirement, and the corresponding confidence CIf is calculated according to the relationship between the confidence and the confidence interval as: CIf-1-exp (-R)2/(2λ));
Figure BDA0002743529480000033
Ω (1,1), Ω (2,2), and Ω (1,2) respectively represent elements of corresponding positions in the covariance matrix of the target position。
The beneficial effect of adopting the further scheme is that: the system covariance is calculated by adopting a Monte Carlo statistical method, and the accuracy of parameter estimation is improved.
Drawings
FIG. 1 is a flow chart of a method for evaluating confidence in a monocular vision mapping measurement at a vehicle end according to the present invention;
FIG. 2 is a flowchart of an embodiment of a method for evaluating confidence in a vehicle-end monocular vision mapping measurement according to the present invention;
FIG. 3 is a block diagram of an embodiment of a system for evaluating confidence in a monocular vision mapping measurement at a vehicle end according to the present invention;
fig. 4 is a schematic physical structure diagram of an electronic device according to an embodiment of the present invention. :
in the drawings, the components represented by the respective reference numerals are listed below:
101. the device comprises an initialization module 102, a position covariance matrix estimation module 103, a confidence coefficient calculation module 201, a processor 202, a communication interface 203, a memory 204 and a communication bus.
Detailed Description
The principles and features of this invention are described below in conjunction with the following drawings, which are set forth by way of illustration only and are not intended to limit the scope of the invention.
Fig. 1 is a flowchart of a method for evaluating confidence in a vehicle-end monocular vision mapping measurement according to the present invention, and as shown in fig. 1, the method includes:
step 1, initializing each module error covariance matrix of a vehicle-end monocular vision system, wherein the monocular vision system comprises a plurality of modules, such as a vehicle-end position and attitude calculation module and an image feature point matching pixel calculation module, and the corresponding covariance matrix comprises a vehicle-end position and attitude error covariance matrix omegaGAnd the image characteristic point matching pixel error covariance matrix omegaP
The covariance error matrix of the position and the attitude is the covariance error matrix of the position and the attitude of the GPS. In the process of visual mapping, the above error covariance matrixes are kept unchanged.
And 2, calculating a position covariance matrix of the target point after passing through the monocular vision system.
And 3, calculating the corresponding confidence coefficient under the specified error radius according to the position covariance matrix based on a confidence ellipse theory.
The invention provides a method for evaluating the confidence of a vehicle-end monocular vision-based mapping measurement, which is used for calculating the confidence of a measurement result based on a specified error confidence interval so as to solve the problem of quantitative evaluation of uncertainty of the vehicle-end monocular measurement result, directly providing the error uncertainty of the current measurement result for a user, providing a reliability evaluation basis of the vision measurement result in the applications of vision positioning, vision navigation, vision mapping and the like, and further guiding the improvement of the vision measurement precision.
Example 1
Embodiment 1 provided by the present invention is an embodiment of a method for evaluating confidence in a vehicle-end monocular vision mapping measurement provided by the present invention, and as shown in fig. 2, is a flowchart of an embodiment of a method for evaluating confidence in a vehicle-end monocular vision mapping measurement provided by the present invention, as can be seen from fig. 2, the embodiment includes:
step 1, initializing each module error covariance matrix of the vehicle-end monocular vision system, wherein the covariance matrix comprises a vehicle-end position and attitude error covariance matrix omegaGAnd the image characteristic point matching pixel error covariance matrix omegaP
Preferably, in step 1, a monte carlo statistical method is adopted to calculate each error covariance matrix of the vehicle-end monocular vision system.
The system covariance is calculated by adopting a Monte Carlo statistical method, and the accuracy of parameter estimation is improved.
And 2, calculating a position covariance matrix of the target point after passing through the monocular vision system.
Preferably, step 2 comprises:
step 201, combining with the monocular vision mapping process, calculating a position covariance matrix omega of a feature point set on the surface of a target point based on a ceres optimization libraryi(i 1.., n), where n is a feature pointThe number of (2).
Step 202, calculating a position covariance matrix omega of the target pointO
Figure BDA0002743529480000051
Specifically, the position covariance matrix is the error distribution of the target in the transverse direction and the longitudinal direction under the monocular camera coordinate system, and the size is 2 × 2 square matrix:
Figure BDA0002743529480000052
each target point possesses three-dimensional spatial position information, and its position covariance matrix is 3 × 3 matrix, but the position of the target point can be considered as negligible error in the height direction, mainly considering errors from the transverse direction and the longitudinal direction along the monocular camera coordinate system, so the position covariance matrix is reduced to 2 × 2 square matrix, here expressed as:
Figure BDA0002743529480000053
the covariance matrix is a diagonal matrix, i.e. omega12=Ω21
Further, the calculation of the boundary value of the position covariance matrix of the target point is mainly based on a comparison method, and the process includes:
step 203, initializing a set Ω ═ { Φ } of the covariance matrix of the current target point, and adding the covariance matrix of the first target point position, where Ω ═ Ω { (Ω) } at this time1}。
And step 204, sequentially comparing the difference values of the diagonal element values of the position covariance matrices of the new target point and the current target point, and adding the position covariance matrix of the new target point into the set omega of the position covariance matrices when the difference value exceeds a set threshold value.
In step 205, the estimated value of the boundary value of the covariance matrix of the target point is:
Figure BDA0002743529480000061
n represents the number of position covariance matrices of the target point, and may be, for example, 30.
And 3, calculating the corresponding confidence coefficient under the specified error radius according to the position covariance matrix based on a confidence ellipse theory.
Preferably, in step 3, the error radius is set to R according to the measurement precision requirement, and the corresponding confidence CIf is calculated according to the relationship between the confidence and the confidence interval as: CIf-1-exp (-R)2/(2λ))。
Figure BDA0002743529480000062
Ω (1,1), Ω (2,2), and Ω (1,2) respectively denote elements of corresponding positions in the covariance matrix of the target position.
Example 2
Embodiment 2 provided by the present invention is an embodiment of a vehicle-end monocular vision mapping measurement confidence evaluation system provided by the present invention, and as shown in fig. 2, is a structural block diagram of an embodiment of a vehicle-end monocular vision mapping measurement confidence evaluation system provided by the present invention, as can be seen from fig. 2, the system includes: an initialization module 101, a location covariance matrix estimation module 102 and a confidence calculation module 103.
An initialization module 101, configured to initialize each error covariance matrix of the vehicle-end monocular vision system, where the covariance matrix includes a vehicle-end position and attitude error covariance matrix ΩGAnd the image characteristic point matching pixel error covariance matrix omegaP
And the position covariance matrix estimation module 102 is configured to calculate a position covariance matrix of the target point after the monocular vision mapping.
And the confidence coefficient calculating module 103 is used for calculating the corresponding confidence coefficient under the specified error radius according to the position covariance matrix based on a confidence ellipse theory.
Fig. 3 is a schematic entity structure diagram of an electronic device according to an embodiment of the present invention, and as shown in fig. 3, the electronic device may include: the system comprises a processor 201, a communication interface 202, a memory 203 and a communication bus 204, wherein the processor 201, the communication interface 202 and the memory 203 are communicated with each other through the communication bus 204. The processor 201 mayTo invoke a computer program stored in the memory 203 and operable on the processor 201 to execute the method for estimating confidence of the vehicle-end monocular vision mapping measurement provided by the above embodiments, for example, the method includes: step 1, initializing each error covariance matrix of the vehicle-end monocular vision system, wherein the covariance matrix comprises a vehicle-end position and attitude error covariance matrix omegaGAnd the image characteristic point matching pixel error covariance matrix omegaP(ii) a Step 2, calculating a position covariance matrix of the target point after monocular vision mapping; and 3, calculating the corresponding confidence coefficient under the specified error radius according to the position covariance matrix based on a confidence ellipse theory.
An embodiment of the present invention further provides a non-transitory computer-readable storage medium, on which a computer program is stored, where the computer program is implemented to, when executed by a processor, perform the method for estimating confidence in vehicle-end monocular vision mapping measurement provided in the foregoing embodiments, for example, the method includes: step 1, initializing each error covariance matrix of the vehicle-end monocular vision system, wherein the covariance matrix comprises a vehicle-end position and attitude error covariance matrix omegaGAnd the image characteristic point matching pixel error covariance matrix omegaP(ii) a Step 2, calculating a position covariance matrix of the target point after monocular vision mapping; and 3, calculating the corresponding confidence coefficient under the specified error radius according to the position covariance matrix based on a confidence ellipse theory.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (9)

1. A method for evaluating confidence of monocular vision mapping measurement at a vehicle end is characterized by comprising the following steps:
step 1, initializing each module covariance matrix of a vehicle-end monocular vision system, wherein the covariance matrix comprises a vehicle-end position and attitude error covariance matrix omegaGAnd the image characteristic point matching pixel error covariance matrix omegaP
Step 2, calculating a position covariance matrix of the target point after passing through the monocular vision system;
and 3, calculating the corresponding confidence coefficient under the specified error radius according to the position covariance matrix based on a confidence ellipse theory.
2. The method of claim 1, wherein the covariance matrix of each module of the vehicle-end monocular vision system is calculated in step 1 using monte carlo statistics.
3. The method according to claim 1, wherein the step 2 comprises:
step 201, combining with the monocular vision mapping process, based on the ceres optimization library, calculating a position covariance matrix set Ω of the feature point set on the target point surfacei(i 1.., n), wherein n is the number of feature points;
step 202, calculating the position covariance matrix omega of the target pointO
Figure FDA0002743529470000011
4. The method of claim 1, wherein the position covariance matrix represents the error of the target in the lateral and longitudinal directions of the monocular camera coordinate system, and has a magnitude of 2 x 2 square:
Figure FDA0002743529470000012
5. the method of claim 1, wherein step 2 further comprises: perfecting the boundary value of the position covariance matrix of the target point according to the sample data;
the process of calculating the boundary values of the position covariance matrix of the target point includes:
step 203, initializing the set Ω ═ Φ } of the covariance matrix of the current target point, and fitting the first target pointThe covariance matrix of the point position is added, and then omega is equal to { omega1};
Step 204, comparing the difference values of diagonal element values of the position covariance matrices of the new target point and the current target point in sequence, and adding the position covariance matrix of the new target point into the set omega of the position covariance matrices when judging that the difference value exceeds a set threshold;
in step 205, the estimated value of the boundary value of the covariance matrix of the target point is:
Figure FDA0002743529470000021
n represents the number of position covariance matrices of the target point.
6. The method according to claim 5, wherein in step 3, the error radius is set to R according to the measurement precision requirement, and the corresponding confidence CIf is calculated according to the relation between the confidence and the confidence interval as: CIf-1-exp (-R)2/(2λ));
Figure FDA0002743529470000022
Ω (1,1), Ω (2,2), and Ω (1,2) respectively denote elements of corresponding positions in the covariance matrix of the target position.
7. A vehicle-end monocular vision mapping measurement confidence evaluation system is characterized by comprising: the device comprises an initialization module, a position covariance matrix estimation module and a confidence coefficient calculation module;
an initialization module for initializing each module error covariance matrix of the vehicle-end monocular vision system, wherein the covariance matrix comprises a vehicle-end position and attitude error covariance matrix omegaGAnd the image characteristic point matching pixel error covariance matrix omegaP
The position covariance matrix estimation module is used for calculating a position covariance matrix of the target point after monocular vision mapping;
and the confidence coefficient calculation module is used for calculating the corresponding confidence coefficient under the specified error radius.
8. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the program implements the steps of the method for vehicle-end monocular visual mapping measurement confidence assessment according to any one of claims 1 to 7.
9. A non-transitory computer readable storage medium, having stored thereon a computer program, wherein the computer program, when being executed by a processor, implements the steps of the method for vehicle end monocular vision mapping measurement confidence assessment according to any one of claims 1 to 7.
CN202011158441.6A 2020-10-26 2020-10-26 Method and system for evaluating confidence of monocular vision mapping measurement at vehicle end Active CN112258577B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011158441.6A CN112258577B (en) 2020-10-26 2020-10-26 Method and system for evaluating confidence of monocular vision mapping measurement at vehicle end

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011158441.6A CN112258577B (en) 2020-10-26 2020-10-26 Method and system for evaluating confidence of monocular vision mapping measurement at vehicle end

Publications (2)

Publication Number Publication Date
CN112258577A true CN112258577A (en) 2021-01-22
CN112258577B CN112258577B (en) 2022-06-17

Family

ID=74262454

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011158441.6A Active CN112258577B (en) 2020-10-26 2020-10-26 Method and system for evaluating confidence of monocular vision mapping measurement at vehicle end

Country Status (1)

Country Link
CN (1) CN112258577B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113514052A (en) * 2021-06-10 2021-10-19 西安因诺航空科技有限公司 Multi-machine cooperation high-precision active target positioning method and system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110074569A1 (en) * 2009-09-30 2011-03-31 Nayef Alsindi Method and Network for Determining Positions of Wireless Nodes While Minimizing Propagation of Positioning Errors
CN109931940A (en) * 2019-01-22 2019-06-25 广东工业大学 A kind of robot localization method for evaluating confidence based on monocular vision
CN110617815A (en) * 2018-06-19 2019-12-27 上海汽车集团股份有限公司 Method and device for automatic driving monitoring alarm
CN111310772A (en) * 2020-03-16 2020-06-19 上海交通大学 Point line feature selection method and system for binocular vision SLAM
US20200240793A1 (en) * 2019-01-28 2020-07-30 Qfeeltech (Beijing) Co., Ltd. Methods, apparatus, and systems for localization and mapping

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110074569A1 (en) * 2009-09-30 2011-03-31 Nayef Alsindi Method and Network for Determining Positions of Wireless Nodes While Minimizing Propagation of Positioning Errors
CN110617815A (en) * 2018-06-19 2019-12-27 上海汽车集团股份有限公司 Method and device for automatic driving monitoring alarm
CN109931940A (en) * 2019-01-22 2019-06-25 广东工业大学 A kind of robot localization method for evaluating confidence based on monocular vision
US20200240793A1 (en) * 2019-01-28 2020-07-30 Qfeeltech (Beijing) Co., Ltd. Methods, apparatus, and systems for localization and mapping
CN111310772A (en) * 2020-03-16 2020-06-19 上海交通大学 Point line feature selection method and system for binocular vision SLAM

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
MOSTAFA OSMAN,ET AL.: "A novel online approach for drift covariance estimation of odometries used in intelligent vehicle localization", 《SENSORS 2019》 *
敬泽 等: "基于单目视觉的空间目标位置测量", 《传感器与微系统》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113514052A (en) * 2021-06-10 2021-10-19 西安因诺航空科技有限公司 Multi-machine cooperation high-precision active target positioning method and system

Also Published As

Publication number Publication date
CN112258577B (en) 2022-06-17

Similar Documents

Publication Publication Date Title
CN111208492B (en) Vehicle-mounted laser radar external parameter calibration method and device, computer equipment and storage medium
US10247556B2 (en) Method for processing feature measurements in vision-aided inertial navigation
CN108885791B (en) Ground detection method, related device and computer readable storage medium
US10754922B2 (en) Method and apparatus for sensor fusion
US9513130B1 (en) Variable environment high integrity registration transformation system and related method
US20160379365A1 (en) Camera calibration device, camera calibration method, and camera calibration program
CN112258577B (en) Method and system for evaluating confidence of monocular vision mapping measurement at vehicle end
CN114926549B (en) Three-dimensional point cloud processing method, device, equipment and storage medium
CN113740871A (en) Laser SLAM method, system equipment and storage medium in high dynamic environment
Sibley et al. A sliding window filter for incremental SLAM
CN113554712B (en) Registration method and device of automatic driving vehicle, electronic equipment and vehicle
CN114255274A (en) Vehicle positioning method, system, equipment and storage medium based on two-dimension code recognition
CN116883460A (en) Visual perception positioning method and device, electronic equipment and storage medium
CN113295159A (en) Positioning method and device for end cloud integration and computer readable storage medium
CN114355393A (en) Three-antenna attitude estimation method based on low-cost receiver
CN114812601A (en) State estimation method and device of visual inertial odometer and electronic equipment
CN109489658B (en) Moving target positioning method and device and terminal equipment
CN114488042A (en) Laser radar calibration method and device, electronic equipment and storage medium
CN112325770B (en) Method and system for evaluating confidence of relative precision of monocular vision measurement at vehicle end
CN112633043B (en) Lane line determining method and device, electronic equipment and storage medium
CN112330735B (en) Method and system for evaluating confidence of measurement accuracy of relative position of vehicle body
CN115311635B (en) Lane line processing method, device, equipment and storage medium
CN117433511B (en) Multi-sensor fusion positioning method
CN117058430B (en) Method, apparatus, electronic device and storage medium for field of view matching
CN114612544B (en) Image processing method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant