CN115790613A - Visual information assisted inertial/odometer integrated navigation method and device - Google Patents

Visual information assisted inertial/odometer integrated navigation method and device Download PDF

Info

Publication number
CN115790613A
CN115790613A CN202211411406.XA CN202211411406A CN115790613A CN 115790613 A CN115790613 A CN 115790613A CN 202211411406 A CN202211411406 A CN 202211411406A CN 115790613 A CN115790613 A CN 115790613A
Authority
CN
China
Prior art keywords
navigation
matrix
map
attitude
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211411406.XA
Other languages
Chinese (zh)
Inventor
李子月
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Kunpeng Borui Technology Co.,Ltd.
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202211411406.XA priority Critical patent/CN115790613A/en
Publication of CN115790613A publication Critical patent/CN115790613A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention discloses a visual information assisted inertial/odometer integrated navigation method and device, wherein the method comprises the following steps: constructing a road attitude matrix according to a high-precision map and the current vehicle positioning
Figure DDA0003938373900000011
Based on a camera installed on the vehicle, the lane line is detected and identified by using a machine vision algorithm, so that an included angle beta between the lane line and the vehicle is calculated 0 (ii) a According to the road attitude matrix
Figure DDA0003938373900000012
Angle beta between lane line and vehicle 0 Computing a visual/map pose matrix C V/map (ii) a Constructing a posture combination filter model, and combining a visual sense/map posture matrix C V/map And taking an inertial attitude matrix output by inertial navigation as an observed quantity, and estimating and correcting errors of the inertial attitude matrix; obtaining the attitude combining filterDecomposing the speed of the milemeter into a navigation coordinate system by using a fusion attitude matrix output by the instrument model; and constructing a filter model of the inertial/odometer combined navigation system, and correcting inertial navigation errors, device errors and odometer errors so as to perform navigation.

Description

Visual information assisted inertial/odometer integrated navigation method and device
Technical Field
The invention belongs to the technical field of multi-sensor data fusion, and particularly relates to a visual information-assisted inertial/odometer integrated navigation method and device.
Background
Among many bottleneck technologies, high-precision, reliable and continuous navigation and positioning are one of the necessary conditions for stable operation of an automatic driving vehicle. The satellite navigation positioning technology based on real-time dynamic difference is most widely applied to automatic driving, but is easily subjected to signal interruption caused by environmental interference, particularly urban scenes, and the data availability of a navigation system is greatly reduced due to the influence of environmental factors such as buildings such as high buildings, overpasses and tunnels, dense tree crowns and the like on satellite signals, and the problems of shielding, weak signals, multipath effects and the like.
In a complex environment of automatic driving vehicle operation, any single positioning sensor is difficult to realize accurate and reliable navigation positioning, and based on different measurement principles, RTK GNSS, inertial navigation equipment, odometer, laser radar, vision and other positioning sensors have advantages and disadvantages and have strong complementarity. In order to solve the problems of low reliability, poor continuity and insufficient robustness of the integrated navigation system, the integrated navigation technology which integrates the measurement data of various sensors becomes an important research direction.
In a satellite navigation rejection environment, the inertia/milemeter integrated navigation is an extremely important navigation mode, has the advantages of comprehensive information, strong autonomy, high reliability and stable performance, and can further improve the precision maintaining effect of the inertia/milemeter integrated navigation system by estimating and compensating the installation error and the scale coefficient error of the milemeter as state variables. However, since the odometer measurement information is single, the measurement information for the combined navigation depends on the attitude matrix output by the inertial navigation, so that the observability degree of the state variable is low, the error compensation effect is weak, and the navigation system is difficult to maintain the accuracy for a long time.
Disclosure of Invention
Aiming at the problems that the traditional inertia/mileometer combined navigation system is low in observable degree of state variables, poor in error estimation effect and difficult to realize long-time precision maintenance in a satellite rejection environment, the embodiment of the application aims to provide a visual information assisted inertia/mileometer combined navigation method and device.
According to a first aspect of embodiments of the present application, there is provided a visual information assisted combined inertial/odometer navigation method, comprising:
according to high-precision map andvehicle current positioning and road attitude matrix construction
Figure SMS_1
Based on a camera installed on the vehicle, the lane line is detected and identified by using a machine vision algorithm, so that the included angle beta between the lane line and the vehicle is calculated 0
According to the road attitude matrix
Figure SMS_2
Angle beta between lane line and vehicle 0 Computing a visual/map pose matrix C V/map
Constructing a posture combination filter model, and combining a visual sense/map posture matrix C V/map And taking an inertial attitude matrix output by inertial navigation as an observed quantity, and estimating and correcting errors of the inertial attitude matrix;
acquiring a fusion attitude matrix output by the attitude combination filter model, and decomposing the speed of the odometer into a navigation coordinate system;
and constructing a filter model of the inertial/odometer combined navigation system, and correcting inertial navigation errors, device errors and odometer errors so as to perform navigation.
Further, a road attitude matrix is constructed according to the high-precision map and the current positioning of the vehicle
Figure SMS_3
The method comprises the following steps:
obtaining parameters of a direction angle phi, a slope angle theta and a transverse slope angle gamma of a road based on the current positioning and the high-precision map;
assuming that the vehicle is a rigid body model and the contact between the tire and the ground is good, the rigid body model is approximated as a course angle, a pitch angle and a roll angle, and a road attitude matrix is constructed
Figure SMS_4
Figure SMS_5
Further, based on a camera installed on the vehicle, a machine vision algorithm is utilized to detect and identify the lane line, so that an included angle beta between the lane line and the vehicle is calculated 0 The method comprises the following steps:
acquiring lane line coordinates: establishing a binocular camera carrier coordinate system by taking the center of the camera as an origin, taking the dead ahead as a Y axis and taking the horizontal right as an X axis, detecting and identifying lane lines by using a machine vision algorithm, and arranging the lane lines in a three-dimensional coordinate A (X) of the binocular camera carrier coordinate system i ,y i ,z i ) Projecting to an O-XY plane to obtain a two-dimensional coordinate A (x) of the lane line i ,y i );
Acquiring a lane line tangent equation: in the horizontal coordinate system, a (x) is determined by a quadratic function i ,y i ) Fitting to obtain a function curve of the lane line, and differentiating the function curve of the lane line to obtain a tangent expression of the lane line;
acquiring an intersection point between a vehicle and a lane line: according to the height h of the camera installation and the angle of field alpha parameter, calculating to obtain a lane line starting point A in the camera field of view 1 (x 1 ,y 1 ) And a vertical distance Δ y = h/tan α between the cameras according to the lane line starting point a 1 (x 1 ,y 1 ) And obtaining the lane line coordinate A corresponding to the current vehicle position according to the vertical distance delta y 0 (x 0 ,y 0 ) Wherein y is 0 =y 1 -Δy;
Included angle beta between lane line and vehicle 0 And (3) calculating: calculating the curve of the quadratic function y = f (x) at A 0 (x 0 ,y 0 ) Slope k of tangent at point c And approximating the included angle between the tangent line and the Y axis to the included angle between the lane line and the current driving direction of the vehicle.
Further, according to the road attitude matrix
Figure SMS_6
Angle beta between lane line and vehicle 0 Computing a visual/map pose matrix C V/map The following formula:
Figure SMS_7
further, the attitude combining filter model includes an observation equation and a state equation,
the observation equation is as follows:
Figure SMS_8
wherein, delta gamma, delta theta,
Figure SMS_9
Respectively obtaining differences between a roll angle, a pitch angle and a course angle between the inertia matrix and the vision/map attitude matrix obtained by observation; gamma ray IMU 、θ IMU
Figure SMS_10
Respectively outputting a roll angle, a pitch angle and a course angle of the inertial navigation; gamma ray V/map 、θ V/map
Figure SMS_11
Roll angle, pitch angle and course angle obtained by the vision/map attitude matrix; phi is a unit of e 、φ n 、φ u Misalignment angles for east, north and sky, respectively;
selecting attitude, position, speed, gyroscope zero offset and accelerometer zero offset error as a system state variable state vector, wherein the state equation is as follows:
Figure SMS_12
Figure SMS_13
Figure SMS_14
wherein v is E 、v N 、v U Respectively representing east, north and sky speeds; delta v E 、δv N Respectively representing east and north speed errors; epsilon x 、ε y 、ε z Respectively representing three gyroscope errors; omega ie Representing the rotational angular velocity of the earth; l, R M ,R N Respectively represents latitude, radius of curvature of earth meridian and radius of curvature of earth prime unit circle,
Figure SMS_15
respectively representing three accelerometer errors; f. of E 、f N 、f U Respectively representing specific forces, omega, in the east, north and sky directions ie Is the rotational angular velocity of the earth.
Further, acquiring a fusion attitude matrix output by the attitude combination filter model, and decomposing the odometer speed into a navigation coordinate system, including:
according to the data output frequency of the inertial navigation equipment, updating the state of the attitude fusion filter;
measuring and updating the attitude fusion matrix according to the data output frequency of the map attitude matrix, reading attitude angle parameters output by the fusion attitude matrix in real time, and constructing the fusion attitude matrix
Figure SMS_16
And (3) decomposing the vehicle running speed data output by the odometer to obtain a speed measurement value under a navigation coordinate system, wherein the formula (8) is as follows:
Figure SMS_17
wherein the content of the first and second substances,
Figure SMS_18
measuring values of the mileage meter speed in a navigation coordinate system;
Figure SMS_19
a fusion attitude matrix for converting a vehicle body coordinate system to a navigation coordinate system;
Figure SMS_20
the attitude matrix is converted from the coordinate system of the odometer to the coordinate system of the vehicle body, and the mounting error parameters of the odometer comprise a pitch angle delta theta, a roll angle delta gamma and a course angle
Figure SMS_21
Forming;
Figure SMS_22
the speed measurement data is directly output by the odometer.
Further, constructing a filter model of the inertial/odometer combined navigation system, and correcting inertial navigation errors, device errors and odometer errors, wherein the method comprises the following steps:
selecting the attitude, the position, the speed, the zero offset of the gyroscope, the zero offset of the accelerometer and the error of the odometer as the system state variables of the inertia/odometer combined navigation system:
Figure SMS_23
δθ、
Figure SMS_24
δK D representing a pitch angle installation error, a course angle installation error and a scale coefficient error of the odometer;
the state equation of the inertia/mileometer integrated navigation system is as follows:
Figure SMS_25
wherein (F) INS ) 9×9 And (F) sg ) 9×6 The specific expression of (a) is shown in a state equation of the attitude combination filter model;
the observation equation of the inertia/mileometer combined navigation system is as follows:
Figure SMS_26
in the above formula, the first and second carbon atoms are,
Figure SMS_27
and
Figure SMS_28
respectively representing inertial navigation speed and speed error in a navigation coordinate system;
Figure SMS_29
and
Figure SMS_30
respectively representing the speed and the speed error of the odometer in the navigation coordinate system;
Figure SMS_31
and x represents a cross-multiplication matrix of speed measurement data directly output by the odometer, I is an identity matrix, and V is an observation noise matrix.
According to a second aspect of embodiments herein, there is provided a combined inertial/odometer navigation device assisted by visual information, comprising:
a construction module for constructing a road attitude matrix according to the high-precision map and the current location of the vehicle
Figure SMS_32
A first calculation module for detecting and identifying the lane line by using a machine vision algorithm based on a camera installed on the vehicle, thereby calculating an included angle beta between the lane line and the vehicle 0
A second calculation module for calculating a road attitude matrix
Figure SMS_33
Angle beta between lane line and vehicle 0 Computing a visual/map pose matrix C V/map
A correction module for constructing a posture combination filter model to obtain a visual/map posture matrix C V/map And taking an inertial attitude matrix output by inertial navigation as an observed quantity, and estimating and correcting errors of the inertial attitude matrix;
the decomposition module is used for acquiring a fusion attitude matrix output by the attitude combination filter model and decomposing the speed of the odometer into a navigation coordinate system;
and the navigation module is used for constructing an inertia/milemeter combined navigation system filter model, and correcting an inertia navigation error, a device error and a milemeter error so as to perform navigation.
According to a third aspect of embodiments herein, there is provided an electronic device comprising:
one or more processors;
a memory for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement a method as described in the first aspect.
According to a fourth aspect of embodiments herein, there is provided a computer readable storage medium having stored thereon computer instructions which, when executed by a processor, carry out the steps of the method according to the first aspect.
The technical scheme provided by the embodiment of the application can have the following beneficial effects:
according to the embodiment, by adopting the technical means of multi-sensor (inertial navigation, camera and odometer) data fusion, the attitude matrix based on the vision and high-precision map is constructed, the odometer data is decomposed, error decoupling of the odometer data and the attitude matrix in the inertial/odometer combined navigation system is realized, the problem of rapid error divergence caused by insufficient observability is effectively solved, and further the precision improvement of the combined navigation system is realized.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
FIG. 1 is a schematic diagram illustrating a visually assisted inertial/odometer combined navigation method according to an exemplary embodiment.
FIG. 2 is a diagram illustrating a lane line and vehicle angle calculation schematic according to an exemplary embodiment.
FIG. 3 is a block diagram illustrating a visual information assisted combined inertial/odometer navigation device, according to an example embodiment.
FIG. 4 is a schematic diagram illustrating an electronic device in accordance with an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. The following description refers to the accompanying drawings in which the same numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It should be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if" as used herein may be interpreted as "at" \8230; "or" when 8230; \8230; "or" in response to a determination ", depending on the context.
FIG. 1 is a flow chart illustrating a method of visual information assisted combined inertial/odometer navigation, as shown in FIG. 1, according to an exemplary embodiment, which may include the steps of:
step S11: constructing a road attitude matrix according to a high-precision map and the current vehicle positioning
Figure SMS_34
Step S12: based on a camera installed on the vehicle, the lane line is detected and identified by using a machine vision algorithm, so that an included angle beta between the lane line and the vehicle is calculated 0
Step S13: according to the road attitude matrix
Figure SMS_35
Angle beta between lane line and vehicle 0 Computing a visual/map pose matrix C V/map
Step S14: constructing a posture combination filter model, and combining a visual sense/map posture matrix C V/map And taking an inertial attitude matrix output by inertial navigation as an observed quantity, and estimating and correcting errors of the inertial attitude matrix;
step S15: acquiring a fusion attitude matrix output by the attitude combination filter model, and decomposing the speed of the odometer into a navigation coordinate system;
step S16: and constructing a filter model of the inertial/odometer combined navigation system, and correcting inertial navigation errors, device errors and odometer errors so as to perform navigation.
According to the embodiment, by adopting the technical means of data fusion of multiple sensors (inertial navigation, a camera and a mileometer), the attitude matrix based on the vision and high-precision map is constructed, the mileometer data is decomposed, error decoupling of the mileometer data and the attitude matrix in the inertial/mileometer combined navigation system is realized, the problem of rapid error divergence caused by insufficient observability is effectively solved, and further the precision improvement of the combined navigation system is realized.
In the specific implementation of step S11, according to the high-precision map and the vehicleCurrently positioning and constructing road attitude matrix
Figure SMS_36
In particular, the high-precision map is relative to the common map, and provides map information with higher precision and richer content. The position data in the high-precision map can reach the decimeter level in precision, and simultaneously, the position data also comprises the slope, the transverse slope angle and the direction angle parameters of the road; in the aspect of information types, besides elements common to ordinary maps, the information also contains dozens or even hundreds of elements such as street lamps, guardrails, traffic lights and the like. This step may include the following sub-steps:
obtaining parameters of a direction angle phi, a slope angle theta and a cross slope angle gamma of a road based on the current positioning and high-precision map;
assuming that the vehicle is a rigid body model and the contact between the tire and the ground is good, the rigid body model is approximated as a course angle, a pitch angle and a roll angle, and a road attitude matrix is constructed
Figure SMS_37
Figure SMS_38
In the specific implementation of step S12, the lane line is detected and identified by a machine vision algorithm based on a camera mounted on the vehicle, so as to calculate an included angle β between the lane line and the vehicle 0
In particular, this step may comprise the following sub-steps:
step S21: acquiring lane line coordinates: the method comprises the steps of establishing a binocular camera carrier coordinate system by taking the camera center as an origin, taking the dead ahead as a Y axis and taking the flat right as an X axis, detecting and identifying lane lines by utilizing a machine vision algorithm, and arranging the lane lines in a three-dimensional coordinate A (X) in the binocular camera carrier coordinate system i ,y i ,z i ) Projecting to an O-XY plane to obtain a two-dimensional coordinate A (x) of the lane line i ,y i );
Step S22: lane line cutterAnd (3) obtaining a line equation: in the horizontal coordinate system, a (x) is determined by a quadratic function i ,y i ) Fitting to obtain a function curve of the lane line, and differentiating the function curve of the lane line to obtain a tangent expression of the lane line;
step S23: acquiring an intersection point between a vehicle and a lane line: according to the height h of the camera installation and the vertical field angle alpha parameter, calculating to obtain a lane line starting point A in the camera field of view 1 (x 1 ,y 1 ) And a vertical distance delta y = h/tan alpha between the camera according to the starting point A of the lane line 1 (x 1 ,y 1 ) And obtaining the lane line coordinate A corresponding to the current vehicle position according to the vertical distance delta y 0 (x 0 ,y 0 ) Wherein y is 0 =y 1 -Δy;
Step S24: included angle beta between lane line and vehicle 0 And (3) calculating: calculating the curve of the quadratic function y = f (x) at A 0 (x 0 ,y 0 ) Slope k of tangent at point c And approximating an included angle between the tangent line and the Y axis to an included angle between a lane line and the current driving direction of the vehicle.
In one embodiment of steps S21-S24. As shown in fig. 2, the camera is horizontally and centrally installed on the front windshield of the vehicle, the coordinate axis is parallel to the vehicle body coordinate system, the O-XY rectangular coordinate system is the projection of the camera coordinate system in the horizontal direction, the origin O is located at the center point of the camera, the Y axis is right in front of the vehicle head, and the X axis is perpendicular to the Y axis and points to the right side of the vehicle body. y = f (x) is a fitted function curve of the lane line, a 1 (x 1 ,y 1 ) As the intersection of the lower boundary of the image in the camera and the lane line, A 0 (x 0 ,y 0 ) Is the intersection of the extended line of the lane line and the X axis, and y = k 0 x+b 0 Is point A 0 (x 0 ,y 0 ) Corresponding tangent equation, the included angle between the tangent and the Y axis is beta 0
In the specific implementation of step S13, the road attitude matrix is used
Figure SMS_39
Angle beta between lane line and vehicle 0 CalculatingVisual/map attitude matrix C V/map
In particular, the visual/map pose matrix C V/map The following formula:
Figure SMS_40
in the specific implementation of step S14, a pose combining filter model is constructed, and a visual/map pose matrix C is formed V/map And taking an inertial attitude matrix output by inertial navigation as an observed quantity, and estimating and correcting errors of the inertial attitude matrix;
in particular, a kalman filter may be employed as the attitude combining filter model, and the remaining filters may be selected, which is set as is conventional in the art. The attitude combining filter model includes an observation equation and a state equation,
the observation equation is as follows:
Figure SMS_41
wherein, delta gamma, delta theta,
Figure SMS_42
The differences between the roll angle, pitch angle and course angle between the inertia and vision/map attitude matrixes obtained by observation are respectively; gamma ray IMU 、θ IMU
Figure SMS_43
Respectively outputting a roll angle, a pitch angle and a course angle of the inertial navigation; gamma ray V/map 、θ V/map
Figure SMS_44
Respectively obtaining a roll angle, a pitch angle and a course angle by a vision/map attitude matrix; phi is a e 、φ n 、φ u Misalignment angles for east, north and sky, respectively;
selecting the attitude, the position, the speed, the zero offset of the gyroscope and the zero offset error of the accelerometer as a state vector of a system state variable:
Figure SMS_45
the state equation is as follows:
Figure SMS_46
Figure SMS_47
Figure SMS_48
wherein v is E 、v N 、v U Respectively representing east, north and sky speeds; delta v E 、δv N Respectively representing east and north speed errors; epsilon x 、ε y 、ε z Respectively representing three gyroscope errors; omega ie Representing the rotational angular velocity of the earth; l, R M ,R N Respectively represents latitude, radius of curvature of earth meridian and radius of curvature of earth prime unit circle,
Figure SMS_49
three accelerometer errors are respectively represented; f. of E 、f N 、f U Respectively representing the specific forces, omega, of the east, north and sky ie Is the rotational angular velocity of the earth, c ij The method is the same as the method for estimating and correcting the inertial attitude matrix error to obtain the element of the ith row and the jth column of the fusion attitude matrix.
In the specific implementation of the step S15, a fusion attitude matrix output by the attitude combination filter model is obtained, and the odometer speed is decomposed to a navigation coordinate system;
in particular, this step may comprise the following sub-steps:
updating the state of the attitude fusion filter according to the data output frequency of the inertial navigation equipment;
measuring and updating the attitude fusion matrix according to the data output frequency of the map attitude matrix, reading attitude angle parameters output by the fusion attitude matrix in real time, and constructing the fusion attitude matrix
Figure SMS_50
And (3) decomposing the vehicle running speed data output by the odometer to obtain a speed measurement value under a navigation coordinate system, wherein the formula (8) is as follows:
Figure SMS_51
wherein the content of the first and second substances,
Figure SMS_52
measuring values of the mileage meter speed in a navigation coordinate system;
Figure SMS_53
a fusion attitude matrix for converting a vehicle body coordinate system to a navigation coordinate system;
Figure SMS_54
the attitude matrix is converted from the coordinate system of the odometer to the coordinate system of the vehicle body, and the mounting error parameters of the odometer comprise a pitch angle delta theta, a roll angle delta gamma and a course angle
Figure SMS_55
Forming;
Figure SMS_56
the speed measurement data is directly output by the odometer.
In the specific implementation of the step S16, an inertia/odometer combined navigation system filter model is constructed, and an inertial navigation error, a device error and an odometer error are corrected, so as to perform navigation;
specifically, selecting attitude, position, speed, gyroscope zero offset, accelerometer zero offset and odometer error as system state variables of the inertia/odometer combined navigation system:
Figure SMS_57
δθ、
Figure SMS_58
δK D representing a pitch angle installation error, a course angle installation error and a scale coefficient error of the odometer;
the state equation of the inertia/mileometer integrated navigation system is as follows:
Figure SMS_59
wherein (F) INS ) 9×9 And (F) sg ) 9×6 See the state equation of the attitude combination filter model;
the observation equation of the inertia/mileometer combined navigation system is as follows:
Figure SMS_60
in the above formula, the first and second carbon atoms are,
Figure SMS_61
and
Figure SMS_62
respectively representing inertial navigation speed and speed error in a navigation coordinate system;
Figure SMS_63
and
Figure SMS_64
respectively representing the speed and the speed error of the odometer in the navigation coordinate system;
Figure SMS_65
and the cross multiplication matrix represents the speed measurement data directly output by the odometer, I is an identity matrix, and V is an observation noise matrix.
Corresponding to the embodiment of the visual information assisted inertial/odometer combined navigation method, the application also provides an embodiment of the visual information assisted inertial/odometer combined navigation device.
FIG. 3 is a block diagram of a visual information assisted combined inertial/odometer navigation device, according to an example embodiment. Referring to fig. 3, the apparatus may include:
a construction module 21, configured to construct a road attitude matrix according to the high-precision map and the current vehicle location
Figure SMS_66
A first calculation module 22 for detecting and identifying the lane line by using a machine vision algorithm based on a camera mounted on the vehicle, thereby calculating an included angle β between the lane line and the vehicle 0
A second calculation module 23 for calculating a road attitude matrix according to the road attitude matrix
Figure SMS_67
Angle beta between lane line and vehicle 0 Computing a visual/map pose matrix C V/map
A correction module 24 for constructing a pose combining filter model to combine the visual/map pose matrix C V/map And taking an inertial attitude matrix output by inertial navigation as an observed quantity, and estimating and correcting errors of the inertial attitude matrix;
the decomposition module 25 is used for acquiring a fusion attitude matrix output by the attitude combination filter model and decomposing the speed of the odometer into a navigation coordinate system;
and the navigation module 26 is used for constructing an inertia/milemeter combined navigation system filter model, and correcting an inertia navigation error, a device error and an milemeter error so as to perform navigation.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the application. One of ordinary skill in the art can understand and implement it without inventive effort.
Correspondingly, the present application also provides an electronic device, comprising: one or more processors; a memory for storing one or more programs; when executed by the one or more processors, cause the one or more processors to implement a visual information assisted combined inertial/odometry navigation method as described above. As shown in fig. 4, for a hardware structure diagram of any device with data processing capability where the visual information assisted inertial/odometer combined navigation method provided in the embodiment of the present invention is located, in addition to the processor, the memory and the network interface shown in fig. 4, any device with data processing capability where the apparatus is located in the embodiment may also include other hardware generally according to the actual function of the any device with data processing capability, which is not described again.
Accordingly, the present application also provides a computer readable storage medium having stored thereon computer instructions which, when executed by a processor, implement the visual information assisted combined inertial/odometer navigation method as described above. The computer readable storage medium may be an internal storage unit, such as a hard disk or a memory, of any data processing device described in any previous embodiment. The computer readable storage medium may also be an external storage device such as a plug-in hard disk, a Smart Media Card (SMC), an SD Card, a Flash memory Card (Flash Card), etc. provided on the device. Further, the computer readable storage medium may include both an internal storage unit of any data processing capable device and an external storage device. The computer-readable storage medium is used for storing the computer program and other programs and data required by the arbitrary data processing capable device, and may also be used for temporarily storing data that has been output or is to be output.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains.
It will be understood that the present application is not limited to the precise arrangements that have been described above and shown in the drawings, and that various modifications and changes may be made without departing from the scope thereof.

Claims (10)

1. A visual information assisted inertial/odometer combined navigation method, comprising:
constructing a road attitude matrix according to a high-precision map and the current vehicle positioning
Figure FDA0003938373870000011
Based on a camera installed on the vehicle, the lane line is detected and identified by using a machine vision algorithm, so that an included angle beta between the lane line and the vehicle is calculated 0
According to the road attitude matrix
Figure FDA0003938373870000012
Angle beta between lane line and vehicle 0 Computing a visual/map pose matrix C V/map
Constructing a posture combination filter model, and converting a visual/map posture matrix C V/map And taking the inertial attitude matrix output by inertial navigation as observed quantity, and carrying out error correction on the inertial attitude matrixCorrecting line estimation;
acquiring a fusion attitude matrix output by the attitude combination filter model, and decomposing the speed of the milemeter into a navigation coordinate system;
and constructing a filter model of the inertial/odometer combined navigation system, and correcting inertial navigation errors, device errors and odometer errors so as to perform navigation.
2. The method of claim 1, wherein the road pose matrix is constructed from a high precision map and a current vehicle location
Figure FDA0003938373870000013
The method comprises the following steps:
obtaining parameters of a direction angle phi, a slope angle theta and a transverse slope angle gamma of a road based on the current positioning and the high-precision map;
assuming that the vehicle is a rigid body model and the contact between the tire and the ground is good, the rigid body model is approximated as a course angle, a pitch angle and a roll angle, and a road attitude matrix is constructed
Figure FDA0003938373870000014
Figure FDA0003938373870000015
3. The method of claim 1, wherein the lane marking is detected and identified using a machine vision algorithm based on a camera mounted on the vehicle, thereby calculating an angle β between the lane marking and the vehicle 0 The method comprises the following steps:
acquiring lane line coordinates: the method comprises the steps of establishing a binocular camera carrier coordinate system by taking the camera center as an origin, taking the dead ahead as a Y axis and taking the flat right as an X axis, detecting and identifying lane lines by utilizing a machine vision algorithm, and arranging the lane lines in a three-dimensional coordinate A (X) in the binocular camera carrier coordinate system i ,y i ,z i ) Projected onto an O-XY plane,obtaining the two-dimensional coordinate A (x) of the lane line i ,y i );
Acquiring a lane line tangent equation: in the horizontal coordinate system, a (x) is determined by a quadratic function i ,y i ) Fitting to obtain a function curve of the lane line, and differentiating the function curve of the lane line to obtain a tangent expression of the lane line;
acquiring an intersection point between a vehicle and a lane line: according to the height h of the camera installation and the vertical field angle alpha parameter, calculating to obtain a lane line starting point A in the camera field of view 1 (x 1 ,y 1 ) And a vertical distance Δ y = h/tan α between the cameras according to the lane line starting point a 1 (x 1 ,y 1 ) And obtaining the lane line coordinate A corresponding to the current vehicle position according to the vertical distance delta y 0 (x 0 ,y 0 ) Wherein y is 0 =y 1 -Δy;
Included angle beta between lane line and vehicle 0 And (3) calculating: calculating the curve of the quadratic function y = f (x) at A 0 (x 0 ,y 0 ) Slope k of tangent at point c And approximating an included angle between the tangent line and the Y axis to an included angle between a lane line and the current driving direction of the vehicle.
4. The method of claim 1, wherein the road pose matrix is based on the road pose matrix
Figure FDA0003938373870000021
Angle beta between lane line and vehicle 0 Computing a visual/map pose matrix C V/map The following formula:
Figure FDA0003938373870000022
5. the method of claim 1, wherein the pose combining filter model comprises an observation equation and a state equation,
the observation equation is as follows:
Figure FDA0003938373870000023
wherein, delta gamma, delta theta,
Figure FDA0003938373870000024
Respectively obtaining differences between a roll angle, a pitch angle and a course angle between the inertia matrix and the vision/map attitude matrix obtained by observation; gamma ray IMU 、θ IMU
Figure FDA0003938373870000025
Respectively outputting a roll angle, a pitch angle and a course angle of the inertial navigation; gamma ray V/map 、θ V/map
Figure FDA0003938373870000026
Roll angle, pitch angle and course angle obtained by the vision/map attitude matrix; phi is a e 、φ n 、φ u Misalignment angles for east, north and sky, respectively;
selecting the attitude, the position, the speed, the zero offset of the gyroscope and the zero offset error of the accelerometer as a state vector of a system state variable, wherein the state equation is as follows:
Figure FDA0003938373870000027
Figure FDA0003938373870000031
Figure FDA0003938373870000032
wherein v is E 、v N 、v U Respectively representing east, north and sky speeds; delta v E 、δv N Respectively representing east and north speed errors; epsilon x 、ε y 、ε z Respectively representing three gyroscope errors; omega ie Representing the rotational angular velocity of the earth; l, R M ,R N Respectively represents latitude, radius of curvature of earth meridian and radius of curvature of earth prime unit circle,
Figure FDA0003938373870000033
respectively representing three accelerometer errors; f. of E 、f N 、f U Respectively representing specific forces, omega, in the east, north and sky directions ie Is the rotational angular velocity of the earth.
6. The method of claim 1, wherein obtaining a fused pose matrix output by the pose combining filter model, decomposing odometer velocity into a navigation coordinate system, comprises:
according to the data output frequency of the inertial navigation equipment, updating the state of the attitude fusion filter;
measuring and updating the attitude fusion matrix according to the data output frequency of the map attitude matrix, reading attitude angle parameters output by the fusion attitude matrix in real time, and constructing the fusion attitude matrix
Figure FDA0003938373870000034
And (3) decomposing the vehicle running speed data output by the odometer to obtain a speed measurement value under a navigation coordinate system, wherein the formula (8) is as follows:
Figure FDA0003938373870000035
wherein the content of the first and second substances,
Figure FDA0003938373870000036
measuring velocity values of the mileage in a navigation coordinate system;
Figure FDA0003938373870000037
a fusion attitude matrix for converting a vehicle body coordinate system to a navigation coordinate system;
Figure FDA0003938373870000038
the attitude matrix is converted from the coordinate system of the odometer to the coordinate system of the vehicle body, and the mounting error parameters of the odometer comprise a pitch angle delta theta, a roll angle delta gamma and a course angle
Figure FDA0003938373870000039
Forming;
Figure FDA00039383738700000310
the speed measurement data is directly output by the odometer.
7. The method of claim 1, wherein constructing a combined inertial/odometer navigation system filter model, correcting inertial navigation errors, device errors, and odometer errors, comprises:
selecting the attitude, the position, the speed, the zero offset of a gyroscope, the zero offset of an accelerometer and the error of a milemeter as the system state variables of the inertia/milemeter combined navigation system:
Figure FDA0003938373870000041
δθ、
Figure FDA0003938373870000042
δK D indicating a pitch angle installation error, a course angle installation error and a scale coefficient error of the odometer;
the state equation of the inertia/mileometer integrated navigation system is as follows:
Figure FDA0003938373870000043
wherein (F) INS ) 9×9 And (F) sg ) 9×6 The specific expression of (a) is shown in a state equation of the attitude combination filter model;
the observation equation of the inertia/mileometer combined navigation system is as follows:
Figure FDA0003938373870000044
in the above formula, the first and second carbon atoms are,
Figure FDA0003938373870000045
and
Figure FDA0003938373870000046
respectively representing inertial navigation speed and speed error in a navigation coordinate system;
Figure FDA0003938373870000047
and
Figure FDA0003938373870000048
respectively representing the speed and the speed error of the odometer in the navigation coordinate system;
Figure FDA0003938373870000049
and x represents a cross-multiplication matrix of speed measurement data directly output by the odometer, I is an identity matrix, and V is an observation noise matrix.
8. A visual information assisted integrated inertial/odometer navigation device, comprising:
a construction module for constructing a road attitude matrix according to the high-precision map and the current vehicle positioning
Figure FDA00039383738700000410
A first calculation module for basisThe camera installed on the vehicle detects and identifies the lane line by using a machine vision algorithm, thereby calculating the included angle beta between the lane line and the vehicle 0
A second calculation module for calculating a road attitude matrix
Figure FDA00039383738700000411
Angle beta between lane line and vehicle 0 Computing a visual/map pose matrix C V/map
A correction module for constructing a posture combination filter model to obtain a visual/map posture matrix C V/map And taking an inertial attitude matrix output by inertial navigation as an observed quantity, and estimating and correcting errors of the inertial attitude matrix;
the decomposition module is used for acquiring a fusion attitude matrix output by the attitude combination filter model and decomposing the speed of the odometer into a navigation coordinate system;
and the navigation module is used for constructing an inertia/milemeter combined navigation system filter model, and correcting an inertia navigation error, a device error and a milemeter error so as to perform navigation.
9. An electronic device, comprising:
one or more processors;
a memory for storing one or more programs;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1-7.
10. A computer-readable storage medium having stored thereon computer instructions, which when executed by a processor, perform the steps of the method according to any one of claims 1-7.
CN202211411406.XA 2022-11-11 2022-11-11 Visual information assisted inertial/odometer integrated navigation method and device Pending CN115790613A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211411406.XA CN115790613A (en) 2022-11-11 2022-11-11 Visual information assisted inertial/odometer integrated navigation method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211411406.XA CN115790613A (en) 2022-11-11 2022-11-11 Visual information assisted inertial/odometer integrated navigation method and device

Publications (1)

Publication Number Publication Date
CN115790613A true CN115790613A (en) 2023-03-14

Family

ID=85436925

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211411406.XA Pending CN115790613A (en) 2022-11-11 2022-11-11 Visual information assisted inertial/odometer integrated navigation method and device

Country Status (1)

Country Link
CN (1) CN115790613A (en)

Similar Documents

Publication Publication Date Title
CN109946732B (en) Unmanned vehicle positioning method based on multi-sensor data fusion
JP4897542B2 (en) Self-positioning device, self-positioning method, and self-positioning program
EP2095148B1 (en) Arrangement for and method of two dimensional and three dimensional precision location and orientation determination
US9921065B2 (en) Unit and method for improving positioning accuracy
US11227168B2 (en) Robust lane association by projecting 2-D image into 3-D world using map information
Rose et al. An integrated vehicle navigation system utilizing lane-detection and lateral position estimation systems in difficult environments for GPS
CN107782321B (en) Combined navigation method based on vision and high-precision map lane line constraint
CN110779521A (en) Multi-source fusion high-precision positioning method and device
US20110153198A1 (en) Method for the display of navigation instructions using an augmented-reality concept
CN111457902A (en) Water area measuring method and system based on laser SLAM positioning
CN111426332B (en) Course installation error determination method and device, electronic equipment and storage medium
CN111426320B (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter
JP2001331787A (en) Road shape estimating device
US20170074678A1 (en) Positioning and orientation data analysis system and method thereof
CN109059909A (en) Satellite based on neural network aiding/inertial navigation train locating method and system
CN112835085B (en) Method and device for determining vehicle position
CN109470276B (en) Odometer calibration method and device based on zero-speed correction
KR20190040818A (en) 3D vehicular navigation system using vehicular internal sensor, camera, and GNSS terminal
JP4986883B2 (en) Orientation device, orientation method and orientation program
CN114088114A (en) Vehicle pose calibration method and device and electronic equipment
JP7203805B2 (en) Analysis of localization errors of moving objects
CN109059913A (en) A kind of zero-lag integrated navigation initial method for onboard navigation system
CN115790613A (en) Visual information assisted inertial/odometer integrated navigation method and device
CN114705199A (en) Lane-level fusion positioning method and system
CN113048987A (en) Vehicle navigation system positioning method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20230807

Address after: Room 410, 4th Floor, No. 4, Yard 19, Huangping Road, Changping District, Beijing 102200

Applicant after: Beijing Kunpeng Borui Technology Co.,Ltd.

Address before: 261, Unit 2, Building 17, Phase 2, Xinchao Jiayuan, 185 Luyuan South Street, Tongzhou District, Beijing, 101125

Applicant before: Li Ziyue

TA01 Transfer of patent application right