CN117031513A - Real-time monitoring and positioning method, system and device for roads and accessories - Google Patents

Real-time monitoring and positioning method, system and device for roads and accessories Download PDF

Info

Publication number
CN117031513A
CN117031513A CN202311008483.5A CN202311008483A CN117031513A CN 117031513 A CN117031513 A CN 117031513A CN 202311008483 A CN202311008483 A CN 202311008483A CN 117031513 A CN117031513 A CN 117031513A
Authority
CN
China
Prior art keywords
measurement unit
matrix
time
real
inertial measurement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311008483.5A
Other languages
Chinese (zh)
Inventor
许桥壮
高周正
吕洁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China University of Geosciences Beijing
Original Assignee
China University of Geosciences Beijing
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China University of Geosciences Beijing filed Critical China University of Geosciences Beijing
Priority to CN202311008483.5A priority Critical patent/CN117031513A/en
Publication of CN117031513A publication Critical patent/CN117031513A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/43Determining position using carrier phase measurements, e.g. kinematic positioning; using long or short baseline interferometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • G01S19/485Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an optical system or imaging system
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

The invention discloses a real-time monitoring and positioning method, a system and a device for roads and accessories, which relate to the technical field of monitoring and positioning, wherein the method comprises the following steps: determining a calibrated internal reference matrix and an external reference matrix by adopting a Zhang Zhengyou camera calibration method; based on satellite observation data, performing Beidou satellite navigation RTK ambiguity fixed calculation to obtain a real-time position, a real-time speed and a real-time posture of an antenna of a Beidou receiver; determining the relative position of an antenna of the Beidou receiver and the inertial measurement unit, then determining the final position and the gesture rotation matrix of the inertial measurement unit by combining current inertial data, satellite observation data, current road image, a calibrated internal reference matrix and a calibrated external reference matrix, and further calculating the position corresponding to the characteristic point in the current road image under a geocentric coordinate system; feature points in the current road image characterize road features and accessory features on the road. The invention realizes accurate positioning of roads and accessories.

Description

Real-time monitoring and positioning method, system and device for roads and accessories
Technical Field
The invention relates to the technical field of monitoring and positioning, in particular to a real-time monitoring and positioning method, system and device for roads and accessories.
Background
The road and its attachment detecting method is divided into dynamic detection and static detection. Dynamic detection is a method for positioning and monitoring roads and accessories thereof by using a dynamic road monitoring vehicle, and static detection is a method for positioning and monitoring roads and accessories thereof by using an absolute measurement mode. The large dynamic road monitoring vehicle has the advantages of high speed, insufficient precision, high cost, insufficient convenience and certain limitation. Absolute measurements are mainly polar coordinates, PPK of GNSS, etc. The total station of the polar coordinate method has strict requirements on the external environment and low measurement speed, and the satellite signals of the PPK method are interfered and shielded by the environment, so that the accuracy of the positioning result is affected, the real-time performance is not realized, and the method has certain defects. To summarize, the conventional road and its attachment monitoring technology has the problem that it is difficult to achieve the detection accuracy, the detection efficiency and the real-time performance.
Disclosure of Invention
The invention aims to provide a real-time monitoring and positioning method, a system and a device for roads and accessories, which realize accurate positioning of the roads and the accessories.
In order to achieve the above object, the present invention provides the following solutions:
in a first aspect, the present invention provides a method for real-time monitoring and positioning of roads and appendages, comprising:
Acquiring historical inertial data acquired by an inertial measurement unit, a calibration image shot by a road image shooting component, first satellite observation data output by a Beidou receiver on a reference station and second satellite observation data output by the Beidou receiver on a carrier mobile platform; the inertial measurement unit and the road image shooting component are both arranged on the carrier moving platform;
analyzing random noise in the historical inertial data by adopting an Allan variance method to obtain a random noise time domain model;
calibrating an internal reference matrix of the road image shooting component according to the calibration image by adopting a Zhang Zhengyou camera calibration method, and determining initial values of the external reference matrix of the inertial measurement unit and the road image shooting component;
optimizing the initial value of the external reference matrix based on the historical inertial data, the random noise time domain model and the calibrated internal reference matrix to obtain a calibrated external reference matrix;
based on the first satellite observation data and the second satellite observation data, performing Beidou satellite navigation RTK ambiguity fixed calculation to obtain a real-time position, a real-time speed and a real-time gesture at an antenna of the Beidou receiver, and then initializing the inertial measurement unit based on the real-time position, the real-time speed and the real-time gesture at the antenna of the Beidou receiver;
Acquiring current inertial data acquired by an initialized inertial measurement unit and a current road image shot by a road image shooting component;
determining the relative position of an antenna of the Beidou receiver and the inertial measurement unit, and then determining a final position and posture rotation matrix of the inertial measurement unit by combining the current inertial data, the first satellite observation data, the second satellite observation data, the current road image, the calibrated internal reference matrix and the calibrated external reference matrix;
calculating the position corresponding to the characteristic point in the current road image under a geocentric coordinate system based on the final position and posture rotation matrix of the inertial measurement unit and the calibrated external parameter matrix; the feature points in the current road image represent road features and accessory features on the road.
In a second aspect, the present invention provides a real-time monitoring and positioning system for roads and accessories, comprising:
the data acquisition module is used for acquiring historical inertial data acquired by the inertial measurement unit, a calibration image shot by the road image shooting component, first satellite observation data output by the Beidou receiver on the reference station and second satellite observation data output by the Beidou receiver on the carrier mobile platform; the inertial measurement unit and the road image shooting component are both arranged on the carrier moving platform;
The noise analysis module is used for analyzing random noise in the historical inertia data by adopting an Allan variance method so as to obtain a random noise time domain model;
the internal reference calibration module is used for calibrating an internal reference matrix of the road image shooting component according to the calibration image by adopting a Zhang Zhengyou camera calibration method, and determining initial values of the external reference matrix of the inertial measurement unit and the road image shooting component;
the external parameter calibration module is used for optimizing the initial value of the external parameter matrix based on the historical inertial data, the random noise time domain model and the calibrated internal parameter matrix to obtain a calibrated external parameter matrix;
the Beidou positioning and resolving module is used for fixedly resolving the Beidou satellite navigation RTK ambiguity based on the first satellite observation data and the second satellite observation data to obtain a real-time position, a real-time speed and a real-time gesture at an antenna of the Beidou receiver, and initializing the inertial measurement unit based on the real-time position, the real-time speed and the real-time gesture at the antenna of the Beidou receiver;
the current data acquisition module is used for acquiring the current inertial data acquired by the initialized inertial measurement unit and the current road image shot by the road image shooting component;
The position determining module is used for determining the relative position of the antenna of the Beidou receiver and the inertial measurement unit, and then determining a final position and posture rotation matrix of the inertial measurement unit by combining the current inertial data, the first satellite observation data, the second satellite observation data, the current road image, the calibrated internal reference matrix and the calibrated external reference matrix;
the road and accessory positioning module is used for calculating the position corresponding to the characteristic point in the current road image under the geocentric coordinate system based on the final position and gesture rotation matrix of the inertial measurement unit and the calibrated external parameter matrix; the feature points in the current road image represent road features and accessory features on the road.
The invention provides a real-time monitoring and positioning device for roads and accessories, which comprises an inertial measurement unit, a road image shooting component, a Beidou receiver, a 5G communication module, an FPGA board card and a carrier moving platform, wherein the inertial measurement unit is connected with the road image shooting component;
the inertial measurement unit, the road image shooting component, the Beidou receiver, the 5G communication module and the FPGA board card are all arranged on the carrier moving platform;
The inertial measurement unit, the road image shooting component, the Beidou receiver and the 5G communication module are electrically connected with the FPGA board card;
the inertial measurement unit is used for acquiring inertial data;
the road image shooting component is used for acquiring a calibration image and a current road image;
the Beidou receiver is used for receiving satellite observation data;
the FPGA board is used for executing a real-time monitoring and positioning method of the road and the appendages so as to obtain the positions corresponding to the characteristic points in the current road image under the geocentric coordinate system;
and the 5G communication module is used for sending the position corresponding to the characteristic point in the current road image to a remote terminal.
According to the specific embodiment provided by the invention, the invention discloses the following technical effects:
the invention discloses a real-time monitoring and positioning method, a system and a device for roads and accessories, which are characterized in that an Allan variance method is adopted to analyze random noise in inertial data so as to obtain a random noise time domain model; and calibrating an internal reference matrix of the road image shooting component by adopting a Zhang Zhengyou camera calibration method, and calibrating an initial value of the external reference matrix. Based on satellite observation data, performing Beidou satellite navigation RTK ambiguity fixed calculation to obtain a real-time position, a real-time speed and a real-time posture of an antenna of a Beidou receiver, and then based on the real-time position, the real-time speed and the real-time posture of the antenna of the Beidou receiver. Then, determining the relative position of an antenna of the Beidou receiver and the inertial measurement unit, and then determining the final position and posture rotation matrix of the inertial measurement unit by combining current inertial data, the first satellite observation data, the second satellite observation data, a current road image, a calibrated internal reference matrix and a calibrated external reference matrix; further calculating the position corresponding to the characteristic point in the current road image under the geocentric coordinate system; feature points in the current road image characterize road features and accessory features on the road.
In summary, the invention obtains the high-precision three-dimensional position coordinates of the road and the accessories thereof by adopting a real-time Beidou RTK/INS/vision tight combination data processing method based on the long-term high-precision positioning stability, the short-term positioning stability of the INS and the high-autonomy and visual navigation rich image information of the Beidou satellite navigation system real-time dynamic differential positioning technology, and realizes daily low-cost intelligent road inspection. The real-time Beidou RTK can provide centimeter-level real-time position information, the real-time RTK/INS/vision tight combination technology can provide high-precision, continuous and stable real-time precise position, speed and gesture information, and meanwhile, the accurate determination of the positions of roads and accessories thereof can be realized by applying a pinhole camera projection theory. In order to further realize the position deviation monitoring of the roads and the accessories thereof, the internal spatial relationship between the image feature points and the monitored objects is utilized to accurately identify and identify the road facilities and the road surfaces and to calibrate the lane-level accurate positions, thereby providing auxiliary decision support for the road inspection plan, maintenance plan and management.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions of the prior art, the drawings that are needed in the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a real-time monitoring and positioning method for roads and accessories according to the present invention;
FIG. 2 is a schematic diagram of a real-time monitoring and positioning system for roads and appendages according to the present invention;
fig. 3 is an integrated schematic diagram of the real-time monitoring and positioning device for roads and accessories according to the present invention.
Symbol description:
the system comprises a 1-inertial measurement unit, a 2-road image shooting component, a 3-Beidou receiver, a 4-5G communication module, a 5-FPGA board card, a 6-carrier moving platform, a 7-mobile power supply, an 8-Beidou receiver antenna, a 10-well lid, an 11-street lamp, a 12-traffic light, a 13-tree and a 14-telegraph pole.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The invention provides a real-time monitoring and positioning method, a system and a device for roads and accessories, which realize the double-antenna Beidou/inertia/camera data fusion positioning and real-time processing of road accessory detection data.
In order that the above-recited objects, features and advantages of the present invention will become more readily apparent, a more particular description of the invention will be rendered by reference to the appended drawings and appended detailed description.
Example 1
As shown in fig. 1, the present invention provides a real-time monitoring and positioning method for roads and accessories, comprising:
step 100, acquiring historical inertial data acquired by an inertial measurement unit, a calibration image shot by a road image shooting component, first satellite observation data output by a Beidou receiver on a reference station and second satellite observation data output by the Beidou receiver on a carrier mobile platform; the inertial measurement unit and the road image shooting component are both arranged on the carrier moving platform.
And 200, analyzing random noise in the historical inertia data by adopting an Allan variance method to obtain a random noise time domain model.
The historical inertial data comprises N continuous IMUs (Inertial Measurement Unit, inertial measurement unitsMeta) sampling point x i (i is more than or equal to 1 and less than or equal to N), and the data sampling period is tau 0 Arbitrary time cluster τ= (n-1) ·τ 0 Wherein N < N/2. X is set according to the length of the time cluster i Divided into N m Successive arrays. And calculating the average value of the data in the corresponding data block of each time cluster tau, wherein the formula is as follows:
Wherein,mean value is represented, and Σ (·) represents the summation symbol.
The formula for calculating the difference between the average values of two adjacent data blocks is as follows:
wherein,and->Is the average of adjacent blocks.
The Allan variance of the time cluster τ is then obtained as follows:
wherein,<·>to calculate allAn operator of the average value.
Finally, the variation condition of the Allan standard deviation along with time, namely a random noise time domain model, is drawn by a double logarithmic curve.
And 300, calibrating an internal reference matrix of the road image shooting component according to the calibration image by adopting a Zhang Zhengyou camera calibration method, and determining initial values of the external reference matrix of the inertial measurement unit and the road image shooting component. In order to project the spatial points under the camera coordinate system into the pixel coordinate system, the internal parameters of the camera need to be calibrated in advance. Typically, the camera parameters include focal length, principal point coordinates, and distortion parameters.
Step 300 specifically includes:
(1) Based on the conversion relation between the world coordinate system and the pixel coordinate system, establishing a shooting component calibration function model; the shooting component calibration function model comprises an internal parameter matrix of the road image shooting component, the inertia measurement unit and an external parameter matrix of the road image shooting component.
Assume that there is a point p= (X) in three-dimensional space of the world coordinate system w Y w Z w ) The coordinates of the image point in the corresponding pixel coordinate system are (u v), and the homogeneous coordinates of the image space and the object space of the spatial point are expressed as follows:and (3) withThe collineation condition of the pinhole model is as follows:
wherein s is a proportionality coefficient, [ R|t ]]Is a transformation matrix between a world coordinate system and a camera coordinate system, namely an external azimuth element matrix,is an internal azimuth element matrix of the camera, (u) 0 v 0 ) Is the coordinates of the principal point in the pixel coordinate system, (f) x f y ) For the focal length of the camera, gamma is the skew factor, and the degree of orthogonality between u, v is measured.
By fixing the world coordinate system to the checkerboard, the spatial coordinates of any point on the checkerboard can be expressed as (X) w Y w 0) The pinhole model can be expressed as follows
In [ r ] 1 r 2 r 3 ]In the form of a column vector of the rotation matrix R.
(2) And constructing a homography matrix based on the shooting component calibration function model.
Construction of homography matrix h=mr 1 r 2 t]=[h 1 h 2 h 3 ]Two-dimensional pixel coordinates and three-dimensional world coordinates of the spatial points are known, so that an H array can be obtained for each image. r is (r) 1 ,r 2 For the orthonormal vector, the following formula is satisfied:
and r is 1 =M -1 h 1 ,r 2 =M -1 h 2 The simultaneous formula can be obtained:
recording deviceIt is a symmetric array, so can be represented as b= (B) with a 6-dimensional vector 11 B 12 B 22 B 13 B 23 B 33 ) T Therefore, the above can be written as:
wherein v is ij =(h i1 h j1 h i1 h j2 +h i2 h j1 h i2 h j2 h i3 h j1 +h i1 h j3 h i3 h j2 +h i2 h j3 h i3 h j3 ) T
Assuming that there are n images, the above formula can be expressed as:
(3) And solving the homography matrix according to the calibration image based on a least square method and a cholesky decomposition method to obtain an initial value of an internal reference matrix of the road image shooting component.
According to the content described in the above step (2), it can be known that V is a 2n×6 dimensional matrix, more than 3 images are selected, the b vector can be obtained by least square, and then the internal azimuth element matrix (initial value of internal reference matrix) M can be obtained by cholesky decomposition, wherein each element can be expressed as follows:
(4) According to the calibrated internal reference matrix and the homography matrix, calculating the initial values of the external reference matrix of the inertia measurement unit and the road image shooting component, wherein the calculation formula is as follows:
[R|t]=M -1 H。
(5) Determining an initial value of a distortion coefficient; after obtaining the closed solution of the internal and external azimuth elements, adding distortion factors into the solution, and obtaining a distortion coefficient initial value according to the following formula:
wherein k is 1 And k is equal to 2 Radial distortion coefficients, (x y) normalized image coordinates,and (u v) are the true value of the pixel coordinate and the pixel coordinate containing distortion, respectively.
(6) And based on the initial value of the internal reference matrix and the initial value of the distortion coefficient, minimizing the reprojection error of the point by adopting a maximum likelihood method so as to optimize the initial value of the internal reference matrix and obtain the calibrated internal reference matrix. Because the solving process only obtains the minimum algebraic distance, the method of maximum likelihood estimation is also needed to be adopted for optimizing the minimum algebraic distance, and finally, the accurate internal reference of the camera is obtained. Assuming that the image points are contaminated with noise that is independently co-distributed, the maximum likelihood estimate can be expressed by:
Wherein m is the number of control points in each image,for the i-th image midpoint P j Is a projection of (a).
And 400, optimizing the initial value of the external reference matrix based on the historical inertial data, the random noise time domain model and the calibrated internal reference matrix to obtain the calibrated external reference matrix. The historical inertial data comprises a historical acceleration measurement value, a historical angular velocity measurement value and a historical pose; the historical pose is the pose of the inertial measurement unit in a world coordinate system.
After the internal parameters are calibrated, the IMU-camera platform moves against the calibration plate, each shaft system of the IMU and the camera is fully excited to enable the calibration parameters to have good observability, and then the spatial pose relation among different sensors is estimated based on continuous time batch estimation and maximum likelihood theory combination, so that the external parameters of the IMU-camera are calibrated. Specifically, after obtaining the IMU error parameters and the camera internal parameters, the combined calibration of the inertial/visual external parameters and the solution of the time delay become the key for realizing high-precision inertial/visual integrated navigation. In order to realize the calibration of relevant parameters, the invention defines the following states to be solved:
x(t)=[g w T c,i T w,i (t) d b a (t) b g (t)]。
g in w Is the gravity vector under the world coordinate system, T c,i Is the external parameters of the camera and the IMU, T w,i (t) is the pose of the IMU in the world coordinate system, b a (t) and b g And (t) is zero offset of the adder and the gyroscope respectively.
Step 400 specifically includes:
(1) Converting the historical pose into a Bezier spline curve with six degrees of freedom; pose T of IMU in world coordinate system wi (t) a Bezier spline parameterized as 6 degrees of freedom, wherein three degrees of freedom represent rotationThree degrees of freedom represent translation t (t). Can be represented by the following formula:
wherein C (·) represents the transformation rotation parameter as a function of the rotation matrix.
(2) Calculating the acceleration and the angular velocity of the inertial measurement unit based on the Bezier spline curve with six degrees of freedom; the motion acceleration a (t) and the angular velocity ω (t) can be obtained by deriving the bezier curve, and are respectively expressed as follows:
where S (·) represents a transformation matrix that converts the rotation parameters into angular velocities.
(3) Based on the random noise time domain model and the calibrated internal reference matrix, a visual re-projection error function, an acceleration error function, an angular velocity error function, an accelerometer zero bias error function and a gyro zero bias error function are respectively constructed according to the historical acceleration measurement value, the historical angular velocity measurement value and the calculated acceleration and angular velocity of the inertial measurement unit at the same moment.
Let it be assumed that at t k The measured values of the adding table and the gyroscope are respectively alpha at the moment k And (3) withThe measured value of the IMU and the acceleration a (t k ) Angular velocity ω (t) k ) Relation of (2) and landmark point->The pixel position of (2) can be expressed by the following formula
Wherein eta to N (0, R), h (& gt) are nonlinear camera models, t j D is the time stamp of the image and d is the unknown amount of time delay.
And then constructing various errors and objective functions according to the above formula:
the visual re-projection error function is:
the acceleration error function is:
the angular velocity error function is:
the accelerometer zero offset error function is:
the gyro zero offset error function is as follows:
the calculation formula of the total error value is as follows:
wherein E represents the total error value, E y Representing the total error value of visual re-projection, E α Indicating the total error value of the acceleration,representing the total error value of the angular velocity,/-, for example>Indicating an accelerometer zero total bias error value, +.>Representing the total zero offset error value of the gyroscope; />Representing the visual re-projection error value, y mj Representing the pixel coordinates of the road mark points, wherein the pixel of the road mark points is a characteristic pixel point representing a road and an accessory in a calibration image; h () represents a nonlinear road image capturing section model, T c,i External reference matrix T representing inertial measurement unit and road image capturing component w,i Representing the pose of an inertial measurement unit in a world coordinate system, t j Time stamp representing calibration image, d representing time delay parameter,/->Representation calibrationRoad marking points on the road in the image, J represents the number of images, M represents the number of road marking points, and +.>Representing the pixel observation noise of the road mark point; />Representing the acceleration error value, C (·) representing the transformation rotation parameter as a function of the rotation matrix, ++>Indicating rotation parameters, a (t k ) Representing t k Acceleration at time g w Representing the gravity vector in the world coordinate system, b a (t) and b g (t) is zero offset of the adder and the gyroscope at the time t, when t=t k At the time, denoted as t k Time; k represents epoch number, () T represents transpose, () -1 Representing inversion of->Representing acceleration observation noise; />Representing the angular velocity error value,/>Representing gyroscope measurement output, ω (t) k ) Represents the angular velocity, b, derived from spline curves ω (t k ) Indicating gyroscope random walk, +.>Represents angular velocity observation noise ++>Representing the zero offset error value of the accelerometer, R a Representing acceleration random walk noise +.>Representing zero offset error value of gyro, R g Representing gyro random walk noise.
(4) And establishing an extrinsic matrix optimization function by taking the minimum total error value as a target according to the vision reprojection error function, the acceleration error function, the angular velocity error function, the accelerometer zero offset error function and the gyro zero offset error function.
(5) And solving the external parameter matrix optimization function by adopting a Levenberg-Marquardt algorithm to obtain a calibrated external parameter matrix.
And 500, performing Beidou satellite navigation RTK (Real-Time Kinematic) ambiguity fixed resolution based on the first satellite observation data and the second satellite observation data to obtain a Real-Time position, a Real-Time speed and a Real-Time posture of an antenna of the Beidou receiver (Beidou Navigation Satellite System, beidou satellite navigation system), and initializing the inertial measurement unit based on the Real-Time position, the Real-Time speed and the Real-Time posture of the antenna of the Beidou receiver. The first satellite observation data and the second satellite observation data each comprise a pseudo-range observation value, a carrier observation value, a BDS satellite broadcast ephemeris and a Doppler observation value.
Based on the first satellite observation data and the second satellite observation data, performing a Beidou satellite navigation RTK ambiguity fixed calculation to obtain a real-time position, a real-time speed and a real-time gesture of an antenna of the Beidou receiver, wherein the method specifically comprises the following steps:
(1) Constructing a double-difference observation equation based on the first satellite observation data and the second satellite observation data; the double-difference observation equation comprises a double-difference carrier observation equation and a double-difference pseudo-range observation equation. Specifically, a double-difference observation equation is constructed by utilizing a double-frequency pseudo-range output by a BDS receiver integrated on a hardware platform and a BDS receiver output by a reference station, a carrier wave and a BDS satellite broadcast ephemeris:
In the middle ofRepresenting the double difference of standing star, P and +.>The pseudo-range/carrier observed value is represented, p is the station star distance, e and epsilon are respectively represented by the pseudo-range and the observed noise of the carrier observed value, and lambda and N are represented by the carrier wavelength and the integer ambiguity.
(2) Based on a sequential nonlinear least square method, BDS RTK floating solution calculation is carried out on the double-difference observation equation to obtain floating solution double-difference ambiguity vectors of carrier observation values on B1I frequency points and B3I frequency points of Beidou satellite navigationAnd->) Corresponding variance-covariance matrix (Q N1,f And Q N3,f ) The calculation formula is as follows:
in the method, in the process of the invention,representing vector estimates, δp is the receiver coordinate vector error, L k 、V k And A k Respectively represent an observation vector, an observation value residual vector and a design coefficient momentArray, P k A priori matrix representing the observation vector, V X,k Representing current state vector estimate X k And its predictive value->Difference, P X,k-1 Representing a weight matrix of the predicted state vector.
Is a state vector post-test covariance matrix.
(3) Based on the floating solution double-difference ambiguity vector and the corresponding variance-covariance matrix, a least square ambiguity-reduction correlation adjustment LAMBDA algorithm is adopted to determine an integer double-difference ambiguity value corresponding to the real double-difference ambiguity, and the calculation formula is as follows:
z in T In the form of an integer transform matrix, And->For estimated floating double difference ambiguity (+)>And->) And its corresponding post-test covariance matrix (Q N1,f And Q N3,f ),/>And->To reduce the ambiguity vector after correlationAnd its corresponding covariance matrix. Thus, the problem of solving the integer ambiguity vector will translate into a solution minimization function problem as follows:
where z is the unrecovered integer ambiguity vector. After the z-vector is obtained, the integer double difference ambiguity value can be found according to the following equation:
N=Z -1 z。
(4) Based on a least square method, according to the integer double-difference ambiguity value and the double-difference carrier observation equation, calculating the real-time position of the antenna of the Beidou receiver, wherein the calculation formula is as follows:
it should be noted that in the observation vector L k The integer double-difference ambiguity value obtained by the double-difference carrier observation equation is replaced.
(5) Based on the real-time position of the antenna of the Beidou receiver and the Doppler observation value, the real-time speed and the real-time attitude of the antenna of the Beidou receiver are calculated by utilizing a least square theory, a BDS speed measurement theory and a dual-antenna attitude determination theory, wherein the calculation formula is as follows:
in the method, in the process of the invention,a= |v, which is the Doppler observed value on the j frequency of the ith satellite i -v r The | represents the rate of change of distance caused by satellite and receiver movements, +. >Indicating receiver clock variationTransformation rate (F/L)>Is a comprehensive error term such as satellite clock error, ionosphere error, troposphere error and the like. Applying the least squares method-> The speed of the receiver antenna can be solved. According to the dual-antenna attitude determination theory, the attitude information is obtained by the following formula:
where yaw and pitch represent heading and pitch angles, roll angle is typically given an empirical value of 0,is the coordinates of the base line vector between the main antenna and the auxiliary antenna of the receiver under the n system.
Step 600, acquiring current inertial data acquired by the initialized inertial measurement unit and a current road image shot by the road image shooting component.
Step 700, determining the relative position of the antenna of the Beidou receiver and the inertial measurement unit, and then determining the final position and posture rotation matrix of the inertial measurement unit by combining the current inertial data, the first satellite observation data, the second satellite observation data, the current road image, the calibrated internal reference matrix and the calibrated external reference matrix. The current inertial data includes a current acceleration observation, a current angular velocity observation. Specifically, the high-precision real-time position, speed and attitude information of the IMU center is calculated based on a BDS/INS/visual data fusion theory and a multi-state constraint Kalman filtering theory by utilizing the position, speed and attitude real-time information of the IMU center, an RTK double-difference observation equation and the visual measurement characteristic point pixel coordinates calculated by an inertial navigation theory.
The determining process of the relative position of the antenna of the Beidou receiver and the inertial measurement unit specifically comprises the following steps: using IMU measuring center as coordinate origin, setting front (F) of carrier motion as x-axis, right (R) as y-axis, determining z-axis (Down, D) according to Right-hand theorem, and measuring three-dimensional coordinate of BDS receiver main antenna relative to coordinate origin (IMU measuring center) under the coordinate system (F-R-D)To compensate for lever arm errors due to misalignment of the GNSS receiver center with the IMU center.
Step 700 specifically includes:
(1) Calculating the position, the speed and the gesture rotation matrix of the inertial measurement unit under an inertial navigation coordinate system according to the current inertial data; specifically, the linear velocity observation value (b system) output by the accelerometer of the IMU and the angular velocity observation data (b system) output by the gyroscope are utilized to calculate the position of the IMU measurement center at the moment k under the navigation coordinate system according to an INS navigation equation (N represents the navigation coordinate system North-East-Down, denoted as N-E-D)Speed->And posture->
The calculation formula of the speed of the inertial measurement unit under the inertial navigation coordinate system is as follows:
Wherein,and->Represents the velocity in the n series at times k and k-1, respectively,/->Represents the speed increase produced by the specific force at time k,/->A speed compensation error term representing the common constant of gravity, centripetal acceleration and coriolis acceleration from time k,/->Represents the rotation matrix of the moment n from k to k-1, I represents the identity matrix,/>A rotation matrix from n to b at time k-1; />And->Are both a function of position and velocity, Δt representing the sampling interval of the inertial measurement unit, mid representing the intermediate instants k-1 to k,/o>Indicating the speed increase in line b by the specific force at time k-1,/>Represents the velocity increment g at time b n Represents the gravitational acceleration in the n series, v n Represents the velocity in n series, Δθ k The attitude increment at the time b of k is represented by the navigation coordinate system, and the coordinate system with the measurement center of the inertial measurement unit as the origin of coordinates is represented by the n.
The position of the inertial measurement unit in the inertial navigation coordinate systemThe calculation formula of (2) is as follows:
/>
wherein,h imu,k and h imu,k-1 The elevations at the times k and k-1 are shown, v D,mid Indicating the speed at the intermediate time, B imu,k And L imu,k Respectively representing the latitude and longitude at the moment k, E is a diagonal matrix, R M And R is N Is the radius of the meridian and the mortise circle, Position quaternion representing moment k +.>Representing quaternion multiplication, and representing modulo operation, ζ k Represents the rotation vector of the n system from k-1 to k time k The rotation vector of the e-series from k-1 to k-time, atan () represents the arctangent function.
The calculation formula of the attitude rotation matrix of the inertial measurement unit under the inertial navigation coordinate system is as follows:
wherein,gesture quaternion representing moment k +.>A rotation vector representing the series from k-1 to k time b; Δθ k-1 ×Δθ k And/12 represents a second order cone effect compensation term, q is reduced +.>q [Y] Representing a gesture quaternion +.>The Y-th element of (b).
(2) According to the relative position of the antenna of the Beidou receiver and the inertial measurement unit and the position and posture rotation matrix of the inertial measurement unit under an inertial navigation coordinate system, calculating the station star distance predicted by the inertial measurement unit under a geocentric coordinate system, wherein the calculation formula is as follows:
wherein,representing the posture rotation matrix from b-series to n-series,/-series>A rotation matrix representing the fixed coordinate system (Earth Centered Earth Fixed frame, e) from n-system to geocentric,/v>And (3) representing the position of the BDS satellite under the e system at the k moment, wherein I I.I. represents European norm operation.
(3) According to the current road image of multiple frames, calculating the three-dimensional coordinates of any feature point in the current road image under the geocentric coordinate system by adopting a triangulation method
(4) Based on the three-dimensional coordinates of any feature point in the current road image in the geocentric coordinate system and the calibrated internal reference matrix (f) of the road image shooting component x And f y ) External parameter matrix after calibrationAnd->c represents a Camera coordinate system Camera frame), and according to the position of the inertial measurement unit corresponding to the current road image under the geocentric coordinate systemGesture rotation matrix->Calculating pixel coordinate estimated values of feature points in the current road imageThe calculation formula is as follows:
wherein,is the three-dimensional coordinate of the feature point j in the c system, o x And o y Representing the characteristic point measurement noise.
(5) Based on the station star distance predicted by the inertial measurement unit, the first satellite observation data, the second satellite observation data and the pixel coordinate estimation value of the characteristic point in the current road image under the geocentric coordinate system, an innovation vector function (Z) in an observation equation of multi-state constraint Kalman filtering is constructed k ) The calculation formula is as follows:
in the method, in the process of the invention,measuring pixel coordinates representing feature points, +.>And->All are calculated based on the first satellite observation data and the second satellite observation data, and the specific calculation formula is shown in step 500 above.
And updating a function model according to the observation of the multi-state constraint Kalman filtering, and carrying out parameter calculation:
δx k =δx k-1 +K k (Z k -H k δx k-1 )。
Wherein δx represents the parameter correction vector, K k And H k Respectively representing a gain matrix and a design coefficient matrix of the kalman filter.
(6) Substituting the integer double-difference ambiguity value into the innovation vector function and performing parameter calculation to obtain a position and posture rotation matrix of the inertial measurement unit after error compensation.
Step 800, calculating the position corresponding to the feature point in the current road image under the geocentric coordinate system based on the final position and posture rotation matrix of the inertial measurement unit and the calibrated external reference matrix; the feature points in the current road image represent road features and accessory features on the road. Namely: coordinates of the feature point position in the camera coordinate system (c system)Conversion to position coordinates under the e-series +.>
Specifically, high-precision IMU position based on camera external participation error compensationAnd gesture matrix->The position information of the road accessory under the e system is calculated as follows:
in the method, in the process of the invention,for posture->Transposed matrix of>The attitude and rotation matrix from the c-series to the b-series is shown.
Compared with the current road and its accessory monitoring technology, the invention has the following advantages: (1) The accuracy is high, the position determination of the road and the auxiliary materials thereof is completed in real time, and the result is obtained after the measurement is completed; (2) The invention adopts the Beidou real-time RTK/INS/vision tight combination technology, can utilize images to explore and check the positions of the road to be maintained and the road appendages, and can meet the requirements of monitoring and maintenance.
Example two
As shown in fig. 2, in order to achieve the technical solution in the first embodiment to achieve the corresponding functions and technical effects, the present embodiment further provides a real-time monitoring and positioning system for roads and accessories, including:
the data acquisition module 101 is used for acquiring historical inertial data acquired by the inertial measurement unit, a calibration image shot by the road image shooting component, first satellite observation data output by the Beidou receiver on the reference station and second satellite observation data output by the Beidou receiver on the carrier mobile platform; the inertial measurement unit and the road image shooting component are both arranged on the carrier moving platform.
The noise analysis module 201 is configured to analyze random noise in the historical inertial data by adopting an alan variance method, so as to obtain a random noise time domain model.
And the internal reference calibration module 301 is configured to calibrate an internal reference matrix of the road image capturing component according to the calibration image by adopting a Zhang Zhengyou camera calibration method, and determine initial values of external reference matrices of the inertial measurement unit and the road image capturing component.
And the external parameter calibration module 401 is configured to optimize the initial value of the external parameter matrix based on the historical inertial data, the random noise time domain model and the calibrated internal parameter matrix, so as to obtain a calibrated external parameter matrix.
The Beidou positioning and resolving module 501 is configured to perform a fixed resolving of a Beidou satellite navigation RTK ambiguity based on the first satellite observation data and the second satellite observation data, so as to obtain a real-time position, a real-time speed and a real-time gesture at an antenna of the Beidou receiver, and then initialize the inertial measurement unit based on the real-time position, the real-time speed and the real-time gesture at the antenna of the Beidou receiver.
The current data acquisition module 601 is configured to acquire current inertial data acquired by the initialized inertial measurement unit and a current road image captured by the road image capturing component.
The position determining module 701 is configured to determine a relative position of an antenna of the beidou receiver and the inertial measurement unit, and then determine a final position and an attitude rotation matrix of the inertial measurement unit by combining the current inertial data, the first satellite observation data, the second satellite observation data, the current road image, the calibrated internal reference matrix, and the calibrated external reference matrix.
The road and accessory positioning module 801 is configured to calculate a position corresponding to a feature point in the current road image in a geocentric coordinate system based on the final position and posture rotation matrix of the inertial measurement unit and the calibrated external parameter matrix; the feature points in the current road image represent road features and accessory features on the road.
Example III
As shown in fig. 3, the present embodiment provides a real-time monitoring and positioning device for roads and accessories, which includes an inertial measurement unit 1, a road image capturing component 2, a beidou receiver 3, a 5G communication module 4, an FPGA board card 5 and a carrier moving platform 6. The carrier moving platform 6 is also provided with a mobile power supply 7 to supply power to the device.
The inertial measurement unit 1, the road image shooting component 2, the Beidou receiver 3, the 5G communication module 4 and the FPGA board card 5 are all arranged on the carrier moving platform 6; specifically, the inertial sensor is mounted on the right side of the carrier platform, the Beidou receiver antenna is mounted on the-y axis of the carrier coordinate system (F-R-D, b system) with the center of the IMU, the vision sensor is mounted on the-x axis of the IMU b system, and the specific position of the vision sensor is placed in front of the carrier. The inertial sensor, the Beidou receiver antenna and the visual sensor are in rigid relation, so that the movement of the IMU, the Beidou receiver antenna and the visual sensor is consistent, and the spatial position relation among the inertial sensor, the Beidou receiver antenna and the visual sensor is unchanged under the b system when the carrier platform moves.
The inertial measurement unit 1, the road image shooting component 2, the Beidou receiver 3 and the 5G communication module 4 are electrically connected with the FPGA board card 5.
The inertial measurement unit 1 is used for acquiring inertial data. The road image shooting component 2 is used for acquiring a calibration image and a current road image, in particular a camera. The Beidou receiver 3 is used for receiving satellite observation data, and a Beidou receiver antenna 8 is arranged on the Beidou receiver 3. The coordinates of the Beidou receiver master-slave Beidou receiver antenna and the vision sensor in the b system with the IMU measurement center as the origin of coordinates are respectively as follows:
and->
The inertial data, the image data and the satellite observation data are transmitted to the FPGA board in real time, the BDS time is taken as a reference, the IMU data and the image data are in a time hard synchronization mode, the time synchronization with the BDS data is realized, and the specific steps can be summarized as follows: firstly, a camera receives a PPS second pulse (20 Hz) external signal output by a Beidou receiver through an INPUT interface of the camera to realize hard trigger exposure; then, the camera exposure emission signals are received and recorded by the Beidou receiver and stored, so that the time synchronization among the three sensors of the camera, the inertial navigation and the Beidou receiver is realized from the hardware level.
After the above time synchronization, the FPGA board 5 is configured to execute the real-time monitoring and positioning method for the road and the accessories according to the first embodiment, so as to obtain the position corresponding to the feature point in the current road image under the geocentric coordinate system.
The 5G communication module 4 is used for sending the position corresponding to the characteristic point in the current road image to a remote terminal, so that real-time positioning and monitoring of roads and accessories can be realized. The appendages in the roadway include manhole covers 10, street lamps 11, traffic lights 12, trees 13, and utility poles 14, including but not limited to the features described above. When the road image shooting component 2 shoots images, the road and the accessories on the road are shot, so that the subsequent FPGA board card can sequentially perform calibration and image processing to obtain the positions of the corresponding feature points.
The invention is provided with the inertial sensor, the Beidou satellite receiver, the vision sensor and other sensors on the monitoring trolley, and realizes real-time high-precision position determination of the road and the appendages thereof by utilizing the real-time BDS RTK/INS/vision tight combination. Meanwhile, the real-time data after time synchronization is processed in real time on the FPGA, carrier position and attitude information is obtained in real time, a visual navigation theory is utilized to realize rapid, high-precision and real-time monitoring of the road accessory, and then a positioning monitoring result is transmitted to a user in real time through the 5G communication module, so that the user can know related positioning information in time.
In the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, and identical and similar parts between the embodiments are all enough to refer to each other. For the system disclosed in the embodiment, since it corresponds to the method disclosed in the embodiment, the description is relatively simple, and the relevant points refer to the description of the method section.
The principles and embodiments of the present invention have been described herein with reference to specific examples, the description of which is intended only to assist in understanding the methods of the present invention and the core ideas thereof; also, it is within the scope of the present invention to be modified by those of ordinary skill in the art in light of the present teachings. In view of the foregoing, this description should not be construed as limiting the invention.

Claims (9)

1. The real-time monitoring and positioning method for the roads and the accessories is characterized by comprising the following steps of:
acquiring historical inertial data acquired by an inertial measurement unit, a calibration image shot by a road image shooting component, first satellite observation data output by a Beidou receiver on a reference station and second satellite observation data output by the Beidou receiver on a carrier mobile platform; the inertial measurement unit and the road image shooting component are both arranged on the carrier moving platform;
Analyzing random noise in the historical inertial data by adopting an Allan variance method to obtain a random noise time domain model;
calibrating an internal reference matrix of the road image shooting component according to the calibration image by adopting a Zhang Zhengyou camera calibration method, and determining initial values of the external reference matrix of the inertial measurement unit and the road image shooting component;
optimizing the initial value of the external reference matrix based on the historical inertial data, the random noise time domain model and the calibrated internal reference matrix to obtain a calibrated external reference matrix;
based on the first satellite observation data and the second satellite observation data, performing Beidou satellite navigation RTK ambiguity fixed calculation to obtain a real-time position, a real-time speed and a real-time gesture at an antenna of the Beidou receiver, and then initializing the inertial measurement unit based on the real-time position, the real-time speed and the real-time gesture at the antenna of the Beidou receiver;
acquiring current inertial data acquired by an initialized inertial measurement unit and a current road image shot by a road image shooting component;
determining the relative position of an antenna of the Beidou receiver and the inertial measurement unit, and then determining a final position and posture rotation matrix of the inertial measurement unit by combining the current inertial data, the first satellite observation data, the second satellite observation data, the current road image, the calibrated internal reference matrix and the calibrated external reference matrix;
Calculating the position corresponding to the characteristic point in the current road image under a geocentric coordinate system based on the final position and posture rotation matrix of the inertial measurement unit and the calibrated external parameter matrix; the feature points in the current road image represent road features and accessory features on the road.
2. The method for real-time monitoring and positioning of roads and accessories according to claim 1, wherein the method for calibrating the reference matrix of the road image capturing unit according to the calibration image by using a Zhang Zhengyou camera calibration method, and determining initial values of the reference matrix of the inertial measurement unit and the reference matrix of the road image capturing unit, specifically comprises:
based on the conversion relation between the world coordinate system and the pixel coordinate system, establishing a shooting component calibration function model; the shooting component calibration function model comprises an internal parameter matrix of the road image shooting component, an external parameter matrix of the inertia measurement unit and the road image shooting component;
constructing a homography matrix based on the shooting component calibration function model;
solving the homography matrix according to the calibration image based on a least square method and a cholesky decomposition method to obtain an initial value of an internal reference matrix of the road image shooting component;
Calculating the initial values of the external parameter matrix of the inertial measurement unit and the road image shooting component according to the calibrated internal parameter matrix and the homography matrix;
determining an initial value of a distortion coefficient;
and based on the initial value of the internal reference matrix and the initial value of the distortion coefficient, minimizing the reprojection error of the point by adopting a maximum likelihood method so as to optimize the initial value of the internal reference matrix and obtain the calibrated internal reference matrix.
3. The method of claim 1, wherein the historical inertial data includes historical acceleration measurements, historical angular velocity measurements, and historical pose; the historical pose is the pose of the inertial measurement unit under a world coordinate system;
optimizing the initial value of the external reference matrix based on the historical inertial data, the random noise time domain model and the calibrated internal reference matrix to obtain a calibrated external reference matrix, wherein the method specifically comprises the following steps of:
converting the historical pose into a Bezier spline curve with six degrees of freedom;
calculating the acceleration and the angular velocity of the inertial measurement unit based on the Bezier spline curve with six degrees of freedom;
Based on the random noise time domain model and the calibrated internal reference matrix, respectively constructing a visual re-projection error function, an acceleration error function, an angular velocity error function, an accelerometer zero bias error function and a gyro zero bias error function according to the historical acceleration measurement value, the historical angular velocity measurement value and the calculated acceleration and angular velocity of the inertial measurement unit at the same moment;
establishing an extrinsic matrix optimization function by taking the minimum total error value as a target according to the vision reprojection error function, the acceleration error function, the angular velocity error function, the accelerometer zero offset error function and the gyro zero offset error function;
and solving the external parameter matrix optimization function by adopting a Levenberg-Marquardt algorithm to obtain a calibrated external parameter matrix.
4. The method for real-time monitoring and positioning of roads and accessories according to claim 3, wherein the visual re-projection error function is:
the acceleration error function is:
the angular velocity error function is:
the accelerometer zero offset error function is:
the gyro zero offset error function is as follows:
the calculation formula of the total error value is as follows:
Wherein E represents the total error value, F y Representing the total error value of visual re-projection, E α Indicating the total error value of the acceleration,representing the total error value of the angular velocity,/-, for example>Indicating an accelerometer zero total bias error value, +.>Representing the total zero offset error value of the gyroscope; />Representing the visual re-projection error value, y mj Representing landmark pixel coordinates, h () representing a nonlinear road image capturing component model, T c,i External reference matrix T representing inertial measurement unit and road image capturing component w,i Representing the pose of an inertial measurement unit in a world coordinate system, t j Time stamp representing calibration image, d representing time delay parameter,/->Represents the road mark points on the road in the calibration image, J represents the number of images, M represents the number of road mark points, < ->Representing the pixel observation noise of the road mark point; />Representing the acceleration error value, C (·) representing the transformation rotation parameter as a function of the rotation matrix, ++>Indicating rotation parameters, a (t k ) Representing t k Acceleration at time g w Representing the gravity vector in the world coordinate system, b a (t) and b g (t) is the sum of the meter and the gyroscope respectivelyZero offset at time t, when t=t k At the time, denoted as t k Time; k represents the epoch number () T Representing the transpose () -1 Representing inversion of->Representing acceleration observation noise; />Representing the angular velocity error value,/ >Representing gyroscope measurement output, ω (t) k ) Represents the angular velocity, b, derived from spline curves ω (t k ) Indicating gyroscope random walk, +.>Representing angular velocity with observation noise +.>Representing the zero offset error value of the accelerometer, R a Representing acceleration random walk noise +.>Representing zero offset error value of gyro, R g Representing gyro random walk noise.
5. The method of claim 1, wherein the first satellite observation data and the second satellite observation data each comprise pseudorange observations, carrier observations, BDS satellite broadcast ephemeris and doppler observations;
based on the first satellite observation data and the second satellite observation data, performing Beidou satellite navigation RTK ambiguity fixed calculation to obtain a real-time position, a real-time speed and a real-time gesture of an antenna of the Beidou receiver, wherein the method specifically comprises the following steps:
constructing a double-difference observation equation based on the first satellite observation data and the second satellite observation data; the double-difference observation equation comprises a double-difference carrier observation equation and a double-difference pseudo-range observation equation;
based on a sequential nonlinear least square method, performing BDS RTK floating solution calculation on the double-difference observation equation to obtain floating solution double-difference ambiguity vectors of carrier observation values and corresponding variance-covariance matrixes on B1I frequency points and B3I frequency points of Beidou satellite navigation;
Based on the floating solution double-difference ambiguity vector and a corresponding variance-covariance matrix, determining an integer double-difference ambiguity value corresponding to a real double-difference ambiguity by adopting a least square ambiguity-reduction correlation adjustment LAMBDA algorithm;
based on a least square method, calculating a real-time position of an antenna of the Beidou receiver according to the integer double-difference ambiguity value and the double-difference carrier observation equation;
based on the real-time position of the antenna of the Beidou receiver and the Doppler observation value, a least square theory, a BDS speed measurement theory and a dual-antenna attitude determination theory are adopted to calculate the real-time speed and the real-time attitude of the antenna of the Beidou receiver.
6. The method of claim 5, wherein the current inertial data comprises current acceleration observations, current angular velocity observations;
determining the relative position of the antenna of the Beidou receiver and the inertial measurement unit, and then determining a final position and posture rotation matrix of the inertial measurement unit by combining the current inertial data, the first satellite observation data, the second satellite observation data, the current road image, the calibrated internal reference matrix and the calibrated external reference matrix, wherein the method specifically comprises the following steps of:
Calculating the position, the speed and the gesture rotation matrix of the inertial measurement unit under an inertial navigation coordinate system according to the current inertial data;
calculating the station star distance predicted by the inertial measurement unit under a geocentric coordinate system according to the relative position of the antenna of the Beidou receiver and the inertial measurement unit and the position and posture rotation matrix of the inertial measurement unit under an inertial navigation coordinate system;
according to the multi-frame current road image, calculating the three-dimensional coordinates of any feature point in the current road image under a geocentric coordinate system by adopting a triangulation method;
calculating a pixel coordinate estimated value of a feature point in the current road image based on a three-dimensional coordinate of any feature point in the current road image under a geocentric coordinate system, a calibrated internal reference matrix of the road image shooting component and a calibrated external reference matrix, and according to a position and an attitude rotation matrix of an inertial measurement unit corresponding to the current road image under the geocentric coordinate system;
constructing an innovation vector function in an observation equation of multi-state constraint Kalman filtering based on a station star distance predicted by the inertial measurement unit, the first satellite observation data, the second satellite observation data and pixel coordinate estimation values of feature points in the current road image under a geocentric coordinate system;
Substituting the integer double-difference ambiguity value into the innovation vector function and performing parameter calculation to obtain a position and posture rotation matrix of the inertial measurement unit after error compensation.
7. The method for real-time monitoring and positioning of roads and accessories according to claim 6, wherein the calculation formula of the speed of the inertial measurement unit in the inertial navigation coordinate system is:
wherein,and->Represents the velocity in the n series at times k and k-1, respectively,/->Represents the speed increase produced by the specific force at time k,/->A speed compensation error term representing the common constant of gravity, centripetal acceleration and coriolis acceleration from time k,/->Represents the rotation matrix of the moment n from k to k-1, I represents the identity matrix,/>A rotation matrix from n to b at time k-1; />And->Are both a function of position and velocity, Δt representing the sampling interval of the inertial measurement unit, mid representing the intermediate instants k-1 to k,/o>Indicating the speed increase in line b by the specific force at time k-1,/>Represents the velocity increment g at time b n Represents the gravitational acceleration in the n series, v n Represents the velocity in n series, Δθ k The attitude increment under the b system at the k moment is represented, the n system represents a navigation coordinate system, and the b system represents a coordinate system taking the measurement center of the inertial measurement unit as the origin of coordinates;
The position of the inertial measurement unit in the inertial navigation coordinate systemThe calculation formula of (2) is as follows:
wherein h is imu,k And h imu,k-1 The elevations at the times k and k-1 are shown, v D,mid Indicating the speed at the intermediate time, B imu,k And L imu,k Respectively representing the latitude and longitude at the moment k, E is a diagonal matrix, R M And R is N Is the radius of the meridian and the mortise circle,position quaternion representing moment k +.>Represents quaternion multiplication, and ζ represents modulo operation k Represents the rotation vector of the n system from k-1 to k time k A rotation vector representing the series of k-1 to k time e, atan () representing an arctangent function;
the calculation formula of the attitude rotation matrix of the inertial measurement unit under the inertial navigation coordinate system is as follows:
wherein,gesture quaternion representing moment k +.>Representing rotation of the k-1 to k time b seriesA vector; Δθ k-1 ×Δθ k And/12 represents a second order cone effect compensation term, q is reduced +.>q [Y] Representing a gesture quaternion +.>The Y-th element of (b).
8. A real-time monitoring and positioning system for roads and accessories, the system comprising:
the data acquisition module is used for acquiring historical inertial data acquired by the inertial measurement unit, a calibration image shot by the road image shooting component, first satellite observation data output by the Beidou receiver on the reference station and second satellite observation data output by the Beidou receiver on the carrier mobile platform; the inertial measurement unit and the road image shooting component are both arranged on the carrier moving platform;
The noise analysis module is used for analyzing random noise in the historical inertia data by adopting an Allan variance method so as to obtain a random noise time domain model;
the internal reference calibration module is used for calibrating an internal reference matrix of the road image shooting component according to the calibration image by adopting a Zhang Zhengyou camera calibration method, and determining initial values of the external reference matrix of the inertial measurement unit and the road image shooting component;
the external parameter calibration module is used for optimizing the initial value of the external parameter matrix based on the historical inertial data, the random noise time domain model and the calibrated internal parameter matrix to obtain a calibrated external parameter matrix;
the Beidou positioning and resolving module is used for fixedly resolving the Beidou satellite navigation RTK ambiguity based on the first satellite observation data and the second satellite observation data to obtain a real-time position, a real-time speed and a real-time gesture at an antenna of the Beidou receiver, and initializing the inertial measurement unit based on the real-time position, the real-time speed and the real-time gesture at the antenna of the Beidou receiver;
the current data acquisition module is used for acquiring the current inertial data acquired by the initialized inertial measurement unit and the current road image shot by the road image shooting component;
The position determining module is used for determining the relative position of the antenna of the Beidou receiver and the inertial measurement unit, and then determining a final position and posture rotation matrix of the inertial measurement unit by combining the current inertial data, the first satellite observation data, the second satellite observation data, the current road image, the calibrated internal reference matrix and the calibrated external reference matrix;
the road and accessory positioning module is used for calculating the position corresponding to the characteristic point in the current road image under the geocentric coordinate system based on the final position and gesture rotation matrix of the inertial measurement unit and the calibrated external parameter matrix; the feature points in the current road image represent road features and accessory features on the road.
9. The real-time monitoring and positioning device for the roads and the accessories is characterized by comprising an inertial measurement unit, a road image shooting component, a Beidou receiver, a 5G communication module, an FPGA board card and a carrier moving platform;
the inertial measurement unit, the road image shooting component, the Beidou receiver, the 5G communication module and the FPGA board card are all arranged on the carrier moving platform;
The inertial measurement unit, the road image shooting component, the Beidou receiver and the 5G communication module are electrically connected with the FPGA board card;
the inertial measurement unit is used for acquiring inertial data;
the road image shooting component is used for acquiring a calibration image and a current road image;
the Beidou receiver is used for receiving satellite observation data;
the FPGA board is used for executing the real-time monitoring and positioning method of the road and the appendages according to any one of claims 1-7 so as to obtain the positions corresponding to the characteristic points in the current road image under the geocentric coordinate system;
and the 5G communication module is used for sending the position corresponding to the characteristic point in the current road image to a remote terminal.
CN202311008483.5A 2023-08-11 2023-08-11 Real-time monitoring and positioning method, system and device for roads and accessories Pending CN117031513A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311008483.5A CN117031513A (en) 2023-08-11 2023-08-11 Real-time monitoring and positioning method, system and device for roads and accessories

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311008483.5A CN117031513A (en) 2023-08-11 2023-08-11 Real-time monitoring and positioning method, system and device for roads and accessories

Publications (1)

Publication Number Publication Date
CN117031513A true CN117031513A (en) 2023-11-10

Family

ID=88601791

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311008483.5A Pending CN117031513A (en) 2023-08-11 2023-08-11 Real-time monitoring and positioning method, system and device for roads and accessories

Country Status (1)

Country Link
CN (1) CN117031513A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117724141A (en) * 2024-02-18 2024-03-19 中国民航大学 Beidou orthogonal baseline direction finding angle optimization method for aircraft

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117724141A (en) * 2024-02-18 2024-03-19 中国民航大学 Beidou orthogonal baseline direction finding angle optimization method for aircraft
CN117724141B (en) * 2024-02-18 2024-04-16 中国民航大学 Beidou orthogonal baseline direction finding angle optimization method for aircraft

Similar Documents

Publication Publication Date Title
CN111947652B (en) Inertia/vision/astronomy/laser ranging combined navigation method suitable for lunar lander
CN107449444B (en) Multi-star map attitude associated star sensor internal parameter calibration method
US20120257792A1 (en) Method for Geo-Referencing An Imaged Area
CN105467415B (en) A kind of SUAV RTK relative positioning methods constrained based on difference pressure altitude
Ellum et al. The development of a backpack mobile mapping system
US9529093B2 (en) Systems and methods for estimating attitude using double differenced GPS carrier phase measurements
CN111912430B (en) On-orbit geometric calibration method, device, equipment and medium for high-orbit optical satellite
CN113267794B (en) Antenna phase center correction method and device with base line length constraint
CN110044377B (en) Vicon-based IMU offline calibration method
CN110363758B (en) Optical remote sensing satellite imaging quality determination method and system
CN117031513A (en) Real-time monitoring and positioning method, system and device for roads and accessories
CN108613675B (en) Low-cost unmanned aerial vehicle movement measurement method and system
CN115435779A (en) Intelligent body pose estimation method based on GNSS/IMU/optical flow information fusion
CN108445519A (en) A kind of pinpoint implementation methods of mobile terminal GPS
Xu et al. Error analysis and accuracy assessment of mobile laser scanning system
CN112197765A (en) Method for realizing fine navigation of underwater robot
CN115356965B (en) Loose coupling real-package data acquisition device and data processing method
CN116576849A (en) Vehicle fusion positioning method and system based on GMM assistance
CN116047567A (en) Deep learning assistance-based guard and inertial navigation combined positioning method and navigation method
May A rigorous approach to comprehesive performance analysis of state-of-the-art airborne mobile mapping systems
CN114705223A (en) Inertial navigation error compensation method and system for multiple mobile intelligent bodies in target tracking
CN114812554A (en) Multi-source fusion robot indoor absolute positioning method based on filtering
CN114025320A (en) Indoor positioning method based on 5G signal
CN113503872A (en) Low-speed unmanned vehicle positioning method based on integration of camera and consumption-level IMU
Deans et al. A sun tracker for planetary analog rovers

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination