CN117433518A - Positioning control system, method, electronic device, storage medium and chip - Google Patents

Positioning control system, method, electronic device, storage medium and chip Download PDF

Info

Publication number
CN117433518A
CN117433518A CN202311738787.7A CN202311738787A CN117433518A CN 117433518 A CN117433518 A CN 117433518A CN 202311738787 A CN202311738787 A CN 202311738787A CN 117433518 A CN117433518 A CN 117433518A
Authority
CN
China
Prior art keywords
controller
pose
point cloud
inertial navigation
error state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311738787.7A
Other languages
Chinese (zh)
Inventor
刘京
曲小飞
周江涛
徐昊
樊硕
尹力
马双涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sany Intelligent Equipment Co ltd
Sany Heavy Equipment Co Ltd
Original Assignee
Sany Intelligent Equipment Co ltd
Sany Heavy Equipment Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sany Intelligent Equipment Co ltd, Sany Heavy Equipment Co Ltd filed Critical Sany Intelligent Equipment Co ltd
Priority to CN202311738787.7A priority Critical patent/CN117433518A/en
Publication of CN117433518A publication Critical patent/CN117433518A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • EFIXED CONSTRUCTIONS
    • E21EARTH OR ROCK DRILLING; MINING
    • E21DSHAFTS; TUNNELS; GALLERIES; LARGE UNDERGROUND CHAMBERS
    • E21D9/00Tunnels or galleries, with or without linings; Methods or apparatus for making thereof; Layout of tunnels or galleries
    • E21D9/003Arrangement of measuring or indicating devices for use during driving of tunnels, e.g. for guiding machines
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/183Compensation of inertial measurements, e.g. for temperature effects
    • G01C21/188Compensation of inertial measurements, e.g. for temperature effects for accumulated errors, e.g. by coupling inertial systems with absolute positioning systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mining & Mineral Resources (AREA)
  • Electromagnetism (AREA)
  • Environmental & Geological Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Life Sciences & Earth Sciences (AREA)
  • Geochemistry & Mineralogy (AREA)
  • Geology (AREA)
  • Navigation (AREA)

Abstract

The invention provides a positioning control system, a method, electronic equipment, a storage medium and a chip, which relate to the technical field of heading machine positioning, wherein the positioning control system comprises: a target; an inertial navigation system; a laser radar; the controller is electrically connected with the inertial navigation system and the laser radar, determines external parameters of the sensor system, determines internal parameters of the inertial navigation system, acquires point clouds of targets and roadways, matches point cloud data of adjacent frames, determines an optimized target function of pose by utilizing normal distribution transformation, determines a state variable of iterative error state Kalman filtering, determining a linearization continuous time model related to an error state through a first-order Gaussian Markov process, determining a prediction equation related to the error state and covariance of the error state according to the linearization continuous time model, determining an optimization model related to the error state according to the error state and an optimization objective function, determining an iterative update equation of the error state, and determining the pose of the heading machine in a roadway.

Description

Positioning control system, method, electronic device, storage medium and chip
Technical Field
The invention relates to the technical field of heading machine positioning, in particular to a positioning control system, a positioning control method, electronic equipment, a storage medium and a chip.
Background
In the related art, the heading machine is positioned by adopting the pure inertial navigation system, but the integral accumulated error of the pure inertial navigation system under long-time operation is larger, so that the positioning precision of the heading machine is reduced.
Disclosure of Invention
In order to solve or improve the technical problem that the integral accumulated error of a pure inertial navigation system under long-time operation is large and the positioning accuracy of a heading machine is reduced, the invention aims to provide a positioning control system.
Another object of the present invention is to provide a positioning control method.
Another object of the present invention is to provide an electronic device.
It is another object of the present invention to provide a computer readable storage medium.
It is another object of the present invention to provide a chip.
To achieve the above object, a first aspect of the present invention provides a positioning control system applied to a heading machine including a body frame, the heading machine being for traveling in a roadway, the positioning control system including: the target is arranged in the roadway; the inertial navigation system is arranged on the body frame; the laser radar is arranged on the body frame; the system comprises a body frame, a controller, a laser radar, a linear continuous time model, a predictive error and an iterative function, wherein the body frame is arranged on the body frame, the controller is electrically connected with the inertial navigation system, the controller is used for acquiring a three-dimensional image of the heading machine, determining the pose of the inertial navigation system in the heading machine according to the three-dimensional image, the pose of the laser radar in the heading machine and the relative position between the inertial navigation system and the laser radar, the controller is used for determining noise parameters in the inertial navigation system through an internal reference calibration tool, the controller is used for determining the position of a target in a roadway through the laser radar, the controller is used for acquiring the target and point clouds of the roadway through the laser radar, the point clouds comprise a plurality of frames of point clouds, the controller is used for matching the point clouds of adjacent frames and determining an optimized target function of the pose according to the error state of the inertial navigation system by means of normal distribution transformation, determining a state variable of iterative error state Kalman filtering according to the noise parameters and the state variable, determining a linear continuous time model about the error state by means of a first-order Gaussian Markov process, the controller is used for determining a predictive error about the error state and an error about the error state covariance according to the error state equation according to the linear continuous time model, and determining an iterative function about the optimal state is used for determining an iterative state about the optimal state.
According to the technical scheme of the positioning control system, in the first aspect, the positioning control system deeply fuses data acquired by the inertial navigation system and data acquired by the laser radar through the iterative error state Kalman filtering algorithm, the complementary property of the data acquired by the inertial navigation system and the data acquired by the laser radar is fully utilized, the tight coupling of the inertial navigation system and the laser radar is realized, the integral accumulated error of the inertial navigation system under long-time running is reduced, and the accuracy of pose estimation and the positioning precision of the heading machine in a roadway are improved; in the second aspect, it is advantageous to increase the number of representative feature points in the point cloud by targets that can be arranged at random.
The positioning control system is applied to the heading machine. The heading machine is used for advancing in the roadway so as to cut the working face of the roadway. The heading machine comprises a body frame. The body frame mainly plays a role of a mounting carrier for some devices or systems in a positioning control system (such as an inertial navigation system, a laser radar and a controller). The positioning control system is used for determining the pose of the heading machine in the roadway. It should be noted that, the pose referred to in the present invention refers to a position and a pose.
Specifically, the positioning control system includes a target, an inertial navigation system, a lidar, and a controller. Wherein, the target is located the tunnel. Optionally, the staff randomly positions the targets at any position in the roadway. The target is used for assisting the laser radar to scan and acquire point cloud data. Optionally, the number of targets is at least one, that is, the targets may be one, two or more, and the targets are flexibly set according to the actual application scenario. Further, the inertial navigation system is arranged on the body frame. An inertial navigation system (INS, inertial navigation for short) is an autonomous navigation system that does not depend on external information nor radiate energy to the outside. Optionally, the inertial navigation system is disposed on an upper surface of the body frame. Further, the laser radar is arranged on the body frame. Optionally, the lidar and the inertial navigation system form a sensor system of a positioning control system. Optionally, the number of the laser radars is at least one, that is, the number of the laser radars can be one, two or more, and the laser radars are flexibly set according to actual application scenes. When the number of the laser radars is two or more, the two or more laser radars are respectively installed at different heights of the body frame. Further, the controller is arranged on the body frame. The controller is electrically connected with the inertial navigation system; the controller is electrically connected with the laser radar.
Further, the controller is used for acquiring a three-dimensional image of the heading machine. The operator acquires a three-dimensional image of the surface of the heading machine using a 3D scanner and stores or transmits information related to the three-dimensional image to the controller so that the controller acquires the three-dimensional image of the heading machine.
Further, the controller is used for determining the pose of the inertial navigation system in the heading machine, the pose of the laser radar in the heading machine and the relative position between the inertial navigation system and the laser radar according to the three-dimensional image. The pose of the inertial navigation system in the heading machine, the pose of the laser radar in the heading machine and the relative position between the inertial navigation system and the laser radar are external parameters of the sensor system. In other words, the controller is configured to determine an external parameter of the sensor system from the three-dimensional image. Optionally, the controller is used for establishing a heading machine coordinate system, an inertial navigation coordinate system and a radar coordinate system according to the three-dimensional image, and determining a first coordinate origin of the heading machine coordinate system, a second coordinate origin of the inertial navigation coordinate system and a third coordinate origin of the radar coordinate system; the controller is used for determining a displacement matrix according to the coordinates of the second coordinate origin in the coordinate system of the tunneling machine and the coordinates of the third coordinate origin in the coordinate system of the tunneling machine; the controller is configured to determine angle data and to solve the rotation matrix based on the angle data to determine an external parameter of the sensor system. Optionally, the first origin of coordinates is an intersection point between the Z-axis in the inertial navigation coordinate system and the upper surface of the body frame.
Further, the controller is configured to determine a noise parameter in the inertial navigation system via the internal reference calibration tool. Optionally, the internal reference calibration tool is an imu tools. Optionally, the inertial navigation system is placed still for a first time, and the internal parameter calibration tool is used for processing the data acquired by the inertial navigation system to determine the internal parameter of the inertial navigation system. Optionally, the internal parameter of the inertial navigation system is a noise parameter in the inertial navigation system. Optionally, the first time is not less than 2h. Optionally, the noise parameters include noise of acceleration, random walk noise of acceleration, noise of angular velocity, and random walk noise of angular velocity.
Further, the controller is used for determining the position of the target in the roadway through the laser radar. Optionally, the staff randomly arranges targets within the roadway. Alternatively, the target is square. Optionally, the number of targets is at least one, that is, the targets may be one, two or more, and the targets are flexibly set according to the actual application scenario.
Further, the controller is used for acquiring point clouds of the target and the roadway through the laser radar, and the point clouds comprise multi-frame point cloud data. Optionally, the lidar can scan the target and roadway to enable the controller to obtain a point cloud of the target and roadway via the lidar. The point cloud is a set of pointing data. Optionally, the number of the laser radars is multiple, and the controller is used for fusing point clouds acquired by the multiple laser radars. When the number of the laser radars is plural, the pose of the laser radars in the heading machine includes the relative positions of the plurality of laser radars in the heading machine. Optionally, the number of the lidars is three, namely a first lidar, a second lidar and a third lidar. Based on external parameters (pose of the laser radar in the heading machine) of the sensor system, projecting the point cloud acquired by the second laser radar and the point cloud acquired by the third laser radar into the point cloud of the first laser radar to obtain fusion point cloud.
Further, the controller is used for matching the point cloud data of the adjacent frames and determining an optimized objective function of the pose by using normal distribution transformation. Optionally, the kth frame and the k+1 frame are neighboring frames. Optionally, the controller is configured to project the point cloud data of the kth frame to a termination time of the kth frame according to the first pose to obtain a first point cloud, divide the first point cloud into grids with a first value, and determine a mean value and a covariance of the first point cloud in each grid; the controller is used for projecting the point cloud data of the k+1st frame to the starting moment of the k+1st frame according to the second pose to obtain a second point cloud, and determining an optimization objective function of the second pose according to the probability that the points of the second point cloud in each grid of the first point cloud fall in the normal distribution of the corresponding grids. Optionally, the first pose isThe first pose is the pose of the kth frame. Optionally, the second pose is +.>The second pose is the pose of the k+1st frame. Optionally, the termination time of the kth frame is +.>. Alternatively, the start time of the k+1th frame is +.>. Optionally, the ending time of the kth frame coincides with the starting time of the kth+1 frame, i.e. the two times are the same time. Optionally, the first value is M. Optionally, the first point cloud is +. >. Optionally, the first point cloud +_>Dividing into M grids, and obtaining the mean value and covariance of the first point cloud in each grid. Optionally, the controller is configured to determine a mean value of the first point cloud in each grid according to a first formula:
wherein->For mean value->For the number of first point clouds in the mth grid,/->Is the coordinates of the ith point in the mth grid. Optionally, the controller is configured to determine the covariance of the first point cloud in each grid by a second formula:
wherein->For covariance +.>For mean value->For the number of first point clouds in the mth grid,/->Is the coordinates of the ith point in the mth grid. Wherein T represents a matrix transposition, which is an operation method of a matrix. Optionally, the second point cloud is +.>. Associating with a second point cloud->Fall at the first point cloud->Is +.>Normal distribution N (+) in the corresponding grid>,/>) Probability of ∈1->Is provided. Optionally, the optimization objective function is as shown in the third formula. The third formula is:
wherein->Is the number of points in the mth grid where the second point cloud falls on the first point cloud. />To optimize the objective function. T represents the matrix transpose. T (T) k+1 Representing a second pose. Optimizing the second poseWhen the first pose is in a known state. Alternatively, M is not greater than M.
Further, the controller is configured to determine a state variable of the iterative error state kalman filter based on the error state of the inertial navigation system. Optionally, according to the error state of the inertial navigation system, determining a state variable of iterative error state kalman filtering by a fourth formula. The fourth formula is:wherein->Representing error items->、/>、/>、/>、/>、/>、/>Respectively indicate->State, displacement, speed, rotation, acceleration deviation, angular velocity deviation, local gravity, +.>,/>Representing a quaternion.
Further, the controller is configured to determine a linearized continuous-time model for the error state via a first order gaussian markov process based on the noise parameter and the state variable. The first order gaussian markov process is a model of the time series that is built based on a linear combination of previous observation points and the current noise term. Optionally, the first order gaussian markov process comprises a fifth formula. The fifth formula is:wherein->Is a gaussian noise vector corresponding to the internal parameters of the inertial navigation system. / >Is the error state transition matrix at time t and satisfies the sixth equation. Optionally, the sixth formula is: />. Optionally, ->Is a noise jacobian at time t and satisfies a seventh formula. Optionally, the seventh formula is: />. Wherein [ (S)] × ∈R 3×3 Representing the transformation of a three-dimensional vector into its oblique symmetry matrix, I 3 ∈R 3×3 Is an identity matrix>Is the rotation matrix at time t, < >>And->The acceleration and angular velocity at time t, respectively.
Further, the method comprises the steps of,the controller is configured to determine a predictive equation for the error state and the error state covariance based on the linearized continuous time model. Optionally, the error state is. Optionally, ->The eighth formula is satisfied. The eighth formula is:. Optionally, the error state covariance is +.>. Optionally, ->The ninth formula is satisfied. The ninth formula is: />. Optionally, according to the error state +.>And error State covariance->Status prediction is obtained by continuous pre-integration>. Wherein (1)>,/>And->Is a continuous IMU time step, +.>Representation->Is a covariance matrix of (a). IMU means inertial sensor that is part of the inertial navigation system, i.e. the inertial navigation system comprises an inertial sensor.
Further, the controller is configured to determine an optimization model for the error condition based on the error condition and the optimization objective function. Optionally, the final optimization objective function F of the optimization model satisfies the tenth formula. Optionally, the tenth formula is:
. Wherein (1)>Representing the mahalanobis norm>Optimized objective function +.>Is used for measuring the noise of the measurement. Subscript k+1 represents from +.>Time to->Time of day change->Is the covariance matrix of the measurement noise. Optionally, ->The eleventh formula is satisfied. Optionally, the eleventh formula is:
. Wherein (1)>Representing from->Time to->The final state of the moment exp () represents the euler angle rotation quaternion,/-for example>Representing the superposition of the state obtained by inertial navigation from the last moment to the next moment (this state is obtained by integrating the angular velocity and acceleration measured by inertial navigation) and the error state.
Further, the controller is configured to determine an iteratively updated equation for the error state based on the optimization model. Optionally, the iteratively updating equation includes a twelfth equation. The twelfth equation is:
. Optionally, the iteratively updating the equation further comprises a thirteenth equation. The thirteenth formula is:
. Optionally, the iteratively updating the equation further comprises a fourteenth equation. The fourteenth equation is:
. Alternatively, when->Updating the error state covariance when the error state covariance is lower than a certain threshold value, wherein an updating equation of the error state covariance is a fifteenth equation, and obtaining a final state through the eleventh equation. Optionally, the fifteenth formula is:
. Wherein (1)>Indicate->Correction vector at multiple iterations,/>Is->About->Is a jacobian of (c).
Further, the controller is used for determining the pose of the heading machine in the roadway according to the iterative updating equation. Optionally, the next prediction is initialized by the final stateAs shown in the sixteenth formula; and converting the final state to a global state under a roadway coordinate system, as shown in a seventeenth formula and an eighteenth formula. Optionally, the sixteenth formula is:. Optionally, the seventeenth formula is: />. Optionally, the eighteenth formula is: />. The controller utilizes normal distribution transformation to match the point clouds of adjacent frames so as to obtain an optimized target function of the pose, and lays a foundation for the estimation of the subsequent error state; and combining the optimization objective function of the pose obtained by the point cloud matching, and obtaining the pose of the heading machine in real time based on iterative error state Kalman filtering.
In the technical scheme defined by the invention, in the first aspect, the positioning control system deeply fuses the data acquired by the inertial navigation system and the data acquired by the laser radar through the iterative error state Kalman filtering algorithm, the complementary property of the data acquired by the inertial navigation system and the data acquired by the laser radar is fully utilized, the tight coupling of the inertial navigation system and the laser radar is realized, the integral accumulated error of the inertial navigation system under long-time running is reduced, and the accuracy of pose estimation and the positioning precision of the heading machine in a roadway are improved; in the second aspect, it is advantageous to increase the number of representative feature points in the point cloud by targets that can be arranged at random.
In addition, the technical scheme provided by the invention can also have the following additional technical characteristics:
in some embodiments, optionally, the controller is configured to establish a heading machine coordinate system, an inertial navigation coordinate system, and a radar coordinate system according to the three-dimensional image, and determine a first origin of coordinates of the heading machine coordinate system, a second origin of coordinates of the inertial navigation coordinate system, and a third origin of coordinates of the radar coordinate system, the controller is configured to determine a displacement matrix according to coordinates of the second origin of coordinates in the heading machine coordinate system and coordinates of the third origin of coordinates in the heading machine coordinate system, and the controller is configured to determine angle data and solve the rotation matrix according to the angle data, so as to determine a pose of the inertial navigation system in the heading machine, a pose of the laser radar in the heading machine, and a relative position between the inertial navigation system and the laser radar.
In this solution, the controller is configured to determine the external parameters of the sensor system from the three-dimensional image. The external parameters of the sensor system comprise the pose of the inertial navigation system in the heading machine, the pose of the laser radar in the heading machine and the relative position between the inertial navigation system and the laser radar. Optionally, the first origin of coordinates is an intersection point between the Z-axis in the inertial navigation coordinate system and the upper surface of the body frame. Optionally, the X, Y and Z axes of the heading machine coordinate system are parallel to the X, Y and Z axes of the inertial navigation coordinate system, respectively. Optionally, the controller acquires coordinates of the coordinate origin (the second coordinate origin and the third coordinate origin) in the coordinate system of the sensor under the coordinate system of the heading machine through three-dimensional measurement software to construct a displacement matrix, and acquires an included angle between the coordinate axes of the sensor and the coordinate axes of the heading machine based on the three-dimensional measurement software to solve the rotation matrix. In the technical scheme, when the sensor external parameters are determined, a displacement matrix is constructed under the heading machine coordinate system by the second coordinate origin in the inertial navigation coordinate system and the third coordinate origin in the radar coordinate system, and the method is favorable for fully utilizing the complementary properties of the data acquired by the inertial navigation system and the data acquired by the laser radar and realizing the tight coupling of the inertial navigation system and the laser radar.
In some embodiments, optionally, the first origin of coordinates is an intersection point between a Z-axis in the inertial navigation coordinate system and the upper surface of the body frame.
In the technical scheme, an X axis, a Y axis and a Z axis of a heading machine coordinate system are respectively parallel to the X axis, the Y axis and the Z axis of an inertial navigation coordinate system. In the design mode, the displacement matrix is conveniently constructed under the coordinate system of the heading machine by the second coordinate origin in the inertial navigation coordinate system and the third coordinate origin in the radar coordinate system in the subsequent steps.
In some embodiments, optionally, the number of the lidars is multiple, and the controller is configured to fuse point clouds acquired by the multiple lidars.
In the technical scheme, under the condition that the number of the laser radars is multiple, the controller can fuse point clouds acquired by the multiple laser radars. When the number of the laser radars is plural, the pose of the laser radars in the heading machine includes the relative positions of the plurality of laser radars in the heading machine. Optionally, the number of the lidars is three, namely a first lidar, a second lidar and a third lidar. Based on external parameters (pose of the laser radar in the heading machine) of the sensor system, projecting the point cloud acquired by the second laser radar and the point cloud acquired by the third laser radar into the point cloud of the first laser radar to obtain fusion point cloud.
In some embodiments, optionally, the noise parameter includes noise of acceleration, random walk noise of acceleration, noise of angular velocity, and random walk noise of angular velocity.
In this technical solution, by determining the noise parameters, it is convenient to determine, in a subsequent step, a linearized continuous-time model concerning the error state by means of a first-order gaussian markov process, from the noise parameters and the state variables.
In some technical schemes, optionally, the controller is configured to project the point cloud data of the kth frame to a termination time of the kth frame according to the first pose to obtain a first point cloud, divide the first point cloud into grids with a first value, and determine a mean and a covariance of the first point cloud in each grid; the controller is used for projecting the point cloud data of the k+1th frame to the starting moment of the k+1th frame according to the second pose to obtain a second point cloud, and determining an optimized objective function of the second pose according to the probability that the points of the second point cloud in each grid of the first point cloud fall in the normal distribution of the corresponding grids, wherein the first pose is the pose of the k+1th frame, and the second pose is the pose of the k+1th frame.
In this technical scheme, the kth frame and the k+1th frame are adjacent frames. The controller is used for matching the point cloud data of the adjacent frames and determining an optimized target function of the pose by using normal distribution transformation.
In some embodiments, optionally, the ending time of the kth frame coincides with the starting time of the k+1th frame.
In the technical scheme, the termination time of the kth frame and the start time of the kth+1st frame are the same time, so that the controller can match the point cloud data of the adjacent frames.
In some embodiments, optionally, the controller is configured to determine a mean value of the first point cloud in each grid according to a first formula:
wherein->For mean value->For the number of first point clouds in the mth grid,/->Coordinates of an ith point in the mth grid; and/or the controller is configured to determine covariance of the first point cloud within each grid by a second formula:
wherein->For covariance +.>For mean value->For the number of first point clouds in the mth grid,/->For the coordinates of the ith point in the mth grid, T represents the matrix transpose.
In the technical scheme, the purpose of the step is to determine the mean value and covariance of the first point cloud in each grid, so that the optimization objective function can be conveniently determined in the subsequent steps.
In some embodiments, optionally, the optimization objective function is:
wherein->For covariance +.>For mean value- >For the points of the second point cloud falling in each grid of the first point cloud +.>For the number of points of the M-th grid of the first point cloud, M is a first numerical value, T represents matrix transposition, T k+1 Representing a second pose.
In this technical solution, the objective of this step is to determine an optimization objective function based on the mean and covariance of the first point cloud in each grid. The first pose is a known state when the second pose is optimized.
The second aspect of the present invention provides a positioning control method, which is applied to the controller of the positioning control system in any one of the above technical solutions, and the positioning control method includes: acquiring a three-dimensional image of the heading machine; determining the pose of the inertial navigation system in the heading machine, the pose of the laser radar in the heading machine and the relative position between the inertial navigation system and the laser radar according to the three-dimensional image; determining noise parameters in the inertial navigation system through an internal reference calibration tool; determining the position of the target in the roadway; acquiring point clouds of a target and a roadway, wherein the point clouds comprise multi-frame point cloud data; matching the point cloud data of the adjacent frames, and determining an optimization objective function of the pose by using normal distribution transformation; determining a state variable of iterative error state Kalman filtering according to the error state of the inertial navigation system; determining a linearized continuous-time model for the error state by a first order gaussian markov process based on the noise parameter and the state variable; determining a predictive equation for the error state and the error state covariance from the linearized continuous time model; determining an optimization model about the error state according to the error state and the optimization objective function; determining an iterative update equation of the error state according to the optimization model; and determining the pose of the heading machine in the roadway according to the iterative updating equation.
According to the technical scheme of the positioning control method, the positioning control method is applied to the controller of the positioning control system in any one of the technical scheme. The positioning control method comprises the following specific steps:
firstly, acquiring a three-dimensional image of the heading machine. The operator acquires a three-dimensional image of the surface of the heading machine using a 3D scanner and stores or transmits information related to the three-dimensional image to the controller so that the controller acquires the three-dimensional image of the heading machine.
And secondly, determining the pose of the inertial navigation system in the heading machine, the pose of the laser radar in the heading machine and the relative position between the inertial navigation system and the laser radar according to the three-dimensional image. The pose of the inertial navigation system in the heading machine, the pose of the laser radar in the heading machine and the relative position between the inertial navigation system and the laser radar are external parameters of the sensor system. In other words, the controller is configured to determine an external parameter of the sensor system from the three-dimensional image. Optionally, the controller is used for establishing a heading machine coordinate system, an inertial navigation coordinate system and a radar coordinate system according to the three-dimensional image, and determining a first coordinate origin of the heading machine coordinate system, a second coordinate origin of the inertial navigation coordinate system and a third coordinate origin of the radar coordinate system; the controller is used for determining a displacement matrix according to the coordinates of the second coordinate origin in the coordinate system of the tunneling machine and the coordinates of the third coordinate origin in the coordinate system of the tunneling machine; the controller is configured to determine angle data and to solve the rotation matrix based on the angle data to determine an external parameter of the sensor system. Optionally, the first origin of coordinates is an intersection point between the Z-axis in the inertial navigation coordinate system and the upper surface of the body frame.
And thirdly, determining noise parameters in the inertial navigation system through an internal reference calibration tool. Optionally, the internal reference calibration tool is an imu tools. Optionally, the inertial navigation system is placed still for a first time, and the internal parameter calibration tool is used for processing the data acquired by the inertial navigation system to determine the internal parameter of the inertial navigation system. Optionally, the internal parameter of the inertial navigation system is a noise parameter in the inertial navigation system. Optionally, the first time is not less than 2h. Optionally, the noise parameters include noise of acceleration, random walk noise of acceleration, noise of angular velocity, and random walk noise of angular velocity.
And fourthly, determining the position of the target in the roadway. The controller is used for determining the position of the target in the roadway through the laser radar. Optionally, the staff randomly arranges targets within the roadway. Alternatively, the target is square. Optionally, the number of targets is at least one, that is, the targets may be one, two or more, and the targets are flexibly set according to the actual application scenario.
And fifthly, acquiring point clouds of the target and the roadway, wherein the point clouds comprise multi-frame point cloud data. Optionally, the lidar can scan the target and roadway to enable the controller to obtain a point cloud of the target and roadway via the lidar. The point cloud is a set of pointing data. Optionally, the number of the laser radars is multiple, and the controller is used for fusing point clouds acquired by the multiple laser radars. When the number of the laser radars is plural, the pose of the laser radars in the heading machine includes the relative positions of the plurality of laser radars in the heading machine. Optionally, the number of the lidars is three, namely a first lidar, a second lidar and a third lidar. Based on external parameters (pose of the laser radar in the heading machine) of the sensor system, projecting the point cloud acquired by the second laser radar and the point cloud acquired by the third laser radar into the point cloud of the first laser radar to obtain fusion point cloud.
And sixthly, matching the point cloud data of the adjacent frames, and determining an optimization objective function of the pose by using normal distribution transformation. Optionally, the kth frame and the k+1 frame are neighboring frames. Optionally, the controller is configured to project the point cloud data of the kth frame to a termination time of the kth frame according to the first pose to obtain a first point cloud, divide the first point cloud into grids with a first value, and determine a mean value and a covariance of the first point cloud in each grid; the controller is used for projecting the point cloud data of the k+1st frame to the starting moment of the k+1st frame according to the second pose to obtain a second point cloud, and determining an optimization objective function of the second pose according to the probability that the points of the second point cloud in each grid of the first point cloud fall in the normal distribution of the corresponding grids. Optionally, the first pose is. Optionally, the second pose is +.>. The first pose is the pose of the kth frame, and the second pose is the pose of the k+1th frame. Optionally, the termination time of the kth frame is +.>. Alternatively, the start time of the k+1th frame is +.>. Optionally, the termination time of the kth frame is equal toThe start time of the k+1st frame is identical, i.e., the two times are the same time. Optionally, the first value is M. Optionally, the first point cloud is +. >. Optionally, the first point cloud +_>Dividing into M grids, and obtaining the mean value and covariance of the first point cloud in each grid. Optionally, the controller is configured to determine a mean value of the first point cloud in each grid according to a first formula:
wherein->For mean value->For the number of first point clouds in the mth grid,/->Is the coordinates of the ith point in the mth grid. Optionally, the controller is configured to determine the covariance of the first point cloud in each grid by a second formula:
wherein->For covariance +.>For mean value->For the number of first point clouds in the mth grid,/->The coordinates of the ith point in the mth grid, where T represents a matrix transpose, which is an operation method of the matrix. Optionally, the second point cloud is +.>. Associating with a second point cloud->Fall at the first point cloud->Is +.>Normal distribution N (+) in the corresponding grid>,/>) Probability of ∈1->Is provided. Optionally, the optimization objective function is as shown in the third formula. The third formula is:
wherein->For the number of points, T, of the mth grid where the second point cloud falls on the first point cloud k+1 Representing a second pose. The first pose is a known state when the second pose is optimized. Alternatively, M is not greater than M.
And seventh, determining a state variable of iterative error state Kalman filtering according to the error state of the inertial navigation system. Optionally, according to the error state of the inertial navigation system, determining a state variable of iterative error state kalman filtering by a fourth formula. The fourth formula is:
wherein->Representing error items->、/>、/>、/>、/>、/>Respectively indicate->State, displacement, speed, rotation, acceleration deviation, angular velocity deviation, local gravity, +.>,/>Representing a quaternion.
Eighth, a linearized continuous-time model for the error state is determined by a first order gaussian markov process based on the noise parameters and the state variables. The first order gaussian markov process is a model of the time series that is built based on a linear combination of previous observation points and the current noise term. Optionally, the first order gaussian markov process comprises a fifth formula. The fifth formula is:
wherein->Is a gaussian noise vector corresponding to the internal parameters of the inertial navigation system. />Is the error state transition matrix at time t and satisfies the sixth equation. Optionally, the sixth formula is:. Optionally, - >Is a noise jacobian at time t and satisfies a seventh formula. Optionally, the seventh formula is: />Wherein [ (S)] × ∈R 3×3 Representing the transformation of a three-dimensional vector into its oblique symmetry matrix, I 3 ∈R 3×3 Is an identity matrix>Is the rotation matrix at time t, < >>And->The acceleration and angular velocity at time t, respectively.
And a ninth step of determining a predictive equation for the error state and the covariance of the error state based on the linearized continuous time model. Optionally, the error state is. Optionally, ->The eighth formula is satisfied. The eighth formula is:. Optionally, the error state covariance is +.>. Optionally, ->The ninth formula is satisfied. The ninth formula is: />. Alternatively, according to the error state +.>And error State covariance->Status prediction is obtained by continuous pre-integration>. Wherein (1)>,/>And->Is a continuous IMU time step, +.>Representation->Is a covariance matrix of (a). IMU means inertial sensor that is part of the inertial navigation system, i.e. the inertial navigation system comprises an inertial sensor.
And tenth, determining an optimization model related to the error state according to the error state and the optimization objective function. Optionally, the final optimization objective function F of the optimization model satisfies the tenth formula. Optionally, the tenth formula is:
. Wherein (1)>Representing the mahalanobis norm>Optimized objective function +.>Is used for measuring the noise of the measurement. Subscript k+1 represents from +.>From moment to momentTime of day change->Is the covariance matrix of the measurement noise. Optionally, ->The eleventh formula is satisfied. Optionally, the eleventh formula is:
. Wherein (1)>Representing from->Time to->The final state of the moment exp () represents the euler angle rotation quaternion,/-for example>Representing the superposition of the state obtained by inertial navigation from the last moment to the next moment (this state is obtained by integrating the angular velocity and acceleration measured by inertial navigation) and the error state.
And eleventh step, determining an iterative update equation of the error state according to the optimization model. Optionally, the iteratively updating equation includes a twelfth equation. The twelfth equation is:
. Optionally, the iteratively updating the equation further comprises a thirteenth equation. The thirteenth formula is:
. Optionally, the iteratively updating the equation further comprises a fourteenth equation. The fourteenth equation is: />. Alternatively, when->Updating the error state covariance when the error state covariance is lower than a certain threshold value, wherein an updating equation of the error state covariance is a fifteenth equation, and obtaining a final state through the eleventh equation. Optionally, the fifteenth formula is:
. Wherein (1)>Indicate->Correction vector at multiple iterations,/>Is->About->Is a jacobian of (c).
And twelfth, determining the pose of the heading machine in the roadway according to the iterative updating equation. Optionally, the next prediction is initialized by the final stateAs shown in the sixteenth formula; and converting the final state to a global state under a roadway coordinate system, as shown in a seventeenth formula and an eighteenth formula. Optionally, the sixteenth formula is:. Optionally, the seventeenth formula is: />. Optionally, the eighteenth formula is: />. The controller utilizes normal distribution transformation to match the point clouds of adjacent frames so as to obtain an optimized target function of the pose, and lays a foundation for the estimation of the subsequent error state; and combining the optimization objective function of the pose obtained by the point cloud matching, and obtaining the pose of the heading machine in real time based on iterative error state Kalman filtering.
According to the technical scheme, the positioning control method deeply fuses the data acquired by the inertial navigation system and the data acquired by the laser radar through the iterative error state Kalman filtering algorithm, the complementary property of the data acquired by the inertial navigation system and the data acquired by the laser radar is fully utilized, the tight coupling of the inertial navigation system and the laser radar is realized, the integral accumulated error of the inertial navigation system under long-time running is reduced, and the pose estimation accuracy and the heading machine pose positioning accuracy in a roadway are improved.
A third aspect of the present invention provides an electronic device, including a processor, a memory, and a program or instructions stored in the memory and executable on the processor, the program or instructions implementing the steps of the positioning control method in the above technical solution when executed by the processor.
A fourth aspect of the present invention provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of the positioning control method in the above-described technical solution.
The fifth aspect of the present invention provides a chip, the chip including a processor and a communication interface, the communication interface being coupled to the processor, the processor being configured to execute a program or instructions to implement the steps of the positioning control method in the above technical solution.
Additional aspects and advantages of the present invention will be made apparent from the description which follows, or may be learned by practice of the invention.
Drawings
FIG. 1 illustrates a schematic diagram of the connections between modules in a positioning control system according to one embodiment of the invention;
FIG. 2 shows a schematic diagram of a heading machine traveling within a roadway in accordance with an embodiment of the invention;
fig. 3 shows a flow chart of a positioning control method according to an embodiment of the invention.
The correspondence between the reference numerals and the component names in fig. 1 and 2 is:
100: a positioning control system; 110: a target; 120: an inertial navigation system; 130: a laser radar; 140: a controller; 210: a heading machine; 211: a body frame; 220: and (5) a roadway.
Detailed Description
In order that the above-recited objects, features and advantages of embodiments of the present invention can be more clearly understood, a further detailed description of embodiments of the present invention will be rendered by reference to the appended drawings and detailed description thereof. It should be noted that, in the case of no conflict, the embodiments of the present application and the features in the embodiments may be combined with each other.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application, however, embodiments of the invention may be practiced otherwise than as described herein, and therefore the scope of the present application is not limited to the specific embodiments disclosed below.
Positioning control systems 100, methods, electronic devices, storage media, and chips provided according to some embodiments of the present invention are described below with reference to fig. 1-3.
As shown in fig. 2, the positioning control system 100 is applied to a heading machine 210. Heading machine 210 is used to travel within roadway 220 to cut the working surface of roadway 220. Heading machine 210 includes a body frame 211. The body mount 211 primarily serves as a mounting carrier for some devices or systems in the positioning control system 100, such as the inertial navigation system 120, the lidar 130, and the controller 140. The positioning control system 100 is used to determine the pose of the heading machine 210 within the roadway 220. It should be noted that, the pose referred to in the present invention refers to a position and a pose.
In one embodiment according to the present invention, as shown in fig. 1 and 2, the positioning control system 100 includes a target 110, an inertial navigation system 120, a lidar 130, and a controller 140. Wherein the target 110 is disposed in the roadway 220. Alternatively, the staff member randomly positions the targets 110 at any position of the roadway 220. The target 110 is used to assist the lidar 130 in scanning and acquiring point cloud data. Optionally, the number of targets 110 is at least one, i.e. the targets 110 may be one, two or more, and the targets 110 are flexibly set according to the actual application scenario. Further, the inertial navigation system 120 is disposed on the body frame 211. Inertial navigation system 120 (INS, inertial navigation for short) is an autonomous navigation system that does not depend on external information nor radiates energy to the outside. Optionally, the inertial navigation system 120 is disposed on an upper surface of the body frame 211. Further, the lidar 130 is provided to the body frame 211. Optionally, lidar 130 and inertial navigation system 120 comprise a sensor system of positioning control system 100. Optionally, the number of the lidars 130 is at least one, that is, the lidars 130 may be one, two or more, and the lidars 130 are flexibly set according to the actual application scenario. When the number of the lidars 130 is two or more, the two or more lidars 130 are installed at different heights of the body frame 211, respectively. Further, the controller 140 is provided on the body frame 211. The controller 140 is electrically connected with the inertial navigation system 120; the controller 140 is electrically connected to the lidar 130.
Further, the controller 140 is configured to acquire a three-dimensional image of the heading machine 210. The worker acquires a three-dimensional image of the surface of the heading machine 210 using a 3D scanner, and stores or transmits information about the three-dimensional image to the controller 140 so that the controller 140 acquires the three-dimensional image of the heading machine 210.
Further, the controller 140 is configured to determine a pose of the inertial navigation system 120 in the heading machine 210, a pose of the lidar 130 in the heading machine 210, and a relative position between the inertial navigation system 120 and the lidar 130 based on the three-dimensional image. The pose of inertial navigation system 120 in heading machine 210, the pose of lidar 130 in heading machine 210, and the relative position between inertial navigation system 120 and lidar 130 are external parameters of the sensor system. In other words, the controller 140 is configured to determine the external parameters of the sensor system from the three-dimensional image. Optionally, the controller 140 is configured to establish a heading machine coordinate system, an inertial navigation coordinate system, and a radar coordinate system according to the three-dimensional image, and determine a first origin of coordinates of the heading machine coordinate system, a second origin of coordinates of the inertial navigation coordinate system, and a third origin of coordinates of the radar coordinate system; the controller 140 is configured to determine a displacement matrix according to the coordinates of the second origin of coordinates in the coordinate system of the heading machine and the coordinates of the third origin of coordinates in the coordinate system of the heading machine; the controller 140 is configured to determine angle data and solve the rotation matrix based on the angle data to determine external parameters of the sensor system. Optionally, the first origin of coordinates is the intersection point between the Z-axis in the inertial navigation coordinate system and the upper surface of the body frame 211.
Further, the controller 140 is configured to determine noise parameters in the inertial navigation system 120 via the internal reference calibration tool. Optionally, the internal reference calibration tool is an imu tools. Optionally, the inertial navigation system 120 is allowed to stand still for a first time, and the data collected by the inertial navigation system 120 is processed by using an internal parameter calibration tool to determine the internal parameters of the inertial navigation system 120. Optionally, an internal parameter of the inertial navigation system 120 is a noise parameter in the inertial navigation system 120. Optionally, the first time is not less than 2h. Optionally, the noise parameters include noise of acceleration, random walk noise of acceleration, noise of angular velocity, and random walk noise of angular velocity.
Further, the controller 140 is configured to determine the position of the target 110 within the roadway 220 via the lidar 130. Optionally, the staff members randomly arrange targets 110 within the roadway 220. Alternatively, the target 110 is square. Optionally, the number of targets 110 is at least one, i.e. the targets 110 may be one, two or more, and the targets 110 are flexibly set according to the actual application scenario.
Further, the controller 140 is configured to obtain, by using the lidar 130, a point cloud of the target 110 and the roadway 220, where the point cloud includes multiple frames of point cloud data. Optionally, lidar 130 can scan target 110 and roadway 220 such that controller 140 obtains a point cloud of target 110 and roadway 220 via lidar 130. The point cloud is a set of pointing data. Optionally, the number of lidars 130 is plural, and the controller 140 is configured to fuse the point clouds acquired by the plurality of lidars 130. When the number of lidars 130 is plural, the pose of lidar 130 in heading machine 210 includes the relative positions of the plurality of lidars 130 in heading machine 210. Alternatively, the number of lidars 130 is three, namely a first lidar, a second lidar and a third lidar. Based on the external parameters of the sensor system (pose of the lidar 130 in the heading machine 210), the point cloud acquired by the second lidar and the point cloud acquired by the third lidar are projected into the point cloud of the first lidar to obtain a fused point cloud.
Further, the controller 140 is configured to match point cloud data of adjacent frames and determine an optimized objective function of pose using normal distribution transformation. Optionally, the kth frame and the k+1 frame are neighboring frames. Optionally, the controller 140 is configured to project the point cloud data of the kth frame to a termination time of the kth frame according to the first pose to obtain a first point cloud, divide the first point cloud into grids with a first value, and determine a mean value and a covariance of the first point cloud in each grid; the controller 140 is configured to project the point cloud data of the (k+1) th frame to the start time of the (k+1) th frame according to the second pose to obtain a second pointAnd the cloud determines an optimization objective function of the second pose according to the probability that the second point cloud falls in the normal distribution of the points in each grid of the first point cloud in the corresponding grid. Optionally, the first pose isThe first pose is the pose of the kth frame. Optionally, the second pose is +.>The second pose is the pose of the k+1st frame. Optionally, the termination time of the kth frame is +.>. Alternatively, the start time of the k+1th frame is +.>. Optionally, the ending time of the kth frame coincides with the starting time of the kth+1 frame, i.e. the two times are the same time. Optionally, the first value is M. Optionally, the first point cloud is +. >. Optionally, the first point cloud +_>Dividing into M grids, and obtaining the mean value and covariance of the first point cloud in each grid. Optionally, the controller 140 is configured to determine the average value of the first point cloud in each grid according to a first formula:
wherein->For mean value->For the number of first point clouds in the mth grid,/->Is the coordinates of the ith point in the mth grid. Optionally, the controller 140 is configured to determine the covariance of the first point cloud in each grid by a second formula:
wherein->For covariance +.>For mean value->For the number of first point clouds in the mth grid,/->Is the coordinates of the ith point in the mth grid. Wherein T represents a matrix transposition, which is an operation method of a matrix. Optionally, the second point cloud is +.>. Associating with a second point cloud->Fall at the first point cloud->Is +.>Normal distribution N (+) in the corresponding grid>,/>) Probability of ∈1->Is provided. Optionally, optimizing the objective functionAs shown in the third formula. The third formula is: />
Wherein->For the number of points of the m-th grid where the second point cloud falls on the first point cloud, T represents the matrix transpose, T k+1 Representing a second pose. The first pose is a known state when the second pose is optimized. Alternatively, M is not greater than M.
Further, the controller 140 is configured to determine a state variable of the iterative error state kalman filter according to the error state of the inertial navigation system 120. Optionally, the state variable of the iterative error state kalman filter is determined by a fourth formula according to the error state of the inertial navigation system 120. The fourth formula is:wherein->Representing error items->、/>、/>、/>、/>、/>、/>Respectively indicate->State, displacement, speed, rotation, acceleration deviation, angular velocity deviation, local gravity, +.>,/>Representing a quaternion.
Further, the controller 140 is configured to determine a linearized continuous-time model for the error state via a first order gaussian markov process based on the noise parameter and the state variable. The first order gaussian markov process is a model of the time series that is built based on a linear combination of previous observation points and the current noise term. Optionally, the first order gaussian markov process comprises a fifth formula. The fifth formula is:wherein->Is a gaussian noise vector corresponding to the internal parameters of inertial navigation system 120. />Is the error state transition matrix at time t and satisfies the sixth equation. Optionally, the sixth formula is: / >. Optionally, ->Is a noise jacobian at time t and satisfies a seventh formula. Optionally, the seventh formula is: />. Wherein [ (S)] × ∈R 3×3 Representing the transformation of a three-dimensional vector into its oblique symmetry matrix, I 3 ∈R 3×3 Is an identity matrix>Is the rotation matrix at the moment t,and->The acceleration and angular velocity at time t, respectively.
Further, the controller 140 is configured to determine a predictive equation for the error state and the error state covariance based on the linearized continuous time model. Optionally, the error state is. Optionally, ->The eighth formula is satisfied. The eighth formula is: />. Optionally, the error state covariance is +.>. Optionally, ->The ninth formula is satisfied. The ninth formula is: />. Optionally, according to the error state +.>And error State covariance->Status prediction is obtained by continuous pre-integration>. Wherein (1)>,/>And->Is a continuous IMU time step, +.>Representation->Is a covariance matrix of (a). IMU represents an inertial sensor that is part of inertial navigation system 120, i.e., inertial navigation system 120 includes an inertial sensor.
Further, the controller 140 is configured to determine an optimization model for the error condition based on the error condition and the optimization objective function. Optionally, the final optimization objective function F of the optimization model satisfies the tenth formula. Optionally, the tenth formula is:
. Wherein (1)>Representing the mahalanobis norm>Optimized objective function +.>Is used for measuring the noise of the measurement. Subscript k+1 represents from +.>From moment to momentTime of day change->Is the measurement noiseCovariance matrix of sound. Optionally, ->The eleventh formula is satisfied. Optionally, the eleventh formula is:
. Wherein (1)>Representing from->Time to->The final state of the moment exp () represents the euler angle rotation quaternion,/-for example>Representing the superposition of the state obtained by inertial navigation from the last moment to the next moment (this state is obtained by integrating the angular velocity and acceleration measured by inertial navigation) and the error state.
Further, the controller 140 is configured to determine an iteratively updated equation for the error state based on the optimization model. Optionally, the iteratively updating equation includes a twelfth equation. The twelfth equation is:
. Optionally, the iteratively updating the equation further comprises a thirteenth equation. The thirteenth formula is:
. Optionally, the iteratively updating the equation further comprises a fourteenth equation. The fourteenth equation is:
. Alternatively, when->Updating the error state covariance when the error state covariance is lower than a certain threshold value, wherein an updating equation of the error state covariance is a fifteenth equation, and obtaining a final state through the eleventh equation. Optionally, the fifteenth formula is:
. Wherein (1)>Indicate->Correction vector at multiple iterations,/>Is->About->Is a jacobian of (c).
Further, the controller 140 is configured to determine a pose of the heading machine 210 within the roadway 220 according to an iteratively updated equation. Optionally, the next prediction is initialized by the final stateAs shown in the sixteenth formula; and converting the final state to a global state under a roadway coordinate system, as shown in a seventeenth formula and an eighteenth formula. Optionally, the sixteenth formula is:
. Optionally, the seventeenth formula is:
. Optionally, the eighteenth formula is: />. The controller 140 utilizes normal distribution transformation to match the point clouds of adjacent frames to obtain an optimized target function of the pose, and lays a foundation for the estimation of the subsequent error state; and combining the optimization objective function of the pose obtained by the point cloud matching, and obtaining the pose of the heading machine 210 in real time based on iterative error state Kalman filtering.
In the technical scheme defined by the invention, in the first aspect, the positioning control system 100 deeply fuses the data acquired by the inertial navigation system 120 and the data acquired by the laser radar 130 through the iterative error state Kalman filtering algorithm, fully utilizes the complementary properties of the data acquired by the inertial navigation system 120 and the data acquired by the laser radar 130, realizes the tight coupling of the inertial navigation system 120 and the laser radar 130, is beneficial to reducing the integral accumulated error of the inertial navigation system 120 under long-time running, and improves the accuracy of pose estimation and the positioning precision of the pose of the heading machine 210 in the roadway 220; in the second aspect, it is advantageous to increase the number of representative feature points in the point cloud by the targets 110 that can be arranged at random.
In some embodiments, optionally, the controller 140 is configured to establish a heading machine coordinate system, an inertial navigation coordinate system, and a radar coordinate system from the three-dimensional image, and determine a first origin of coordinates of the heading machine coordinate system, a second origin of coordinates of the inertial navigation coordinate system, and a third origin of coordinates of the radar coordinate system, the controller 140 is configured to determine a displacement matrix from coordinates of the second origin of coordinates in the heading machine coordinate system and coordinates of the third origin of coordinates in the heading machine coordinate system, and the controller 140 is configured to determine angle data and solve the rotation matrix from the angle data to determine a pose of the inertial navigation system 120 in the heading machine 210, a pose of the lidar 130 in the heading machine 210, and a relative position between the inertial navigation system 120 and the lidar 130.
The controller 140 is used to determine external parameters of the sensor system from the three-dimensional image. The sensor system parameters include the pose of inertial navigation system 120 in heading machine 210, the pose of lidar 130 in heading machine 210, and the relative position between inertial navigation system 120 and lidar 130. Optionally, the first origin of coordinates is the intersection point between the Z-axis in the inertial navigation coordinate system and the upper surface of the body frame 211. Optionally, the X, Y and Z axes of the heading machine coordinate system are parallel to the X, Y and Z axes of the inertial navigation coordinate system, respectively. Optionally, the controller 140 obtains coordinates of the origin of coordinates (the second origin of coordinates and the third origin of coordinates) in the coordinate system of the sensor under the coordinate system of the heading machine through the three-dimensional measurement software to construct a displacement matrix, and obtains an included angle between the coordinate axes of the sensor and the coordinate axes of the heading machine 210 based on the three-dimensional measurement software to solve the rotation matrix. In the technical scheme defined by the invention, when the sensor external parameters are determined, a displacement matrix is constructed under the heading machine coordinate system by needing to construct a second coordinate origin in the inertial navigation coordinate system and a third coordinate origin in the radar coordinate system, and the mode is favorable for fully utilizing the complementary properties of the data acquired by the inertial navigation system 120 and the data acquired by the laser radar 130 and realizing the tight coupling of the inertial navigation system 120 and the laser radar 130.
In some embodiments, optionally, the first origin of coordinates is the intersection between the Z-axis in the inertial navigation coordinate system and the upper surface of the body frame 211. The X axis, the Y axis and the Z axis of the heading machine coordinate system are respectively parallel to the X axis, the Y axis and the Z axis of the inertial navigation coordinate system. In the design mode, the displacement matrix is conveniently constructed under the coordinate system of the heading machine by the second coordinate origin in the inertial navigation coordinate system and the third coordinate origin in the radar coordinate system in the subsequent steps.
In some embodiments, optionally, where the number of lidars 130 is multiple, the controller 140 can fuse the point clouds acquired by the multiple lidars 130. When the number of lidars 130 is plural, the pose of lidar 130 in heading machine 210 includes the relative positions of the plurality of lidars 130 in heading machine 210. Alternatively, the number of lidars 130 is three, namely a first lidar, a second lidar and a third lidar. Based on the external parameters of the sensor system (pose of the lidar 130 in the heading machine 210), the point cloud acquired by the second lidar and the point cloud acquired by the third lidar are projected into the point cloud of the first lidar to obtain a fused point cloud.
In some embodiments, optionally, the noise parameters include noise of acceleration, random walk noise of acceleration, noise of angular velocity, and random walk noise of angular velocity. By determining the noise parameters, it is convenient to determine in a subsequent step a linearized continuous-time model concerning the error state by means of a first order gaussian markov process from the noise parameters and the state variables.
In some embodiments, optionally, the controller 140 is configured to project the point cloud data of the kth frame to a termination time of the kth frame according to the first pose to obtain a first point cloud, divide the first point cloud into grids with a first value, and determine a mean and a covariance of the first point cloud in each grid; the controller 140 is configured to project the point cloud data of the k+1st frame to the start time of the k+1st frame according to the second pose to obtain a second point cloud, and determine an optimization objective function of the second pose according to a probability that the second point cloud falls in normal distribution of each grid of the first point cloud. The first pose is the pose of the kth frame, and the second pose is the pose of the k+1th frame.
The kth frame and the k+1 frame are adjacent frames. The controller 140 is configured to match point cloud data of adjacent frames and determine an optimized objective function of pose using normal distribution transformation.
In some embodiments, optionally, the ending time of the kth frame coincides with the starting time of the k+1th frame. The ending time of the kth frame and the starting time of the kth+1th frame are the same time, so that the controller 140 can match the point cloud data of the adjacent frames.
In one embodiment according to the present invention, the positioning control method is applied to the controller 140 of the positioning control system 100 in any of the above embodiments.
As shown in fig. 3, the positioning control method specifically includes the steps of:
s302, acquiring a three-dimensional image of the heading machine. The operator acquires a three-dimensional image of the surface of the heading machine using a 3D scanner and stores or transmits information related to the three-dimensional image to the controller so that the controller acquires the three-dimensional image of the heading machine.
S304, determining the pose of the inertial navigation system in the heading machine, the pose of the laser radar in the heading machine and the relative position between the inertial navigation system and the laser radar according to the three-dimensional image. The pose of the inertial navigation system in the heading machine, the pose of the laser radar in the heading machine and the relative position between the inertial navigation system and the laser radar are external parameters of the sensor system. In other words, the controller is configured to determine an external parameter of the sensor system from the three-dimensional image. Optionally, the controller is used for establishing a heading machine coordinate system, an inertial navigation coordinate system and a radar coordinate system according to the three-dimensional image, and determining a first coordinate origin of the heading machine coordinate system, a second coordinate origin of the inertial navigation coordinate system and a third coordinate origin of the radar coordinate system; the controller is used for determining a displacement matrix according to the coordinates of the second coordinate origin in the coordinate system of the tunneling machine and the coordinates of the third coordinate origin in the coordinate system of the tunneling machine; the controller is configured to determine angle data and to solve the rotation matrix based on the angle data to determine an external parameter of the sensor system. Optionally, the first origin of coordinates is an intersection point between the Z-axis in the inertial navigation coordinate system and the upper surface of the body frame.
S306, determining noise parameters in the inertial navigation system through an internal reference calibration tool. Optionally, the internal reference calibration tool is an imu tools. Optionally, the inertial navigation system is placed still for a first time, and the internal parameter calibration tool is used for processing the data acquired by the inertial navigation system to determine the internal parameter of the inertial navigation system. Optionally, the internal parameter of the inertial navigation system is a noise parameter in the inertial navigation system. Optionally, the first time is not less than 2h. Optionally, the noise parameters include noise of acceleration, random walk noise of acceleration, noise of angular velocity, and random walk noise of angular velocity.
S308, determining the position of the target in the roadway. The controller is used for determining the position of the target in the roadway through the laser radar. Optionally, the staff randomly arranges targets within the roadway. Alternatively, the target is square. Optionally, the number of targets is at least one, that is, the targets may be one, two or more, and the targets are flexibly set according to the actual application scenario.
S310, acquiring point clouds of the target and the roadway, wherein the point clouds comprise multi-frame point cloud data. Optionally, the lidar can scan the target and roadway to enable the controller to obtain a point cloud of the target and roadway via the lidar. The point cloud is a set of pointing data. Optionally, the number of the laser radars is multiple, and the controller is used for fusing point clouds acquired by the multiple laser radars. When the number of the laser radars is plural, the pose of the laser radars in the heading machine includes the relative positions of the plurality of laser radars in the heading machine. Optionally, the number of the lidars is three, namely a first lidar, a second lidar and a third lidar. Based on external parameters (pose of the laser radar in the heading machine) of the sensor system, projecting the point cloud acquired by the second laser radar and the point cloud acquired by the third laser radar into the point cloud of the first laser radar to obtain fusion point cloud.
S312, matching the point cloud data of the adjacent frames, and determining an optimization objective function of the pose by using normal distribution transformation. Optionally, the kth frame and the k+1 frame are neighboring frames. Optionally, the controller is configured to project the point cloud data of the kth frame to a termination time of the kth frame according to the first pose to obtain a first point cloud, divide the first point cloud into grids with a first value, and determine a mean value and a covariance of the first point cloud in each grid; the controller is used for projecting the point cloud data of the k+1st frame to the starting moment of the k+1st frame according to the second pose to obtain a second point cloud, and determining an optimization objective function of the second pose according to the probability that the points of the second point cloud in each grid of the first point cloud fall in the normal distribution of the corresponding grids. Optionally, the first pose isThe first pose is the pose of the kth frame. Optionally, the second pose is +.>The second pose is the pose of the k+1st frame. Optionally, the termination time of the kth frame is +.>. Alternatively, the start time of the k+1th frame is +.>. Optionally, the ending time of the kth frame coincides with the starting time of the kth+1 frame, i.e. the two times are the same time. Optionally, the first value is M. Optionally, the first point cloud is +. >. Optionally, the first point cloud +_>Dividing into M grids, and obtaining the mean value and covariance of the first point cloud in each grid. Optionally, the controller is configured to determine a mean value of the first point cloud in each grid according to a first formula:
wherein->For mean value->For the number of first point clouds in the mth grid,/->Is the coordinates of the ith point in the mth grid. Optionally, the controller is configured to determine the covariance of the first point cloud in each grid by a second formula:
wherein->For covariance +.>For mean value->For the number of first point clouds in the mth grid,/->Is the coordinates of the ith point in the mth grid. Wherein T represents a matrix transposition, which is an operation method of a matrix. Optionally, the second point cloud is +.>. Associating with a second point cloud->Fall at the first point cloud->Is +.>Normal distribution N (+) in the corresponding grid>,/>) Probability of ∈1->Is provided. Optionally, the optimization objective function is as shown in the third formula. The third formula is:
wherein->For the number of points of the m-th grid where the second point cloud falls on the first point cloud, T represents the matrix transpose, T k+1 Representing a second pose. The first pose is a known state when the second pose is optimized. Alternatively, M is not greater than M.
S314, determining a state variable of iterative error state Kalman filtering according to the error state of the inertial navigation system. Optionally, determining the iterative error state Kalman filter by a fourth formula based on the error state of the inertial navigation systemA state variable. The fourth formula is:wherein->Representing error items->、/>、/>、/>、/>、/>Respectively indicate->State, displacement, speed, rotation, acceleration deviation, angular velocity deviation, local gravity, +.>,/>Representing a quaternion.
S316, determining a linearization continuous time model about error states through a first-order Gaussian Markov process according to noise parameters and state variables. The first order gaussian markov process is a model of the time series that is built based on a linear combination of previous observation points and the current noise term. Optionally, first order Gao SimaThe ercoff process includes a fifth formula. The fifth formula is:wherein->Is a gaussian noise vector corresponding to the internal parameters of the inertial navigation system. />Is the error state transition matrix at time t and satisfies the sixth equation. Optionally, the sixth formula is:. Optionally, - >Is a noise jacobian at time t and satisfies a seventh formula. Optionally, the seventh formula is: />. Wherein [ (S)] × ∈R 3×3 Representing the transformation of a three-dimensional vector into its oblique symmetry matrix, I 3 ∈R 3×3 Is an identity matrix>Is the rotation matrix at time t, < >>And->The acceleration and angular velocity at time t, respectively.
S318, determining a prediction equation about the error state and the covariance of the error state according to the linearization continuous time model. Optionally, the error state is. Optionally, ->The eighth formula is satisfied. The eighth formula is: />. Optionally, the error state covariance is +.>. Optionally, ->The ninth formula is satisfied. The ninth formula is: />. Alternatively, according to the error state +.>And error State covariance->Status prediction is obtained by continuous pre-integration>. Wherein (1)>,/>And->Is a continuous IMU time step, +.>Representation->Is a covariance matrix of (a). IMU means inertial sensor that is part of the inertial navigation system, i.e. the inertial navigation system comprises an inertial sensor.
S320, determining an optimization model about the error state according to the error state and the optimization objective function. Optionally, the final optimization objective function F of the optimization model satisfies the tenth formula. Optionally, the tenth formula is:
. Wherein (1)>Representing the mahalanobis norm>Optimized objective function +.>Is used for measuring the noise of the measurement. Subscript k+1 represents from +.>From moment to momentTime of day change->Is the covariance matrix of the measurement noise. Optionally, ->The eleventh formula is satisfied. Optionally, the eleventh formula is:
. Wherein (1)>Representing from->Time to->The final state at time, exp (), represents the euler angle rotation quaternion. />Representing the superposition of the state obtained by inertial navigation from the last moment to the next moment (this state is obtained by integrating the angular velocity and acceleration measured by inertial navigation) and the error state.
S322, determining an iterative update equation of the error state according to the optimization model. Optionally, the iteratively updating equation includes a twelfth equation. The twelfth equation is:
. Optionally, the iteratively updating the equation further comprises a thirteenth equation. The thirteenth formula is:
. Optionally, the iteratively updating the equation further comprises a fourteenth equation. The fourteenth equation is:
. Alternatively, when->Updating the error state covariance when the error state covariance is lower than a certain threshold value, wherein an updating equation of the error state covariance is a fifteenth equation, and obtaining a final state through the eleventh equation. Optionally, the fifteenth formula is:
. Wherein (1)>Indicate->Correction vector at multiple iterations,/>Is->About->Is a jacobian of (c).
S324, determining the pose of the heading machine in the roadway according to the iterative updating equation. Optionally, the next prediction is initialized by the final stateAs shown in the sixteenth formula; and converting the final state to a global state under a roadway coordinate system, as shown in a seventeenth formula and an eighteenth formula. Optionally, the sixteenth formula is:. Optionally, the seventeenth formula is: />. Optionally, the eighteenth formula is: />. The controller utilizes normal distribution transformation to match the point clouds of adjacent frames so as to obtain an optimized target function of the pose, and lays a foundation for the estimation of the subsequent error state; and combining the optimization objective function of the pose obtained by the point cloud matching, and obtaining the pose of the heading machine in real time based on iterative error state Kalman filtering.
According to the technical scheme, the positioning control method deeply fuses the data acquired by the inertial navigation system and the data acquired by the laser radar through the iterative error state Kalman filtering algorithm, the complementary property of the data acquired by the inertial navigation system and the data acquired by the laser radar is fully utilized, the tight coupling of the inertial navigation system and the laser radar is realized, the integral accumulated error of the inertial navigation system under long-time running is reduced, and the pose estimation accuracy and the heading machine pose positioning accuracy in a roadway are improved.
In one embodiment according to the present invention, an electronic device includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, which when executed by the processor, implement the steps of the positioning control method in the above embodiments.
It should be noted that the positioning control method may be implemented in a variety of different ways depending on the particular features and/or example applications. For example, the methods may be implemented by a combination of hardware, firmware, and/or software. For example, in a hardware implementation, a processor may be implemented in one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors, electronic devices, other device units designed to perform the functions described above, and/or a combination thereof.
In an embodiment according to the invention, a computer readable storage medium stores a computer program which, when executed by a processor, implements the steps of the positioning control method in the above embodiment. It should be noted that a computer-readable storage medium may be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, but is not limited to being, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium would include: portable computer floppy disks, hard disks, random Access Memories (RAMs), read-only memories (ROMs), erasable programmable read-only memories (EPROMs or flash memories), static Random Access Memories (SRAMs), portable compact disk read-only memories (CD-ROMs), digital Versatile Disks (DVDs), memory cards, floppy disks, encoding machinery such as punch cards or grooves having a raised structure with recorded instructions, or any suitable combination of the above. Computer-readable storage media, as used herein, should not be construed as transmitting a signal itself, such as a radio wave or other freely propagating electromagnetic wave, an electromagnetic wave propagating through a waveguide or other transmission medium, or an electrical signal transmitted through an electrical wire, etc.
In an embodiment according to the invention, the chip comprises a processor and a communication interface, the communication interface and the processor being coupled, the processor being adapted to run a program or instructions for implementing the steps of the positioning control method in the above-described embodiment.
In the present invention, the terms "first," "second," "third," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance; the term "plurality" means two or more, unless expressly defined otherwise. The terms "mounted," "connected," "secured," and the like are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; "coupled" may be directly coupled or indirectly coupled through intermediaries. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances.
In the description of the present invention, it should be understood that the directions or positional relationships indicated by the terms "upper", "lower", "left", "right", "front", "rear", etc. are based on the directions or positional relationships shown in the drawings, are merely for convenience of describing the present invention and simplifying the description, and do not indicate or imply that the devices or units referred to must have a specific direction, be constructed and operated in a specific direction, and thus should not be construed as limiting the present invention.
In the description of the present specification, the terms "one embodiment," "some embodiments," "particular embodiments," and the like, mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
The above is only a preferred embodiment of the present invention, and is not intended to limit the present invention, but various modifications and variations can be made to the present invention by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (13)

1. A positioning control system for use with a heading machine (210), the heading machine (210) including a body frame (211), the heading machine (210) being configured to travel within a roadway (220), the positioning control system comprising:
a target (110) provided in the tunnel (220);
An inertial navigation system (120) provided on the body frame (211);
a laser radar (130) provided on the main body frame (211);
the controller (140) is arranged on the body frame (211), the controller (140) is electrically connected with the inertial navigation system (120), the controller (140) is electrically connected with the laser radar (130), the controller (140) is used for acquiring a three-dimensional image of the heading machine (210), the controller (140) is used for determining the pose of the inertial navigation system (120) in the heading machine (210), the pose of the laser radar (130) in the heading machine (210) and the relative position between the inertial navigation system (120) and the laser radar (130) according to the three-dimensional image, the controller (140) being for determining noise parameters in the inertial navigation system (120) by means of an internal calibration tool, the controller (140) being for determining a position of the target (110) within the roadway (220) by means of the lidar (130), the controller (140) being for obtaining a point cloud of the target (110) and the roadway (220) by means of the lidar (130), the point cloud comprising a plurality of frames of point cloud data, the controller (140) being for matching the point cloud data of adjacent frames and determining an optimized objective function of pose using a normal distribution transformation, the controller (140) being for determining an error state of the inertial navigation system (120), -determining a state variable of an iterative error state kalman filter, -the controller (140) being adapted to determine a linearized continuous time model for the error state from the noise parameter and the state variable by a first order gaussian markov process, -the controller (140) being adapted to determine a prediction equation for the error state and an error state covariance from the linearized continuous time model, -the controller (140) being adapted to determine an optimization model for the error state from the error state and the optimization objective function, -the controller (140) being adapted to determine an iterative update equation for the error state from the optimization model, -the controller (140) being adapted to determine a pose of the heading machine (210) within the tunnel (220) from the iteratively updated equation.
2. The positioning control system according to claim 1, wherein the controller (140) is configured to establish a heading machine coordinate system, an inertial navigation coordinate system and a radar coordinate system from the three-dimensional image, and to determine a first origin of coordinates of the heading machine coordinate system, a second origin of coordinates of the inertial navigation coordinate system and a third origin of coordinates of the radar coordinate system, the controller (140) is configured to determine a displacement matrix from coordinates of the second origin of coordinates in the heading machine coordinate system and coordinates of the third origin of coordinates in the heading machine coordinate system, the controller (140) is configured to determine angle data and to solve a rotation matrix from the angle data to determine a pose of the inertial navigation system (120) in the heading machine (210), a pose of the laser radar (130) in the heading machine (210) and a relative position between the inertial navigation system (120) and the laser radar (130).
3. The positioning control system according to claim 2, characterized in that the first origin of coordinates is an intersection point between a Z-axis in the inertial navigation coordinate system and an upper surface of the body frame (211).
4. A positioning control system according to any one of claims 1 to 3, characterized in that the number of lidars (130) is a plurality, the controller (140) being configured to fuse the point clouds acquired by a plurality of the lidars (130).
5. A positioning control system according to any one of claims 1 to 3, characterized in that the noise parameters include noise of acceleration, random walk noise of the acceleration, noise of angular velocity and random walk noise of the angular velocity.
6. A positioning control system according to any of claims 1-3, characterized in that the controller (140) is configured to project the point cloud data of the kth frame according to a first pose to a termination time of the kth frame to obtain a first point cloud, divide the first point cloud into grids with a first value and determine a mean and covariance of the first point cloud within each of the grids; the controller (140) is configured to project the point cloud data of the kth+1st frame to a start time of the kth+1st frame according to a second pose, obtain a second point cloud, and determine the optimized objective function of the second pose according to a probability that a point of the second point cloud in each grid of the first point cloud falls in a normal distribution of the corresponding grid, where the first pose is a pose of the kth frame, and the second pose is a pose of the kth+1st frame.
7. The positioning control system of claim 6, wherein the ending time of the kth frame coincides with the starting time of the k+1 frame.
8. The positioning control system of claim 6, wherein the controller (140) is configured to determine the mean value of the first point cloud within each of the grids by a first formula:
wherein->For the mean>For the number of first point clouds in the mth said grid, +.>Coordinates of an ith point in the mth grid; and/or
The controller (140) is configured to determine the covariance of the first point cloud within each of the grids by a second formula:
wherein->For the covariance +.>For the mean>For the number of first point clouds in the mth said grid, +.>For the coordinates of the ith point in the mth said grid, T represents the matrix transpose.
9. The positioning control system of claim 6, wherein the optimization objective function is:
wherein->For the covariance +.>For the mean>For points of the second point cloud falling in each of the grids of the first point cloud, and (2) >For the number of points of the grid where the second point cloud falls on the first point cloud, M is the first value, T represents matrix transposition, T k+1 Representing the second pose.
10. A positioning control method, characterized by being applied to the controller of the positioning control system according to any one of claims 1 to 9, comprising:
acquiring a three-dimensional image of the heading machine;
determining the pose of an inertial navigation system in the heading machine, the pose of a laser radar in the heading machine and the relative position between the inertial navigation system and the laser radar according to the three-dimensional image;
determining noise parameters in the inertial navigation system through an internal reference calibration tool;
determining the position of the target in the roadway;
acquiring point clouds of the target and the roadway, wherein the point clouds comprise multi-frame point cloud data;
matching the point cloud data of the adjacent frames, and determining an optimized target function of the pose by using normal distribution transformation;
determining a state variable of iterative error state Kalman filtering according to the error state of the inertial navigation system;
determining a linearized continuous-time model for the error state by a first order gaussian markov process from the noise parameter and the state variable;
Determining a predictive equation for the error state and error state covariance from the linearized continuous time model;
determining an optimization model for the error state based on the error state and the optimization objective function;
determining an iterative update equation of the error state according to the optimization model;
and determining the pose of the heading machine in the roadway according to the iterative updating equation.
11. An electronic device comprising a processor, a memory and a program or instruction stored on the memory and executable on the processor, the program or instruction when executed by the processor implementing the steps of the positioning control method of claim 10.
12. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program which, when executed by a processor, implements the steps of the positioning control method according to claim 10.
13. A chip comprising a processor and a communication interface, the communication interface and the processor being coupled, the processor being configured to execute programs or instructions for implementing the steps of the positioning control method according to claim 10.
CN202311738787.7A 2023-12-18 2023-12-18 Positioning control system, method, electronic device, storage medium and chip Pending CN117433518A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311738787.7A CN117433518A (en) 2023-12-18 2023-12-18 Positioning control system, method, electronic device, storage medium and chip

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311738787.7A CN117433518A (en) 2023-12-18 2023-12-18 Positioning control system, method, electronic device, storage medium and chip

Publications (1)

Publication Number Publication Date
CN117433518A true CN117433518A (en) 2024-01-23

Family

ID=89553689

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311738787.7A Pending CN117433518A (en) 2023-12-18 2023-12-18 Positioning control system, method, electronic device, storage medium and chip

Country Status (1)

Country Link
CN (1) CN117433518A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN212409719U (en) * 2020-12-24 2021-01-26 三一重型装备有限公司 Positioning system of heading machine
CN112780275A (en) * 2019-11-08 2021-05-11 三一重型装备有限公司 Heading machine working system and method
CN114485643A (en) * 2022-01-25 2022-05-13 重庆理工大学 Environment sensing and high-precision positioning method for coal mine underground mobile robot
CN116642482A (en) * 2023-04-14 2023-08-25 陕西远海探科电子科技有限公司 Positioning method, equipment and medium based on solid-state laser radar and inertial navigation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112780275A (en) * 2019-11-08 2021-05-11 三一重型装备有限公司 Heading machine working system and method
CN212409719U (en) * 2020-12-24 2021-01-26 三一重型装备有限公司 Positioning system of heading machine
CN114485643A (en) * 2022-01-25 2022-05-13 重庆理工大学 Environment sensing and high-precision positioning method for coal mine underground mobile robot
CN116642482A (en) * 2023-04-14 2023-08-25 陕西远海探科电子科技有限公司 Positioning method, equipment and medium based on solid-state laser radar and inertial navigation

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CHAO QIN, ET AL: "LINS: A Lidar-Inertial State Estimator for Robust and Efficient Navigation", 《2020 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA)》, 31 August 2020 (2020-08-31), pages 8899 - 8905 *
无人驾驶车辆理论与设计 慕课版: "《无人驾驶车辆理论与设计 慕课版》", 30 April 2021, 北京理工大学出版社, pages: 122 - 127 *

Similar Documents

Publication Publication Date Title
CN112639502B (en) Robot pose estimation
CN109270545B (en) Positioning true value verification method, device, equipment and storage medium
CN105573318B (en) environment construction method based on probability analysis
CN110930495A (en) Multi-unmanned aerial vehicle cooperation-based ICP point cloud map fusion method, system, device and storage medium
US8078399B2 (en) Method and device for three-dimensional path planning to avoid obstacles using multiple planes
Cai et al. Mobile robot localization using gps, imu and visual odometry
CN112859110B (en) Positioning navigation method based on three-dimensional laser radar
CN111862215B (en) Computer equipment positioning method and device, computer equipment and storage medium
WO2022179094A1 (en) Vehicle-mounted lidar external parameter joint calibration method and system, medium and device
CN113819917A (en) Automatic driving path planning method, device, equipment and storage medium
JP2023036796A (en) Positioning method and positioning device by lane line and characteristic point, electronic apparatus, storage medium, computer program, and autonomous driving vehicle
CN114636414A (en) High definition city map drawing
KR20200028210A (en) System for structuring observation data and platform for mobile mapping or autonomous vehicle
CN113959437A (en) Measuring method and system for mobile measuring equipment
KR102325119B1 (en) Vehicles updating point cloud maps using lidar coordinates
CN113960614A (en) Elevation map construction method based on frame-map matching
CN112127417B (en) Device for generating environmental data around construction machine and construction machine comprising same
CN113252023A (en) Positioning method, device and equipment based on odometer
Ma et al. Vehicle model aided inertial navigation
CN117433518A (en) Positioning control system, method, electronic device, storage medium and chip
CN110853098A (en) Robot positioning method, device, equipment and storage medium
CN116358525A (en) Laser radar-based map building and positioning method, system and engineering vehicle
CN113902864B (en) Vector map generation method and system for mine field and computer system
CN114111769B (en) Visual inertial positioning method and device and automatic driving device
CN108960738B (en) Laser radar data clustering method under warehouse channel environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination