CN108020855B - posture and rotation instantaneous center joint estimation method for skid-steer robot - Google Patents

posture and rotation instantaneous center joint estimation method for skid-steer robot Download PDF

Info

Publication number
CN108020855B
CN108020855B CN201711229584.XA CN201711229584A CN108020855B CN 108020855 B CN108020855 B CN 108020855B CN 201711229584 A CN201711229584 A CN 201711229584A CN 108020855 B CN108020855 B CN 108020855B
Authority
CN
China
Prior art keywords
matrix
terrain
ellipsoid
rotation
center
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711229584.XA
Other languages
Chinese (zh)
Other versions
CN108020855A (en
Inventor
吕文君
刘葆林
贾晓敏
郑敏
李鲲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ANHUI 11TONG INFORMATION TECHNOLOGY CO LTD
Original Assignee
ANHUI 11TONG INFORMATION TECHNOLOGY CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ANHUI 11TONG INFORMATION TECHNOLOGY CO LTD filed Critical ANHUI 11TONG INFORMATION TECHNOLOGY CO LTD
Priority to CN201711229584.XA priority Critical patent/CN108020855B/en
Publication of CN108020855A publication Critical patent/CN108020855A/en
Application granted granted Critical
Publication of CN108020855B publication Critical patent/CN108020855B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/53Determining attitude

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Image Analysis (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • External Artificial Organs (AREA)

Abstract

Compared with the prior art, the method does not need probability distribution of process noise and observation noise as priori knowledge, which means that a large number of statistical experiments are not needed before the implementation of the method, and meanwhile, the method has stronger robustness on the condition that the noise probability distribution is time-varying, and in addition, as the terrain detection is introduced, the method can adjust the process noise envelope matrix of the rotation instantaneous center when the terrain obviously changes, the self-adaptive mechanism can ensure the stability of the estimation of the rotation instantaneous center, simultaneously reduce the convergence time, and is suitable for scenes with complex terrain.

Description

posture and rotation instantaneous center joint estimation method for skid-steer robot
Technical Field
The invention relates to the technical field of robots, in particular to a joint estimation method for the pose and the instantaneous center of rotation of sliding steering robots.
Background
The skid steer mechanism can control the direction of the mobile robot by changing the speed of the left and right wheels or the track, and is widely applied to field robots by due to good robustness and flexibility and the capability of realizing zero-radius steering.
However, the slip phenomenon inevitably occurs in the moving process of the slip steering robot, a rotation instantaneous center must be introduced for establishing an accurate movement model, and the rotation instantaneous center is always changed along with the change of the terrain, so how to acquire the rotation instantaneous center of the slip steering robot in real time becomes important and challenging work in the robot field.
At present, the related research aiming at the pose and rotation instantaneous center joint estimation method of the slip steering robot is just started, and the existing research results are few. The existing documents mainly adopt the method that the noise is white gaussian noise as the extended kalman filtering and the colorless kalman filtering, which is not easy to meet in practice, for example, the constant error is brought to the system after the wheel is deformed.
In addition, the process noise variance of the instantaneous center of rotation is often set to be a constant value, if larger values are set, the estimated value of the instantaneous center of rotation is converged quickly when the terrain changes, but large jitter is generated, and if smaller values are set, the estimated value of the instantaneous center of rotation is relatively stable, but the convergence process is longer after the terrain changes, so that the method is not suitable for application scenes with complex terrain.
Disclosure of Invention
The invention aims to provide joint estimation methods for the pose and the instantaneous rotation center of a skid-steer robot to adapt to application scenes with complex terrain.
Therefore, the invention provides a joint estimation method for the pose and the instant rotation center of skid-steer robots, which comprises the following steps:
estimating ellipsoid of sampling point sequence number k and posterior state
Figure BDA0001487965720000021
Topographic feature vector pkEnvelope matrix Q of process noise and observation noisekAnd RkSampling interval T and body width B, wherein the posterior state estimation ellipsoid
Figure BDA0001487965720000022
Of the center of the ellipsoid
Figure BDA0001487965720000023
The six elements in (1) are respectively:and
Figure BDA0001487965720000025
respectively representing the center of the posterior state estimation ellipsoid of the east coordinate, the north coordinate and the course angle,
Figure BDA0001487965720000027
and
Figure BDA0001487965720000028
estimating the center of an ellipsoid for the posterior states of the 3 kinematic parameters of the instantaneous center of rotation;
step two, the serial number of the sampling point is automatically increased, k ← k +1, acceleration data of the accelerometer about the direction vertical to the ground axial direction are collected, and the acceleration data are collected for N times at equal time intervals in sampling periods, so that an acceleration data set { a }is obtainedk,i1, …, N; shooting a ground photo by utilizing a camera facing the ground to obtain a pixel matrix Mk(ii) a Collecting left and right wheel encoder data to obtain rotation speed v of left and right wheelsL,kAnd vR,k(ii) a Collecting electronic compass data and GPS module data to obtain observation vector zk=[ze,kzn,kzθ,k]', wherein ze,kAnd zn,kIs the observed value of the east coordinate and the north coordinate, acquired by the GPS module, zθ,kThe observed value of the heading angle is acquired by an electronic compass;
thirdly, detecting the terrain according to the terrain feature vector obtained in the step , the acceleration data set obtained in the step two and the ground photo pixel matrix, and judging whether the terrain has significant changes;
step four: if the terrain is judged to be remarkably changed, multiplying the process noise envelope matrix of the instantaneous center of rotation by a set multiple in the next five sampling points; if the terrain does not change significantly, the original envelope matrix is kept;
step five, performing state prediction according to the posterior state estimation ellipsoid obtained in the step , the sampling interval, the wheel radius and the vehicle body width, the rotating speeds of the left wheel and the right wheel obtained in the step two and the adjusted process noise envelope matrix in the step four to obtain a prior state estimation ellipsoid;
sixthly, updating the state according to the observation noise envelope matrix obtained in the step , the robot course angle obtained in the step two and the prior state estimation ellipsoid obtained in the step five to obtain a posterior state estimation ellipsoid, and
and seventhly, repeating the second step to the sixth step to output the pose of each sampling points and the posterior state estimation ellipsoid of the instantaneous center of rotation, wherein the center of the posterior state estimation ellipsoid is the estimated value of the pose and the instantaneous center of rotation.
Compared with the prior art, the method has the advantages that 1) the probability distribution of process noise and observation noise is not needed to be used as priori knowledge, which means that a large number of statistical experiments are not needed before the method is implemented, and meanwhile, the robustness is strong to the time-varying condition of the noise probability distribution, 2) the condition that the process noise and the observation noise meet Gaussian white noise is not needed, which is very consistent with the actual condition, because the Gaussian white noise is only ideal conditions in reality, is difficult to meet, 3) as terrain detection is introduced, when the terrain is obviously varied, the method can adjust the process noise envelope matrix of the rotation instant center, the self-adaptive mechanism can ensure the stability of the estimation of the rotation instant center, meanwhile, the convergence time is shortened, the method is suitable for scenes with complex terrain, and 4) as coarse error detection is introduced, the influence of sensor faults on the estimation algorithm can be weakened, and the accuracy is improved.
In addition to the objects, features and advantages described above, the present invention has other objects, features and advantages as will become apparent from the following detailed description which proceeds with reference to the accompanying figures.
Drawings
The accompanying drawings, which form a part hereof , are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the invention and together with the description, serve to explain the invention and not to limit the invention.
FIG. 1 is a flow chart of a method for joint estimation of pose and instant center of rotation of a skid steer robot in accordance with the present invention;
FIG. 2 illustrates results of terrain similarity simulation according to an embodiment of the present invention ;
FIG. 3 shows the results of a simulation of the estimation of the instant center of rotation according to an embodiment of the invention, an
FIG. 4 shows pose estimation simulation results according to an embodiment of the invention .
Detailed Description
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
Compared with the prior art, the method does not need probability distribution of process noise and observation noise as priori knowledge, which means that a large number of statistical experiments are not needed before the implementation of the method, and meanwhile, the method has stronger robustness on the condition that the probability distribution of the noise is time-varying.
As shown in fig. 1, the combined estimation method for the pose and the rotation instantaneous center of the slip steering robot based on the kalman filter has the following processes:
s10, initialization
And initializing the sampling point serial number, the posterior state estimation ellipsoid, the terrain feature vector, the envelope matrix of the process noise and the observation noise, the sampling interval and the vehicle body width. The method comprises the following specific steps:
sampling point serial number k is 0, posterior state estimation ellipsoid
Figure BDA0001487965720000031
The initialization of (1) is as follows: center of ellipsoid
Figure BDA0001487965720000032
Six elements in the (B) are determined according to actual conditions, and an ellipsoid envelope matrix Pk=0.1×I6×6Symbol(s)
Figure BDA0001487965720000033
Representing a set of ellipsoids, the 1 st element of the set of ellipsoids representing the center of the ellipsoid, the 2 nd element representing an envelope matrix of the ellipsoid, and a feature vector p of the terraink=O8×1Envelope matrix Q of process noise and observation noisekAnd RkThe method is characterized in that the method comprises the following steps that 6 rows and 6 columns of diagonal arrays and 3 rows and 3 columns of diagonal arrays are respectively adopted, the sampling interval T, the wheel radius phi and the vehicle body width B are determined according to actual conditions. Wherein the subscript k denotes the sampling point number, I6×6Is an identity matrix of 6 rows and 6 columns, O8×1Is a zero vector of 8 rows and 1 column,
Figure BDA0001487965720000041
and
Figure BDA0001487965720000042
respectively representing the center of the posterior state estimation ellipsoid of the east coordinate, the north coordinate and the course angle,
Figure BDA0001487965720000043
and
Figure BDA0001487965720000044
to turn toThe posterior state of the 3 kinematic parameters of the instantaneous center of motion estimates the center of the ellipsoid.
In the present invention, the prime superscript represents the transpose of the matrix, e.g., C' is the transpose of matrix C.
S20, collecting sensor data
And the sampling point sequence number is automatically increased, and data of the accelerometer, the camera, the left and right wheel encoders, the electronic compass and the GPS module are acquired. The method comprises the following specific steps:
sampling point serial number k ← k +1, collecting acceleration data of the accelerometer about the acceleration vertical to the ground axial direction, and collecting for N times at equal time intervals in sampling periods to obtain an acceleration data set { a ← k +1k,i1, …, N; shooting a ground photo by utilizing a camera facing the ground to obtain a pixel matrix Mk(ii) a Collecting left and right wheel encoder data to obtain rotation speed v of left and right wheelsL,kAnd vR,k(ii) a Collecting electronic compass data and GPS module to obtain observation vector zk=[ze,kzn,kzθ,k]', wherein ze,kAnd zn,kIs the observed value of the east coordinate and the north coordinate, acquired by the GPS module, zθ,kThe observed value of the heading angle is acquired by an electronic compass.
S30 terrain detection
And (4) carrying out terrain detection according to the terrain feature vector obtained in the step S10, the acceleration data set obtained in the step S20 and the ground photo pixel matrix, and judging whether the terrain has significant changes. The method comprises the following specific steps:
3.1 eliminating the DC component of the acceleration data set: subtracting the mean value of all elements of the set from all elements of the acceleration data set, i.e.Obtaining an acceleration data set from which a DC component is eliminated
Figure BDA0001487965720000046
i=1,…,N。
3.2 extracting the dominant color of the ground photo: from ground photograph pixel momentsMatrix MkRandomly extracting 50 pixels and averaging to obtain red, green and blue components l of the terrain dominant colorR,k,lG,kAnd lB,k
3.3 finding the feature vector of the terrain
Figure BDA0001487965720000047
The elements are as follows:
Figure BDA0001487965720000049
Figure BDA00014879657200000410
Figure BDA0001487965720000051
Figure BDA0001487965720000052
Figure BDA0001487965720000053
Figure BDA0001487965720000054
Figure BDA0001487965720000055
3.4 the terrain feature vector is subjected to the grouping processing.
3.5 judging whether the terrain changes: calculating the terrain similarity distance:
Figure BDA0001487965720000056
wherein, ω isi∈(0,1]Is the weight of each feature component. If it is not
Figure BDA0001487965720000057
Judging that the terrain does not change, otherwise, judging that the terrain has a significant change.
S40, adjusting process noise envelope matrix of rotation resistance coefficient
According to the judgment of whether the terrain is changed significantly in the step S20, adjusting the process noise envelope matrix of the instantaneous center of rotation: if the terrain changes significantly, the process noise envelope matrix of the instantaneous center will be rotated in the next five sampling points, namely QkMain diagonal 4 th to 6 th elements, multiplied by 10 times; if the terrain does not change, the original envelope matrix is maintained.
S50, state prediction
Performing state prediction according to the posterior state estimation ellipsoid obtained in the step S10, the sampling interval, and the vehicle body width, the rotation speeds of the left and right wheels obtained in the step S20, and the process noise envelope matrix adjusted in the step S40 to obtain a prior state estimation ellipsoid, which is specifically as follows:
computing a priori state estimation ellipsoid
Figure BDA0001487965720000058
The following were used:
Figure BDA0001487965720000059
Figure BDA00014879657200000510
wherein, the state transition equation f (-) is specifically:
wherein, the matrix
Figure BDA0001487965720000062
The Jacobian matrix of the state transition equation f (-),
Figure BDA0001487965720000063
tr (-) denotes the trace of the matrix.
S60, status update
According to the observation noise envelope matrix obtained in the step S10, the robot heading angle obtained in the step S20 and the prior state estimation ellipsoid obtained in the step S50, performing state update to obtain a posterior state estimation ellipsoid, which is specifically as follows:
6.1 calculate the Innovation ∈kThe following were used:
Figure BDA0001487965720000064
wherein the content of the first and second substances,is an observation matrix.
6.2 calculate innovation envelope matrix WkThe following were used:
Figure BDA0001487965720000066
wherein the content of the first and second substances,
Figure BDA0001487965720000067
msvm · denotes the maximum singular value of the matrix.
6.3 computing posterior state estimation ellipsoidThe following were used:
wherein, it is good forHealth indicator function deltakPre-envelope matrix with a posteriori state estimate ellipsoid
Figure BDA00014879657200000611
As follows:
Figure BDA00014879657200000613
wherein, the matrix I6×6Is a 6-dimensional unit matrix;
6.4 eliminating gross error: if deltakLess than or equal to 0, indicating that the sensor is in fault, calculating the posterior state estimation ellipsoid
Figure BDA0001487965720000071
The following were used:
Figure BDA0001487965720000072
Pk=Pk,k-1
and S70, repeating the steps S20 to S60, and obtaining the posterior state estimation ellipsoids of the pose and the rotation instantaneous center of each sampling points, wherein the centers of the ellipsoids are estimation values of the pose and the rotation instantaneous center.
In order to verify the invention, a software MATLAB pair is adopted to carry out a simulation experiment, 2000 sampling points are set, the sampling interval is 0.4 second, the radius of a tire is 35 cm, the width of a vehicle frame is 65 cm, and the instant center of rotation
Figure BDA0001487965720000073
Initially 21.6, -21.6, 10, and becomes 43.2, -43.2, 5 at the 1001 st sampling point. Meanwhile, an acceleration sensor and a camera are used for respectively acquiring 1000 groups of data of two terrains, namely a cement land and a grassland, and a combined experiment is carried out with MATLAB to simulate a mobile robot to switch terrains. The result of the terrain similarity simulation is shown in FIG. 2, which shows that the terrain is on the groundWhen the terrain is not changed, the terrain similarity distance can be stably maintained at relatively small values, the initial values are respectively set to 65, -65 and 20 as shown in figure 3, the estimation result can quickly converge to a true value after the state mutation, the pose estimation simulation result is shown in figure 4, the visible estimation value is almost coincident with the true value, and the effectiveness of the invention can be verified by the simulation.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (5)

1, A joint estimation method for the pose and the instantaneous center of rotation of a skid-steer robot, which is characterized by comprising the following steps:
estimating ellipsoid of sampling point sequence number k and posterior stateTopographic feature vector pkEnvelope matrix Q of process noise and observation noisekAnd RkSampling interval T and body width B, wherein the posterior state estimation ellipsoid
Figure FDA0002153879280000013
Of the center of the ellipsoid
Figure FDA0002153879280000012
The six elements in (1) are respectively:
Figure FDA0002153879280000014
and
Figure FDA0002153879280000015
respectively representing the center of the posterior state estimation ellipsoid of the east coordinate, the north coordinate and the course angle,
Figure FDA0002153879280000016
and
Figure FDA0002153879280000017
estimating the center of an ellipsoid for the posterior states of the 3 kinematic parameters of the instantaneous center of rotation;
step two, self-increasing the sampling point serial number k ← k +1, collecting acceleration data of the accelerometer about the axial direction vertical to the ground, and collecting N times at equal time intervals in sampling periods to obtain an acceleration data set { a-k,i1, …, N; shooting a ground photo by utilizing a camera facing the ground to obtain a pixel matrix Mk(ii) a Collecting left and right wheel encoder data to obtain rotation speed v of left and right wheelsL,kAnd vR,k(ii) a Collecting electronic compass data and GPS module data to obtain observation vector zk=[ze,kzn,kzθ,k]', wherein ze,kAnd zn,kIs the observed value of the east coordinate and the north coordinate, acquired by the GPS module, zθ,kThe observed value of the heading angle is acquired by an electronic compass;
thirdly, detecting the terrain according to the terrain feature vector obtained in the step , the acceleration data set obtained in the step two and the ground photo pixel matrix, and judging whether the terrain has significant changes;
step four: if the terrain is judged to be remarkably changed, multiplying the process noise envelope matrix of the instantaneous center of rotation by a set multiple in the next five sampling points; if the terrain does not change significantly, the original envelope matrix is kept;
step five, performing state prediction according to the posterior state estimation ellipsoid obtained in the step , the sampling interval and the vehicle body width, the rotating speeds of the left and right wheels obtained in the step two and the process noise envelope matrix adjusted in the step four to obtain a prior state estimation ellipsoid;
sixthly, updating the state according to the envelope matrix of the observation noise obtained in the step , the observation vector obtained in the step two and the prior state estimation ellipsoid obtained in the step five to obtain a posterior state estimation ellipsoid, and
and seventhly, repeating the second step to the sixth step to output the pose of each sampling points and the posterior state estimation ellipsoid of the instantaneous center of rotation, wherein the center of the posterior state estimation ellipsoid is the estimated value of the pose and the instantaneous center of rotation.
2. The joint estimation method of the pose and the instant center of rotation of the skid steer robot as claimed in claim 1, wherein the second step comprises the following substeps:
2.1) eliminating the direct current component of the acceleration data set: subtracting the mean value of all elements of the set from all elements of the acceleration data set, i.e.
Figure FDA0002153879280000021
Obtaining an acceleration data set from which a DC component is eliminated
Figure FDA0002153879280000022
2.2) extracting the dominant color of the ground photo: photo pixel matrix M from the groundkRandomly extracting 50 pixels and averaging to obtain red, green and blue components l of the terrain dominant colorR,k,lG,kAnd lB,k
2.3) obtaining the terrain feature vector
Figure FDA0002153879280000023
The elements are as follows:
Figure FDA0002153879280000025
Figure FDA0002153879280000026
2.4) performing normalization processing on the terrain feature vector, and
2.5) judging whether the terrain changes: calculating the terrain similarity distance:
wherein, ω isi∈(0,1]Is the weight of each feature component, if
Figure FDA0002153879280000028
Judging that the terrain does not change, otherwise, judging that the terrain has a significant change.
3. The joint estimation method of pose and instant center of rotation of a skid steer robot as claimed in claim 2, wherein the set multiple in the fourth step is 10 times.
4. The joint estimation method of pose and instant center of rotation of a skidding steering robot of claim 3, wherein the prior state estimation ellipsoid
Figure FDA0002153879280000029
The calculation formula of (a) is as follows:
Figure FDA00021538792800000210
Figure FDA00021538792800000211
wherein, the state transition equation f (-) is specifically:
Figure FDA0002153879280000031
wherein, the matrix
Figure FDA0002153879280000032
The Jacobian matrix of the state transition equation f (-),
Figure FDA0002153879280000033
tr (-) denotes the trace of the matrix, A'kIs a matrix AkAnd (4) transposition.
5. The joint estimation method of pose and instant center of rotation of a skidding steering robot of claim 4, wherein the posterior state estimation ellipsoid
Figure FDA0002153879280000034
The acquisition method comprises the following steps:
5.1) calculating the new form ekThe following were used:
Figure FDA0002153879280000035
wherein the content of the first and second substances,
Figure FDA0002153879280000036
is an observation matrix;
5.2) calculating an innovation envelope matrix WkThe following were used:
Figure FDA0002153879280000037
wherein the content of the first and second substances,msvm (·) represents the maximum singular value of the matrix, C' is the matrix C transpose;
5.3) calculating the posterior state estimation ellipsoid
Figure FDA0002153879280000039
The following were used:
Figure FDA00021538792800000310
wherein the health indicator function deltakPre-envelope matrix with a posteriori state estimate ellipsoid
Figure FDA00021538792800000311
As follows:
wherein, the matrix I6×6Is a 6-dimensional unit matrix; and
5.4) eliminating gross errors: if deltakLess than or equal to 0, indicating that the sensor is in fault, calculating the posterior state estimation ellipsoidThe following were used:
Figure FDA00021538792800000314
Pk=Pk,k-1
CN201711229584.XA 2017-11-29 2017-11-29 posture and rotation instantaneous center joint estimation method for skid-steer robot Active CN108020855B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711229584.XA CN108020855B (en) 2017-11-29 2017-11-29 posture and rotation instantaneous center joint estimation method for skid-steer robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711229584.XA CN108020855B (en) 2017-11-29 2017-11-29 posture and rotation instantaneous center joint estimation method for skid-steer robot

Publications (2)

Publication Number Publication Date
CN108020855A CN108020855A (en) 2018-05-11
CN108020855B true CN108020855B (en) 2020-01-31

Family

ID=62077453

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711229584.XA Active CN108020855B (en) 2017-11-29 2017-11-29 posture and rotation instantaneous center joint estimation method for skid-steer robot

Country Status (1)

Country Link
CN (1) CN108020855B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109085376B (en) * 2018-08-20 2020-09-18 东阳市维创工业产品设计有限公司 Target speed self-adaptive estimation method
CN110160527B (en) * 2019-05-06 2020-08-28 安徽红蝠智能科技有限公司 Mobile robot navigation method and device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0719818A (en) * 1993-06-30 1995-01-20 Kajima Corp Three-dimensional movement predicting device
KR20120092235A (en) * 2011-02-11 2012-08-21 고려대학교 산학협력단 Apparatus for controlling mobile robot and method of the same
CN103411621A (en) * 2013-08-09 2013-11-27 东南大学 Indoor-mobile-robot-oriented optical flow field vision/inertial navigation system (INS) combined navigation method
CN103995984A (en) * 2014-06-09 2014-08-20 武汉科技大学 Robot path planning method and device based on elliptic constrains
US9126338B2 (en) * 2012-08-16 2015-09-08 Hanwha Techwin Co., Ltd. Robot system and method for driving the same
CN107218939A (en) * 2017-06-04 2017-09-29 吕文君 A kind of mobile robot reckoning localization method based on Kinematic Decomposition

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0719818A (en) * 1993-06-30 1995-01-20 Kajima Corp Three-dimensional movement predicting device
KR20120092235A (en) * 2011-02-11 2012-08-21 고려대학교 산학협력단 Apparatus for controlling mobile robot and method of the same
US9126338B2 (en) * 2012-08-16 2015-09-08 Hanwha Techwin Co., Ltd. Robot system and method for driving the same
CN103411621A (en) * 2013-08-09 2013-11-27 东南大学 Indoor-mobile-robot-oriented optical flow field vision/inertial navigation system (INS) combined navigation method
CN103995984A (en) * 2014-06-09 2014-08-20 武汉科技大学 Robot path planning method and device based on elliptic constrains
CN107218939A (en) * 2017-06-04 2017-09-29 吕文君 A kind of mobile robot reckoning localization method based on Kinematic Decomposition

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"Terrain adaptive odometry for mobile skid-steer robots";M. Reinstein 等;《Proceedings of 2013 IEEE》;20131231;第4706–4711页 *
"基于视觉信息的工业机器人搬运系统研究";李洪涛 等;《机床与液压》;20150531;第43卷(第9期);第17-20页 *

Also Published As

Publication number Publication date
CN108020855A (en) 2018-05-11

Similar Documents

Publication Publication Date Title
CN105021184B (en) It is a kind of to be used for pose estimating system and method that vision under mobile platform warship navigation
CN108731670B (en) Inertial/visual odometer integrated navigation positioning method based on measurement model optimization
CN106780699B (en) Visual SLAM method based on SINS/GPS and odometer assistance
Baldwin et al. Complementary filter design on the Special Euclidean group SE (3)
CN108036789B (en) field robot track calculation method
CN108020855B (en) posture and rotation instantaneous center joint estimation method for skid-steer robot
Kang et al. Vins-vehicle: A tightly-coupled vehicle dynamics extension to visual-inertial state estimator
CN107991110B (en) A kind of caterpillar type robot sliding parameter detection method
CN113155129B (en) Holder attitude estimation method based on extended Kalman filtering
EP3566171A1 (en) Systems and methods for classifying road features
JP7173471B2 (en) 3D position estimation device and program
CN104406594B (en) The Measurement Algorithm of spacecrafts rendezvous spacecraft relative pose
CN108122255A (en) It is a kind of based on trapezoidal with circular combination terrestrial reference UAV position and orientation method of estimation
CN112710309B (en) Attitude heading parameter estimation method
CN108051004B (en) Instantaneous center of rotation estimation method for four-wheel robot
CN113223161A (en) Robust panoramic SLAM system and method based on IMU and wheel speed meter tight coupling
CN114234967A (en) Hexapod robot positioning method based on multi-sensor fusion
CN109443353B (en) Visual-inertial tight coupling combined navigation method based on fuzzy self-adaptive ICKF
CN115540860A (en) Multi-sensor fusion pose estimation algorithm
CN109211231A (en) A kind of shell Attitude estimation method based on Newton iteration method
CN107123128B (en) A kind of state of motion of vehicle estimation method guaranteeing accuracy
CN109443355B (en) Visual-inertial tight coupling combined navigation method based on self-adaptive Gaussian PF
CN107389092B (en) Gyro calibration method based on assistance of magnetic sensor
CN109211232A (en) A kind of shell Attitude estimation method based on least squares filtering
CN111145267B (en) 360-degree panoramic view multi-camera calibration method based on IMU assistance

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant