CN111890373A - Sensing and positioning method of vehicle-mounted mechanical arm - Google Patents

Sensing and positioning method of vehicle-mounted mechanical arm Download PDF

Info

Publication number
CN111890373A
CN111890373A CN202011047302.6A CN202011047302A CN111890373A CN 111890373 A CN111890373 A CN 111890373A CN 202011047302 A CN202011047302 A CN 202011047302A CN 111890373 A CN111890373 A CN 111890373A
Authority
CN
China
Prior art keywords
positioning
mechanical arm
vector
inertial
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011047302.6A
Other languages
Chinese (zh)
Inventor
牛建伟
李默
马群
李青锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changzhou Weishi Intelligent Iot Innovation Center Co ltd
Original Assignee
Changzhou Weishi Intelligent Iot Innovation Center Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changzhou Weishi Intelligent Iot Innovation Center Co ltd filed Critical Changzhou Weishi Intelligent Iot Innovation Center Co ltd
Priority to CN202011047302.6A priority Critical patent/CN111890373A/en
Publication of CN111890373A publication Critical patent/CN111890373A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/04Viewing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/10Programme-controlled manipulators characterised by positioning means for manipulator elements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Navigation (AREA)

Abstract

The invention relates to a sensing and positioning method of a vehicle-mounted mechanical arm. Which comprises the following steps: step 1, obtaining a visual positioning attitude vector of the mechanical arm
Figure 530142DEST_PATH_IMAGE001
And visual positioning position vector
Figure 292430DEST_PATH_IMAGE002
(ii) a Step 2, obtaining an inertial positioning attitude vector of the mechanical arm
Figure 504100DEST_PATH_IMAGE003
And inertial positioning position vector
Figure 508571DEST_PATH_IMAGE004
Using AFAFEKF to locate the pose vector for vision
Figure 613931DEST_PATH_IMAGE001
Visual positioning position vector
Figure 501115DEST_PATH_IMAGE002
Inertial positioning attitude vector
Figure 246086DEST_PATH_IMAGE003
And inertial positioning position vector
Figure 40867DEST_PATH_IMAGE005
Fusing to obtain a joint state matrix of the mechanical arm
Figure 733DEST_PATH_IMAGE006
(ii) a Step 3, carrying out indoor positioning on the vehicle-mounted mechanical arm by using UWB (ultra wide band) to obtain UWB measurement positioning vector
Figure 606289DEST_PATH_IMAGE007
(ii) a Step 4, utilizing SHFAF method to pair the combined state matrix
Figure 58130DEST_PATH_IMAGE006
UWB measurement positioning vector
Figure 640290DEST_PATH_IMAGE007
Fusing and measuring the positioning vector for UWB
Figure 454662DEST_PATH_IMAGE007
Abnormal value detection and correction are carried out to obtain an accurate state matrix
Figure 745966DEST_PATH_IMAGE008
. The invention can effectively realize the positioning of the vehicle-mounted mechanical arm, improves the positioning precision and is safe and reliable.

Description

Sensing and positioning method of vehicle-mounted mechanical arm
Technical Field
The invention relates to a perception positioning method, in particular to a perception positioning method of a vehicle-mounted mechanical arm.
Background
With the development of the field of robots, mobile robots have become increasingly popular in recent years. For these robots, it is a guarantee for their normal operation that they can know the surrounding environmental conditions and their own position correctly. However, the conventional GPS positioning technology is limited by the environment too much.
In the past, people have generally used radars to gather information about the environment, but the cost of lidar has been too high. In recent years, cameras are adopted as a tool for visual positioning, the cameras are susceptible to illumination conditions, and the speed of processing a large amount of image data by an image processing algorithm is slow and cannot meet the requirement of real-time performance. In contrast, Inertial Navigation Systems (INS) employ Inertial Measurement Units (IMU) to both meet real-time performance and compensate for camera processing speed issues, but they suffer from cumulative errors and are therefore not suitable for use as remote indoor navigation locations.
Disclosure of Invention
The invention aims to overcome the defects in the prior art and provide a perception positioning method of a vehicle-mounted mechanical arm, which can effectively realize the positioning of the vehicle-mounted mechanical arm, improve the positioning precision and is safe and reliable.
According to the technical scheme provided by the invention, the perception positioning method of the vehicle-mounted mechanical arm comprises the following steps:
step 1, utilizing a vision device to carry out indoor positioning on a vehicle-mounted mechanical arm so as to obtain a vision positioning attitude vector of the mechanical arm
Figure 824413DEST_PATH_IMAGE001
And visual positioning position vector
Figure 210395DEST_PATH_IMAGE002
Step 2, utilizing an inertial device to carry out indoor positioning on the vehicle-mounted mechanical arm so as to obtain an inertial positioning attitude vector of the mechanical arm
Figure 98455DEST_PATH_IMAGE003
And inertial positioning position vector
Figure 493664DEST_PATH_IMAGE004
Using AFAFEKF to locate the pose vector for vision
Figure 776878DEST_PATH_IMAGE001
Visual positioning position vector
Figure 396078DEST_PATH_IMAGE002
Inertial positioning attitude vector
Figure 663111DEST_PATH_IMAGE005
And inertial positioning position vector
Figure 206219DEST_PATH_IMAGE006
Fusing to obtain a joint state matrix of the mechanical arm
Figure 609519DEST_PATH_IMAGE007
Step 3, carrying out indoor positioning on the vehicle-mounted mechanical arm by using UWB (ultra wide band) to obtain UWB measurement positioning vector
Figure 196358DEST_PATH_IMAGE008
Step 4, utilizing SHFAF method to pair the combined state matrix
Figure 950687DEST_PATH_IMAGE009
UWB measurement positioning vector
Figure 733704DEST_PATH_IMAGE010
Fusing and measuring the positioning vector for UWB
Figure 991510DEST_PATH_IMAGE011
Abnormal value detection and correction are carried out to obtain an accurate state matrix
Figure 421355DEST_PATH_IMAGE012
In the step 1, the method comprises the following steps:
step 1.1, when indoor positioning is carried out, a Kinect camera is used for acquiring a plurality of continuous color images and depth images in real time;
step 1.2, extracting common characteristic points from two continuous frames of color images by using a SURF method, obtaining characteristic vectors, and obtaining direction vectors corresponding to the posture changes of the mechanical arm through the characteristic vectors;
step 1.3, calculating a rotation matrix of the direction of the mechanical arm in two continuous frames of images by using an absolute orientation method, and obtaining a visual positioning attitude vector of the mechanical arm according to the rotation matrix and the direction vector
Figure 459718DEST_PATH_IMAGE001
(ii) a By using the deviation value of the mechanical arm in the two continuous frames of depth images, the visual positioning position vector can be obtained
Figure 344628DEST_PATH_IMAGE002
An inertial unit IMU is arranged on a base of the vehicle-mounted mechanical arm, and an inertial positioning attitude vector is obtained by a strapdown inertial navigation method
Figure 456941DEST_PATH_IMAGE013
And inertial positioning position vector
Figure 385583DEST_PATH_IMAGE004
The invention has the advantages that: and the visual positioning and the IMU positioning are combined to realize the complementary advantages, so that the navigation precision of the inertial navigation system is improved. Meanwhile, Ultra Wideband (UWB) is employed to compensate for the accumulated error of the IMU. Because UWB is susceptible to the problem of multipath effect and non-line-of-sight factor in the complicated indoor environment, adopt fuzzy adaptive filter (SHFAF) to solve the problem of time varying noise in the complicated indoor environment, and through detecting and correcting the outlier, improve the localization performance of the vehicle carried mechanical arm.
Drawings
FIG. 1 is a flow chart of the present invention.
FIG. 2 is a flow chart of the present invention for indoor positioning of a vehicle-mounted robotic arm by a vision apparatus.
Detailed Description
The invention is further illustrated by the following specific figures and examples.
As shown in fig. 1, in order to effectively position the vehicle-mounted mechanical arm and improve the positioning accuracy, the sensing and positioning method of the present invention includes the following steps:
step 1, utilizing a vision device to carry out indoor positioning on a vehicle-mounted mechanical arm so as to obtain a vision positioning attitude vector of the mechanical arm
Figure 583346DEST_PATH_IMAGE014
And visual positioning position vector
Figure 239324DEST_PATH_IMAGE002
Specifically, the base of the mechanical arm is inconvenient to move, the mechanical arm is fixed to the trolley, and the mechanical arm and the trolley move together, so that the mechanical arm is called a vehicle-mounted mechanical arm and needs to be positioned. When a vision device is used for indoor positioning, a Kinct camera is used for acquiring a continuous color image and a depth image of multiple frames in real time, and a SURF (speeded up robust scale invariant feature change) method is adopted for carrying out feature matching on two continuous color images so as to extract a common feature point and a corresponding feature vector. Subtracting three-dimensional coordinates of two adjacent public characteristic points, and multiplying the three-dimensional coordinates by a rotation matrix obtained by calculation by using an absolute orientation method to obtain a current visual positioning position vector
Figure 206143DEST_PATH_IMAGE015
. Obtaining a direction vector of the posture change of the mechanical arm under a model coordinate system through the change of the characteristic vector, calculating a rotation matrix of the mechanical arm in two continuous color images by adopting an absolute orientation method, multiplying the direction vector under the model coordinate system by the rotation matrix to obtain a direction vector under a target coordinate system, namely posture change information of the mechanical arm, and obtaining a new visual positioning posture vector by combining the previous posture vector
Figure 977790DEST_PATH_IMAGE016
In the present example, the target coordinate system, i.e., the absolute coordinate system, is defined as an origin coordinate in the current region, the horizontal rightward direction is the x-axis, the vertical forward direction is the y-axis, and the vertical upward direction is the z-axis, and all of the following coordinates are coordinates relative to the origin in this coordinate system.
The model coordinate system, i.e., the relative coordinate system, is the coordinate system of each robot arm itself. The directly acquired coordinates are all coordinates in a model coordinate system, and need to be uniformly converted into vector coordinates of a target coordinate system for calculation. The absolute orientation method is a process of converting coordinates in a model coordinate system into coordinates in a target coordinate system.
During specific implementation, in order to avoid visual field blockage, the Kinect camera is fixed at any position above the hand grip of the mechanical arm and cannot be shielded by the mechanical arm. Let the pixel coordinates of the feature points obtained by the Kinct camera be
Figure 521904DEST_PATH_IMAGE017
After distortion correction by conventional means, three-dimensional coordinates of points in a model coordinate system are obtained through a pinhole camera model and internal parameters of the camera
Figure 138830DEST_PATH_IMAGE018
The SURF algorithm is an accelerated version of the classical SIFT algorithm and is a common algorithm for feature matching. The SURF algorithm is adopted to extract public characteristic points and characteristic vectors from pixel coordinate data of continuous frame color images, pixel coordinates of the characteristic points are respectively extracted from the color images, and three-dimensional coordinates of the characteristic points are obtained by utilizing conventional technical means in the technical field.
After the feature points are extracted from the color image by using the SURF algorithm, many unmatched points still exist due to the influence of external conditions such as noise, illumination and the like, and the precision needs to be further improved. And eliminating mismatching points in the feature points by adopting a RANSAC algorithm to obtain matched feature points which are less influenced by external conditions.
Because the camera adopts an infrared camera, the infrared radiation distance can be influenced by weather, the infrared radiation intensity is overhigh, the sensor is saturated, and the positioning precision is reduced.
The absolute orientation method is divided into three steps: the model coordinate system is rotated relative to the target coordinate system; translating the model coordinate system relative to the target coordinate system; model scale factors are determined.
After the SURF method and the RANSAC method are used, the three-dimensional coordinates of the feature point captured by the Kinect camera at the first position are marked as data1, and the three-dimensional coordinates of the feature point obtained at the second position are data 2; then, processing and averaging the data1 and the data2 by adopting a bubble sorting method, subtracting the two, multiplying by a rotation matrix calculated by an absolute orientation method, and combining the last position vector to obtain the current visual positioning position vector
Figure 491314DEST_PATH_IMAGE019
. Subtracting the characteristic vectors corresponding to the data2 and the data1 to obtain a direction vector under a model coordinate system, calculating a rotation matrix of the mechanical arm in two continuous color images by adopting an absolute orientation method, multiplying the direction vector under the model coordinate system by the rotation matrix to obtain the direction vector under a target coordinate system, namely posture change information of the mechanical arm, and combining the previous posture vector to obtain the current visual positioning posture vector of the mechanical arm
Figure 981332DEST_PATH_IMAGE020
Wherein
Figure 12742DEST_PATH_IMAGE021
Figure 433359DEST_PATH_IMAGE022
Figure 640349DEST_PATH_IMAGE023
Respectively representing the included angles of the mechanical arm posture and the absolute coordinate system on an X axis, a Y axis and a Z axis.
The mechanical arm is initially located at the origin of coordinates. When moving to the first position and the second positionAnd in the third position, a new characteristic point is obtained as new data2, and the characteristic point obtained in the second position is used as new data 1. After data updating, calculating the relative motion parameters of the mechanical arm from the second position to the third position by using new data1 and data2, and calculating the visual positioning attitude vector of the mechanical arm by combining the last position vector
Figure 614120DEST_PATH_IMAGE024
And visual positioning position vector
Figure 539351DEST_PATH_IMAGE025
(ii) a And (4) performing a visual positioning process, as shown in FIG. 2.
Step 2, utilizing an inertial device to carry out indoor positioning on the vehicle-mounted mechanical arm so as to obtain an inertial positioning attitude vector of the mechanical arm
Figure 560396DEST_PATH_IMAGE003
And inertial positioning position vector
Figure 90735DEST_PATH_IMAGE026
Using AFAFEKF to locate the pose vector for vision
Figure 375085DEST_PATH_IMAGE027
Visual positioning position vector
Figure 397399DEST_PATH_IMAGE025
Inertial positioning attitude vector
Figure 222136DEST_PATH_IMAGE028
And inertial positioning position vector
Figure 403718DEST_PATH_IMAGE029
Fusing to obtain a joint state matrix of the mechanical arm
Figure 62233DEST_PATH_IMAGE030
In particular, the inertial unit IMU is typically mounted at the center of gravity of the object being measured, thus mounting the IMU to the machineThe arm base. The core components of the IMU are the gyroscope and accelerometer, where the gyroscope measures the angular velocity of the mechanical arm motion
Figure 70378DEST_PATH_IMAGE031
With accelerometers measuring the acceleration of the mechanical arm movement
Figure 105330DEST_PATH_IMAGE032
The velocity is obtained according to the acceleration and the basic physical kinematics formula
Figure 406998DEST_PATH_IMAGE033
. Obtaining an inertial positioning attitude vector of the mechanical arm through a strapdown inertial navigation algorithm
Figure 564310DEST_PATH_IMAGE028
And inertial positioning position vector
Figure 951429DEST_PATH_IMAGE034
The basic idea of the strapdown inertial navigation method is to use the navigation parameters (attitude, velocity, and position) at the previous time as initial values, and to use the output of the IMU (angular velocity)
Figure 665438DEST_PATH_IMAGE035
And acceleration
Figure 290455DEST_PATH_IMAGE036
) Calculating to obtain the current navigation parameters of the mechanical arm in a recursive integral mode, namely the inertial positioning attitude vector of the mechanical arm
Figure 87510DEST_PATH_IMAGE037
And inertial positioning position vector
Figure 289821DEST_PATH_IMAGE038
. The current navigation parameters will be used as the known values for the next solution.
Since the vision sensor is limited by light conditions, it cannot operate in a dark environment, and accumulated errors also occur in long-term operation. In addition, the IMU also has systematic errors and random errors. For more accurate results, the position and attitude vectors output by the Kinect camera and the IMU can be fused by using an adaptive fading extended kalman filter AFEKF, that is, a distribution with less uncertainty is obtained from the distribution with high uncertainty originally caused by the existence of errors. The AFEKF is an algorithm widely used in sensor data fusion and the like, and is specifically known to those skilled in the art, and is not described herein again.
At this point, the AFAFEKF is used to obtain the fusion result of the Kinect camera and the IMU, and the joint state matrix is obtained
Figure 509318DEST_PATH_IMAGE039
Including the position vector obtained by fusing the Kinect camera and the IMU
Figure 520000DEST_PATH_IMAGE040
And attitude vector
Figure 222376DEST_PATH_IMAGE041
Associative state matrix
Figure 911984DEST_PATH_IMAGE030
Is formed by a position vector
Figure 623588DEST_PATH_IMAGE040
And attitude vector
Figure 36246DEST_PATH_IMAGE041
And (4) combining to form a matrix. The results are processed in subsequent steps.
Step 3, carrying out indoor positioning on the vehicle-mounted mechanical arm by using UWB (ultra wide band) to obtain UWB measurement positioning vector
Figure 643945DEST_PATH_IMAGE042
In the present example, a new position and velocity vector will be obtained using UWB measurement methods. In the UWB measuring system, a label (Tag) is fixed on a mechanical arm and used for sending out a pulse signal, and an Anchor point (Anchor) is a base station used for receiving the signal sent out by the label, and at least more than three Anchor points are needed. And estimating the position of the UWB tag by adopting a least square method, wherein the specific calculation process is as follows:
setting the position vector of the UWB anchor point as
Figure 227373DEST_PATH_IMAGE043
Wherein
Figure 336143DEST_PATH_IMAGE044
Indicating the serial number of the anchor point, N being the total number of anchor points, which are the UWB measuring base stations, and therefore the position vector of the anchor points
Figure 790258DEST_PATH_IMAGE045
Is fixed and needs to be given.
Figure 100017DEST_PATH_IMAGE046
The position vector representing the tag is obtained by UWB measurement, which is the result obtained by equation (3-6). The position vector of the UWB tag at time k is represented as
Figure 216746DEST_PATH_IMAGE047
Then the distance from the UWB tag to the nth anchor point at time k
Figure 863628DEST_PATH_IMAGE048
Is calculated by the formula
Figure 703408DEST_PATH_IMAGE049
(3-1)
According to equation (3-1) have
Figure 918489DEST_PATH_IMAGE050
(3-2)
G and
Figure 758400DEST_PATH_IMAGE051
as follows below, the following description will be given,
Figure 615498DEST_PATH_IMAGE052
(3-3)
Figure 903259DEST_PATH_IMAGE053
(3-4)
then, the second line, the third line in equation (3-2) is used
Figure 289241DEST_PATH_IMAGE054
The first line is subtracted from each line, and the linear equations are obtained by combining the equations (3-3) and (3-4)
Figure 334558DEST_PATH_IMAGE055
(3-5)
Solving the linear equation to obtain the UWB tag position vector as
Figure 578369DEST_PATH_IMAGE056
(3-6)
Is provided with
Figure 127162DEST_PATH_IMAGE057
Is the velocity vector of the tag and is,
Figure 11942DEST_PATH_IMAGE058
is the time of the kth time instant, the velocity vector of the tag at time instant k is determined according to the underlying principles of physical kinematics
Figure 747817DEST_PATH_IMAGE057
Is composed of
Figure 946717DEST_PATH_IMAGE059
(3-7)
Setting the time of the mechanical arm obtained by indoor positioning by IMU
Figure 959803DEST_PATH_IMAGE060
Respectively is a position vector and a velocity vector of
Figure 15484DEST_PATH_IMAGE061
And
Figure 769813DEST_PATH_IMAGE062
then UWB measurement result
Figure 975667DEST_PATH_IMAGE063
Is composed of
Figure 607374DEST_PATH_IMAGE064
(3-8)
Thus far, UWB measurement results are obtained
Figure 37218DEST_PATH_IMAGE042
And will be in the following pair
Figure 482106DEST_PATH_IMAGE042
And carrying out abnormal value correction.
Step 4, measuring the positioning vector of the combined state matrix X, UWB by using SHFAF method
Figure 819547DEST_PATH_IMAGE065
Fusing and measuring the positioning vector for UWB
Figure 463018DEST_PATH_IMAGE065
Abnormal value detection and correction are carried out to obtain an accurate state matrix
Figure 142392DEST_PATH_IMAGE066
Since the indoor environment is usually complex (dynamic, shading factor, etc.), UWB measurements are usually affected by multipath effects and NLOS factors, which may bring outliers, cause performance degradation, and even cause filter divergence. Therefore, the estimation accuracy and robustness using SHFAF can be improved with lower computational complexity. Joint state matrix for robotic arms using SHFAF method
Figure 605734DEST_PATH_IMAGE030
And UWB measures the location vector
Figure 278024DEST_PATH_IMAGE067
Fusing, namely, the joint state matrix of the mechanical arm
Figure 776001DEST_PATH_IMAGE030
Performing correction compensation and measuring UWB positioning vector
Figure 328074DEST_PATH_IMAGE068
The outliers of (a) are corrected to obtain more accurate position and attitude vectors.
SHFAF (fuzzy adaptive filter) is used to solve the problem that time-varying noise causes an abnormal value of UWB measurement in a complex indoor environment. SHFAF includes outlier detection and correction, an adaptive estimator, and an error state kalman filter. The outlier detection and correction module is intended to identify outliers and reduce their dispersion by pre-processing the raw measurements. The adaptive estimator accurately estimates the error over time. Furthermore, the corrected measurement and its estimated noise covariance are the inputs to the error state kalman filter. Outlier detection and correction utilizes a joint state matrix
Figure 544292DEST_PATH_IMAGE030
And carrying out abnormal value judgment on the UWB measurement result and correcting the abnormal value. The adaptive estimator performs error estimation on the result using the covariance matrix and the error state matrix described below. The specific case of SHFAF is in accordance with the prior art, and is well known to those skilled in the art, and will not be described herein again.
The joint state matrix of the mechanical arm obtained in the step 2 is
Figure 161218DEST_PATH_IMAGE069
Measured joint state matrix
Figure 107177DEST_PATH_IMAGE070
There is an error, and therefore,joint state matrix
Figure 49726DEST_PATH_IMAGE070
Can be seen as a matrix of accurate states
Figure 35130DEST_PATH_IMAGE071
And error state matrix
Figure 924589DEST_PATH_IMAGE072
Composition of transition equations
Figure 397159DEST_PATH_IMAGE073
To the exact state matrix
Figure 104083DEST_PATH_IMAGE074
And carrying out recursive prediction to obtain a final result. The specific steps are as follows:
(1) prediction error state and observation covariance matrix
Definition of
Figure 498156DEST_PATH_IMAGE075
Is an error state transition matrix at the moment of k-1, is used for predicting the error state at the moment of k to obtain
Figure 722464DEST_PATH_IMAGE076
(4-1)
Wherein the content of the first and second substances,
Figure 95545DEST_PATH_IMAGE077
in the form of a third-order identity matrix,
Figure 973371DEST_PATH_IMAGE078
in the form of a time interval,
Figure 385898DEST_PATH_IMAGE079
is the acceleration measured at the moment of k-1, is measured by an inertial unit IMU,
Figure 882738DEST_PATH_IMAGE080
is the measurement error of the acceleration;
Figure 64321DEST_PATH_IMAGE081
is that
Figure 67043DEST_PATH_IMAGE082
The angular velocity measured at any moment is measured by an inertial unit IMU,
Figure 560341DEST_PATH_IMAGE083
in order to be a measurement error of the angular velocity,
Figure 126452DEST_PATH_IMAGE084
and (3) obtaining a rotation transformation matrix at the moment of k-1 by an absolute orientation method in the step 1.
The error state prediction equation at the time k is
Figure 896962DEST_PATH_IMAGE082
Error state transition matrix left multiplication of time of day
Figure 359999DEST_PATH_IMAGE082
The error state of the time is calculated by the formula
Figure 747118DEST_PATH_IMAGE085
(4-2)
Noise driving matrix
Figure 648078DEST_PATH_IMAGE086
For a custom matrix, to calculate the covariance matrix, two matrices of different dimensions need to be added, thus defining
Figure 273094DEST_PATH_IMAGE086
To complete the matrix addition in the following equation.
Figure 70149DEST_PATH_IMAGE087
(4-3)
Wherein the content of the first and second substances,
Figure 288772DEST_PATH_IMAGE077
is a third order identity matrix.
Process noise covariance matrix
Figure 259002DEST_PATH_IMAGE088
For calculating the error due to noise. Since the measurement results are affected by noise, it is necessary to remove the influence of noise at the time of prediction.
Figure 269683DEST_PATH_IMAGE089
(4-4)
Wherein the content of the first and second substances,
Figure 706481DEST_PATH_IMAGE090
a linear substitution is shown in the representation,
Figure 910935DEST_PATH_IMAGE091
Figure 356960DEST_PATH_IMAGE092
the acceleration value at the moment 1 is the acceleration value at the moment k, and the like are obtained; the angular velocity value at time 1 and the angular velocity value at time k are similar to each other.
Thus, the observed covariance matrix at time k is
Figure 789209DEST_PATH_IMAGE082
The observed covariance matrix of the time is multiplied by the error state matrix and the noise effect is added, the equation is
Figure 98968DEST_PATH_IMAGE097
(4-5)
The observation covariance matrix is a very important parameter in calculating the expectation value of the measurement and calculating the kalman gain.
(2) Abnormal value detection and correction
Ultra-wideband may be subject to unknown and uncertain interference such as multipath effects and NLOS factors. In this case, an abnormal value is likely to occur. The adverse effects of this phenomenon on filter performance are difficult to eliminate. In order to avoid large estimation errors and even filter divergence, detection and correction of outliers are required.
Figure 763167DEST_PATH_IMAGE060
The estimated error of the time UWB measurement is
Figure 862579DEST_PATH_IMAGE098
Wherein, in the step (A),
Figure 436780DEST_PATH_IMAGE099
is a matrix defined for the purpose of matrix operations,
Figure 917440DEST_PATH_IMAGE100
is the result of the UWB measurement in step 3,
Figure 803356DEST_PATH_IMAGE101
is that
Figure 207924DEST_PATH_IMAGE102
The transposing of (1).
Figure 902210DEST_PATH_IMAGE103
Is the error expectation sum
Figure 22613DEST_PATH_IMAGE082
The expected summation of the moments is calculated by
Figure 926984DEST_PATH_IMAGE104
(4-6)
Figure 791035DEST_PATH_IMAGE103
The error state, the predicted covariance and the measurement noise are taken into account, and the calculation formula is
Figure 339828DEST_PATH_IMAGE105
(4-7)
Measuring noise covariance
Figure 473875DEST_PATH_IMAGE106
Is based on
Figure 334384DEST_PATH_IMAGE082
Calculated by measurement error and covariance of time of day
Figure 2125DEST_PATH_IMAGE060
The covariance of the noise at the time of day,
Figure 671004DEST_PATH_IMAGE107
wherein
Figure 477417DEST_PATH_IMAGE108
Modified Innovation Contribution (MICW) of
Figure 966167DEST_PATH_IMAGE109
Wherein
Figure 562234DEST_PATH_IMAGE110
. When in use
Figure 820040DEST_PATH_IMAGE111
When the temperature of the water is higher than the set temperature,
Figure 984305DEST_PATH_IMAGE112
reduced to the conventional ICW.
Figure 74533DEST_PATH_IMAGE113
Is a forgetting factor (typically between 0.95 and 0.99).
Calculated by equations (4-6) and (4-7)
Figure 411973DEST_PATH_IMAGE103
Is determined, the measured value at time k is determinedWhether it is an abnormal value. Is provided with
Figure 789865DEST_PATH_IMAGE114
(4-8)
Then define the abnormal coefficient
Figure 859452DEST_PATH_IMAGE115
The following were used:
Figure 932581DEST_PATH_IMAGE116
(4-9)
wherein the content of the first and second substances,
Figure 276975DEST_PATH_IMAGE117
is a predetermined sensitivity threshold for outlier identification and needs to be obtained by specific experimentation.
If it is
Figure 571690DEST_PATH_IMAGE115
Not equal to 1, then measure
Figure 546599DEST_PATH_IMAGE118
Corrected new measured value as an abnormal value
Figure 762817DEST_PATH_IMAGE119
Is composed of
Figure 753644DEST_PATH_IMAGE120
(4-10) reinjection of corrected data into the above
Figure 574970DEST_PATH_IMAGE121
And (4) calculating. If measured value
Figure 251939DEST_PATH_IMAGE122
For non-abnormal values, the following calculation was performed.
(3) Computing kalman gain
Through the steps, the error state is calculated to be
Figure 486611DEST_PATH_IMAGE123
Kalman gain of
Figure 517015DEST_PATH_IMAGE124
Figure 724006DEST_PATH_IMAGE125
Is the ratio of the observed covariance matrix and the predicted covariance matrix, used to update the error state, i.e.
Figure 571876DEST_PATH_IMAGE126
(4-11)
Wherein the content of the first and second substances,
Figure 90582DEST_PATH_IMAGE127
is the observed covariance matrix in equation (4-5),
Figure 314890DEST_PATH_IMAGE128
is the prediction covariance matrix after the noise covariance is removed.
(4) Updating the error state
Estimated error state matrix
Figure 687971DEST_PATH_IMAGE123
Is the Kalman gain
Figure 706743DEST_PATH_IMAGE125
And current UWB estimation error
Figure 853690DEST_PATH_IMAGE129
The product of (a) and (b),
Figure 475165DEST_PATH_IMAGE130
(4-12)
(5) update accurate state, reset error state to 0
Figure 204217DEST_PATH_IMAGE131
(4-13)
So far, the most accurate joint state matrix of the mechanical arm of the vehicle-mounted mechanical arm at the moment k is obtained
Figure 659469DEST_PATH_IMAGE132
And positioning at the moment k is realized.

Claims (3)

1. A perception positioning method of a vehicle-mounted mechanical arm is characterized by comprising the following steps:
step 1, utilizing a vision device to carry out indoor positioning on a vehicle-mounted mechanical arm so as to obtain a vision positioning attitude vector of the mechanical arm
Figure 722563DEST_PATH_IMAGE001
And visual positioning position vector
Figure 187043DEST_PATH_IMAGE002
Step 2, utilizing an inertial device to carry out indoor positioning on the vehicle-mounted mechanical arm so as to obtain an inertial positioning attitude vector of the mechanical arm
Figure 541407DEST_PATH_IMAGE003
And inertial positioning position vector
Figure 206875DEST_PATH_IMAGE004
Using AFAFEKF to locate the pose vector for vision
Figure 226784DEST_PATH_IMAGE001
Visual positioning position vector
Figure 416325DEST_PATH_IMAGE002
Inertial positioning attitude vector
Figure 80656DEST_PATH_IMAGE003
And inertial positioning position vector
Figure 464495DEST_PATH_IMAGE005
Fusing to obtain a joint state matrix of the mechanical arm
Figure 971700DEST_PATH_IMAGE006
Step 3, carrying out indoor positioning on the vehicle-mounted mechanical arm by using UWB (ultra wide band) to obtain UWB measurement positioning vector
Figure 653348DEST_PATH_IMAGE007
Step 4, utilizing SHFAF method to pair the combined state matrix
Figure 827977DEST_PATH_IMAGE006
UWB measurement positioning vector
Figure 615674DEST_PATH_IMAGE008
Fusing and measuring the positioning vector for UWB
Figure 751120DEST_PATH_IMAGE009
Abnormal value detection and correction are carried out to obtain an accurate state matrix
Figure 95514DEST_PATH_IMAGE010
2. The sensing and positioning method of the vehicle-mounted mechanical arm according to claim 1, wherein the step 1 comprises the following steps:
step 1.1, when indoor positioning is carried out, a Kinect camera is used for acquiring a plurality of continuous color images and depth images in real time;
step 1.2, extracting common characteristic points from two continuous frames of color images by using a SURF method, obtaining characteristic vectors, and obtaining direction vectors corresponding to the posture changes of the mechanical arm through the characteristic vectors;
step 1.3, calculating two continuous frames by using an absolute orientation methodThe rotation matrix of the direction of the mechanical arm in the image can obtain the visual positioning attitude vector of the mechanical arm according to the rotation matrix and the direction vector
Figure 562134DEST_PATH_IMAGE001
(ii) a By using the deviation value of the mechanical arm in the two continuous frames of depth images, the visual positioning position vector can be obtained
Figure 474727DEST_PATH_IMAGE002
3. The perceptual positioning method of the vehicle-mounted mechanical arm as claimed in claim 1, wherein the inertial unit IMU is mounted on a base of the vehicle-mounted mechanical arm, and the inertial positioning attitude vector is obtained by a strapdown inertial navigation method
Figure 956523DEST_PATH_IMAGE011
And inertial positioning position vector
Figure DEST_PATH_IMAGE012
CN202011047302.6A 2020-09-29 2020-09-29 Sensing and positioning method of vehicle-mounted mechanical arm Pending CN111890373A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011047302.6A CN111890373A (en) 2020-09-29 2020-09-29 Sensing and positioning method of vehicle-mounted mechanical arm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011047302.6A CN111890373A (en) 2020-09-29 2020-09-29 Sensing and positioning method of vehicle-mounted mechanical arm

Publications (1)

Publication Number Publication Date
CN111890373A true CN111890373A (en) 2020-11-06

Family

ID=73224015

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011047302.6A Pending CN111890373A (en) 2020-09-29 2020-09-29 Sensing and positioning method of vehicle-mounted mechanical arm

Country Status (1)

Country Link
CN (1) CN111890373A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113124856A (en) * 2021-05-21 2021-07-16 天津大学 Visual inertia tight coupling odometer based on UWB online anchor point and metering method
CN113729655A (en) * 2021-09-26 2021-12-03 重庆邮电大学 Method for separating received signals of UWB radar sensor
WO2023168849A1 (en) * 2022-03-08 2023-09-14 江南大学 Mechanical arm motion capture method, medium, electronic device, and system
US11766784B1 (en) 2022-03-08 2023-09-26 Jiangnan University Motion capture method and system of robotic arm, medium, and electronic device
WO2024002276A1 (en) * 2021-11-01 2024-01-04 华人运通(江苏)技术有限公司 Method and apparatus for determining script sequence, and electronic device and vehicle

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106584466A (en) * 2017-02-15 2017-04-26 湖南天特智能科技有限公司 System for automatically conveying mobile robots
CN107179080A (en) * 2017-06-07 2017-09-19 纳恩博(北京)科技有限公司 The localization method and device of electronic equipment, electronic equipment, electronic positioning system
CN109916407A (en) * 2019-02-03 2019-06-21 河南科技大学 Indoor mobile robot combined positioning method based on adaptive Kalman filter
CN110609311A (en) * 2019-10-10 2019-12-24 武汉理工大学 Intelligent vehicle positioning method based on fusion of vehicle-mounted panoramic image and millimeter wave radar
CN110926460A (en) * 2019-10-29 2020-03-27 广东工业大学 Uwb positioning abnormal value processing method based on IMU
US20200192362A1 (en) * 2018-12-12 2020-06-18 GM Global Technology Operations LLC System and method for assisting a vehicle to park in alignment with a wireless battery charging pad

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106584466A (en) * 2017-02-15 2017-04-26 湖南天特智能科技有限公司 System for automatically conveying mobile robots
CN107179080A (en) * 2017-06-07 2017-09-19 纳恩博(北京)科技有限公司 The localization method and device of electronic equipment, electronic equipment, electronic positioning system
US20200192362A1 (en) * 2018-12-12 2020-06-18 GM Global Technology Operations LLC System and method for assisting a vehicle to park in alignment with a wireless battery charging pad
CN109916407A (en) * 2019-02-03 2019-06-21 河南科技大学 Indoor mobile robot combined positioning method based on adaptive Kalman filter
CN110609311A (en) * 2019-10-10 2019-12-24 武汉理工大学 Intelligent vehicle positioning method based on fusion of vehicle-mounted panoramic image and millimeter wave radar
CN110926460A (en) * 2019-10-29 2020-03-27 广东工业大学 Uwb positioning abnormal value processing method based on IMU

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JIANFENG LIU等: "An Approach to Robust INS/UWB Integrated Positioning for Autonomous Indoor Mobile Robots", 《SENSORS》 *
温熙: "Kinect和惯性导航系统组合的室内定位技术研究", 《中国优秀硕士学位论文全文数据库》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113124856A (en) * 2021-05-21 2021-07-16 天津大学 Visual inertia tight coupling odometer based on UWB online anchor point and metering method
CN113124856B (en) * 2021-05-21 2023-03-14 天津大学 Visual inertia tight coupling odometer based on UWB (ultra wide band) online anchor point and metering method
CN113729655A (en) * 2021-09-26 2021-12-03 重庆邮电大学 Method for separating received signals of UWB radar sensor
CN113729655B (en) * 2021-09-26 2024-03-08 重庆邮电大学 Method for separating UWB radar sensor receiving signals
WO2024002276A1 (en) * 2021-11-01 2024-01-04 华人运通(江苏)技术有限公司 Method and apparatus for determining script sequence, and electronic device and vehicle
WO2023168849A1 (en) * 2022-03-08 2023-09-14 江南大学 Mechanical arm motion capture method, medium, electronic device, and system
US11766784B1 (en) 2022-03-08 2023-09-26 Jiangnan University Motion capture method and system of robotic arm, medium, and electronic device

Similar Documents

Publication Publication Date Title
CN111595333B (en) Modularized unmanned vehicle positioning method and system based on visual inertia laser data fusion
CN111890373A (en) Sensing and positioning method of vehicle-mounted mechanical arm
CN111795686B (en) Mobile robot positioning and mapping method
CN110009681B (en) IMU (inertial measurement unit) assistance-based monocular vision odometer pose processing method
CN111210477B (en) Method and system for positioning moving object
CN107941217B (en) Robot positioning method, electronic equipment, storage medium and device
JP6534664B2 (en) Method for camera motion estimation and correction
CN112233177B (en) Unmanned aerial vehicle pose estimation method and system
CN111156984A (en) Monocular vision inertia SLAM method oriented to dynamic scene
CN107909614B (en) Positioning method of inspection robot in GPS failure environment
CN112815939B (en) Pose estimation method of mobile robot and computer readable storage medium
CN104704384A (en) Image processing method, particularly used in a vision-based localization of a device
CN103020952A (en) Information processing apparatus and information processing method
CN114755662B (en) Road-vehicle fusion perception laser radar and GPS calibration method and device
Zhang et al. Vision-aided localization for ground robots
CN112388635B (en) Method, system and device for fusing sensing and space positioning of multiple sensors of robot
CN115371665B (en) Mobile robot positioning method based on depth camera and inertial fusion
CN112179373A (en) Measuring method of visual odometer and visual odometer
CN114964276A (en) Dynamic vision SLAM method fusing inertial navigation
CN113899364A (en) Positioning method and device, equipment and storage medium
CN110515088B (en) Odometer estimation method and system for intelligent robot
CN113362377B (en) VO weighted optimization method based on monocular camera
CN112762929B (en) Intelligent navigation method, device and equipment
CN112731503A (en) Pose estimation method and system based on front-end tight coupling
CN111862146B (en) Target object positioning method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20201106

RJ01 Rejection of invention patent application after publication