CN111890373A - Sensing and positioning method of vehicle-mounted mechanical arm - Google Patents
Sensing and positioning method of vehicle-mounted mechanical arm Download PDFInfo
- Publication number
- CN111890373A CN111890373A CN202011047302.6A CN202011047302A CN111890373A CN 111890373 A CN111890373 A CN 111890373A CN 202011047302 A CN202011047302 A CN 202011047302A CN 111890373 A CN111890373 A CN 111890373A
- Authority
- CN
- China
- Prior art keywords
- positioning
- mechanical arm
- vector
- inertial
- vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/04—Viewing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/10—Programme-controlled manipulators characterised by positioning means for manipulator elements
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Navigation (AREA)
Abstract
The invention relates to a sensing and positioning method of a vehicle-mounted mechanical arm. Which comprises the following steps: step 1, obtaining a visual positioning attitude vector of the mechanical armAnd visual positioning position vector(ii) a Step 2, obtaining an inertial positioning attitude vector of the mechanical armAnd inertial positioning position vectorUsing AFAFEKF to locate the pose vector for visionVisual positioning position vectorInertial positioning attitude vectorAnd inertial positioning position vectorFusing to obtain a joint state matrix of the mechanical arm(ii) a Step 3, carrying out indoor positioning on the vehicle-mounted mechanical arm by using UWB (ultra wide band) to obtain UWB measurement positioning vector(ii) a Step 4, utilizing SHFAF method to pair the combined state matrixUWB measurement positioning vectorFusing and measuring the positioning vector for UWBAbnormal value detection and correction are carried out to obtain an accurate state matrix. The invention can effectively realize the positioning of the vehicle-mounted mechanical arm, improves the positioning precision and is safe and reliable.
Description
Technical Field
The invention relates to a perception positioning method, in particular to a perception positioning method of a vehicle-mounted mechanical arm.
Background
With the development of the field of robots, mobile robots have become increasingly popular in recent years. For these robots, it is a guarantee for their normal operation that they can know the surrounding environmental conditions and their own position correctly. However, the conventional GPS positioning technology is limited by the environment too much.
In the past, people have generally used radars to gather information about the environment, but the cost of lidar has been too high. In recent years, cameras are adopted as a tool for visual positioning, the cameras are susceptible to illumination conditions, and the speed of processing a large amount of image data by an image processing algorithm is slow and cannot meet the requirement of real-time performance. In contrast, Inertial Navigation Systems (INS) employ Inertial Measurement Units (IMU) to both meet real-time performance and compensate for camera processing speed issues, but they suffer from cumulative errors and are therefore not suitable for use as remote indoor navigation locations.
Disclosure of Invention
The invention aims to overcome the defects in the prior art and provide a perception positioning method of a vehicle-mounted mechanical arm, which can effectively realize the positioning of the vehicle-mounted mechanical arm, improve the positioning precision and is safe and reliable.
According to the technical scheme provided by the invention, the perception positioning method of the vehicle-mounted mechanical arm comprises the following steps:
step 1, utilizing a vision device to carry out indoor positioning on a vehicle-mounted mechanical arm so as to obtain a vision positioning attitude vector of the mechanical armAnd visual positioning position vector;
Step 2, utilizing an inertial device to carry out indoor positioning on the vehicle-mounted mechanical arm so as to obtain an inertial positioning attitude vector of the mechanical armAnd inertial positioning position vectorUsing AFAFEKF to locate the pose vector for visionVisual positioning position vectorInertial positioning attitude vectorAnd inertial positioning position vectorFusing to obtain a joint state matrix of the mechanical arm;
Step 3, carrying out indoor positioning on the vehicle-mounted mechanical arm by using UWB (ultra wide band) to obtain UWB measurement positioning vector;
Step 4, utilizing SHFAF method to pair the combined state matrixUWB measurement positioning vectorFusing and measuring the positioning vector for UWBAbnormal value detection and correction are carried out to obtain an accurate state matrix。
In the step 1, the method comprises the following steps:
step 1.1, when indoor positioning is carried out, a Kinect camera is used for acquiring a plurality of continuous color images and depth images in real time;
step 1.2, extracting common characteristic points from two continuous frames of color images by using a SURF method, obtaining characteristic vectors, and obtaining direction vectors corresponding to the posture changes of the mechanical arm through the characteristic vectors;
step 1.3, calculating a rotation matrix of the direction of the mechanical arm in two continuous frames of images by using an absolute orientation method, and obtaining a visual positioning attitude vector of the mechanical arm according to the rotation matrix and the direction vector(ii) a By using the deviation value of the mechanical arm in the two continuous frames of depth images, the visual positioning position vector can be obtained。
An inertial unit IMU is arranged on a base of the vehicle-mounted mechanical arm, and an inertial positioning attitude vector is obtained by a strapdown inertial navigation methodAnd inertial positioning position vector。
The invention has the advantages that: and the visual positioning and the IMU positioning are combined to realize the complementary advantages, so that the navigation precision of the inertial navigation system is improved. Meanwhile, Ultra Wideband (UWB) is employed to compensate for the accumulated error of the IMU. Because UWB is susceptible to the problem of multipath effect and non-line-of-sight factor in the complicated indoor environment, adopt fuzzy adaptive filter (SHFAF) to solve the problem of time varying noise in the complicated indoor environment, and through detecting and correcting the outlier, improve the localization performance of the vehicle carried mechanical arm.
Drawings
FIG. 1 is a flow chart of the present invention.
FIG. 2 is a flow chart of the present invention for indoor positioning of a vehicle-mounted robotic arm by a vision apparatus.
Detailed Description
The invention is further illustrated by the following specific figures and examples.
As shown in fig. 1, in order to effectively position the vehicle-mounted mechanical arm and improve the positioning accuracy, the sensing and positioning method of the present invention includes the following steps:
step 1, utilizing a vision device to carry out indoor positioning on a vehicle-mounted mechanical arm so as to obtain a vision positioning attitude vector of the mechanical armAnd visual positioning position vector;
Specifically, the base of the mechanical arm is inconvenient to move, the mechanical arm is fixed to the trolley, and the mechanical arm and the trolley move together, so that the mechanical arm is called a vehicle-mounted mechanical arm and needs to be positioned. When a vision device is used for indoor positioning, a Kinct camera is used for acquiring a continuous color image and a depth image of multiple frames in real time, and a SURF (speeded up robust scale invariant feature change) method is adopted for carrying out feature matching on two continuous color images so as to extract a common feature point and a corresponding feature vector. Subtracting three-dimensional coordinates of two adjacent public characteristic points, and multiplying the three-dimensional coordinates by a rotation matrix obtained by calculation by using an absolute orientation method to obtain a current visual positioning position vector. Obtaining a direction vector of the posture change of the mechanical arm under a model coordinate system through the change of the characteristic vector, calculating a rotation matrix of the mechanical arm in two continuous color images by adopting an absolute orientation method, multiplying the direction vector under the model coordinate system by the rotation matrix to obtain a direction vector under a target coordinate system, namely posture change information of the mechanical arm, and obtaining a new visual positioning posture vector by combining the previous posture vector。
In the present example, the target coordinate system, i.e., the absolute coordinate system, is defined as an origin coordinate in the current region, the horizontal rightward direction is the x-axis, the vertical forward direction is the y-axis, and the vertical upward direction is the z-axis, and all of the following coordinates are coordinates relative to the origin in this coordinate system.
The model coordinate system, i.e., the relative coordinate system, is the coordinate system of each robot arm itself. The directly acquired coordinates are all coordinates in a model coordinate system, and need to be uniformly converted into vector coordinates of a target coordinate system for calculation. The absolute orientation method is a process of converting coordinates in a model coordinate system into coordinates in a target coordinate system.
During specific implementation, in order to avoid visual field blockage, the Kinect camera is fixed at any position above the hand grip of the mechanical arm and cannot be shielded by the mechanical arm. Let the pixel coordinates of the feature points obtained by the Kinct camera beAfter distortion correction by conventional means, three-dimensional coordinates of points in a model coordinate system are obtained through a pinhole camera model and internal parameters of the camera。
The SURF algorithm is an accelerated version of the classical SIFT algorithm and is a common algorithm for feature matching. The SURF algorithm is adopted to extract public characteristic points and characteristic vectors from pixel coordinate data of continuous frame color images, pixel coordinates of the characteristic points are respectively extracted from the color images, and three-dimensional coordinates of the characteristic points are obtained by utilizing conventional technical means in the technical field.
After the feature points are extracted from the color image by using the SURF algorithm, many unmatched points still exist due to the influence of external conditions such as noise, illumination and the like, and the precision needs to be further improved. And eliminating mismatching points in the feature points by adopting a RANSAC algorithm to obtain matched feature points which are less influenced by external conditions.
Because the camera adopts an infrared camera, the infrared radiation distance can be influenced by weather, the infrared radiation intensity is overhigh, the sensor is saturated, and the positioning precision is reduced.
The absolute orientation method is divided into three steps: the model coordinate system is rotated relative to the target coordinate system; translating the model coordinate system relative to the target coordinate system; model scale factors are determined.
After the SURF method and the RANSAC method are used, the three-dimensional coordinates of the feature point captured by the Kinect camera at the first position are marked as data1, and the three-dimensional coordinates of the feature point obtained at the second position are data 2; then, processing and averaging the data1 and the data2 by adopting a bubble sorting method, subtracting the two, multiplying by a rotation matrix calculated by an absolute orientation method, and combining the last position vector to obtain the current visual positioning position vector. Subtracting the characteristic vectors corresponding to the data2 and the data1 to obtain a direction vector under a model coordinate system, calculating a rotation matrix of the mechanical arm in two continuous color images by adopting an absolute orientation method, multiplying the direction vector under the model coordinate system by the rotation matrix to obtain the direction vector under a target coordinate system, namely posture change information of the mechanical arm, and combining the previous posture vector to obtain the current visual positioning posture vector of the mechanical armWherein,,Respectively representing the included angles of the mechanical arm posture and the absolute coordinate system on an X axis, a Y axis and a Z axis.
The mechanical arm is initially located at the origin of coordinates. When moving to the first position and the second positionAnd in the third position, a new characteristic point is obtained as new data2, and the characteristic point obtained in the second position is used as new data 1. After data updating, calculating the relative motion parameters of the mechanical arm from the second position to the third position by using new data1 and data2, and calculating the visual positioning attitude vector of the mechanical arm by combining the last position vectorAnd visual positioning position vector(ii) a And (4) performing a visual positioning process, as shown in FIG. 2.
Step 2, utilizing an inertial device to carry out indoor positioning on the vehicle-mounted mechanical arm so as to obtain an inertial positioning attitude vector of the mechanical armAnd inertial positioning position vectorUsing AFAFEKF to locate the pose vector for visionVisual positioning position vectorInertial positioning attitude vectorAnd inertial positioning position vectorFusing to obtain a joint state matrix of the mechanical arm;
In particular, the inertial unit IMU is typically mounted at the center of gravity of the object being measured, thus mounting the IMU to the machineThe arm base. The core components of the IMU are the gyroscope and accelerometer, where the gyroscope measures the angular velocity of the mechanical arm motionWith accelerometers measuring the acceleration of the mechanical arm movementThe velocity is obtained according to the acceleration and the basic physical kinematics formula. Obtaining an inertial positioning attitude vector of the mechanical arm through a strapdown inertial navigation algorithmAnd inertial positioning position vector。
The basic idea of the strapdown inertial navigation method is to use the navigation parameters (attitude, velocity, and position) at the previous time as initial values, and to use the output of the IMU (angular velocity)And acceleration) Calculating to obtain the current navigation parameters of the mechanical arm in a recursive integral mode, namely the inertial positioning attitude vector of the mechanical armAnd inertial positioning position vector. The current navigation parameters will be used as the known values for the next solution.
Since the vision sensor is limited by light conditions, it cannot operate in a dark environment, and accumulated errors also occur in long-term operation. In addition, the IMU also has systematic errors and random errors. For more accurate results, the position and attitude vectors output by the Kinect camera and the IMU can be fused by using an adaptive fading extended kalman filter AFEKF, that is, a distribution with less uncertainty is obtained from the distribution with high uncertainty originally caused by the existence of errors. The AFEKF is an algorithm widely used in sensor data fusion and the like, and is specifically known to those skilled in the art, and is not described herein again.
At this point, the AFAFEKF is used to obtain the fusion result of the Kinect camera and the IMU, and the joint state matrix is obtainedIncluding the position vector obtained by fusing the Kinect camera and the IMUAnd attitude vectorAssociative state matrixIs formed by a position vectorAnd attitude vectorAnd (4) combining to form a matrix. The results are processed in subsequent steps.
Step 3, carrying out indoor positioning on the vehicle-mounted mechanical arm by using UWB (ultra wide band) to obtain UWB measurement positioning vector;
In the present example, a new position and velocity vector will be obtained using UWB measurement methods. In the UWB measuring system, a label (Tag) is fixed on a mechanical arm and used for sending out a pulse signal, and an Anchor point (Anchor) is a base station used for receiving the signal sent out by the label, and at least more than three Anchor points are needed. And estimating the position of the UWB tag by adopting a least square method, wherein the specific calculation process is as follows:
setting the position vector of the UWB anchor point asWhereinIndicating the serial number of the anchor point, N being the total number of anchor points, which are the UWB measuring base stations, and therefore the position vector of the anchor pointsIs fixed and needs to be given.The position vector representing the tag is obtained by UWB measurement, which is the result obtained by equation (3-6). The position vector of the UWB tag at time k is represented asThen the distance from the UWB tag to the nth anchor point at time kIs calculated by the formula
According to equation (3-1) have
then, the second line, the third line in equation (3-2) is usedThe first line is subtracted from each line, and the linear equations are obtained by combining the equations (3-3) and (3-4)
Solving the linear equation to obtain the UWB tag position vector as
Is provided withIs the velocity vector of the tag and is,is the time of the kth time instant, the velocity vector of the tag at time instant k is determined according to the underlying principles of physical kinematicsIs composed of
Setting the time of the mechanical arm obtained by indoor positioning by IMURespectively is a position vector and a velocity vector ofAndthen UWB measurement resultIs composed of
Thus far, UWB measurement results are obtainedAnd will be in the following pairAnd carrying out abnormal value correction.
Step 4, measuring the positioning vector of the combined state matrix X, UWB by using SHFAF methodFusing and measuring the positioning vector for UWBAbnormal value detection and correction are carried out to obtain an accurate state matrix。
Since the indoor environment is usually complex (dynamic, shading factor, etc.), UWB measurements are usually affected by multipath effects and NLOS factors, which may bring outliers, cause performance degradation, and even cause filter divergence. Therefore, the estimation accuracy and robustness using SHFAF can be improved with lower computational complexity. Joint state matrix for robotic arms using SHFAF methodAnd UWB measures the location vectorFusing, namely, the joint state matrix of the mechanical armPerforming correction compensation and measuring UWB positioning vectorThe outliers of (a) are corrected to obtain more accurate position and attitude vectors.
SHFAF (fuzzy adaptive filter) is used to solve the problem that time-varying noise causes an abnormal value of UWB measurement in a complex indoor environment. SHFAF includes outlier detection and correction, an adaptive estimator, and an error state kalman filter. The outlier detection and correction module is intended to identify outliers and reduce their dispersion by pre-processing the raw measurements. The adaptive estimator accurately estimates the error over time. Furthermore, the corrected measurement and its estimated noise covariance are the inputs to the error state kalman filter. Outlier detection and correction utilizes a joint state matrixAnd carrying out abnormal value judgment on the UWB measurement result and correcting the abnormal value. The adaptive estimator performs error estimation on the result using the covariance matrix and the error state matrix described below. The specific case of SHFAF is in accordance with the prior art, and is well known to those skilled in the art, and will not be described herein again.
The joint state matrix of the mechanical arm obtained in the step 2 isMeasured joint state matrixThere is an error, and therefore,joint state matrixCan be seen as a matrix of accurate statesAnd error state matrixComposition of transition equationsTo the exact state matrixAnd carrying out recursive prediction to obtain a final result. The specific steps are as follows:
(1) prediction error state and observation covariance matrix
Definition ofIs an error state transition matrix at the moment of k-1, is used for predicting the error state at the moment of k to obtain
Wherein the content of the first and second substances,in the form of a third-order identity matrix,in the form of a time interval,is the acceleration measured at the moment of k-1, is measured by an inertial unit IMU,is the measurement error of the acceleration;is thatThe angular velocity measured at any moment is measured by an inertial unit IMU,in order to be a measurement error of the angular velocity,and (3) obtaining a rotation transformation matrix at the moment of k-1 by an absolute orientation method in the step 1.
The error state prediction equation at the time k isError state transition matrix left multiplication of time of dayThe error state of the time is calculated by the formula
Noise driving matrixFor a custom matrix, to calculate the covariance matrix, two matrices of different dimensions need to be added, thus definingTo complete the matrix addition in the following equation.
Process noise covariance matrixFor calculating the error due to noise. Since the measurement results are affected by noise, it is necessary to remove the influence of noise at the time of prediction.
Wherein the content of the first and second substances,a linear substitution is shown in the representation,,the acceleration value at the moment 1 is the acceleration value at the moment k, and the like are obtained; the angular velocity value at time 1 and the angular velocity value at time k are similar to each other.
Thus, the observed covariance matrix at time k isThe observed covariance matrix of the time is multiplied by the error state matrix and the noise effect is added, the equation is
The observation covariance matrix is a very important parameter in calculating the expectation value of the measurement and calculating the kalman gain.
(2) Abnormal value detection and correction
Ultra-wideband may be subject to unknown and uncertain interference such as multipath effects and NLOS factors. In this case, an abnormal value is likely to occur. The adverse effects of this phenomenon on filter performance are difficult to eliminate. In order to avoid large estimation errors and even filter divergence, detection and correction of outliers are required.
The estimated error of the time UWB measurement isWherein, in the step (A),is a matrix defined for the purpose of matrix operations,is the result of the UWB measurement in step 3,is thatThe transposing of (1).Is the error expectation sumThe expected summation of the moments is calculated by
The error state, the predicted covariance and the measurement noise are taken into account, and the calculation formula is
Measuring noise covarianceIs based onCalculated by measurement error and covariance of time of dayThe covariance of the noise at the time of day,whereinModified Innovation Contribution (MICW) ofWherein. When in useWhen the temperature of the water is higher than the set temperature,reduced to the conventional ICW.Is a forgetting factor (typically between 0.95 and 0.99).
Calculated by equations (4-6) and (4-7)Is determined, the measured value at time k is determinedWhether it is an abnormal value. Is provided with
wherein the content of the first and second substances,is a predetermined sensitivity threshold for outlier identification and needs to be obtained by specific experimentation.
(4-10) reinjection of corrected data into the aboveAnd (4) calculating. If measured valueFor non-abnormal values, the following calculation was performed.
(3) Computing kalman gain
Through the steps, the error state is calculated to beKalman gain of,Is the ratio of the observed covariance matrix and the predicted covariance matrix, used to update the error state, i.e.
Wherein the content of the first and second substances,is the observed covariance matrix in equation (4-5),is the prediction covariance matrix after the noise covariance is removed.
(4) Updating the error state
Estimated error state matrixIs the Kalman gainAnd current UWB estimation errorThe product of (a) and (b),
(5) update accurate state, reset error state to 0
Claims (3)
1. A perception positioning method of a vehicle-mounted mechanical arm is characterized by comprising the following steps:
step 1, utilizing a vision device to carry out indoor positioning on a vehicle-mounted mechanical arm so as to obtain a vision positioning attitude vector of the mechanical armAnd visual positioning position vector;
Step 2, utilizing an inertial device to carry out indoor positioning on the vehicle-mounted mechanical arm so as to obtain an inertial positioning attitude vector of the mechanical armAnd inertial positioning position vectorUsing AFAFEKF to locate the pose vector for visionVisual positioning position vectorInertial positioning attitude vectorAnd inertial positioning position vectorFusing to obtain a joint state matrix of the mechanical arm;
Step 3, carrying out indoor positioning on the vehicle-mounted mechanical arm by using UWB (ultra wide band) to obtain UWB measurement positioning vector;
2. The sensing and positioning method of the vehicle-mounted mechanical arm according to claim 1, wherein the step 1 comprises the following steps:
step 1.1, when indoor positioning is carried out, a Kinect camera is used for acquiring a plurality of continuous color images and depth images in real time;
step 1.2, extracting common characteristic points from two continuous frames of color images by using a SURF method, obtaining characteristic vectors, and obtaining direction vectors corresponding to the posture changes of the mechanical arm through the characteristic vectors;
step 1.3, calculating two continuous frames by using an absolute orientation methodThe rotation matrix of the direction of the mechanical arm in the image can obtain the visual positioning attitude vector of the mechanical arm according to the rotation matrix and the direction vector(ii) a By using the deviation value of the mechanical arm in the two continuous frames of depth images, the visual positioning position vector can be obtained。
3. The perceptual positioning method of the vehicle-mounted mechanical arm as claimed in claim 1, wherein the inertial unit IMU is mounted on a base of the vehicle-mounted mechanical arm, and the inertial positioning attitude vector is obtained by a strapdown inertial navigation methodAnd inertial positioning position vector。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011047302.6A CN111890373A (en) | 2020-09-29 | 2020-09-29 | Sensing and positioning method of vehicle-mounted mechanical arm |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011047302.6A CN111890373A (en) | 2020-09-29 | 2020-09-29 | Sensing and positioning method of vehicle-mounted mechanical arm |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111890373A true CN111890373A (en) | 2020-11-06 |
Family
ID=73224015
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011047302.6A Pending CN111890373A (en) | 2020-09-29 | 2020-09-29 | Sensing and positioning method of vehicle-mounted mechanical arm |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111890373A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113124856A (en) * | 2021-05-21 | 2021-07-16 | 天津大学 | Visual inertia tight coupling odometer based on UWB online anchor point and metering method |
CN113729655A (en) * | 2021-09-26 | 2021-12-03 | 重庆邮电大学 | Method for separating received signals of UWB radar sensor |
WO2023168849A1 (en) * | 2022-03-08 | 2023-09-14 | 江南大学 | Mechanical arm motion capture method, medium, electronic device, and system |
US11766784B1 (en) | 2022-03-08 | 2023-09-26 | Jiangnan University | Motion capture method and system of robotic arm, medium, and electronic device |
WO2024002276A1 (en) * | 2021-11-01 | 2024-01-04 | 华人运通(江苏)技术有限公司 | Method and apparatus for determining script sequence, and electronic device and vehicle |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106584466A (en) * | 2017-02-15 | 2017-04-26 | 湖南天特智能科技有限公司 | System for automatically conveying mobile robots |
CN107179080A (en) * | 2017-06-07 | 2017-09-19 | 纳恩博(北京)科技有限公司 | The localization method and device of electronic equipment, electronic equipment, electronic positioning system |
CN109916407A (en) * | 2019-02-03 | 2019-06-21 | 河南科技大学 | Indoor mobile robot combined positioning method based on adaptive Kalman filter |
CN110609311A (en) * | 2019-10-10 | 2019-12-24 | 武汉理工大学 | Intelligent vehicle positioning method based on fusion of vehicle-mounted panoramic image and millimeter wave radar |
CN110926460A (en) * | 2019-10-29 | 2020-03-27 | 广东工业大学 | Uwb positioning abnormal value processing method based on IMU |
US20200192362A1 (en) * | 2018-12-12 | 2020-06-18 | GM Global Technology Operations LLC | System and method for assisting a vehicle to park in alignment with a wireless battery charging pad |
-
2020
- 2020-09-29 CN CN202011047302.6A patent/CN111890373A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106584466A (en) * | 2017-02-15 | 2017-04-26 | 湖南天特智能科技有限公司 | System for automatically conveying mobile robots |
CN107179080A (en) * | 2017-06-07 | 2017-09-19 | 纳恩博(北京)科技有限公司 | The localization method and device of electronic equipment, electronic equipment, electronic positioning system |
US20200192362A1 (en) * | 2018-12-12 | 2020-06-18 | GM Global Technology Operations LLC | System and method for assisting a vehicle to park in alignment with a wireless battery charging pad |
CN109916407A (en) * | 2019-02-03 | 2019-06-21 | 河南科技大学 | Indoor mobile robot combined positioning method based on adaptive Kalman filter |
CN110609311A (en) * | 2019-10-10 | 2019-12-24 | 武汉理工大学 | Intelligent vehicle positioning method based on fusion of vehicle-mounted panoramic image and millimeter wave radar |
CN110926460A (en) * | 2019-10-29 | 2020-03-27 | 广东工业大学 | Uwb positioning abnormal value processing method based on IMU |
Non-Patent Citations (2)
Title |
---|
JIANFENG LIU等: "An Approach to Robust INS/UWB Integrated Positioning for Autonomous Indoor Mobile Robots", 《SENSORS》 * |
温熙: "Kinect和惯性导航系统组合的室内定位技术研究", 《中国优秀硕士学位论文全文数据库》 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113124856A (en) * | 2021-05-21 | 2021-07-16 | 天津大学 | Visual inertia tight coupling odometer based on UWB online anchor point and metering method |
CN113124856B (en) * | 2021-05-21 | 2023-03-14 | 天津大学 | Visual inertia tight coupling odometer based on UWB (ultra wide band) online anchor point and metering method |
CN113729655A (en) * | 2021-09-26 | 2021-12-03 | 重庆邮电大学 | Method for separating received signals of UWB radar sensor |
CN113729655B (en) * | 2021-09-26 | 2024-03-08 | 重庆邮电大学 | Method for separating UWB radar sensor receiving signals |
WO2024002276A1 (en) * | 2021-11-01 | 2024-01-04 | 华人运通(江苏)技术有限公司 | Method and apparatus for determining script sequence, and electronic device and vehicle |
WO2023168849A1 (en) * | 2022-03-08 | 2023-09-14 | 江南大学 | Mechanical arm motion capture method, medium, electronic device, and system |
US11766784B1 (en) | 2022-03-08 | 2023-09-26 | Jiangnan University | Motion capture method and system of robotic arm, medium, and electronic device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111595333B (en) | Modularized unmanned vehicle positioning method and system based on visual inertia laser data fusion | |
CN111890373A (en) | Sensing and positioning method of vehicle-mounted mechanical arm | |
CN111795686B (en) | Mobile robot positioning and mapping method | |
CN110009681B (en) | IMU (inertial measurement unit) assistance-based monocular vision odometer pose processing method | |
CN111210477B (en) | Method and system for positioning moving object | |
CN107941217B (en) | Robot positioning method, electronic equipment, storage medium and device | |
JP6534664B2 (en) | Method for camera motion estimation and correction | |
CN112233177B (en) | Unmanned aerial vehicle pose estimation method and system | |
CN111156984A (en) | Monocular vision inertia SLAM method oriented to dynamic scene | |
CN107909614B (en) | Positioning method of inspection robot in GPS failure environment | |
CN112815939B (en) | Pose estimation method of mobile robot and computer readable storage medium | |
CN104704384A (en) | Image processing method, particularly used in a vision-based localization of a device | |
CN103020952A (en) | Information processing apparatus and information processing method | |
CN114755662B (en) | Road-vehicle fusion perception laser radar and GPS calibration method and device | |
Zhang et al. | Vision-aided localization for ground robots | |
CN112388635B (en) | Method, system and device for fusing sensing and space positioning of multiple sensors of robot | |
CN115371665B (en) | Mobile robot positioning method based on depth camera and inertial fusion | |
CN112179373A (en) | Measuring method of visual odometer and visual odometer | |
CN114964276A (en) | Dynamic vision SLAM method fusing inertial navigation | |
CN113899364A (en) | Positioning method and device, equipment and storage medium | |
CN110515088B (en) | Odometer estimation method and system for intelligent robot | |
CN113362377B (en) | VO weighted optimization method based on monocular camera | |
CN112762929B (en) | Intelligent navigation method, device and equipment | |
CN112731503A (en) | Pose estimation method and system based on front-end tight coupling | |
CN111862146B (en) | Target object positioning method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20201106 |
|
RJ01 | Rejection of invention patent application after publication |