CN111681279A - Driving suspension arm space pose measurement method based on improved lie group nonlinear optimization - Google Patents

Driving suspension arm space pose measurement method based on improved lie group nonlinear optimization Download PDF

Info

Publication number
CN111681279A
CN111681279A CN202010304889.8A CN202010304889A CN111681279A CN 111681279 A CN111681279 A CN 111681279A CN 202010304889 A CN202010304889 A CN 202010304889A CN 111681279 A CN111681279 A CN 111681279A
Authority
CN
China
Prior art keywords
camera
image
iteration
coordinates
lie
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010304889.8A
Other languages
Chinese (zh)
Other versions
CN111681279B (en
Inventor
牛丹
傅星瑜
李奇
陈夕松
李世华
杨俊�
刘进波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Keyuan Intelligent Technology Group Co ltd
Nanjing Wenwang Automation Co ltd
Southeast University
Original Assignee
Nanjing Keyuan Intelligent Technology Group Co ltd
Nanjing Wenwang Automation Co ltd
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Keyuan Intelligent Technology Group Co ltd, Nanjing Wenwang Automation Co ltd, Southeast University filed Critical Nanjing Keyuan Intelligent Technology Group Co ltd
Priority to CN202010304889.8A priority Critical patent/CN111681279B/en
Publication of CN111681279A publication Critical patent/CN111681279A/en
Application granted granted Critical
Publication of CN111681279B publication Critical patent/CN111681279B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/155Segmentation; Edge detection involving morphological operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20032Median filtering
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a method for measuring the spatial pose of a crane jib based on improved lie group nonlinear optimization, which comprises the following steps: s1: acquiring four target points which are distributed in a rectangular shape by using a camera, and extracting four barycentric coordinates of an infrared cursor in an acquired image; s2: calibrating the internal and external parameters of the acquisition camera, determining an imaging center and a distortion coefficient, and correcting the image distortion parameters of the image acquired in the step S1; s3: converting the relative pose problem of the camera and the cooperative target into a minimized re-projection problem; s4: calculating an initial rotation matrix and translation amount of the relative pose of the camera by using an EPNP method; s5: and performing nonlinear iteration by using a Gauss-Newton method, and then finally converging to a real solution to obtain a rotation matrix and a translation amount. The invention not only improves the precision of the measurement result, but also improves the detection and calculation speed, can carry out rapid target detection and space parameter acquisition, and improves the real-time performance and the accuracy of the space angle measurement of the suspension arm of the unmanned vehicle.

Description

Driving suspension arm space pose measurement method based on improved lie group nonlinear optimization
Technical Field
The invention relates to unmanned driving control of an industrial system, in particular to a method for measuring the spatial pose of a driving suspension arm.
Background
The unmanned traveling system is a complex system comprising a plurality of subsystems, and the anti-swing control system is one of the subsystems and is responsible for accurately controlling a traveling crane boom in the moving operation process and ensuring that a traveling crane can run stably. The premise of anti-swing control is to acquire real-time and accurate space pose parameters of the crane jib. At present, the difficulty of detecting the space angle parameters of the crane jib is that the crane jib needs to be acquired and calculated in real time and at high speed and then sent to a crane controller, so that efficient and rapid target detection and space parameter calculation are important. In the aspect of angle measurement, the angle measurement technology applied to unmanned vehicles mainly comprises the direct measurement of an inclinometer and the indirect measurement of a visual target. Compared with an inclinometer method, the method for visually measuring the swing angle of the crane jib has the advantages that the obtained data is more comprehensive and visual, and the optimization and calculation of an anti-swing control algorithm are facilitated. But in the real-time high-speed running process of the vehicle, the spatial pose parameters of the industrial crane jib are difficult to accurately measure.
Disclosure of Invention
The purpose of the invention is as follows: in order to overcome the defects and shortcomings of the prior art, the invention provides a method for measuring the space pose of a crane jib based on improved lie group nonlinear optimization, the method can be used for quickly calculating the space pose parameters of the crane jib, and the problem that the space pose parameters of the industrial crane jib are difficult to accurately measure in real time is effectively solved.
The technical scheme is as follows: a method for measuring the spatial pose of a crane jib based on improved lie group nonlinear optimization comprises the following steps:
s1: a camera is adopted to collect a target point of a cooperative target, and the cooperative target is composed of four laser points which are distributed in a rectangular shape. Extracting four gravity center coordinates of an infrared cursor in the collected image;
s2: calibrating the internal and external parameters of the acquisition camera, determining an imaging center and a distortion coefficient, and correcting the image distortion parameters of the image acquired in the step S1;
s3: converting the relative pose problem of the camera and the cooperative target into a minimized re-projection problem;
s4: calculating an initial rotation matrix and a translation amount of the relative position of the camera by using an EPNP method;
s5: and performing nonlinear iteration by using a Gauss-Newton method, and then finally converging to a real solution to obtain a rotation matrix and a translation amount.
Further, in step S1, the process of extracting coordinates of four barycentric centers of the infrared cursor in the acquired image includes the following steps:
s11: performing mixed filtering operation on an original image, namely judging to adopt a method of combining Gaussian filtering and median filtering according to a set threshold value, and smoothing the image to eliminate noise;
s12: performing morphological binary open operation on the image by adopting an optimal threshold segmentation method, finding a connected domain and eliminating isolated noise points and burrs in the image background;
s13: and searching cursor connected domains in the image by adopting a gray scale gravity center method and solving gravity center coordinates one by one.
Further, the step S3 specifically includes the following steps:
s31: converting the relative pose problem of the camera and the cooperative target into a nonlinear optimization iteration, namely, minimizing the error between the real solution of the image coordinates and the image coordinates obtained in the step S1, as shown in the following formula:
2=∑||qi′-f(M2Pi w)||2
in the formula (I), the compound is shown in the specification,2is an error term, q'iFor the coordinates of the measurement calibration point obtained in step S1,
Figure BDA0002455417510000021
as actual physical coordinates of the infrared cursor, M2A coordinate transformation matrix consisting of a rotation matrix and a translation quantity;
s32: the expression of step S31 is expressed by lie group method, i.e. rotation matrix and translation between camera coordinate and calibration object coordinate are expressed by lie algebra, ξ is a 6-dimensional vector, the first 3-dimensional represents translation, the last 3-dimensional represents rotation, i.e. 4 th order coordinate transformation matrix is expressed by ξ of 6-dimensional, the expression can be converted into:
Figure BDA0002455417510000022
xi is lie algebra and Λ is broadly expressed as "from vector to matrix".
Further, the step S4 specifically includes the following steps:
s41: representing three-dimensional control points of a cooperative target by a weighted sum of four virtual space coordinate points;
s42: calculating coordinates of the four virtual space points in a world coordinate system, converting the coordinates into a camera coordinate system by using a camera projection model, and calculating the coordinates in the camera coordinate system;
s43: and according to the 3D-3D matching, calculating a corresponding rotation matrix R and a corresponding translation amount t by adopting ICP (inductively coupled plasma), and taking the rotation matrix R and the translation amount t as initial values to be introduced into a lie group algorithm to carry out minimum optimization.
Further, the step S5 includes the following steps:
s51: find out
Figure BDA0002455417510000023
Calculating an advance as a Jacobian matrix and calculating an initial quantity as a subsequent Jacobian matrix, where A is
Figure BDA0002455417510000024
Figure BDA0002455417510000025
Is a kronecker product, wherein I in A is a unit matrix, R is a rotation matrix, and t is a translation amount;
s52: solving the derivative of the minimized re-projection error term about the optimization variable, namely linearizing the objective function under the K iteration, eliminating the secondary term information after first-order Taylor expansion, and iteratively converging the lie algebra left-times disturbance quantity exp (xi ^) to improve the convergence speed;
s53: merging the expression after the linearization expression in the step S52 and the nonlinear iterative error expression in the step S32, deriving the expression, performing vectorization calculation on the Jacobian matrix by using a Kernel product when solving the Jacobian matrix, and using the lead in the step S51 during each calculation;
s54, setting the derived expression obtained in the step S53 as 0, obtaining the variation lie algebra required by the next iteration, and setting the initial condition of the kth iteration as ξk、E(ξk) Substituting the following formula to obtain a lie algebra under the K +1 iteration;
exp(ξk∧)exp(Δξk∧)=exp(ξk+1∧)
in the above formula, exp (^) represents a matrix index, ξkRepresenting lie algebra after the K iteration, ξk+1Represents the lie algebra, Δ, after the K +1 iterationξkRepresenting the variation of lie algebra;
s55 update f (exp (ξ)k+1∧)Pi w) Obtain new E (ξ)k+1) And judging whether a convergence termination condition is reached, if so, terminating the iteration, and if not, repeating S52 to S54 until the convergence reaches a true value.
Has the advantages that: the invention discloses a method for improving nonlinear optimization of a lie group and measuring the spatial pose of a crane jib. According to the method, through the corresponding relation between the geometric positions of the infrared coplanar four-point cursors and the imaging projection, the spatial pose parameters are solved by using a nonlinear optimization iteration method based on the lie group, and then the spatial pose parameters are converted into the attitude quantities of the four suspension arm models, so that the spatial pose parameters of the crane suspension arm can be rapidly calculated, and the real-time performance of the spatial pose parameter measurement of the industrial crane suspension arm is improved.
Drawings
FIG. 1 is a flow chart of an algorithm for solving spatial pose parameters in an embodiment of the present invention;
FIG. 2 is a block diagram of a monocular vision measuring system;
fig. 3 is a graph of the iterative error drop for the improved lie group.
Detailed Description
The technical solution of the present invention will be further described with reference to the following detailed description and accompanying drawings.
Referring to fig. 1 and 2, the present embodiment discloses a method for measuring a crane jib spatial pose with improved lie group nonlinear optimization, comprising the following steps:
and step S1, a camera is adopted to collect a target point of the cooperation target, and the cooperation target is composed of four laser points which are distributed in a rectangular shape. And extracting the barycentric coordinates of the light spots of the collected image infrared cursor by using a gray scale barycentric method. The process of solving the barycentric coordinate of the infrared cursor in the collected image comprises the following steps:
s11: the mixed filtering operation is firstly carried out on the original image, namely, the method of combining Gaussian filtering and median filtering is adopted according to the set threshold value judgment, and the image is subjected to smoothing processing to eliminate noise.
Specifically, based on the characteristics of mixed noise, a noise model in a filtering window is judged first, the variance of pixels in the window is calculated, if the variance is larger than a preset threshold, the window is polluted by salt and pepper noise, median filtering is needed, and otherwise, Gaussian filtering is directly performed.
S12: and (3) adopting an optimal threshold segmentation method, taking 30 of the nearest distance between two peaks, carrying out morphological binary open operation on the image, finding a connected domain and eliminating isolated noise points and burrs in the image background.
S13: and searching cursor connected domains in the image by adopting a gray scale gravity center method and solving gravity center coordinates one by one.
Specifically, by adopting a gray scale gravity center method, only a circular ring part of a light spot from an edge to a certain position in a circle is obtained to search cursor connected domains in an image and gradually obtain gravity center coordinates, through a previous gray scale threshold segmentation method, a lower limit value of the gray scale of the circular ring can be determined, namely, a gray scale value of the edge of the light spot is used as Thresh1, and a maximum light intensity amplitude value of the gray scale distribution of the light spot can be found to be H, so that the upper limit Thresh2 can be obtained by the following formula:
Thresh2=H-tc
wherein, tcIs a value that can be changed appropriately for light intensity. The barycentric coordinate standard deviation can be controlled to be within 0.1 pixel.
Step S2, calibrating the internal and external parameters of the acquisition camera, determining the imaging center, calculating the radial distortion coefficient by adopting two-step calibration, and correcting the image distortion parameter of the image acquired in the step S1;
step S3, transforming the relative pose problem of the camera and the cooperative target into a minimized reprojection problem, specifically including the steps of:
s31: converting the relative pose problem of the camera and the cooperative target into a nonlinear optimization iteration, namely, minimizing the error between the real solution of the image coordinates and the image coordinates obtained in the step S1, as shown in the following formula:
2=∑||q′i-f(M2Pi w)||2
in the formula (I), the compound is shown in the specification,2is an error term,q′iFor the coordinates of the measurement calibration point obtained in step S1,
Figure BDA0002455417510000041
as actual physical coordinates of the infrared cursor, M2A coordinate transformation matrix consisting of rotation matrix and translation amount, f represents image coordinate with respect to M2And
Figure BDA0002455417510000042
the functional relationship of (a);
s32: the expression of step S31 is expressed by lie group method, i.e. rotation matrix and translation between camera coordinate and calibration object coordinate are expressed by lie algebra, ξ is a 6-dimensional vector, the first 3-dimensional represents translation, the last 3-dimensional represents rotation, i.e. 4 th order coordinate transformation matrix is expressed by ξ of 6-dimensional, the expression can be converted into:
Figure BDA0002455417510000051
xi is lie algebra and Λ is broadly expressed as "from vector to matrix".
Step S4, calculating the initial rotation matrix and translation of the relative pose of the camera by using an EPNP method, which comprises the following steps:
s41: the three-dimensional control point of a cooperative target is represented by a weighted sum of four virtual coordinate points in space, which is specifically represented as follows:
Figure BDA0002455417510000052
wherein, Pi wRepresenting the coordinates of a three-dimensional control point,
Figure BDA0002455417510000053
as virtual space point coordinates, αijAnd (3) the uniquely determined mean barycentric coordinate satisfies the following conditions:
Figure BDA0002455417510000054
s42: calculating coordinates of the four virtual space points in a world coordinate system, converting the coordinates into a camera coordinate system by using a camera projection model, and calculating the coordinates in the camera coordinate system;
s43: and according to the matching of 3D and 3D, calculating a corresponding rotation matrix R and a corresponding translation quantity T by adopting ICP (inductively coupled plasma), and taking the rotation matrix R and the translation quantity T as initial values to be introduced into a lie group algorithm to carry out minimum optimization.
Step S5, carrying out nonlinear iteration by using a Gauss Newton method, and then finally converging to a real solution to obtain a rotation matrix and a translation amount, wherein the method specifically comprises the following steps:
s51: find out
Figure BDA0002455417510000055
Calculating an advance as a Jacobian matrix and calculating an initial quantity as a subsequent Jacobian matrix, where A is
Figure BDA0002455417510000056
Figure BDA0002455417510000057
Is a kronecker product, wherein I in A is a unit matrix, R is a rotation matrix, and t is a translation amount;
s52: solving the derivative of the minimized re-projection error term about the optimization variable, namely linearizing the objective function under the K iteration, eliminating the secondary term information after first-order Taylor expansion, and iteratively converging the lie algebra left-times disturbance quantity exp (xi ^) to improve the convergence speed;
s53: merging the expression after the linearization expression in the step S52 and the nonlinear iteration error expression in the step S32, deriving the expression, performing vectorization calculation on the Jacobian matrix by using a Kernel product when solving the Jacobian matrix, and using the lead in the step S51 during each calculation;
s54, setting the derived expression obtained in the step S53 as 0, obtaining the variation lie algebra required by the next iteration, and setting the initial condition of the kth iteration as ξk、E(ξk) And (4) obtaining a lie algebra under the K +1 iteration after the following formula is substituted:
exp(ξk∧)exp(Δξk∧)=exp(ξk+1∧)
in the above formula, exp (^) represents a matrix index, ξkRepresenting lie algebra after the K iteration, ξk+1Denotes the lie algebra after the K +1 iteration, Δ ξkRepresenting the variation of lie algebra;
s55 update f (exp (ξ)k+1∧)Pi w) Obtain new E (ξ)k+1) And judging whether a convergence termination condition is reached, if so, terminating iteration, otherwise, repeating the steps from S52 to S54 until the convergence reaches a true value, generally requiring 6 steps of iteration, and increasing the speed by about 3 ms.
In terms of the convergence rate of the nonlinear iteration, the iteration number and projection error descending curve of the improved lie group iteration method is shown in fig. 3. And the real solution can be converged basically through 6 iteration steps. Through platform experiments, the iterative algorithm is short in time consumption, completely meets the requirement of the real-time performance of the system, and can quickly calculate three Euclidean angles and three translation amounts of the space pose of the target.
Taking this example as an example, the comparison result of the image plane reprojection error with the original lie group nonlinear optimization method after the test by the monocular vision measurement method for vehicle driving space pose parameters is shown in table 1.
TABLE 1 image plane reprojection error contrast for different algorithms
Figure BDA0002455417510000061
The improved lie group nonlinear optimization method is compared with the original lie group method in terms of the calculation time consumption of the algorithm, and is shown in table 2.
TABLE 2 comparison of the computation time consumption (ms) for the different algorithms
Figure BDA0002455417510000062
As can be seen from the analysis of tables 1 and 2, the improved lie group algorithm provided by the invention has better precision and real-time performance than the pose estimation method based on the original lie group.

Claims (5)

1. A method for measuring the spatial pose of a crane jib based on improved lie group nonlinear optimization is characterized by comprising the following steps:
s1: acquiring a target point of a cooperation target by using a camera, wherein the cooperation target consists of four laser points which are distributed in a rectangular shape, and extracting four barycentric coordinates of an infrared cursor in an acquired image;
s2: calibrating the internal and external parameters of the acquisition camera, determining an imaging center and a distortion coefficient, and correcting the image distortion parameters of the image acquired in the step S1;
s3: converting the relative pose problem of the camera and the cooperative target into a minimized re-projection problem;
s4: calculating an initial rotation matrix and translation amount of the relative pose of the camera by using an EPNP method;
s5: and performing nonlinear iteration by using a Gauss-Newton method, and then finally converging to a real solution to obtain a rotation matrix and a translation amount.
2. The method for measuring the spatial pose of the crane boom based on the improved lie group nonlinear optimization as claimed in claim 1, wherein the step S1 of extracting barycentric coordinates of the infrared cursor in the acquired image comprises the following steps:
s11: performing mixed filtering operation on an original image, namely judging to adopt a method of combining Gaussian filtering and median filtering according to a set threshold value, and smoothing the image to eliminate noise;
s12: performing morphological binary open operation on the image by adopting an optimal threshold segmentation method, finding a connected domain and eliminating isolated noise points and burrs in the image background;
s13: and searching cursor connected domains in the image by adopting a gray scale gravity center method and solving gravity center coordinates one by one.
3. The method for measuring the spatial pose of the crane boom based on the improved lie group nonlinear optimization as claimed in claim 1, wherein the step S3 specifically comprises the following steps:
s31: converting the relative pose problem of the camera and the cooperative target into nonlinear optimization iteration, namely solving the error minimization of the image coordinate real solution and the image coordinate obtained in the step S1:
2=∑||qi′-f(M2Pi w)||2
in the formula (I), the compound is shown in the specification,2is an error term, q'iFor the coordinates of the measurement calibration point obtained in step S1,
Figure FDA0002455417500000011
as actual physical coordinates of the infrared cursor, M2The coordinate transformation matrix is composed of a rotation matrix and a translation quantity, and f represents a functional relation;
s32: the expression of step S31 is expressed by the lie group method, i.e., the rotation matrix and the translation amount between the camera coordinates and the calibration object coordinates are expressed by the lie algebra:
Figure FDA0002455417500000012
xi is lie algebra and Λ is broadly expressed as "from vector to matrix".
4. The method for measuring the spatial pose of the crane boom based on the improved lie group nonlinear optimization as claimed in claim 3, wherein the step S4 specifically comprises the following steps:
s41: representing three-dimensional control points of a cooperative target by a weighted sum of four virtual space coordinate points;
s42: calculating coordinates of the four virtual space points in a world coordinate system, converting the coordinates into a camera coordinate system by using a camera projection model, and calculating the coordinates in the camera coordinate system;
s43: and according to the 3D-3D matching, calculating a corresponding rotation matrix R and a corresponding translation t by adopting an ICP (inductively coupled plasma) method, and taking the rotation matrix R and the translation t as initial values to be introduced into a lie group algorithm to carry out minimum optimization.
5. The method for measuring the spatial pose of the crane boom based on the improved lie group nonlinear optimization as claimed in claim 3, wherein the step S5 specifically comprises the following steps:
s51: find out
Figure FDA0002455417500000021
Calculating an advance as a Jacobian matrix and calculating an initial quantity as a subsequent Jacobian matrix, where A is
Figure FDA0002455417500000022
Figure FDA0002455417500000023
The method comprises the following steps of (1) representing a kronecker product, wherein R is a rotation matrix, t is a translation amount, I is a unit matrix, and a lie algebra is used for representing the rotation matrix and the translation amount between a camera coordinate and a calibration object coordinate;
s52: solving the derivative of the minimized re-projection error term about the optimization variable, namely linearizing the objective function under the K iteration, eliminating the secondary term information after first-order Taylor expansion, and iteratively converging the lie algebra left-times disturbance quantity exp (xi ^);
s53: combining the expression after the linearization expression in the step S52 with the nonlinear iteration error expression, deriving the expression, and carrying out vectorization calculation on the Jacobian matrix by using a Kernel product when solving the Jacobian matrix, wherein the lead in the step S51 is used in each calculation;
s54, setting the derived expression obtained in the step S53 as 0, obtaining the variation lie algebra required by the next iteration, and setting the initial condition of the kth iteration as ξk、E(ξk) Substituting the following formula to obtain a lie algebra under the K +1 iteration;
exp(ξk∧)exp(Δξk∧)=exp(ξk+1∧)
in the above formula, exp (^) represents a matrix index, ξkRepresenting lie algebra after the K iteration, ξk+1Denotes the lie algebra after the K +1 iteration, Δ ξkRepresenting the variation of lie algebra;
s55 update f (exp (ξ)k+1 )Pi w) Obtain new E (ξ)k+1) And judging whether a convergence termination condition is reached, if so, terminating the iteration, and if not, repeating S52 to S54 until the convergence reaches a true value.
CN202010304889.8A 2020-04-17 2020-04-17 Driving suspension arm space pose measurement method based on improved Liqun nonlinear optimization Active CN111681279B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010304889.8A CN111681279B (en) 2020-04-17 2020-04-17 Driving suspension arm space pose measurement method based on improved Liqun nonlinear optimization

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010304889.8A CN111681279B (en) 2020-04-17 2020-04-17 Driving suspension arm space pose measurement method based on improved Liqun nonlinear optimization

Publications (2)

Publication Number Publication Date
CN111681279A true CN111681279A (en) 2020-09-18
CN111681279B CN111681279B (en) 2023-10-31

Family

ID=72451607

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010304889.8A Active CN111681279B (en) 2020-04-17 2020-04-17 Driving suspension arm space pose measurement method based on improved Liqun nonlinear optimization

Country Status (1)

Country Link
CN (1) CN111681279B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113155152A (en) * 2021-03-14 2021-07-23 北京工业大学 Camera and inertial sensor spatial relationship self-calibration method based on lie group filtering
CN113610149A (en) * 2021-08-05 2021-11-05 上海氢枫能源技术有限公司 Pose real-time display method and system of hydrogen compressor
CN113639639A (en) * 2021-08-31 2021-11-12 追觅创新科技(苏州)有限公司 Data processing method and device for position data and storage medium
CN113689492A (en) * 2021-10-22 2021-11-23 浙江建木智能系统有限公司 Sea surface distance measurement method and system based on monocular camera
CN115345943A (en) * 2022-08-08 2022-11-15 恩纳基智能科技无锡有限公司 Calibration method based on differential mode concept
CN117893610A (en) * 2024-03-14 2024-04-16 四川大学 Aviation assembly robot gesture measurement system based on zoom monocular vision

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108921895A (en) * 2018-06-12 2018-11-30 中国人民解放军军事科学院国防科技创新研究院 A kind of sensor relative pose estimation method
CN110097553A (en) * 2019-04-10 2019-08-06 东南大学 The semanteme for building figure and three-dimensional semantic segmentation based on instant positioning builds drawing system
CN110163902A (en) * 2019-05-10 2019-08-23 北京航空航天大学 A kind of inverse depth estimation method based on factor graph
CN110490932A (en) * 2019-08-21 2019-11-22 东南大学 The infrared coplanar cursor iteration optimization measurement driving boom spatial pose method of monocular
CN110503688A (en) * 2019-08-20 2019-11-26 上海工程技术大学 A kind of position and orientation estimation method for depth camera
CN110929642A (en) * 2019-11-21 2020-03-27 扬州市职业大学(扬州市广播电视大学) Real-time estimation method for human face posture based on two-dimensional feature points

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108921895A (en) * 2018-06-12 2018-11-30 中国人民解放军军事科学院国防科技创新研究院 A kind of sensor relative pose estimation method
CN110097553A (en) * 2019-04-10 2019-08-06 东南大学 The semanteme for building figure and three-dimensional semantic segmentation based on instant positioning builds drawing system
CN110163902A (en) * 2019-05-10 2019-08-23 北京航空航天大学 A kind of inverse depth estimation method based on factor graph
CN110503688A (en) * 2019-08-20 2019-11-26 上海工程技术大学 A kind of position and orientation estimation method for depth camera
CN110490932A (en) * 2019-08-21 2019-11-22 东南大学 The infrared coplanar cursor iteration optimization measurement driving boom spatial pose method of monocular
CN110929642A (en) * 2019-11-21 2020-03-27 扬州市职业大学(扬州市广播电视大学) Real-time estimation method for human face posture based on two-dimensional feature points

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113155152A (en) * 2021-03-14 2021-07-23 北京工业大学 Camera and inertial sensor spatial relationship self-calibration method based on lie group filtering
CN113155152B (en) * 2021-03-14 2023-01-03 北京工业大学 Camera and inertial sensor spatial relationship self-calibration method based on lie group filtering
CN113610149A (en) * 2021-08-05 2021-11-05 上海氢枫能源技术有限公司 Pose real-time display method and system of hydrogen compressor
CN113610149B (en) * 2021-08-05 2024-03-26 上海氢枫能源技术有限公司 Method and system for displaying pose of hydrogen compressor in real time
CN113639639A (en) * 2021-08-31 2021-11-12 追觅创新科技(苏州)有限公司 Data processing method and device for position data and storage medium
CN113689492A (en) * 2021-10-22 2021-11-23 浙江建木智能系统有限公司 Sea surface distance measurement method and system based on monocular camera
CN113689492B (en) * 2021-10-22 2022-02-11 浙江建木智能系统有限公司 Sea surface distance measurement method and system based on monocular camera
CN115345943A (en) * 2022-08-08 2022-11-15 恩纳基智能科技无锡有限公司 Calibration method based on differential mode concept
CN115345943B (en) * 2022-08-08 2024-04-16 恩纳基智能装备(无锡)股份有限公司 Calibration method based on differential mode concept
CN117893610A (en) * 2024-03-14 2024-04-16 四川大学 Aviation assembly robot gesture measurement system based on zoom monocular vision
CN117893610B (en) * 2024-03-14 2024-05-28 四川大学 Aviation assembly robot gesture measurement system based on zoom monocular vision

Also Published As

Publication number Publication date
CN111681279B (en) 2023-10-31

Similar Documents

Publication Publication Date Title
CN111681279B (en) Driving suspension arm space pose measurement method based on improved Liqun nonlinear optimization
CN111775152B (en) Method and system for guiding mechanical arm to grab scattered stacked workpieces based on three-dimensional measurement
CN110223348B (en) Robot scene self-adaptive pose estimation method based on RGB-D camera
CN110116407B (en) Flexible robot position and posture measuring method and device
CN110014426B (en) Method for grabbing symmetrically-shaped workpieces at high precision by using low-precision depth camera
CN105021124B (en) A kind of planar part three-dimensional position and normal vector computational methods based on depth map
CN109434251B (en) Welding seam image tracking method based on particle filtering
CN112907735B (en) Flexible cable identification and three-dimensional reconstruction method based on point cloud
CN109202912A (en) A method of objective contour point cloud is registrated based on monocular depth sensor and mechanical arm
CN105740899A (en) Machine vision image characteristic point detection and matching combination optimization method
CN110378325B (en) Target pose identification method in robot grabbing process
CN110928301A (en) Method, device and medium for detecting tiny obstacles
CN110490932B (en) Method for measuring space pose of crane boom through monocular infrared coplanar cursor iteration optimization
CN111897349A (en) Underwater robot autonomous obstacle avoidance method based on binocular vision
CN111598172B (en) Dynamic target grabbing gesture rapid detection method based on heterogeneous depth network fusion
CN110796700B (en) Multi-object grabbing area positioning method based on convolutional neural network
CN111721259A (en) Underwater robot recovery positioning method based on binocular vision
CN113421291B (en) Workpiece position alignment method using point cloud registration technology and three-dimensional reconstruction technology
CN111311618A (en) Circular arc workpiece matching and positioning method based on high-precision geometric primitive extraction
CN109448059B (en) Rapid X-corner sub-pixel detection method
TW201516969A (en) Visual object tracking method
CN113781561B (en) Target pose estimation method based on self-adaptive Gaussian weight quick point feature histogram
Wang et al. A pose estimation system based on deep neural network and ICP registration for robotic spray painting application
CN114549549A (en) Dynamic target modeling tracking method based on instance segmentation in dynamic environment
CN113532439A (en) Synchronous positioning and map building method and device for power transmission line inspection robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant