CN111681279B - Driving suspension arm space pose measurement method based on improved Liqun nonlinear optimization - Google Patents

Driving suspension arm space pose measurement method based on improved Liqun nonlinear optimization Download PDF

Info

Publication number
CN111681279B
CN111681279B CN202010304889.8A CN202010304889A CN111681279B CN 111681279 B CN111681279 B CN 111681279B CN 202010304889 A CN202010304889 A CN 202010304889A CN 111681279 B CN111681279 B CN 111681279B
Authority
CN
China
Prior art keywords
camera
iteration
coordinates
image
matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010304889.8A
Other languages
Chinese (zh)
Other versions
CN111681279A (en
Inventor
牛丹
傅星瑜
李奇
陈夕松
李世华
杨俊�
刘进波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Keyuan Intelligent Technology Group Co ltd
Nanjing Wenwang Automation Co ltd
Southeast University
Original Assignee
Nanjing Keyuan Intelligent Technology Group Co ltd
Nanjing Wenwang Automation Co ltd
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Keyuan Intelligent Technology Group Co ltd, Nanjing Wenwang Automation Co ltd, Southeast University filed Critical Nanjing Keyuan Intelligent Technology Group Co ltd
Priority to CN202010304889.8A priority Critical patent/CN111681279B/en
Publication of CN111681279A publication Critical patent/CN111681279A/en
Application granted granted Critical
Publication of CN111681279B publication Critical patent/CN111681279B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/155Segmentation; Edge detection involving morphological operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20032Median filtering
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a crane boom space pose measurement method based on improved Liqun nonlinear optimization, which comprises the following steps: s1: collecting and cooperating four target points in rectangular distribution by using a camera, and extracting four barycentric coordinates of infrared targets in the collected image; s2: calibrating internal and external parameters of an acquisition camera, determining an imaging center and a distortion coefficient, and correcting imaging distortion parameters of the image acquired in the step S1; s3: converting the relative pose problem of the camera and the cooperative target into a minimized re-projection problem; s4: calculating an initial rotation matrix and translation quantity of the relative pose of the camera by using an EPNP method; s5: and finally converging to a real solution after nonlinear iteration by using a Gauss Newton method to obtain a rotation matrix and translation. The invention not only improves the precision of the measurement result, but also improves the detection and calculation speed, can carry out rapid target detection and space parameter acquisition, and improves the real-time performance and accuracy of the space angle measurement of the unmanned crane boom.

Description

Driving suspension arm space pose measurement method based on improved Liqun nonlinear optimization
Technical Field
The invention relates to unmanned driving control of an industrial system, in particular to a method for measuring the space pose of a driving suspension arm.
Background
The unmanned crane system is a complex system comprising a plurality of subsystems, and the anti-swing control system is one of the subsystems and is responsible for accurately controlling a crane boom in the movement operation process and ensuring that the crane can stably run. The premise of anti-shake control is that real-time and accurate space pose parameters of a crane boom are required to be obtained. At present, the difficulty in detecting the space angle parameters of the crane boom is that real-time high-speed acquisition and calculation are needed and sent to a crane controller, so that efficient and rapid target detection and space parameter calculation are particularly important. In terms of measuring angles, the angle measuring technology applied to unmanned vehicles mainly comprises direct measurement by an inclinometer and indirect measurement by a visual target. Compared with the inclinometer method, the method for measuring the swing angle of the crane boom by vision is more comprehensive and visual in data, and is more beneficial to optimization and calculation of the anti-swing control algorithm. However, in the real-time high-speed running process of the vehicle, it is difficult to accurately measure the spatial pose parameters of the industrial crane boom.
Disclosure of Invention
The invention aims to: in order to overcome the defects and shortcomings of the prior art, the invention provides a crane boom space pose measurement method based on improved Liqun nonlinear optimization, which can rapidly calculate the space pose parameters of a crane boom and effectively solve the problem that the space pose parameters of an industrial crane boom are difficult to accurately measure in real time.
The technical scheme is as follows: a traveling crane boom space pose measurement method based on improved Liqun nonlinear optimization comprises the following steps:
s1: the camera is adopted to collect the target points of the cooperative targets, and the cooperative targets are composed of four laser points in rectangular distribution. Extracting four barycentric coordinates from an infrared cursor in an acquired image;
s2: calibrating internal and external parameters of an acquisition camera, determining an imaging center and a distortion coefficient, and correcting imaging distortion parameters of the image acquired in the step S1;
s3: converting the relative pose problem of the camera and the cooperative target into a minimized re-projection problem;
s4: calculating an initial rotation matrix and a translation amount of the relative position of the camera by using an EPNP method;
s5: and finally converging to a real solution after nonlinear iteration by using a Gaussian Newton method to obtain a rotation matrix and translation.
Further, in the step S1, the process of extracting four barycentric coordinates from the infrared targets in the acquired image includes the following steps:
s11: firstly, carrying out mixed filtering operation on an original image, namely judging to adopt a method of combining Gaussian filtering and median filtering according to a set threshold value, and carrying out smoothing treatment on the image to eliminate noise;
s12: performing morphological binarization opening operation on the image by adopting an optimal threshold segmentation method, finding a connected domain and eliminating isolated noise points and burrs in the image background;
s13: and searching the cursor connected domain in the image by adopting a gray level gravity center method and solving gravity center coordinates one by one.
Further, the step S3 specifically includes the following steps:
s31: converting the relative pose problem of the camera and the cooperative target into nonlinear optimization iteration, namely solving an image coordinate true solution and minimizing an error of the image coordinate obtained in the step S1, wherein the error is represented by the following formula:
ε 2 =∑||q i ′-f(M 2 P i w )|| 2
in the method, in the process of the invention,ε 2 is the error term, q' i Coordinates of the calibration point for the measurement obtained in step S1,for the actual physical coordinates of the infrared cursor, M 2 A coordinate transformation matrix consisting of a rotation matrix and a translation amount;
s32: the expression in step S31 is represented by a method of li, that is, the rotation matrix and the translation amount between the camera coordinates and the calibration object coordinates are represented by li algebra, ζ is a 6-dimensional vector, the first 3-dimensional represents the translation amount, and the last 3-dimensional represents the rotation amount, that is, the 4-order coordinate transformation matrix is expressed by ζ of 6-dimensional, and the expression can be converted into:
ζ is lie algebra, a is broadly expressed as "from vector to matrix".
Further, the step S4 specifically includes the following steps:
s41: using the weighted sum of the four virtual space coordinate points to represent a three-dimensional control point of a cooperative target;
s42: calculating the coordinates of the four virtual space points under the world coordinate system, converting the coordinates into the camera coordinate system by using a camera projection model, and calculating the coordinates under the camera coordinate system;
s43: according to 3D-3D matching, calculating a corresponding rotation matrix R and a translation t by adopting ICP, and taking the rotation matrix R and the translation t as initial values to carry out minimum optimization by using a plum-swarm algorithm.
Further, the step S5 includes the steps of:
s51: determination ofCalculating an advance as a Jacobian matrix and an initial amount as a subsequent Jacobian matrix, wherein A is +.> I in A is an identity matrix, R is a rotation matrix, and t is a translation amount;
s52: solving the derivative of the minimized reprojection error item about the optimization variable, namely linearizing the objective function under the K-th iteration, eliminating the information of the quadratic term after the first-order Taylor expansion, and iteratively converging the lie algebra left-hand multiplication disturbance quantity exp (delta zeta);
s53: combining the expression linearly represented in the step S52 with the nonlinear iterative error expression of the step S32, deriving the expression, and carrying out vectorization calculation on the Jacobian matrix by using a Cronecker product when solving the Jacobian matrix, wherein the advance in the step S51 is utilized in each calculation;
s54: setting the derived equation obtained in the step S53 to be 0, obtaining the variation lie algebra required by the next iteration, and setting the initial condition at the kth iteration as xi k 、E(ξ k ) Obtaining a lie algebra under the K+1th iteration after bringing the following formula into the iteration;
exp(ξ k ∧)exp(Δξ k ∧)=exp(ξ k+1 ∧)
in the above formula, exp (x k Represents the K-th iteration post-lie algebra, ζ k+1 Represents the number of lie algebra, Δζ after the (K+1) th iteration k Representing the variation of the lie algebra;
s55: update f (exp (ζ) k+1 ∧)P i w ) Obtaining new E (ζ) k+1 ) And judging whether the convergence termination condition is met, if so, terminating the iteration, and if not, repeating S52 to S54 until the convergence to the true value.
The beneficial effects are that: the invention discloses a method for measuring space pose of a crane boom by improving non-linear optimization of a prune group. According to the method, through the corresponding relation between the geometric positions of the infrared coplanar four-point cursors and imaging projection, the spatial pose parameters are solved by using a nonlinear optimization iteration method based on the Liqun, and then the spatial pose parameters are converted into four boom model pose quantities, so that the spatial pose parameters of the crane boom can be calculated rapidly, and the real-time performance of measuring the spatial pose parameters of the industrial crane boom is improved.
Drawings
FIG. 1 is a flowchart of an algorithm for solving spatial pose parameters in an embodiment of the invention;
FIG. 2 is a diagram of a monocular vision measurement system frame;
fig. 3 is a graph of the iterative error decline of the modified prune set.
Detailed Description
The technical scheme of the invention is further described below with reference to the detailed description and the accompanying drawings.
Referring to fig. 1 and 2, the present embodiment discloses a method for measuring a space pose of a crane boom for improving nonlinear optimization of a prune group, comprising the following steps:
step S1, a camera is adopted to collect a cooperative target point, and the cooperative target point is composed of four laser points in rectangular distribution. And extracting the barycenter coordinates of the light spots from the acquired image infrared targets by using a gray barycenter method. The process for obtaining the barycentric coordinates of the infrared light marks in the acquired images comprises the following steps:
s11: the original image is subjected to mixed filtering operation, namely, the method of combining Gaussian filtering and median filtering is adopted according to the set threshold value judgment, and the image is subjected to smoothing treatment to eliminate noise.
Specifically, based on the characteristics of mixed noise, firstly judging a noise model in a filtering window, solving the variance of pixels in the window, if the variance is larger than a threshold value set in advance, the window is polluted by salt and pepper noise, median filtering is needed, and otherwise, gaussian filtering is directly carried out.
S12: and (3) taking 30 of the nearest distance between the two peaks by adopting an optimal threshold segmentation method, performing morphological binarization opening operation on the image, finding a connected domain and eliminating isolated noise points and burrs in the image background.
S13: and searching the cursor connected domain in the image by adopting a gray level gravity center method and solving gravity center coordinates one by one.
Specifically, by adopting a gray level gravity center method, only a ring part of a light spot from the edge to a certain position in a circle is taken to search a cursor connected region in an image, and gravity center coordinates are obtained one by one, and by a previous gray level threshold segmentation method, the lower limit value of the ring gray level can be determined, namely the gray level value of the light spot edge is taken as Thresh1, and the maximum light intensity amplitude value can be found for the gray level distribution of the light spot, and then the upper limit Thresh2 can be obtained by the following formula:
Thresh2=H-t c
wherein t is c Is a value that adapts to the change in light intensity. The standard deviation of the barycentric coordinates can be controlled to be within 0.1 pixel.
Step S2, calibrating internal and external parameters of the acquisition camera, determining an imaging center, calibrating by a two-step method to obtain a radial distortion coefficient, and correcting imaging distortion parameters of the image acquired in the step S1;
step S3, converting the relative pose problem of the camera and the cooperative target into a minimized re-projection problem, and specifically comprising the following steps:
s31: converting the relative pose problem of the camera and the cooperative target into nonlinear optimization iteration, namely solving an image coordinate true solution and minimizing an error of the image coordinate obtained in the step S1, wherein the error is represented by the following formula:
ε 2 =∑||q′ i -f(M 2 P i w )|| 2
wherein ε 2 Is the error term, q' i Coordinates of the calibration point for the measurement obtained in step S1,for the actual physical coordinates of the infrared cursor, M 2 For a coordinate transformation matrix composed of a rotation matrix and a translation amount, f represents the image coordinates with respect to M 2 And->Is a function of (a);
s32: the expression in step S31 is represented by a method of li, that is, the rotation matrix and the translation amount between the camera coordinates and the calibration object coordinates are represented by li algebra, ζ is a 6-dimensional vector, the first 3-dimensional represents the translation amount, and the last 3-dimensional represents the rotation amount, that is, the 4-order coordinate transformation matrix is expressed by ζ of 6-dimensional, and the expression can be converted into:
ζ is lie algebra, a is broadly expressed as "from vector to matrix".
Step S4, calculating an initial rotation matrix and translation amount of the relative pose of the camera by using an EPNP method, wherein the method specifically comprises the following steps:
s41: the three-dimensional control point of a cooperative target is represented by a weighted sum of four virtual coordinate points in space, and is specifically represented as follows:
wherein P is i w Representing the coordinates of a three-dimensional control point,for virtual space point coordinates, alpha ij For the uniquely determined mean barycentric coordinates, the following are satisfied:
s42: calculating the coordinates of the four virtual space points under the world coordinate system, converting the coordinates into the camera coordinate system by using a camera projection model, and calculating the coordinates under the camera coordinate system;
s43: according to 3D-3D matching, calculating a corresponding rotation matrix R and a translation amount T by ICP, and taking the rotation matrix R and the translation amount T as initial values to carry out minimum optimization by a plum-swarm algorithm.
Step S5, carrying out nonlinear iteration by using a Gaussian Newton method, and finally converging to a true solution to obtain a rotation matrix and a translation amount, wherein the method specifically comprises the following steps of:
s51: determination ofCalculating an advance as a Jacobian matrix and an initial amount as a subsequent Jacobian matrix, wherein A is +.> I in A is an identity matrix, R is a rotation matrix, and t is a translation amount;
s52: solving the derivative of the minimized reprojection error item about the optimization variable, namely linearizing the objective function under the K-th iteration, eliminating the information of the quadratic term after the first-order Taylor expansion, and iteratively converging the lie algebra left-hand multiplication disturbance quantity exp (delta zeta);
s53: combining the expression after the linearization representation of the step S52 with the nonlinear iteration error expression of the step S32, deriving the expression, and carrying out vectorization calculation on the Jacobian matrix by using a Cronecker product when solving the Jacobian matrix, wherein the advance in the step S51 is utilized during each calculation;
s54: setting the derived equation obtained in the step S53 to be 0, obtaining the variation lie algebra required by the next iteration, and setting the initial condition at the kth iteration as xi k 、E(ξ k ) The lie algebra at the (k+1) th iteration is found after bringing the following formula:
exp(ξ k ∧)exp(Δξ k ∧)=exp(ξ k+1 ∧)
in the above formula, exp (x k Represents the K-th iteration post-lie algebra, ζ k+1 Represents the number of lie algebra, Δζ after the (K+1) th iteration k Representing the variation of the lie algebra;
s55: update f (exp (ζ) k+1 ∧)P i w ) Obtaining new E (ζ) k+1 ) Judging whether the convergence termination condition is reached, if so, terminating the iteration, and if not, repeating S52 to S54 until the convergence to the true value, wherein 6 steps are generally requiredThe iteration speed can be increased by about 3 ms.
The improved lie group iteration method is characterized in that the convergence speed of nonlinear iteration is shown in figure 3. The true solution can be converged basically through 6 steps of iteration. Through a platform experiment, the iterative algorithm is short in time consumption, completely meets the real-time requirement of a system, and can calculate three Euclidean angles and three translation amounts of the spatial pose of the target quickly.
Taking this example as an example, after the monocular vision measurement method for driving space pose parameter test, the image plane reprojection error result is shown in table 1 compared with the original method for non-linear optimization of the plum group.
Table 1 image plane re-projection error contrast for different algorithms
The comparison of the improved prune population nonlinear optimization method with the original prune population method in terms of computational time consumption of the algorithm is shown in table 2.
Table 2 comparison of the calculation time consumption (ms) of different algorithms
As can be seen from the analysis of the tables 1 and 2, the accuracy and the real-time performance of the improved plum swarm algorithm provided by the invention are better than those of the pose estimation method based on the original plum swarm.

Claims (2)

1. The method for measuring the space pose of the crane boom based on the improved Liqun nonlinear optimization is characterized by comprising the following steps of:
s1: collecting a cooperative target point by using a camera, wherein the cooperative target point is composed of four laser points distributed in a rectangular shape, and extracting four barycentric coordinates of an infrared cursor in a collected image;
s2: calibrating internal and external parameters of an acquisition camera, determining an imaging center and a distortion coefficient, and correcting imaging distortion parameters of the image acquired in the step S1;
s3: converting the relative pose problem of the camera and the cooperative target to a minimized re-projection problem, comprising:
s31: converting the relative pose problem of the camera and the cooperative target into nonlinear optimization iteration, namely solving an image coordinate true solution and minimizing the error of the image coordinate obtained in the step S1:
ε 2 =∑||q i ′-f(M 2 P i w )|| 2
wherein ε 2 Is the error term, q' i Coordinates of the calibration point for the measurement obtained in step S1,for the actual physical coordinates of the infrared cursor, M 2 The coordinate transformation matrix is composed of a rotation matrix and a translation quantity, and f represents a functional relation;
s32: the expression in the step S31 is represented by a method of li group, that is, the rotation matrix and the translation amount between the camera coordinates and the calibration object coordinates are represented by li algebra:
ζ is lie algebra, a is broadly expressed as "from vector to matrix";
s4: calculating an initial rotation matrix and translation amount of the relative pose of the camera by using an EPNP method, wherein the method comprises the following steps:
s41: using the weighted sum of the four virtual space coordinate points to represent a three-dimensional control point of a cooperative target;
s42: calculating the coordinates of the four virtual space points under the world coordinate system, converting the coordinates into the camera coordinate system by using a camera projection model, and calculating the coordinates under the camera coordinate system;
s43: according to 3D-3D matching, calculating a corresponding rotation matrix R and a translation t by adopting an ICP method, and taking the rotation matrix R and the translation t as initial values to carry out minimum optimization by using a plum algorithm;
s5: and finally converging to a true solution after nonlinear iteration by using a Gauss Newton method to obtain a rotation matrix and translation quantity, wherein the method comprises the following steps:
s51: determination ofCalculating an advance as a Jacobian matrix and an initial amount as a subsequent Jacobian matrix, wherein A is +.> The method is characterized in that the method is a Cronecker product, R is a rotation matrix, t is a translation amount, I is a unit matrix, and the rotation matrix and the translation amount between a camera coordinate and a calibration object coordinate are represented by a lie algebra;
s52: solving the derivative of the minimized reprojection error term about the optimization variable, namely linearizing the objective function under the K-th iteration, eliminating quadratic term information after first-order Taylor expansion, and iteratively converging the lie algebra left-hand multiplication disturbance quantity exp (delta zeta);
s53: combining the expression after the linearization representation in the step S52 with a nonlinear iteration error expression, deriving the expression, and carrying out vectorization calculation on the Jacobian matrix by using a Cronecker product when solving the Jacobian matrix, wherein the advance in the step S51 is utilized during each calculation;
s54: setting the derived equation obtained in the step S53 to be 0, obtaining the variation lie algebra required by the next iteration, and setting the initial condition at the kth iteration as xi k 、E(ξ k ) Obtaining a lie algebra under the K+1th iteration after bringing the following formula into the iteration;
exp(ξ k )exp(Δξ k )=exp(ξ k+1 )
in the above formula, exp (.) ) Representing matrix index, ζ k Represents the K-th iteration post-lie algebra, ζ k+1 Represents the number of lie algebra, Δζ after the (K+1) th iteration k Representation ofThe amount of change in lie algebra;
s55: update f (exp (ζ) k+1 )P i w ) Obtaining new E (ζ) k+1 ) And judging whether the convergence termination condition is met, if so, terminating the iteration, and if not, repeating S52 to S54 until the convergence to the true value.
2. The method for measuring the space pose of the crane boom based on the nonlinear optimization of the improved lie group according to claim 1, wherein in the step S1, the extracting of barycentric coordinates of infrared targets in the acquired image comprises the following steps:
s11: firstly, carrying out mixed filtering operation on an original image, namely judging to adopt a method of combining Gaussian filtering and median filtering according to a set threshold value, and carrying out smoothing treatment on the image to eliminate noise;
s12: performing morphological binarization opening operation on the image by adopting an optimal threshold segmentation method, finding a connected domain and eliminating isolated noise points and burrs in the image background;
s13: and searching the cursor connected domain in the image by adopting a gray level gravity center method, and solving gravity center coordinates one by one.
CN202010304889.8A 2020-04-17 2020-04-17 Driving suspension arm space pose measurement method based on improved Liqun nonlinear optimization Active CN111681279B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010304889.8A CN111681279B (en) 2020-04-17 2020-04-17 Driving suspension arm space pose measurement method based on improved Liqun nonlinear optimization

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010304889.8A CN111681279B (en) 2020-04-17 2020-04-17 Driving suspension arm space pose measurement method based on improved Liqun nonlinear optimization

Publications (2)

Publication Number Publication Date
CN111681279A CN111681279A (en) 2020-09-18
CN111681279B true CN111681279B (en) 2023-10-31

Family

ID=72451607

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010304889.8A Active CN111681279B (en) 2020-04-17 2020-04-17 Driving suspension arm space pose measurement method based on improved Liqun nonlinear optimization

Country Status (1)

Country Link
CN (1) CN111681279B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113155152B (en) * 2021-03-14 2023-01-03 北京工业大学 Camera and inertial sensor spatial relationship self-calibration method based on lie group filtering
CN113610149B (en) * 2021-08-05 2024-03-26 上海氢枫能源技术有限公司 Method and system for displaying pose of hydrogen compressor in real time
CN113639639A (en) * 2021-08-31 2021-11-12 追觅创新科技(苏州)有限公司 Data processing method and device for position data and storage medium
CN113689492B (en) * 2021-10-22 2022-02-11 浙江建木智能系统有限公司 Sea surface distance measurement method and system based on monocular camera
CN115345943B (en) * 2022-08-08 2024-04-16 恩纳基智能装备(无锡)股份有限公司 Calibration method based on differential mode concept
CN117893610B (en) * 2024-03-14 2024-05-28 四川大学 Aviation assembly robot gesture measurement system based on zoom monocular vision

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108921895A (en) * 2018-06-12 2018-11-30 中国人民解放军军事科学院国防科技创新研究院 A kind of sensor relative pose estimation method
CN110097553A (en) * 2019-04-10 2019-08-06 东南大学 The semanteme for building figure and three-dimensional semantic segmentation based on instant positioning builds drawing system
CN110163902A (en) * 2019-05-10 2019-08-23 北京航空航天大学 A kind of inverse depth estimation method based on factor graph
CN110490932A (en) * 2019-08-21 2019-11-22 东南大学 The infrared coplanar cursor iteration optimization measurement driving boom spatial pose method of monocular
CN110503688A (en) * 2019-08-20 2019-11-26 上海工程技术大学 A kind of position and orientation estimation method for depth camera
CN110929642A (en) * 2019-11-21 2020-03-27 扬州市职业大学(扬州市广播电视大学) Real-time estimation method for human face posture based on two-dimensional feature points

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108921895A (en) * 2018-06-12 2018-11-30 中国人民解放军军事科学院国防科技创新研究院 A kind of sensor relative pose estimation method
CN110097553A (en) * 2019-04-10 2019-08-06 东南大学 The semanteme for building figure and three-dimensional semantic segmentation based on instant positioning builds drawing system
CN110163902A (en) * 2019-05-10 2019-08-23 北京航空航天大学 A kind of inverse depth estimation method based on factor graph
CN110503688A (en) * 2019-08-20 2019-11-26 上海工程技术大学 A kind of position and orientation estimation method for depth camera
CN110490932A (en) * 2019-08-21 2019-11-22 东南大学 The infrared coplanar cursor iteration optimization measurement driving boom spatial pose method of monocular
CN110929642A (en) * 2019-11-21 2020-03-27 扬州市职业大学(扬州市广播电视大学) Real-time estimation method for human face posture based on two-dimensional feature points

Also Published As

Publication number Publication date
CN111681279A (en) 2020-09-18

Similar Documents

Publication Publication Date Title
CN111681279B (en) Driving suspension arm space pose measurement method based on improved Liqun nonlinear optimization
CN110223348B (en) Robot scene self-adaptive pose estimation method based on RGB-D camera
CN105021124B (en) A kind of planar part three-dimensional position and normal vector computational methods based on depth map
CN109949375A (en) A kind of mobile robot method for tracking target based on depth map area-of-interest
CN109434251B (en) Welding seam image tracking method based on particle filtering
CN110928301A (en) Method, device and medium for detecting tiny obstacles
CN111897349A (en) Underwater robot autonomous obstacle avoidance method based on binocular vision
CN110796700B (en) Multi-object grabbing area positioning method based on convolutional neural network
JP2015522200A (en) Human face feature point positioning method, apparatus, and storage medium
CN110490932B (en) Method for measuring space pose of crane boom through monocular infrared coplanar cursor iteration optimization
CN111598172B (en) Dynamic target grabbing gesture rapid detection method based on heterogeneous depth network fusion
CN115592324B (en) Automatic welding robot control system based on artificial intelligence
CN111311618A (en) Circular arc workpiece matching and positioning method based on high-precision geometric primitive extraction
CN115597512A (en) Automobile clearance measurement and error correction method
CN114549549A (en) Dynamic target modeling tracking method based on instance segmentation in dynamic environment
CN108469729B (en) Human body target identification and following method based on RGB-D information
CN117111545A (en) Skin milling path real-time planning method based on line laser
CN116579955A (en) New energy battery cell weld reflection point denoising and point cloud complement method and system
CN104240268B (en) A kind of pedestrian tracting method based on manifold learning and rarefaction representation
CN116665097A (en) Self-adaptive target tracking method combining context awareness
CN113702941B (en) Point cloud speed measuring method based on improved ICP
CN111915632B (en) Machine learning-based method for constructing truth database of lean texture target object
CN112767481B (en) High-precision positioning and mapping method based on visual edge features
CN111239761B (en) Method for indoor real-time establishment of two-dimensional map
Ying et al. Technique of measuring leading vehicle distance based on digital image processing theory

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant