CN115063451B - High-speed micro-vision tracking method and system for planar three-degree-of-freedom pose measurement - Google Patents

High-speed micro-vision tracking method and system for planar three-degree-of-freedom pose measurement Download PDF

Info

Publication number
CN115063451B
CN115063451B CN202210655670.1A CN202210655670A CN115063451B CN 115063451 B CN115063451 B CN 115063451B CN 202210655670 A CN202210655670 A CN 202210655670A CN 115063451 B CN115063451 B CN 115063451B
Authority
CN
China
Prior art keywords
value
target
matrix
state
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210655670.1A
Other languages
Chinese (zh)
Other versions
CN115063451A (en
Inventor
李海
沈楠
张宪民
许诺
廖祝
庞水全
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN202210655670.1A priority Critical patent/CN115063451B/en
Publication of CN115063451A publication Critical patent/CN115063451A/en
Application granted granted Critical
Publication of CN115063451B publication Critical patent/CN115063451B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a high-speed micro-vision tracking method and a system for planar three-degree-of-freedom pose measurement. In the imaging module, the area array camera adopts a high-speed data transmission interface, and the lens adopts a low-distortion telecentric lens or a microscope lens which can support a coaxial light source. In the real-time tracking algorithm, iterative optimization template tracking is combined with motion prediction, and the current optimal state is estimated through a Kalman filtering method and the state at the next moment is predicted, so that the optimization initial value deviation and the iteration times among different image frames are reduced, and high-precision real-time image processing under high-speed imaging is realized. The invention can be used in the field of precise measurement, and can realize real-time high-precision detection of plane coupling three-degree-of-freedom motion by selecting and matching cameras and lenses of different types and combining the proposed algorithm.

Description

High-speed micro-vision tracking method and system for planar three-degree-of-freedom pose measurement
Technical Field
The invention belongs to the field of high-speed visual tracking of pose of a dense positioning platform, and particularly relates to a high-speed micro-visual tracking method and system for planar three-degree-of-freedom pose measurement.
Background
With the development of information technology and the wide application of computer vision technology, people can utilize image processing technology to realize real-time monitoring and tracking of moving targets. The visual tracking refers to detecting, extracting, identifying and tracking a moving target in an image sequence, obtaining a moving parameter (such as position, speed, acceleration and the like) and a moving track of the moving target, and further processing and analyzing the moving target to track the moving target. The technology is widely applied to various fields of man-machine interaction, intelligent transportation, intelligent robots and the like.
The target tracking method based on template matching is one of the existing visual tracking algorithms, and is essentially that the target to be tracked is used as a template image to perform template matching in the next frame image, and the best matching position is used as a target tracking result. The template updating strategy is an important research content in target tracking based on template matching, and aims to keep the target template information and the image information change as synchronous as possible, adapt to geometric or gray distortion caused by camera motion, target motion or other noise on an image, and improve the target tracking precision. After the best matching position is obtained through template matching, namely, a subregion with a certain size is intercepted at the position to be used as an updated template, and template matching tracking of the next frame is carried out. Such templates reduce to some extent the risk of template failure, but each updated template contains errors in the current template matching, which eventually lead to the occurrence of tracking drift as they accumulate from frame to frame.
Therefore, in order to solve the problem of error accumulation generated by updating the target template in the iterative process and realize the targets of high efficiency, high precision and high bandwidth, the document (Li H,Zhang X,Zhu B,et al.Online Precise Motion Measurement of 3-DOF Nanopositioners Based on Image Correlation[J].IEEE Transactions on Instrumentation and Measurement,2018,PP:1-9.) proposes to use the target region selected in the first frame as the template, the difference square sum function as the matching criterion, the gaussian newton iterative method as the optimization algorithm, and simultaneously update the distortion function in the iterative process in combination with the reverse synthesis method, thereby realizing high-speed visual tracking. Literature (Li H,Zhang X,Yao S,et al.An improved template-matching-based pose tracking method for planar nanopositioning stages using enhanced correlation coefficient[J].IEEE Sensors Journal,2020,PP(99):1-1.) is used for researching the problems of measurement accuracy, measurement range mutual inhibition and low calculation efficiency existing in micro-vision perception, provides a tracking method based on an enhanced difference square sum matching criterion, derives a solving algorithm based on an ECC operator, and converts a multi-degree-of-freedom motion problem into a multi-variable parameter optimization problem by combining plane European projective transformation and template matching. Based on the inverse synthetic Gaussian Newton optimization algorithm, an optimization template selection strategy and a segmentation punishment accelerating scheme are provided for further improving the optimization efficiency and convergence accuracy. The problem of computational efficiency decline caused by the complexity improvement of the objective function is solved, the precision of the scheme in the aspect of angle measurement is higher, and the effective tracking of the three-degree-of-freedom precision positioning platform can be realized.
However, both the above methods do not consider the number of algorithm iterations, and do not simplify the algorithm iteration process, so the frame rate of camera tracking cannot be improved.
Disclosure of Invention
In order to solve the above problems, the present invention proposes a high-speed micro-vision tracking method for planar three-degree-of-freedom pose measurement, because the kalman filter can convert the object tracking problem into the estimation problem of the system state posterior probability density, and when the input is a random signal generated by white noise, the kalman filter can minimize the mean square error between the expected output and the actual output. Meanwhile, the position of the moving object in the next frame is predicted by utilizing the prediction function of the Kalman filter, the problem of global searching of the image is converted into local searching, the searching space is reduced, the matching speed is increased, and the real-time performance of the system is improved. The method for high-speed visual tracking can overcome a plurality of defects in a multi-sensor combination measurement scheme, has the functions of visualization and providing a new idea for full-closed loop feedback of the micro-nano positioning platform, can expand the application range of the micro-nano positioning platform, can effectively improve the sampling frequency and tracking precision in the tracking process, and provides a new method for full feedback of a follow-up precise positioning platform.
The invention is realized at least by one of the following technical schemes.
The high-speed micro-vision tracking method for planar three-degree-of-freedom pose measurement comprises the following steps:
S1, acquiring motion information of a precision motion platform, and selecting picture information as a target area for tracking;
s2, obtaining a motion state expression of a tracking target according to motion planning of a motion platform, obtaining a state transition matrix and a control matrix of Kalman filtering according to the motion state expression, obtaining an optimal estimated value of a state at the current moment by using Kalman filtering, predicting the current value according to a previous state value, and predicting a state covariance matrix at the same time:
S3, obtaining a distortion function update value at the current moment according to the optimal estimated value of the state at the current moment, and converting the tracking process into a nonlinear optimization problem;
S4, obtaining motion parameters of each iteration process by using template matching:
S5, after the warping function is updated, the position of the target template in the frame is obtained, and tracking of the target is achieved after the position of the target image in each frame is found.
Further, step S2 includes the steps of:
predicting the current value from the previous state value according to the state transition matrix, and simultaneously predicting the state covariance matrix:
And obtaining the position of the tracking target in the camera coordinate system according to the vision measured value, and updating the system state value, the Kalman gain and the state covariance:
In the method, in the process of the invention, Optimal estimation value representing current time state,/>Representing a priori predicted values of states,/>A priori covariance matrix representing states, a state transition matrix, Q state transition covariance matrix, H observation matrix, R observation noise variance matrix, z k observation value, K kalman gain, B control matrix, P k state covariance matrix, u k control matrix of system,/>Covariance matrix representing prediction state, I represents identity matrix.
Further, a distortion transfer function between the coordinate system O T -x 'y' and the coordinate system O I -xy is defined by W; for planar Euclidean transforms, the warping function is expressed as:
Wherein: w (p; x ') represents a matrix of transformed target position coordinates, p represents a pose change vector, and x' represents a coordinate position of an image before change; θ represents the angle of rotation of the image transformation; t x and t y represent displacement vectors in the image transformation process; w x and W y denote coordinates in the X-axis and the Y-axis after image transformation.
Further, in step S3, an optimal estimated value is obtained from the current timeObtaining an initial value updated by a distortion function for visual tracking based on template matching at the current moment when iteration starts, converting a tracking process into a nonlinear optimization problem, and enabling an objective function of a tracking target at the current moment to be:
p‘=(tx ty θ)T
Wherein the method comprises the steps of For matching criteria, p' is a parameter vector, θ represents the angle of rotation of the image transformation; t x and t y represent displacement vectors in the image transformation process; the pose change vector p in the iterative process is obtained through optimization of an objective function, the pose change vector p is obtained through initial values of Kalman filtering prediction, t xk=X(1,1),tyk=X(2,1),θk =X (3, 1), and X respectively represents a matrix represented by state variables obtained through Kalman filtering; t xk and t yk represent displacements of the target position in the state variables in the X-axis and the Y-axis; θ k represents the rotation angle of the target position change in the state variable; i (W (p; x ')) represents the gray value of the target area after the image change, and T (x') represents the gray value of the selected template area; Ω represents a set of all points on the image.
Further, a reverse synthesis method is selected to complete template matching, so that optimization solution is carried out on the objective function, and the obtained objective function in the kth iteration is expressed as:
wherein h (W (Δp k; x')) represents the gray value of the template region; g (W (p k; x')) represents the gray value of the sub-image region; Is the average value of the gray values of the image in the template area; /(I) The gray value mean value of the sub-image area; Δh represents the template region gray value and the norm of the gray value mean; Δg is the norm of the target subregion gray value and the gray value average.
Further, if Δt xk、Δtxy、Δθk of each iteration process is obtained according to the parameter Δp k of each iteration, the warping function is updated as follows:
Wherein: A compounding process representing a warping function, Δt xk、Δtxy、Δθk being displacement vectors and rotation angles of the target region moving in the X-axis and the Y-axis, respectively, X 'representing a point on the image, W (p k; X') being a warping function, W -1(Δpk; x') is the inverse warp parameter matrix delta.
Further, the target tracking process is completed by using the template matching based on the gray value, the template matching process is completed by using a reverse synthesis method according to the optimization of the target function, and the motion parameters of each iteration process are obtained:
Where g (W (p k; x')) represents the gray value of the sub-image region, Is the average value of the gray values of the image in the template area; Δh represents the norm of the gray value and the gray value mean of the template region,/>For the gray value mean of the sub-image region, J represents a jacobian matrix, and its expression is expressed as:
Where W x and W y represent coordinates in the X-axis and Y-axis after image transformation, And/>Is when t=1, i.e. the gradient of the template of the target region in the selected first frame image,/>And/>Is a jacobian matrix of a warping function W with respect to p, p representing a pose change vector;
Wherein Δg is the norm of the gray value and the gray value average value of the target subarea, and N represents the number of pixels of the template area; g (W (p; x ')) represents the gray value of the sub-region pixel, h (x ') is the gray value vector of the target template region pixel, and g (W (p k; x ')) is the gray value of the image region pixel after the iteration.
Further, the motion planning of the end motion platform is uniform acceleration linear motion with acceleration as a, and the state variable X= [ X v ] of the end motion platform is a state transition matrixControl matrix/>Observation matrix h= [ 11 ]; Δt is an image sampling interval, X represents a displacement of a target position in an X-axis, V represents a motion speed of a target image, wherein W (k) and V (k) are a white noise sequence with an average value of zero, which is uncorrelated with each other, and the following conditions are satisfied:
Where Q k and R k are the covariance matrices of W (k) and V (k), respectively, and delta kj is the Cronecker function, i.e
Where W (j), V (j) represent a white noise sequence with zero mean, and k and j represent only arguments.
Further, the displacement relationship between the output (x, y, θ) of the precision positioning stage and the capacitive sensor is as follows:
Wherein: u 1、u2、u3 is the measured value of three capacitance sensors, and d is the distance between the No. 1 and No. 2 capacitances; the precise positioning platform performs uniform circular motion with the angular velocity omega, and the state variables of the motion are as follows: x= [ X Y v x vy]T, where X and Y represent displacement of the target position in the X-axis and Y-axis directions, respectively, and v x and v y represent velocities of the target position in the X-axis and Y-axis directions, respectively. The corresponding state transition matrix and control matrix of the motion planning are as follows:
The observation matrix is: Δt is the image sampling interval, where W (k) and V (k) are uncorrelated white noise sequences with zero mean value, satisfying:
Where Q k and R k are the covariance matrices of W (k) and V (k), respectively, and delta kj is the Cronecker function, i.e
The system for realizing the high-speed micro-vision tracking method for planar three-degree-of-freedom pose measurement comprises a vibration isolation platform, a lens cone, a precise movement device, an industrial camera, an imaging system positioning platform, a logo pattern, a lens fixing sliding table, a coaxial light source and a micro lens; the industrial camera is connected with the lens cone, the industrial camera acquires data and transmits the data to the computer for processing, the lens cone is arranged on the lens fixing sliding table, the microscope lens is connected with the lens cone, the lens fixing sliding table is arranged on the imaging system positioning platform, the imaging system positioning platform is arranged on the vibration isolation table, the mark pattern is positioned on the precise movement device, the lens cone is positioned above the mark pattern, the information acquisition is convenient, the precise movement device is fixed on the vibration isolation table, and the coaxial light source is arranged on the microscope lens.
Compared with the prior art, the invention has the beneficial effects that
(1) In the visual tracking process based on template matching, a Kalman filtering technology is added, the position of a tracking target in the next frame is predicted by utilizing the prediction function of the Kalman filtering technology, the image global search problem is converted into local search, the search space is reduced, the matching speed is increased, and the real-time performance of a system is improved.
(2) In the process of optimizing the solving algorithm, compared with an SSD operator used in early research, the method adopts an ECC operator with better robustness to deduce the algorithm, the method can reduce the average iteration time required by the tracking process, the tracking error is obviously reduced, and the method can obviously improve the pose high-speed visual tracking performance of the precise positioning platform.
Drawings
FIG. 1 is a schematic structural flow diagram of a high-speed micro-vision tracking method for planar three-degree-of-freedom pose measurement according to an embodiment;
FIG. 2 is a block diagram of a high-speed micro-vision tracking system for planar three-degree-of-freedom pose measurement according to an embodiment;
FIG. 3 is a general block diagram of a high-speed micro-vision tracking system for planar three-degree-of-freedom pose measurement according to an embodiment;
FIG. 4 is a flow chart of a high-speed micro-vision tracking method for planar three-degree-of-freedom pose measurement according to an embodiment;
The figure shows: 1-vibration isolation table, 2-lens cone, 3-precision motion device, 4-industrial camera, 5-imaging system positioning platform, 6-mark pattern, 7-lens fixed slipway, 8-coaxial light source, 9-microscope lens.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Example 1
The application discloses a high-speed visual tracking method for the pose of a precision positioning platform based on motion prediction and iterative optimization, which is shown in figure 1 and comprises the following steps: the method comprises the steps of collecting motion information of a precision motion platform, preprocessing the motion information, planning the motion of the precision motion platform, predicting an optimal estimated value of a tracking target in a next frame by Kalman filtering, and completing visual tracking based on template matching. Specifically, the method comprises the following steps:
S1, adjusting the precise movement device to enable the origin of the terminal movement platform to be coincident with the origin of the base, wherein the origin and the base are both at zero points;
s2, adjusting a sliding table on the camera positioning platform to enable the mark patterns under the lens to be clear;
S3, the industrial camera acquires initial position image information of the mark pattern and transmits the initial position image information to the computer, and a template with a proper size is selected as an object for target tracking;
S4, converting the motion profile into a pulse signal according to the theoretical motion profile of the precision motion platform and transmitting the pulse signal to the motor controller so that the precision motion platform generates a corresponding motion track;
S5, obtaining an optimal estimated value of the current moment state by utilizing Kalman filtering according to the theoretical motion state The process is as follows:
After the carrying of the tracking platform is completed, the first frame position of the mark pattern is acquired after the calibration of the camera is completed, and proper picture information is selected as a target area for tracking. Secondly, according to the motion planning of the motion platform, a motion state space expression of the tracking target can be obtained:
Xk+1=AXk+Bu
Where u represents the control matrix of the system and X k represents the state variable of the system at time k
According to the motion space expression, a state transition matrix A and a control matrix B of Kalman filtering can be obtained, according to the parameters, a current value can be predicted according to a previous state value, and meanwhile, a state covariance matrix is predicted:
updating the system state value, the Kalman gain and the state covariance according to the vision measured value:
In the method, in the process of the invention, Optimal estimation value representing current time state,/>Representing a priori predicted values of states,/>A priori covariance matrix representing states, a state transition matrix, Q state transition covariance matrix, H observation matrix, R observation noise variance matrix, z k observation value, K kalman gain, B control matrix, P k state covariance matrix, u k control matrix of system,/>Covariance matrix representing prediction state, I represents identity matrix.
After the completion of the prediction of the target state by the kalman filtering, an optimal estimated value of the state at the current moment is obtained, that is, the position (t xk=X(1,1),tyk=X(2,1),θk =x (3, 1)) of the target template can be obtained, and at this time, the warping function can be updated as follows:
According to the optimal estimated value of the current moment Obtaining an initial value updated by a distortion function for visual tracking based on template matching at the current moment when iteration starts, converting a tracking process into a nonlinear optimization problem, and enabling an objective function of a tracking target at the current moment to be:
Wherein the method comprises the steps of For matching criteria, p ' = (T x ty θ)T is a parameter vector, obtained through initial value of Kalman filtering prediction, T xk=X(1,1),tyk=X(2,1),θk = X (3, 1), wherein X represents a matrix represented by a state variable obtained through Kalman filtering, T xk and T yk represent displacement of a target position in the state variable in an X axis and a Y axis respectively, theta k represents a rotation angle of the target position change in the state variable, I (W (p; X ')) represents a gray value of a target area after image change, T (X ') represents a gray value of a selected template area, and omega represents a set of all points on the image.
The distortion transfer function between coordinate system O T -x 'y' and coordinate system O I -xy is defined by W. For planar euclidean transforms, the warping function can be expressed as:
the method for optimizing and solving the objective function mainly comprises the following steps:
1. For visual tracking based on regional gray scale, gray scale template matching is generally selected, and common matching functions include a sum of squares function (SSD), a sum of absolute differences function (SAD), a normalized cross correlation function (NCC), a normalized SSD and the like.
2. For the matching function, the matching function belongs to the nonlinear optimization problem. The optimization method of the optimization calculation is to perform repeated numerical calculation according to a certain logic structure, and find the design point of continuously decreasing the numerical value of the objective function until the sufficient precision is obtained.
3. The present example selects the reverse synthesis method to optimize the objective function. Compared with the other two methods, the method has the advantages that the calculation process is simpler, the gradient of the template, the Jacobian matrix and the sea plug matrix are calculated once, repeated calculation is not needed, the calculated amount is greatly reduced, and the iteration time is reduced.
The above algorithm is optimized by using a reverse synthesis method, and the objective function at the kth iteration can be expressed as:
Wherein Δp k represents the displacement vector and rotation angle obtained after the iteration is completed; Representing an objective function; h (W (Δp k; x')) represents the gray value of the target subregion; g (W (p k; x')) represents the gray value of the template region; /(I) Is the average value of the gray values of the images of the target subareas; /(I)The gray value average value of the template area; Δh represents the norm of the gray value and the gray value mean of the target subregion; Δg is the norm of the template region gray value and the gray value mean.
The tracking of the target is accomplished by template matching, according to which the algorithm (objective function) And finishing the template matching process by using a reverse synthesis method, and obtaining the motion parameters of each iteration process as a result:
Wherein: j represents a jacobian matrix, the expression of which can be expressed as:
Wherein the method comprises the steps of And/>Is the gradient of the template of the target region in the selected first frame image when t=1,And/>Is a jacobian of the warping function W for p.
H (x ') is the gray value vector of the target template region pixel, g (W (p k; x')) is the gray value vector of the sub-image region pixel,
From the expression of Δp k above, Δt xk、Δtxy、Δθk for each iteration process can be obtained, and the warping function can be updated as follows:
Wherein: the degree represents the composite process of the distortion function, and Δt xk、Δtxy、Δθk represents the iterative result in the above step 3, respectively, which is the displacement vector and rotation angle of the target region moving in the X-axis and the Y-axis.
S6, after the warping function is updated, the position of the target template in the frame can be obtained, and after the position of the target image in each frame is found, the target tracking process can be realized.
Example 2
As shown in fig. 2, the present embodiment provides a high-speed visual tracking device based on a macro-motion precision motion platform, which comprises a vibration isolation platform 1, a lens barrel 2, a precision motion device 3, an industrial camera 4, an imaging system positioning platform 5, a logo pattern 6, a lens fixing sliding table 7, a coaxial light source 8 and a micro lens 9; the industrial camera 4 is connected with the lens cone 2 through threads, the industrial camera 4 transmits collected data to the computer through the Ethernet for processing, the lens cone 2 is fixed on the lens fixing sliding table 7 through bolts, the micro lens 9 is connected with the lens cone 2 through threads, the lens fixing sliding table 7 is fixed on the imaging system positioning platform 5 through bolts, the imaging system positioning platform 5 is connected with the vibration isolation table 1 through bolts, the mark pattern 6 is fixed on the precise movement device 3, the lens 2 is located above the mark pattern 6, information collection is facilitated, and the precise movement device 3 is fixed on the vibration isolation table 1 through bolts.
In the imaging module, the area array camera adopts a high-speed data transmission interface, and the lens adopts a low-distortion telecentric lens or a microscope lens which can support a coaxial light source. In the real-time tracking algorithm, iterative optimization template tracking is combined with motion prediction, and the current optimal state is estimated through a Kalman filtering method and the state at the next moment is predicted, so that the optimization initial value deviation and the iteration times among different image frames are reduced, and high-precision real-time image processing under high-speed imaging is realized.
In this example, the industrial camera is 4M180MCL of the flere CL series, with a resolution of 2048×2048 and a pixel size of 5.5 μm×5.5 μm; the model of the lens is 0.06X GoldTL telecentric lens, the magnification of the whole imaging system is 0.06 times, and the theoretical measurement range is 187.73mm multiplied by 187.73mm. The motion range of the single-degree-of-freedom reciprocating motion platform is set to be 80mm, and the parameter is in the visual field range of the vision system and meets the design requirement.
In this example, the motion of the reciprocating platform is planned to be a uniformly accelerated linear motion with acceleration a, so the state variable can be represented by x= [ X v ], and the state transition matrix is: The control matrix is: /(I) The observation matrix is: h= [ 11 ]; Δt is the image sampling interval, where W (k) and V (k) are uncorrelated white noise sequences with zero mean value, satisfying:
Where Q k and R k are the covariance matrices of W (k) and V (k), respectively, and delta kj is the Cronecker function, i.e
The prediction function of the Kalman filtering can be completed by the formula, and the subsequent visual tracking process based on template matching is similar to that of the embodiment 1.
Example 3
As shown in fig. 3, the difference between this example and example 2 is that the structure of the precision motion platform is a three-degree-of-freedom compliant mechanism, and the structure of the precision motion platform of example 3 is driven by piezoelectric ceramics, and has the motion capability of X-Y-Z three axes, the maximum stroke of each axis is 100 μm, and the resolution can reach 0.2nm. Whereas the vision system equipped with example 3: the camera is 2M360MCL of the Flare CL series, the resolution is 2048×1088, and the pixel size is 5.5 μm×5.5 μm; the lens model was an objective lens with a magnification of 20 from Mitutoyo of japan, which was combined with a barrel of Navitar from usa, and the barrel and the camera were connected with an adapter from Navitar. The theoretical measurement range of the system is 563.2 μm× 299.2 μm. Since the uniaxial maximum stroke of the compliant mechanism is 100 μm, this value is less than the field of view range of the vision system of this example design.
The displacement relation between the output (x, y, θ) of the precision positioning platform and the capacitance sensor is as follows:
Wherein: u 1、u2、u3 is the measured value of the three capacitive sensors, and d is the distance between the number 1 and number 2 capacitances, respectively. The precise positioning platform performs uniform circular motion with the angular speed omega, and the state variables of the motion are as follows: x= [ X y v x vy]T, therefore, the state transition matrix and control matrix for this motion planning are obtained as follows:
The observation matrix is: Δt is the image sampling interval, where W (k) and V (k) are uncorrelated white noise sequences with zero mean value, satisfying:
Where Q k and R k are the covariance matrices of W (k) and V (k), respectively, and delta kj is the Cronecker function, i.e
The prediction function of the Kalman filtering can be completed by the formula, and the subsequent visual tracking process based on template matching is similar to that of the embodiment 1. The invention can be used in the field of precise measurement, can realize real-time high-precision detection of plane coupling three-degree-of-freedom motion by selecting and matching cameras and lenses of different types and combining the proposed algorithm, has the ratio of a measurement range to a measurement resolution of 10 4~105 and has the tracking frequency of more than 1000 Hz.
The foregoing is only illustrative of the present invention and is not intended to limit the scope of the invention, and all equivalent structures using the teachings of the present invention or directly or indirectly applied to other related technical fields are included in the scope of the present invention.

Claims (8)

1. The high-speed micro-vision tracking method for planar three-degree-of-freedom pose measurement is characterized by comprising the following steps of: the method comprises the following steps:
S1, acquiring motion information of a precision motion platform, and selecting picture information as a target area for tracking;
S2, obtaining a motion state expression of a tracking target according to motion planning of a motion platform, obtaining a state transition matrix and a control matrix of Kalman filtering according to the motion state expression, obtaining an optimal estimated value of a state at the current moment by using Kalman filtering, predicting the current value according to a previous state value, and predicting a state covariance matrix;
The motion planning of the terminal motion platform is uniform acceleration linear motion with acceleration as a, and the state variable X= [ xv ] of the terminal motion platform is a state transition matrix Control matrix/>Observation matrix h= [ 11 ]; Δt is an image sampling interval, X represents a displacement of a target position in an X-axis, V represents a motion speed of a target image, wherein W (k) and V (k) are a white noise sequence with an average value of zero, which is uncorrelated with each other, and the following conditions are satisfied:
Where Q k and R k are the covariance matrices of W (k) and V (k), respectively, and delta kj is the Cronecker function, i.e
Wherein W (j) and V (j) represent white noise sequences with zero mean value, and k and j represent independent variables only;
the displacement relation between the output (x, y, θ) of the precision positioning platform and the capacitive sensor is as follows:
wherein: u 1、u2、u3 is the measured value of three capacitance sensors, and d is the distance between the No. 1 and No. 2 capacitances; the precise positioning platform performs uniform circular motion with the angular velocity omega, and the state variables of the motion are as follows: x= [ xyv xvy]T, X and Y represent displacement of the target position in the X-axis and Y-axis directions, v x and v y represent speeds of the target position in the X-axis and Y-axis directions, respectively, and a state transition matrix and a control matrix of the corresponding motion plan are obtained as follows:
The observation matrix is: Δt is the image sampling interval, where W (k) and V (k) are uncorrelated white noise sequences with zero mean value, satisfying:
Where Q k and R k are the covariance matrices of W (k) and V (k), respectively, and delta kj is the Cronecker function, i.e
S3, obtaining a distortion function update value at the current moment according to the optimal estimated value of the state at the current moment, and converting the tracking process into a nonlinear optimization problem;
S4, obtaining motion parameters of each iteration process by using template matching:
S5, after the warping function is updated, the position of the target template in the frame is obtained, and tracking of the target is achieved after the position of the target image in each frame is found.
2. The high-speed micro-vision tracking method for planar three-degree-of-freedom pose measurement according to claim 1, wherein the method comprises the following steps: step S2 comprises the steps of:
predicting the current value from the previous state value according to the state transition matrix, and simultaneously predicting the state covariance matrix:
And obtaining the position of the tracking target in the camera coordinate system according to the vision measured value, and updating the system state value, the Kalman gain and the state covariance:
In the method, in the process of the invention, Optimal estimation value representing current time state,/>Representing a priori predicted values of states,/>A priori covariance matrix representing states, a state transition matrix, Q state transition covariance matrix, H observation matrix, R observation noise variance matrix, z k observation value, K kalman gain, B control matrix, P k state covariance matrix, u k control matrix of system,/>Covariance matrix representing prediction state, I represents identity matrix.
3. The high-speed micro-vision tracking method for planar three-degree-of-freedom pose measurement according to claim 1, wherein the method comprises the following steps: defining a distortion transfer function between the coordinate system O T -x 'y' and the coordinate system O I -xy by W; for planar Euclidean transforms, the warping function is expressed as:
wherein: w (p; x ') represents a matrix of transformed target position coordinates, p represents a pose change vector, and x' represents a coordinate position of an image before change; θ represents the angle of rotation of the image transformation; t x and t y represent displacement vectors during image transformation.
4. The high-speed micro-vision tracking method for planar three-degree-of-freedom pose measurement according to claim 1, wherein the method comprises the following steps: in step S3, according to the optimal estimated value at the current timeObtaining an initial value updated by a distortion function for visual tracking based on template matching at the current moment when iteration starts, converting a tracking process into a nonlinear optimization problem, and enabling an objective function of a tracking target at the current moment to be:
p‘=(tx ty θ)T
Wherein the method comprises the steps of For matching criteria, p' is a parameter vector, θ represents the angle of rotation of the image transformation; t x and t y represent displacement vectors in the image transformation process; the pose change vector p in the iterative process is obtained through optimization of an objective function, the pose change vector p is obtained through initial values of Kalman filtering prediction, t xk=X(1,1),tyk=X(2,1),θk =X (3, 1), and X respectively represents a matrix represented by state variables obtained through Kalman filtering; t kk and t yk represent displacements of the target position in the state variables in the X-axis and the Y-axis; θ k represents the rotation angle of the target position change in the state variable; i (W (p; x ')) represents the gray value of the target area after the image change, and T (x') represents the gray value of the selected template area; Ω represents a set of all points on the image.
5. The high-speed micro-vision tracking method for planar three-degree-of-freedom pose measurement according to claim 4, wherein the method comprises the following steps: and (3) finishing template matching by selecting a reverse synthesis method, and realizing optimization solution on an objective function, wherein the obtained objective function in the kth iteration is expressed as:
wherein h (W (Δp k; x')) represents the gray value of the template region; g (W (p k; x')) represents the gray value of the sub-image region; Is the average value of the gray values of the image in the template area; /(I) The gray value mean value of the sub-image area; Δh represents the template region gray value and the norm of the gray value mean; Δg is the norm of the target subregion gray value and the gray value average.
6. The high-speed micro-vision tracking method for planar three-degree-of-freedom pose measurement according to claim 5, wherein the method comprises the following steps: and obtaining delta t xk、Δtxy、Δθk of each iteration process according to the parameter delta p k of each iteration, and updating the warping function as follows:
Wherein: the degree represents the compounding process of the distortion function, deltat xk、Δtxy、Δθk is the displacement vector and rotation angle of the target area moving in the X axis and Y axis respectively, X 'represents the point on the image, W (p k; X') is the distortion function, W -1(Δpk; x') is the inverse warp parameter matrix delta.
7. The high-speed micro-vision tracking method for planar three-degree-of-freedom pose measurement according to claim 1, wherein the method comprises the following steps: the target tracking process is completed by using the template matching based on the gray value, the template matching process is completed by using a reverse synthesis method according to the optimization of the target function, and the motion parameters of each iteration process are obtained:
where g (W (p k; x')) represents the gray value of the changed sub-region, Is the average value of the gray values of the image in the template area; Δh represents the norm of the gray value and the gray value mean of the template region,/>For the gray value mean of the sub-image region, J represents a jacobian matrix, and its expression is expressed as:
Where W x and W y represent coordinates in the X-axis and Y-axis after image transformation, And/>Is when t=1, i.e. the gradient of the template of the target region in the selected first frame image,/>And/>Is a jacobian matrix of a warping function W with respect to p, p representing a pose change vector;
Wherein Δg is the norm of the gray value and the gray value average value of the target subarea, and N represents the number of pixels of the template area;
g (W (p; x ')) represents the gray value of the sub-region pixel, h (x') is the gray value vector of the target template region pixel,
G (W (p k; x')) is the gray value of the pixel of the image area after the iteration.
8. A system for implementing the high-speed micro-vision tracking method for planar three-degree-of-freedom pose measurement according to claim 1, characterized in that: the device comprises a vibration isolation table, a lens cone, a precise movement device, an industrial camera, an imaging system positioning platform, a logo pattern, a lens fixing sliding table, a coaxial light source and a microscope lens; the industrial camera is connected with the lens cone, the industrial camera acquires data and transmits the data to the computer for processing, the lens cone is arranged on the lens fixing sliding table, the microscope lens is connected with the lens cone, the lens fixing sliding table is arranged on the imaging system positioning platform, the imaging system positioning platform is arranged on the vibration isolation table, the mark pattern is positioned on the precise movement device, the lens cone is positioned above the mark pattern, the information acquisition is convenient, the precise movement device is fixed on the vibration isolation table, and the coaxial light source is arranged on the microscope lens.
CN202210655670.1A 2022-06-10 2022-06-10 High-speed micro-vision tracking method and system for planar three-degree-of-freedom pose measurement Active CN115063451B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210655670.1A CN115063451B (en) 2022-06-10 2022-06-10 High-speed micro-vision tracking method and system for planar three-degree-of-freedom pose measurement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210655670.1A CN115063451B (en) 2022-06-10 2022-06-10 High-speed micro-vision tracking method and system for planar three-degree-of-freedom pose measurement

Publications (2)

Publication Number Publication Date
CN115063451A CN115063451A (en) 2022-09-16
CN115063451B true CN115063451B (en) 2024-06-04

Family

ID=83199549

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210655670.1A Active CN115063451B (en) 2022-06-10 2022-06-10 High-speed micro-vision tracking method and system for planar three-degree-of-freedom pose measurement

Country Status (1)

Country Link
CN (1) CN115063451B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107462173A (en) * 2017-09-25 2017-12-12 山东大学 Micromotion platform displacement measurement method and system based on micro-vision
CN107999955A (en) * 2017-12-29 2018-05-08 华南理工大学 A kind of six-shaft industrial robot line laser automatic tracking system and an automatic tracking method
CN110068267A (en) * 2019-05-06 2019-07-30 广东工业大学 Evaluate the space nanometer positioning and detection device and method of micro-vision measurement performance

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9176263B2 (en) * 2010-04-06 2015-11-03 President And Fellows Of Harvard College Optical micro-sensor

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107462173A (en) * 2017-09-25 2017-12-12 山东大学 Micromotion platform displacement measurement method and system based on micro-vision
CN107999955A (en) * 2017-12-29 2018-05-08 华南理工大学 A kind of six-shaft industrial robot line laser automatic tracking system and an automatic tracking method
CN110068267A (en) * 2019-05-06 2019-07-30 广东工业大学 Evaluate the space nanometer positioning and detection device and method of micro-vision measurement performance

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于视觉的三自由度微动平台输入耦合研究;张宪民 等;振动.测试与诊断;20130215(第01期);第9-13页 *

Also Published As

Publication number Publication date
CN115063451A (en) 2022-09-16

Similar Documents

Publication Publication Date Title
JP7072759B2 (en) Composite calibration device and method using multiple sensors
CN109658457B (en) Method for calibrating arbitrary relative pose relationship between laser and camera
CN101419055B (en) Space target position and pose measuring device and method based on vision
Zhou et al. π-LSAM: LiDAR smoothing and mapping with planes
CN108161931A (en) The workpiece automatic identification of view-based access control model and intelligent grabbing system
CN108663026B (en) Vibration measuring method
CN103895042A (en) Industrial robot workpiece positioning grabbing method and system based on visual guidance
US10652521B2 (en) Stereo camera and image pickup system
CN112017248B (en) 2D laser radar camera multi-frame single-step calibration method based on dotted line characteristics
JPH09214945A (en) Image characteristic tracing device and image characteristic tracing method
CN112580683B (en) Multi-sensor data time alignment system and method based on cross correlation
CN115890639A (en) Robot vision guide positioning and grabbing control system
CN114289332A (en) Visual identification and positioning method and device for workpiece sorting and sorting system
CN115063451B (en) High-speed micro-vision tracking method and system for planar three-degree-of-freedom pose measurement
CN114067210A (en) Mobile robot intelligent grabbing method based on monocular vision guidance
CN114554030B (en) Device detection system and device detection method
CN109903309B (en) Robot motion information estimation method based on angular optical flow method
CN108955562B (en) Digital extension method and system for microscopic depth of field of microscopic vision system
CN112033286B (en) Measuring method of structure six-degree-of-freedom motion measuring system based on binocular vision
CN111354031B (en) 3D vision guidance system based on deep learning
Zhu et al. A Smartphone-Based Six-DOF Measurement Method With Marker Detector
Stringa et al. A novel camera calibration algorithm based on Kalman filter
CN112862812B (en) Method, system and device for optimizing operation space trajectory of flexible robot
CN109801338B (en) High-robustness space relative navigation target relative coordinate calculation method and system
Yin et al. Global automatic recognition and localization of multi screws based on image mosaic

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant