CN113051767A - AGV sliding mode control method based on visual servo - Google Patents

AGV sliding mode control method based on visual servo Download PDF

Info

Publication number
CN113051767A
CN113051767A CN202110370699.0A CN202110370699A CN113051767A CN 113051767 A CN113051767 A CN 113051767A CN 202110370699 A CN202110370699 A CN 202110370699A CN 113051767 A CN113051767 A CN 113051767A
Authority
CN
China
Prior art keywords
agv
sliding mode
camera
coordinate system
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110370699.0A
Other languages
Chinese (zh)
Inventor
邢科新
林叶贵
李星宏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shaoxing Mindong Technology Co ltd
Original Assignee
Shaoxing Mindong Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shaoxing Mindong Technology Co ltd filed Critical Shaoxing Mindong Technology Co ltd
Priority to CN202110370699.0A priority Critical patent/CN113051767A/en
Publication of CN113051767A publication Critical patent/CN113051767A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2119/00Details relating to the type or aim of the analysis or the optimisation
    • G06F2119/02Reliability analysis or reliability optimisation; Failure analysis, e.g. worst case scenario performance, failure mode and effects analysis [FMEA]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Analysis (AREA)
  • Computational Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Algebra (AREA)
  • Computing Systems (AREA)
  • Computer Hardware Design (AREA)
  • Evolutionary Computation (AREA)
  • Geometry (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses an AGV sliding mode control method based on visual servo, which comprises the following steps: step 1) establishing a camera imaging model, acquiring feature points through a camera, obtaining pixel coordinates of a current image, and obtaining world coordinates through coordinate transformation; step 2) performing kinematic modeling on the AGV, and establishing a kinematic model of the AGV; step 3) designing a corresponding sliding mode controller according to the kinematics model; and 4) adding the world coordinates obtained in the step 1) into a designed sliding mode controller, and enabling the AGV to move to a target point. The real-time position of the AGV can be accurately acquired through the visual servo; through a sliding mode control algorithm, the AGV can realize good track tracking.

Description

AGV sliding mode control method based on visual servo
Technical Field
The invention relates to the technical field of industrial robots, in particular to an AGV sliding mode control method based on visual servo.
Background
The AGV control algorithm comprises a classical PID control algorithm, a self-adaptive control algorithm, an intelligent control algorithm, a sliding mode variable structure control algorithm and the like. For different control requirements we need to choose the most suitable control algorithm. The guidance modes commonly used by AGVs at present include electromagnetic guidance, magnetic tape guidance, inertial guidance, ultrasonic positioning guidance, laser guidance, visual guidance, and the like. Different guidance methods are required in different applications to meet the needs of the industry.
Disclosure of Invention
The invention aims to provide an AGV sliding mode control algorithm based on visual servo, which can accurately acquire the real-time position of the AGV through the visual servo; through a sliding mode control algorithm, the AGV can realize excellent track tracking.
In order to achieve the above purpose, the invention provides the following technical scheme:
an AGV sliding mode control method based on visual servo comprises the following steps:
step 1) establishing a camera imaging model, acquiring feature points through a camera, obtaining pixel coordinates of a current image, and obtaining world coordinates through coordinate transformation;
step 2) performing kinematic modeling on the AGV, and establishing a kinematic model of the AGV;
step 3) designing a corresponding sliding mode controller according to the kinematics model;
and 4) adding the world coordinates obtained in the step 1) into a designed sliding mode controller, and enabling the AGV to move to a target point.
Further, the coordinate transformation in step 1) is performed by firstly transforming from the pixel coordinate system to the image physical coordinate system, then transforming from the image physical coordinate system to the camera coordinate system, and then transforming from the camera coordinate system to the world coordinate system.
Further, the specific process of coordinate transformation in step 1) is as follows:
1.1) converting the image pixel coordinates to image physical coordinates by a pixel conversion matrix:
Figure BDA0003009183360000021
in the formula: [ u v 1]TRepresenting an image pixel coordinate system, [ x y 1 ]]TRepresenting an image physical coordinate system; dx and dy respectively represent the unit length number occupied by one pixel in the x direction and the y direction; u. of0、v0Center pixel coordinates and map representing an imageThe number of horizontal and vertical pixels of the phase difference between the pixel coordinates of the image origin;
1.2) converting the image physical coordinates to camera coordinates through a focus diagonal matrix and a distortion coefficient:
Figure BDA0003009183360000022
in the formula: [ X ]c Yc Zc]TDenotes a camera coordinate system, [ x y 1 ]]TRepresenting a normalized image physical coordinate system; f represents a focal length;
1.3) external reference matrix
Figure BDA0003009183360000023
Solving, wherein R is a rotation matrix, and T is a translation vector;
1.4) converting the camera coordinates to world coordinates by:
Figure BDA0003009183360000031
in the formula: [ X ]c Yc Zc]TDenotes a camera coordinate system, [ X ]m Ym Zm]TRepresenting a world coordinate system.
Further, the external parameter matrix
Figure BDA0003009183360000032
The solving process comprises the steps of matching the current image and the image of the target point, estimating and optimizing the mathematical mapping relation corresponding to the two images, combining the internal parameters of the camera obtained by calibrating the camera, and calculating the external parameter matrix of the camera by adopting a fast decomposition algorithm
Figure BDA0003009183360000033
Further, the external parameter matrix
Figure BDA0003009183360000034
The solving process of (2) is specifically as follows:
1.3.1) estimation and optimization of projective homography matrix G
Corresponding pixel p of the characteristic point between the current image and the image of the target pointi,pi *Correlating by matrix G:
pi=λiGpi *
in the formula: p is a radical ofi=[1,ui,vi]T;pi *=[1,ui *,vi *]T(ii) a Pixel coordinate (u)i,vi) And (u)i *,vi *) Respectively representing the pixel coordinates of the current image and the target image; lambda [ alpha ]iRepresenting static characteristic points Oi(i ═ 1,2,3 …, n) in the current camera coordinate system F and the desired camera coordinate system F*The ratio of the depths of the lower portions;
after finishing, obtaining:
Figure BDA0003009183360000035
eliminating g11λiThis gives:
Figure BDA0003009183360000036
estimating 8 unknown variables in the projective homography matrix G by using the two linear constraint equations and adopting a least square method for pixel coordinates corresponding to more than four feature points under the condition of a difference of a scale factor, and finally obtaining the projective homography matrix G;
1.3.2) estimation of the Euclidean homography matrix H
Defining a 3X3 Euclidean homography matrix:
H=A-1GA
establishing a relation between the European homography matrix H and the relative pose parameters of the mobile robot, and expressing as follows:
Figure BDA0003009183360000041
n*=[nx *,ny *,nz *]Tis a target plane F under the camera's desired coordinate system*The term values of the unit normal vector of (2) are the unit normal vector n*In a coordinate system F*Projection of each coordinate axis; theta is a rotation angle between the current pose and the expected pose of the camera;
1.3.3) fast matrix decomposition to solve R and T
Estimating the proportional translation T between the current position and the expected position of the AGV, wherein the T is specifically represented as follows:
Figure BDA0003009183360000042
the corresponding relation between the European homography matrix and the relative pose parameters of the mobile robot is as follows:
Figure BDA0003009183360000043
wherein h isij(i 1,2, 3; j 1,2,3) represents the parameter value of the ith row of the Euclidean matrix H, the jth column, and n is a normal vector;
solving according to the corresponding relation to obtain a solution of theta and T, wherein
Figure BDA0003009183360000051
Finally obtaining the external parameter matrix
Figure BDA0003009183360000052
Further, the modeling process of step 2) is as follows:
2.1) pose of AGV by vector q ═ xm ymθm]TIs represented by vmAnd wmRespectively representing the integral linear speed and angular speed when the AGV advances; thus establishing a kinematic model of the AGV:
Figure BDA0003009183360000053
2.2) let qr=[xr yr θr]TAs coordinates of the desired pose, qe=[xe ye θe]TAnd if the position error coordinate is obtained, the position error equation of the AGV is as follows:
Figure BDA0003009183360000054
further, the design process of the sliding mode controller in the step 3) is as follows:
3.1) design the switching function as follows:
Figure BDA0003009183360000055
wherein S1,S2All tend to 0, xeToward 0, then yeAlso tends towards 0; wherein S1、S2Two quantities of the designed switching function;
3.2) designing a sliding mode controller by adopting a method combining an approximation rule and high gain:
Figure BDA0003009183360000056
wherein (x)r,yrr) And (v)rr) Respectively representing the expected pose and speed of the AGV, beta representing a virtual control quantity, epsilon1,ε2>0,0<a<1,k1,k2>0,ε1、ε2、a、k1、k2Are all constant.
Further, the specific process of step 4) is as follows:
4.1) converting the world coordinates (x) in step 1)m,ymm) As the real-time coordinates of the AGV, the coordinates (x) with the set coordinatesr,yrr) Inputting a position error equation of the wheeled mobile robot of the AGV in the step 3 together:
Figure BDA0003009183360000061
4.2) output by AGV pose error equation (x)e,yee) Inputting the sliding mode controller designed in the step 4
Figure BDA0003009183360000062
4.3) inputting the output angular velocity and the linear velocity of the sliding mode controller into the AGV, and converting v and omega into omega through motion decomposition1And ω2Wherein ω is1And ω2Respectively representing the angular velocity of the left wheel and the angular velocity of the right wheel;
4.4) the AGV acquires the world coordinate of the AGV at the moment through the step 1) and inputs the world coordinate into the error equation in the step 3 to form closed-loop control.
Compared with the prior art, the invention has the advantages that:
1. the invention adopts a position and pose estimation method of the homography matrix, and extracts the relative position and pose parameters of the AGV through the rapid decomposition of the homography matrix. The method is a process for acquiring three-dimensional attitude information from a two-dimensional image, a unique solution can be obtained under a common condition, and the algorithm efficiency is high.
2. Compared with the classical PID algorithm, the sliding mode control algorithm adopted by the invention has stronger anti-interference capability and better robustness.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further specifically described below by examples.
Example (b): an AGV sliding mode control method based on visual servo comprises the following specific steps:
step 1) establishing a camera imaging model, acquiring pictures through a camera on an AGV to obtain real-time pixel coordinates of characteristic points, and acquiring world coordinates of the AGV through calibration and coordinate transformation of the camera; the coordinate transformation is to transform the pixel coordinate system to an image physical coordinate system, then transform the image physical coordinate system to a camera coordinate system, and then transform the camera coordinate system to a world coordinate system; wherein: the world coordinate system refers to the coordinates of an object in the real world; the camera coordinate system is defined by taking the optical center of the camera lens as the origin of the camera coordinate system and taking the X and y directions parallel to the image as XcAxis and YcAxis, ZcAxis parallel to optical axis, Xc,Yc,ZcAre mutually vertical; the image physical coordinate system takes the intersection point of the main shaft and the image plane as the origin of coordinates, and the x-axis and the y-axis directions are respectively parallel to the u-axis and the v-axis; the pixel coordinate system is that the top left corner of the shot photo is taken as a vertex, the horizontal direction is a u axis, and the vertical direction is a v axis.
The specific steps of coordinate transformation are as follows:
1.1) converting the image pixel coordinates to image physical coordinates by a pixel conversion matrix:
Figure BDA0003009183360000071
in the formula: [ u v 1]TRepresenting an image pixel coordinate system, [ x y 1 ]]TRepresenting an image physical coordinate system; dx and dy respectively represent the unit length number occupied by one pixel in the x direction and the y direction; u. of0、v0The number of horizontal and vertical pixels representing the phase difference between the center pixel coordinate of the image and the image origin pixel coordinate.
1.2) converting the image physical coordinates to camera coordinates through a focus diagonal matrix and a distortion coefficient:
Figure BDA0003009183360000081
in the formula: [ X ]c Yc Zc]TDenotes a camera coordinate system, [ x y 1 ]]TRepresenting a normalized image physical coordinate system; f denotes a focal length.
1.3) external reference matrix
Figure BDA0003009183360000082
A solution is performed where R is the rotation matrix and T is the translation vector.
The solving process is as follows:
1.3.1) estimation and optimization of projective homography matrix G
According to the geometric knowledge, the projective homography matrix G represents the one-to-one mapping relation between the images shot by the camera at different positions in the motion process. Static feature points on a flat scene Oi(i is 1,2,3 …, n) moving to the current coordinate F and the target coordinate F when the mobile robot moves*Are respectively expressed as pi=[1,ui,vi]T,pi *=[1,ui *,vi *]TPixel coordinate (u)i,vi) And (u)i *,vi *) Respectively representing the pixel coordinates on the current image and the target image. Corresponding pixels p of the characteristic points between the current image and the image of the target point according to the geometric knowledge between the imagesi,pi *Correlating by matrix G:
pi=λiGpi *
in the formula: lambda [ alpha ]iRepresenting static characteristic points OiIn the current camera coordinate system F and the desired camera coordinate system F*The ratio of the depths of the lower portions.
After finishing, obtaining:
Figure BDA0003009183360000091
eliminating g11λiThis gives:
Figure BDA0003009183360000092
and (3) estimating 8 unknown variables in the projective homography matrix G by using the two linear constraint equations and adopting a least square method to the pixel coordinates corresponding to more than four feature points under the condition of a difference of a scale factor, and finally obtaining the projective homography matrix G.
1.3.2) estimation of the Euclidean homography matrix H
The projective homography matrix G represents the current image L and the target image L*Homogeneous pixel coordinate p corresponding to middle characteristic pointiAnd pi *The geometric transformation relationship between the camera and the camera implies the rotation angle between the current pose and the expected pose of the camera, the translation vector between the current position and the expected position of the camera and the interrelation between the internal parameters of the camera. Defining a 3X3 Euclidean homography matrix:
H=A-1GA
establishing a relation between the European homography matrix H and the relative pose parameters of the mobile robot, and expressing as follows:
Figure BDA0003009183360000093
n*=[nx *,ny *,nz *]Tis a target plane F under the camera's desired coordinate system*The term values of the unit normal vector of (2) are the unit normal vector n*In a coordinate system F*Projection of each coordinate axis; theta is a rotation angle between the current pose and the expected pose of the camera; since the plane on which the target scene is placed is not perpendicular to the plane of motion of the AGV, the unit normal vector n*N of (A) to (B)z *Not 0, i.e. nz *Not equal to 0; and d*Represents a coordinate system F*To the vertical distance of the target plane.
1.3.3) fast matrix decomposition to solve R and T
For the decomposition algorithm of the Euclidean homography matrix H, the process of extracting the three-dimensional information of the camera is essentially. Considering that the monocular vision system depth information is unknown, the proportional translation T between the current position and the expected position of the AGV can be estimated only under the condition that the difference is a scale factor, and the specific expression of T is as follows:
Figure BDA0003009183360000101
the corresponding relation between the European homography matrix and the relative pose parameters of the mobile robot is as follows:
Figure BDA0003009183360000102
wherein h isij(i 1,2, 3; j 1,2,3) represents the parameter value of the ith row of the Euclidean matrix H, the jth column, and n is a normal vector;
according to the above correspondence, discussion can be made by virtue of the following:
when h is generated13≠0,h23Not equal to 0, for this case, the relative pose parameter θ, T and the unit normal vector n between the current position coordinates and the expected position coordinates of the camera*There are two sets of solutions.
When h is generated13=0,h23When the position is not equal to 0, relative pose parameters theta and T between the current position coordinate and the expected position coordinate of the camera and a unit normal vector n*There are also two sets of solutions.
When h is generated13=0,h23When the position coordinate of the camera is equal to 0, relative position and orientation parameters theta, T between the current position coordinate and the expected position coordinate of the camera and a unit normal vector n*A set of solutions also exist.
Obtaining a solution of theta, T by the above solution, wherein
Figure BDA0003009183360000103
Finally, the external parameter matrix is obtained
Figure BDA0003009183360000104
1.4) converting the camera coordinates to world coordinates by means of a parameter matrix. When the world coordinate system and the camera coordinate system are converted, the world coordinate system and the camera coordinate system are expressed by homogeneous coordinates, and a 3x4 external reference matrix is pre-multiplied to obtain a relation formula of the world coordinate system and the camera coordinate system:
Figure BDA0003009183360000111
in the formula: [ X ]c Yc Zc]TDenotes a camera coordinate system, [ X ]m Ym Zm]TRepresenting a world coordinate system.
Substituting the external reference matrix obtained in the step 1.3) into the formula to obtain world coordinates.
Step 2) performing kinematic modeling on the AGV, and establishing a kinematic model of the AGV; the modeling process is as follows:
2.1) pose of AGV by vector q ═ xm ym θm]TIs represented by vmAnd wmRespectively representing the integral linear speed and angular speed when the AGV advances; thus establishing a kinematic model of the AGV:
Figure BDA0003009183360000112
2.2) let qr=[xr yr θr]TAs coordinates of the desired pose, qe=[xe ye θe]TAnd if the position error coordinate is obtained, the position error equation of the AGV is as follows:
Figure BDA0003009183360000113
step 3) designing a corresponding sliding mode controller according to the kinematics model; the design process of the sliding mode controller is as follows:
3.1) design the switching function as follows:
Figure BDA0003009183360000121
wherein S1,S2All tend to 0, xeToward 0, then yeAlso tends towards 0; wherein S1、S2Two quantities of the designed switching function;
3.2) the switching action of the designed switching function causes control discontinuity, so that the AGV has buffeting; adopting a method combining an approach law and a high gain to weaken buffeting of a system near a switching surface, firstly adopting the approach law method, and then combining a continuous thought of the high gain method to design a sliding mode controller, wherein the design sliding mode control law is as follows:
Figure BDA0003009183360000122
wherein (x)r,yrr) And (v)rr) Respectively representing the expected pose and speed of the AGV, beta representing a virtual control quantity, epsilon1,ε2>0,0<a<1,k1,k2>0,ε1、ε2、a、k1、k2Are all constant.
And 4) adding the world coordinates obtained in the step 1) into a designed sliding mode controller, and enabling the AGV to move to a target point. The specific process is as follows:
4.1) converting the world coordinates (x) in step 1)m,ymm) As the real-time coordinates of the AGV, the coordinates (x) with the set coordinatesr,yrr) Inputting a position error equation of the wheeled mobile robot of the AGV in the step 3 together:
Figure BDA0003009183360000123
4.2) output by AGV pose error equation (x)e,yee) Inputting the sliding mode controller designed in the step 4
Figure BDA0003009183360000131
4.3) inputting the output angular velocity and the linear velocity of the sliding mode controller into the AGV, and converting v and omega into omega through motion decomposition1And ω2Wherein ω is1And ω2Respectively representing the angular velocity of the left wheel and the angular velocity of the right wheel;
4.4) the AGV acquires the world coordinate of the AGV at the moment through the step 1) and inputs the world coordinate into the error equation in the step 3 to form closed-loop control.
The above embodiments are merely illustrative of the technical ideas and features of the present invention, and the purpose thereof is to enable those skilled in the art to understand the contents of the present invention and implement the present invention, and not to limit the protection scope of the present invention. All equivalent changes and modifications made according to the spirit of the present invention should be covered within the protection scope of the present invention.

Claims (8)

1. An AGV sliding mode control method based on visual servo is characterized by comprising the following steps:
step 1) establishing a camera imaging model, acquiring feature points through a camera, obtaining pixel coordinates of a current image, and obtaining world coordinates through coordinate transformation;
step 2) performing kinematic modeling on the AGV, and establishing a kinematic model of the AGV;
step 3) designing a corresponding sliding mode controller according to the kinematics model;
and 4) adding the world coordinates obtained in the step 1) into a designed sliding mode controller, and enabling the AGV to move to a target point.
2. The AGV sliding mode control method based on visual servo according to claim 1, wherein the coordinate transformation in step 1) is performed by firstly transforming from a pixel coordinate system to an image physical coordinate system, then transforming from the image physical coordinate system to a camera coordinate system, and then transforming from the camera coordinate system to a world coordinate system.
3. The AGV sliding mode control method based on the visual servo as claimed in claim 2, wherein the specific process of the coordinate transformation in the step 1) is as follows:
1.1) converting the image pixel coordinates to image physical coordinates by a pixel conversion matrix:
Figure FDA0003009183350000011
in the formula: [ u v 1]TRepresenting an image pixel coordinate system, [ x y 1 ]]TRepresenting an image physical coordinate system; dx and dy respectively represent the unit length number occupied by one pixel in the x direction and the y direction; u. of0、v0A number of horizontal and vertical pixels representing a phase difference between a central pixel coordinate of the image and an origin pixel coordinate of the image;
1.2) converting the image physical coordinates to camera coordinates through a focus diagonal matrix and a distortion coefficient:
Figure FDA0003009183350000021
in the formula: [ X ]c Yc Zc]TDenotes a camera coordinate system, [ x y 1 ]]TRepresenting a normalized image physical coordinate system; f represents a focal length;
1.3) external reference matrix
Figure FDA0003009183350000022
Solving, wherein R is a rotation matrix, and T is a translation vector;
1.4) converting the camera coordinates to world coordinates by:
Figure FDA0003009183350000023
in the formula: [ X ]c Yc Zc]TDenotes a camera coordinate system, [ X ]m Ym Zm]TRepresenting a world coordinate system.
4. The AGV sliding mode control method based on visual servo as claimed in claim 3, wherein the external parameter matrix is
Figure FDA0003009183350000024
The solving process comprises the steps of matching the current image and the image of the target point, estimating and optimizing the mathematical mapping relation corresponding to the two images, combining the internal parameters of the camera obtained by calibrating the camera, and calculating the external parameter matrix of the camera by adopting a fast decomposition algorithm
Figure FDA0003009183350000025
5. The AGV sliding mode control method based on visual servo as claimed in claim 4, wherein the external parameter matrix is
Figure FDA0003009183350000026
The solving process of (2) is specifically as follows:
1.3.1) estimation and optimization of projective homography matrix G
Corresponding pixel p of the characteristic point between the current image and the image of the target pointi,pi *Correlating by matrix G:
pi=λiGpi *
in the formula: p is a radical ofi=[1,ui,vi]T;pi *=[1,ui *,vi *]T(ii) a Pixel coordinate (u)i,vi) And (u)i *,vi *) Respectively representing the pixel coordinates of the current image and the target image; lambda [ alpha ]iRepresenting static characteristic points Oi(i ═ 1,2,3 …, n) in the current camera coordinate system F and the desired camera coordinate system F*The ratio of the depths of the lower portions;
after finishing, obtaining:
Figure FDA0003009183350000031
eliminating g11λiThis gives:
Figure FDA0003009183350000032
estimating 8 unknown variables in the projective homography matrix G by using the two linear constraint equations and adopting a least square method for pixel coordinates corresponding to more than four feature points under the condition of a difference of a scale factor, and finally obtaining the projective homography matrix G;
1.3.2) estimation of the Euclidean homography matrix H
Defining a 3X3 Euclidean homography matrix:
H=A-1GA
establishing a relation between the European homography matrix H and the relative pose parameters of the mobile robot, and expressing as follows:
Figure FDA0003009183350000033
n*=[nx *,ny *,nz *]Tis a target plane F under the camera's desired coordinate system*The term values of the unit normal vector of (2) are the unit normal vector n*In a coordinate system F*Projection of each coordinate axis; theta is taking upA rotation angle between a current pose of the camera and an expected pose;
1.3.3) fast matrix decomposition to solve R and T
Estimating the proportional translation T between the current position and the expected position of the AGV, wherein the T is specifically represented as follows:
Figure FDA0003009183350000041
the corresponding relation between the European homography matrix and the relative pose parameters of the mobile robot is as follows:
Figure FDA0003009183350000042
wherein h isij(i 1,2, 3; j 1,2,3) represents the parameter value of the ith row of the Euclidean matrix H, the jth column, and n is a normal vector;
solving according to the corresponding relation to obtain a solution of theta and T, wherein
Figure FDA0003009183350000043
Finally obtaining the external parameter matrix
Figure FDA0003009183350000044
6. The AGV sliding mode control method based on the visual servo as claimed in claim 1, wherein the modeling process of the step 2) is as follows:
2.1) pose of AGV by vector q ═ xm ym θm]TIs represented by vmAnd wmRespectively representing the integral linear speed and angular speed when the AGV advances; thus establishing a kinematic model of the AGV:
Figure FDA0003009183350000045
2.2) let qr=[xr yr θr]TAs coordinates of the desired pose, qe=[xe ye θe]TAnd if the position error coordinate is obtained, the position error equation of the AGV is as follows:
Figure FDA0003009183350000051
7. the AGV sliding mode control method based on the visual servo according to claim 6, wherein the sliding mode controller in the step 3) is designed as follows:
3.1) design the switching function as follows:
Figure FDA0003009183350000052
wherein S1,S2All tend to 0, xeToward 0, then yeAlso tends towards 0; wherein S1、S2Two quantities of the designed switching function;
3.2) designing a sliding mode controller by adopting a method combining an approximation rule and high gain:
Figure FDA0003009183350000053
wherein (x)r,yrr) And (v)rr) Respectively representing the expected pose and speed of the AGV, beta representing a virtual control quantity, epsilon1,ε2>0,0<a<1,k1,k2>0,ε1、ε2、a、k1、k2Are all constant.
8. The AGV sliding mode control method based on the visual servo as claimed in claim 7, wherein the specific process of the step 4) is as follows:
4.1) taking the world coordinate in the step 1) as a real-time coordinate of the AGV, and inputting the real-time coordinate and the set coordinate into a position error equation of the wheeled mobile robot of the AGV in the step 3 together:
Figure FDA0003009183350000054
4.2) output by AGV pose error equation (x)e,yee) Inputting the sliding mode controller designed in the step 4
Figure FDA0003009183350000061
4.3) inputting the output angular velocity and the linear velocity of the sliding mode controller into the AGV, and converting v and omega into omega through motion decomposition1And ω2Wherein ω is1And ω2Respectively representing the angular velocity of the left wheel and the angular velocity of the right wheel;
4.4) the AGV acquires the world coordinate of the AGV at the moment through the step 1) and inputs the world coordinate into the error equation in the step 3 to form closed-loop control.
CN202110370699.0A 2021-04-07 2021-04-07 AGV sliding mode control method based on visual servo Pending CN113051767A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110370699.0A CN113051767A (en) 2021-04-07 2021-04-07 AGV sliding mode control method based on visual servo

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110370699.0A CN113051767A (en) 2021-04-07 2021-04-07 AGV sliding mode control method based on visual servo

Publications (1)

Publication Number Publication Date
CN113051767A true CN113051767A (en) 2021-06-29

Family

ID=76517735

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110370699.0A Pending CN113051767A (en) 2021-04-07 2021-04-07 AGV sliding mode control method based on visual servo

Country Status (1)

Country Link
CN (1) CN113051767A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113625715A (en) * 2021-08-12 2021-11-09 上海海事大学 Rapid trajectory tracking control method for automatic container terminal AGV
CN113658221A (en) * 2021-07-28 2021-11-16 同济大学 Monocular camera-based AGV pedestrian following method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5758298A (en) * 1994-03-16 1998-05-26 Deutsche Forschungsanstalt Fur Luft-Und Raumfahrt E.V. Autonomous navigation system for a mobile robot or manipulator
CN107037808A (en) * 2016-09-09 2017-08-11 东莞理工学院 Waggon autonomous tracing in intelligent vehicle based on sliding mode controller
CN109816687A (en) * 2017-11-20 2019-05-28 天津工业大学 The concurrent depth identification of wheeled mobile robot visual servo track following
CN110722533A (en) * 2018-07-17 2020-01-24 天津工业大学 External parameter calibration-free visual servo tracking of wheeled mobile robot
CN111103798A (en) * 2019-12-20 2020-05-05 华南理工大学 AGV path tracking method based on inversion sliding mode control
CN112578671A (en) * 2020-12-11 2021-03-30 上海应用技术大学 AGV track tracking control method based on U model optimization SMC

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5758298A (en) * 1994-03-16 1998-05-26 Deutsche Forschungsanstalt Fur Luft-Und Raumfahrt E.V. Autonomous navigation system for a mobile robot or manipulator
CN107037808A (en) * 2016-09-09 2017-08-11 东莞理工学院 Waggon autonomous tracing in intelligent vehicle based on sliding mode controller
CN109816687A (en) * 2017-11-20 2019-05-28 天津工业大学 The concurrent depth identification of wheeled mobile robot visual servo track following
CN110722533A (en) * 2018-07-17 2020-01-24 天津工业大学 External parameter calibration-free visual servo tracking of wheeled mobile robot
CN111103798A (en) * 2019-12-20 2020-05-05 华南理工大学 AGV path tracking method based on inversion sliding mode control
CN112578671A (en) * 2020-12-11 2021-03-30 上海应用技术大学 AGV track tracking control method based on U model optimization SMC

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
于晓龙: "移动机器人轨迹跟踪控制方法及自主导航性能评估技术研究", 《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》 *
林叶贵: "基于DM6437的移动机器人视觉系统研究与设计", 《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》 *
黄晓娟: "轮式机器人轨迹跟踪和路径规划算法研究", 《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113658221A (en) * 2021-07-28 2021-11-16 同济大学 Monocular camera-based AGV pedestrian following method
CN113658221B (en) * 2021-07-28 2024-04-26 同济大学 AGV pedestrian following method based on monocular camera
CN113625715A (en) * 2021-08-12 2021-11-09 上海海事大学 Rapid trajectory tracking control method for automatic container terminal AGV
CN113625715B (en) * 2021-08-12 2024-04-09 上海海事大学 Automatic container terminal AGV rapid track tracking control method

Similar Documents

Publication Publication Date Title
CN110116407B (en) Flexible robot position and posture measuring method and device
Qiu et al. Visual servo tracking of wheeled mobile robots with unknown extrinsic parameters
CN110503688A (en) A kind of position and orientation estimation method for depth camera
CN111552293B (en) Mobile robot formation control method based on images under visual field constraint
CN113051767A (en) AGV sliding mode control method based on visual servo
CN110722533B (en) External parameter calibration-free visual servo tracking of wheeled mobile robot
Liu et al. Target tracking of moving and rotating object by high-speed monocular active vision
De Luca et al. Image-based visual servoing schemes for nonholonomic mobile manipulators
CN110136211A (en) A kind of workpiece localization method and system based on active binocular vision technology
CN114721275B (en) Visual servo robot self-adaptive tracking control method based on preset performance
Ziaei et al. Global path planning with obstacle avoidance for omnidirectional mobile robot using overhead camera
Liang et al. Calibration-free image-based trajectory tracking control of mobile robots with an overhead camera
CN110928311B (en) Indoor mobile robot navigation method based on linear features under panoramic camera
CN112947569A (en) Visual servo target tracking control method for quad-rotor unmanned aerial vehicle based on preset performance
CN117218210A (en) Binocular active vision semi-dense depth estimation method based on bionic eyes
Taherian et al. Image-based visual servoing improvement through utilization of adaptive control gain and pseudo-inverse of the weighted mean of the Jacobians
CN113211433A (en) Separated visual servo control method based on composite characteristics
CN109048911B (en) Robot vision control method based on rectangular features
CN109542094B (en) Mobile robot vision stabilization control without desired images
WO2020010625A1 (en) Method and system for optimizing kinematic model of robot, and storage device.
CN112767481B (en) High-precision positioning and mapping method based on visual edge features
Shao et al. Vision-based adaptive trajectory tracking control of wheeled mobile robot with unknown translational external parameters
Hwang et al. Robust 2D map building with motion-free ICP algorithm for mobile robot navigation
Yang et al. Optical-flow-based visual servoing for robotic moving control using closed-loop joints
CN112123370B (en) Mobile robot vision stabilization control with desired pose change

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210629