CN112256001A - Visual servo control method for mobile robot under visual angle constraint - Google Patents

Visual servo control method for mobile robot under visual angle constraint Download PDF

Info

Publication number
CN112256001A
CN112256001A CN202011053311.6A CN202011053311A CN112256001A CN 112256001 A CN112256001 A CN 112256001A CN 202011053311 A CN202011053311 A CN 202011053311A CN 112256001 A CN112256001 A CN 112256001A
Authority
CN
China
Prior art keywords
mobile robot
constraint
visual
servo
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011053311.6A
Other languages
Chinese (zh)
Other versions
CN112256001B (en
Inventor
戴诗陆
梁健俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN202011053311.6A priority Critical patent/CN112256001B/en
Publication of CN112256001A publication Critical patent/CN112256001A/en
Application granted granted Critical
Publication of CN112256001B publication Critical patent/CN112256001B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0094Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a visual servo control method of a mobile robot under visual angle constraint, which comprises the following steps: constructing a kinematics model of the incomplete mobile robot and a projection model of the pinhole camera; coordinate transformation is carried out on the projection model of the pinhole camera by combining a kinematic model of the mobile robot, and a system dynamic equation about pixel coordinates is established; constructing a servo tracking error equation related to pixel coordinates, and converting the visual angle constraint of the pinhole camera into an upper and lower bound constraint problem related to the servo tracking error; based on a preset performance control method, a barrier function is constructed to ensure that a servo control system can meet preset performance indexes and view angle constraints, and a Lyapunov design method is used for designing a vision servo controller, a speed observer and an adaptive parameter updating law of the image-based mobile robot. The controller, the observer and the parameter updating law ensure that the view angle constraint and the performance constraint can be met in the whole control process, and the closed-loop system is stable.

Description

Visual servo control method for mobile robot under visual angle constraint
Technical Field
The invention relates to the field of vision servo control of mobile robots, in particular to a distributed vision servo control method of a mobile robot, which considers the visual angle constraint and the control performance constraint of a pinhole camera.
Background
In recent years, the problem of controlling robot systems such as mobile robots, robot arms, and the like using visual feedback has become an important problem in the field of control engineering, and image-based visual servo (IBVS) control is one of the most popular topics among them. The visual servo control of the mobile robot has wide application prospect and can be applied to a plurality of fields such as military, industrial production, intelligent transportation and the like.
Compared with the visual servo control for a static target, the visual servo control of the mobile robot has the greatest characteristic that the tracked target is in continuous motion, and the realization of a servo controller usually needs to acquire the speed information of the target. This presents difficulties in the design of distributed servo controllers. One possible method is to install communication equipment for all the mobile robots, so that they can acquire the information required by themselves through communication, and further calculate their own control signals. However, this approach may increase the hardware cost of the system. Moreover, this method is even more impractical when the tracked object is a non-cooperative object. How to design a distributed visual servo controller without additional hardware and without restricting the controller application scenario is a significant issue.
Since the visual servoing performs tracking control based on image information, the problem that the visual servoing first needs to solve is how to ensure that a tracking target (feature point) can always be located within the view angle range of the camera. In visual servo control, if a tracking target runs out of the view angle range of a camera, a controller cannot acquire enough information, and a servo task cannot be completed. The difficulty of solving the above problem is more increased for the characteristic points of the motion. Therefore, one of the design important points of the visual servo controller is how to ensure that the tracked object is always located within the viewing angle range of the camera.
Another important issue in visual servo control of mobile robots is the transient and steady state performance of closed loop systems in the presence of unknown parameters and dynamic variables in the system. In an actual scenario, due to the reasons of imperfect system modeling, difficult system state measurement, etc., unknown parameters or dynamic variables are often included in the system model. In such a situation, it is often difficult to strictly ensure that the closed-loop system can meet specific transient and steady-state performance indicators with existing methods. Transient and steady state performance issues of visual servos present more challenges to the design of visual servos controllers.
Disclosure of Invention
The invention mainly aims to overcome the defects of the prior art and provide a visual servo control method of a mobile robot under the visual angle constraint.
In order to achieve the purpose, the invention adopts the following technical scheme:
the invention provides a visual servo control method of a mobile robot under the constraint of visual angles, which comprises the following steps:
constructing a kinematics model of the incomplete mobile robot and a projection model of the pinhole camera;
coordinate transformation is carried out on the projection model of the pinhole camera by combining a kinematic model of the mobile robot, and a system dynamic equation about pixel coordinates is established;
constructing a servo tracking error equation with respect to pixel coordinates
Figure BDA0002710179450000021
And
Figure BDA0002710179450000022
wherein i is the number of the tracked target robot, and j is the number of the mobile robot performing servo tracking; because the pinhole camera has visual angle constraint, in order to avoid the loss of a target in the servo process of the mobile robot, the visual angle constraint of the pinhole camera is converted into an upper and lower bound constraint problem about a servo tracking error by using a preset performance control method;
constructing a servo tracking error based on a preset performance control method
Figure BDA0002710179450000023
And
Figure BDA0002710179450000024
to ensure servo tracking error performanceThe method can be kept between an upper bound performance function and a lower bound performance function all the time, the constructed barrier function is introduced into a Lyapunov design method, unknown parameters and dynamic variables in a system are estimated and compensated by using a self-adaptive control and a speed observer, and the visual servo controller based on the image is designed.
Preferably, the kinematic model of the incomplete mobile robot is specifically:
Figure BDA0002710179450000025
wherein (x)i,yi) Indicating the position of the ith mobile robot in the geodetic coordinate system; thetaiThe course angle of the ith mobile robot in the geodetic coordinate system is defined; v. ofiAnd ωiRespectively linear and angular velocity of the mobile robot moving relative to a geodetic coordinate system.
Preferably, the pinhole camera model specifically includes:
Figure BDA0002710179450000026
in the above formula, (u, v) represents the pixel coordinates of the image on the image plane corresponding to the space point, (X)c,Yc,Zc) Is the three-dimensional position coordinate of a space point in a camera coordinate system, A is an internal reference matrix of the camera, and the form of the internal reference matrix is as follows:
Figure BDA0002710179450000027
wherein, auAnd avScale factors, a, in the x-and y-axes of the image plane, respectivelyuvRepresents the distortion factor between the x-axis and the y-axis, (u)0,v0) Pixel coordinates representing the center point of the image plane.
Preferably, the coordinate transformation is performed on the projection model of the pinhole camera and a system dynamic equation about the pixel coordinate is established by combining the kinematic model of the mobile robot, specifically:
the camera mounted on the jth mobile robot is simply referred to as the jth camera, and its camera coordinates are marked with a mark "j", that is, (X)cj,Ycj,Zcj);
The body coordinate system of the jth mobile robot and the camera coordinate system of the pinhole camera mounted thereon have the following transformation relation:
Figure BDA0002710179450000031
wherein (X)j,Yj,Zj) The coordinates of the space points in the body coordinate system of the jth mobile robot are obtained;
for a feature point located on the z-axis of the ith mobile robot coordinate system, it is abbreviated as the ith feature point, and it is set as the target point tracked by the jth mobile robot, i ≠ j, and the coordinates of the feature point in the ith, jth and jth mobile robots and jth cameras coordinate systems are respectively recorded as
Figure BDA0002710179450000032
Wherein
Figure BDA0002710179450000033
Figure BDA0002710179450000034
The first two coordinates are transformed as follows:
Figure BDA0002710179450000035
combining the two coordinate transformation relations and the projection model of the pinhole camera, the following equation can be obtained:
Figure BDA0002710179450000036
in the above formula, the first and second carbon atoms are,
Figure BDA0002710179450000037
pixel coordinates on the image plane of the jth camera representing the ith feature point, a1、a2Respectively are a first row and a second row of the internal reference matrix A, and the derivation is carried out on the formula to obtain:
Figure BDA0002710179450000038
wherein the content of the first and second substances,
Figure BDA0002710179450000039
is a normalized pixel coordinate defined as:
Figure BDA0002710179450000041
preferably, for the j-th robot tracking the i-th robot, the servo tracking error is adjusted
Figure BDA0002710179450000042
And
Figure BDA0002710179450000043
is defined as:
Figure BDA0002710179450000044
wherein the content of the first and second substances,
Figure BDA0002710179450000045
is composed of
Figure BDA0002710179450000046
The expected value of (c) is,
Figure BDA0002710179450000047
and
Figure BDA0002710179450000048
are all constants;
view angle constraints of pinhole cameras: the field of view of a pinhole camera is limited, i.e. the camera has a range of horizontal viewing angles and a range of vertical viewing angles, called the viewing angle constraint of the pinhole camera, which are embodied in the image plane as a range of pixel coordinates, which is limited, i.e.:
umin≤u≤umax
vmin≤v≤vmax
wherein (u)min,vmin) And (u)max,vmax) The pixel coordinates of the upper left corner point and the pixel coordinates of the lower right corner point of the image plane are respectively combined with the definition about the servo tracking error to obtain the following error constraints:
Figure BDA0002710179450000049
preferably, in order to ensure that the closed-loop system can meet the specified transient and steady-state performance, a preset performance control method is used to introduce the following upper and lower bound performance constraints for the servo tracking error:
Figure BDA00027101794500000410
Figure BDA00027101794500000411
wherein the content of the first and second substances,β ui(t)、
Figure BDA00027101794500000412
β vi(t)、
Figure BDA00027101794500000413
for a pre-specified performance function, define asThe following:
Figure BDA00027101794500000420
Figure BDA00027101794500000419
in the above formula, the first and second carbon atoms are,
Figure BDA00027101794500000414
β *i,∞
Figure BDA00027101794500000415
represents the maximum allowable steady-state error, and respectively satisfies 0 <β *i,∞β *i,∞
Figure BDA00027101794500000416
Figure BDA00027101794500000417
For positive design parameters, ∈ { u, v }; the defined form of the performance function ensures that the performance constraint is more stringent than the view constraint, and when the servo tracking error satisfies the performance constraint, it must also satisfy the view constraint.
Preferably, the barrier function is specifically as follows:
Figure BDA00027101794500000418
Figure BDA0002710179450000051
the barrier function described above has the following characteristics: if and only if servo tracking error
Figure BDA0002710179450000052
Approaching a performance functionβ *iOr
Figure BDA0002710179450000053
When the function value of the barrier function is infinite; if and only if servo tracking error
Figure BDA0002710179450000054
Time, eta*i=0,*∈{u,v}。
Preferably, based on these characteristics of the barrier function, the upper and lower bounds constraints of the servo tracking error can be converted into the bounded problem of the barrier function, and a suitable controller is designed by the lyapunov design method, taking into account the following lyapunov equation:
Figure BDA0002710179450000055
in the above formula, etai=[ηuivi]T,γ1、γ2In order to be a positive design parameter,
Figure BDA0002710179450000056
is hiThe error of the estimation of (2) is,
Figure BDA0002710179450000057
is hiIs determined by the estimated value of (c),
Figure BDA0002710179450000058
is v'iThe error of the estimation of (2) is,
Figure BDA0002710179450000059
is v'iAccording to the lyapunov direct method, the servo controller is designed as follows:
Figure BDA00027101794500000510
wherein, K1Is a positive definite diagonal matrix of the matrix,
Figure BDA00027101794500000511
Figure BDA00027101794500000512
Figure BDA00027101794500000513
preferably, the method further comprises the following steps:
will be provided with
Figure BDA00027101794500000514
Adaptive update of rhythm and v'iThe observer of (a) is designed as:
Figure BDA00027101794500000515
Figure BDA00027101794500000516
wherein the content of the first and second substances,
Figure BDA00027101794500000517
is Gi -1First row of (a)1、σ2Are all design parameters greater than zero.
Preferably, the heights of the origins of the body coordinate systems of all the mobile robots from the ground are the same.
Compared with the prior art, the invention has the following advantages and beneficial effects:
1. the invention provides a distributed visual servo controller. Although the dynamic equation of the servo tracking error contains the linear velocity information of the target robot, the mobile robot can calculate the own control quantity by only depending on the information acquired by the own sensor under the help of the speed observer without any communication between the robots. This ensures a wide range of applications for the present invention.
2. The controller provided by the invention can ensure that the transient and steady-state performance of the servo control system meets the preset performance requirement. By using the method of the preset performance control, the servo tracking error is restrained between the pre-specified upper and lower bound performance functions, and the transient state and steady state performance of the servo control system, such as overshoot, error convergence speed, steady state error and the like, can meet the design requirements.
3. The controller provided by the invention can ensure that the visual angle constraint of the pinhole camera can be always met. By designing a proper performance function, the invention can ensure that the tracked target is limited within the visual angle range of the pinhole camera in the whole control process, so that the servo control can be smoothly carried out.
Drawings
Fig. 1 is a schematic diagram of a positional relationship between coordinate systems according to an embodiment of the present invention.
Fig. 2 is a block diagram of the overall structure of a visual servo control system according to an embodiment of the present invention.
Fig. 3 is a diagram of a motion trajectory of the mobile robot according to the embodiment of the present invention.
Fig. 4 is a track diagram of the feature points on the image plane according to the embodiment of the present invention.
FIG. 5 is a diagram of servo control error variables in an embodiment of the present invention
Figure BDA0002710179450000061
A simulation diagram of (1).
FIG. 6 is a diagram of servo control error variables in an embodiment of the present invention
Figure BDA0002710179450000062
A simulation diagram of (1).
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The vision servo control of the mobile robot under the visual angle constraint of the embodiment totally comprises two mobile robots, wherein the first robot is a target robot, and the second robot is a robot for implementing the vision servo. Fig. 1 is a schematic diagram of a positional relationship of each coordinate system, fig. 2 is a block diagram of an overall structure of a mobile robot vision servo control system under a viewing angle constraint, and a detailed implementation process of a control method includes the following steps:
step (1): establishing a kinematic model of the ith incomplete mobile robot:
Figure BDA0002710179450000071
wherein (x)i,yi) Indicating the position of the ith mobile robot in the geodetic coordinate system; thetaiThe course angle of the ith mobile robot in the geodetic coordinate system is defined; v. ofiAnd ωiRespectively linear and angular velocity of the mobile robot moving relative to a geodetic coordinate system. In this example, the state of each robot at the initial time is: [ x ] of1,y1,θ1]T=[0,0,0]T,[x2,y2,θ2]T=[-1.5,0,π/9]TThe initial linear and angular velocities of both robots are 0. The robot serving as a tracking target moves following a trajectory as follows:
x1(t)=10sin(0.1t)
y1(t)=10[1-cos(0.1t)]
θ1(t)=0.1t
from the above equation, it can be seen that the first robot trajectory is a circle with a radius of 10 meters. In the present embodiment, the movement trajectories of the two robots are shown in fig. 3.
Further, according to the imaging principle of the pinhole camera, a point in the three-dimensional space and the image projected on the image plane of the pinhole camera are collinear with the three optical center points of the pinhole camera. Thus, the projection model of a pinhole camera can be described by:
Figure BDA0002710179450000072
in the above formula, (u, v) represents the pixel coordinates of the image on the image plane corresponding to the space point, (X)c,Yc,Zc) Is the three-dimensional position coordinate of a space point in a camera coordinate system, A is an internal reference matrix of the camera, and the form of the internal reference matrix is as follows:
Figure BDA0002710179450000073
wherein, auAnd avScale factors, a, in the x-and y-axes of the image plane, respectivelyuvRepresents the distortion factor between the x-axis and the y-axis, (u)0,v0) Pixel coordinates representing the center point of the image plane. For the sake of convenience of distinction, the camera mounted on the jth mobile robot is simply referred to as the jth camera, and its camera coordinates are denoted by the superscript "j", that is, (X)cj,Ycj,Zcj). In this embodiment, the internal reference matrix of the pinhole camera is:
Figure BDA0002710179450000074
step (2): and (3) combining a kinematic model of the mobile robot, carrying out coordinate transformation on the projection model of the pinhole camera and establishing a system dynamic equation related to pixel coordinates. In the present invention, the body coordinate system of the jth mobile robot and the camera coordinate system of the pinhole camera mounted thereon have the following transformation relationship:
Figure BDA0002710179450000075
wherein (X)j,Yj,Zj) Is the coordinate of the space point in the body coordinate system of the jth mobile robot. In the present invention, the heights of the origin of the body coordinate systems of all the mobile robots from the ground are the same.
Further, a feature point located on the z-axis of the ith mobile robot coordinate system (hereinafter, simply referred to as the ith feature point) is set as a target point (i ≠ j) to be tracked by the jth mobile robot, and the coordinates of the feature point in the ith, jth, and jth camera coordinate systems are respectively expressed as "i ≠ j
Figure BDA0002710179450000081
Wherein
Figure BDA0002710179450000082
Figure BDA0002710179450000083
Is the height of the feature point. In the present embodiment, h10.3. The first two coordinates have the following transformation relation:
Figure BDA0002710179450000084
combining the two coordinate transformation relations and the projection model of the pinhole camera, the following equation can be obtained:
Figure BDA0002710179450000085
in the above formula, the first and second carbon atoms are,
Figure BDA0002710179450000086
pixel coordinates on the image plane of the jth camera representing the ith feature point, a1、a2The first and second rows of the reference matrix a, respectively. And (3) performing derivation on the formula, and finishing to obtain:
Figure BDA0002710179450000087
wherein the content of the first and second substances,
Figure BDA0002710179450000088
for normalized pixel coordinates, they are defined as:
Figure BDA0002710179450000089
in the embodiment, the pixel coordinate of the 1 st feature point on the image plane of the 2 nd camera
Figure BDA00027101794500000810
The change locus of (2) is shown in fig. 4.
And (3): for the j-th robot tracking the i-th robot, the servo tracking error is calculated
Figure BDA00027101794500000811
And
Figure BDA00027101794500000812
is defined as:
Figure BDA00027101794500000813
wherein the content of the first and second substances,
Figure BDA00027101794500000814
is composed of
Figure BDA00027101794500000815
The expected value of (c) is,
Figure BDA00027101794500000816
and
Figure BDA00027101794500000817
are all constants. In the present embodiment, it is preferred that,
Figure BDA00027101794500000818
Figure BDA00027101794500000819
view angle constraints of pinhole cameras: the field of view (FOV) of a pinhole camera is limited, i.e. the camera has one horizontal and one vertical range of viewing angles, which are called the viewing angle constraints of the pinhole camera. The constraints of these two angles are embodied in the image plane as a range of pixel coordinates that is limited, namely:
umin≤u≤umax
vmin≤v≤vmax
wherein (u)min,vmin) And (u)max,vmax) Respectively the pixel coordinates of the upper left corner point and the pixel coordinates of the lower right corner point of the image plane. In this embodiment, the pixel resolution of the camera is 640 × 480, i.e., umin=vmin=0,umax=640,vmax480. In combination with the above definition of servo tracking error, the following error constraints can be obtained:
Figure BDA0002710179450000091
Figure BDA0002710179450000092
further, in order to ensure that the closed-loop system can meet the specified transient and steady-state performance, a preset performance control method is used to introduce the following upper and lower bound performance constraints for the servo tracking error:
Figure BDA0002710179450000093
Figure BDA0002710179450000094
wherein the content of the first and second substances,β ui(t)、
Figure BDA0002710179450000095
β vi(t)、
Figure BDA0002710179450000096
for pre-designed performance functions, they are defined as follows:
Figure BDA00027101794500000920
Figure BDA00027101794500000919
in the above formula, the first and second carbon atoms are,
Figure BDA0002710179450000097
β *i,∞
Figure BDA0002710179450000098
represents the maximum allowable error, and respectively satisfies 0 <β *i,∞β *i,∞
Figure BDA0002710179450000099
κ *iAnd
Figure BDA00027101794500000910
for positive design parameters, ∈ { u, v }. The defined form of the performance function ensures that the performance constraint is more stringent than the view constraint, and when the servo tracking error satisfies the performance constraint, it must also satisfy the view constraint.
In one embodiment of the present application, a performance functionβ ui
Figure BDA00027101794500000911
β viAnd
Figure BDA00027101794500000912
respectively in the specific form of
β ui(t)=(320-5)e-0.15t+5
Figure BDA00027101794500000913
β vi(t)=(160-3)e-0.15t+3
Figure BDA00027101794500000914
As can be seen from the above-mentioned equation,
Figure BDA00027101794500000915
and
Figure BDA00027101794500000916
the constraint range of (1) is:
Figure BDA00027101794500000917
Figure BDA00027101794500000918
just satisfying the view angle constraints of a pinhole camera.
Fig. 5 and 6 show the variation of the servo tracking error, and it can be seen that both tracking errors can converge to the vicinity of the zero point rapidly, and both error curves do not cross the boundary defined by the upper and lower bound performance functions during the whole control process. This indicates that the performance constraints and viewing angle constraints described above can always be satisfied.
And (4): the following is constructed with respect to servo tracking error
Figure BDA0002710179450000101
And
Figure BDA0002710179450000102
barrier function of (d):
Figure BDA0002710179450000103
Figure BDA0002710179450000104
the barrier function described above has the following characteristics: if and only if servo tracking error
Figure BDA0002710179450000105
Approaching a performance functionβ *iOr
Figure BDA0002710179450000106
When the function value of the barrier function is infinite; if and only if servo tracking error
Figure BDA0002710179450000107
Time, eta *i0, ∈ { u, v }. By virtue of the characteristics of the barrier function, the upper and lower bound constraints of the servo tracking error can be converted into the boundedness problem of the barrier function, and the former can design a suitable controller by a Lyapunov design method. Consider the following Lyapunov equation:
Figure BDA0002710179450000108
in the above formula, etai=[ηuivi]T,γ1、γ2In order to be a positive design parameter,
Figure BDA0002710179450000109
is hiEstimation error of (1: (
Figure BDA00027101794500001010
Is hiAn estimate of (c),
Figure BDA00027101794500001011
is v'iEstimation error of (1: (
Figure BDA00027101794500001012
Is v'iAn estimate of (d). In this embodiment, γ1=γ2=0.05,
Figure BDA00027101794500001013
All initial values of (a) are zero. According to the lyapunov direct method, the servo controller is designed as follows:
Figure BDA00027101794500001014
wherein, K1Is a positive diagonal matrix, in this embodiment, K1=diag(5,5),
Figure BDA00027101794500001015
Figure BDA00027101794500001016
Figure BDA00027101794500001017
Figure BDA00027101794500001018
Further, will
Figure BDA00027101794500001019
Is adaptive to update law and v'iThe observer of (a) is designed as:
Figure BDA0002710179450000111
Figure BDA0002710179450000112
wherein the content of the first and second substances,
Figure BDA0002710179450000113
is Gi -1First row of (a)1、σ2Are all design parameters greater than zero, in this embodiment, σ1=σ2=0.15。
The distributed vision servo controller of the embodiment can enable a servo control system consisting of the mobile robot and the pinhole camera to track the designated feature point only by means of self-acquired information, and the servo tracking error can be converged into a small neighborhood of zero. Although the system model contains unknown parameters and dynamic variables, the closed-loop system can still meet the preset transient and steady performance indexes. Moreover, the controller may ensure that the feature points are always located within the range of viewing angles of the pinhole camera.
According to the method, the visual angle constraint of the pinhole camera is converted into the upper and lower bound constraints on the servo tracking error, and the upper and lower bound constraints on the servo tracking error are finally converted into the bounded problem of the barrier function by using a preset performance control technology. Further, a controller capable of ensuring that the barrier function is bounded is designed by using the Lyapunov design method. By designing a proper performance function, the method can not only ensure that the visual angle constraint of the pinhole camera can be always met, but also ensure that transient and steady performance indexes such as error convergence speed, steady error and the like of a closed-loop system can meet preset performance requirements under the condition that the height of the characteristic point and the target speed are unknown.
In addition, the distributed visual servo controller provided by the invention does not need to acquire accurate speed information of the target. By designing the method of the speed observer, the invention can estimate and compensate the target speed without any communication between robots, thereby ensuring that the controller can be applied to various actual scenes.
The above embodiments are preferred embodiments of the present invention, but the present invention is not limited to the above embodiments, and any other changes, modifications, substitutions, combinations, and simplifications which do not depart from the spirit and principle of the present invention should be construed as equivalents thereof, and all such changes, modifications, substitutions, combinations, and simplifications are intended to be included in the scope of the present invention.

Claims (10)

1. A visual servo control method for a mobile robot under the constraint of visual angles is characterized by comprising the following steps:
constructing a kinematics model of the incomplete mobile robot and a projection model of the pinhole camera;
coordinate transformation is carried out on the projection model of the pinhole camera by combining a kinematic model of the mobile robot, and a system dynamic equation about pixel coordinates is established;
constructing a servo tracking error equation with respect to pixel coordinates
Figure FDA0002710179440000011
And
Figure FDA0002710179440000012
wherein i is the number of the tracked target robot, and j is the number of the mobile robot performing servo tracking; because the pinhole camera has visual angle constraint, in order to avoid the loss of a target in the servo process of the mobile robot, the visual angle constraint of the pinhole camera is converted into an upper and lower bound constraint problem about a servo tracking error by using a preset performance control method;
constructing a servo tracking error based on a preset performance control method
Figure FDA0002710179440000013
And
Figure FDA0002710179440000014
the barrier function ensures that the servo tracking error can be always kept between the upper and lower bound performance functions, the constructed barrier function is introduced into a Lyapunov design method, and an adaptive control and a speed observer are used for estimating and compensating unknown parameters and dynamic variables in a system to design the visual servo controller based on the image.
2. The visual servo control method for the mobile robot under the view angle constraint of claim 1, wherein the kinematic model of the incomplete mobile robot is specifically:
Figure FDA0002710179440000015
wherein (x)i,yi) Indicating the position of the ith mobile robot in the geodetic coordinate system; thetaiThe course angle of the ith mobile robot in the geodetic coordinate system is defined; v. ofiAnd ωiRespectively linear and angular velocity of the mobile robot moving relative to a geodetic coordinate system.
3. The visual servo control method of a mobile robot under the view angle constraint of claim 1, wherein the pinhole camera model is specifically:
Figure FDA0002710179440000016
in the above formula, (u, v) represents the pixel coordinates of the image on the image plane corresponding to the space point, (X)c,Yc,Zc) Is the three-dimensional position coordinate of a space point in a camera coordinate system, A is an internal reference matrix of the camera, and the form of the internal reference matrix is as follows:
Figure FDA0002710179440000017
wherein, auAnd avScale factors, a, in the x-and y-axes of the image plane, respectivelyuvRepresents the distortion factor between the x-axis and the y-axis, (u)0,v0) Pixel coordinates representing the center point of the image plane.
4. The visual servo control method of the mobile robot under the view angle constraint of claim 3, wherein the projection model of the pinhole camera is subjected to coordinate transformation and a system dynamic equation about pixel coordinates is established by combining a kinematic model of the mobile robot, and specifically comprises:
the camera mounted on the jth mobile robot is simply referred to as the jth camera, and its camera coordinates are marked with a mark "j", that is, (X)cj,Ycj,Zcj);
The body coordinate system of the jth mobile robot and the camera coordinate system of the pinhole camera mounted thereon have the following transformation relation:
Figure FDA0002710179440000021
wherein (X)j,Yj,Zj) The coordinates of the space points in the body coordinate system of the jth mobile robot are obtained;
for a feature point located on the z-axis of the ith mobile robot coordinate system, it is abbreviated as the ith feature point, and it is set as the target point tracked by the jth mobile robot, i ≠ j, and the coordinates of the feature point in the ith, jth and jth mobile robots and jth cameras coordinate systems are respectively recorded as
Figure FDA0002710179440000022
Wherein
Figure FDA0002710179440000023
Figure FDA0002710179440000024
The first two coordinates are transformed as follows:
Figure FDA0002710179440000025
combining the two coordinate transformation relations and the projection model of the pinhole camera, the following equation can be obtained:
Figure FDA0002710179440000026
in the above formula, the first and second carbon atoms are,
Figure FDA0002710179440000027
pixel coordinates on the image plane of the jth camera representing the ith feature point, a1、a2Respectively are a first row and a second row of the internal reference matrix A, and the derivation is carried out on the formula to obtain:
Figure FDA0002710179440000028
wherein, v'i=vi/hi
Figure FDA0002710179440000029
Is a normalized pixel coordinate defined as:
Figure FDA00027101794400000210
5. the visual servo control method of mobile robot under view angle constraint of claim 1, wherein for j-th robot tracking i-th robot, servo tracking error is applied
Figure FDA00027101794400000211
And
Figure FDA00027101794400000212
is defined as:
Figure FDA00027101794400000213
wherein the content of the first and second substances,
Figure FDA00027101794400000214
is composed of
Figure FDA00027101794400000215
The expected value of (c) is,
Figure FDA00027101794400000216
and
Figure FDA00027101794400000217
are all constants;
view angle constraints of pinhole cameras: the field of view of a pinhole camera is limited, i.e. the camera has a range of horizontal viewing angles and a range of vertical viewing angles, called the viewing angle constraint of the pinhole camera, which are embodied in the image plane as a range of pixel coordinates, which is limited, i.e.:
umin≤u≤umax
vmin≤v≤vmax
wherein (u)min,vmin) And (u)max,vmax) The pixel coordinates of the upper left corner point and the pixel coordinates of the lower right corner point of the image plane are respectively combined with the definition about the servo tracking error to obtain the following error constraints:
Figure FDA0002710179440000031
6. the visual servo control method for the mobile robot under the visual angle constraint of claim 5, wherein in order to ensure that the closed-loop system can meet the specified transient and steady-state performance, the method for controlling the preset performance is used to introduce the following upper and lower bound performance constraints for the servo tracking error:
Figure FDA0002710179440000032
Figure FDA0002710179440000033
wherein the content of the first and second substances,β ui(t)、
Figure FDA0002710179440000034
β vi(t)、
Figure FDA0002710179440000035
for the pre-specified performance function, the following is defined:
Figure FDA0002710179440000036
Figure FDA0002710179440000037
in the above formula, the first and second carbon atoms are,
Figure FDA0002710179440000038
β *i,∞
Figure FDA0002710179440000039
represents the maximum allowable steady-state error, and respectively satisfies 0 <B *i,∞β *i,0
Figure FDA00027101794400000310
κ *i,
Figure FDA00027101794400000311
For positive design parameters, ∈ { u, v }; the defined form of the performance function ensures that the performance constraint is more stringent than the view constraint, and when the servo tracking error satisfies the performance constraint, it must also satisfy the view constraint.
7. The visual servo control method for mobile robot under the constraint of visual angle of claim 1, wherein the barrier function is specifically as follows:
Figure FDA00027101794400000312
Figure FDA00027101794400000313
the barrier function described above has the following characteristics: if and only if servo tracking error
Figure FDA00027101794400000314
Approaching a performance functionβ *iOr
Figure FDA00027101794400000315
When the function value of the barrier function is infinite; if and only if servo tracking error
Figure FDA00027101794400000316
Time, eta*i=0,*∈{u,v}。
8. The visual servo control method for the mobile robot under the constraint of visual angle as claimed in claim 7, wherein based on the characteristics of the barrier function, the upper and lower bounds constraints of the servo tracking error can be converted into the bounded problem of the barrier function, and the suitable controller is designed by the lyapunov design method, considering the following lyapunov equation:
Figure FDA0002710179440000041
in the above formula, etai=[ηuivi]T,γ1、γ2In order to be a positive design parameter,
Figure FDA0002710179440000042
is hiThe error of the estimation of (2) is,
Figure FDA0002710179440000043
is hiIs determined by the estimated value of (c),
Figure FDA0002710179440000044
is v'iThe error of the estimation of (2) is,
Figure FDA0002710179440000045
is v'iAccording to the lyapunov direct method, the servo controller is designed as follows:
Figure FDA0002710179440000046
wherein, K1Is a positive definite diagonal matrix of the matrix,
Figure FDA0002710179440000047
Figure FDA0002710179440000048
Figure FDA0002710179440000049
9. the visual servo control method for mobile robot under visual angle constraint of claim 8, further comprising the following steps:
will be provided with
Figure FDA00027101794400000410
Adaptive update of rhythm and v'iThe observer of (a) is designed as:
Figure FDA00027101794400000411
Figure FDA00027101794400000412
wherein the content of the first and second substances,
Figure FDA00027101794400000413
is Gi -1First row of (a)1、σ2Are all design parameters greater than zero.
10. Visual servo control method of mobile robots under a view constraint according to any of the claims 1-9, characterized in that the height of the origin of the body coordinate system of all mobile robots from the ground is the same.
CN202011053311.6A 2020-09-29 2020-09-29 Visual servo control method for mobile robot under visual angle constraint Active CN112256001B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011053311.6A CN112256001B (en) 2020-09-29 2020-09-29 Visual servo control method for mobile robot under visual angle constraint

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011053311.6A CN112256001B (en) 2020-09-29 2020-09-29 Visual servo control method for mobile robot under visual angle constraint

Publications (2)

Publication Number Publication Date
CN112256001A true CN112256001A (en) 2021-01-22
CN112256001B CN112256001B (en) 2022-01-18

Family

ID=74233901

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011053311.6A Active CN112256001B (en) 2020-09-29 2020-09-29 Visual servo control method for mobile robot under visual angle constraint

Country Status (1)

Country Link
CN (1) CN112256001B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112947569A (en) * 2021-03-09 2021-06-11 中南大学 Visual servo target tracking control method for quad-rotor unmanned aerial vehicle based on preset performance
CN113031590A (en) * 2021-02-06 2021-06-25 浙江同筑科技有限公司 Mobile robot vision servo control method based on Lyapunov function
CN113190042A (en) * 2021-05-06 2021-07-30 南京云智控产业技术研究院有限公司 Unmanned aerial vehicle ground moving target tracking control method based on graphic moments
CN114518753A (en) * 2022-01-25 2022-05-20 华南理工大学 Unmanned ship vision servo control method based on preset performance control
CN114721275A (en) * 2022-05-13 2022-07-08 北京航空航天大学 Visual servo robot self-adaptive tracking control method based on preset performance

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106940186A (en) * 2017-02-16 2017-07-11 华中科技大学 A kind of robot autonomous localization and air navigation aid and system
CN109358507A (en) * 2018-10-29 2019-02-19 东北大学 A kind of visual servo adaptive tracking control method of time-varying performance boundary constraint
CN109976347A (en) * 2019-04-11 2019-07-05 中南大学 A kind of visual servo paths planning method based on Quick Extended random tree and potential field method
CN109991636A (en) * 2019-03-25 2019-07-09 启明信息技术股份有限公司 Map constructing method and system based on GPS, IMU and binocular vision
CN110142785A (en) * 2019-06-25 2019-08-20 山东沐点智能科技有限公司 A kind of crusing robot visual servo method based on target detection
CN111197984A (en) * 2020-01-15 2020-05-26 重庆邮电大学 Vision-inertial motion estimation method based on environmental constraint
CN111552293A (en) * 2020-05-13 2020-08-18 湖南大学 Mobile robot formation control method based on images under visual field constraint
CN111553239A (en) * 2020-04-23 2020-08-18 厦门理工学院 Robot joint visual servo control method, terminal device and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106940186A (en) * 2017-02-16 2017-07-11 华中科技大学 A kind of robot autonomous localization and air navigation aid and system
CN109358507A (en) * 2018-10-29 2019-02-19 东北大学 A kind of visual servo adaptive tracking control method of time-varying performance boundary constraint
CN109991636A (en) * 2019-03-25 2019-07-09 启明信息技术股份有限公司 Map constructing method and system based on GPS, IMU and binocular vision
CN109976347A (en) * 2019-04-11 2019-07-05 中南大学 A kind of visual servo paths planning method based on Quick Extended random tree and potential field method
CN110142785A (en) * 2019-06-25 2019-08-20 山东沐点智能科技有限公司 A kind of crusing robot visual servo method based on target detection
CN111197984A (en) * 2020-01-15 2020-05-26 重庆邮电大学 Vision-inertial motion estimation method based on environmental constraint
CN111553239A (en) * 2020-04-23 2020-08-18 厦门理工学院 Robot joint visual servo control method, terminal device and storage medium
CN111552293A (en) * 2020-05-13 2020-08-18 湖南大学 Mobile robot formation control method based on images under visual field constraint

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
刘江涛: ""基于视觉定位的轮式机器人避障研究"", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
宋健: ""基于视觉的移动机器人定位方法的研究"", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113031590A (en) * 2021-02-06 2021-06-25 浙江同筑科技有限公司 Mobile robot vision servo control method based on Lyapunov function
CN112947569A (en) * 2021-03-09 2021-06-11 中南大学 Visual servo target tracking control method for quad-rotor unmanned aerial vehicle based on preset performance
CN112947569B (en) * 2021-03-09 2022-08-12 中南大学 Visual servo target tracking control method for quad-rotor unmanned aerial vehicle based on preset performance
CN113190042A (en) * 2021-05-06 2021-07-30 南京云智控产业技术研究院有限公司 Unmanned aerial vehicle ground moving target tracking control method based on graphic moments
CN114518753A (en) * 2022-01-25 2022-05-20 华南理工大学 Unmanned ship vision servo control method based on preset performance control
CN114721275A (en) * 2022-05-13 2022-07-08 北京航空航天大学 Visual servo robot self-adaptive tracking control method based on preset performance

Also Published As

Publication number Publication date
CN112256001B (en) 2022-01-18

Similar Documents

Publication Publication Date Title
CN112256001B (en) Visual servo control method for mobile robot under visual angle constraint
CN110039542B (en) Visual servo tracking control method with speed and direction control function and robot system
CN105872371A (en) Information processing method and electronic device
CN113110495B (en) Formation control method of mobile robots under consideration of external interference
CN107966907B (en) Obstacle avoidance solution applied to redundant manipulator
CN107831761A (en) A kind of path tracking control method of intelligent vehicle
CN109032137B (en) Distributed tracking control method for multi-Euler-Lagrange system
CN109933096B (en) Cloud deck servo control method and system
CN109102525A (en) A kind of mobile robot follow-up control method based on the estimation of adaptive pose
CN111552293B (en) Mobile robot formation control method based on images under visual field constraint
CN109782759B (en) Approximate decoupling and rapid track following control method of wheeled mobile robot
CN110722533B (en) External parameter calibration-free visual servo tracking of wheeled mobile robot
CN112000135B (en) Three-axis holder visual servo control method based on human face maximum temperature point characteristic feedback
CN111553239A (en) Robot joint visual servo control method, terminal device and storage medium
CN109976347A (en) A kind of visual servo paths planning method based on Quick Extended random tree and potential field method
CN107192375A (en) A kind of unmanned plane multiple image adaptive location bearing calibration based on posture of taking photo by plane
CN110928311B (en) Indoor mobile robot navigation method based on linear features under panoramic camera
CN114721275B (en) Visual servo robot self-adaptive tracking control method based on preset performance
CN112504261A (en) Unmanned aerial vehicle landing pose filtering estimation method and system based on visual anchor point
CN112099505B (en) Low-complexity visual servo formation control method for mobile robot
CN113051767A (en) AGV sliding mode control method based on visual servo
CN108927807A (en) A kind of robot vision control method based on point feature
CN113211433A (en) Separated visual servo control method based on composite characteristics
CN111899303A (en) Novel feature matching and relative positioning method considering space inverse projection constraint
CN113689501B (en) Double-machine cooperative target machine positioning tracking control method based on convergence point

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant