CN109960145A - Mobile robot mixes vision track following strategy - Google Patents

Mobile robot mixes vision track following strategy Download PDF

Info

Publication number
CN109960145A
CN109960145A CN201711438733.3A CN201711438733A CN109960145A CN 109960145 A CN109960145 A CN 109960145A CN 201711438733 A CN201711438733 A CN 201711438733A CN 109960145 A CN109960145 A CN 109960145A
Authority
CN
China
Prior art keywords
error
mobile robot
image
robot
track following
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201711438733.3A
Other languages
Chinese (zh)
Other versions
CN109960145B (en
Inventor
闫凡雷
李宝全
师五喜
王栋伟
尹浩霖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin Polytechnic University
Original Assignee
Tianjin Polytechnic University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Polytechnic University filed Critical Tianjin Polytechnic University
Priority to CN201711438733.3A priority Critical patent/CN109960145B/en
Publication of CN109960145A publication Critical patent/CN109960145A/en
Application granted granted Critical
Publication of CN109960145B publication Critical patent/CN109960145B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B13/00Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion
    • G05B13/02Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric
    • G05B13/04Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric involving the use of models or simulators
    • G05B13/042Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric involving the use of models or simulators in which a parameter or coefficient is automatically adjusted to optimise the performance

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)
  • Image Processing (AREA)

Abstract

The present invention discloses a kind of mobile robot mixing vision track following strategy.Vision Trajectory Tracking Control method is proposed for the wheeled mobile robot equipped with vehicle-mounted vision system, wherein being readily maintained at visual signature within the scope of camera coverage using 2.5 dimension visual servo frames, and then improves the effect of vision track following.Firstly, designing the systematic error of mixed form by characteristics of image and robot direction according to present image, reference picture and desired image sequence.New systematic error is introduced later, is derived open cycle system error equation after doing corresponding transformation.Based on this, a kind of adaptive controller is devised to realize visual servo track following task, wherein passing through parameter update mechanism complementary characteristics point depth.According to Lyapunov method and Barbalat lemma, it was demonstrated that go out mentioned method can make systematic error under depth unknown situation asymptotic convergence to zero.Contrast simulation results showed that mentioned method performance.

Description

Mobile robot mixes vision track following strategy
Technical field
The invention belongs to the technical fields of computer vision and mobile robot, mixed more particularly to a kind of mobile robot Close vision track following strategy.
Background technique
With the fast development of robot technology, mobile robot is played an increasingly important role.Nowadays, it has Move the advantages that flexible, easily operated and working space is big.In addition, visual sensor is widely used in respective intelligent body, this Class sensor has abundant information, non-cpntact measurement, high reliability.Mobile robot and visual sensor are combined Come, the ability of robot system perception external environment can be enhanced, and then complicated task can be easily accomplished.View-based access control model feedback Control, the also referred to as visual servo of mobile robot guides mobile robot using the feedback of realtime graphic, make robot with The given tracking of track is calmed to expected pose.Therefore, this technology can be applied hands in many fields, such as intelligence It is logical, home services, automatic material flow etc..This technology is a research hotspot of robot and automatic field.
Compared with the control of the posture stabilization of mobile robot, track following strategy can mutually be tied with other motion measurements It closes, therefore is more suitable for completing complicated task.For the mobile-robot system of view-based access control model, wherein main challenge is a lack of The depth information of environment makes mobile robot pose be difficult to be restored completely, and brings uncertainty for Closed dynamitic system. On the other hand, mobile robot is the typical under-actuated systems with nonholonomic motion, therefore existing robot control strategy, Including track following, it is not directly applicable mobile-robot system.It is machine in view of uncertain factors such as nonholonomic constraints People designs a high performance hybrid vision track following strategy and is challenging.
Up to the present, certain methods have been devised to realize the track following target of mobile robot.Blazic is mentioned A kind of frame constructing various contrail trackers out can obtain different tracking performances by changing time-varying function.Base In polar expression, Chwa designs a sliding mode controller to track given track, can obtain efficient motion path.Li Et al. combine quadratic programming and predictive control model to solve track following problem, wherein considering mobile robot speed Constraint of saturation.Zambelli et al. combines tracking error with decreasing function, designs controller in order to adjust transitory It can be with track following effect.But tracking above is effective when mobile robot total state can be surveyed.Due to It carries out lacking depth information when track following when using visual sensor, so bringing difficulty to track following.
When using visual sensor, it should be readily maintained at visual signature within the scope of camera coverage.The benefits such as Fang Active vision is formed with holder result, tracking sensation target is accordingly rotated.Mariottini etc. is in order in mobile robot Visible feature is kept when mobile, omnidirectional's camera is then used in visual servo task.In addition, in order to handle unknown depth Information is spent, the adaptive controller that Zhang et al. devises a kind of visual servo compensates depths of features.Simultaneously to expectation Track tracked, Wang et al. propose identification algorithm can be with the depths of features of On-line Estimation and robot global position. Yang et al. proposes one kind in the case where mobile robot does not demarcate visual parameters under uncertain robot dynamics Contrail tracker based on adaptive torque.In a sense, challenge is related to by visual sensor bring Visual field constraint and unknown depths of features.In visual servo tracing task, it should handle with great care.
Many solutions are that mobile robot visual servo tracking design is with ceiling camera.For example, beam etc. It proposes and proposes a kind of contrail tracker based on not calibrated camera image, camera plane does not need in this scheme It is parallel with robot motion's plane.However, the application potential frame of eyes opponent is limited, because mobile robot is limited Within small range.On the other hand, mobile robot visual servo track task is completed, recent years, various methods are It is developed for Airborne camera configuration.
Chen et al. proposes a kind of controller that there is vision track following characteristic point depth adaptive to update rule, i.e., Required motion profile is determined by the image sequence prerecorded.The track given by the tracking of a series of key images, Jia et al. devises the adaptive tracking control unit of one kind to handle the camera substantially installed.Cherubini et al. utilizes pass Key image sequence defines the desired motion path of mobile robot, and devises one kind by image mistake and Obstacle Position Contrail tracker.
Becerra et al. has used pole and trifocal tensor to design the angular speed of mobile robot, wherein according to compared with The image obtained in big working space defines track to be tracked.Regrettably, existing method seldom considers visual field problem Or the effect of track following is discussed.
Summary of the invention
The present invention discloses a kind of mobile robot mixing vision track following strategy.For equipped with vehicle-mounted vision system Wheeled mobile robot proposes vision Trajectory Tracking Control method, wherein making visual signature using 2.5 dimension visual servo frames It is readily maintained within the scope of camera coverage, and then improves the effect of vision track following.Firstly, according to present image, reference Image and desired image sequence are designed the systematic error of mixed form by characteristics of image and robot direction.It introduces later New systematic error is derived open cycle system error equation after doing corresponding transformation.Based on this, a kind of adaptive controller is devised To realize visual servo track following task, wherein passing through parameter update mechanism complementary characteristics point depth.According to the side Lyapunov Method and Barbalat lemma, it was demonstrated that go out mentioned method can make systematic error under depth unknown situation asymptotic convergence to zero.It is right Than the performance that simulation result shows proposed method.
The mixing vision track strategy of mobile robot provided by the invention includes:
1st, the design of visual servo Trajectory Tracking System
The description of 1.1st scheme
In the present invention, we design a kind of mobile robot mixing visual servo track following strategy;Utilize 2.5 dimensions Visual servo frame is so that visual signature is readily maintained within the scope of camera coverage, and then improves the effect of track following;It is first First, according to present image, reference picture and desired image sequence, 2-1/2-D view is defined by characteristics of image and robot rotation amount Feel servo tracking errors;Adaptive controller is devised later, wherein passing through parameter update mechanism complementary characteristics point depth information; According to Lyapunov method and Barbalat lemma, it was demonstrated that go out in the case where scene depth is unknown, the vision track that is mentioned with Track control method is able to achieve Asymptotic Stability as a result, comparing simulation result, compared with our pervious work, tracks in visual servo With set point it is constant in the case where, mentioned method need to only adjust less control parameter and complete track following task, relatively be suitble to Actual use;
2nd, construct system model
The description of 2.1st problem
In-vehicle camera coordinate system FcIt is overlapped with the coordinate system of mobile robot with nonholonomic constraints;Coordinate system FcZcAxis is along camera Optical axis direction, and with mobile robot towards unanimously;xcAxis is parallel with wheel axis direction, ycAxis is perpendicular to robot motion's plane zcxc;In addition, with FdIndicate the coordinate system on desired trajectory, wherein desired trajectory is defined by the image sequence prerecorded;It is quiet Only coordinate system F*Indicate robot/video camera reference pose, be set to reference frame, can make desired image sequence with Current image is compared by reference to image;The angle, θ calculated by the method for homographyc(t) and θd(t) the two angles Degree is illustrated respectively in F under reference framecAnd FdRotation angle;According to the definition of these coordinate systems, the present invention designs a kind of view Feel Trajectory Tracking Control method, makes in-vehicle camera coordinate system FcWith desired trajectory coordinate system FdIt coincides;
2.2nd can survey signal
Consider static nature point P in the scenei(i=1;2;...;N), in F*, FdAnd FcUnder coordinate use respectivelyCarry out F*, FdAnd FcDescription:
The corresponding homogeneous image pixel coordinates of above three coordinate are respectively
Normalized image coordinate is measurable:
Wherein K ∈ R3×3It is calibrated camera intrinsic parameter matrix;
For the ease of subsequent analysis, depth ratio is defined as follows:
Since mobile robot is usually maintained a certain distance with target object, thus variableWithIt is positive Know γi1(t) and γi2(t) singular problem, Ke Yiyong will not occurTo estimate;
Angular speed w on desired trajectoryd(t) and scale meaning under expectation linear velocityIt can be by following The form of calculus of finite differences calculates
Wherein θd(k) θ at current time is indicatedd(t) value, θd(k-1) θ of previous moment is indicatedd(t) value,WithDefinition it is similar therewith, Δ tkIt is the time interval between two moment;
3rd, controller design
Analysis robot kinematics first, then planned course tracking control unit is with Active Compensation unknown characteristics point depth, Prove that proposed controller can make tracking error asymptotic convergence to zero using Lyapunov method.
3.1st, robot kinematics
FcAnd FdBetween translation error ez(t), exIt (t) can be by arbitrary characteristics point PiIt obtains, is defined as follows:
It considersKnow that above-mentioned definition mode does not have singularity problem furthermore FcAnd FdBetween rotation error eθ (t) is defined as:
eθ:=θcd. (7)
By (6) (7) it is found that the track following error constructed is made of characteristics of image and the rotation angle estimated 's;The thus invention is built upon under 2.5 dimension visual servo frames, and visual signature can be made to be easily held in camera coverage model Within enclosing;
For the ease of the controller design of next part, the present invention constructs a new error vector:
To ρ1, ρ2And ρ3After time derivation, available following chain type kinematical equation:
WhereinIndicate unknown characteristics point depth information;In addition, using the pose algorithm for estimating based on homography, it can To obtain error signal ez(t), ex(t), eθ(t);
For the ease of the controller design of next part, the present invention makes following hypothesis:
Assuming that 1: the speed v on desired trajectoryd(t), wdIt (t) is bounded, and
3.2nd controller design
In the open loop error system-based of formula (9), the present invention devises shifting on the basis of the depth of characteristic point is unknown The hybrid visual servo contrail tracker of mobile robot;
Based on Lyapunov method for analyzing stability, the mobile robot linear velocity and angular speed control of following form are devised System rule:
Wherein kv, kwThe control gain being positive.It is the estimated value of unknown constant α related with characteristic point depth, and leads to Following manner is crossed to be updated:
Wherein Γ ∈ R+It is more new gain;
After controller is substituted into open loop kinetics equation (9), it is as follows that closed-loop error equation can be obtained:
WhereinIt is the definition of parameter estimating error:
Theorem 1: control law (10) and parameter more new law (11) can make the error asymptotic convergence in system dynamical equation (9) extremely Zero:
It proves: choosing following non-negative Lyapunov function V (t):
To (14) both sides about time derivation, then bring into known to closed loop kinematic equation (12):
After formula (11) are substituted into (15), it can obtain
According to formula (14) and (16), it is apparent from ρ1(t), ρ2(t), ρ3(t),And then v can be obtained according to formula (10)c (t), wc(t)∈L;As it is assumed that vd(t), wd(t) it is bounded, thus can be obtained according to (9) and (11)Cause This, system mode is all bounded;
In turn, from ρ known to (16)1, ρ2∈L2;Therefore, it can directly be obtained using Barbalat lemma Then, to ρ in formula (12)1(t) corresponding sin ρ11Partial derivative is as follows:
It is set up in addition, being apparent from following relationship:
Due toIt is continuous in section (0, ∞), therefore it can be concluded that knot It can be seen that sin ρ11It is congruous continuity;
In addition, being easy to getIt is also congruous continuity.We also haveWith Therefore, to ρ1(t) closed-loop error system can be obtained using extension Barbalat lemma
From (6) and (7) it is found that when formula (13) are set up, it can be achieved that vision track following task.
As it is assumed thatIt is readily seenTherefore, it is understood that the systematic error constructed Equal asymptotic convergence is to 0, it may be assumed that
In addition, according to the ρ in (8)1(t), ρ2(t), ρ3(t) and ez(t), ex(t), eθ(t) relationship, it is known that track following Error asymptotic convergence is to zero, it may be assumed that
Detailed description of the invention:
Fig. 1 is the coordinate system in this visual servo tracing task;
Fig. 2 is the block diagram of entire control system;
Fig. 3 is the scheme that is proposed: moveable robot movement track is marked with desired and current motion profile;
Fig. 4 is the scheme that is proposed: the differentiation [dotted line: desired value (zero)] of mobile robot tracking error;
Fig. 5 is the scheme proposed: the speed [solid line: rate of current of mobile robot;Dotted line: the speed of target trajectory];
Fig. 6 is the scheme proposed: the two-dimentional track [solid line: current signature track of characteristics of image;Dotted line: desired feature Track];
Fig. 7 is to indicate experimental result: the motion profile of mobile robot is marked with desired and current motion profile;
Fig. 8 is to indicate experimental result: the differentiation [dotted line: desired value (zero)] of mobile robot tracking error;
Fig. 9 indicates experimental result: the speed [solid line: present speed of mobile robot;Dotted line: the speed on required track Degree];
Figure 10 indicates experimental result: the two-dimentional track [solid line: present image track of the characteristic point in image space;Dotted line: Required image path];
Figure 11 indicates experimental result: the differentiation of the position and direction error of mobile robot forms [dotted line: desired value (zero)];
Figure 12 indicates experimental result: the linear and angular speed [solid line: rate of current of mobile robot;Dotted line: required track On speed].
Specific embodiment:
Embodiment 1
1st, the design of visual servo Trajectory Tracking System
The description of 1.1st scheme
In the present invention, we design a kind of mobile robot mixing visual servo track following strategy;Utilize 2.5 dimensions Visual servo frame is so that visual signature is readily maintained within the scope of camera coverage, and then improves the effect of track following;It is first First, according to present image, reference picture and desired image sequence, 2-1/2-D view is defined by characteristics of image and robot rotation amount Feel servo tracking errors;Adaptive controller is devised later, wherein passing through parameter update mechanism complementary characteristics point depth information; According to Lyapunov method and Barbalat lemma, it was demonstrated that go out in the case where scene depth is unknown, the vision track that is mentioned with Track control method is able to achieve Asymptotic Stability as a result, comparing simulation result, compared with our pervious work, tracks in visual servo With set point it is constant in the case where, mentioned method need to only adjust less control parameter and complete track following task, relatively be suitble to Actual use;
2nd, construct system model
The description of 2.1st problem
In-vehicle camera coordinate system FcIt is overlapped with the coordinate system of mobile robot with nonholonomic constraints;Coordinate system FcZcAxis is along camera Optical axis direction, and with mobile robot towards unanimously;xcAxis is parallel with wheel axis direction, ycAxis is perpendicular to robot motion's plane zcxc;In addition, with FdIndicate the coordinate system on desired trajectory, wherein desired trajectory is defined by the image sequence prerecorded;It is quiet Only coordinate system F*Indicate robot/video camera reference pose, be set to reference frame, can make desired image sequence with Current image is compared by reference to image;The angle, θ calculated by the method for homographyc(t) and θd(t) the two angles Degree is illustrated respectively in F under reference framecAnd FdRotation angle;According to the definition of these coordinate systems, the present invention designs a kind of view Feel Trajectory Tracking Control method, makes in-vehicle camera coordinate system FcWith desired trajectory coordinate system FdIt coincides;
2.2nd can survey signal
Consider static nature point P in the scenei(i=1;2;...;N), in F*, FdAnd FcUnder coordinate use respectivelyCarry out F*, FdAnd FcDescription:
The corresponding homogeneous image pixel coordinates of above three coordinate are respectively
Normalized image coordinate is measurable:
Wherein K ∈ R3×3It is calibrated camera intrinsic parameter matrix;
For the ease of subsequent analysis, depth ratio is defined as follows:
Since mobile robot is usually maintained a certain distance with target object, thus variableWithIt is positive Know γi1(t) and γi2(t) singular problem, Ke Yiyong will not occurTo estimate;
Angular speed w on desired trajectoryd(t) and scale meaning under expectation linear velocityIt can be by following The form of calculus of finite differences calculates
Wherein θd(k) θ at current time is indicatedd(t) value, θd(k-1) θ of previous moment is indicatedd(t) value,WithDefinition it is similar therewith, Δ tkIt is the time interval between two moment;
3rd, controller design
Analysis robot kinematics first, then planned course tracking control unit is with Active Compensation unknown characteristics point depth, Prove that proposed controller can make tracking error asymptotic convergence to zero using Lyapunov method.
3.1st, robot kinematics
FcAnd FdBetween translation error ez(t), exIt (t) can be by arbitrary characteristics point PiIt obtains, is defined as follows:
It considersKnow that above-mentioned definition mode does not have singularity problem furthermore FcAnd FdBetween rotation error eθ (t) is defined as:
eθ:=θcd. (7)
By (6) (7) it is found that the track following error constructed is made of characteristics of image and the rotation angle estimated 's;The thus invention is built upon under 2.5 dimension visual servo frames, and visual signature can be made to be easily held in camera coverage model Within enclosing;
For the ease of the controller design of next part, the present invention constructs a new error vector:
To ρ1, ρ2And ρ3After time derivation, available following chain type kinematical equation:
WhereinIndicate unknown characteristics point depth information;In addition, using the pose algorithm for estimating based on homography, it can To obtain error signal ez(t), ex(t), eθ(t);
For the ease of the controller design of next part, the present invention makes following hypothesis:
Assuming that 1: the speed v on desired trajectoryd(t), wdIt (t) is bounded, and
3.2nd controller design
In the open loop error system-based of formula (9), the present invention devises shifting on the basis of the depth of characteristic point is unknown The hybrid visual servo contrail tracker of mobile robot;
Based on Lyapunov method for analyzing stability, the mobile robot linear velocity and angular speed control of following form are devised System rule:
Wherein kv, kwThe control gain being positive.It is the estimated value of unknown constant α related with characteristic point depth, and leads to Following manner is crossed to be updated:
Wherein Γ ∈ R+It is more new gain;
After controller is substituted into open loop kinetics equation (9), it is as follows that closed-loop error equation can be obtained:
WhereinIt is the definition of parameter estimating error:
Theorem 1: control law (10) and parameter more new law (11) can make the error asymptotic convergence in system dynamical equation (9) extremely Zero:
It proves: choosing following non-negative Lyapunov function V (t):
To (14) both sides about time derivation, then bring into known to closed loop kinematic equation (12):
After formula (11) are substituted into (15), it can obtain
According to formula (14) and (16), it is apparent from ρ1(t), ρ2(t), ρ3(t),And then v can be obtained according to formula (10)c (t), wc(t)∈L;As it is assumed that vd(t), wd(t) it is bounded, thus can be obtained according to (9) and (11)Cause This, system mode is all bounded;
In turn, from ρ known to (16)1, ρ2∈L2;Therefore, it can directly be obtained using Barbalat lemmaThen, to ρ in formula (12)1(t) corresponding sin ρ11Partial derivative is as follows:
It is set up in addition, being apparent from following relationship:
Due toIt is continuous in section (0, ∞), therefore it can be concluded that knot It can be seen that sin ρ11It is congruous continuity;
In addition, being easy to getIt is also congruous continuity.We also haveWith Therefore, to ρ1(t) closed-loop error system can be obtained using extension Barbalat lemma
From (6) and (7) it is found that when formula (13) are set up, it can be achieved that vision track following task.
As it is assumed thatIt is readily seenTherefore, it is understood that the systematic error constructed Equal asymptotic convergence is to 0, it may be assumed that
In addition, according to the ρ in (8)1(t), ρ2(t), ρ3(t) and ez(t), ex(t), eθ(t) relationship, it is known that track following Error asymptotic convergence is to zero, it may be assumed that
4th, simulation result
In in this section, the performance of proposed method is verified the present invention provides simulation result.It is randomly selected total Region feature point is taken as sensation target, and reference picture is in F*Place obtains.In order to obtain required image sequence, needed for it is defined Robot motion track, virtual robot camera system is manipulated by the speed of sinusoid.Further it is provided that scheme and warp The visual servo tracking strategy of allusion quotation and unified visual servo method are compared.Required track is from posture (- 4.2m;1.2m; 3 °) it starts setting up, then with snakelike movement.Motion profile is from (- 5.2m at present;1.7m;5 °) it starts setting up.Gaussian noise is added It is added on required image sequence and the pixel of standard deviation δ=0.2 of present image.
For the scheme proposed, control parameter such as k is selectedv=0.3, kw=0.3, Γ=80.In addition, after filtering use Desired value speed is calculated to difference algorithmwd(t).Attached drawing 3 shows the track of current and desired robot.Fig. 4 is aobvious Track of the robot of tracking mistake between expectation and current kinetic is shown, wherein * Tdz(t), * Tdx(t) Z, X-coordinate are indicated In coordinate system F*Under coordinate system FdOrigin, * Tcz(t), * Tcx(t) Z is indicated, X-coordinate is in coordinate system F*Under coordinate system FcOrigin. According to Fig. 3 and Fig. 4, it is understood that by initial error quickly to reach desired motion profile bigger for mobile robot.Fig. 5 Show the linear and angular speed of robot, it is seen that when mobile robot is mobile with current speed and desired speed Degree is consistent.Image path shown in attached drawing 6, we see that current feature is consistent with required image sequence there.
Control parameter kv, kw, γ1Be chosen as k by comparingv=0.3, kw=0.1, γ1It conscientiously adjusts, ties after=8 Fruit is shown in attached drawing 7 to 10.Motion profile and attitude error are shown in attached drawing 7 and attached drawing 8.Attached drawing 9 and 10 respectively illustrates machine The speed and image path of device people.For locus configurations, vertical and horizontal tracking error is 1.0m and 0.6m, difference when being initial It is tracked.But, tracking mistake is reduced rapidly after target controller works 10 seconds, and mistake is suppressed sufficiently small, control At 22 seconds.Method for being compared, lateral error is sufficiently small after about 35 seconds, and anisotropy is received after 50 seconds It holds back.Accordingly, it has been known that tracking error is slowly restrained by comparing controller.It is arrived by comparing attached drawing 3 to attached drawing 6 and attached drawing 7 Attached drawing 10, it is known that desired motion profile having the same and target signature, the reaction of the method proposed is better than other Method, in the case, track following mistake quickly converge to zero.
Method by being compared, control parameter are adjusted to γ1=2, γ2=0.5, γ3=0.1, k1=0.4, k2= 5, Γ1=100, Γ2=100, as a result attached drawing 12 is arrived in attached drawing 11.We have seen that tracking error was in 15 seconds or so fast convergences.So And jitter phenomenon occurs because the tracking error of angular speed is very big, although angular speed is filtered.With other methods phase Than the method proposed can obtain comparable tracking performance using medium linear and angular speed.In addition, in the method 7 control parameters should be adjusted carefully to obtain satisfied tracking performance, and proposed method only needs to adjust 3 parameters.Cause This, this method is more suitably applied to actual tracking task.
It should be noted that in simulating scenes, vd(t) and wd(t) sinusoidal finally it is arranged to desired Zero velocity.Although theory analysis is to set up, as long as vd(t) it just not should be equal to zero in Infinite Time.It can be seen that the control proposed Device processed is with good performance, illustrates that this method can be applied in practice.

Claims (1)

1. the 1st, the design of visual servo Trajectory Tracking System
The description of 1.1st scheme
In the present invention, we design a kind of mobile robot mixing visual servo track following strategy;Utilize 2.5 dimension visions Servo frame is so that visual signature is readily maintained within the scope of camera coverage, and then improves the effect of track following;Firstly, root According to present image, reference picture and desired image sequence, 2-1/2-D visual servo is defined by characteristics of image and robot rotation amount Tracking error;Adaptive controller is devised later, wherein passing through parameter update mechanism complementary characteristics point depth information;According to Lyapunov method and Barbalat lemma, it was demonstrated that go out in the case where scene depth is unknown, the vision track following control mentioned Method processed is able to achieve Asymptotic Stability as a result, comparing simulation result, compared with our pervious work, tracks and sets in visual servo Pinpoint it is constant in the case where, mentioned method need to only adjust less control parameter and complete track following task, relatively be suitble to practical It uses;
2nd, construct system model
The description of 2.1st problem
In-vehicle camera coordinate system FcIt is overlapped with the coordinate system of mobile robot with nonholonomic constraints;Coordinate system FcZcAxis is along camera optical axis Direction, and with mobile robot towards unanimously;xcAxis is parallel with wheel axis direction, ycAxis is perpendicular to robot motion's plane zcxc;This Outside, with FdIndicate the coordinate system on desired trajectory, wherein desired trajectory is defined by the image sequence prerecorded;Static coordinate It is F*Indicate robot/video camera reference pose, be set to reference frame, can make desired image sequence and currently Image is compared by reference to image;The angle, θ calculated by the method for homographyc(t) and θd(t), the two angles are distinguished Indicate the F under reference framecAnd FdRotation angle;According to the definition of these coordinate systems, the present invention designs a kind of vision track Tracking and controlling method makes in-vehicle camera coordinate system FcWith desired trajectory coordinate system FdIt coincides;
2.2nd can survey signal
Consider static nature point P in the scenei(i=1;2;...;N), in F*, FdAnd FcUnder coordinate use P respectivelyi *, Pi d, Pi c ∈R3Carry out F*, FdAnd FcDescription:
The corresponding homogeneous image pixel coordinates of above three coordinate are respectively
Normalized image coordinate is measurable:
Wherein K ∈ R3×3It is calibrated camera intrinsic parameter matrix;
For the ease of subsequent analysis, depth ratio is defined as follows:
Since mobile robot is usually maintained a certain distance with target object, thus variableWithIt is known to positive γi1(t) and γi2(t) singular problem, Ke Yiyong will not occurTo estimate;
Angular speed w on desired trajectoryd(t) and scale meaning under expectation linear velocityFollowing difference can be passed through The form of method calculates
Wherein θd(k) θ at current time is indicatedd(t) value, θd(k-1) θ of previous moment is indicatedd(t) value,With's Define similar therewith, Δ tkIt is the time interval between two moment;
3rd, controller design
Analysis robot kinematics first, then planned course tracking control unit is utilized with Active Compensation unknown characteristics point depth Lyapunov method proves that proposed controller can make tracking error asymptotic convergence to zero.
3.1st, robot kinematics
FcAnd FdBetween translation error ez(t), exIt (t) can be by arbitrary characteristics point PiIt obtains, is defined as follows:
It considersKnow that above-mentioned definition mode does not have singularity problem furthermore FcAnd FdBetween rotation error eθ(t) fixed Justice are as follows:
eθ:=θcd. (7)
By (6) (7) it is found that the track following error constructed was made of characteristics of image and the rotation angle estimated;Cause And the invention is built upon under 2.5 dimension visual servo frames, can make visual signature be easily held in camera coverage range it It is interior;
For the ease of the controller design of next part, the present invention constructs a new error vector:
To ρ1, ρ2And ρ3After time derivation, available following chain type kinematical equation:
WhereinIndicate unknown characteristics point depth information;In addition, can be obtained using the pose algorithm for estimating based on homography To error signal ez(t), ex(t), eθ(t);
For the ease of the controller design of next part, the present invention makes following hypothesis:
Assuming that 1: the speed v on desired trajectoryd(t), wdIt (t) is bounded, and
3.2nd controller design
In the open loop error system-based of formula (9), the present invention devises moving machine on the basis of the depth of characteristic point is unknown The hybrid visual servo contrail tracker of device people;
Based on Lyapunov method for analyzing stability, the mobile robot linear velocity and angular speed control of following form are devised Rule:
Wherein kv, kw, the control gain that is positive.The estimated value of unknown constant α related with characteristic point depth, and by with Under type is updated:
Wherein Γ ∈ R+It is more new gain;
After controller is substituted into open loop kinetics equation (9), it is as follows that closed-loop error equation can be obtained:
WhereinIt is the definition of parameter estimating error:
Theorem 1: control law (10) and parameter more new law (11) can make the error asymptotic convergence in system dynamical equation (9) to zero:
It proves: choosing following non-negative Lyapunov function V (t):
To (14) both sides about time derivation, then bring into known to closed loop kinematic equation (12):
After formula (11) are substituted into (15), it can obtain
According to formula (14) and (16), it is apparent from ρ1(t), ρ2(t), ρ3(t),And then v can be obtained according to formula (10)c(t), wc (t)∈L;As it is assumed that vd(t), wd(t) it is bounded, thus can be obtained according to (9) and (11)Therefore, it is System state is all bounded;
In turn, from ρ known to (16)1, ρ2∈L2;Therefore, it can directly be obtained using Barbalat lemmaSo Afterwards, to ρ in formula (12)1(t) corresponding sin ρ11Partial derivative is as follows:
It is set up in addition, being apparent from following relationship:
Due toIt is continuous in section (0, ∞), therefore it can be concluded that knot It can be seen that sin ρ11It is congruous continuity;
In addition, being easy to getIt is also congruous continuity.We also haveWith;Therefore, To ρ1(t) closed-loop error system can be obtained using extension Barbalat lemma
From (6) and (7) it is found that when formula (13) are set up, it can be achieved that vision track following task.
As it is assumed thatIt is readily seenTherefore, it is understood that the systematic error constructed is asymptotic Converge to 0, it may be assumed that
In addition, according to the ρ in (8)1(t), ρ2(t), ρ3(t) and ez(t), ex(t), eθ(t) relationship, it is known that track following error Asymptotic convergence is to zero, it may be assumed that
CN201711438733.3A 2017-12-22 2017-12-22 Mobile robot mixed vision trajectory tracking strategy Expired - Fee Related CN109960145B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711438733.3A CN109960145B (en) 2017-12-22 2017-12-22 Mobile robot mixed vision trajectory tracking strategy

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711438733.3A CN109960145B (en) 2017-12-22 2017-12-22 Mobile robot mixed vision trajectory tracking strategy

Publications (2)

Publication Number Publication Date
CN109960145A true CN109960145A (en) 2019-07-02
CN109960145B CN109960145B (en) 2022-06-14

Family

ID=67022786

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711438733.3A Expired - Fee Related CN109960145B (en) 2017-12-22 2017-12-22 Mobile robot mixed vision trajectory tracking strategy

Country Status (1)

Country Link
CN (1) CN109960145B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111283683A (en) * 2020-03-04 2020-06-16 湖南师范大学 Servo tracking accelerated convergence method for robot visual feature planning track
CN111578947A (en) * 2020-05-29 2020-08-25 天津工业大学 Unmanned aerial vehicle monocular SLAM extensible framework with depth recovery capability
CN112363538A (en) * 2020-11-09 2021-02-12 哈尔滨工程大学 AUV (autonomous underwater vehicle) area tracking control method under incomplete speed information
CN114434441A (en) * 2021-12-31 2022-05-06 中南大学 Mobile robot visual servo tracking control method based on self-adaptive dynamic programming
US11380176B2 (en) * 2019-11-07 2022-07-05 Hon Hai Precision Industry Co., Ltd. Computing device and non-transitory storage medium implementing target tracking method
WO2022143626A1 (en) * 2020-12-31 2022-07-07 深圳市优必选科技股份有限公司 Method for controlling mobile robot, computer-implemented storage medium, and mobile robot

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102358287A (en) * 2011-09-05 2012-02-22 北京航空航天大学 Trajectory tracking control method used for automatic driving robot of vehicle
CN103019239A (en) * 2012-11-27 2013-04-03 江苏大学 Trajectory tracking sliding mode control system and control method for spraying mobile robot
CN103135549A (en) * 2012-12-21 2013-06-05 北京邮电大学 Motion control system and motion control method for spherical robot with visual feedback
CN104317299A (en) * 2014-11-11 2015-01-28 东南大学 Mixed control method based on trace tracking of wheeled mobile robot
CN104950887A (en) * 2015-06-19 2015-09-30 重庆大学 Transportation device based on robot vision system and independent tracking system
CN105955251A (en) * 2016-03-11 2016-09-21 北京克路德人工智能科技有限公司 Vision following control method of robot and robot
CN106774335A (en) * 2017-01-03 2017-05-31 南京航空航天大学 Guiding device based on multi-vision visual and inertial navigation, terrestrial reference layout and guidance method
CN107121981A (en) * 2017-04-20 2017-09-01 杭州南江机器人股份有限公司 A kind of AGV line walkings navigation of view-based access control model and localization method
CN107263511A (en) * 2017-05-26 2017-10-20 哈尔滨工程大学 A kind of omnidirectional's airfield runway detection robot system and its control method
CN107421540A (en) * 2017-05-05 2017-12-01 华南理工大学 A kind of Mobile Robotics Navigation method and system of view-based access control model

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102358287A (en) * 2011-09-05 2012-02-22 北京航空航天大学 Trajectory tracking control method used for automatic driving robot of vehicle
CN103019239A (en) * 2012-11-27 2013-04-03 江苏大学 Trajectory tracking sliding mode control system and control method for spraying mobile robot
CN103135549A (en) * 2012-12-21 2013-06-05 北京邮电大学 Motion control system and motion control method for spherical robot with visual feedback
CN104317299A (en) * 2014-11-11 2015-01-28 东南大学 Mixed control method based on trace tracking of wheeled mobile robot
CN104950887A (en) * 2015-06-19 2015-09-30 重庆大学 Transportation device based on robot vision system and independent tracking system
CN105955251A (en) * 2016-03-11 2016-09-21 北京克路德人工智能科技有限公司 Vision following control method of robot and robot
CN106774335A (en) * 2017-01-03 2017-05-31 南京航空航天大学 Guiding device based on multi-vision visual and inertial navigation, terrestrial reference layout and guidance method
CN107121981A (en) * 2017-04-20 2017-09-01 杭州南江机器人股份有限公司 A kind of AGV line walkings navigation of view-based access control model and localization method
CN107421540A (en) * 2017-05-05 2017-12-01 华南理工大学 A kind of Mobile Robotics Navigation method and system of view-based access control model
CN107263511A (en) * 2017-05-26 2017-10-20 哈尔滨工程大学 A kind of omnidirectional's airfield runway detection robot system and its control method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
JIANFENG LIAO等: "Performance-oriented coordinated adaptive robust control for four-wheel independently driven skid steer mobile robot", 《IEEE ACCESS》 *
YING WANG等: "A hybrid visual servo controller for Robust Grasping by Wheeled Mobile Robots", 《IEEE/ASME TRANSACTIONS ON MECHATRONICS》 *
YONG-LIN KUO等: "Pose determination of a robot manipulator based on monocular vision", 《IEEE ACCESS》 *
管春苗: "基于机器视觉的运动目标轨迹跟踪技术研究", 《中国优秀硕士学位论文全文数据库·信息科技辑》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11380176B2 (en) * 2019-11-07 2022-07-05 Hon Hai Precision Industry Co., Ltd. Computing device and non-transitory storage medium implementing target tracking method
CN111283683A (en) * 2020-03-04 2020-06-16 湖南师范大学 Servo tracking accelerated convergence method for robot visual feature planning track
CN111578947A (en) * 2020-05-29 2020-08-25 天津工业大学 Unmanned aerial vehicle monocular SLAM extensible framework with depth recovery capability
CN111578947B (en) * 2020-05-29 2023-12-22 国网浙江省电力有限公司台州市椒江区供电公司 Unmanned plane monocular SLAM (selective liquid level adjustment) expandable frame with depth recovery capability
CN112363538A (en) * 2020-11-09 2021-02-12 哈尔滨工程大学 AUV (autonomous underwater vehicle) area tracking control method under incomplete speed information
WO2022143626A1 (en) * 2020-12-31 2022-07-07 深圳市优必选科技股份有限公司 Method for controlling mobile robot, computer-implemented storage medium, and mobile robot
CN114434441A (en) * 2021-12-31 2022-05-06 中南大学 Mobile robot visual servo tracking control method based on self-adaptive dynamic programming

Also Published As

Publication number Publication date
CN109960145B (en) 2022-06-14

Similar Documents

Publication Publication Date Title
CN109960145A (en) Mobile robot mixes vision track following strategy
Zhang et al. Motion-estimation-based visual servoing of nonholonomic mobile robots
Engel et al. Scale-aware navigation of a low-cost quadrocopter with a monocular camera
Hu et al. Quaternion‐based visual servo control in the presence of camera calibration error
Huang et al. Through-the-lens drone filming
Kerl Odometry from rgb-d cameras for autonomous quadrocopters
Zhao et al. Vision-based tracking control of quadrotor with backstepping sliding mode control
Motlagh et al. Position Estimation for Drones based on Visual SLAM and IMU in GPS-denied Environment
Murrieri et al. A hybrid-control approach to the parking problem of a wheeled vehicle using limited view-angle visual feedback
Becerra et al. Visual navigation of wheeled mobile robots using direct feedback of a geometric constraint
Liang et al. Calibration-free image-based trajectory tracking control of mobile robots with an overhead camera
Li et al. Motion prediction and robust tracking of a dynamic and temporarily-occluded target by an unmanned aerial vehicle
Vakanski et al. An image-based trajectory planning approach for robust robot programming by demonstration
Barreto et al. Active Stereo Tracking of $ N\le 3$ Targets Using Line Scan Cameras
Yang et al. Vision-based localization and mapping for an autonomous mower
Marchand et al. Visual servoing through mirror reflection
Peretroukhin et al. Optimizing camera perspective for stereo visual odometry
CN109816717A (en) The vision point stabilization of wheeled mobile robot in dynamic scene
Kim et al. Absolute motion and structure from stereo image sequences without stereo correspondence and analysis of degenerate cases
Choi et al. Encoderless gimbal calibration of dynamic multi-camera clusters
Perron et al. Orbiting a moving target with multi-robot collaborative visual slam
Spica et al. A game theoretic approach to autonomous two-player drone racing
Ling et al. An iterated extended Kalman filter for 3D mapping via Kinect camera
Spica et al. Active structure from motion for spherical and cylindrical targets
Braganza et al. Euclidean position estimation of static features using a moving camera with known velocities

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20220614

CF01 Termination of patent right due to non-payment of annual fee